WorldWideScience

Sample records for carlos apache indians

  1. San Carlos Apache Tribe - Energy Organizational Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rapp, James; Albert, Steve

    2012-04-01

    The San Carlos Apache Tribe (SCAT) was awarded $164,000 in late-2011 by the U.S. Department of Energy (U.S. DOE) Tribal Energy Program's "First Steps Toward Developing Renewable Energy and Energy Efficiency on Tribal Lands" Grant Program. This grant funded:  The analysis and selection of preferred form(s) of tribal energy organization (this Energy Organization Analysis, hereinafter referred to as "EOA").  Start-up staffing and other costs associated with the Phase 1 SCAT energy organization.  An intern program.  Staff training.  Tribal outreach and workshops regarding the new organization and SCAT energy programs and projects, including two annual tribal energy summits (2011 and 2012). This report documents the analysis and selection of preferred form(s) of a tribal energy organization.

  2. Solar Feasibility Study May 2013 - San Carlos Apache Tribe

    Energy Technology Data Exchange (ETDEWEB)

    Rapp, Jim [Parametrix; Duncan, Ken [San Carlos Apache Tribe; Albert, Steve [Parametrix

    2013-05-01

    The San Carlos Apache Tribe (Tribe) in the interests of strengthening tribal sovereignty, becoming more energy self-sufficient, and providing improved services and economic opportunities to tribal members and San Carlos Apache Reservation (Reservation) residents and businesses, has explored a variety of options for renewable energy development. The development of renewable energy technologies and generation is consistent with the Tribe’s 2011 Strategic Plan. This Study assessed the possibilities for both commercial-scale and community-scale solar development within the southwestern portions of the Reservation around the communities of San Carlos, Peridot, and Cutter, and in the southeastern Reservation around the community of Bylas. Based on the lack of any commercial-scale electric power transmission between the Reservation and the regional transmission grid, Phase 2 of this Study greatly expanded consideration of community-scale options. Three smaller sites (Point of Pines, Dudleyville/Winkleman, and Seneca Lake) were also evaluated for community-scale solar potential. Three building complexes were identified within the Reservation where the development of site-specific facility-scale solar power would be the most beneficial and cost-effective: Apache Gold Casino/Resort, Tribal College/Skill Center, and the Dudleyville (Winkleman) Casino.

  3. Remote sensing analysis of vegetation at the San Carlos Apache Reservation, Arizona and surrounding area

    Science.gov (United States)

    Norman, Laura M.; Middleton, Barry R.; Wilson, Natalie R.

    2018-01-01

    Mapping of vegetation types is of great importance to the San Carlos Apache Tribe and their management of forestry and fire fuels. Various remote sensing techniques were applied to classify multitemporal Landsat 8 satellite data, vegetation index, and digital elevation model data. A multitiered unsupervised classification generated over 900 classes that were then recoded to one of the 16 generalized vegetation/land cover classes using the Southwest Regional Gap Analysis Project (SWReGAP) map as a guide. A supervised classification was also run using field data collected in the SWReGAP project and our field campaign. Field data were gathered and accuracy assessments were generated to compare outputs. Our hypothesis was that a resulting map would update and potentially improve upon the vegetation/land cover class distributions of the older SWReGAP map over the 24,000  km2 study area. The estimated overall accuracies ranged between 43% and 75%, depending on which method and field dataset were used. The findings demonstrate the complexity of vegetation mapping, the importance of recent, high-quality-field data, and the potential for misleading results when insufficient field data are collected.

  4. Analysis of oil-bearing Cretaceous sandstone hydrocarbon reservoirs, exclusive of the Dakota Sandstone, on the Jicarilla Apache Indian Reservation, New Mexico; TOPICAL

    International Nuclear Information System (INIS)

    Ridgley, Jennie; Wright Dunbar, Robyn

    2000-01-01

    This is the Phase One contract report to the United States Department of Energy, United State Geological Survey and the Jicarilla Apache Indian Tribe on the project entitled''Outcrop Analysis of the Cretaceous Mesaverde Group: Jicarilla Apache Reservation, New Mexico.'' Field work for this project was conducted during July and August 1998, at which time fourteen measured sections were described and correlated on or adjacent to Jicarilla Apache Reservation lands. A fifteen section, described east of the main field area, is included in this report, although its distant location precluded use in the correlation's and cross-sections presented herein. Ground-based photo mosaics were shot for much of the exposed Mesaverde outcrop belt and were used to assist in correlation. Outcrop gamma-ray surveys at six of the fifteen measured sections using a GAD-6 scintillometer was conducted. The raw gamma-ray data are included in this report, however, analysis of those data is part of the ongoing Phase Two of this project

  5. Apache Maven cookbook

    CERN Document Server

    Bharathan, Raghuram

    2015-01-01

    If you are a Java developer or a manager who has experience with Apache Maven and want to extend your knowledge, then this is the ideal book for you. Apache Maven Cookbook is for those who want to learn how Apache Maven can be used for build automation. It is also meant for those familiar with Apache Maven, but want to understand the finer nuances of Maven and solve specific problems.

  6. Subsurface Analysis of the Mesaverde Group on and near the Jicarilla Apache Indian Reservation, New Mexico-its implication on Sites of Oil and Gas Accumulation

    Energy Technology Data Exchange (ETDEWEB)

    Ridgley, Jennie

    2001-08-21

    The purpose of the phase 2 Mesaverde study part of the Department of Energy funded project ''Analysis of oil-bearing Cretaceous Sandstone Hydrocarbon Reservoirs, exclusive of the Dakota Sandstone, on the Jicarilla Apache Indian Reservation, New Mexico'' was to define the facies of the oil-producing units within the subsurface units of the Mesaverde Group and integrate these results with outcrop studies that defined the depositional environments of these facies within a sequence stratigraphic context. The focus of this report will center on (1) integration of subsurface correlations with outcrop correlations of components of the Mesaverde, (2) application of the sequence stratigraphic model determined in the phase one study to these correlations, (3) determination of the facies distribution of the Mesaverde Group and their relationship to sites of oil and gas accumulation, (4) evaluation of the thermal maturity and potential source rocks for oil and gas in the Mesaverde Group, and (5) evaluation of the structural features on the Reservation as they may control sites of oil accumulation.

  7. Learning Apache Kafka

    CERN Document Server

    Garg, Nishant

    2015-01-01

    This book is for readers who want to know more about Apache Kafka at a hands-on level; the key audience is those with software development experience but no prior exposure to Apache Kafka or similar technologies. It is also useful for enterprise application developers and big data enthusiasts who have worked with other publisher-subscriber-based systems and want to explore Apache Kafka as a futuristic solution.

  8. Apache The Definitive Guide

    CERN Document Server

    Laurie, Ben

    2003-01-01

    Apache is far and away the most widely used web server platform in the world. This versatile server runs more than half of the world's existing web sites. Apache is both free and rock-solid, running more than 21 million web sites ranging from huge e-commerce operations to corporate intranets and smaller hobby sites. With this new third edition of Apache: The Definitive Guide, web administrators new to Apache will come up to speed quickly, and experienced administrators will find the logically organized, concise reference sections indispensable, and system programmers interested in customizin

  9. Learning Apache Karaf

    CERN Document Server

    Edstrom, Johan; Kesler, Heath

    2013-01-01

    The book is a fast-paced guide full of step-by-step instructions covering all aspects of application development using Apache Karaf.Learning Apache Karaf will benefit all Java developers and system administrators who need to develop for and/or operate Karaf's OSGi-based runtime. Basic knowledge of Java is assumed.

  10. The APACHE Project

    Directory of Open Access Journals (Sweden)

    Giacobbe P.

    2013-04-01

    Full Text Available First, we summarize the four-year long efforts undertaken to build the final setup of the APACHE Project, a photometric transit search for small-size planets orbiting bright, low-mass M dwarfs. Next, we describe the present status of the APACHE survey, officially started in July 2012 at the site of the Astronomical Observatory of the Autonomous Region of the Aosta Valley, in the Western Italian Alps. Finally, we briefly discuss the potentially far-reaching consequences of a multi-technique characterization program of the (potentially planet-bearing APACHE targets.

  11. Instant Apache Wicket 6

    CERN Document Server

    Longo, João Sávio Ceregatti

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. This Starter style guide takes the reader through the basic workflow of Apache Wicket in a practical and friendly style.Instant Apache Wicket 6 is for people who want to learn the basics of Apache Wicket 6 and who already have some experience with Java and object-oriented programming. Basic knowledge of web concepts like HTTP and Ajax will be an added advantage.

  12. Apache Mahout essentials

    CERN Document Server

    Withanawasam, Jayani

    2015-01-01

    If you are a Java developer or data scientist, haven't worked with Apache Mahout before, and want to get up to speed on implementing machine learning on big data, then this is the perfect guide for you.

  13. Apache Solr essentials

    CERN Document Server

    Gazzarini, Andrea

    2015-01-01

    If you are a competent developer with experience of working with technologies similar to Apache Solr and want to develop efficient search applications, then this book is for you. Familiarity with the Java programming language is required.

  14. Apache Mahout cookbook

    CERN Document Server

    Giacomelli, Piero

    2013-01-01

    Apache Mahout Cookbook uses over 35 recipes packed with illustrations and real-world examples to help beginners as well as advanced programmers get acquainted with the features of Mahout.""Apache Mahout Cookbook"" is great for developers who want to have a fresh and fast introduction to Mahout coding. No previous knowledge of Mahout is required, and even skilled developers or system administrators will benefit from the various recipes presented

  15. Instant Apache Maven starter

    CERN Document Server

    Turatti, Maurizio

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks.The book follows a starter approach for using Maven to create and build a new Java application or Web project from scratch.Instant Apache Maven Starter is great for Java developers new to Apache Maven, but also for experts looking for immediate information. Moreover, only 20% of the necessary information about Maven is used in 80% of the activities. This book aims to focus on the most important information, those pragmatic parts you actually use

  16. Apache Tomcat 7 Essentials

    CERN Document Server

    Khare, Tanuj

    2012-01-01

    This book is a step-by-step tutorial for anyone wanting to learn Apache Tomcat 7 from scratch. There are plenty of illustrations and examples to escalate you from a novice to an expert with minimal strain. If you are a J2EE administrator, migration administrator, technical architect, or a project manager for a web hosting domain, and are interested in Apache Tomcat 7, then this book is for you. If you are someone responsible for installation, configuration, and management of Tomcat 7, then too, this book will be of help to you.

  17. Sequence Stratigraphic Analysis and Facies Architecture of the Cretaceous Mancos Shale on and Near the Jicarilla Apache Indian Reservation, New Mexico-their relation to Sites of Oil Accumulation; FINAL

    International Nuclear Information System (INIS)

    Ridgley, Jennie

    2001-01-01

    The purpose of phase 1 and phase 2 of the Department of Energy funded project Analysis of oil- bearing Cretaceous Sandstone Hydrocarbon Reservoirs, exclusive of the Dakota Sandstone, on the Jicarilla Apache Indian Reservation, New Mexico was to define the facies of the oil producing units within the Mancos Shale and interpret the depositional environments of these facies within a sequence stratigraphic context. The focus of this report will center on (1) redefinition of the area and vertical extent of the ''Gallup sandstone'' or El Vado Sandstone Member of the Mancos Shale, (2) determination of the facies distribution within the ''Gallup sandstone'' and other oil-producing sandstones within the lower Mancos, placing these facies within the overall depositional history of the San Juan Basin, (3) application of the principals of sequence stratigraphy to the depositional units that comprise the Mancos Shale, and (4) evaluation of the structural features on the Reservation as they may control sites of oil accumulation

  18. Apaches push privatization

    International Nuclear Information System (INIS)

    Daniels, S.

    1994-01-01

    Trying to drum up business for what would be the first private temporary storage facility for spent nuclear fuel rods, the Mescalero Apaches are inviting officials of 30 utilities to convene March 10 at the tribe's New Mexico reservation. The state public utilities commission will also attend the meeting, which grew from an agreement the tribe signed last month with Minneapolis-based Northern States Power Co

  19. Mastering Apache Cassandra

    CERN Document Server

    Neeraj, Nishant

    2013-01-01

    Mastering Apache Cassandra is a practical, hands-on guide with step-by-step instructions. The smooth and easy tutorial approach focuses on showing people how to utilize Cassandra to its full potential.This book is aimed at intermediate Cassandra users. It is best suited for startups where developers have to wear multiple hats: programmer, DevOps, release manager, convincing clients, and handling failures. No prior knowledge of Cassandra is required.

  20. Apache 2 Pocket Reference For Apache Programmers & Administrators

    CERN Document Server

    Ford, Andrew

    2008-01-01

    Even if you know the Apache web server inside and out, you still need an occasional on-the-job reminder -- especially if you're moving to the newer Apache 2.x. Apache 2 Pocket Reference gives you exactly what you need to get the job done without forcing you to plow through a cumbersome, doorstop-sized reference. This Book provides essential information to help you configure and maintain the server quickly, with brief explanations that get directly to the point. It covers Apache 2.x, giving web masters, web administrators, and programmers a quick and easy reference solution. This pocket r

  1. Apache Cordova 3 programming

    CERN Document Server

    Wargo, John M

    2013-01-01

    Written for experienced mobile developers, Apache Cordova 3 Programming is a complete introduction to Apache Cordova 3 and Adobe PhoneGap 3. It describes what makes Cordova important and shows how to install and use the tools, the new Cordova CLI, the native SDKs, and more. If you’re brand new to Cordova, this book will be just what you need to get started. If you’re familiar with an older version of Cordova, this book will show you in detail how to use all of the new stuff that’s in Cordova 3 plus stuff that has been around for a while (like the Cordova core APIs). After walking you through the process of downloading and setting up the framework, mobile expert John M. Wargo shows you how to install and use the command line tools to manage the Cordova application lifecycle and how to set up and use development environments for several of the more popular Cordova supported mobile device platforms. Of special interest to new developers are the chapters on the anatomy of a Cordova application, as well ...

  2. Preliminary Assessment of Apache Hopefulness: Relationships with Hopelessness and with Collective as well as Personal Self-Esteem

    Science.gov (United States)

    Hammond, Vanessa Lea; Watson, P. J.; O'Leary, Brian J.; Cothran, D. Lisa

    2009-01-01

    Hopelessness is central to prominent mental health problems within American Indian (AI) communities. Apaches living on a reservation in Arizona responded to diverse expressions of hope along with Hopelessness, Personal Self-Esteem, and Collective Self-Esteem scales. An Apache Hopefulness Scale expressed five themes of hope and correlated…

  3. Apache ZooKeeper essentials

    CERN Document Server

    Haloi, Saurav

    2015-01-01

    Whether you are a novice to ZooKeeper or already have some experience, you will be able to master the concepts of ZooKeeper and its usage with ease. This book assumes you to have some prior knowledge of distributed systems and high-level programming knowledge of C, Java, or Python, but no experience with Apache ZooKeeper is required.

  4. Instant Apache Camel message routing

    CERN Document Server

    Ibryam, Bilgin

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. This short, instruction-based guide shows you how to perform application integration using the industry standard Enterprise Integration Patterns.This book is intended for Java developers who are new to Apache Camel and message- oriented applications.

  5. The Jicarilla Apaches. A Study in Survival.

    Science.gov (United States)

    Gunnerson, Dolores A.

    Focusing on the ultimate fate of the Cuartelejo and/or Paloma Apaches known in archaeological terms as the Dismal River people of the Central Plains, this book is divided into 2 parts. The early Apache (1525-1700) and the Jicarilla Apache (1700-1800) tribes are studied in terms of their: persistent cultural survival, social/political adaptability,…

  6. SEQUENCE STRATIGRAPHIC ANALYSIS AND FACIES ARCHITECTURE OF THE CRETACEOUS MANCOS SHALE ON AND NEAR THE JICARILLA APACHE INDIAN RESERVATION, NEW MEXICO-THEIR RELATION TO SITES OF OIL ACCUMULATION

    International Nuclear Information System (INIS)

    Jennie Ridgley

    2000-01-01

    Oil distribution in the lower part of the Mancos Shale seems to be mainly controlled by fractures and by sandier facies that are dolomite-cemented. Structure in the area of the Jicarilla Apache Indian Reservation consists of the broad northwest- to southeast-trending Chaco slope, the deep central basin, and the monocline that forms the eastern boundary of the San Juan Basin. Superimposed on the regional structure are broad low-amplitude folds. Fractures seem best developed in the areas of these folds. Using sequence stratigraphic principals, the lower part of the Mancos Shale has been subdivided into four main regressive and transgressive components. These include facies that are the basinal time equivalents to the Gallup Sandstone, an overlying interbedded sandstone and shale sequence time equivalent to the transgressive Mulatto Tongue of the Mancos Shale, the El Vado Sandstone Member which is time equivalent to part of the Dalton Sandstone, and an unnamed interbedded sandstone and shale succession time equivalent to the regressive Dalton Sandstone and transgressive Hosta Tongue of the Mesaverde Group. Facies time equivalent to the Gallup Sandstone underlie an unconformity of regional extent. These facies are gradually truncated from south to north across the Reservation. The best potential for additional oil resources in these facies is in the southern part of the Reservation where the top sandier part of these facies is preserved. The overlying unnamed wedge of transgressive rocks produces some oil but is underexplored, except for sandstones equivalent to the Tocito Sandstone. This wedge of rocks is divided into from two to five units. The highest sand content in this wedge occurs where each of the four subdivisions above the Tocito terminates to the south and is overstepped by the next youngest unit. These terminal areas should offer the best targets for future oil exploration. The El Vado Sandstone Member overlies the transgressive wedge. It produces most of

  7. Learning Apache Solr high performance

    CERN Document Server

    Mohan, Surendra

    2014-01-01

    This book is an easy-to-follow guide, full of hands-on, real-world examples. Each topic is explained and demonstrated in a specific and user-friendly flow, from search optimization using Solr to Deployment of Zookeeper applications. This book is ideal for Apache Solr developers and want to learn different techniques to optimize Solr performance with utmost efficiency, along with effectively troubleshooting the problems that usually occur while trying to boost performance. Familiarity with search servers and database querying is expected.

  8. Instant Apache Camel messaging system

    CERN Document Server

    Sharapov, Evgeniy

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. A beginner's guide to Apache Camel that walks you through basic operations like installation and setup right through to developing simple applications.This book is a good starting point for Java developers who have to work on an application dealing with various systems and interfaces but who haven't yet started using Enterprise System Buses or Java Business Integration frameworks.

  9. Mescalero Apache Tribe Monitored Retrievable Storage (MRS)

    Energy Technology Data Exchange (ETDEWEB)

    Peso, F.

    1992-03-13

    The Nuclear Waste Policy Act of 1982, as amended, authorizes the siting, construction and operation of a Monitored Retrievable Storage (MRS) facility. The MRS is intended to be used for the temporary storage of spent nuclear fuel from the nation's nuclear power plants beginning as early as 1998. Pursuant to the Nuclear Waste Policy Act, the Office of the Nuclear Waste Negotiator was created. On October 7, 1991, the Nuclear Waste Negotiator invited the governors of states and the Presidents of Indian tribes to apply for government grants in order to conduct a study to assess under what conditions, if any, they might consider hosting an MRS facility. Pursuant to this invitation, on October 11, 1991 the Mescalero Apache Indian Tribe of Mescalero, NM applied for a grant to conduct a phased, preliminary study of the safety, technical, political, environmental, social and economic feasibility of hosting an MRS. The preliminary study included: (1) An investigative education process to facilitate the Tribe's comprehensive understanding of the safety, environmental, technical, social, political, and economic aspects of hosting an MRS, and; (2) The development of an extensive program that is enabling the Tribe, in collaboration with the Negotiator, to reach an informed and carefully researched decision regarding the conditions, (if any), under which further pursuit of the MRS would be considered. The Phase 1 grant application enabled the Tribe to begin the initial activities necessary to determine whether further consideration is warranted for hosting the MRS facility. The Tribe intends to pursue continued study of the MRS in order to meet the following objectives: (1) Continuing the education process towards a comprehensive understanding of the safety, environmental, technical, social and economic aspects of the MRS; (2) Conducting an effective public participation and information program; (3) Participating in MRS meetings.

  10. Mescalero Apache Tribe Monitored Retrievable Storage (MRS)

    International Nuclear Information System (INIS)

    Peso, F.

    1992-01-01

    The Nuclear Waste Policy Act of 1982, as amended, authorizes the siting, construction and operation of a Monitored Retrievable Storage (MRS) facility. The MRS is intended to be used for the temporary storage of spent nuclear fuel from the nation's nuclear power plants beginning as early as 1998. Pursuant to the Nuclear Waste Policy Act, the Office of the Nuclear Waste Negotiator was created. On October 7, 1991, the Nuclear Waste Negotiator invited the governors of states and the Presidents of Indian tribes to apply for government grants in order to conduct a study to assess under what conditions, if any, they might consider hosting an MRS facility. Pursuant to this invitation, on October 11, 1991 the Mescalero Apache Indian Tribe of Mescalero, NM applied for a grant to conduct a phased, preliminary study of the safety, technical, political, environmental, social and economic feasibility of hosting an MRS. The preliminary study included: (1) An investigative education process to facilitate the Tribe's comprehensive understanding of the safety, environmental, technical, social, political, and economic aspects of hosting an MRS, and; (2) The development of an extensive program that is enabling the Tribe, in collaboration with the Negotiator, to reach an informed and carefully researched decision regarding the conditions, (if any), under which further pursuit of the MRS would be considered. The Phase 1 grant application enabled the Tribe to begin the initial activities necessary to determine whether further consideration is warranted for hosting the MRS facility. The Tribe intends to pursue continued study of the MRS in order to meet the following objectives: (1) Continuing the education process towards a comprehensive understanding of the safety, environmental, technical, social and economic aspects of the MRS; (2) Conducting an effective public participation and information program; (3) Participating in MRS meetings

  11. Nuclear data physics issues in Monte Carlo simulations of neutron and photon transport in the Indian context

    International Nuclear Information System (INIS)

    Ganesan, S.

    2009-01-01

    In this write-up, some of the basic issues of nuclear data physics in Monte Carlo simulation of neutron transport in the Indian context are dealt with. In this lecture, some of the aspects associated with usage of the ENDF/B system, and of the PREPRO code system developed by D.E. Cullen and distributed by the IAEA Nuclear Data Section are briefly touched upon. Some aspects of the SIGACE code system which was developed by the author in collaboration with IPR, Ahmedabad and the IAEA Nuclear Data Section are also briefly covered. The validation of the SIGACE package included investigations using the NJOY and the MCNP compatible ACE files. Appendix-1 of the paper provides some useful discussions pointing out that voluminous and high-quality nuclear physics data required for nuclear applications usually evolve from a national effort to provide state-of-the-art data that are based upon established needs and uncertainties. Appendix-2 deals with some interesting work that was carried out using the SIGACE Code for Generating High Temperature ACE Files. Appendix-3 mentions briefly Integral nuclear data validation studies and use of Monte Carlo codes and nuclear data. Appendix-4 provides a brief summary report on selected Indian nuclear data physics activities for the interested reader in the light of BARC/DAE treating the subject area of nuclear data physics as a thrust area in our atomic energy programme

  12. The Creation of a Carmeleño Identity:Marriage Practices in the Indian Village at Mission San Carlos Borromeo del Río Carmel

    OpenAIRE

    Peelo, Sarah

    2010-01-01

    Indigenous peoples from diverse tribelets lived within the Indian village at Mission San Carlos Borromeo del Río Carmel. In precolonial times, California Indians formed identities tied to their tribelets. In the mission, those identities were reproduced as members of this pluralistic community formed a connection with their new place of residence. In this paper, I illustrate how marriage was one arena within which different indigenous peoples at this mission may have created a shared sense o...

  13. Random Decision Forests on Apache Spark

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    About the speaker Tom White has been an Apache Hadoop committer since February 2007, and is a member of the Apache Software Foundation. He works for Cloudera, a company set up to offer Hadoop support and training. Previously he was as an independent Hadoop consultant, work...

  14. Apache Flume distributed log collection for Hadoop

    CERN Document Server

    D'Souza, Subas

    2013-01-01

    A starter guide that covers Apache Flume in detail.Apache Flume: Distributed Log Collection for Hadoop is intended for people who are responsible for moving datasets into Hadoop in a timely and reliable manner like software engineers, database administrators, and data warehouse administrators

  15. The Apache OODT Project: An Introduction

    Science.gov (United States)

    Mattmann, C. A.; Crichton, D. J.; Hughes, J. S.; Ramirez, P.; Goodale, C. E.; Hart, A. F.

    2012-12-01

    Apache OODT is a science data system framework, borne over the past decade, with 100s of FTEs of investment, tens of sponsoring agencies (NASA, NIH/NCI, DoD, NSF, universities, etc.), and hundreds of projects and science missions that it powers everyday to their success. At its core, Apache OODT carries with it two fundamental classes of software services and components: those that deal with information integration from existing science data repositories and archives, that themselves have already-in-use business processes and models for populating those archives. Information integration allows search, retrieval, and dissemination across these heterogeneous systems, and ultimately rapid, interactive data access, and retrieval. The other suite of services and components within Apache OODT handle population and processing of those data repositories and archives. Workflows, resource management, crawling, remote data retrieval, curation and ingestion, along with science data algorithm integration all are part of these Apache OODT software elements. In this talk, I will provide an overview of the use of Apache OODT to unlock and populate information from science data repositories and archives. We'll cover the basics, along with some advanced use cases and success stories.

  16. Conservation priorities in the Apache Highlands ecoregion

    Science.gov (United States)

    Dale Turner; Rob Marshall; Carolyn A. F. Enquist; Anne Gondor; David F. Gori; Eduardo Lopez; Gonzalo Luna; Rafaela Paredes Aguilar; Chris Watts; Sabra Schwartz

    2005-01-01

    The Apache Highlands ecoregion incorporates the entire Madrean Archipelago/Sky Island region. We analyzed the current distribution of 223 target species and 26 terrestrial ecological systems there, and compared them with constraints on ecosystem integrity (e.g., road density) to determine the most efficient set of areas needed to maintain current biodiversity. The...

  17. Apache Flume distributed log collection for Hadoop

    CERN Document Server

    Hoffman, Steve

    2015-01-01

    If you are a Hadoop programmer who wants to learn about Flume to be able to move datasets into Hadoop in a timely and replicable manner, then this book is ideal for you. No prior knowledge about Apache Flume is necessary, but a basic knowledge of Hadoop and the Hadoop File System (HDFS) is assumed.

  18. [Validity of APACHE II, APACHE III, SAPS 2, SAPS 3 and SOFA scales in obstetric patients with sepsis].

    Science.gov (United States)

    Zabolotskikh, I B; Musaeva, T S; Denisova, E A

    2012-01-01

    to estimate efficiency of APACHE II, APACHE III, SAPS II, SAPS III, SOFA scales for obstetric patients with heavy sepsis. 186 medical cards retrospective analysis of pregnant women with pulmonary sepsis, 40 women with urosepsis and puerperas with abdominal sepsis--66 was performed. Middle age of women was 26.7 (22.4-34.5). In population of puerperas with abdominal sepsis APACHE II, APACHE III, SAPS 2, SAPS 3, SOFA scales showed to good calibration, however, high resolution was observed only in APACHE III, SAPS 3 and SOFA (AUROC 0.95; 0.93; 0.92 respectively). APACHE III and SOFA scales provided qualitative prognosis in pregnant women with urosepsis; resolution ratio of these scales considerably exceeds APACHE II, SAPS 2 and SAPS 3 (AUROC 0.73; 0.74; 0.79 respectively). APACHE II scale is inapplicable because of a lack of calibration (X2 = 13.1; p < 0.01), and at other scales (APACHE III, SAPS 2, SAPS 3, SOFA) was observed the insufficient resolution (AUROC < 0.9) in pregnant women with pulmonary sepsis. Prognostic possibilities assessment of score scales showed that APACHE III, SAPS 3 and SOFA scales can be used for a lethality prognosis for puerperas with abdominal sepsis, in population of pregnant women with urosepsis--only APACHE III and SOFA, and with pulmonary sepsis--SAPS 3 and APACHE III only in case of additional clinical information.

  19. Optimizing CMS build infrastructure via Apache Mesos

    CERN Document Server

    Abduracmanov, David; Degano, Alessandro; Elmer, Peter; Eulisse, Giulio; Mendez, David; Muzaffar, Shahzad

    2015-12-23

    The Offline Software of the CMS Experiment at the Large Hadron Collider (LHC) at CERN consists of 6M lines of in-house code, developed over a decade by nearly 1000 physicists, as well as a comparable amount of general use open-source code. A critical ingredient to the success of the construction and early operation of the WLCG was the convergence, around the year 2000, on the use of a homogeneous environment of commodity x86-64 processors and Linux. Apache Mesos is a cluster manager that provides efficient resource isolation and sharing across distributed applications, or frameworks. It can run Hadoop, Jenkins, Spark, Aurora, and other applications on a dynamically shared pool of nodes. We present how we migrated our continuos integration system to schedule jobs on a relatively small Apache Mesos enabled cluster and how this resulted in better resource usage, higher peak performance and lower latency thanks to the dynamic scheduling capabilities of Mesos.

  20. The White Mountain Recreational Enterprise: Bio-Political Foundations for White Mountain Apache Natural Resource Control, 1945–1960

    Directory of Open Access Journals (Sweden)

    David C. Tomblin

    2016-07-01

    Full Text Available Among American Indian nations, the White Mountain Apache Tribe has been at the forefront of a struggle to control natural resource management within reservation boundaries. In 1952, they developed the first comprehensive tribal natural resource management program, the White Mountain Recreational Enterprise (WMRE, which became a cornerstone for fighting legal battles over the tribe’s right to manage cultural and natural resources on the reservation for the benefit of the tribal community rather than outside interests. This article examines how White Mountain Apaches used the WMRE, while embracing both Euro-American and Apache traditions, as an institutional foundation for resistance and exchange with Euro-American society so as to reassert control over tribal eco-cultural resources in east-central Arizona.

  1. Network Intrusion Detection System using Apache Storm

    Directory of Open Access Journals (Sweden)

    Muhammad Asif Manzoor

    2017-06-01

    Full Text Available Network security implements various strategies for the identification and prevention of security breaches. Network intrusion detection is a critical component of network management for security, quality of service and other purposes. These systems allow early detection of network intrusion and malicious activities; so that the Network Security infrastructure can react to mitigate these threats. Various systems are proposed to enhance the network security. We are proposing to use anomaly based network intrusion detection system in this work. Anomaly based intrusion detection system can identify the new network threats. We also propose to use Real-time Big Data Stream Processing Framework, Apache Storm, for the implementation of network intrusion detection system. Apache Storm can help to manage the network traffic which is generated at enormous speed and size and the network traffic speed and size is constantly increasing. We have used Support Vector Machine in this work. We use Knowledge Discovery and Data Mining 1999 (KDD’99 dataset to test and evaluate our proposed solution.

  2. LHCbDIRAC as Apache Mesos microservices

    CERN Multimedia

    Couturier, Ben

    2016-01-01

    The LHCb experiment relies on LHCbDIRAC, an extension of DIRAC, to drive its offline computing. This middleware provides a development framework and a complete set of components for building distributed computing systems. These components are currently installed and ran on virtual machines (VM) or bare metal hardware. Due to the increased load of work, high availability is becoming more and more important for the LHCbDIRAC services, and the current installation model is showing its limitations. Apache Mesos is a cluster manager which aims at abstracting heterogeneous physical resources on which various tasks can be distributed thanks to so called "framework". The Marathon framework is suitable for long running tasks such as the DIRAC services, while the Chronos framework meets the needs of cron-like tasks like the DIRAC agents. A combination of the service discovery tool Consul together with HAProxy allows to expose the running containers to the outside world while hiding their dynamic placements. Such an arc...

  3. LHCbDIRAC as Apache Mesos microservices

    Science.gov (United States)

    Haen, Christophe; Couturier, Benjamin

    2017-10-01

    The LHCb experiment relies on LHCbDIRAC, an extension of DIRAC, to drive its offline computing. This middleware provides a development framework and a complete set of components for building distributed computing systems. These components are currently installed and run on virtual machines (VM) or bare metal hardware. Due to the increased workload, high availability is becoming more and more important for the LHCbDIRAC services, and the current installation model is showing its limitations. Apache Mesos is a cluster manager which aims at abstracting heterogeneous physical resources on which various tasks can be distributed thanks to so called “frameworks” The Marathon framework is suitable for long running tasks such as the DIRAC services, while the Chronos framework meets the needs of cron-like tasks like the DIRAC agents. A combination of the service discovery tool Consul together with HAProxy allows to expose the running containers to the outside world while hiding their dynamic placements. Such an architecture brings a greater flexibility in the deployment of LHCbDirac services, allowing for easier deployment maintenance and scaling of services on demand (e..g LHCbDirac relies on 138 services and 116 agents). Higher reliability is also easier, as clustering is part of the toolset, which allows constraints on the location of the services. This paper describes the investigations carried out to package the LHCbDIRAC and DIRAC components into Docker containers and orchestrate them using the previously described set of tools.

  4. Apache, Santa Fe energy units awarded two Myanmar blocks

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    This paper reports that Myanmar's state oil company has awarded production sharing contracts (PSCs) on two blocks to units of Apache Corp. and Santa Fe Energy Resources Inc., both of Houston. That comes on the heels of a report by County NatWest Woodmac that notes Myanmar's oil production, currently meeting less than half the country's demand, is set to fall further this year. 150 line km of new seismic data could be acquired and one well drilled. During the initial 2 year exploration period on Block EP-3, Apache will conduct geological studies and conduct at least 200 line km of seismic data

  5. 77 FR 51475 - Safety Zone; Apache Pier Labor Day Fireworks; Myrtle Beach, SC

    Science.gov (United States)

    2012-08-24

    ...-AA00 Safety Zone; Apache Pier Labor Day Fireworks; Myrtle Beach, SC AGENCY: Coast Guard, DHS. ACTION... Atlantic Ocean in the vicinity of Apache Pier in Myrtle Beach, SC, during the Labor Day fireworks... [[Page 51476

  6. 77 FR 15796 - Notice of Intent To Repatriate Cultural Items: U.S. Department of the Interior, Bureau of Indian...

    Science.gov (United States)

    2012-03-16

    ... stone and 1 chert scraper. The Pinnacle Site consists of a pueblo of about 10 rooms and dates from A.D... with additional stone alignments and dates from A.D. 1275-1400, based on the ceramic assemblage. The... Assessment of White Mountain Apache Tribal Lands (Fort Apache Indian Reservation),'' by John R. Welch and T.J...

  7. A Modified APACHE II Score for Predicting Mortality of Variceal ...

    African Journals Online (AJOL)

    Conclusion: Modified APACHE II score is effective in predicting outcome of patients with variceal bleeding. Score of L 15 points and long ICU stay are associated with high mortality. Keywords: liver cirrhosis, periportal fibrosis, portal hypertension, schistosomiasis udan Journal of Medical Sciences Vol. 2 (2) 2007: pp. 105- ...

  8. Fallugia paradoxa (D. Don) Endl. ex Torr.: Apache-plume

    Science.gov (United States)

    Susan E. Meyer

    2008-01-01

    The genus Fallugia contains a single species - Apache-plume, F. paradoxa (D. Don) Endl. ex Torr. - found throughout the southwestern United States and northern Mexico. It occurs mostly on coarse soils on benches and especially along washes and canyons in both warm and cool desert shrub communities and up into the pinyon-juniper vegetation type. It is a sprawling, much-...

  9. The Apache Point Observatory Galactic Evolution Experiment (APOGEE)

    DEFF Research Database (Denmark)

    Majewski, Steven R.; Schiavon, Ricardo P.; Frinchaboy, Peter M.

    2017-01-01

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE), one of the programs in the Sloan Digital Sky Survey III (SDSS-III), has now completed its systematic, homogeneous spectroscopic survey sampling all major populations of the Milky Way. After a three-year observing campaign on the...

  10. Ergonomic and anthropometric issues of the forward Apache crew station

    NARCIS (Netherlands)

    Oudenhuijzen, A.J.K.

    1999-01-01

    This paper describes the anthropometric accommodation in the Apache crew systems. These activities are part of a comprehensive project, in a cooperative effort from the Armstrong Laboratory at Wright Patterson Air Force Base (Dayton, Ohio, USA) and TNO Human Factors Research Institute (TNO HFRI) in

  11. FEASIBILITY STUDY FOR A PETROLEUM REFINERY FOR THE JICARILLA APACHE TRIBE

    International Nuclear Information System (INIS)

    Jones, John D.

    2004-01-01

    A feasibility study for a proposed petroleum refinery for the Jicarilla Apache Indian Reservation was performed. The available crude oil production was identified and characterized. There is 6,000 barrels per day of crude oil production available for processing in the proposed refinery. The proposed refinery will utilize a lower temperature, smaller crude fractionation unit. It will have a Naphtha Hydrodesulfurizer and Reformer to produce high octane gasoline. The surplus hydrogen from the reformer will be used in a specialized hydrocracker to convert the heavier crude oil fractions to ultra low sulfur gasoline and diesel fuel products. The proposed refinery will produce gasoline, jet fuel, diesel fuel, and a minimal amount of lube oil. The refinery will require about $86,700,000 to construct. It will have net annual pre-tax profit of about $17,000,000. The estimated return on investment is 20%. The feasibility is positive subject to confirmation of long term crude supply. The study also identified procedures for evaluating processing options as a means for American Indian Tribes and Native American Corporations to maximize the value of their crude oil production

  12. Uncomfortable Experience: Lessons Lost in the Apache War

    Science.gov (United States)

    2015-03-01

    the Apache War gripped the focus of American and Mexican citizens throughout Arizona, New Mexico, Chihuahua , and Sonora for a period greater than...Arizona and portions of New Mexico, and Northern Sonora and Chihuahua .5 Although confusion exists as to their true subdivisions, the Chokonen led by...contributed directly to the Victorio War, the Loco and Geronimo campaigns, and the Nana and Chatto- Chihuahua raids that followed.38 Once again, failure to

  13. Integration of event streaming and microservices with Apache Kafka

    OpenAIRE

    Kljun, Matija

    2017-01-01

    Over the last decade, the microservice architecture has become a standard for big and successful internet companies, like Netflix, Amazon and LinkedIn. The importance of stream processing, aggregation and exchange of data is growing, as it allows companies to compete better and move faster. In this diploma, we have analyzed the interactions between microservices, described the streaming platform and ordinary message queues. We have described the Apache Kafka platform and how...

  14. Satellite Imagery Production and Processing Using Apache Hadoop

    Science.gov (United States)

    Hill, D. V.; Werpy, J.

    2011-12-01

    The United States Geological Survey's (USGS) Earth Resources Observation and Science (EROS) Center Land Science Research and Development (LSRD) project has devised a method to fulfill its processing needs for Essential Climate Variable (ECV) production from the Landsat archive using Apache Hadoop. Apache Hadoop is the distributed processing technology at the heart of many large-scale, processing solutions implemented at well-known companies such as Yahoo, Amazon, and Facebook. It is a proven framework and can be used to process petabytes of data on thousands of processors concurrently. It is a natural fit for producing satellite imagery and requires only a few simple modifications to serve the needs of science data processing. This presentation provides an invaluable learning opportunity and should be heard by anyone doing large scale image processing today. The session will cover a description of the problem space, evaluation of alternatives, feature set overview, configuration of Hadoop for satellite image processing, real-world performance results, tuning recommendations and finally challenges and ongoing activities. It will also present how the LSRD project built a 102 core processing cluster with no financial hardware investment and achieved ten times the initial daily throughput requirements with a full time staff of only one engineer. Satellite Imagery Production and Processing Using Apache Hadoop is presented by David V. Hill, Principal Software Architect for USGS LSRD.

  15. Mechanical characterization of densely welded Apache Leap tuff

    International Nuclear Information System (INIS)

    Fuenkajorn, K.; Daemen, J.J.K.

    1991-06-01

    An empirical criterion is formulated to describe the compressive strength of the densely welded Apache Leap tuff. The criterion incorporates the effects of size, L/D ratio, loading rate and density variations. The criterion improves the correlation between the test results and the failure envelope. Uniaxial and triaxial compressive strengths, Brazilian tensile strength and elastic properties of the densely welded brown unit of the Apache Leap tuff have been determined using the ASTM standard test methods. All tuff samples are tested dry at room temperature (22 ± 2 degrees C), and have the core axis normal to the flow layers. The uniaxial compressive strength is 73.2 ± 16.5 MPa. The Brazilian tensile strength is 5.12 ± 1.2 MPa. The Young's modulus and Poisson's ratio are 22.6 ± 5.7 GPa and 0.20 ± 0.03. Smoothness and perpendicularity do not fully meet the ASTM requirements for all samples, due to the presence of voids and inclusions on the sample surfaces and the sample preparation methods. The investigations of loading rate, L/D radio and cyclic loading effects on the compressive strength and of the size effect on the tensile strength are not conclusive. The Coulomb strength criterion adequately represents the failure envelope of the tuff under confining pressures from 0 to 62 MPa. Cohesion and internal friction angle are 16 MPa and 43 degrees. The brown unit of the Apache Leap tuff is highly heterogeneous as suggested by large variations of the test results. The high intrinsic variability of the tuff is probably caused by the presence of flow layers and by nonuniform distributions of inclusions, voids and degree of welding. Similar variability of the properties has been found in publications on the Topopah Spring tuff at Yucca Mountain. 57 refs., 32 figs., 29 tabs

  16. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif

    2017-01-07

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  17. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif; Orakzai, Faisal Moeen; Abdelaziz, Ibrahim; Khayyat, Zuhair

    2017-01-01

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  18. Beginning PHP, Apache, MySQL web development

    CERN Document Server

    Glass, Michael K; Naramore, Elizabeth; Mailer, Gary; Stolz, Jeremy; Gerner, Jason

    2004-01-01

    An ideal introduction to the entire process of setting up a Web site using PHP (a scripting language), MySQL (a database management system), and Apache (a Web server)* Programmers will be up and running in no time, whether they're using Linux or Windows servers* Shows readers step by step how to create several Web sites that share common themes, enabling readers to use these examples in real-world projects* Invaluable reading for even the experienced programmer whose current site has outgrown the traditional static structure and who is looking for a way to upgrade to a more efficient, user-f

  19. Perl and Apache Your visual blueprint for developing dynamic Web content

    CERN Document Server

    McDaniel, Adam

    2010-01-01

    Visually explore the range of built-in and third-party libraries of Perl and Apache. Perl and Apache have been providing Common Gateway Interface (CGI) access to Web sites for 20 years and are constantly evolving to support the ever-changing demands of Internet users. With this book, you will heighten your knowledge and see how to usePerl and Apache to develop dynamic Web sites. Beginning with a clear, step-by-step explanation of how to install Perl and Apache on both Windows and Linux servers, you then move on to configuring each to securely provide CGI Services. CGI developer and author Adam

  20. Evaluation of Apache Hadoop for parallel data analysis with ROOT

    International Nuclear Information System (INIS)

    Lehrack, S; Duckeck, G; Ebke, J

    2014-01-01

    The Apache Hadoop software is a Java based framework for distributed processing of large data sets across clusters of computers, using the Hadoop file system (HDFS) for data storage and backup and MapReduce as a processing platform. Hadoop is primarily designed for processing large textual data sets which can be processed in arbitrary chunks, and must be adapted to the use case of processing binary data files which cannot be split automatically. However, Hadoop offers attractive features in terms of fault tolerance, task supervision and control, multi-user functionality and job management. For this reason, we evaluated Apache Hadoop as an alternative approach to PROOF for ROOT data analysis. Two alternatives in distributing analysis data were discussed: either the data was stored in HDFS and processed with MapReduce, or the data was accessed via a standard Grid storage system (dCache Tier-2) and MapReduce was used only as execution back-end. The focus in the measurements were on the one hand to safely store analysis data on HDFS with reasonable data rates and on the other hand to process data fast and reliably with MapReduce. In the evaluation of the HDFS, read/write data rates from local Hadoop cluster have been measured and compared to standard data rates from the local NFS installation. In the evaluation of MapReduce, realistic ROOT analyses have been used and event rates have been compared to PROOF.

  1. Evaluation of Apache Hadoop for parallel data analysis with ROOT

    Science.gov (United States)

    Lehrack, S.; Duckeck, G.; Ebke, J.

    2014-06-01

    The Apache Hadoop software is a Java based framework for distributed processing of large data sets across clusters of computers, using the Hadoop file system (HDFS) for data storage and backup and MapReduce as a processing platform. Hadoop is primarily designed for processing large textual data sets which can be processed in arbitrary chunks, and must be adapted to the use case of processing binary data files which cannot be split automatically. However, Hadoop offers attractive features in terms of fault tolerance, task supervision and control, multi-user functionality and job management. For this reason, we evaluated Apache Hadoop as an alternative approach to PROOF for ROOT data analysis. Two alternatives in distributing analysis data were discussed: either the data was stored in HDFS and processed with MapReduce, or the data was accessed via a standard Grid storage system (dCache Tier-2) and MapReduce was used only as execution back-end. The focus in the measurements were on the one hand to safely store analysis data on HDFS with reasonable data rates and on the other hand to process data fast and reliably with MapReduce. In the evaluation of the HDFS, read/write data rates from local Hadoop cluster have been measured and compared to standard data rates from the local NFS installation. In the evaluation of MapReduce, realistic ROOT analyses have been used and event rates have been compared to PROOF.

  2. Mescalero Apache Tribe Monitored Retrievable Storage (MRS). Phase 1 feasibility study report

    Energy Technology Data Exchange (ETDEWEB)

    Peso, F.

    1992-03-13

    The Nuclear Waste Policy Act of 1982, as amended, authorizes the siting, construction and operation of a Monitored Retrievable Storage (MRS) facility. The MRS is intended to be used for the temporary storage of spent nuclear fuel from the nation`s nuclear power plants beginning as early as 1998. Pursuant to the Nuclear Waste Policy Act, the Office of the Nuclear Waste Negotiator was created. On October 7, 1991, the Nuclear Waste Negotiator invited the governors of states and the Presidents of Indian tribes to apply for government grants in order to conduct a study to assess under what conditions, if any, they might consider hosting an MRS facility. Pursuant to this invitation, on October 11, 1991 the Mescalero Apache Indian Tribe of Mescalero, NM applied for a grant to conduct a phased, preliminary study of the safety, technical, political, environmental, social and economic feasibility of hosting an MRS. The preliminary study included: (1) An investigative education process to facilitate the Tribe`s comprehensive understanding of the safety, environmental, technical, social, political, and economic aspects of hosting an MRS, and; (2) The development of an extensive program that is enabling the Tribe, in collaboration with the Negotiator, to reach an informed and carefully researched decision regarding the conditions, (if any), under which further pursuit of the MRS would be considered. The Phase 1 grant application enabled the Tribe to begin the initial activities necessary to determine whether further consideration is warranted for hosting the MRS facility. The Tribe intends to pursue continued study of the MRS in order to meet the following objectives: (1) Continuing the education process towards a comprehensive understanding of the safety, environmental, technical, social and economic aspects of the MRS; (2) Conducting an effective public participation and information program; (3) Participating in MRS meetings.

  3. Managing Variant Calling Files the Big Data Way: Using HDFS and Apache Parquet

    NARCIS (Netherlands)

    Boufea, Aikaterini; Finkers, H.J.; Kaauwen, van M.P.W.; Kramer, M.R.; Athanasiadis, I.N.

    2017-01-01

    Big Data has been seen as a remedy for the efficient management of the ever-increasing genomic data. In this paper, we investigate the use of Apache Spark to store and process Variant Calling Files (VCF) on a Hadoop cluster. We demonstrate Tomatula, a software tool for converting VCF files to Apache

  4. 75 FR 68607 - BP Canada Energy Marketing Corp. Apache Corporation; Notice for Temporary Waivers

    Science.gov (United States)

    2010-11-08

    ... Energy Marketing Corp. Apache Corporation; Notice for Temporary Waivers November 1, 2010. Take notice that on October 29, 2010, BP Canada Energy Marketing Corp. and Apache Corporation filed with the... assistance with any FERC Online service, please e-mail [email protected] , or call (866) 208-3676...

  5. Spinal Pain and Occupational Disability: A Cohort Study of British Apache AH Mk1 Pilots

    Science.gov (United States)

    2013-09-01

    British RW community. 33 References Apache AH Mk1. 2012. Agusta Westland. http://www.agustawestland.com/ product /apache-ah- mk1-0. Ang, B., and...muscles Physical ex and stretching Continued pt and stretching exercises Use pt session included pumpkin bobs to stretch the neck. No effects noticed

  6. Biology and distribution of Lutzomyia apache as it relates to VSV

    Science.gov (United States)

    Phlebotomine sand flies are vectors of bacteria, parasites, and viruses. Lutzomyia apache was incriminated as a vector of vesicular stomatitis viruses(VSV)due to overlapping ranges of the sand fly and outbreaks of VSV. I report on newly discovered populations of L. apache in Wyoming from Albany and ...

  7. CMS Analysis and Data Reduction with Apache Spark

    Energy Technology Data Exchange (ETDEWEB)

    Gutsche, Oliver [Fermilab; Canali, Luca [CERN; Cremer, Illia [Magnetic Corp., Waltham; Cremonesi, Matteo [Fermilab; Elmer, Peter [Princeton U.; Fisk, Ian [Flatiron Inst., New York; Girone, Maria [CERN; Jayatilaka, Bo [Fermilab; Kowalkowski, Jim [Fermilab; Khristenko, Viktor [CERN; Motesnitsalis, Evangelos [CERN; Pivarski, Jim [Princeton U.; Sehrish, Saba [Fermilab; Surdy, Kacper [CERN; Svyatkovskiy, Alexey [Princeton U.

    2017-10-31

    Experimental Particle Physics has been at the forefront of analyzing the world's largest datasets for decades. The HEP community was among the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems for distributed data processing, collectively called "Big Data" technologies have emerged from industry and open source projects to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and tools, promising a fresh look at analysis of very large datasets that could potentially reduce the time-to-physics with increased interactivity. Moreover these new tools are typically actively developed by large communities, often profiting of industry resources, and under open source licensing. These factors result in a boost for adoption and maturity of the tools and for the communities supporting them, at the same time helping in reducing the cost of ownership for the end-users. In this talk, we are presenting studies of using Apache Spark for end user data analysis. We are studying the HEP analysis workflow separated into two thrusts: the reduction of centrally produced experiment datasets and the end-analysis up to the publication plot. Studying the first thrust, CMS is working together with CERN openlab and Intel on the CMS Big Data Reduction Facility. The goal is to reduce 1 PB of official CMS data to 1 TB of ntuple output for analysis. We are presenting the progress of this 2-year project with first results of scaling up Spark-based HEP analysis. Studying the second thrust, we are presenting studies on using Apache Spark for a CMS Dark Matter physics search, comparing Spark's feasibility, usability and performance to the ROOT-based analysis.

  8. Representation without Taxation: Citizenship and Suffrage in Indian Country.

    Science.gov (United States)

    Phelps, Glenn A.

    1985-01-01

    Reviews history of Arizona Indian voting rights. Details current dispute over voting rights in Apache County (Arizona). Explores three unanswered questions in light of current constitutional interpretation. Stresses solution to political disputes will require climate of mutual trust, awareness of constitutional rights/obligations of all concerned,…

  9. Prediction of heart disease using apache spark analysing decision trees and gradient boosting algorithm

    Science.gov (United States)

    Chugh, Saryu; Arivu Selvan, K.; Nadesh, RK

    2017-11-01

    Numerous destructive things influence the working arrangement of human body as hypertension, smoking, obesity, inappropriate medication taking which causes many contrasting diseases as diabetes, thyroid, strokes and coronary diseases. The impermanence and horribleness of the environment situation is also the reason for the coronary disease. The structure of Apache start relies on the evolution which requires gathering of the data. To break down the significance of use programming focused on data structure the Apache stop ought to be utilized and it gives various central focuses as it is fast in light as it uses memory worked in preparing. Apache Spark continues running on dispersed environment and chops down the data in bunches giving a high profitability rate. Utilizing mining procedure as a part of the determination of coronary disease has been exhaustively examined indicating worthy levels of precision. Decision trees, Neural Network, Gradient Boosting Algorithm are the various apache spark proficiencies which help in collecting the information.

  10. Analyzing large data sets from XGC1 magnetic fusion simulations using apache spark

    Energy Technology Data Exchange (ETDEWEB)

    Churchill, R. Michael [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States)

    2016-11-21

    Apache Spark is explored as a tool for analyzing large data sets from the magnetic fusion simulation code XGCI. Implementation details of Apache Spark on the NERSC Edison supercomputer are discussed, including binary file reading, and parameter setup. Here, an unsupervised machine learning algorithm, k-means clustering, is applied to XGCI particle distribution function data, showing that highly turbulent spatial regions do not have common coherent structures, but rather broad, ring-like structures in velocity space.

  11. Aplikasi Search Engine Perpustakaan Petra Berbasis Android dengan Apache SOLR

    Directory of Open Access Journals (Sweden)

    Andreas Handojo

    2016-07-01

    Full Text Available Abstrak: Pendidikan merupakan kebutuhan yang penting bagi manusia untuk meningkatkan kemampuan serta taraf hidupnya.Selain melalui pendidikan formal, ilmu juga dapat diperoleh melalui media cetak atau buku.Perpustakaan merupakan salah satu sarana yang penting dalam menunjang hal tersebut.Meskipun sangat bermanfaat, terdapat kesulitan penggunaan layanan perpustakaan, karena terlalu banyaknya koleksi pustaka yang ada (buku, jurnal, majalah, dan sebagainya sehingga sulit untuk menemukan buku yang ingin dicari.Oleh sebab itu, selain harus berkembang dengan penyediaan koleksi pustaka, perpustakaan harus dapat mengikuti perkembangan zaman yang ada sehingga mempermudah penggunaan layanan perpustakaan.Saat iniperpustakaan Universitas Kristen Petra memiliki perpustakaan dengan kurang lebih 230.000 koleksi fisik maupun digital (berdasarkan data 2014.Dimana daftar koleksi fisik dan dokumen digital dapat diakses melalui website perpustakaan.Adanya koleksi pustaka yang sangat banyak ini menyebabkan kesulitan pengguna dalam melakukan proses pencarian. Sehingga guna menambah fitur layanan yang diberikan maka pada penelitian ini dibuatlah sebuah aplikasi layanan search engine perpustakaan menggunakan platform Apache SOLR dan database PostgreSQL. Selain itu, guna lebih meningkatkan kemudahan akses maka aplikasi ini dibuat dengan menggunakan platform mobile device berbasis Android.Selain pengujian terhadap aplikasi dilakukan juga pengujian dengan mengedarkan kuesioner terhadap 50 calon pengguna.Dari hasil kuestioner tersebut menunjukkan bahwa fitur-fitur yang dibuat telah sesuai dengan kebutuhan pengguna (78%. Kata kunci: SOLR, Mesin Pencarian, Perpustakaan, PostgreSQL Abstract: Education is an essential requirement for people to improve their standard of living. Other than through formal education, science can also be obtained through the print media or books. Library is one important tool supporting it. Although it is useful, there are difficulties use library

  12. The Apache Point Observatory Galactic Evolution Experiment (APOGEE)

    Science.gov (United States)

    Majewski, Steven R.; Schiavon, Ricardo P.; Frinchaboy, Peter M.; Allende Prieto, Carlos; Barkhouser, Robert; Bizyaev, Dmitry; Blank, Basil; Brunner, Sophia; Burton, Adam; Carrera, Ricardo; Chojnowski, S. Drew; Cunha, Kátia; Epstein, Courtney; Fitzgerald, Greg; García Pérez, Ana E.; Hearty, Fred R.; Henderson, Chuck; Holtzman, Jon A.; Johnson, Jennifer A.; Lam, Charles R.; Lawler, James E.; Maseman, Paul; Mészáros, Szabolcs; Nelson, Matthew; Nguyen, Duy Coung; Nidever, David L.; Pinsonneault, Marc; Shetrone, Matthew; Smee, Stephen; Smith, Verne V.; Stolberg, Todd; Skrutskie, Michael F.; Walker, Eric; Wilson, John C.; Zasowski, Gail; Anders, Friedrich; Basu, Sarbani; Beland, Stephane; Blanton, Michael R.; Bovy, Jo; Brownstein, Joel R.; Carlberg, Joleen; Chaplin, William; Chiappini, Cristina; Eisenstein, Daniel J.; Elsworth, Yvonne; Feuillet, Diane; Fleming, Scott W.; Galbraith-Frew, Jessica; García, Rafael A.; García-Hernández, D. Aníbal; Gillespie, Bruce A.; Girardi, Léo; Gunn, James E.; Hasselquist, Sten; Hayden, Michael R.; Hekker, Saskia; Ivans, Inese; Kinemuchi, Karen; Klaene, Mark; Mahadevan, Suvrath; Mathur, Savita; Mosser, Benoît; Muna, Demitri; Munn, Jeffrey A.; Nichol, Robert C.; O'Connell, Robert W.; Parejko, John K.; Robin, A. C.; Rocha-Pinto, Helio; Schultheis, Matthias; Serenelli, Aldo M.; Shane, Neville; Silva Aguirre, Victor; Sobeck, Jennifer S.; Thompson, Benjamin; Troup, Nicholas W.; Weinberg, David H.; Zamora, Olga

    2017-09-01

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE), one of the programs in the Sloan Digital Sky Survey III (SDSS-III), has now completed its systematic, homogeneous spectroscopic survey sampling all major populations of the Milky Way. After a three-year observing campaign on the Sloan 2.5 m Telescope, APOGEE has collected a half million high-resolution (R ˜ 22,500), high signal-to-noise ratio (>100), infrared (1.51-1.70 μm) spectra for 146,000 stars, with time series information via repeat visits to most of these stars. This paper describes the motivations for the survey and its overall design—hardware, field placement, target selection, operations—and gives an overview of these aspects as well as the data reduction, analysis, and products. An index is also given to the complement of technical papers that describe various critical survey components in detail. Finally, we discuss the achieved survey performance and illustrate the variety of potential uses of the data products by way of a number of science demonstrations, which span from time series analysis of stellar spectral variations and radial velocity variations from stellar companions, to spatial maps of kinematics, metallicity, and abundance patterns across the Galaxy and as a function of age, to new views of the interstellar medium, the chemistry of star clusters, and the discovery of rare stellar species. As part of SDSS-III Data Release 12 and later releases, all of the APOGEE data products are publicly available.

  13. The Apache Point Observatory Galactic Evolution Experiment (APOGEE)

    International Nuclear Information System (INIS)

    Majewski, Steven R.; Brunner, Sophia; Burton, Adam; Chojnowski, S. Drew; Pérez, Ana E. García; Hearty, Fred R.; Lam, Charles R.; Schiavon, Ricardo P.; Frinchaboy, Peter M.; Prieto, Carlos Allende; Carrera, Ricardo; Barkhouser, Robert; Bizyaev, Dmitry; Blank, Basil; Henderson, Chuck; Cunha, Kátia; Epstein, Courtney; Johnson, Jennifer A.; Fitzgerald, Greg; Holtzman, Jon A.

    2017-01-01

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE), one of the programs in the Sloan Digital Sky Survey III (SDSS-III), has now completed its systematic, homogeneous spectroscopic survey sampling all major populations of the Milky Way. After a three-year observing campaign on the Sloan 2.5 m Telescope, APOGEE has collected a half million high-resolution ( R  ∼ 22,500), high signal-to-noise ratio (>100), infrared (1.51–1.70 μ m) spectra for 146,000 stars, with time series information via repeat visits to most of these stars. This paper describes the motivations for the survey and its overall design—hardware, field placement, target selection, operations—and gives an overview of these aspects as well as the data reduction, analysis, and products. An index is also given to the complement of technical papers that describe various critical survey components in detail. Finally, we discuss the achieved survey performance and illustrate the variety of potential uses of the data products by way of a number of science demonstrations, which span from time series analysis of stellar spectral variations and radial velocity variations from stellar companions, to spatial maps of kinematics, metallicity, and abundance patterns across the Galaxy and as a function of age, to new views of the interstellar medium, the chemistry of star clusters, and the discovery of rare stellar species. As part of SDSS-III Data Release 12 and later releases, all of the APOGEE data products are publicly available.

  14. The Apache Point Observatory Galactic Evolution Experiment (APOGEE)

    Energy Technology Data Exchange (ETDEWEB)

    Majewski, Steven R.; Brunner, Sophia; Burton, Adam; Chojnowski, S. Drew; Pérez, Ana E. García; Hearty, Fred R.; Lam, Charles R. [Department of Astronomy, University of Virginia, Charlottesville, VA 22904-4325 (United States); Schiavon, Ricardo P. [Gemini Observatory, 670 N. A’Ohoku Place, Hilo, HI 96720 (United States); Frinchaboy, Peter M. [Department of Physics and Astronomy, Texas Christian University, Fort Worth, TX 76129 (United States); Prieto, Carlos Allende; Carrera, Ricardo [Instituto de Astrofísica de Canarias, E-38200 La Laguna, Tenerife (Spain); Barkhouser, Robert [Department of Physics and Astronomy, Johns Hopkins University, Baltimore, MD 21218 (United States); Bizyaev, Dmitry [Apache Point Observatory and New Mexico State University, P.O. Box 59, Sunspot, NM, 88349-0059 (United States); Blank, Basil; Henderson, Chuck [Pulse Ray Machining and Design, 4583 State Route 414, Beaver Dams, NY 14812 (United States); Cunha, Kátia [Observatório Nacional, Rio de Janeiro, RJ 20921-400 (Brazil); Epstein, Courtney; Johnson, Jennifer A. [The Ohio State University, Columbus, OH 43210 (United States); Fitzgerald, Greg [New England Optical Systems, 237 Cedar Hill Street, Marlborough, MA 01752 (United States); Holtzman, Jon A. [New Mexico State University, Las Cruces, NM 88003 (United States); and others

    2017-09-01

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE), one of the programs in the Sloan Digital Sky Survey III (SDSS-III), has now completed its systematic, homogeneous spectroscopic survey sampling all major populations of the Milky Way. After a three-year observing campaign on the Sloan 2.5 m Telescope, APOGEE has collected a half million high-resolution ( R  ∼ 22,500), high signal-to-noise ratio (>100), infrared (1.51–1.70 μ m) spectra for 146,000 stars, with time series information via repeat visits to most of these stars. This paper describes the motivations for the survey and its overall design—hardware, field placement, target selection, operations—and gives an overview of these aspects as well as the data reduction, analysis, and products. An index is also given to the complement of technical papers that describe various critical survey components in detail. Finally, we discuss the achieved survey performance and illustrate the variety of potential uses of the data products by way of a number of science demonstrations, which span from time series analysis of stellar spectral variations and radial velocity variations from stellar companions, to spatial maps of kinematics, metallicity, and abundance patterns across the Galaxy and as a function of age, to new views of the interstellar medium, the chemistry of star clusters, and the discovery of rare stellar species. As part of SDSS-III Data Release 12 and later releases, all of the APOGEE data products are publicly available.

  15. Outcrop Gamma-ray Analysis of the Cretaceous mesaverde Group: Jicarilla Apache Indian Reservation, New Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Ridgley, Jennie; Dunbar, Robyn Wright

    2001-04-25

    This report presents the results of an outcrop gamma-ray survey of six selected measured sections included in the original report. The primary objective of this second study is to provide a baseline to correlate from the outcrop and reservoir model into Mesaverde strata in the San Juan Basin subsurface. Outcrop logs were generated using a GAD-6 gamma-ray spectrometer that simultaneously recorded total counts, potassium, uranium, and thorium data.

  16. 76 FR 72969 - Proclaiming Certain Lands as Reservation for the Fort Sill Apache Indian Tribe

    Science.gov (United States)

    2011-11-28

    ... Affairs, Division of Real Estate Services, Mail Stop-4639-MIB, 1849 C Street NW., Washington, DC 20240...[deg]47'40'', a chord which bears S. 73[deg]39'52'' W., 445.63 feet through an arc length of 451.77... feet, through an arc length of 764.78 feet to I-10 P.C. marker 45+11.53; Thence N. 82[deg]45'27'' W., a...

  17. An External Independent Validation of APACHE IV in a Malaysian Intensive Care Unit.

    Science.gov (United States)

    Wong, Rowena S Y; Ismail, Noor Azina; Tan, Cheng Cheng

    2015-04-01

    Intensive care unit (ICU) prognostic models are predominantly used in more developed nations such as the United States, Europe and Australia. These are not that popular in Southeast Asian countries due to costs and technology considerations. The purpose of this study is to evaluate the suitability of the acute physiology and chronic health evaluation (APACHE) IV model in a single centre Malaysian ICU. A prospective study was conducted at the single centre ICU in Hospital Sultanah Aminah (HSA) Malaysia. External validation of APACHE IV involved a cohort of 916 patients who were admitted in 2009. Model performance was assessed through its calibration and discrimination abilities. A first-level customisation using logistic regression approach was also applied to improve model calibration. APACHE IV exhibited good discrimination, with an area under receiver operating characteristic (ROC) curve of 0.78. However, the model's overall fit was observed to be poor, as indicated by the Hosmer-Lemeshow goodness-of-fit test (Ĉ = 113, P discrimination was not affected. APACHE IV is not suitable for application in HSA ICU, without further customisation. The model's lack of fit in the Malaysian study is attributed to differences in the baseline characteristics between HSA ICU and APACHE IV datasets. Other possible factors could be due to differences in clinical practice, quality and services of health care systems between Malaysia and the United States.

  18. Carlos Romero

    Directory of Open Access Journals (Sweden)

    2008-05-01

    Full Text Available Entrevista (en español Presentación Carlos Romero, politólogo, es profesor-investigador en el Instituto de Estudios Políticos de la Facultad de Ciencias Jurídicas y Políticas de la Universidad Central de Venezuela, en donde se ha desempeñado como coordinador del Doctorado, subdirector y director del Centro de Estudios de Postgrado. Cuenta con ocho libros publicados sobre temas de análisis político y relaciones internacionales, siendo uno de los últimos Jugando con el globo. La política exter...

  19. Jicarilla Apache Utility Authority. Strategic Plan for Energy Efficiency and Renewable Energy Development

    International Nuclear Information System (INIS)

    Rabago, K.R.

    2008-01-01

    The purpose of this Strategic Plan Report is to provide an introduction and in-depth analysis of the issues and opportunities, resources, and technologies of energy efficiency and renewable energy that have potential beneficial application for the people of the Jicarilla Apache Nation and surrounding communities. The Report seeks to draw on the best available information that existed at the time of writing, and where necessary, draws on new research to assess this potential. This study provides a strategic assessment of opportunities for maximizing the potential for electrical energy efficiency and renewable energy development by the Jicarilla Apache Nation. The report analyzes electricity use on the Jicarilla Apache Reservation in buildings. The report also assesses particular resources and technologies in detail, including energy efficiency, solar, wind, geothermal, biomass, and small hydropower. The closing sections set out the elements of a multi-year, multi-phase strategy for development of resources to the maximum benefit of the Nation

  20. Jicarilla Apache Utility Authority Renewable Energy and Energy Efficiency Strategic Planning

    Energy Technology Data Exchange (ETDEWEB)

    Rabago, K.R.

    2008-06-28

    The purpose of this Strategic Plan Report is to provide an introduction and in-depth analysis of the issues and opportunities, resources, and technologies of energy efficiency and renewable energy that have potential beneficial application for the people of the Jicarilla Apache Nation and surrounding communities. The Report seeks to draw on the best available information that existed at the time of writing, and where necessary, draws on new research to assess this potential. This study provides a strategic assessment of opportunities for maximizing the potential for electrical energy efficiency and renewable energy development by the Jicarilla Apache Nation. The report analyzes electricity use on the Jicarilla Apache Reservation in buildings. The report also assesses particular resources and technologies in detail, including energy efficiency, solar, wind, geothermal, biomass, and small hydropower. The closing sections set out the elements of a multi-year, multi-phase strategy for development of resources to the maximum benefit of the Nation.

  1. Kelayakan Raspberry Pi sebagai Web Server: Perbandingan Kinerja Nginx, Apache, dan Lighttpd pada Platform Raspberry Pi

    Directory of Open Access Journals (Sweden)

    Rahmad Dawood

    2014-04-01

    Full Text Available Raspberry Pi is a small-sized computer, but it can function like an ordinary computer. Because it can function like a regular PC then it is also possible to run a web server application on the Raspberry Pi. This paper will report results from testing the feasibility and performance of running a web server on the Raspberry Pi. The test was conducted on the current top three most popular web servers, which are: Apache, Nginx, and Lighttpd. The parameters used to evaluate the feasibility and performance of these web servers were: maximum request and reply time. The results from the test showed that it is feasible to run all three web servers on the Raspberry Pi but Nginx gave the best performance followed by Lighttpd and Apache.Keywords: Raspberry Pi, web server, Apache, Lighttpd, Nginx, web server performance

  2. Growth and survival of Apache Trout under static and fluctuating temperature regimes

    Science.gov (United States)

    Recsetar, Matthew S.; Bonar, Scott A.; Feuerbacher, Olin

    2014-01-01

    Increasing stream temperatures have important implications for arid-region fishes. Little is known about effects of high water temperatures that fluctuate over extended periods on Apache Trout Oncorhynchus gilae apache, a federally threatened species of southwestern USA streams. We compared survival and growth of juvenile Apache Trout held for 30 d in static temperatures (16, 19, 22, 25, and 28°C) and fluctuating diel temperatures (±3°C from 16, 19, 22 and 25°C midpoints and ±6°C from 19°C and 22°C midpoints). Lethal temperature for 50% (LT50) of the Apache Trout under static temperatures (mean [SD] = 22.8 [0.6]°C) was similar to that of ±3°C diel temperature fluctuations (23.1 [0.1]°C). Mean LT50 for the midpoint of the ±6°C fluctuations could not be calculated because survival in the two treatments (19 ± 6°C and 22 ± 6°C) was not below 50%; however, it probably was also between 22°C and 25°C because the upper limb of a ±6°C fluctuation on a 25°C midpoint is above critical thermal maximum for Apache Trout (28.5–30.4°C). Growth decreased as temperatures approached the LT50. Apache Trout can survive short-term exposure to water temperatures with daily maxima that remain below 25°C and midpoint diel temperatures below 22°C. However, median summer stream temperatures must remain below 19°C for best growth and even lower if daily fluctuations are high (≥12°C).

  3. Use of APACHE II and SAPS II to predict mortality for hemorrhagic and ischemic stroke patients.

    Science.gov (United States)

    Moon, Byeong Hoo; Park, Sang Kyu; Jang, Dong Kyu; Jang, Kyoung Sool; Kim, Jong Tae; Han, Yong Min

    2015-01-01

    We studied the applicability of the Acute Physiology and Chronic Health Evaluation II (APACHE II) and Simplified Acute Physiology Score II (SAPS II) in patients admitted to the intensive care unit (ICU) with acute stroke and compared the results with the Glasgow Coma Scale (GCS) and National Institutes of Health Stroke Scale (NIHSS). We also conducted a comparative study of accuracy for predicting hemorrhagic and ischemic stroke mortality. Between January 2011 and December 2012, ischemic or hemorrhagic stroke patients admitted to the ICU were included in the study. APACHE II and SAPS II-predicted mortalities were compared using a calibration curve, the Hosmer-Lemeshow goodness-of-fit test, and the receiver operating characteristic (ROC) curve, and the results were compared with the GCS and NIHSS. Overall 498 patients were included in this study. The observed mortality was 26.3%, whereas APACHE II and SAPS II-predicted mortalities were 35.12% and 35.34%, respectively. The mean GCS and NIHSS scores were 9.43 and 21.63, respectively. The calibration curve was close to the line of perfect prediction. The ROC curve showed a slightly better prediction of mortality for APACHE II in hemorrhagic stroke patients and SAPS II in ischemic stroke patients. The GCS and NIHSS were inferior in predicting mortality in both patient groups. Although both the APACHE II and SAPS II systems can be used to measure performance in the neurosurgical ICU setting, the accuracy of APACHE II in hemorrhagic stroke patients and SAPS II in ischemic stroke patients was superior. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. APACHE II as an indicator of ventilator-associated pneumonia (VAP.

    Directory of Open Access Journals (Sweden)

    Kelser de Souza Kock

    2015-01-01

    Full Text Available Background and objectives: strategies for risk stratification in severe pathologies are extremely important. The aim of this study was to analyze the accuracy of the APACHE II score as an indicator of Ventilator-Associated Pneumonia (VAP in ICU patient sat Hospital Nossa Senhora da Conceição (HNSC Tubarão-SC. Methods: It was conducted a prospective cohort study with 120 patients admitted between March and August 2013, being held APACHE II in the first 24 hours of mechanical ventilation (MV. Patients were followed until the following gout comes: discharge or death. It was also analyzed the cause of ICU admission, age, gender, days of mechanical ventilation, length of ICU and outcome. Results: The incidence of VAP was 31.8% (38/120. Two variables showed a relative riskin the development of VAP, APACHE II above average (RR = 1,62; IC 95% 1,03-2,55 and males (RR = 1,56; IC 95 % 1,18-2,08. The duration of mechanical ventilation (days above average18.4± 14.9(p =0.001, ICU stay (days above average 20.4± 15.3(p =0.003 presented the development of VAP. The accuracy of APACHE II in predicting VAP score >23, showed a sensitivity of 84% and specificity of 33%. Inrelation to death, two variables showed relative risk, age above average (RR=2.08; 95% CI =1.34 to 3.23 and ICU stay above average (RR=2.05; CI 95 =1.28 to 3.28%. Conclusion: The APACHE II score above or equal 23 might to indicate the risk of VAP. Keywords: Pneumonia, Ventilator-Associated, Intensive Care Units, APACHE. Prognosis

  5. One hundred years of instrumental phonetic fieldwork on North America Indian languages

    Science.gov (United States)

    McDonough, Joyce

    2005-04-01

    A resurgence of interest in phonetic fieldwork on generally morphologically complex North American Indian languages over the last 15 years is a continuation of a tradition started a century ago with the Earle Pliny Goddard, who collected kymographic and palatographic field-data between 1906-1927 on several Athabaskan languages: Coastal Athabaskan (Hupa and Kato), Apachean (Mescalero, Jicarilla, White Mountain, San Juan Carlos Apache), and several Athabaskan languages in Northern Canada (Cold Lake and Beaver); data that remains important for its record of segmental timing profiles and rare articulatory documentation in then largely monolingual communities. This data in combination with new work has resulted in the emergence of a body of knowledge of these typologically distinct families that often challenge notions of phonetic universality and typology. Using the Athabaskan languages as benchmark example and starting with Goddard's work, two types of emergent typological patterns will be discussed; the persistence of fine-grained timing and duration details across the widely dispersed family, and the broad variation in prosodic types that exists, both of which are unaccounted for by phonetic or phonological theories.

  6. 77 FR 18997 - Rim Lakes Forest Restoration Project; Apache-Sitgreavese National Forest, Black Mesa Ranger...

    Science.gov (United States)

    2012-03-29

    ... DEPARTMENT OF AGRICULTURE Forest Service Rim Lakes Forest Restoration Project; Apache-Sitgreavese National Forest, Black Mesa Ranger District, Coconino County, AZ AGENCY: Forest Service, USDA. ACTION: Notice of intent to prepare an environmental impact statement. SUMMARY: The U.S. Forest Service (FS) will...

  7. Lutzomyia (Helcocyrtomyia) Apache Young and Perkins (Diptera: Psychodidae) feeds on reptiles

    Science.gov (United States)

    Phlebotomine sand flies are vectors of bacteria, parasites, and viruses. In the western USA a sand fly, Lutzomyia apache Young and Perkins, was initially associated with epizootics of vesicular stomatitis virus (VSV), because sand flies were trapped at sites of an outbreak. Additional studies indica...

  8. Predictive value of SAPS II and APACHE II scoring systems for patient outcome in a medical intensive care unit

    Directory of Open Access Journals (Sweden)

    Amina Godinjak

    2016-11-01

    Full Text Available Objective. The aim is to determine SAPS II and APACHE II scores in medical intensive care unit (MICU patients, to compare them for prediction of patient outcome, and to compare with actual hospital mortality rates for different subgroups of patients. Methods. One hundred and seventy-four patients were included in this analysis over a oneyear period in the MICU, Clinical Center, University of Sarajevo. The following patient data were obtained: demographics, admission diagnosis, SAPS II, APACHE II scores and final outcome. Results. Out of 174 patients, 70 patients (40.2% died. Mean SAPS II and APACHE II scores in all patients were 48.4±17.0 and 21.6±10.3 respectively, and they were significantly different between survivors and non-survivors. SAPS II >50.5 and APACHE II >27.5 can predict the risk of mortality in these patients. There was no statistically significant difference in the clinical values of SAPS II vs APACHE II (p=0.501. A statistically significant positive correlation was established between the values of SAPS II and APACHE II (r=0.708; p=0.001. Patients with an admission diagnosis of sepsis/septic shock had the highest values of both SAPS II and APACHE II scores, and also the highest hospital mortality rate of 55.1%. Conclusion. Both APACHE II and SAPS II had an excellent ability to discriminate between survivors and non-survivors. There was no significant difference in the clinical values of SAPS II and APACHE II. A positive correlation was established between them. Sepsis/septic shock patients had the highest predicted and observed hospital mortality rate.

  9. Evaluation of APACHE II system among intensive care patients at a teaching hospital

    Directory of Open Access Journals (Sweden)

    Paulo Antonio Chiavone

    Full Text Available CONTEXT: The high-complexity features of intensive care unit services and the clinical situation of patients themselves render correct prognosis fundamentally important not only for patients, their families and physicians, but also for hospital administrators, fund-providers and controllers. Prognostic indices have been developed for estimating hospital mortality rates for hospitalized patients, based on demographic, physiological and clinical data. OBJECTIVE: The APACHE II system was applied within an intensive care unit to evaluate its ability to predict patient outcome; to compare illness severity with outcomes for clinical and surgical patients; and to compare the recorded result with the predicted death rate. DESIGN: Diagnostic test. SETTING: Clinical and surgical intensive care unit in a tertiary-care teaching hospital. PARTICIPANTS: The study involved 521 consecutive patients admitted to the intensive care unit from July 1998 to June 1999. MAIN MEASUREMENTS: APACHE II score, in-hospital mortality, receiver operating characteristic curve, decision matrices and linear regression analysis. RESULTS: The patients' mean age was 50 ± 19 years and the APACHE II score was 16.7 ± 7.3. There were 166 clinical patients (32%, 173 (33% post-elective surgery patients (33%, and 182 post-emergency surgery patients (35%, thus producing statistically similar proportions. The APACHE II scores for clinical patients (18.5 ± 7.8 were similar to those for non-elective surgery patients (18.6 ± 6.5 and both were greater than for elective surgery patients (13.0 ± 6.3 (p < 0.05. The higher this score was, the higher the mortality rate was (p < 0.05. The predicted death rate was 25.6% and the recorded death rate was 35.5%. Through the use of receiver operating curve analysis, good discrimination was found (area under the curve = 0.80. From the 2 x 2 decision matrix, 72.2% of patients were correctly classified (sensitivity = 35.1%; specificity = 92.6%. Linear

  10. Better prognostic marker in ICU - APACHE II, SOFA or SAP II!

    Science.gov (United States)

    Naqvi, Iftikhar Haider; Mahmood, Khalid; Ziaullaha, Syed; Kashif, Syed Mohammad; Sharif, Asim

    2016-01-01

    This study was designed to determine the comparative efficacy of different scoring system in assessing the prognosis of critically ill patients. This was a retrospective study conducted in medical intensive care unit (MICU) and high dependency unit (HDU) Medical Unit III, Civil Hospital, from April 2012 to August 2012. All patients over age 16 years old who have fulfilled the criteria for MICU admission were included. Predictive mortality of APACHE II, SAP II and SOFA were calculated. Calibration and discrimination were used for validity of each scoring model. A total of 96 patients with equal gender distribution were enrolled. The average APACHE II score in non-survivors (27.97+8.53) was higher than survivors (15.82+8.79) with statistically significant p value (discrimination power than SAP II and SOFA.

  11. Lot 4 AH-64E Apache Attack Helicopter Follow-on Operational Test and Evaluation Report

    Science.gov (United States)

    2014-12-01

    engine is tested to determine its Engine Torque Factor ( ETF ) rating.6 To meet contract specifications, a new engine must have an ETF of 1.0. The...published AH-64E operator’s manual estimates performance based on engines with an ETF of 1.0, and pilots normally plan missions anticipating the 717...pound shortfall in hover performance at KPP conditions. The Apache Program Manager reports that new engines are delivered with an average ETF of

  12. Developer Initiation and Social Interactions in OSS: A Case Study of the Apache Software Foundation

    Science.gov (United States)

    2014-08-01

    pp. 201–215, 2003. 2. K. Crowston, K. Wei, J. Howison, and A. Wiggins, “Free/ libre open-source software devel- opment: What we know and what we do not...Understanding the process of participating in open source communities,” in International Workshop on Emerging Trends in Free/ Libre /Open Source Software ...Noname manuscript No. (will be inserted by the editor) Developer Initiation and Social Interactions in OSS: A Case Study of the Apache Software

  13. 75 FR 14419 - Camp Tatiyee Land Exchange on the Lakeside Ranger District of the Apache-Sitgreaves National...

    Science.gov (United States)

    2010-03-25

    ... Ranger, Lakeside Ranger District, Apache-Sitgreaves National Forests, c/o TEC Inc., 514 Via de la Valle... to other papers serving areas affected by this proposal: Tucson Citizen, Sierra Vista Herald, Nogales...

  14. Redskins in Bluecoats: A Strategic and Cultural Analysis of General George Crooks Use of Apache Scouts in the Second Apache Campaign, 1882-1886

    Science.gov (United States)

    2010-03-31

    Scouts ...................................................................... .44 Figure 7. Captain John Gregory Bourke ...John Gregory Bourke (see Figure 7), served with him "for more than 15 years ... as a member of his military staff.,,3o Following his retirement, Bourke ...Bureau of Indian affairs. John Bourke said of Crook in an obituary, "The story of his administration of Indian Affairs in that, as in every other

  15. Seguridad en la configuración del servidor web Apache

    Directory of Open Access Journals (Sweden)

    Carlos Eduardo Gómez Montoya

    2013-07-01

    Full Text Available Apache es el servidor Web con mayor presencia en el mercado mundial. Aunque su configuración es relativamente sencilla, fortalecer sus condiciones de seguridad implica entender y aplicar un conjunto de reglas generales conocidas, aceptadas y disponibles. Por otra parte, a pesar de ser un tema aparentemente resuelto, la seguridad en los servidores HTTP constituye un problema en aumento, y no todas las compañías lo toman en serio. Este artículo identifica y verifica un conjunto de buenas prácticas de seguridad informática aplicadas a la configuración de Apache. Para alcanzar los objetivos, y con el fin de garantizar un proceso adecuado, se eligió una metodología basada en el Círculo de Calidad de Deming, el cual comprende cuatro fases: planear, hacer, verificar y actuar, y su aplicación condujo el desarrollo del proyecto. Este artículo consta de cinco secciones: Introducción, Marco de referencia, Metodología, Resultados y discusión, y Conclusiones.

  16. The customization of APACHE II for patients receiving orthotopic liver transplants

    Science.gov (United States)

    Moreno, Rui

    2002-01-01

    General outcome prediction models developed for use with large, multicenter databases of critically ill patients may not correctly estimate mortality if applied to a particular group of patients that was under-represented in the original database. The development of new diagnostic weights has been proposed as a method of adapting the general model – the Acute Physiology and Chronic Health Evaluation (APACHE) II in this case – to a new group of patients. Such customization must be empirically tested, because the original model cannot contain an appropriate set of predictive variables for the particular group. In this issue of Critical Care, Arabi and co-workers present the results of the validation of a modified model of the APACHE II system for patients receiving orthotopic liver transplants. The use of a highly heterogeneous database for which not all important variables were taken into account and of a sample too small to use the Hosmer–Lemeshow goodness-of-fit test appropriately makes their conclusions uncertain. PMID:12133174

  17. SIDELOADING – INGESTION OF LARGE POINT CLOUDS INTO THE APACHE SPARK BIG DATA ENGINE

    Directory of Open Access Journals (Sweden)

    J. Boehm

    2016-06-01

    Full Text Available In the geospatial domain we have now reached the point where data volumes we handle have clearly grown beyond the capacity of most desktop computers. This is particularly true in the area of point cloud processing. It is therefore naturally lucrative to explore established big data frameworks for big geospatial data. The very first hurdle is the import of geospatial data into big data frameworks, commonly referred to as data ingestion. Geospatial data is typically encoded in specialised binary file formats, which are not naturally supported by the existing big data frameworks. Instead such file formats are supported by software libraries that are restricted to single CPU execution. We present an approach that allows the use of existing point cloud file format libraries on the Apache Spark big data framework. We demonstrate the ingestion of large volumes of point cloud data into a compute cluster. The approach uses a map function to distribute the data ingestion across the nodes of a cluster. We test the capabilities of the proposed method to load billions of points into a commodity hardware compute cluster and we discuss the implications on scalability and performance. The performance is benchmarked against an existing native Apache Spark data import implementation.

  18. Sideloading - Ingestion of Large Point Clouds Into the Apache Spark Big Data Engine

    Science.gov (United States)

    Boehm, J.; Liu, K.; Alis, C.

    2016-06-01

    In the geospatial domain we have now reached the point where data volumes we handle have clearly grown beyond the capacity of most desktop computers. This is particularly true in the area of point cloud processing. It is therefore naturally lucrative to explore established big data frameworks for big geospatial data. The very first hurdle is the import of geospatial data into big data frameworks, commonly referred to as data ingestion. Geospatial data is typically encoded in specialised binary file formats, which are not naturally supported by the existing big data frameworks. Instead such file formats are supported by software libraries that are restricted to single CPU execution. We present an approach that allows the use of existing point cloud file format libraries on the Apache Spark big data framework. We demonstrate the ingestion of large volumes of point cloud data into a compute cluster. The approach uses a map function to distribute the data ingestion across the nodes of a cluster. We test the capabilities of the proposed method to load billions of points into a commodity hardware compute cluster and we discuss the implications on scalability and performance. The performance is benchmarked against an existing native Apache Spark data import implementation.

  19. Resonance – Journal of Science Education | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 4. Markov Chain Monte Carlo Methods - Simple Monte Carlo. K B Athreya Mohan Delampady T Krishnan. General ... School of ORIE Rhodes Hall Cornell University, Ithaca New York 14853, USA. Indian Statistical Institute 8th Mile, Mysore Road ...

  20. Exploring Monte Carlo methods

    CERN Document Server

    Dunn, William L

    2012-01-01

    Exploring Monte Carlo Methods is a basic text that describes the numerical methods that have come to be known as "Monte Carlo." The book treats the subject generically through the first eight chapters and, thus, should be of use to anyone who wants to learn to use Monte Carlo. The next two chapters focus on applications in nuclear engineering, which are illustrative of uses in other fields. Five appendices are included, which provide useful information on probability distributions, general-purpose Monte Carlo codes for radiation transport, and other matters. The famous "Buffon's needle proble

  1. Monte Carlo methods

    Directory of Open Access Journals (Sweden)

    Bardenet Rémi

    2013-07-01

    Full Text Available Bayesian inference often requires integrating some function with respect to a posterior distribution. Monte Carlo methods are sampling algorithms that allow to compute these integrals numerically when they are not analytically tractable. We review here the basic principles and the most common Monte Carlo algorithms, among which rejection sampling, importance sampling and Monte Carlo Markov chain (MCMC methods. We give intuition on the theoretical justification of the algorithms as well as practical advice, trying to relate both. We discuss the application of Monte Carlo in experimental physics, and point to landmarks in the literature for the curious reader.

  2. An Indian tribal view of the back end of the nuclear fuel cycle: historical and cultural lessons

    International Nuclear Information System (INIS)

    Tano, M.L.; Powankee, D.; Lester, A.D.

    1995-01-01

    The Nez Perce Tribe, the Confederated Tribes of the Umatilla Indian Reservation and the Yakama Indian Nation have entered into cooperative agreements with the US Department of Energy to oversee the cleanup of the Hanford Reservation. The Mescalero Apache Tribe and the Meadow Lake Tribal Council have come under severe criticism from some ''ideological pure'' Indians and non-Indians for aiding and abetting the violation of Mother Earth by permitting the land to be contaminated by radioactive wastes. This paper suggests that this view of the Indian relationship to nature and the environment is too narrow and describes aspects of Indian religion that support tribal involvement in radioactive waste management. (O.M.)

  3. Constructing Flexible, Configurable, ETL Pipelines for the Analysis of "Big Data" with Apache OODT

    Science.gov (United States)

    Hart, A. F.; Mattmann, C. A.; Ramirez, P.; Verma, R.; Zimdars, P. A.; Park, S.; Estrada, A.; Sumarlidason, A.; Gil, Y.; Ratnakar, V.; Krum, D.; Phan, T.; Meena, A.

    2013-12-01

    A plethora of open source technologies for manipulating, transforming, querying, and visualizing 'big data' have blossomed and matured in the last few years, driven in large part by recognition of the tremendous value that can be derived by leveraging data mining and visualization techniques on large data sets. One facet of many of these tools is that input data must often be prepared into a particular format (e.g.: JSON, CSV), or loaded into a particular storage technology (e.g.: HDFS) before analysis can take place. This process, commonly known as Extract-Transform-Load, or ETL, often involves multiple well-defined steps that must be executed in a particular order, and the approach taken for a particular data set is generally sensitive to the quantity and quality of the input data, as well as the structure and complexity of the desired output. When working with very large, heterogeneous, unstructured or semi-structured data sets, automating the ETL process and monitoring its progress becomes increasingly important. Apache Object Oriented Data Technology (OODT) provides a suite of complementary data management components called the Process Control System (PCS) that can be connected together to form flexible ETL pipelines as well as browser-based user interfaces for monitoring and control of ongoing operations. The lightweight, metadata driven middleware layer can be wrapped around custom ETL workflow steps, which themselves can be implemented in any language. Once configured, it facilitates communication between workflow steps and supports execution of ETL pipelines across a distributed cluster of compute resources. As participants in a DARPA-funded effort to develop open source tools for large-scale data analysis, we utilized Apache OODT to rapidly construct custom ETL pipelines for a variety of very large data sets to prepare them for analysis and visualization applications. We feel that OODT, which is free and open source software available through the Apache

  4. Monte Carlo: Basics

    OpenAIRE

    Murthy, K. P. N.

    2001-01-01

    An introduction to the basics of Monte Carlo is given. The topics covered include, sample space, events, probabilities, random variables, mean, variance, covariance, characteristic function, chebyshev inequality, law of large numbers, central limit theorem (stable distribution, Levy distribution), random numbers (generation and testing), random sampling techniques (inversion, rejection, sampling from a Gaussian, Metropolis sampling), analogue Monte Carlo and Importance sampling (exponential b...

  5. The Ability of the Acute Physiology and Chronic Health Evaluation (APACHE IV Score to Predict Mortality in a Single Tertiary Hospital

    Directory of Open Access Journals (Sweden)

    Jae Woo Choi

    2017-08-01

    Full Text Available Background The Acute Physiology and Chronic Health Evaluation (APACHE II model has been widely used in Korea. However, there have been few studies on the APACHE IV model in Korean intensive care units (ICUs. The aim of this study was to compare the ability of APACHE IV and APACHE II in predicting hospital mortality, and to investigate the ability of APACHE IV as a critical care triage criterion. Methods The study was designed as a prospective cohort study. Measurements of discrimination and calibration were performed using the area under the receiver operating characteristic curve (AUROC and the Hosmer-Lemeshow goodness-of-fit test respectively. We also calculated the standardized mortality ratio (SMR. Results The APACHE IV score, the Charlson Comorbidity index (CCI score, acute respiratory distress syndrome, and unplanned ICU admissions were independently associated with hospital mortality. The calibration, discrimination, and SMR of APACHE IV were good (H = 7.67, P = 0.465; C = 3.42, P = 0.905; AUROC = 0.759; SMR = 1.00. However, the explanatory power of an APACHE IV score >93 alone on hospital mortality was low at 44.1%. The explanatory power was increased to 53.8% when the hospital mortality was predicted using a model that considers APACHE IV >93 scores, medical admission, and risk factors for CCI >3 coincidentally. However, the discriminative ability of the prediction model was unsatisfactory (C index <0.70. Conclusions The APACHE IV presented good discrimination, calibration, and SMR for hospital mortality.

  6. Outcrop Analysis of the Cretaceous Mesaverde Group: Jicarilla Apache Reservation, New Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Ridgley, Jennie; Dunbar, Robin Wright

    2001-04-24

    Field work for this project was conducted during July and April 1998, at which time fourteen measured sections were described and correlated on or adjacent to Jicarilla Apache Reservation lands. A fifteenth section, described east of the main field area, is included in this report, although its distant location precluded use in the correlations and cross sections presented herein. Ground-based photo mosaics were shot for much of the exposed Mesaverde outcrop belt and were used to assist in correlation. Outcrop gamma-ray surveys at six of the fifteen measured sections using a GAD-6 scintillometer was conducted. The raw gamma-ray data are included in this report, however, analysis of those data is part of the ongoing Phase Two of this project.

  7. Extraction of UMLS® Concepts Using Apache cTAKES™ for German Language.

    Science.gov (United States)

    Becker, Matthias; Böckmann, Britta

    2016-01-01

    Automatic information extraction of medical concepts and classification with semantic standards from medical reports is useful for standardization and for clinical research. This paper presents an approach for an UMLS concept extraction with a customized natural language processing pipeline for German clinical notes using Apache cTAKES. The objectives are, to test the natural language processing tool for German language if it is suitable to identify UMLS concepts and map these with SNOMED-CT. The German UMLS database and German OpenNLP models extended the natural language processing pipeline, so the pipeline can normalize to domain ontologies such as SNOMED-CT using the German concepts. For testing, the ShARe/CLEF eHealth 2013 training dataset translated into German was used. The implemented algorithms are tested with a set of 199 German reports, obtaining a result of average 0.36 F1 measure without German stemming, pre- and post-processing of the reports.

  8. Inequalities in Open Source Software Development: Analysis of Contributor's Commits in Apache Software Foundation Projects.

    Science.gov (United States)

    Chełkowski, Tadeusz; Gloor, Peter; Jemielniak, Dariusz

    2016-01-01

    While researchers are becoming increasingly interested in studying OSS phenomenon, there is still a small number of studies analyzing larger samples of projects investigating the structure of activities among OSS developers. The significant amount of information that has been gathered in the publicly available open-source software repositories and mailing-list archives offers an opportunity to analyze projects structures and participant involvement. In this article, using on commits data from 263 Apache projects repositories (nearly all), we show that although OSS development is often described as collaborative, but it in fact predominantly relies on radically solitary input and individual, non-collaborative contributions. We also show, in the first published study of this magnitude, that the engagement of contributors is based on a power-law distribution.

  9. Are cicadas (Diceroprocta apache) both a "keystone" and a "critical-link" species in lower Colorado River riparian communities?

    Science.gov (United States)

    Andersen, Douglas C.

    1994-01-01

    Apache cicada (Homoptera: Cicadidae: Diceroprocta apache Davis) densities were estimated to be 10 individuals/m2 within a closed-canopy stand of Fremont cottonwood (Populus fremontii) and Goodding willow (Salix gooddingii) in a revegetated site adjacent to the Colorado River near Parker, Arizona. Coupled with data drawn from the literature, I estimate that up to 1.3 cm (13 1/m2) of water may be added to the upper soil layers annually through the feeding activities of cicada nymphs. This is equivalent to 12% of the annual precipitation received in the study area. Apache cicadas may have significant effects on ecosystem functioning via effects on water transport and thus act as a critical-link species in this southwest desert riverine ecosystem. Cicadas emerged later within the cottonwood-willow stand than in relatively open saltcedar-mesquite stands; this difference in temporal dynamics would affect their availability to several insectivorous bird species and may help explain the birds' recent declines. Resource managers in this region should be sensitive to the multiple and strong effects that Apache cicadas may have on ecosystem structure and functioning.

  10. Overview of the SDSS-IV MaNGA Survey: Mapping nearby Galaxies at Apache Point Observatory

    NARCIS (Netherlands)

    Bundy, Kevin; Bershady, Matthew A.; Law, David R.; Yan, Renbin; Drory, Niv; MacDonald, Nicholas; Wake, David A.; Cherinka, Brian; Sánchez-Gallego, José R.; Weijmans, Anne-Marie; Thomas, Daniel; Tremonti, Christy; Masters, Karen; Coccato, Lodovico; Diamond-Stanic, Aleksandar M.; Aragón-Salamanca, Alfonso; Avila-Reese, Vladimir; Badenes, Carles; Falcón-Barroso, Jésus; Belfiore, Francesco; Bizyaev, Dmitry; Blanc, Guillermo A.; Bland-Hawthorn, Joss; Blanton, Michael R.; Brownstein, Joel R.; Byler, Nell; Cappellari, Michele; Conroy, Charlie; Dutton, Aaron A.; Emsellem, Eric; Etherington, James; Frinchaboy, Peter M.; Fu, Hai; Gunn, James E.; Harding, Paul; Johnston, Evelyn J.; Kauffmann, Guinevere; Kinemuchi, Karen; Klaene, Mark A.; Knapen, Johan H.; Leauthaud, Alexie; Li, Cheng; Lin, Lihwai; Maiolino, Roberto; Malanushenko, Viktor; Malanushenko, Elena; Mao, Shude; Maraston, Claudia; McDermid, Richard M.; Merrifield, Michael R.; Nichol, Robert C.; Oravetz, Daniel; Pan, Kaike; Parejko, John K.; Sanchez, Sebastian F.; Schlegel, David; Simmons, Audrey; Steele, Oliver; Steinmetz, Matthias; Thanjavur, Karun; Thompson, Benjamin A.; Tinker, Jeremy L.; van den Bosch, Remco C. E.; Westfall, Kyle B.; Wilkinson, David; Wright, Shelley; Xiao, Ting; Zhang, Kai

    We present an overview of a new integral field spectroscopic survey called MaNGA (Mapping Nearby Galaxies at Apache Point Observatory), one of three core programs in the fourth-generation Sloan Digital Sky Survey (SDSS-IV) that began on 2014 July 1. MaNGA will investigate the internal kinematic

  11. D-dimer as marker for microcirculatory failure: correlation with LOD and APACHE II scores.

    Science.gov (United States)

    Angstwurm, Matthias W A; Reininger, Armin J; Spannagl, Michael

    2004-01-01

    The relevance of plasma d-dimer levels as marker for morbidity and organ dysfunction in severely ill patients is largely unknown. In a prospective study we determined d-dimer plasma levels of 800 unselected patients at admission to our intensive care unit. In 91% of the patients' samples d-dimer levels were elevated, in some patients up to several hundredfold as compared to normal values. The highest mean d-dimer values were present in the patient group with thromboembolic diseases, and particularly in non-survivors of pulmonary embolism. In patients with circulatory impairment (r=0.794) and in patients with infections (r=0.487) a statistically significant correlation was present between d-dimer levels and the APACHE II score (P<0.001). The logistic organ dysfunction score (LOD, P<0.001) correlated with d-dimer levels only in patients with circulatory impairment (r=0.474). On the contrary, patients without circulatory impairment demonstrated no correlation of d-dimer levels to the APACHE II or LOD score. Taking all patients together, no correlations of d-dimer levels with single organ failure or with indicators of infection could be detected. In conclusion, d-dimer plasma levels strongly correlated with the severity of the disease and organ dysfunction in patients with circulatory impairment or infections suggesting that elevated d-dimer levels may reflect the extent of microcirculatory failure. Thus, a therapeutic strategy to improve the microcirculation in such patients may be monitored using d-dimer plasma levels.

  12. Accuracy and Predictability of PANC-3 Scoring System over APACHE II in Acute Pancreatitis: A Prospective Study.

    Science.gov (United States)

    Rathnakar, Surag Kajoor; Vishnu, Vikram Hubbanageri; Muniyappa, Shridhar; Prasath, Arun

    2017-02-01

    Acute Pancreatitis (AP) is one of the common conditions encountered in the emergency room. The course of the disease ranges from mild form to severe acute form. Most of these episodes are mild and spontaneously subsiding within 3 to 5 days. In contrast, Severe Acute Pancreatitis (SAP) occurring in around 15-20% of all cases, mortality can range between 10 to 85% across various centres and countries. In such a situation we need an indicator which can predict the outcome of an attack, as severe or mild, as early as possible and such an indicator should be sensitive and specific enough to trust upon. PANC-3 scoring is such a scoring system in predicting the outcome of an attack of AP. To assess the accuracy and predictability of PANC-3 scoring system over APACHE II in predicting severity in an attack of AP. This prospective study was conducted on 82 patients admitted with the diagnosis of pancreatitis. Investigations to evaluate PANC-3 and APACHE II were done on all the patients and the PANC-3 and APACHE II score was calculated. PANC-3 score has a sensitivity of 82.6% and specificity of 77.9%, the test had a Positive Predictive Value (PPV) of 0.59 and Negative Predictive Value (NPV) of 0.92. Sensitivity of APACHE II in predicting SAP was 91.3% and specificity was 96.6% with PPV of 0.91, NPV was 0.96. Our study shows that PANC-3 can be used to predict the severity of pancreatitis as efficiently as APACHE II. The interpretation of PANC-3 does not need expertise and can be applied at the time of admission which is an advantage when compared to classical scoring systems.

  13. MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described

  14. Variational Monte Carlo Technique

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 19; Issue 8. Variational Monte Carlo Technique: Ground State Energies of Quantum Mechanical Systems. Sukanta Deb. General Article Volume 19 Issue 8 August 2014 pp 713-739 ...

  15. Assessment of performance and utility of mortality prediction models in a single Indian mixed tertiary intensive care unit.

    Science.gov (United States)

    Sathe, Prachee M; Bapat, Sharda N

    2014-01-01

    To assess the performance and utility of two mortality prediction models viz. Acute Physiology and Chronic Health Evaluation II (APACHE II) and Simplified Acute Physiology Score II (SAPS II) in a single Indian mixed tertiary intensive care unit (ICU). Secondary objectives were bench-marking and setting a base line for research. In this observational cohort, data needed for calculation of both scores were prospectively collected for all consecutive admissions to 28-bedded ICU in the year 2011. After excluding readmissions, discharges within 24 h and age <18 years, the records of 1543 patients were analyzed using appropriate statistical methods. Both models overpredicted mortality in this cohort [standardized mortality ratio (SMR) 0.88 ± 0.05 and 0.95 ± 0.06 using APACHE II and SAPS II respectively]. Patterns of predicted mortality had strong association with true mortality (R (2) = 0.98 for APACHE II and R (2) = 0.99 for SAPS II). Both models performed poorly in formal Hosmer-Lemeshow goodness-of-fit testing (Chi-square = 12.8 (P = 0.03) for APACHE II, Chi-square = 26.6 (P = 0.001) for SAPS II) but showed good discrimination (area under receiver operating characteristic curve 0.86 ± 0.013 SE (P < 0.001) and 0.83 ± 0.013 SE (P < 0.001) for APACHE II and SAPS II, respectively). There were wide variations in SMRs calculated for subgroups based on International Classification of Disease, 10(th) edition (standard deviation ± 0.27 for APACHE II and 0.30 for SAPS II). Lack of fit of data to the models and wide variation in SMRs in subgroups put a limitation on utility of these models as tools for assessing quality of care and comparing performances of different units without customization. Considering comparable performance and simplicity of use, efforts should be made to adapt SAPS II.

  16. Indian Legends.

    Science.gov (United States)

    Gurnoe, Katherine J.; Skjervold, Christian, Ed.

    Presenting American Indian legends, this material provides insight into the cultural background of the Dakota, Ojibwa, and Winnebago people. Written in a straightforward manner, each of the eight legends is associated with an Indian group. The legends included here are titled as follows: Minnesota is Minabozho's Land (Ojibwa); How We Got the…

  17. Visible Wavelength Reflectance Spectra and Taxonomies of Near-Earth Objects from Apache Point Observatory

    Science.gov (United States)

    Hammergren, Mark; Brucker, Melissa J.; Nault, Kristie A.; Gyuk, Geza; Solontoi, Michael R.

    2015-11-01

    Near-Earth Objects (NEOs) are interesting to scientists and the general public for diverse reasons: their impacts pose a threat to life and property; they present important albeit biased records of the formation and evolution of the Solar System; and their materials may provide in situ resources for future space exploration and habitation.In January 2015 we began a program of NEO astrometric follow-up and physical characterization using a 17% share of time on the Astrophysical Research Consortium (ARC) 3.5-meter telescope at Apache Point Observatory (APO). Our 500 hours of annual observing time are split into frequent, short astrometric runs (see poster by K. A. Nault et. al), and half-night runs devoted to physical characterization (see poster by M. J. Brucker et. al for preliminary rotational lightcurve results). NEO surface compositions are investigated with 0.36-1.0 μm reflectance spectroscopy using the Dual Imaging Spectrograph (DIS) instrument. As of August 25, 2015, including testing runs during fourth quarter 2014, we have obtained reflectance spectra of 68 unique NEOs, ranging in diameter from approximately 5m to 8km.In addition to investigating the compositions of individual NEOs to inform impact hazard and space resource evaluations, we may examine the distribution of taxonomic types and potential trends with other physical and orbital properties. For example, the Yarkovsky effect, which is dependent on asteroid shape, mass, rotation, and thermal characteristics, is believed to dominate other dynamical effects in driving the delivery of small NEOs from the main asteroid belt. Studies of the taxonomic distribution of a large sample of NEOs of a wide range of sizes will test this hypothesis.We present a preliminary analysis of the reflectance spectra obtained in our survey to date, including taxonomic classifications and potential trends with size.Acknowledgements: Based on observations obtained with the Apache Point Observatory 3.5-meter telescope, which

  18. Update on Astrometric Follow-Up at Apache Point Observatory by Adler Planetarium

    Science.gov (United States)

    Nault, Kristie A.; Brucker, Melissa; Hammergren, Mark

    2016-10-01

    We began our NEO astrometric follow-up and characterization program in 2014 Q4 using about 500 hours of observing time per year with the Astrophysical Research Consortium (ARC) 3.5m telescope at Apache Point Observatory (APO). Our observing is split into 2 hour blocks approximately every other night for astrometry (this poster) and several half-nights per month for spectroscopy (see poster by M. Hammergren et al.) and light curve studies.For astrometry, we use the ARC Telescope Imaging Camera (ARCTIC) with an SDSS r filter, in 2 hour observing blocks centered around midnight. ARCTIC has a magnitude limit of V~23 in 60s, and we target 20 NEOs per session. ARCTIC has a FOV 1.57 times larger and a readout time half as long as the previous imager, SPIcam, which we used from 2014 Q4 through 2015 Q3. Targets are selected primarily from the Minor Planet Center's (MPC) NEO Confirmation Page (NEOCP), and NEA Observation Planning Aid; we also refer to JPL's What's Observable page, the Spaceguard Priority List and Faint NEOs List, and requests from other observers. To quickly adapt to changing weather and seeing conditions, we create faint, midrange, and bright target lists. Detected NEOs are measured with Astrometrica and internal software, and the astrometry is reported to the MPC.As of June 19, 2016, we have targeted 2264 NEOs, 1955 with provisional designations, 1582 of which were detected. We began observing NEOCP asteroids on January 30, 2016, and have targeted 309, 207 of which were detected. In addition, we serendipitously observed 281 moving objects, 201 of which were identified as previously known objects.This work is based on observations obtained with the Apache Point Observatory 3.5m telescope, which is owned and operated by the Astrophysical Research Consortium. We gratefully acknowledge support from NASA NEOO award NNX14AL17G and thank the University of Chicago Department of Astronomy and Astrophysics for observing time in 2014.

  19. Monte Carlo codes and Monte Carlo simulator program

    International Nuclear Information System (INIS)

    Higuchi, Kenji; Asai, Kiyoshi; Suganuma, Masayuki.

    1990-03-01

    Four typical Monte Carlo codes KENO-IV, MORSE, MCNP and VIM have been vectorized on VP-100 at Computing Center, JAERI. The problems in vector processing of Monte Carlo codes on vector processors have become clear through the work. As the result, it is recognized that these are difficulties to obtain good performance in vector processing of Monte Carlo codes. A Monte Carlo computing machine, which processes the Monte Carlo codes with high performances is being developed at our Computing Center since 1987. The concept of Monte Carlo computing machine and its performance have been investigated and estimated by using a software simulator. In this report the problems in vectorization of Monte Carlo codes, Monte Carlo pipelines proposed to mitigate these difficulties and the results of the performance estimation of the Monte Carlo computing machine by the simulator are described. (author)

  20. Vectorized Monte Carlo

    International Nuclear Information System (INIS)

    Brown, F.B.

    1981-01-01

    Examination of the global algorithms and local kernels of conventional general-purpose Monte Carlo codes shows that multigroup Monte Carlo methods have sufficient structure to permit efficient vectorization. A structured multigroup Monte Carlo algorithm for vector computers is developed in which many particle events are treated at once on a cell-by-cell basis. Vectorization of kernels for tracking and variance reduction is described, and a new method for discrete sampling is developed to facilitate the vectorization of collision analysis. To demonstrate the potential of the new method, a vectorized Monte Carlo code for multigroup radiation transport analysis was developed. This code incorporates many features of conventional general-purpose production codes, including general geometry, splitting and Russian roulette, survival biasing, variance estimation via batching, a number of cutoffs, and generalized tallies of collision, tracklength, and surface crossing estimators with response functions. Predictions of vectorized performance characteristics for the CYBER-205 were made using emulated coding and a dynamic model of vector instruction timing. Computation rates were examined for a variety of test problems to determine sensitivities to batch size and vector lengths. Significant speedups are predicted for even a few hundred particles per batch, and asymptotic speedups by about 40 over equivalent Amdahl 470V/8 scalar codes arepredicted for a few thousand particles per batch. The principal conclusion is that vectorization of a general-purpose multigroup Monte Carlo code is well worth the significant effort required for stylized coding and major algorithmic changes

  1. CERN honours Carlo Rubbia

    CERN Document Server

    2009-01-01

    Carlo Rubbia turned 75 on March 31, and CERN held a symposium to mark his birthday and pay tribute to his impressive contribution to both CERN and science. Carlo Rubbia, 4th from right, together with the speakers at the symposium.On 7 April CERN hosted a celebration marking Carlo Rubbia’s 75th birthday and 25 years since he was awarded the Nobel Prize for Physics. "Today we will celebrate 100 years of Carlo Rubbia" joked CERN’s Director-General, Rolf Heuer in his opening speech, "75 years of his age and 25 years of the Nobel Prize." Rubbia received the Nobel Prize along with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. During the symposium, which was held in the Main Auditorium, several eminent speakers gave lectures on areas of science to which Carlo Rubbia made decisive contributions. Among those who spoke were Michel Spiro, Director of the French National Insti...

  2. Arizona TeleMedicine Project.

    Science.gov (United States)

    Arizona Univ., Tucson. Coll. of Medicine.

    Designed to provide health services for American Indians living on rurally isolated reservations, the Arizona TeleMedicine Project proposes to link Phoenix and Tucson medical centers, via a statewide telecommunications system, with the Hopi, San Carlos Apache, Papago, Navajo, and White Mountain Apache reservations. Advisory boards are being…

  3. The APACHE survey hardware and software design: Tools for an automatic search of small-size transiting exoplanets

    Directory of Open Access Journals (Sweden)

    Lattanzi M.G.

    2013-04-01

    Full Text Available Small-size ground-based telescopes can effectively be used to look for transiting rocky planets around nearby low-mass M stars using the photometric transit method, as recently demonstrated for example by the MEarth project. Since 2008 at the Astronomical Observatory of the Autonomous Region of Aosta Valley (OAVdA, we have been preparing for the long-term photometric survey APACHE, aimed at finding transiting small-size planets around thousands of nearby early and mid-M dwarfs. APACHE (A PAthway toward the Characterization of Habitable Earths is designed to use an array of five dedicated and identical 40-cm Ritchey-Chretien telescopes and its observations started at the beginning of summer 2012. The main characteristics of the survey final set up and the preliminary results from the first weeks of observations will be discussed.

  4. THE DATA REDUCTION PIPELINE FOR THE APACHE POINT OBSERVATORY GALACTIC EVOLUTION EXPERIMENT

    International Nuclear Information System (INIS)

    Nidever, David L.; Holtzman, Jon A.; Prieto, Carlos Allende; Mészáros, Szabolcs; Beland, Stephane; Bender, Chad; Desphande, Rohit; Bizyaev, Dmitry; Burton, Adam; García Pérez, Ana E.; Hearty, Fred R.; Majewski, Steven R.; Skrutskie, Michael F.; Sobeck, Jennifer S.; Wilson, John C.; Fleming, Scott W.; Muna, Demitri; Nguyen, Duy; Schiavon, Ricardo P.; Shetrone, Matthew

    2015-01-01

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE), part of the Sloan Digital Sky Survey III, explores the stellar populations of the Milky Way using the Sloan 2.5-m telescope linked to a high resolution (R ∼ 22,500), near-infrared (1.51–1.70 μm) spectrograph with 300 optical fibers. For over 150,000 predominantly red giant branch stars that APOGEE targeted across the Galactic bulge, disks and halo, the collected high signal-to-noise ratio (>100 per half-resolution element) spectra provide accurate (∼0.1 km s −1 ) RVs, stellar atmospheric parameters, and precise (≲0.1 dex) chemical abundances for about 15 chemical species. Here we describe the basic APOGEE data reduction software that reduces multiple 3D raw data cubes into calibrated, well-sampled, combined 1D spectra, as implemented for the SDSS-III/APOGEE data releases (DR10, DR11 and DR12). The processing of the near-IR spectral data of APOGEE presents some challenges for reduction, including automated sky subtraction and telluric correction over a 3°-diameter field and the combination of spectrally dithered spectra. We also discuss areas for future improvement

  5. The Goddard Integral Field Spectrograph at Apache Point Observatory: Current Status and Progress Towards Photon Counting

    Science.gov (United States)

    McElwain, Michael W.; Grady, Carol A.; Bally, John; Brinkmann, Jonathan V.; Bubeck, James; Gong, Qian; Hilton, George M.; Ketzeback, William F.; Lindler, Don; Llop Sayson, Jorge; Malatesta, Michael A.; Norton, Timothy; Rauscher, Bernard J.; Rothe, Johannes; Straka, Lorrie; Wilkins, Ashlee N.; Wisniewski, John P.; Woodgate, Bruce E.; York, Donald G.

    2015-01-01

    We present the current status and progress towards photon counting with the Goddard Integral Field Spectrograph (GIFS), a new instrument at the Apache Point Observatory's ARC 3.5m telescope. GIFS is a visible light imager and integral field spectrograph operating from 400-1000 nm over a 2.8' x 2.8' and 14' x 14' field of view, respectively. As an IFS, GIFS obtains over 1000 spectra simultaneously and its data reduction pipeline reconstructs them into an image cube that has 32 x 32 spatial elements and more than 200 spectral channels. The IFS mode can be applied to a wide variety of science programs including exoplanet transit spectroscopy, protostellar jets, the galactic interstellar medium probed by background quasars, Lyman-alpha emission line objects, and spectral imaging of galactic winds. An electron-multiplying CCD (EMCCD) detector enables photon counting in the high spectral resolution mode to be demonstrated at the ARC 3.5m in early 2015. The EMCCD work builds upon successful operational and characterization tests that have been conducted in the IFS laboratory at NASA Goddard. GIFS sets out to demonstrate an IFS photon-counting capability on-sky in preparation for future exoplanet direct imaging missions such as the AFTA-Coronagraph, Exo-C, and ATLAST mission concepts. This work is supported by the NASA APRA program under RTOP 10-APRA10-0103.

  6. High performance Spark best practices for scaling and optimizing Apache Spark

    CERN Document Server

    Karau, Holden

    2017-01-01

    Apache Spark is amazing when everything clicks. But if you haven’t seen the performance improvements you expected, or still don’t feel confident enough to use Spark in production, this practical book is for you. Authors Holden Karau and Rachel Warren demonstrate performance optimizations to help your Spark queries run faster and handle larger data sizes, while using fewer resources. Ideal for software engineers, data engineers, developers, and system administrators working with large-scale data applications, this book describes techniques that can reduce data infrastructure costs and developer hours. Not only will you gain a more comprehensive understanding of Spark, you’ll also learn how to make it sing. With this book, you’ll explore: How Spark SQL’s new interfaces improve performance over SQL’s RDD data structure The choice between data joins in Core Spark and Spark SQL Techniques for getting the most out of standard RDD transformations How to work around performance issues i...

  7. Detection of attack-targeted scans from the Apache HTTP Server access logs

    Directory of Open Access Journals (Sweden)

    Merve Baş Seyyar

    2018-01-01

    Full Text Available A web application could be visited for different purposes. It is possible for a web site to be visited by a regular user as a normal (natural visit, to be viewed by crawlers, bots, spiders, etc. for indexing purposes, lastly to be exploratory scanned by malicious users prior to an attack. An attack targeted web scan can be viewed as a phase of a potential attack and can lead to more attack detection as compared to traditional detection methods. In this work, we propose a method to detect attack-oriented scans and to distinguish them from other types of visits. In this context, we use access log files of Apache (or ISS web servers and try to determine attack situations through examination of the past data. In addition to web scan detections, we insert a rule set to detect SQL Injection and XSS attacks. Our approach has been applied on sample data sets and results have been analyzed in terms of performance measures to compare our method and other commonly used detection techniques. Furthermore, various tests have been made on log samples from real systems. Lastly, several suggestions about further development have been also discussed.

  8. The dimensionality of stellar chemical space using spectra from the Apache Point Observatory Galactic Evolution Experiment

    Science.gov (United States)

    Price-Jones, Natalie; Bovy, Jo

    2018-03-01

    Chemical tagging of stars based on their similar compositions can offer new insights about the star formation and dynamical history of the Milky Way. We investigate the feasibility of identifying groups of stars in chemical space by forgoing the use of model derived abundances in favour of direct analysis of spectra. This facilitates the propagation of measurement uncertainties and does not pre-suppose knowledge of which elements are important for distinguishing stars in chemical space. We use ˜16 000 red giant and red clump H-band spectra from the Apache Point Observatory Galactic Evolution Experiment (APOGEE) and perform polynomial fits to remove trends not due to abundance-ratio variations. Using expectation maximized principal component analysis, we find principal components with high signal in the wavelength regions most important for distinguishing between stars. Different subsamples of red giant and red clump stars are all consistent with needing about 10 principal components to accurately model the spectra above the level of the measurement uncertainties. The dimensionality of stellar chemical space that can be investigated in the H band is therefore ≲10. For APOGEE observations with typical signal-to-noise ratios of 100, the number of chemical space cells within which stars cannot be distinguished is approximately 1010±2 × (5 ± 2)n - 10 with n the number of principal components. This high dimensionality and the fine-grained sampling of chemical space are a promising first step towards chemical tagging based on spectra alone.

  9. THE DATA REDUCTION PIPELINE FOR THE APACHE POINT OBSERVATORY GALACTIC EVOLUTION EXPERIMENT

    Energy Technology Data Exchange (ETDEWEB)

    Nidever, David L. [Department of Astronomy, University of Michigan, Ann Arbor, MI 48109 (United States); Holtzman, Jon A. [New Mexico State University, Las Cruces, NM 88003 (United States); Prieto, Carlos Allende; Mészáros, Szabolcs [Instituto de Astrofísica de Canarias, Via Láctea s/n, E-38205 La Laguna, Tenerife (Spain); Beland, Stephane [Laboratory for Atmospheric and Space Sciences, University of Colorado at Boulder, Boulder, CO (United States); Bender, Chad; Desphande, Rohit [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States); Bizyaev, Dmitry [Apache Point Observatory and New Mexico State University, P.O. Box 59, sunspot, NM 88349-0059 (United States); Burton, Adam; García Pérez, Ana E.; Hearty, Fred R.; Majewski, Steven R.; Skrutskie, Michael F.; Sobeck, Jennifer S.; Wilson, John C. [Department of Astronomy, University of Virginia, Charlottesville, VA 22904-4325 (United States); Fleming, Scott W. [Computer Sciences Corporation, 3700 San Martin Dr, Baltimore, MD 21218 (United States); Muna, Demitri [Department of Astronomy and the Center for Cosmology and Astro-Particle Physics, The Ohio State University, Columbus, OH 43210 (United States); Nguyen, Duy [Department of Astronomy and Astrophysics, University of Toronto, Toronto, Ontario, M5S 3H4 (Canada); Schiavon, Ricardo P. [Gemini Observatory, 670 N. A’Ohoku Place, Hilo, HI 96720 (United States); Shetrone, Matthew, E-mail: dnidever@umich.edu [University of Texas at Austin, McDonald Observatory, Fort Davis, TX 79734 (United States)

    2015-12-15

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE), part of the Sloan Digital Sky Survey III, explores the stellar populations of the Milky Way using the Sloan 2.5-m telescope linked to a high resolution (R ∼ 22,500), near-infrared (1.51–1.70 μm) spectrograph with 300 optical fibers. For over 150,000 predominantly red giant branch stars that APOGEE targeted across the Galactic bulge, disks and halo, the collected high signal-to-noise ratio (>100 per half-resolution element) spectra provide accurate (∼0.1 km s{sup −1}) RVs, stellar atmospheric parameters, and precise (≲0.1 dex) chemical abundances for about 15 chemical species. Here we describe the basic APOGEE data reduction software that reduces multiple 3D raw data cubes into calibrated, well-sampled, combined 1D spectra, as implemented for the SDSS-III/APOGEE data releases (DR10, DR11 and DR12). The processing of the near-IR spectral data of APOGEE presents some challenges for reduction, including automated sky subtraction and telluric correction over a 3°-diameter field and the combination of spectrally dithered spectra. We also discuss areas for future improvement.

  10. Spatial correlations of Diceroprocta apache and its host plants: Evidence for a negative impact from Tamarix invasion

    Science.gov (United States)

    Ellingson, A.R.; Andersen, D.C.

    2002-01-01

    1. The hypothesis that the habitat-scale spatial distribution of the Apache cicada Diceroprocta apache Davis is unaffected by the presence of the invasive exotic saltcedar Tamarix ramosissima was tested using data from 205 1-m2 quadrats placed within the flood-plain of the Bill Williams River, Arizona, U.S.A. Spatial dependencies within and between cicada density and habitat variables were estimated using Moran's I and its bivariate analogue to discern patterns and associations at spatial scales from 1 to 30 m.2. Apache cicadas were spatially aggregated in high-density clusters averaging 3 m in diameter. A positive association between cicada density, estimated by exuvial density, and the per cent canopy cover of a native tree, Goodding's willow Salix gooddingii, was detected in a non-spatial correlation analysis. No non-spatial association between cicada density and saltcedar canopy cover was detected.3. Tests for spatial cross-correlation using the bivariate IYZ indicated the presence of a broad-scale negative association between cicada density and saltcedar canopy cover. This result suggests that large continuous stands of saltcedar are associated with reduced cicada density. In contrast, positive associations detected at spatial scales larger than individual quadrats suggested a spill-over of high cicada density from areas featuring Goodding's willow canopy into surrounding saltcedar monoculture.4. Taken together and considered in light of the Apache cicada's polyphagous habits, the observed spatial patterns suggest that broad-scale factors such as canopy heterogeneity affect cicada habitat use more than host plant selection. This has implications for management of lower Colorado River riparian woodlands to promote cicada presence and density through maintenance or creation of stands of native trees as well as manipulation of the characteristically dense and homogeneous saltcedar canopies.

  11. Enhancing organization and maintenance of big data with Apache Solr in IBM WebSphere Commerce deployments

    OpenAIRE

    Grigel, Rudolf

    2015-01-01

    The main objective of this thesis was to enhance the organization and maintenance of big data with Apache Solr in IBM WebSphere Commerce deployments. This objective can be split into several subtasks: reorganization of data, fast and optimised exporting and importing, efficient update and cleanup operations. E-Commerce is a fast growing and frequently changing environment. There is a constant flow of data that is rapidly growing larger and larger every day which is becoming an ...

  12. Validation of the LOD score compared with APACHE II score in prediction of the hospital outcome in critically ill patients.

    Science.gov (United States)

    Khwannimit, Bodin

    2008-01-01

    The Logistic Organ Dysfunction score (LOD) is an organ dysfunction score that can predict hospital mortality. The aim of this study was to validate the performance of the LOD score compared with the Acute Physiology and Chronic Health Evaluation II (APACHE II) score in a mixed intensive care unit (ICU) at a tertiary referral university hospital in Thailand. The data were collected prospectively on consecutive ICU admissions over a 24 month period from July1, 2004 until June 30, 2006. Discrimination was evaluated by the area under the receiver operating characteristic curve (AUROC). The calibration was assessed by the Hosmer-Lemeshow goodness-of-fit H statistic. The overall fit of the model was evaluated by the Brier's score. Overall, 1,429 patients were enrolled during the study period. The mortality in the ICU was 20.9% and in the hospital was 27.9%. The median ICU and hospital lengths of stay were 3 and 18 days, respectively, for all patients. Both models showed excellent discrimination. The AUROC for the LOD and APACHE II were 0.860 [95% confidence interval (CI) = 0.838-0.882] and 0.898 (95% Cl = 0.879-0.917), respectively. The LOD score had perfect calibration with the Hosmer-Lemeshow goodness-of-fit H chi-2 = 10 (p = 0.44). However, the APACHE II had poor calibration with the Hosmer-Lemeshow goodness-of-fit H chi-2 = 75.69 (p < 0.001). Brier's score showed the overall fit for both models were 0.123 (95%Cl = 0.107-0.141) and 0.114 (0.098-0.132) for the LOD and APACHE II, respectively. Thus, the LOD score was found to be accurate for predicting hospital mortality for general critically ill patients in Thailand.

  13. APACHE II SCORING SYSTEM AND ITS MODIFICATION FOR THE ASSESSMENT OF DISEASE SEVERITY IN CHILDREN WHO UNDERWENT POLYCHEMOTHERAPY

    Directory of Open Access Journals (Sweden)

    А. V. Sotnikov

    2014-01-01

    Full Text Available Short-term disease prognosis should be considered for the appropriate treatment policy based on the assessment of disease severity in patients with acute disease. The adequate assessment of disease severity and prognosis allows the indications for transferring patients to the resuscitation and intensive care department to be defined more precisely. Disease severity of patients who underwent polychemotherapy was assessed using APACHE II scoring system.

  14. Historical review of uranium-vanadium in the eastern Carrizo Mountains, San Juan County, New Mexico and Apache County, Arizona

    International Nuclear Information System (INIS)

    Chenoweth, W.L.

    1980-03-01

    This report is a brief review of the uranium and/or vanadium mining in the eastern Carrizo Mountains, San Juan County, New Mexico and Apache County, Arizona. It was prepared at the request of the Navajo Tribe, the New Mexico Energy and Minerals Department, and the Arizona Bureau of Geology and Mineral Technology. This report deals only with historical production data. The locations of the mines and the production are presented in figures and tables

  15. Efficient Streaming Mass Spatio-Temporal Vehicle Data Access in Urban Sensor Networks Based on Apache Storm.

    Science.gov (United States)

    Zhou, Lianjie; Chen, Nengcheng; Chen, Zeqiang

    2017-04-10

    The efficient data access of streaming vehicle data is the foundation of analyzing, using and mining vehicle data in smart cities, which is an approach to understand traffic environments. However, the number of vehicles in urban cities has grown rapidly, reaching hundreds of thousands in number. Accessing the mass streaming data of vehicles is hard and takes a long time due to limited computation capability and backward modes. We propose an efficient streaming spatio-temporal data access based on Apache Storm (ESDAS) to achieve real-time streaming data access and data cleaning. As a popular streaming data processing tool, Apache Storm can be applied to streaming mass data access and real time data cleaning. By designing the Spout/bolt workflow of topology in ESDAS and by developing the speeding bolt and other bolts, Apache Storm can achieve the prospective aim. In our experiments, Taiyuan BeiDou bus location data is selected as the mass spatio-temporal data source. In the experiments, the data access results with different bolts are shown in map form, and the filtered buses' aggregation forms are different. In terms of performance evaluation, the consumption time in ESDAS for ten thousand records per second for a speeding bolt is approximately 300 milliseconds, and that for MongoDB is approximately 1300 milliseconds. The efficiency of ESDAS is approximately three times higher than that of MongoDB.

  16. Efficient Streaming Mass Spatio-Temporal Vehicle Data Access in Urban Sensor Networks Based on Apache Storm

    Directory of Open Access Journals (Sweden)

    Lianjie Zhou

    2017-04-01

    Full Text Available The efficient data access of streaming vehicle data is the foundation of analyzing, using and mining vehicle data in smart cities, which is an approach to understand traffic environments. However, the number of vehicles in urban cities has grown rapidly, reaching hundreds of thousands in number. Accessing the mass streaming data of vehicles is hard and takes a long time due to limited computation capability and backward modes. We propose an efficient streaming spatio-temporal data access based on Apache Storm (ESDAS to achieve real-time streaming data access and data cleaning. As a popular streaming data processing tool, Apache Storm can be applied to streaming mass data access and real time data cleaning. By designing the Spout/bolt workflow of topology in ESDAS and by developing the speeding bolt and other bolts, Apache Storm can achieve the prospective aim. In our experiments, Taiyuan BeiDou bus location data is selected as the mass spatio-temporal data source. In the experiments, the data access results with different bolts are shown in map form, and the filtered buses’ aggregation forms are different. In terms of performance evaluation, the consumption time in ESDAS for ten thousand records per second for a speeding bolt is approximately 300 milliseconds, and that for MongoDB is approximately 1300 milliseconds. The efficiency of ESDAS is approximately three times higher than that of MongoDB.

  17. Carlo Caso (1940 - 2007)

    CERN Multimedia

    Leonardo Rossi

    Carlo Caso (1940 - 2007) Our friend and colleague Carlo Caso passed away on July 7th, after several months of courageous fight against cancer. Carlo spent most of his scientific career at CERN, taking an active part in the experimental programme of the laboratory. His long and fruitful involvement in particle physics started in the sixties, in the Genoa group led by G. Tomasini. He then made several experiments using the CERN liquid hydrogen bubble chambers -first the 2000HBC and later BEBC- to study various facets of the production and decay of meson and baryon resonances. He later made his own group and joined the NA27 Collaboration to exploit the EHS Spectrometer with a rapid cycling bubble chamber as vertex detector. Amongst their many achievements, they were the first to measure, with excellent precision, the lifetime of the charmed D mesons. At the start of the LEP era, Carlo and his group moved to the DELPHI experiment, participating in the construction and running of the HPC electromagnetic c...

  18. Variational Monte Carlo Technique

    Indian Academy of Sciences (India)

    ias

    on the development of nuclear weapons in Los Alamos ..... cantly improved the paper. ... Carlo simulations of solids, Reviews of Modern Physics, Vol.73, pp.33– ... The computer algorithms are usually based on a random seed that starts the ...

  19. Markov Chain Monte Carlo

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 7; Issue 3. Markov Chain Monte Carlo - Examples. Arnab Chakraborty. General Article Volume 7 Issue 3 March 2002 pp 25-34. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/007/03/0025-0034. Keywords.

  20. Monte Carlo and Quasi-Monte Carlo Sampling

    CERN Document Server

    Lemieux, Christiane

    2009-01-01

    Presents essential tools for using quasi-Monte Carlo sampling in practice. This book focuses on issues related to Monte Carlo methods - uniform and non-uniform random number generation, variance reduction techniques. It covers several aspects of quasi-Monte Carlo methods.

  1. Field studies at the Apache Leap Research Site in support of alternative conceptual models

    Energy Technology Data Exchange (ETDEWEB)

    Woodhouse, E.G.; Davidson, G.R.; Theis, C. [eds.] [and others

    1997-08-01

    This is a final technical report for a project of the U.S Nuclear Regulatory Commission (sponsored contract NRC-04-090-51) with the University of Arizona. The contract was an optional extension that was initiated on July 21, 1994 and that expired on May 31, 1995. The project manager was Thomas J. Nicholson, Office of Nuclear Regulatory Research. The objectives of this contract were to examine hypotheses and conceptual models concerning unsaturated flow and transport through fractured rock, and to design and execute confirmatory field and laboratory experiments to test these hypotheses and conceptual models at the Apache Leap Research Site near Superior, Arizona. The results discussed here are products of specific tasks that address a broad spectrum of issues related to flow and transport through fractures. Each chapter in this final report summarizes research related to a specific set of objectives and can be read and interpreted as a separate entity. The tasks include detection and characterization of historical rapid fluid flow through fractured rock and the relationship to perched water systems using environmental isotopic tracers of {sup 3}H and {sup 14}C, fluid- and rock-derived {sup 2343}U/{sup 238}U measurements, and geophysical data. The water balance in a small watershed at the ALRS demonstrates the methods of acounting for ET, and estimating the quantity of water available for infiltration through fracture networks. Grain density measurements were made for core-sized samples using a newly designed gas pycnometer. The distribution and magnitude of air permeability measurements have been measured in a three-dimensional setting; the subsequent geostatistical analysis is presented. Electronic versions of the data presented here are available from authors; more detailed discussions and analyses are available in technical publications referenced herein, or soon to appear in the professional literature.

  2. Field studies at the Apache Leap Research Site in support of alternative conceptual models

    International Nuclear Information System (INIS)

    Woodhouse, E.G.; Davidson, G.R.; Theis, C.

    1997-08-01

    This is a final technical report for a project of the U.S Nuclear Regulatory Commission (sponsored contract NRC-04-090-51) with the University of Arizona. The contract was an optional extension that was initiated on July 21, 1994 and that expired on May 31, 1995. The project manager was Thomas J. Nicholson, Office of Nuclear Regulatory Research. The objectives of this contract were to examine hypotheses and conceptual models concerning unsaturated flow and transport through fractured rock, and to design and execute confirmatory field and laboratory experiments to test these hypotheses and conceptual models at the Apache Leap Research Site near Superior, Arizona. The results discussed here are products of specific tasks that address a broad spectrum of issues related to flow and transport through fractures. Each chapter in this final report summarizes research related to a specific set of objectives and can be read and interpreted as a separate entity. The tasks include detection and characterization of historical rapid fluid flow through fractured rock and the relationship to perched water systems using environmental isotopic tracers of 3 H and 14 C, fluid- and rock-derived 2343 U/ 238 U measurements, and geophysical data. The water balance in a small watershed at the ALRS demonstrates the methods of acounting for ET, and estimating the quantity of water available for infiltration through fracture networks. Grain density measurements were made for core-sized samples using a newly designed gas pycnometer. The distribution and magnitude of air permeability measurements have been measured in a three-dimensional setting; the subsequent geostatistical analysis is presented. Electronic versions of the data presented here are available from authors; more detailed discussions and analyses are available in technical publications referenced herein, or soon to appear in the professional literature

  3. Bayesian Monte Carlo method

    International Nuclear Information System (INIS)

    Rajabalinejad, M.

    2010-01-01

    To reduce cost of Monte Carlo (MC) simulations for time-consuming processes, Bayesian Monte Carlo (BMC) is introduced in this paper. The BMC method reduces number of realizations in MC according to the desired accuracy level. BMC also provides a possibility of considering more priors. In other words, different priors can be integrated into one model by using BMC to further reduce cost of simulations. This study suggests speeding up the simulation process by considering the logical dependence of neighboring points as prior information. This information is used in the BMC method to produce a predictive tool through the simulation process. The general methodology and algorithm of BMC method are presented in this paper. The BMC method is applied to the simplified break water model as well as the finite element model of 17th Street Canal in New Orleans, and the results are compared with the MC and Dynamic Bounds methods.

  4. Monte Carlo principles and applications

    Energy Technology Data Exchange (ETDEWEB)

    Raeside, D E [Oklahoma Univ., Oklahoma City (USA). Health Sciences Center

    1976-03-01

    The principles underlying the use of Monte Carlo methods are explained, for readers who may not be familiar with the approach. The generation of random numbers is discussed, and the connection between Monte Carlo methods and random numbers is indicated. Outlines of two well established Monte Carlo sampling techniques are given, together with examples illustrating their use. The general techniques for improving the efficiency of Monte Carlo calculations are considered. The literature relevant to the applications of Monte Carlo calculations in medical physics is reviewed.

  5. Contributon Monte Carlo

    International Nuclear Information System (INIS)

    Dubi, A.; Gerstl, S.A.W.

    1979-05-01

    The contributon Monte Carlo method is based on a new recipe to calculate target responses by means of volume integral of the contributon current in a region between the source and the detector. A comprehensive description of the method, its implementation in the general-purpose MCNP code, and results of the method for realistic nonhomogeneous, energy-dependent problems are presented. 23 figures, 10 tables

  6. Carlos Vesga Duarte

    Directory of Open Access Journals (Sweden)

    Pedro Medina Avendaño

    1981-01-01

    Full Text Available Carlos Vega Duarte tenía la sencillez de los seres elementales y puros. Su corazón era limpio como oro de aluvión. Su trato directo y coloquial ponía de relieve a un santandereano sin contaminaciones que amaba el fulgor de las armas y se encandilaba con el destello de las frases perfectas

  7. Fundamentals of Monte Carlo

    International Nuclear Information System (INIS)

    Wollaber, Allan Benton

    2016-01-01

    This is a powerpoint presentation which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of the Monte Carlo calculation method. The material is presented according to the following outline: Introduction (background, a simple example: estimating @@), Why does this even work? (The Law of Large Numbers, The Central Limit Theorem), How to sample (inverse transform sampling, rejection), and An example from particle transport.

  8. Microcanonical Monte Carlo

    International Nuclear Information System (INIS)

    Creutz, M.

    1986-01-01

    The author discusses a recently developed algorithm for simulating statistical systems. The procedure interpolates between molecular dynamics methods and canonical Monte Carlo. The primary advantages are extremely fast simulations of discrete systems such as the Ising model and a relative insensitivity to random number quality. A variation of the algorithm gives rise to a deterministic dynamics for Ising spins. This model may be useful for high speed simulation of non-equilibrium phenomena

  9. Fundamentals of Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Wollaber, Allan Benton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-16

    This is a powerpoint presentation which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of the Monte Carlo calculation method. The material is presented according to the following outline: Introduction (background, a simple example: estimating π), Why does this even work? (The Law of Large Numbers, The Central Limit Theorem), How to sample (inverse transform sampling, rejection), and An example from particle transport.

  10. Indian Summer

    Energy Technology Data Exchange (ETDEWEB)

    Galindo, E. [Sho-Ban High School, Fort Hall, ID (United States)

    1997-08-01

    This paper focuses on preserving and strengthening two resources culturally and socially important to the Shoshone-Bannock Indian Tribe on the Fort Hall Reservation in Idaho; their young people and the Pacific-Northwest Salmon. After learning that salmon were not returning in significant numbers to ancestral fishing waters at headwater spawning sites, tribal youth wanted to know why. As a result, the Indian Summer project was conceived to give Shoshone-Bannock High School students the opportunity to develop hands-on, workable solutions to improve future Indian fishing and help make the river healthy again. The project goals were to increase the number of fry introduced into the streams, teach the Shoshone-Bannock students how to use scientific methodologies, and get students, parents, community members, and Indian and non-Indian mentors excited about learning. The students chose an egg incubation experiment to help increase self-sustaining, natural production of steelhead trout, and formulated and carried out a three step plan to increase the hatch-rate of steelhead trout in Idaho waters. With the help of local companies, governmental agencies, scientists, and mentors students have been able to meet their project goals, and at the same time, have learned how to use scientific methods to solve real life problems, how to return what they have used to the water and land, and how to have fun and enjoy life while learning.

  11. CERN honours Carlo Rubbia

    CERN Multimedia

    2009-01-01

    On 7 April CERN will be holding a symposium to mark the 75th birthday of Carlo Rubbia, who shared the 1984 Nobel Prize for Physics with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. Following a presentation by Rolf Heuer, lectures will be given by eminent speakers on areas of science to which Carlo Rubbia has made decisive contributions. Michel Spiro, Director of the French National Institute of Nuclear and Particle Physics (IN2P3) of the CNRS, Lyn Evans, sLHC Project Leader, and Alan Astbury of the TRIUMF Laboratory will talk about the physics of the weak interaction and the discovery of the W and Z bosons. Former CERN Director-General Herwig Schopper will lecture on CERN’s accelerators from LEP to the LHC. Giovanni Bignami, former President of the Italian Space Agency and Professor at the IUSS School for Advanced Studies in Pavia will speak about his work with Carlo Rubbia. Finally, Hans Joachim Sch...

  12. CERN honours Carlo Rubbia

    CERN Multimedia

    2009-01-01

    On 7 April CERN will be holding a symposium to mark the 75th birthday of Carlo Rubbia, who shared the 1984 Nobel Prize for Physics with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. Following a presentation by Rolf Heuer, lectures will be given by eminent speakers on areas of science to which Carlo Rubbia has made decisive contributions. Michel Spiro, Director of the French National Institute of Nuclear and Particle Physics (IN2P3) of the CNRS, Lyn Evans, sLHC Project Leader, and Alan Astbury of the TRIUMF Laboratory will talk about the physics of the weak interaction and the discovery of the W and Z bosons. Former CERN Director-General Herwig Schopper will lecture on CERN’s accelerators from LEP to the LHC. Giovanni Bignami, former President of the Italian Space Agency, will speak about his work with Carlo Rubbia. Finally, Hans Joachim Schellnhuber of the Potsdam Institute for Climate Research and Sven Kul...

  13. Who Writes Carlos Bulosan?

    Directory of Open Access Journals (Sweden)

    Charlie Samuya Veric

    2001-12-01

    Full Text Available The importance of Carlos Bulosan in Filipino and Filipino-American radical history and literature is indisputable. His eminence spans the pacific, and he is known, diversely, as a radical poet, fictionist, novelist, and labor organizer. Author of the canonical America Iis the Hearts, Bulosan is celebrated for chronicling the conditions in America in his time, such as racism and unemployment. In the history of criticism on Bulosan's life and work, however, there is an undeclared general consensus that views Bulosan and his work as coherent permanent texts of radicalism and anti-imperialism. Central to the existence of such a tradition of critical reception are the generations of critics who, in more ways than one, control the discourse on and of Carlos Bulosan. This essay inquires into the sphere of the critical reception that orders, for our time and for the time ahead, the reading and interpretation of Bulosan. What eye and seeing, the essay asks, determine the perception of Bulosan as the angel of radicalism? What is obscured in constructing Bulosan as an immutable figure of the political? What light does the reader conceive when the personal is brought into the open and situated against the political? the essay explores the answers to these questions in Bulosan's loving letters to various friends, strangers, and white American women. The presence of these interrogations, the essay believes, will secure ultimately the continuing importance of Carlos Bulosan to radical literature and history.

  14. Monte Carlo alpha calculation

    Energy Technology Data Exchange (ETDEWEB)

    Brockway, D.; Soran, P.; Whalen, P.

    1985-01-01

    A Monte Carlo algorithm to efficiently calculate static alpha eigenvalues, N = ne/sup ..cap alpha..t/, for supercritical systems has been developed and tested. A direct Monte Carlo approach to calculating a static alpha is to simply follow the buildup in time of neutrons in a supercritical system and evaluate the logarithmic derivative of the neutron population with respect to time. This procedure is expensive, and the solution is very noisy and almost useless for a system near critical. The modified approach is to convert the time-dependent problem to a static ..cap alpha../sup -/eigenvalue problem and regress ..cap alpha.. on solutions of a/sup -/ k/sup -/eigenvalue problem. In practice, this procedure is much more efficient than the direct calculation, and produces much more accurate results. Because the Monte Carlo codes are intrinsically three-dimensional and use elaborate continuous-energy cross sections, this technique is now used as a standard for evaluating other calculational techniques in odd geometries or with group cross sections.

  15. Hardening en servidor web Linux Apache, PHP y configurar el firewall de aplicaciones modsecurity para mitigar ataques al servidor

    OpenAIRE

    Espol; Delgado Quishpe, Byron Alberto

    2017-01-01

    Realizar un hardening al servidor web, se procederá a revisar las directivas en los archivos de configuración del servicio Apache, PHP, y se procederá a realizar instalación y configuración de un firewall de aplicaciones llamado mod_security la cual nos permitirá mitigar ataques a nuestro servidor web. realizando un análisis de vulnerabilidades encontrado en el servidor. Guayaquil Magíster en Seguridad Informática Aplicada

  16. Effectively Engaging in Tribal Consultation to protect Traditional Cultural Properties while navigating the 1872 Mining Law - Tonto National Forest, Western Apache Tribes, & Resolution Copper Mine

    Science.gov (United States)

    Nez, N.

    2017-12-01

    By effectively engaging in government-to-government consultation the Tonto National Forest is able to consider oral histories and tribal cultural knowledge in decision making. These conversations often have the potential to lead to the protection and preservation of public lands. Discussed here is one example of successful tribal consultation and how it let to the protection of Traditional Cultural Properties (TCPs). One hour east of Phoenix, Arizona on the Tonto National Forest, Resolution Copper Mine, is working to access a rich copper vein more than 7,000 feet deep. As part of the mining plan of operation they are investigating viable locations to store the earth removed from the mine site. One proposed storage location required hydrologic and geotechnical studies to determine viability. This constituted a significant amount of ground disturbance in an area that is of known importance to local Indian tribes. To ensure proper consideration of tribal concerns, the Forest engaged nine local tribes in government-government consultation. Consultation resulted in the identification of five springs in the project area considered (TCPs) by the Western Apache tribes. Due to the presence of identified TCPs, the Forest asked tribes to assist in the development of mitigation measures to minimize effects of this project on the TCPs identified. The goal of this partnership was to find a way for the Mine to still be able to gather data, while protecting TCPs. During field visits and consultations, a wide range of concerns were shared which were recorded and considered by Tonto National Forest. The Forest developed a proposed mitigation approach to protect springs, which would prevent (not permit) the installation of water monitoring wells, geotechnical borings or trench excavations within 1,200 feet of perennial springs in the project area. As an added mitigation measure, a cultural resources specialist would be on-site during all ground-disturbing activities. Diligent work on

  17. Journal of Earth System Science | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Earth System Science; Volume 122; Issue 5 .... Atmospheric correction of Earth-observation remote sensing images by Monte Carlo method ... Decision tree approach for classification of remotely sensed satellite data ... Analysis of carbon dioxide, water vapour and energy fluxes over an Indian ...

  18. Body composition assessment in American Indian children.

    Science.gov (United States)

    Lohman, T G; Caballero, B; Himes, J H; Hunsberger, S; Reid, R; Stewart, D; Skipper, B

    1999-04-01

    Although the high prevalence of obesity in American Indian children was documented in several surveys that used body mass index (BMI, in kg/m2) as the measure, there is limited information on more direct measurements of body adiposity in this population. The present study evaluated body composition in 81 boys (aged 11.2+/-0.6 y) and 75 girls (aged 11.0+/-0.4 y) attending public schools in 6 American Indian communities: White Mountain Apache, Pima, and Tohono O'Odham in Arizona; Oglala Lakota and Sicangu Lakota in South Dakota; and Navajo in New Mexico and Arizona. These communities were participating in the feasibility phase of Pathways, a multicenter intervention for the primary prevention of obesity. Body composition was estimated by using a combination of skinfold thickness and bioelectrical impedance measurements, with a prediction equation validated previously in this same population. The mean BMI was 20.4+/-4.2 for boys and 21.1+/-5.0 for girls. The sum of the triceps plus subscapular skinfold thicknesses averaged 28.6+/-7.0 mm in boys and 34.0+/-8.0 mm in girls. Mean percentage body fat was 35.6+/-6.9 in boys and 38.8+/-8.5 in girls. The results from this study confirmed the high prevalence of excess body fatness in school-age American Indian children and permitted the development of procedures, training, and quality control for measurement of the main outcome variable in the full-scale Pathways study.

  19. Monte Carlo Methods in Physics

    International Nuclear Information System (INIS)

    Santoso, B.

    1997-01-01

    Method of Monte Carlo integration is reviewed briefly and some of its applications in physics are explained. A numerical experiment on random generators used in the monte Carlo techniques is carried out to show the behavior of the randomness of various methods in generating them. To account for the weight function involved in the Monte Carlo, the metropolis method is used. From the results of the experiment, one can see that there is no regular patterns of the numbers generated, showing that the program generators are reasonably good, while the experimental results, shows a statistical distribution obeying statistical distribution law. Further some applications of the Monte Carlo methods in physics are given. The choice of physical problems are such that the models have available solutions either in exact or approximate values, in which comparisons can be mode, with the calculations using the Monte Carlo method. Comparison show that for the models to be considered, good agreement have been obtained

  20. Inequalities in Open Source Software Development: Analysis of Contributor’s Commits in Apache Software Foundation Projects

    Science.gov (United States)

    2016-01-01

    While researchers are becoming increasingly interested in studying OSS phenomenon, there is still a small number of studies analyzing larger samples of projects investigating the structure of activities among OSS developers. The significant amount of information that has been gathered in the publicly available open-source software repositories and mailing-list archives offers an opportunity to analyze projects structures and participant involvement. In this article, using on commits data from 263 Apache projects repositories (nearly all), we show that although OSS development is often described as collaborative, but it in fact predominantly relies on radically solitary input and individual, non-collaborative contributions. We also show, in the first published study of this magnitude, that the engagement of contributors is based on a power-law distribution. PMID:27096157

  1. Demonstration of the Military Ecological Risk Assessment Framework (MERAF): Apache Longbow - Hell Missile Test at Yuma Proving Ground

    Energy Technology Data Exchange (ETDEWEB)

    Efroymson, R.A.

    2002-05-09

    This ecological risk assessment for a testing program at Yuma Proving Ground, Arizona, is a demonstration of the Military Ecological Risk Assessment Framework (MERAF; Suter et al. 2001). The demonstration is intended to illustrate how risk assessment guidance concerning-generic military training and testing activities and guidance concerning a specific type of activity (e.g., low-altitude aircraft overflights) may be implemented at a military installation. MERAF was developed with funding from the Strategic Research and Development Program (SERDP) of the Department of Defense. Novel aspects of MERAF include: (1) the assessment of risks from physical stressors using an ecological risk assessment framework, (2) the consideration of contingent or indirect effects of stressors (e.g., population-level effects that are derived from habitat or hydrological changes), (3) the integration of risks associated with different component activities or stressors, (4) the emphasis on quantitative risk estimates and estimates of uncertainty, and (5) the modularity of design, permitting components of the framework to be used in various military risk assessments that include similar activities. The particular subject of this report is the assessment of ecological risks associated with a testing program at Cibola Range of Yuma Proving Ground, Arizona. The program involves an Apache Longbow helicopter firing Hellfire missiles at moving targets, i.e., M60-A1 tanks. Thus, the three component activities of the Apache-Hellfire test were: (1) helicopter overflight, (2) missile firing, and (3) tracked vehicle movement. The demonstration was limited, to two ecological endpoint entities (i.e., potentially susceptible and valued populations or communities): woody desert wash communities and mule deer populations. The core assessment area is composed of about 126 km{sup 2} between the Chocolate and Middle Mountains. The core time of the program is a three-week period, including fourteen days of

  2. APOGEE-2: The Second Phase of the Apache Point Observatory Galactic Evolution Experiment in SDSS-IV

    Science.gov (United States)

    Sobeck, Jennifer; Majewski, S.; Hearty, F.; Schiavon, R. P.; Holtzman, J. A.; Johnson, J.; Frinchaboy, P. M.; Skrutskie, M. F.; Munoz, R.; Pinsonneault, M. H.; Nidever, D. L.; Zasowski, G.; Garcia Perez, A.; Fabbian, D.; Meza Cofre, A.; Cunha, K. M.; Smith, V. V.; Chiappini, C.; Beers, T. C.; Steinmetz, M.; Anders, F.; Bizyaev, D.; Roman, A.; Fleming, S. W.; Crane, J. D.; SDSS-IV/APOGEE-2 Collaboration

    2014-01-01

    The second phase of the Apache Point Observatory Galactic Evolution Experiment (APOGEE-2), a part of the Sloan Digital Sky Survey IV (SDSS-IV), will commence operations in 2014. APOGEE-2 represents a significant expansion over APOGEE-1, not only in the size of the stellar sample, but also in the coverage of the sky through observations in both the Northern and Southern Hemispheres. Observations on the 2.5m Sloan Foundation Telescope of the Apache Point Observatory (APOGEE-2N) will continue immediately after the conclusion of APOGEE-1, to be followed by observations with the 2.5m du Pont Telescope of the Las Campanas Observatory (APOGEE-2S) within three years. Over the six-year lifetime of the project, high resolution (R˜22,500), high signal-to-noise (≥100) spectroscopic data in the H-band wavelength regime (1.51-1.69 μm) will be obtained for several hundred thousand stars, more than tripling the total APOGEE-1 sample. Accurate radial velocities and detailed chemical compositions will be generated for target stars in the main Galactic components (bulge, disk, and halo), open/globular clusters, and satellite dwarf galaxies. The spectroscopic follow-up program of Kepler targets with the APOGEE-2N instrument will be continued and expanded. APOGEE-2 will significantly extend and enhance the APOGEE-1 legacy of scientific contributions to understanding the origin and evolution of the elements, the assembly and formation history of galaxies like the Milky Way, and fundamental stellar astrophysics.

  3. Demonstration of the Military Ecological Risk Assessment Framework (MERAF): Apache Longbow - Hell Missile Test at Yuma Proving Ground

    International Nuclear Information System (INIS)

    Efroymson, R.A.

    2002-01-01

    This ecological risk assessment for a testing program at Yuma Proving Ground, Arizona, is a demonstration of the Military Ecological Risk Assessment Framework (MERAF; Suter et al. 2001). The demonstration is intended to illustrate how risk assessment guidance concerning-generic military training and testing activities and guidance concerning a specific type of activity (e.g., low-altitude aircraft overflights) may be implemented at a military installation. MERAF was developed with funding from the Strategic Research and Development Program (SERDP) of the Department of Defense. Novel aspects of MERAF include: (1) the assessment of risks from physical stressors using an ecological risk assessment framework, (2) the consideration of contingent or indirect effects of stressors (e.g., population-level effects that are derived from habitat or hydrological changes), (3) the integration of risks associated with different component activities or stressors, (4) the emphasis on quantitative risk estimates and estimates of uncertainty, and (5) the modularity of design, permitting components of the framework to be used in various military risk assessments that include similar activities. The particular subject of this report is the assessment of ecological risks associated with a testing program at Cibola Range of Yuma Proving Ground, Arizona. The program involves an Apache Longbow helicopter firing Hellfire missiles at moving targets, i.e., M60-A1 tanks. Thus, the three component activities of the Apache-Hellfire test were: (1) helicopter overflight, (2) missile firing, and (3) tracked vehicle movement. The demonstration was limited, to two ecological endpoint entities (i.e., potentially susceptible and valued populations or communities): woody desert wash communities and mule deer populations. The core assessment area is composed of about 126 km 2 between the Chocolate and Middle Mountains. The core time of the program is a three-week period, including fourteen days of

  4. Indian Ledger Art.

    Science.gov (United States)

    Chilcoat, George W.

    1990-01-01

    Offers an innovative way to teach mid-nineteenth century North American Indian history by having students create their own Indian Ledger art. Purposes of the project are: to understand the role played by American Indians, to reveal American Indian stereotypes, and to identify relationships between cultures and environments. Background and…

  5. Lectures on Monte Carlo methods

    CERN Document Server

    Madras, Neal

    2001-01-01

    Monte Carlo methods form an experimental branch of mathematics that employs simulations driven by random number generators. These methods are often used when others fail, since they are much less sensitive to the "curse of dimensionality", which plagues deterministic methods in problems with a large number of variables. Monte Carlo methods are used in many fields: mathematics, statistics, physics, chemistry, finance, computer science, and biology, for instance. This book is an introduction to Monte Carlo methods for anyone who would like to use these methods to study various kinds of mathemati

  6. Advanced Multilevel Monte Carlo Methods

    KAUST Repository

    Jasra, Ajay; Law, Kody; Suciu, Carina

    2017-01-01

    This article reviews the application of advanced Monte Carlo techniques in the context of Multilevel Monte Carlo (MLMC). MLMC is a strategy employed to compute expectations which can be biased in some sense, for instance, by using the discretization of a associated probability law. The MLMC approach works with a hierarchy of biased approximations which become progressively more accurate and more expensive. Using a telescoping representation of the most accurate approximation, the method is able to reduce the computational cost for a given level of error versus i.i.d. sampling from this latter approximation. All of these ideas originated for cases where exact sampling from couples in the hierarchy is possible. This article considers the case where such exact sampling is not currently possible. We consider Markov chain Monte Carlo and sequential Monte Carlo methods which have been introduced in the literature and we describe different strategies which facilitate the application of MLMC within these methods.

  7. Advanced Multilevel Monte Carlo Methods

    KAUST Repository

    Jasra, Ajay

    2017-04-24

    This article reviews the application of advanced Monte Carlo techniques in the context of Multilevel Monte Carlo (MLMC). MLMC is a strategy employed to compute expectations which can be biased in some sense, for instance, by using the discretization of a associated probability law. The MLMC approach works with a hierarchy of biased approximations which become progressively more accurate and more expensive. Using a telescoping representation of the most accurate approximation, the method is able to reduce the computational cost for a given level of error versus i.i.d. sampling from this latter approximation. All of these ideas originated for cases where exact sampling from couples in the hierarchy is possible. This article considers the case where such exact sampling is not currently possible. We consider Markov chain Monte Carlo and sequential Monte Carlo methods which have been introduced in the literature and we describe different strategies which facilitate the application of MLMC within these methods.

  8. TARC: Carlo Rubbia's Energy Amplifier

    CERN Multimedia

    Laurent Guiraud

    1997-01-01

    Transmutation by Adiabatic Resonance Crossing (TARC) is Carlo Rubbia's energy amplifier. This CERN experiment demonstrated that long-lived fission fragments, such as 99-TC, can be efficiently destroyed.

  9. Monte Carlo simulation for IRRMA

    International Nuclear Information System (INIS)

    Gardner, R.P.; Liu Lianyan

    2000-01-01

    Monte Carlo simulation is fast becoming a standard approach for many radiation applications that were previously treated almost entirely by experimental techniques. This is certainly true for Industrial Radiation and Radioisotope Measurement Applications - IRRMA. The reasons for this include: (1) the increased cost and inadequacy of experimentation for design and interpretation purposes; (2) the availability of low cost, large memory, and fast personal computers; and (3) the general availability of general purpose Monte Carlo codes that are increasingly user-friendly, efficient, and accurate. This paper discusses the history and present status of Monte Carlo simulation for IRRMA including the general purpose (GP) and specific purpose (SP) Monte Carlo codes and future needs - primarily from the experience of the authors

  10. Adjoint electron Monte Carlo calculations

    International Nuclear Information System (INIS)

    Jordan, T.M.

    1986-01-01

    Adjoint Monte Carlo is the most efficient method for accurate analysis of space systems exposed to natural and artificially enhanced electron environments. Recent adjoint calculations for isotropic electron environments include: comparative data for experimental measurements on electronics boxes; benchmark problem solutions for comparing total dose prediction methodologies; preliminary assessment of sectoring methods used during space system design; and total dose predictions on an electronics package. Adjoint Monte Carlo, forward Monte Carlo, and experiment are in excellent agreement for electron sources that simulate space environments. For electron space environments, adjoint Monte Carlo is clearly superior to forward Monte Carlo, requiring one to two orders of magnitude less computer time for relatively simple geometries. The solid-angle sectoring approximations used for routine design calculations can err by more than a factor of 2 on dose in simple shield geometries. For critical space systems exposed to severe electron environments, these potential sectoring errors demand the establishment of large design margins and/or verification of shield design by adjoint Monte Carlo/experiment

  11. Monte Carlo theory and practice

    International Nuclear Information System (INIS)

    James, F.

    1987-01-01

    Historically, the first large-scale calculations to make use of the Monte Carlo method were studies of neutron scattering and absorption, random processes for which it is quite natural to employ random numbers. Such calculations, a subset of Monte Carlo calculations, are known as direct simulation, since the 'hypothetical population' of the narrower definition above corresponds directly to the real population being studied. The Monte Carlo method may be applied wherever it is possible to establish equivalence between the desired result and the expected behaviour of a stochastic system. The problem to be solved may already be of a probabilistic or statistical nature, in which case its Monte Carlo formulation will usually be a straightforward simulation, or it may be of a deterministic or analytic nature, in which case an appropriate Monte Carlo formulation may require some imagination and may appear contrived or artificial. In any case, the suitability of the method chosen will depend on its mathematical properties and not on its superficial resemblance to the problem to be solved. The authors show how Monte Carlo techniques may be compared with other methods of solution of the same physical problem

  12. Multilevel sequential Monte Carlo samplers

    KAUST Repository

    Beskos, Alexandros; Jasra, Ajay; Law, Kody; Tempone, Raul; Zhou, Yan

    2016-01-01

    In this article we consider the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods which depend on the step-size level . hL. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretization levels . ∞>h0>h1⋯>hL. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence and a sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. It is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context. That is, relative to exact sampling and Monte Carlo for the distribution at the finest level . hL. The approach is numerically illustrated on a Bayesian inverse problem. © 2016 Elsevier B.V.

  13. Multilevel sequential Monte Carlo samplers

    KAUST Repository

    Beskos, Alexandros

    2016-08-29

    In this article we consider the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods which depend on the step-size level . hL. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretization levels . ∞>h0>h1⋯>hL. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence and a sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. It is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context. That is, relative to exact sampling and Monte Carlo for the distribution at the finest level . hL. The approach is numerically illustrated on a Bayesian inverse problem. © 2016 Elsevier B.V.

  14. Uso do escore prognóstico APACHE II e ATN-ISS em insuficiência renal aguda tratada dentro e fora da unidade de terapia intensiva

    OpenAIRE

    Fernandes,Natáia Maria da Silva; Pinto,Patrícia dos Santos; Lacet,Thiago Bento de Paiva; Rodrigues,Dominique Fonseca; Bastos,Marcus Gomes; Stella,Sérgio Reinaldo; Cendoroglo Neto,Miguel

    2009-01-01

    INTRODUÇÃO: A insuficiência renal aguda (IRA) mantém alta prevalência, morbidade e mortalidade. OBJETIVO: Comparar o uso do escore prognóstico APACHE II com o ATN-ISS e determinar se o APACHE II pode ser utilizado para pacientes com IRA, fora da UTI. MÉTODOS: Coorte prospectiva, 205 pacientes com IRA. Analisamos dados demográficos, condições pré-existentes, falência de órgãos e características da IRA. Os escores prognósticos foram realizados no dia da avaliação do nefrologista. RESULTADOS: A ...

  15. Cloud Computing: A model Construct of Real-Time Monitoring for Big Dataset Analytics Using Apache Spark

    Science.gov (United States)

    Alkasem, Ameen; Liu, Hongwei; Zuo, Decheng; Algarash, Basheer

    2018-01-01

    The volume of data being collected, analyzed, and stored has exploded in recent years, in particular in relation to the activity on the cloud computing. While large-scale data processing, analysis, storage, and platform model such as cloud computing were previously and currently are increasingly. Today, the major challenge is it address how to monitor and control these massive amounts of data and perform analysis in real-time at scale. The traditional methods and model systems are unable to cope with these quantities of data in real-time. Here we present a new methodology for constructing a model for optimizing the performance of real-time monitoring of big datasets, which includes a machine learning algorithms and Apache Spark Streaming to accomplish fine-grained fault diagnosis and repair of big dataset. As a case study, we use the failure of Virtual Machines (VMs) to start-up. The methodology proposition ensures that the most sensible action is carried out during the procedure of fine-grained monitoring and generates the highest efficacy and cost-saving fault repair through three construction control steps: (I) data collection; (II) analysis engine and (III) decision engine. We found that running this novel methodology can save a considerate amount of time compared to the Hadoop model, without sacrificing the classification accuracy or optimization of performance. The accuracy of the proposed method (92.13%) is an improvement on traditional approaches.

  16. Developing Online Communities with LAMP (Linux, Apache, MySQL, PHP) - the IMIA OSNI and CHIRAD Experiences.

    Science.gov (United States)

    Murray, Peter J; Oyri, Karl

    2005-01-01

    Many health informatics organisations do not seem to use, on a practical basis, for the benefit of their activities and interaction with their members, the very technologies that they often promote for use within healthcare environments. In particular, many organisations seem to be slow to take up the benefits of interactive web technologies. This paper presents an introduction to some of the many free/libre and open source (FLOSS) applications currently available and using the LAMP - Linux, Apache, MySQL, PHP architecture - as a way of cheaply deploying reliable, scalable, and secure web applications. The experience of moving to applications using LAMP architecture, in particular that of the Open Source Nursing Informatics (OSNI) Working Group of the Special Interest Group in Nursing Informatics of the International Medical Informatics Association (IMIA-NI), in using PostNuke, a FLOSS Content Management System (CMS) illustrates many of the benefits of such applications. The experiences of the authors in installing and maintaining a large number of websites using FLOSS CMS to develop dynamic, interactive websites that facilitate real engagement with the members of IMIA-NI OSNI, the IMIA Open Source Working Group, and the Centre for Health Informatics Research and Development (CHIRAD), as well as other organisations, is used as the basis for discussing the potential benefits that could be realised by others within the health informatics community.

  17. The Apache Longbow-Hellfire Missile Test at Yuma Proving Ground: Ecological Risk Assessment for Missile Firing

    International Nuclear Information System (INIS)

    Jones, Daniel Steven; Efroymson, Rebecca Ann; Hargrove, William Walter; Suter, Glenn; Pater, Larry

    2008-01-01

    A multiple stressor risk assessment was conducted at Yuma Proving Ground, Arizona, as a demonstration of the Military Ecological Risk Assessment Framework. The focus was a testing program at Cibola Range, which involved an Apache Longbow helicopter firing Hellfire missiles at moving targets, M60-A1 tanks. This paper describes the ecological risk assessment for the missile launch and detonation. The primary stressor associated with this activity was sound. Other minor stressors included the detonation impact, shrapnel, and fire. Exposure to desert mule deer (Odocoileus hemionus crooki) was quantified using the Army sound contour program BNOISE2, as well as distances from the explosion to deer. Few effects data were available from related studies. Exposure-response models for the characterization of effects consisted of human 'disturbance' and hearing damage thresholds in units of C-weighted decibels (sound exposure level) and a distance-based No Observed Adverse Effects Level for moose and cannonfire. The risk characterization used a weight-of-evidence approach and concluded that risk to mule deer behavior from the missile firing was likely for a negligible number of deer, but that no risk to mule deer abundance and reproduction is expected

  18. THE APACHE POINT OBSERVATORY GALACTIC EVOLUTION EXPERIMENT: FIRST DETECTION OF HIGH-VELOCITY MILKY WAY BAR STARS

    Energy Technology Data Exchange (ETDEWEB)

    Nidever, David L.; Zasowski, Gail; Majewski, Steven R.; Beaton, Rachael L.; Wilson, John C.; Skrutskie, Michael F.; O' Connell, Robert W. [Department of Astronomy, University of Virginia, Charlottesville, VA 22904-4325 (United States); Bird, Jonathan; Schoenrich, Ralph; Johnson, Jennifer A.; Sellgren, Kris [Department of Astronomy and the Center for Cosmology and Astro-Particle Physics, The Ohio State University, Columbus, OH 43210 (United States); Robin, Annie C.; Schultheis, Mathias [Institut Utinam, CNRS UMR 6213, OSU THETA, Universite de Franche-Comte, 41bis avenue de l' Observatoire, F-25000 Besancon (France); Martinez-Valpuesta, Inma; Gerhard, Ortwin [Max-Planck-Institut fuer Extraterrestrische Physik, Giessenbachstrasse, D-85748 Garching (Germany); Shetrone, Matthew [McDonald Observatory, University of Texas at Austin, Fort Davis, TX 79734 (United States); Schiavon, Ricardo P. [Gemini Observatory, 670 North A' Ohoku Place, Hilo, HI 96720 (United States); Weiner, Benjamin [Steward Observatory, 933 North Cherry Street, University of Arizona, Tucson, AZ 85721 (United States); Schneider, Donald P. [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States); Allende Prieto, Carlos, E-mail: dln5q@virginia.edu [Instituto de Astrofisica de Canarias, E-38205 La Laguna, Tenerife (Spain); and others

    2012-08-20

    Commissioning observations with the Apache Point Observatory Galactic Evolution Experiment (APOGEE), part of the Sloan Digital Sky Survey III, have produced radial velocities (RVs) for {approx}4700 K/M-giant stars in the Milky Way (MW) bulge. These high-resolution (R {approx} 22, 500), high-S/N (>100 per resolution element), near-infrared (NIR; 1.51-1.70 {mu}m) spectra provide accurate RVs ({epsilon}{sub V} {approx} 0.2 km s{sup -1}) for the sample of stars in 18 Galactic bulge fields spanning -1 Degree-Sign -32 Degree-Sign . This represents the largest NIR high-resolution spectroscopic sample of giant stars ever assembled in this region of the Galaxy. A cold ({sigma}{sub V} {approx} 30 km s{sup -1}), high-velocity peak (V{sub GSR} Almost-Equal-To +200 km s{sup -1}) is found to comprise a significant fraction ({approx}10%) of stars in many of these fields. These high RVs have not been detected in previous MW surveys and are not expected for a simple, circularly rotating disk. Preliminary distance estimates rule out an origin from the background Sagittarius tidal stream or a new stream in the MW disk. Comparison to various Galactic models suggests that these high RVs are best explained by stars in orbits of the Galactic bar potential, although some observational features remain unexplained.

  19. Comparative Study of Load Testing Tools: Apache JMeter, HP LoadRunner, Microsoft Visual Studio (TFS, Siege

    Directory of Open Access Journals (Sweden)

    Rabiya Abbas

    2017-12-01

    Full Text Available Software testing is the process of verifying and validating the user’s requirements. Testing is ongoing process during whole software development. Software testing is characterized into three main types. That is, in Black box testing, user doesn’t know domestic knowledge, internal logics and design of system. In white box testing, Tester knows the domestic logic of code. In Grey box testing, Tester has little bit knowledge about the internal structure and working of the system. It is commonly used in case of Integration testing.Load testing helps us to analyze the performance of the system under heavy load or under Zero load. This is achieved with the help of a Load Testing Tool. The intention for writing this research is to carry out a comparison of four load testing tools i.e. Apache JMeter, LoadRunner, Microsoft Visual Studio (TFS, Siege based on certain criteria  i.e. test scripts generation , result reports, application support, plug-in supports, and cost . The main focus is to study these load testing tools and identify which tool is better and more efficient . We assume this comparison can help in selecting the most appropriate tool and motivates the use of open source load testing tools.

  20. Data collection and field experiments at the Apache Leap research site. Annual report, May 1995--1996

    International Nuclear Information System (INIS)

    Woodhouse, E.G.; Bassett, R.L.; Neuman, S.P.; Chen, G.

    1997-08-01

    This report documents the research performed during the period May 1995-May 1996 for a project of the U.S. Regulatory Commission (sponsored contract NRC-04-090-051) by the University of Arizona. The project manager for this research in Thomas J. Nicholson, Office of Nuclear Regulatory Research. The objectives of this research were to examine hypotheses and test alternative conceptual models concerning unsaturated flow and transport through fractured rock, and to design and execute confirmatory field and laboratory experiments to test these hypotheses and conceptual models at the Apache Leap Research Site near Superior, Arizona. Each chapter in this report summarizes research related to a specific set of objectives and can be read and interpreted as a separate entity. Topics include: crosshole pneumatic and gaseous tracer field and modeling experiments designed to help validate the applicability of contiuum geostatistical and stochastic concepts, theories, models, and scaling relations relevant to unsaturated flow and transport in fractured porous tuffs; use of geochemistry and aquifer testing to evaluate fracture flow and perching mechanisms; investigations of 234 U/ 238 U fractionation to evaluate leaching selectivity; and transport and modeling of both conservative and non-conservative tracers

  1. Prediction of Mortality after Emergent Transjugular Intrahepatic Portosystemic Shunt Placement: Use of APACHE II, Child-Pugh and MELD Scores in Asian Patients with Refractory Variceal Hemorrhage

    Energy Technology Data Exchange (ETDEWEB)

    Tzeng, Wen Sheng; Wu, Reng Hong; Lin, Ching Yih; Chen, Jyh Jou; Sheu, Ming Juen; Koay, Lok Beng; Lee, Chuan [Chi-Mei Foundation Medical Center, Tainan (China)

    2009-10-15

    This study was designed to determine if existing methods of grading liver function that have been developed in non-Asian patients with cirrhosis can be used to predict mortality in Asian patients treated for refractory variceal hemorrhage by the use of the transjugular intrahepatic portosystemic shunt (TIPS) procedure. Data for 107 consecutive patients who underwent an emergency TIPS procedure were retrospectively analyzed. Acute physiology and chronic health evaluation (APACHE II), Child-Pugh and model for end-stage liver disease (MELD) scores were calculated. Survival analyses were performed to evaluate the ability of the various models to predict 30-day, 60-day and 360-day mortality. The ability of stratified APACHE II, Child-Pugh, and MELD scores to predict survival was assessed by the use of Kaplan-Meier analysis with the log-rank test. No patient died during the TIPS procedure, but 82 patients died during the follow-up period. Thirty patients died within 30 days after the TIPS procedure; 37 patients died within 60 days and 53 patients died within 360 days. Univariate analysis indicated that hepatorenal syndrome, use of inotropic agents and mechanical ventilation were associated with elevated 30-day mortality (p < 0.05). Multivariate analysis showed that a Child-Pugh score > 11 or an MELD score > 20 predicted increased risk of death at 30, 60 and 360 days (p < 0.05). APACHE II scores could only predict mortality at 360 days (p < 0.05). A Child-Pugh score > 11 or an MELD score > 20 are predictive of mortality in Asian patients with refractory variceal hemorrhage treated with the TIPS procedure. An APACHE II score is not predictive of early mortality in this patient population.

  2. The Effect of a Monocular Helmet-Mounted Display on Aircrew Health: A Longitudinal Cohort Study of Apache AH Mk 1 Pilots -(Vision and Handedness)

    Science.gov (United States)

    2015-05-19

    the day, night, and in adverse weather through the use of nose-mounted, forward-looking infrared (FLIR) pilotage and targeting sensors that provide a...sensor video and/or symbology to each crewmember via a helmet display unit (HDU). The HDU contains a 1-inch (in.) diameter cathode ray tube (CRT...American Association for Pediatric Ophthalmology and Strabismus, 12(4): 365–369. Sale, D. F., and Lund, G. J. 1993. AH-64 Apache program update

  3. Perform wordcount Map-Reduce Job in Single Node Apache Hadoop cluster and compress data using Lempel-Ziv-Oberhumer (LZO) algorithm

    OpenAIRE

    Mirajkar, Nandan; Bhujbal, Sandeep; Deshmukh, Aaradhana

    2013-01-01

    Applications like Yahoo, Facebook, Twitter have huge data which has to be stored and retrieved as per client access. This huge data storage requires huge database leading to increase in physical storage and becomes complex for analysis required in business growth. This storage capacity can be reduced and distributed processing of huge data can be done using Apache Hadoop which uses Map-reduce algorithm and combines the repeating data so that entire data is stored in reduced format. The paper ...

  4. Validation of APACHE II scoring system at 24 hours after admission as a prognostic tool in urosepsis: A prospective observational study

    Directory of Open Access Journals (Sweden)

    Sundaramoorthy VijayGanapathy

    2017-11-01

    Full Text Available Purpose: Urosepsis implies clinically evident severe infection of urinary tract with features of systemic inflammatory response syndrome (SIRS. We validate the role of a single Acute Physiology and Chronic Health Evaluation II (APACHE II score at 24 hours after admission in predicting mortality in urosepsis. Materials and Methods: A prospective observational study was done in 178 patients admitted with urosepsis in the Department of Urology, in a tertiary care institute from January 2015 to August 2016. Patients >18 years diagnosed as urosepsis using SIRS criteria with positive urine or blood culture for bacteria were included. At 24 hours after admission to intensive care unit, APACHE II score was calculated using 12 physiological variables, age and chronic health. Results: Mean±standard deviation (SD APACHE II score was 26.03±7.03. It was 24.31±6.48 in survivors and 32.39±5.09 in those expired (p<0.001. Among patients undergoing surgery, mean±SD score was higher (30.74±4.85 than among survivors (24.30±6.54 (p<0.001. Receiver operating characteristic (ROC analysis revealed area under curve (AUC of 0.825 with cutoff 25.5 being 94.7% sensitive and 56.4% specific to predict mortality. Mean±SD score in those undergoing surgery was 25.22±6.70 and was lesser than those who did not undergo surgery (28.44±7.49 (p=0.007. ROC analysis revealed AUC of 0.760 with cutoff 25.5 being 94.7% sensitive and 45.6% specific to predict mortality even after surgery. Conclusions: A single APACHE II score assessed at 24 hours after admission was able to predict morbidity, mortality, need for surgical intervention, length of hospitalization, treatment success and outcome in urosepsis patients.

  5. Validation of APACHE II scoring system at 24 hours after admission as a prognostic tool in urosepsis: A prospective observational study.

    Science.gov (United States)

    VijayGanapathy, Sundaramoorthy; Karthikeyan, VIlvapathy Senguttuvan; Sreenivas, Jayaram; Mallya, Ashwin; Keshavamurthy, Ramaiah

    2017-11-01

    Urosepsis implies clinically evident severe infection of urinary tract with features of systemic inflammatory response syndrome (SIRS). We validate the role of a single Acute Physiology and Chronic Health Evaluation II (APACHE II) score at 24 hours after admission in predicting mortality in urosepsis. A prospective observational study was done in 178 patients admitted with urosepsis in the Department of Urology, in a tertiary care institute from January 2015 to August 2016. Patients >18 years diagnosed as urosepsis using SIRS criteria with positive urine or blood culture for bacteria were included. At 24 hours after admission to intensive care unit, APACHE II score was calculated using 12 physiological variables, age and chronic health. Mean±standard deviation (SD) APACHE II score was 26.03±7.03. It was 24.31±6.48 in survivors and 32.39±5.09 in those expired (p<0.001). Among patients undergoing surgery, mean±SD score was higher (30.74±4.85) than among survivors (24.30±6.54) (p<0.001). Receiver operating characteristic (ROC) analysis revealed area under curve (AUC) of 0.825 with cutoff 25.5 being 94.7% sensitive and 56.4% specific to predict mortality. Mean±SD score in those undergoing surgery was 25.22±6.70 and was lesser than those who did not undergo surgery (28.44±7.49) (p=0.007). ROC analysis revealed AUC of 0.760 with cutoff 25.5 being 94.7% sensitive and 45.6% specific to predict mortality even after surgery. A single APACHE II score assessed at 24 hours after admission was able to predict morbidity, mortality, need for surgical intervention, length of hospitalization, treatment success and outcome in urosepsis patients.

  6. ANALYSIS OF OIL-BEARING CRETACEOUS SANDSTONE HYDROCARBON RESERVOIRS, EXCLUSIVE OF THE DAKOTA SANDSTONE, ON THE JICARILLA APACHE INDIAN RESERVATION, NEW MEXICO

    International Nuclear Information System (INIS)

    Jennie Ridgley

    2000-01-01

    An additional 450 wells were added to the structural database; there are now 2550 wells in the database with corrected tops on the Juana Lopez, base of the Bridge Creek Limestone, and datum. This completes the structural data base compilation. Fifteen oil and five gas fields from the Mancos-ElVado interval were evaluated with respect to the newly defined sequence stratigraphic model for this interval. The five gas fields are located away from the structural margins of the deep part of the San Juan Basin. All the fields have characteristics of basin-centered gas and can be considered as continuous gas accumulations as recently defined by the U.S. Geological Survey. Oil production occurs in thinly interbedded sandstone and shale or in discrete sandstone bodies. Production is both from transgressive and regressive strata as redefined in this study. Oil production is both stratigraphically and structurally controlled with production occurring along the Chaco slope or in steeply west-dipping rocks along the east margin of the basin. The ElVado Sandstone of subsurface usage is redefined to encompass a narrower interval; it appears to be more time correlative with the Dalton Sandstone. Thus, it was deposited as part of a regressive sequence, in contrast to the underlying rock units which were deposited during transgression

  7. ANALYSIS OF OIL-BEARING CRETACEOUS SANDSTONE HYDROCARBON RESERVOIRS, EXCLUSIVE OF THE DAKOTA SANDSTONE, ON THE JICARILLA APACHE INDIAN RESERVATION, NEW MEXICO

    International Nuclear Information System (INIS)

    Jennie Ridgley

    2000-01-01

    A goal of the Mesaverde project was to better define the depositional system of the Mesaverde in hopes that it would provide insight to new or by-passed targets for oil exploration. The new, detailed studies of the Mesaverde give us a better understanding of the lateral variability in depositional environments and facies. Recognition of this lateral variability and establishment of the criteria for separating deltaic, strandplain-barrier, and estuarine deposits from each other permit development of better hydrocarbon exploration models, because the sandstone geometry differs in each depositional system. Although these insights will provide better exploration models for gas exploration, it does not appear that they will be instrumental in finding more oil. Oil in the Mesaverde Group is produced from isolated fields on the Chaco slope; only a few wells define each field. Production is from sandstone beds in the upper part of the Point Lookout Sandstone or from individual fluvial channel sandstones in the Menefee. Stratigraphic traps rather than structural traps are more important. Source of the oil in the Menefee and Point Lookout may be from interbedded organic-rich mudstones or coals rather than from the Lewis Shale. The Lewis Shale appears to contain more type III organic matter and, hence, should produce mainly gas. Outcrop studies have not documented oil staining that might point to past oil migration through the sandstones of the Mesaverde. The lack of oil production may be related to the following: (1) lack of abundant organic matter of the type I or II variety in the Lewis Shale needed to produce oil, (2) ineffective migration pathways due to discontinuities in sandstone reservoir geometries, (3) cementation or early formation of gas prior to oil generation that reduced effective permeabilities and served as barriers to updip migration of oil, or (4) erosion of oilbearing reservoirs from the southern part of the basin. Any new production should mimic that of the past, i.e. be confined to small fields in isolated sandstone beds

  8. Application of Advanced Exploration Technologies for the Development of Mancos Formation Oil Reservoirs, Jicarilla Apache Indian Nation, San Juan Basin, New Mexico

    International Nuclear Information System (INIS)

    Reeves, Scott; Billingsley, Randy

    2002-01-01

    The objectives of this project are to: (1) develop an exploration rationale for the Mancos shale in the north-eastern San Juan basin; (2) assess the regional prospectivity of the Mancos in the northern Nation lands based on that rationale; (3) identify specific leads in the northern Nation as appropriate; (4) forecast pro-forma production, reserves and economics for any leads identified; and (5) package and disseminate the results to attract investment in Mancos development on the Nation lands

  9. Prediction of Mortality after Emergent Transjugular Intrahepatic Portosystemic Shunt Placement: Use of APACHE II, Child-Pugh and MELD Scores in Asian Patients with Refractory Variceal Hemorrhage

    International Nuclear Information System (INIS)

    Tzeng, Wen Sheng; Wu, Reng Hong; Lin, Ching Yih; Chen, Jyh Jou; Sheu, Ming Juen; Koay, Lok Beng; Lee, Chuan

    2009-01-01

    This study was designed to determine if existing methods of grading liver function that have been developed in non-Asian patients with cirrhosis can be used to predict mortality in Asian patients treated for refractory variceal hemorrhage by the use of the transjugular intrahepatic portosystemic shunt (TIPS) procedure. Data for 107 consecutive patients who underwent an emergency TIPS procedure were retrospectively analyzed. Acute physiology and chronic health evaluation (APACHE II), Child-Pugh and model for end-stage liver disease (MELD) scores were calculated. Survival analyses were performed to evaluate the ability of the various models to predict 30-day, 60-day and 360-day mortality. The ability of stratified APACHE II, Child-Pugh, and MELD scores to predict survival was assessed by the use of Kaplan-Meier analysis with the log-rank test. No patient died during the TIPS procedure, but 82 patients died during the follow-up period. Thirty patients died within 30 days after the TIPS procedure; 37 patients died within 60 days and 53 patients died within 360 days. Univariate analysis indicated that hepatorenal syndrome, use of inotropic agents and mechanical ventilation were associated with elevated 30-day mortality (p 11 or an MELD score > 20 predicted increased risk of death at 30, 60 and 360 days (p 11 or an MELD score > 20 are predictive of mortality in Asian patients with refractory variceal hemorrhage treated with the TIPS procedure. An APACHE II score is not predictive of early mortality in this patient population

  10. Strategije drevesnega preiskovanja Monte Carlo

    OpenAIRE

    VODOPIVEC, TOM

    2018-01-01

    Po preboju pri igri go so metode drevesnega preiskovanja Monte Carlo (ang. Monte Carlo tree search – MCTS) sprožile bliskovit napredek agentov za igranje iger: raziskovalna skupnost je od takrat razvila veliko variant in izboljšav algoritma MCTS ter s tem zagotovila napredek umetne inteligence ne samo pri igrah, ampak tudi v številnih drugih domenah. Čeprav metode MCTS združujejo splošnost naključnega vzorčenja z natančnostjo drevesnega preiskovanja, imajo lahko v praksi težave s počasno konv...

  11. Markov Chain Monte Carlo Methods

    Indian Academy of Sciences (India)

    Author Affiliations. K B Athreya1 Mohan Delampady2 T Krishnan3. School of ORIE Rhodes Hall Cornell University, Ithaca New York 14853, USA; Indian Statistical Institute 8th Mile, Mysore Road Bangalore 560059, India. Systat Software Asia-Pacific Ltd. Floor 5, 'C' Tower Golden Enclave, Airport Road Bangalore 560 017, ...

  12. Markov Chain Monte Carlo Methods

    Indian Academy of Sciences (India)

    K B Athreya1 Mohan Delampady2 T Krishnan3. School of ORIE Rhodes Hall Cornell University, Ithaca New York 14853, USA. Indian Statistical Institute 8th Mile, Mysore Rood Bangalore 560 059, India. Systat Software Asia-Pacific Ltd. Floor 5, 'C' Tower Golden Enclave, Airport Rood Bangalore 560 017, India.

  13. Is Monte Carlo embarrassingly parallel?

    Energy Technology Data Exchange (ETDEWEB)

    Hoogenboom, J. E. [Delft Univ. of Technology, Mekelweg 15, 2629 JB Delft (Netherlands); Delft Nuclear Consultancy, IJsselzoom 2, 2902 LB Capelle aan den IJssel (Netherlands)

    2012-07-01

    Monte Carlo is often stated as being embarrassingly parallel. However, running a Monte Carlo calculation, especially a reactor criticality calculation, in parallel using tens of processors shows a serious limitation in speedup and the execution time may even increase beyond a certain number of processors. In this paper the main causes of the loss of efficiency when using many processors are analyzed using a simple Monte Carlo program for criticality. The basic mechanism for parallel execution is MPI. One of the bottlenecks turn out to be the rendez-vous points in the parallel calculation used for synchronization and exchange of data between processors. This happens at least at the end of each cycle for fission source generation in order to collect the full fission source distribution for the next cycle and to estimate the effective multiplication factor, which is not only part of the requested results, but also input to the next cycle for population control. Basic improvements to overcome this limitation are suggested and tested. Also other time losses in the parallel calculation are identified. Moreover, the threading mechanism, which allows the parallel execution of tasks based on shared memory using OpenMP, is analyzed in detail. Recommendations are given to get the maximum efficiency out of a parallel Monte Carlo calculation. (authors)

  14. Is Monte Carlo embarrassingly parallel?

    International Nuclear Information System (INIS)

    Hoogenboom, J. E.

    2012-01-01

    Monte Carlo is often stated as being embarrassingly parallel. However, running a Monte Carlo calculation, especially a reactor criticality calculation, in parallel using tens of processors shows a serious limitation in speedup and the execution time may even increase beyond a certain number of processors. In this paper the main causes of the loss of efficiency when using many processors are analyzed using a simple Monte Carlo program for criticality. The basic mechanism for parallel execution is MPI. One of the bottlenecks turn out to be the rendez-vous points in the parallel calculation used for synchronization and exchange of data between processors. This happens at least at the end of each cycle for fission source generation in order to collect the full fission source distribution for the next cycle and to estimate the effective multiplication factor, which is not only part of the requested results, but also input to the next cycle for population control. Basic improvements to overcome this limitation are suggested and tested. Also other time losses in the parallel calculation are identified. Moreover, the threading mechanism, which allows the parallel execution of tasks based on shared memory using OpenMP, is analyzed in detail. Recommendations are given to get the maximum efficiency out of a parallel Monte Carlo calculation. (authors)

  15. Exact Monte Carlo for molecules

    International Nuclear Information System (INIS)

    Lester, W.A. Jr.; Reynolds, P.J.

    1985-03-01

    A brief summary of the fixed-node quantum Monte Carlo method is presented. Results obtained for binding energies, the classical barrier height for H + H 2 , and the singlet-triplet splitting in methylene are presented and discussed. 17 refs

  16. Leadership Preferences of Indian and Non-Indian Athletes.

    Science.gov (United States)

    Malloy, D. C.; Nilson, R. N.

    1991-01-01

    Among 86 Indian and non-Indian volleyball competitors, non-Indian players indicated significantly greater preferences for leadership that involved democratic behavior, autocratic behavior, or social support. Indians may adapt their behavior by participating in non-Indian games, without changing their traditional value orientations. Contains 22…

  17. Monte Carlo - Advances and Challenges

    International Nuclear Information System (INIS)

    Brown, Forrest B.; Mosteller, Russell D.; Martin, William R.

    2008-01-01

    Abstract only, full text follows: With ever-faster computers and mature Monte Carlo production codes, there has been tremendous growth in the application of Monte Carlo methods to the analysis of reactor physics and reactor systems. In the past, Monte Carlo methods were used primarily for calculating k eff of a critical system. More recently, Monte Carlo methods have been increasingly used for determining reactor power distributions and many design parameters, such as β eff , l eff , τ, reactivity coefficients, Doppler defect, dominance ratio, etc. These advanced applications of Monte Carlo methods are now becoming common, not just feasible, but bring new challenges to both developers and users: Convergence of 3D power distributions must be assured; confidence interval bias must be eliminated; iterated fission probabilities are required, rather than single-generation probabilities; temperature effects including Doppler and feedback must be represented; isotopic depletion and fission product buildup must be modeled. This workshop focuses on recent advances in Monte Carlo methods and their application to reactor physics problems, and on the resulting challenges faced by code developers and users. The workshop is partly tutorial, partly a review of the current state-of-the-art, and partly a discussion of future work that is needed. It should benefit both novice and expert Monte Carlo developers and users. In each of the topic areas, we provide an overview of needs, perspective on past and current methods, a review of recent work, and discussion of further research and capabilities that are required. Electronic copies of all workshop presentations and material will be available. The workshop is structured as 2 morning and 2 afternoon segments: - Criticality Calculations I - convergence diagnostics, acceleration methods, confidence intervals, and the iterated fission probability, - Criticality Calculations II - reactor kinetics parameters, dominance ratio, temperature

  18. OVERVIEW OF THE SDSS-IV MaNGA SURVEY: MAPPING NEARBY GALAXIES AT APACHE POINT OBSERVATORY

    Energy Technology Data Exchange (ETDEWEB)

    Bundy, Kevin [Kavli Institute for the Physics and Mathematics of the Universe (Kavli IPMU, WPI), Todai Institutes for Advanced Study, the University of Tokyo, Kashiwa 277-8583 (Japan); Bershady, Matthew A.; Wake, David A.; Tremonti, Christy; Diamond-Stanic, Aleksandar M. [Department of Astronomy, University of Wisconsin-Madison, 475 North Charter Street, Madison, WI 53706 (United States); Law, David R.; Cherinka, Brian [Dunlap Institute for Astronomy and Astrophysics, University of Toronto, 50 St. George Street, Toronto, Ontario M5S 3H4 (Canada); Yan, Renbin; Sánchez-Gallego, José R. [Department of Physics and Astronomy, University of Kentucky, 505 Rose Street, Lexington, KY 40506-0055 (United States); Drory, Niv [McDonald Observatory, Department of Astronomy, University of Texas at Austin, 1 University Station, Austin, TX 78712-0259 (United States); MacDonald, Nicholas [Department of Astronomy, Box 351580, University of Washington, Seattle, WA 98195 (United States); Weijmans, Anne-Marie [School of Physics and Astronomy, University of St Andrews, North Haugh, St Andrews KY16 9SS (United Kingdom); Thomas, Daniel; Masters, Karen; Coccato, Lodovico [Institute of Cosmology and Gravitation, University of Portsmouth, Portsmouth (United Kingdom); Aragón-Salamanca, Alfonso [School of Physics and Astronomy, University of Nottingham, University Park, Nottingham NG7 2RD (United Kingdom); Avila-Reese, Vladimir [Instituto de Astronomia, Universidad Nacional Autonoma de Mexico, A.P. 70-264, 04510 Mexico D.F. (Mexico); Badenes, Carles [Department of Physics and Astronomy and Pittsburgh Particle Physics, Astrophysics and Cosmology Center (PITT PACC), University of Pittsburgh, 3941 OHara St, Pittsburgh, PA 15260 (United States); Falcón-Barroso, Jésus [Instituto de Astrofísica de Canarias, E-38200 La Laguna, Tenerife (Spain); Belfiore, Francesco [Cavendish Laboratory, University of Cambridge, 19 J. J. Thomson Avenue, Cambridge CB3 0HE (United Kingdom); and others

    2015-01-01

    We present an overview of a new integral field spectroscopic survey called MaNGA (Mapping Nearby Galaxies at Apache Point Observatory), one of three core programs in the fourth-generation Sloan Digital Sky Survey (SDSS-IV) that began on 2014 July 1. MaNGA will investigate the internal kinematic structure and composition of gas and stars in an unprecedented sample of 10,000 nearby galaxies. We summarize essential characteristics of the instrument and survey design in the context of MaNGA's key science goals and present prototype observations to demonstrate MaNGA's scientific potential. MaNGA employs dithered observations with 17 fiber-bundle integral field units that vary in diameter from 12'' (19 fibers) to 32'' (127 fibers). Two dual-channel spectrographs provide simultaneous wavelength coverage over 3600-10300 Å at R ∼ 2000. With a typical integration time of 3 hr, MaNGA reaches a target r-band signal-to-noise ratio of 4-8 (Å{sup –1} per 2'' fiber) at 23 AB mag arcsec{sup –2}, which is typical for the outskirts of MaNGA galaxies. Targets are selected with M {sub *} ≳ 10{sup 9} M {sub ☉} using SDSS-I redshifts and i-band luminosity to achieve uniform radial coverage in terms of the effective radius, an approximately flat distribution in stellar mass, and a sample spanning a wide range of environments. Analysis of our prototype observations demonstrates MaNGA's ability to probe gas ionization, shed light on recent star formation and quenching, enable dynamical modeling, decompose constituent components, and map the composition of stellar populations. MaNGA's spatially resolved spectra will enable an unprecedented study of the astrophysics of nearby galaxies in the coming 6 yr.

  19. OVERVIEW OF THE SDSS-IV MaNGA SURVEY: MAPPING NEARBY GALAXIES AT APACHE POINT OBSERVATORY

    International Nuclear Information System (INIS)

    Bundy, Kevin; Bershady, Matthew A.; Wake, David A.; Tremonti, Christy; Diamond-Stanic, Aleksandar M.; Law, David R.; Cherinka, Brian; Yan, Renbin; Sánchez-Gallego, José R.; Drory, Niv; MacDonald, Nicholas; Weijmans, Anne-Marie; Thomas, Daniel; Masters, Karen; Coccato, Lodovico; Aragón-Salamanca, Alfonso; Avila-Reese, Vladimir; Badenes, Carles; Falcón-Barroso, Jésus; Belfiore, Francesco

    2015-01-01

    We present an overview of a new integral field spectroscopic survey called MaNGA (Mapping Nearby Galaxies at Apache Point Observatory), one of three core programs in the fourth-generation Sloan Digital Sky Survey (SDSS-IV) that began on 2014 July 1. MaNGA will investigate the internal kinematic structure and composition of gas and stars in an unprecedented sample of 10,000 nearby galaxies. We summarize essential characteristics of the instrument and survey design in the context of MaNGA's key science goals and present prototype observations to demonstrate MaNGA's scientific potential. MaNGA employs dithered observations with 17 fiber-bundle integral field units that vary in diameter from 12'' (19 fibers) to 32'' (127 fibers). Two dual-channel spectrographs provide simultaneous wavelength coverage over 3600-10300 Å at R ∼ 2000. With a typical integration time of 3 hr, MaNGA reaches a target r-band signal-to-noise ratio of 4-8 (Å –1 per 2'' fiber) at 23 AB mag arcsec –2 , which is typical for the outskirts of MaNGA galaxies. Targets are selected with M * ≳ 10 9 M ☉ using SDSS-I redshifts and i-band luminosity to achieve uniform radial coverage in terms of the effective radius, an approximately flat distribution in stellar mass, and a sample spanning a wide range of environments. Analysis of our prototype observations demonstrates MaNGA's ability to probe gas ionization, shed light on recent star formation and quenching, enable dynamical modeling, decompose constituent components, and map the composition of stellar populations. MaNGA's spatially resolved spectra will enable an unprecedented study of the astrophysics of nearby galaxies in the coming 6 yr

  20. 75 FR 61511 - Indian Gaming

    Science.gov (United States)

    2010-10-05

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs.... FOR FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the.... SUPPLEMENTARY INFORMATION: Under section 11 of the Indian Gaming Regulatory Act of 1988 (IGRA), Public Law 100...

  1. 76 FR 42722 - Indian Gaming

    Science.gov (United States)

    2011-07-19

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs... Date: July 19, 2011. FOR FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming... INFORMATION: Under section 11 of the Indian Gaming Regulatory Act of 1988 (IGRA), Public Law 100-497, 25 U.S.C...

  2. 75 FR 38834 - Indian Gaming

    Science.gov (United States)

    2010-07-06

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs...: July 6, 2010. FOR FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office...-4066. SUPPLEMENTARY INFORMATION: Under Section 11 of the Indian Gaming Regulatory Act of 1988 (IGRA...

  3. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Editorial Board. Sadhana. Editor. N Viswanadham, Indian Institute of Science, Bengaluru. Senior Associate Editors. Arakeri J H, Indian Institute of Science, Bengaluru Hari K V S, Indian Institute of Science, Bengaluru Mujumdar P P, Indian Institute of Science, Bengaluru Manoj Kumar Tiwari, Indian Institute of Technology, ...

  4. (U) Introduction to Monte Carlo Methods

    Energy Technology Data Exchange (ETDEWEB)

    Hungerford, Aimee L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-20

    Monte Carlo methods are very valuable for representing solutions to particle transport problems. Here we describe a “cook book” approach to handling the terms in a transport equation using Monte Carlo methods. Focus is on the mechanics of a numerical Monte Carlo code, rather than the mathematical foundations of the method.

  5. Isotopic depletion with Monte Carlo

    International Nuclear Information System (INIS)

    Martin, W.R.; Rathkopf, J.A.

    1996-06-01

    This work considers a method to deplete isotopes during a time- dependent Monte Carlo simulation of an evolving system. The method is based on explicitly combining a conventional estimator for the scalar flux with the analytical solutions to the isotopic depletion equations. There are no auxiliary calculations; the method is an integral part of the Monte Carlo calculation. The method eliminates negative densities and reduces the variance in the estimates for the isotope densities, compared to existing methods. Moreover, existing methods are shown to be special cases of the general method described in this work, as they can be derived by combining a high variance estimator for the scalar flux with a low-order approximation to the analytical solution to the depletion equation

  6. Monte Carlo Methods in ICF

    Science.gov (United States)

    Zimmerman, George B.

    Monte Carlo methods appropriate to simulate the transport of x-rays, neutrons, ions and electrons in Inertial Confinement Fusion targets are described and analyzed. The Implicit Monte Carlo method of x-ray transport handles symmetry within indirect drive ICF hohlraums well, but can be improved 50X in efficiency by angular biasing the x-rays towards the fuel capsule. Accurate simulation of thermonuclear burn and burn diagnostics involves detailed particle source spectra, charged particle ranges, inflight reaction kinematics, corrections for bulk and thermal Doppler effects and variance reduction to obtain adequate statistics for rare events. It is found that the effects of angular Coulomb scattering must be included in models of charged particle transport through heterogeneous materials.

  7. Monte Carlo methods in ICF

    International Nuclear Information System (INIS)

    Zimmerman, G.B.

    1997-01-01

    Monte Carlo methods appropriate to simulate the transport of x-rays, neutrons, ions and electrons in Inertial Confinement Fusion targets are described and analyzed. The Implicit Monte Carlo method of x-ray transport handles symmetry within indirect drive ICF hohlraums well, but can be improved 50X in efficiency by angular biasing the x-rays towards the fuel capsule. Accurate simulation of thermonuclear burn and burn diagnostics involves detailed particle source spectra, charged particle ranges, inflight reaction kinematics, corrections for bulk and thermal Doppler effects and variance reduction to obtain adequate statistics for rare events. It is found that the effects of angular Coulomb scattering must be included in models of charged particle transport through heterogeneous materials. copyright 1997 American Institute of Physics

  8. Monte Carlo methods in ICF

    International Nuclear Information System (INIS)

    Zimmerman, George B.

    1997-01-01

    Monte Carlo methods appropriate to simulate the transport of x-rays, neutrons, ions and electrons in Inertial Confinement Fusion targets are described and analyzed. The Implicit Monte Carlo method of x-ray transport handles symmetry within indirect drive ICF hohlraums well, but can be improved 50X in efficiency by angular biasing the x-rays towards the fuel capsule. Accurate simulation of thermonuclear burn and burn diagnostics involves detailed particle source spectra, charged particle ranges, inflight reaction kinematics, corrections for bulk and thermal Doppler effects and variance reduction to obtain adequate statistics for rare events. It is found that the effects of angular Coulomb scattering must be included in models of charged particle transport through heterogeneous materials

  9. Shell model Monte Carlo methods

    International Nuclear Information System (INIS)

    Koonin, S.E.; Dean, D.J.; Langanke, K.

    1997-01-01

    We review quantum Monte Carlo methods for dealing with large shell model problems. These methods reduce the imaginary-time many-body evolution operator to a coherent superposition of one-body evolutions in fluctuating one-body fields; the resultant path integral is evaluated stochastically. We first discuss the motivation, formalism, and implementation of such Shell Model Monte Carlo (SMMC) methods. There then follows a sampler of results and insights obtained from a number of applications. These include the ground state and thermal properties of pf-shell nuclei, the thermal and rotational behavior of rare-earth and γ-soft nuclei, and the calculation of double beta-decay matrix elements. Finally, prospects for further progress in such calculations are discussed. (orig.)

  10. A contribution Monte Carlo method

    International Nuclear Information System (INIS)

    Aboughantous, C.H.

    1994-01-01

    A Contribution Monte Carlo method is developed and successfully applied to a sample deep-penetration shielding problem. The random walk is simulated in most of its parts as in conventional Monte Carlo methods. The probability density functions (pdf's) are expressed in terms of spherical harmonics and are continuous functions in direction cosine and azimuthal angle variables as well as in position coordinates; the energy is discretized in the multigroup approximation. The transport pdf is an unusual exponential kernel strongly dependent on the incident and emergent directions and energies and on the position of the collision site. The method produces the same results obtained with the deterministic method with a very small standard deviation, with as little as 1,000 Contribution particles in both analog and nonabsorption biasing modes and with only a few minutes CPU time

  11. Shell model Monte Carlo methods

    International Nuclear Information System (INIS)

    Koonin, S.E.

    1996-01-01

    We review quantum Monte Carlo methods for dealing with large shell model problems. These methods reduce the imaginary-time many-body evolution operator to a coherent superposition of one-body evolutions in fluctuating one-body fields; resultant path integral is evaluated stochastically. We first discuss the motivation, formalism, and implementation of such Shell Model Monte Carlo methods. There then follows a sampler of results and insights obtained from a number of applications. These include the ground state and thermal properties of pf-shell nuclei, thermal behavior of γ-soft nuclei, and calculation of double beta-decay matrix elements. Finally, prospects for further progress in such calculations are discussed. 87 refs

  12. About | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    The 82nd Annual Meeting of the Indian Academy of Sciences is being held at ... by newly elected Fellows and Associates over a wide range of scientific topics. ... Indian Institute of Science Education and Research (IISER), Bhopal: Indian ...

  13. Parallel Monte Carlo reactor neutronics

    International Nuclear Information System (INIS)

    Blomquist, R.N.; Brown, F.B.

    1994-01-01

    The issues affecting implementation of parallel algorithms for large-scale engineering Monte Carlo neutron transport simulations are discussed. For nuclear reactor calculations, these include load balancing, recoding effort, reproducibility, domain decomposition techniques, I/O minimization, and strategies for different parallel architectures. Two codes were parallelized and tested for performance. The architectures employed include SIMD, MIMD-distributed memory, and workstation network with uneven interactive load. Speedups linear with the number of nodes were achieved

  14. Elements of Monte Carlo techniques

    International Nuclear Information System (INIS)

    Nagarajan, P.S.

    2000-01-01

    The Monte Carlo method is essentially mimicking the real world physical processes at the microscopic level. With the incredible increase in computing speeds and ever decreasing computing costs, there is widespread use of the method for practical problems. The method is used in calculating algorithm-generated sequences known as pseudo random sequence (prs)., probability density function (pdf), test for randomness, extension to multidimensional integration etc

  15. Adaptive Multilevel Monte Carlo Simulation

    KAUST Repository

    Hoel, H

    2011-08-23

    This work generalizes a multilevel forward Euler Monte Carlo method introduced in Michael B. Giles. (Michael Giles. Oper. Res. 56(3):607–617, 2008.) for the approximation of expected values depending on the solution to an Itô stochastic differential equation. The work (Michael Giles. Oper. Res. 56(3):607– 617, 2008.) proposed and analyzed a forward Euler multilevelMonte Carlo method based on a hierarchy of uniform time discretizations and control variates to reduce the computational effort required by a standard, single level, Forward Euler Monte Carlo method. This work introduces an adaptive hierarchy of non uniform time discretizations, generated by an adaptive algorithmintroduced in (AnnaDzougoutov et al. Raùl Tempone. Adaptive Monte Carlo algorithms for stopped diffusion. In Multiscale methods in science and engineering, volume 44 of Lect. Notes Comput. Sci. Eng., pages 59–88. Springer, Berlin, 2005; Kyoung-Sook Moon et al. Stoch. Anal. Appl. 23(3):511–558, 2005; Kyoung-Sook Moon et al. An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates stochastic, path dependent, time steps and is based on a posteriori error expansions first developed in (Anders Szepessy et al. Comm. Pure Appl. Math. 54(10):1169– 1214, 2001). Our numerical results for a stopped diffusion problem, exhibit savings in the computational cost to achieve an accuracy of ϑ(TOL),from(TOL−3), from using a single level version of the adaptive algorithm to ϑ(((TOL−1)log(TOL))2).

  16. Geometrical splitting in Monte Carlo

    International Nuclear Information System (INIS)

    Dubi, A.; Elperin, T.; Dudziak, D.J.

    1982-01-01

    A statistical model is presented by which a direct statistical approach yielded an analytic expression for the second moment, the variance ratio, and the benefit function in a model of an n surface-splitting Monte Carlo game. In addition to the insight into the dependence of the second moment on the splitting parameters the main importance of the expressions developed lies in their potential to become a basis for in-code optimization of splitting through a general algorithm. Refs

  17. Extending canonical Monte Carlo methods

    International Nuclear Information System (INIS)

    Velazquez, L; Curilef, S

    2010-01-01

    In this paper, we discuss the implications of a recently obtained equilibrium fluctuation-dissipation relation for the extension of the available Monte Carlo methods on the basis of the consideration of the Gibbs canonical ensemble to account for the existence of an anomalous regime with negative heat capacities C α with α≈0.2 for the particular case of the 2D ten-state Potts model

  18. Indianization of psychiatry utilizing Indian mental concepts

    Science.gov (United States)

    Avasthi, Ajit; Kate, Natasha; Grover, Sandeep

    2013-01-01

    Most of the psychiatry practice in India is guided by the western concepts of mental health and illness, which have largely ignored the role of religion, family, eastern philosophy, and medicine in understanding and managing the psychiatric disorders. India comprises of diverse cultures, languages, ethnicities, and religious affiliations. However, besides these diversities, there are certain commonalities, which include Hinduism as a religion which is spread across the country, the traditional family system, ancient Indian system of medicine and emphasis on use of traditional methods like Yoga and Meditation for controlling mind. This article discusses as to how mind and mental health are understood from the point of view of Hinduism, Indian traditions and Indian systems of medicine. Further, the article focuses on as to how these Indian concepts can be incorporated in the practice of contemporary psychiatry. PMID:23858244

  19. Non statistical Monte-Carlo

    International Nuclear Information System (INIS)

    Mercier, B.

    1985-04-01

    We have shown that the transport equation can be solved with particles, like the Monte-Carlo method, but without random numbers. In the Monte-Carlo method, particles are created from the source, and are followed from collision to collision until either they are absorbed or they leave the spatial domain. In our method, particles are created from the original source, with a variable weight taking into account both collision and absorption. These particles are followed until they leave the spatial domain, and we use them to determine a first collision source. Another set of particles is then created from this first collision source, and tracked to determine a second collision source, and so on. This process introduces an approximation which does not exist in the Monte-Carlo method. However, we have analyzed the effect of this approximation, and shown that it can be limited. Our method is deterministic, gives reproducible results. Furthermore, when extra accuracy is needed in some region, it is easier to get more particles to go there. It has the same kind of applications: rather problems where streaming is dominant than collision dominated problems

  20. BREM5 electroweak Monte Carlo

    International Nuclear Information System (INIS)

    Kennedy, D.C. II.

    1987-01-01

    This is an update on the progress of the BREMMUS Monte Carlo simulator, particularly in its current incarnation, BREM5. The present report is intended only as a follow-up to the Mark II/Granlibakken proceedings, and those proceedings should be consulted for a complete description of the capabilities and goals of the BREMMUS program. The new BREM5 program improves on the previous version of BREMMUS, BREM2, in a number of important ways. In BREM2, the internal loop (oblique) corrections were not treated in consistent fashion, a deficiency that led to renormalization scheme-dependence; i.e., physical results, such as cross sections, were dependent on the method used to eliminate infinities from the theory. Of course, this problem cannot be tolerated in a Monte Carlo designed for experimental use. BREM5 incorporates a new way of treating the oblique corrections, as explained in the Granlibakken proceedings, that guarantees renormalization scheme-independence and dramatically simplifies the organization and calculation of radiative corrections. This technique is to be presented in full detail in a forthcoming paper. BREM5 is, at this point, the only Monte Carlo to contain the entire set of one-loop corrections to electroweak four-fermion processes and renormalization scheme-independence. 3 figures

  1. Western Indian Ocean Journal of Marine Science - Vol 11, No 1 (2012)

    African Journals Online (AJOL)

    Using an ecosystem model to evaluate fisheries management options to mitigate climate change impacts in western Indian Ocean coral reefs · EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT. Carlos Ruiz Sebastián, Tim R. McClanahan, 77-86 ...

  2. Statistical implications in Monte Carlo depletions - 051

    International Nuclear Information System (INIS)

    Zhiwen, Xu; Rhodes, J.; Smith, K.

    2010-01-01

    As a result of steady advances of computer power, continuous-energy Monte Carlo depletion analysis is attracting considerable attention for reactor burnup calculations. The typical Monte Carlo analysis is set up as a combination of a Monte Carlo neutron transport solver and a fuel burnup solver. Note that the burnup solver is a deterministic module. The statistical errors in Monte Carlo solutions are introduced into nuclide number densities and propagated along fuel burnup. This paper is towards the understanding of the statistical implications in Monte Carlo depletions, including both statistical bias and statistical variations in depleted fuel number densities. The deterministic Studsvik lattice physics code, CASMO-5, is modified to model the Monte Carlo depletion. The statistical bias in depleted number densities is found to be negligible compared to its statistical variations, which, in turn, demonstrates the correctness of the Monte Carlo depletion method. Meanwhile, the statistical variation in number densities generally increases with burnup. Several possible ways of reducing the statistical errors are discussed: 1) to increase the number of individual Monte Carlo histories; 2) to increase the number of time steps; 3) to run additional independent Monte Carlo depletion cases. Finally, a new Monte Carlo depletion methodology, called the batch depletion method, is proposed, which consists of performing a set of independent Monte Carlo depletions and is thus capable of estimating the overall statistical errors including both the local statistical error and the propagated statistical error. (authors)

  3. Learning Apache Mahout

    CERN Document Server

    Tiwary, Chandramani

    2015-01-01

    If you are a Java developer and want to use Mahout and machine learning to solve Big Data Analytics use cases then this book is for you. Familiarity with shell scripts is assumed but no prior experience is required.

  4. Apache Accumulo for developers

    CERN Document Server

    Halldórsson, Guðmundur Jón

    2013-01-01

    The book will have a tutorial-based approach that will show the readers how to start from scratch with building an Accumulo cluster and learning how to monitor the system and implement aspects such as security.This book is great for developers new to Accumulo, who are looking to get a good grounding in how to use Accumulo. It's assumed that you have an understanding of how Hadoop works, both HDFS and the Map/Reduce. No prior knowledge of ZooKeeper is assumed.

  5. Apache hive essentials

    CERN Document Server

    Du, Dayong

    2015-01-01

    If you are a data analyst, developer, or simply someone who wants to use Hive to explore and analyze data in Hadoop, this is the book for you. Whether you are new to big data or an expert, with this book, you will be able to master both the basic and the advanced features of Hive. Since Hive is an SQL-like language, some previous experience with the SQL language and databases is useful to have a better understanding of this book.

  6. Learning Apache Mahout classification

    CERN Document Server

    Gupta, Ashish

    2015-01-01

    If you are a data scientist who has some experience with the Hadoop ecosystem and machine learning methods and want to try out classification on large datasets using Mahout, this book is ideal for you. Knowledge of Java is essential.

  7. Apache Maven dependency management

    CERN Document Server

    Lalou, Jonathan

    2013-01-01

    An easy-to-follow, tutorial-based guide with chapters progressing from basic to advanced dependency management.If you are working with Java or Java EE projects and you want to take advantage of Maven dependency management, then this book is ideal for you. This book is also particularly useful if you are a developer or an architect. You should be well versed with Maven and its basic functionalities if you wish to get the most out of this book.

  8. Mastering Apache Maven 3

    CERN Document Server

    Siriwardena, Prabath

    2014-01-01

    If you are working with Java or Java EE projects and you want to take full advantage of Maven in designing, executing, and maintaining your build system for optimal developer productivity, then this book is ideal for you. You should be well versed with Maven and its basic functionality if you wish to get the most out of the book.

  9. Learning Apache Cassandra

    CERN Document Server

    Brown, Mat

    2015-01-01

    If you're an application developer familiar with SQL databases such as MySQL or Postgres, and you want to explore distributed databases such as Cassandra, this is the perfect guide for you. Even if you've never worked with a distributed database before, Cassandra's intuitive programming interface coupled with the step-by-step examples in this book will have you building highly scalable persistence layers for your applications in no time.

  10. Apache Solr PHP integration

    CERN Document Server

    Kumar, Jayant

    2013-01-01

    This book is full of step-by-step example-oriented tutorials which will show readers how to integrate Solr in PHP applications using the available libraries, and boost the inherent search facilities that Solr offers.If you are a developer who knows PHP and is interested in integrating search into your applications, this is the book for you. No advanced knowledge of Solr is required. Very basic knowledge of system commands and the command-line interface on both Linux and Windows is required. You should also be familiar with the concept of Web servers.

  11. Associateship | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Address: Dept. of Electrical Engineering, Indian Institute of Technology, Kandi, ... Specialization: Elementary Particle Physics Address during Associateship: Centre for Theoretical Studies, Indian Institute of Science, Bangalore 560 012.

  12. Fellowship | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Address: Director, Indian Institute of Science Education & Research, .... Address: Visiting Professor, CORAL, Indian Institute of Technology, ..... Specialization: Elementary Particles & High Energy Physics, Plasma Physics and Atomic Physics

  13. Fellowship | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Address: Department of Chemistry, Indian Institute of Technology, Powai, Mumbai .... Address: Emeritus Professor, National Institute of Advanced Studies, Indian .... Specialization: High Energy & Elementary Particle Physics, Supersymmetric ...

  14. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig.

    Science.gov (United States)

    Sahoo, Satya S; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A; Lhatoo, Samden D

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This "neuroscience Big data" represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability-the ability to efficiently process increasing volumes of data; (b) Adaptability-the toolkit can be deployed across different computing configurations; and (c) Ease of programming-the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that the toolkit

  15. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    ... considerable difference between the Procedural programming and Object Oriented PHP language, on the middle layer in the three tier of the web architecture. Also, the research concerning the comparison of relationdatabase system, MySQL and NoSQL, key value store system, ApacheCassandra, on the database layer.

  16. Predictive values of urine paraquat concentration, dose of poison, arterial blood lactate and APACHE II score in the prognosis of patients with acute paraquat poisoning.

    Science.gov (United States)

    Liu, Xiao-Wei; Ma, Tao; Li, Lu-Lu; Qu, Bo; Liu, Zhi

    2017-07-01

    The present study investigated the predictive values of urine paraquat (PQ) concentration, dose of poison, arterial blood lactate and Acute Physiology and Chronic Health Evaluation (APACHE) II score in the prognosis of patients with acute PQ poisoning. A total of 194 patients with acute PQ poisoning, hospitalized between April 2012 and January 2014 at the First Affiliated Hospital of P.R. China Medical University (Shenyang, China), were selected and divided into survival and mortality groups. Logistic regression analysis, receiver operator characteristic (ROC) curve analysis and Kaplan-Meier curve were applied to evaluate the values of urine paraquat (PQ) concentration, dose of poison, arterial blood lactate and (APACHE) II score for predicting the prognosis of patients with acute PQ poisoning. Initial urine PQ concentration (C0), dose of poison, arterial blood lactate and APACHE II score of patients in the mortality group were significantly higher compared with the survival group (all Ppoison and arterial blood lactate correlated with mortality risk of acute PQ poisoning (all Ppoison, arterial blood lactate and APACHE II score in predicting the mortality of patients within 28 days were 0.921, 0.887, 0.808 and 0.648, respectively. The AUC of C0 for predicting early and delayed mortality were 0.890 and 0.764, respectively. The AUC values of urine paraquat concentration the day after poisoning (Csec) and the rebound rate of urine paraquat concentration in predicting the mortality of patients within 28 days were 0.919 and 0.805, respectively. The 28-day survival rate of patients with C0 ≤32.2 µg/ml (42/71; 59.2%) was significantly higher when compared with patients with C0 >32.2 µg/ml (38/123; 30.9%). These results suggest that the initial urine PQ concentration may be the optimal index for predicting the prognosis of patients with acute PQ poisoning. Additionally, dose of poison, arterial blood lactate, Csec and rebound rate also have referential significance.

  17. Cuartel San Carlos. Yacimiento veterano

    Directory of Open Access Journals (Sweden)

    Mariana Flores

    2007-01-01

    Full Text Available El Cuartel San Carlos es un monumento histórico nacional (1986 de finales del siglo XVIII (1785-1790, caracterizado por sufrir diversas adversidades en su construcción y soportar los terremotos de 1812 y 1900. En el año 2006, el organismo encargado de su custodia, el Instituto de Patrimonio Cultural del Ministerio de Cultura, ejecutó tres etapas de exploración arqueológica, que abarcaron las áreas Traspatio, Patio Central y las Naves Este y Oeste de la edificación. Este trabajo reseña el análisis de la documentación arqueológica obtenida en el sitio, a partir de la realización de dicho proyecto, denominado EACUSAC (Estudio Arqueológico del Cuartel San Carlos, que representa además, la tercera campaña realizada en el sitio. La importancia de este yacimiento histórico, radica en su participación en los acontecimientos que propiciaron conflictos de poder durante el surgimiento de la República y en los sucesos políticos del siglo XX. De igual manera, se encontró en el sitio una amplia muestra de materiales arqueológicos que reseñan un estilo de vida cotidiana militar, así como las dinámicas sociales internas ocurridas en el San Carlos, como lugar estratégico para la defensa de los diferentes regímenes que atravesó el país, desde la época del imperialismo español hasta nuestros días.

  18. Carlos Battilana: Profesor, Gestor, Amigo

    Directory of Open Access Journals (Sweden)

    José Pacheco

    2009-12-01

    Full Text Available El Comité Editorial de Anales ha perdido a uno de sus miembros más connotados. Brillante docente de nuestra Facultad, Carlos Alberto Battilana Guanilo (1945-2009 supo transmitir los conocimientos y atraer la atención de sus auditorios, de jóvenes estudiantes o de contemporáneos ya no tan jóvenes. Interesó a sus alumnos en la senda de la capacitación permanente y en la investigación. Por otro lado, comprometió a médicos distinguidos a conformar y liderar grupos con interés en la ciencia-amistad. Su vocación docente lo vinculó a facultades de medicina y academias y sociedades científicas, en donde coordinó cursos y congresos de grato recuerdo. Su producción científica la dedicó a la nefrología, inmunología, cáncer, costos en el tratamiento médico. Su capacidad gestora y de liderazgo presente desde su época de estudiante, le permitió llegar a ser director regional de un laboratorio farmacéutico de mucho prestigio, organizar una facultad de medicina y luego tener el cargo de decano de la facultad de ciencias de la salud de dicha universidad privada. Carlos fue elemento importante para que Anales alcanzara un sitial de privilegio entre las revistas biomédicas peruanas. En la semblanza que publicamos tratamos de resumir apretadamente la trayectoria de Carlos Battilana, semanas después de su partida sin retorno.

  19. Red Women, White Policy: American Indian Women and Indian Education.

    Science.gov (United States)

    Warner, Linda Sue

    This paper discusses American Indian educational policies and implications for educational leadership by Indian women. The paper begins with an overview of federal Indian educational policies from 1802 to the 1970s. As the tribes have moved toward self-determination in recent years, a growing number of American Indian women have assumed leadership…

  20. Monte Carlo Particle Lists: MCPL

    DEFF Research Database (Denmark)

    Kittelmann, Thomas; Klinkby, Esben Bryndt; Bergbäck Knudsen, Erik

    2017-01-01

    A binary format with lists of particle state information, for interchanging particles between various Monte Carlo simulation applications, is presented. Portable C code for file manipulation is made available to the scientific community, along with converters and plugins for several popular...... simulation packages. Program summary: Program Title: MCPL. Program Files doi: http://dx.doi.org/10.17632/cby92vsv5g.1 Licensing provisions: CC0 for core MCPL, see LICENSE file for details. Programming language: C and C++ External routines/libraries: Geant4, MCNP, McStas, McXtrace Nature of problem: Saving...

  1. Luis Carlos López

    Directory of Open Access Journals (Sweden)

    Rafael Maya

    1979-04-01

    Full Text Available Entre los poetasa del Centenario tuvo Luis Carlos López mucha popularidad en el extranjero, desde la publicación de su primer libro. Creo que su obra llamó la atención de filósofos como Unamuno y, si no estoy equivocado, Darío se refirió a ella en términos elogiosos. En Colombia ha sido encomiada hiperbólicamente por algunos, a tiemp que otros no le conceden mayor mérito.

  2. Defeathering the Indian.

    Science.gov (United States)

    LaRoque, Emma

    In an effort to mitigate the stultified image of the American Indian in Canada, this handbook on Native Studies is written from the Indian point of view and is designed to sensitize the dominant society, particularly educators. While numerous approaches and pointers are presented and specific mateirals are recommended, the focus is essentially…

  3. American Indian Community Colleges.

    Science.gov (United States)

    One Feather, Gerald

    With the emergence of reservation based community colleges (th Navajo Community College and the Dakota Community Colleges), the American Indian people, as decision makers in these institutions, are providing Indians with the technical skills and cultural knowledge necessary for self-determination. Confronted with limited numbers of accredited…

  4. Indian Summer Arts Festival


    OpenAIRE

    Martel, Yann; Tabu; Tejpal, Tarun; Kunzru, Hari

    2011-01-01

    The SFU Woodward's Cultural Unit partnered with the Indian Summer Festival Society to kick off the inaugural Indian Summer Festival. Held at the Goldcorp Centre for the Arts, it included an interactive Literature Series with notable authors from both India and Canada, including special guests Yann Martel, Bollywood superstar Tabu, journalist Tarun Tejpal, writer Hari Kunzru, and many others.

  5. Indian Ocean Rim Cooperation

    DEFF Research Database (Denmark)

    Wippel, Steffen

    Since the mid-1990s, the Indian Ocean has been experiencing increasing economic cooperation among its rim states. Middle Eastern countries, too, participate in the work of the Indian Ocean Rim Association, which received new impetus in the course of the current decade. Notably Oman is a very active...

  6. The Indian Monsoon

    Indian Academy of Sciences (India)

    Pacific Oceans, on subseasonal scales of a few days and on an interannual scale. ... over the Indian monsoon zone2 (Figure 3) during the summer monsoon .... each 500 km ×500 km grid over the equatorial Indian Ocean, Bay of Bengal and ...

  7. Indian Arts in Canada

    Science.gov (United States)

    Tawow, 1974

    1974-01-01

    A recent publication, "Indian Arts in Canada", examines some of the forces, both past and present, which are not only affecting American Indian artists today, but which will also profoundly influence their future. The review presents a few of the illustrations used in the book, along with the Introduction and the Foreword. (KM)

  8. Monte Carlo techniques in radiation therapy

    CERN Document Server

    Verhaegen, Frank

    2013-01-01

    Modern cancer treatment relies on Monte Carlo simulations to help radiotherapists and clinical physicists better understand and compute radiation dose from imaging devices as well as exploit four-dimensional imaging data. With Monte Carlo-based treatment planning tools now available from commercial vendors, a complete transition to Monte Carlo-based dose calculation methods in radiotherapy could likely take place in the next decade. Monte Carlo Techniques in Radiation Therapy explores the use of Monte Carlo methods for modeling various features of internal and external radiation sources, including light ion beams. The book-the first of its kind-addresses applications of the Monte Carlo particle transport simulation technique in radiation therapy, mainly focusing on external beam radiotherapy and brachytherapy. It presents the mathematical and technical aspects of the methods in particle transport simulations. The book also discusses the modeling of medical linacs and other irradiation devices; issues specific...

  9. The APACHE II measured on patients' discharge from the Intensive Care Unit in the prediction of mortality APACHE II medido en la salida de los pacientes de la Unidad de Terapia Intensiva en la previsión de la mortalidad APACHE II medido na saída dos pacientes da Unidade de Terapia Intensiva na previsão da mortalidade

    Directory of Open Access Journals (Sweden)

    Luciana Gonzaga dos Santos Cardoso

    2013-06-01

    Full Text Available OBJECTIVE: to analyze the performance of the Acute Physiology and Chronic Health Evaluation (APACHE II, measured based on the data from the last 24 hours of hospitalization in ICU, for patients transferred to the wards. METHOD: an observational, prospective and quantitative study using the data from 355 patients admitted to the ICU between January and July 2010, who were transferred to the wards. RESULTS: the discriminatory power of the AII-OUT prognostic index showed a statistically significant area beneath the ROC curve. The mortality observed in the sample was slightly greater than that predicted by the AII-OUT, with a Standardized Mortality Ratio of 1.12. In the calibration curve the linear regression analysis showed the R2 value to be statistically significant. CONCLUSION: the AII-OUT could predict mortality after discharge from ICU, with the observed mortality being slightly greater than that predicted, which shows good discrimination and good calibration. This system was shown to be useful for stratifying the patients at greater risk of death after discharge from ICU. This fact deserves special attention from health professionals, particularly nurses, in managing human and technological resources for this group of patients. OBJETIVO: analizar el desempeño del Acute Physiology and Chronic Health Evaluation (APACHE II, medido con base en los datos de la últims 24 horas de internación en la UTI, en los pacientes con transferencia para las enfermerías. MÉTODO: estudio observacional, prospectivo y cuantitativo con datos de 355 pacientes admitidos en la UTI entre enero y julio de 2010 que fueron transferidos para las enfermerías. RESULTADOS: el poder discriminatorio del índice pronóstico AII-SALIDA demostró un área debajo de la curva ROC estadísticamente significativa. La mortalidad observada en la muestra fue discretamente mayor que la prevista por el AII-SALIDA, con una Razón de Mortalidad Estandarizada de 1,12. En la curva de

  10. Mean field simulation for Monte Carlo integration

    CERN Document Server

    Del Moral, Pierre

    2013-01-01

    In the last three decades, there has been a dramatic increase in the use of interacting particle methods as a powerful tool in real-world applications of Monte Carlo simulation in computational physics, population biology, computer sciences, and statistical machine learning. Ideally suited to parallel and distributed computation, these advanced particle algorithms include nonlinear interacting jump diffusions; quantum, diffusion, and resampled Monte Carlo methods; Feynman-Kac particle models; genetic and evolutionary algorithms; sequential Monte Carlo methods; adaptive and interacting Marko

  11. 76 FR 49505 - Indian Gaming

    Science.gov (United States)

    2011-08-10

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Tribal-State Class III Gaming Compact taking effect. SUMMARY: This publishes..., Director, Office of Indian Gaming, Office of the Deputy Assistant Secretary--Policy and Economic...

  12. 75 FR 38833 - Indian Gaming

    Science.gov (United States)

    2010-07-06

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes... Date: July 6, 2010. FOR FURTHER INFORMATION CONTACT: Paula Hart, Director, Office of Indian Gaming...

  13. 77 FR 76513 - Indian Gaming

    Science.gov (United States)

    2012-12-28

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Amended Tribal-State Class III Gaming Compact taking effect. SUMMARY..., 2012. FOR FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the...

  14. 76 FR 165 - Indian Gaming

    Science.gov (United States)

    2011-01-03

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs... Wisconsin Gaming Compact of 1992, as Amended in 1999, 2000, and 2003. DATES: Effective Date: January 3, 2011. FOR FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the...

  15. 75 FR 68618 - Indian Gaming

    Science.gov (United States)

    2010-11-08

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs... of Wisconsin Gaming Compact of 1991, as Amended in 1999 and 2003. DATES: Effective Date: November 8, 2010. FOR FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the...

  16. 77 FR 76514 - Indian Gaming

    Science.gov (United States)

    2012-12-28

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact taking effect. SUMMARY: This... FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the Deputy...

  17. Monte Carlo surface flux tallies

    International Nuclear Information System (INIS)

    Favorite, Jeffrey A.

    2010-01-01

    Particle fluxes on surfaces are difficult to calculate with Monte Carlo codes because the score requires a division by the surface-crossing angle cosine, and grazing angles lead to inaccuracies. We revisit the standard practice of dividing by half of a cosine 'cutoff' for particles whose surface-crossing cosines are below the cutoff. The theory behind this approximation is sound, but the application of the theory to all possible situations does not account for two implicit assumptions: (1) the grazing band must be symmetric about 0, and (2) a single linear expansion for the angular flux must be applied in the entire grazing band. These assumptions are violated in common circumstances; for example, for separate in-going and out-going flux tallies on internal surfaces, and for out-going flux tallies on external surfaces. In some situations, dividing by two-thirds of the cosine cutoff is more appropriate. If users were able to control both the cosine cutoff and the substitute value, they could use these parameters to make accurate surface flux tallies. The procedure is demonstrated in a test problem in which Monte Carlo surface fluxes in cosine bins are converted to angular fluxes and compared with the results of a discrete ordinates calculation.

  18. Monte Carlo simulations of neutron scattering instruments

    International Nuclear Information System (INIS)

    Aestrand, Per-Olof; Copenhagen Univ.; Lefmann, K.; Nielsen, K.

    2001-01-01

    A Monte Carlo simulation is an important computational tool used in many areas of science and engineering. The use of Monte Carlo techniques for simulating neutron scattering instruments is discussed. The basic ideas, techniques and approximations are presented. Since the construction of a neutron scattering instrument is very expensive, Monte Carlo software used for design of instruments have to be validated and tested extensively. The McStas software was designed with these aspects in mind and some of the basic principles of the McStas software will be discussed. Finally, some future prospects are discussed for using Monte Carlo simulations in optimizing neutron scattering experiments. (R.P.)

  19. Shear velocity structure of the laterally heterogeneous crust and uppermost mantle beneath the Indian region

    Science.gov (United States)

    Mohan, G.; Rai, S. S.; Panza, G. F.

    1997-08-01

    The shear velocity structure of the Indian lithosphere is mapped by inverting regionalized Rayleigh wave group velocities in time periods of 15-60 s. The regionalized maps are used to subdivide the Indian plate into several geologic units and determine the variation of velocity with depth in each unit. The Hedgehog Monte Carlo technique is used to obtain the shear wave velocity structure for each geologic unit, revealing distinct velocity variations in the lower crust and uppermost mantle. The Indian shield has a high-velocity (4.4-4.6 km/s) upper mantle which, however, is slower than other shields in the world. The central Indian platform comprised of Proterozoic basins and cratons is marked by a distinct low-velocity (4.0-4.2 km/s) upper mantle. Lower crustal velocities in the Indian lithosphere generally range between 3.8 and 4.0 km/s with the oceanic segments and the sedimentary basins marked by marginally higher and lower velocities, respectively. A remarkable contrast is observed in upper mantle velocities between the northern and eastern convergence fronts of the Indian plate. The South Bruma region along the eastern subduction front of the Indian oceanic lithosphere shows significant velocity enhancement in the lower crust and upper mantle. High velocities (≈4.8 km/s) are also observed in the upper mantle beneath the Ninetyeast ridge in the northeastern Indian Ocean.

  20. New associates | Announcements | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Sushmee Badhulika, Indian Institute of Technology, Hyderabad ... Sankar Chakma, Indian Institute of Science Education & Research, Bhopal Joydeep ... B Praveen Kumar, Indian National Centre for Ocean Information Services, Hyderabad

  1. On the use of stochastic approximation Monte Carlo for Monte Carlo integration

    KAUST Repository

    Liang, Faming

    2009-01-01

    The stochastic approximation Monte Carlo (SAMC) algorithm has recently been proposed as a dynamic optimization algorithm in the literature. In this paper, we show in theory that the samples generated by SAMC can be used for Monte Carlo integration

  2. General Monte Carlo code MONK

    International Nuclear Information System (INIS)

    Moore, J.G.

    1974-01-01

    The Monte Carlo code MONK is a general program written to provide a high degree of flexibility to the user. MONK is distinguished by its detailed representation of nuclear data in point form i.e., the cross-section is tabulated at specific energies instead of the more usual group representation. The nuclear data are unadjusted in the point form but recently the code has been modified to accept adjusted group data as used in fast and thermal reactor applications. The various geometrical handling capabilities and importance sampling techniques are described. In addition to the nuclear data aspects, the following features are also described; geometrical handling routines, tracking cycles, neutron source and output facilities. 12 references. (U.S.)

  3. Monte Carlo lattice program KIM

    International Nuclear Information System (INIS)

    Cupini, E.; De Matteis, A.; Simonini, R.

    1980-01-01

    The Monte Carlo program KIM solves the steady-state linear neutron transport equation for a fixed-source problem or, by successive fixed-source runs, for the eigenvalue problem, in a two-dimensional thermal reactor lattice. Fluxes and reaction rates are the main quantities computed by the program, from which power distribution and few-group averaged cross sections are derived. The simulation ranges from 10 MeV to zero and includes anisotropic and inelastic scattering in the fast energy region, the epithermal Doppler broadening of the resonances of some nuclides, and the thermalization phenomenon by taking into account the thermal velocity distribution of some molecules. Besides the well known combinatorial geometry, the program allows complex configurations to be represented by a discrete set of points, an approach greatly improving calculation speed

  4. Monte Carlo simulation of experiments

    International Nuclear Information System (INIS)

    Opat, G.I.

    1977-07-01

    An outline of the technique of computer simulation of particle physics experiments by the Monte Carlo method is presented. Useful special purpose subprograms are listed and described. At each stage the discussion is made concrete by direct reference to the programs SIMUL8 and its variant MONTE-PION, written to assist in the analysis of the radiative decay experiments μ + → e + ν sub(e) antiνγ and π + → e + ν sub(e)γ, respectively. These experiments were based on the use of two large sodium iodide crystals, TINA and MINA, as e and γ detectors. Instructions for the use of SIMUL8 and MONTE-PION are given. (author)

  5. Rasam Indian Restaurant: Menu

    OpenAIRE

    Rasam Indian Restaurant

    2013-01-01

    Rasam Indian Restaurant is located in the Glasthule, a suburb of Dublin and opened in 2003. The objective is to serve high quality, authentic Indian cuisine. "We blend, roast and grind our own spices daily to provide a flavour that is unique to Rasam. Cooking Indian food is founded upon long held family traditions. The secret is in the varying elements of heat and spices, the tandoor clay oven is a hugely important fixture in our kitchen. Marinated meats are lowered into the oven on long m...

  6. [Indian workers in Oman].

    Science.gov (United States)

    Longuenesse, E

    1985-01-01

    Until recently Oman was a country of emigration, but by 1980 an estimated 200,000 foreign workers were in the country due to the petroleum boom. Almost 1/3 of the estimated 300,000 Indian workers in the Gulf states were in Oman, a country whose colonial heritage was closely tied to that of India and many of whose inhabitants still speak Urdu. The number of work permits granted to Indians working in the private sector in Oman increased from 47,928 in 1976 to 80,787 in 1980. An estimated 110,000 Indians were working in Oman in 1982, the great majority in the construction and public works sector. A few hundred Indian women were employed by the government of Oman, as domestics, or in other capacities. No accurate data is available on the qualifications of Indian workers in Oman, but a 1979 survey suggested a relatively low illiteracy rate among them. 60-75% of Indians in Oman are from the state of Kerala, followed by workers from the Punjab and the southern states of Tamil Nadu and Andhra Pradesh and Bombay. Indian workers are recruited by specialized agencies or by friends or relatives already employed in Oman. Employers in Oman prefer to recruit through agencies because the preselection process minimizes hiring of workers unqualified for their posts. Officially, expenses of transportation, visas, and other needs are shared by the worker and the employer, but the demand for jobs is so strong that the workers are obliged to pay commissions which amount to considerable sums for stable and well paying jobs. Wages in Oman are however 2 to 5 times the level in India. Numerous abuses have been reported in recruitment practices and in failure of employers in Oman to pay the promised wages, but Indian workers have little recourse. At the same level of qualifications, Indians are paid less then non-Omani Arabs, who in turn receive less than Oman nationals. Indians who remain in Oman long enough nevertheless are able to support families at home and to accumulate considerable

  7. Indian concepts on sexuality.

    Science.gov (United States)

    Chakraborty, Kaustav; Thakurata, Rajarshi Guha

    2013-01-01

    India is a vast country depicting wide social, cultural and sexual variations. Indian concept of sexuality has evolved over time and has been immensely influenced by various rulers and religions. Indian sexuality is manifested in our attire, behavior, recreation, literature, sculptures, scriptures, religion and sports. It has influenced the way we perceive our health, disease and device remedies for the same. In modern era, with rapid globalization the unique Indian sexuality is getting diffused. The time has come to rediscover ourselves in terms of sexuality to attain individual freedom and to reinvest our energy to social issues related to sexuality.

  8. Advanced Computational Methods for Monte Carlo Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-12

    This course is intended for graduate students who already have a basic understanding of Monte Carlo methods. It focuses on advanced topics that may be needed for thesis research, for developing new state-of-the-art methods, or for working with modern production Monte Carlo codes.

  9. Nested Sampling with Constrained Hamiltonian Monte Carlo

    OpenAIRE

    Betancourt, M. J.

    2010-01-01

    Nested sampling is a powerful approach to Bayesian inference ultimately limited by the computationally demanding task of sampling from a heavily constrained probability distribution. An effective algorithm in its own right, Hamiltonian Monte Carlo is readily adapted to efficiently sample from any smooth, constrained distribution. Utilizing this constrained Hamiltonian Monte Carlo, I introduce a general implementation of the nested sampling algorithm.

  10. Monte Carlo Treatment Planning for Advanced Radiotherapy

    DEFF Research Database (Denmark)

    Cronholm, Rickard

    This Ph.d. project describes the development of a workflow for Monte Carlo Treatment Planning for clinical radiotherapy plans. The workflow may be utilized to perform an independent dose verification of treatment plans. Modern radiotherapy treatment delivery is often conducted by dynamically...... modulating the intensity of the field during the irradiation. The workflow described has the potential to fully model the dynamic delivery, including gantry rotation during irradiation, of modern radiotherapy. Three corner stones of Monte Carlo Treatment Planning are identified: Building, commissioning...... and validation of a Monte Carlo model of a medical linear accelerator (i), converting a CT scan of a patient to a Monte Carlo compliant phantom (ii) and translating the treatment plan parameters (including beam energy, angles of incidence, collimator settings etc) to a Monte Carlo input file (iii). A protocol...

  11. The MC21 Monte Carlo Transport Code

    International Nuclear Information System (INIS)

    Sutton TM; Donovan TJ; Trumbull TH; Dobreff PS; Caro E; Griesheimer DP; Tyburski LJ; Carpenter DC; Joo H

    2007-01-01

    MC21 is a new Monte Carlo neutron and photon transport code currently under joint development at the Knolls Atomic Power Laboratory and the Bettis Atomic Power Laboratory. MC21 is the Monte Carlo transport kernel of the broader Common Monte Carlo Design Tool (CMCDT), which is also currently under development. The vision for CMCDT is to provide an automated, computer-aided modeling and post-processing environment integrated with a Monte Carlo solver that is optimized for reactor analysis. CMCDT represents a strategy to push the Monte Carlo method beyond its traditional role as a benchmarking tool or ''tool of last resort'' and into a dominant design role. This paper describes various aspects of the code, including the neutron physics and nuclear data treatments, the geometry representation, and the tally and depletion capabilities

  12. Monte Carlo simulation in nuclear medicine

    International Nuclear Information System (INIS)

    Morel, Ch.

    2007-01-01

    The Monte Carlo method allows for simulating random processes by using series of pseudo-random numbers. It became an important tool in nuclear medicine to assist in the design of new medical imaging devices, optimise their use and analyse their data. Presently, the sophistication of the simulation tools allows the introduction of Monte Carlo predictions in data correction and image reconstruction processes. The availability to simulate time dependent processes opens up new horizons for Monte Carlo simulation in nuclear medicine. In a near future, these developments will allow to tackle simultaneously imaging and dosimetry issues and soon, case system Monte Carlo simulations may become part of the nuclear medicine diagnostic process. This paper describes some Monte Carlo method basics and the sampling methods that were developed for it. It gives a referenced list of different simulation software used in nuclear medicine and enumerates some of their present and prospective applications. (author)

  13. 78 FR 65370 - Notice of Inventory Completion: Pima County Office of the Medical Examiner, Tucson, AZ

    Science.gov (United States)

    2013-10-31

    ... Tribe of the San Carlos Reservation, Arizona; Tohono O'odham Nation of Arizona; White Mountain Apache... Community of the Gila River Indian Reservation, Arizona; Hopi Tribe of Arizona; Tohono O'odham Nation of...; Tohono O'odham Nation of Arizona; and the Zuni Tribe of the Zuni Reservation, New Mexico. Additional...

  14. Indian refining industry

    International Nuclear Information System (INIS)

    Singh, I.J.

    2002-01-01

    The author discusses the history of the Indian refining industry and ongoing developments under the headings: the present state; refinery configuration; Indian capabilities for refinery projects; and reforms in the refining industry. Tables lists India's petroleum refineries giving location and capacity; new refinery projects together with location and capacity; and expansion projects of Indian petroleum refineries. The Indian refinery industry has undergone substantial expansion as well as technological changes over the past years. There has been progressive technology upgrading, energy efficiency, better environmental control and improved capacity utilisation. Major reform processes have been set in motion by the government of India: converting the refining industry from a centrally controlled public sector dominated industry to a delicensed regime in a competitive market economy with the introduction of a liberal exploration policy; dismantling the administered price mechanism; and a 25 year hydrocarbon vision. (UK)

  15. Evaluation of the Apache II and the oncologic history, as indicative predictions of mortality in the unit of intensive care of the INC September 1996 -December 1997

    International Nuclear Information System (INIS)

    Camargo, David O; Gomez, Clara; Martinez, Teresa

    1999-01-01

    They are multiple the indexes of severity that have been carried out to value the predict and the quality of a patient's life, especially when this it enters to the unit of intensive care (UIC); however, the oncologic patient presents particularities in their mobility, that it supposes a different behavior in the results of the Indexes. Presently work is compared the Apache scale and the oncologic history like morbid mortality as predictors in the UCI. 207 patients were included that entered the UCI between September of 1996 and December of 1997. It was a mortality of 29%, the stay of most of this group of patient smaller than 24 hours or bigger than 8 days. To the entrance, 50% of the patients presented superior averages at 15 in the Apache Scale and at the 48 hours, alone 30.4% continued with this value. The patients with hematologic neoplasia presented superior average at 15 in 87%, with a mortality of 63.3% with average between 15 and 24 to the entrance, the risk of dying was 9.8 times but that with inferior average. In the hematologic patient, the risk of dying was 5.7 times but regarding the solid tumors. The system but altered it was the breathing one, with an increase in the risk of dying from 2,8 times for each increment utility in the scale. Contrary to described in the literature, the oncologic diagnoses and the neoplasia statistic they didn't influence in the mortality of the patients

  16. Monte Carlo Codes Invited Session

    International Nuclear Information System (INIS)

    Trama, J.C.; Malvagi, F.; Brown, F.

    2013-01-01

    This document lists 22 Monte Carlo codes used in radiation transport applications throughout the world. For each code the names of the organization and country and/or place are given. We have the following computer codes. 1) ARCHER, USA, RPI; 2) COG11, USA, LLNL; 3) DIANE, France, CEA/DAM Bruyeres; 4) FLUKA, Italy and CERN, INFN and CERN; 5) GEANT4, International GEANT4 collaboration; 6) KENO and MONACO (SCALE), USA, ORNL; 7) MC21, USA, KAPL and Bettis; 8) MCATK, USA, LANL; 9) MCCARD, South Korea, Seoul National University; 10) MCNP6, USA, LANL; 11) MCU, Russia, Kurchatov Institute; 12) MONK and MCBEND, United Kingdom, AMEC; 13) MORET5, France, IRSN Fontenay-aux-Roses; 14) MVP2, Japan, JAEA; 15) OPENMC, USA, MIT; 16) PENELOPE, Spain, Barcelona University; 17) PHITS, Japan, JAEA; 18) PRIZMA, Russia, VNIITF; 19) RMC, China, Tsinghua University; 20) SERPENT, Finland, VTT; 21) SUPERMONTECARLO, China, CAS INEST FDS Team Hefei; and 22) TRIPOLI-4, France, CEA Saclay

  17. Advanced computers and Monte Carlo

    International Nuclear Information System (INIS)

    Jordan, T.L.

    1979-01-01

    High-performance parallelism that is currently available is synchronous in nature. It is manifested in such architectures as Burroughs ILLIAC-IV, CDC STAR-100, TI ASC, CRI CRAY-1, ICL DAP, and many special-purpose array processors designed for signal processing. This form of parallelism has apparently not been of significant value to many important Monte Carlo calculations. Nevertheless, there is much asynchronous parallelism in many of these calculations. A model of a production code that requires up to 20 hours per problem on a CDC 7600 is studied for suitability on some asynchronous architectures that are on the drawing board. The code is described and some of its properties and resource requirements ae identified to compare with corresponding properties and resource requirements are identified to compare with corresponding properties and resource requirements are identified to compare with corresponding properties and resources of some asynchronous multiprocessor architectures. Arguments are made for programer aids and special syntax to identify and support important asynchronous parallelism. 2 figures, 5 tables

  18. Adaptive Markov Chain Monte Carlo

    KAUST Repository

    Jadoon, Khan

    2016-08-08

    A substantial interpretation of electromagnetic induction (EMI) measurements requires quantifying optimal model parameters and uncertainty of a nonlinear inverse problem. For this purpose, an adaptive Bayesian Markov chain Monte Carlo (MCMC) algorithm is used to assess multi-orientation and multi-offset EMI measurements in an agriculture field with non-saline and saline soil. In the MCMC simulations, posterior distribution was computed using Bayes rule. The electromagnetic forward model based on the full solution of Maxwell\\'s equations was used to simulate the apparent electrical conductivity measured with the configurations of EMI instrument, the CMD mini-Explorer. The model parameters and uncertainty for the three-layered earth model are investigated by using synthetic data. Our results show that in the scenario of non-saline soil, the parameters of layer thickness are not well estimated as compared to layers electrical conductivity because layer thicknesses in the model exhibits a low sensitivity to the EMI measurements, and is hence difficult to resolve. Application of the proposed MCMC based inversion to the field measurements in a drip irrigation system demonstrate that the parameters of the model can be well estimated for the saline soil as compared to the non-saline soil, and provide useful insight about parameter uncertainty for the assessment of the model outputs.

  19. Importance iteration in MORSE Monte Carlo calculations

    International Nuclear Information System (INIS)

    Kloosterman, J.L.; Hoogenboom, J.E.

    1994-01-01

    An expression to calculate point values (the expected detector response of a particle emerging from a collision or the source) is derived and implemented in the MORSE-SGC/S Monte Carlo code. It is outlined how these point values can be smoothed as a function of energy and as a function of the optical thickness between the detector and the source. The smoothed point values are subsequently used to calculate the biasing parameters of the Monte Carlo runs to follow. The method is illustrated by an example that shows that the obtained biasing parameters lead to a more efficient Monte Carlo calculation

  20. Monte Carlo approaches to light nuclei

    International Nuclear Information System (INIS)

    Carlson, J.

    1990-01-01

    Significant progress has been made recently in the application of Monte Carlo methods to the study of light nuclei. We review new Green's function Monte Carlo results for the alpha particle, Variational Monte Carlo studies of 16 O, and methods for low-energy scattering and transitions. Through these calculations, a coherent picture of the structure and electromagnetic properties of light nuclei has arisen. In particular, we examine the effect of the three-nucleon interaction and the importance of exchange currents in a variety of experimentally measured properties, including form factors and capture cross sections. 29 refs., 7 figs

  1. Monte Carlo approaches to light nuclei

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, J.

    1990-01-01

    Significant progress has been made recently in the application of Monte Carlo methods to the study of light nuclei. We review new Green's function Monte Carlo results for the alpha particle, Variational Monte Carlo studies of {sup 16}O, and methods for low-energy scattering and transitions. Through these calculations, a coherent picture of the structure and electromagnetic properties of light nuclei has arisen. In particular, we examine the effect of the three-nucleon interaction and the importance of exchange currents in a variety of experimentally measured properties, including form factors and capture cross sections. 29 refs., 7 figs.

  2. Importance iteration in MORSE Monte Carlo calculations

    International Nuclear Information System (INIS)

    Kloosterman, J.L.; Hoogenboom, J.E.

    1994-02-01

    An expression to calculate point values (the expected detector response of a particle emerging from a collision or the source) is derived and implemented in the MORSE-SGC/S Monte Carlo code. It is outlined how these point values can be smoothed as a function of energy and as a function of the optical thickness between the detector and the source. The smoothed point values are subsequently used to calculate the biasing parameters of the Monte Carlo runs to follow. The method is illustrated by an example, which shows that the obtained biasing parameters lead to a more efficient Monte Carlo calculation. (orig.)

  3. Monte carlo simulation for soot dynamics

    KAUST Repository

    Zhou, Kun

    2012-01-01

    A new Monte Carlo method termed Comb-like frame Monte Carlo is developed to simulate the soot dynamics. Detailed stochastic error analysis is provided. Comb-like frame Monte Carlo is coupled with the gas phase solver Chemkin II to simulate soot formation in a 1-D premixed burner stabilized flame. The simulated soot number density, volume fraction, and particle size distribution all agree well with the measurement available in literature. The origin of the bimodal distribution of particle size distribution is revealed with quantitative proof.

  4. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Author Affiliations. A Salih1 S Ghosh Moulic2. Department of Aerospace Engineering, Indian Institute of Space Science and Technology, Thiruvananthapuram 695 022; Department of Mechanical Engineering, Indian Institute of Technology, Kharagpur 721 302 ...

  5. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Sequential Bayesian technique: An alternative approach for software reliability estimation ... Software reliability; Bayesian sequential estimation; Kalman filter. ... Department of Mathematics, Indian Institute of Technology, Kharagpur 721 302; Reliability Engineering Centre, Indian Institute of Technology, Kharagpur 721 302 ...

  6. Fellowship | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Address: Director, Indian Institute of Science Education & Research, Sri Rama ... Address: Department of Chemistry, Indian Institute of Technology, New Delhi 110 016, Delhi ..... Specialization: Elementary Particle Physics, Field Theory and ...

  7. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Author Affiliations. Soumen Bag1 Gaurav Harit2. Department of Computer Science and Engineering, Indian Institute of Technology Kharagpur, Kharagpur 721 302, India; Information and Communication Technology, Indian Institute of Technology Rajasthan, Jodhpur 342 011, India ...

  8. 11th International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing

    CERN Document Server

    Nuyens, Dirk

    2016-01-01

    This book presents the refereed proceedings of the Eleventh International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing that was held at the University of Leuven (Belgium) in April 2014. These biennial conferences are major events for Monte Carlo and quasi-Monte Carlo researchers. The proceedings include articles based on invited lectures as well as carefully selected contributed papers on all theoretical aspects and applications of Monte Carlo and quasi-Monte Carlo methods. Offering information on the latest developments in these very active areas, this book is an excellent reference resource for theoreticians and practitioners interested in solving high-dimensional computational problems, arising, in particular, in finance, statistics and computer graphics.

  9. Quantum Monte Carlo approaches for correlated systems

    CERN Document Server

    Becca, Federico

    2017-01-01

    Over the past several decades, computational approaches to studying strongly-interacting systems have become increasingly varied and sophisticated. This book provides a comprehensive introduction to state-of-the-art quantum Monte Carlo techniques relevant for applications in correlated systems. Providing a clear overview of variational wave functions, and featuring a detailed presentation of stochastic samplings including Markov chains and Langevin dynamics, which are developed into a discussion of Monte Carlo methods. The variational technique is described, from foundations to a detailed description of its algorithms. Further topics discussed include optimisation techniques, real-time dynamics and projection methods, including Green's function, reptation and auxiliary-field Monte Carlo, from basic definitions to advanced algorithms for efficient codes, and the book concludes with recent developments on the continuum space. Quantum Monte Carlo Approaches for Correlated Systems provides an extensive reference ...

  10. Monte Carlo simulations for plasma physics

    International Nuclear Information System (INIS)

    Okamoto, M.; Murakami, S.; Nakajima, N.; Wang, W.X.

    2000-07-01

    Plasma behaviours are very complicated and the analyses are generally difficult. However, when the collisional processes play an important role in the plasma behaviour, the Monte Carlo method is often employed as a useful tool. For examples, in neutral particle injection heating (NBI heating), electron or ion cyclotron heating, and alpha heating, Coulomb collisions slow down high energetic particles and pitch angle scatter them. These processes are often studied by the Monte Carlo technique and good agreements can be obtained with the experimental results. Recently, Monte Carlo Method has been developed to study fast particle transports associated with heating and generating the radial electric field. Further it is applied to investigating the neoclassical transport in the plasma with steep gradients of density and temperatures which is beyong the conventional neoclassical theory. In this report, we briefly summarize the researches done by the present authors utilizing the Monte Carlo method. (author)

  11. Frontiers of quantum Monte Carlo workshop: preface

    International Nuclear Information System (INIS)

    Gubernatis, J.E.

    1985-01-01

    The introductory remarks, table of contents, and list of attendees are presented from the proceedings of the conference, Frontiers of Quantum Monte Carlo, which appeared in the Journal of Statistical Physics

  12. Monte Carlo code development in Los Alamos

    International Nuclear Information System (INIS)

    Carter, L.L.; Cashwell, E.D.; Everett, C.J.; Forest, C.A.; Schrandt, R.G.; Taylor, W.M.; Thompson, W.L.; Turner, G.D.

    1974-01-01

    The present status of Monte Carlo code development at Los Alamos Scientific Laboratory is discussed. A brief summary is given of several of the most important neutron, photon, and electron transport codes. 17 references. (U.S.)

  13. "Shaakal" Carlos kaebas arreteerija kohtusse / Margo Pajuste

    Index Scriptorium Estoniae

    Pajuste, Margo

    2006-01-01

    Ilmunud ka: Postimees : na russkom jazõke 3. juuli lk. 11. Vangistatud kurikuulus terrorist "Shaakal" Carlos alustas kohtuasja oma kunagise vahistaja vastu. Ta süüdistab Prantsusmaa luureteenistuse endist juhti inimröövis

  14. Experience with the Monte Carlo Method

    Energy Technology Data Exchange (ETDEWEB)

    Hussein, E M.A. [Department of Mechanical Engineering University of New Brunswick, Fredericton, N.B., (Canada)

    2007-06-15

    Monte Carlo simulation of radiation transport provides a powerful research and design tool that resembles in many aspects laboratory experiments. Moreover, Monte Carlo simulations can provide an insight not attainable in the laboratory. However, the Monte Carlo method has its limitations, which if not taken into account can result in misleading conclusions. This paper will present the experience of this author, over almost three decades, in the use of the Monte Carlo method for a variety of applications. Examples will be shown on how the method was used to explore new ideas, as a parametric study and design optimization tool, and to analyze experimental data. The consequences of not accounting in detail for detector response and the scattering of radiation by surrounding structures are two of the examples that will be presented to demonstrate the pitfall of condensed.

  15. Experience with the Monte Carlo Method

    International Nuclear Information System (INIS)

    Hussein, E.M.A.

    2007-01-01

    Monte Carlo simulation of radiation transport provides a powerful research and design tool that resembles in many aspects laboratory experiments. Moreover, Monte Carlo simulations can provide an insight not attainable in the laboratory. However, the Monte Carlo method has its limitations, which if not taken into account can result in misleading conclusions. This paper will present the experience of this author, over almost three decades, in the use of the Monte Carlo method for a variety of applications. Examples will be shown on how the method was used to explore new ideas, as a parametric study and design optimization tool, and to analyze experimental data. The consequences of not accounting in detail for detector response and the scattering of radiation by surrounding structures are two of the examples that will be presented to demonstrate the pitfall of condensed

  16. Monte Carlo Transport for Electron Thermal Transport

    Science.gov (United States)

    Chenhall, Jeffrey; Cao, Duc; Moses, Gregory

    2015-11-01

    The iSNB (implicit Schurtz Nicolai Busquet multigroup electron thermal transport method of Cao et al. is adapted into a Monte Carlo transport method in order to better model the effects of non-local behavior. The end goal is a hybrid transport-diffusion method that combines Monte Carlo Transport with a discrete diffusion Monte Carlo (DDMC). The hybrid method will combine the efficiency of a diffusion method in short mean free path regions with the accuracy of a transport method in long mean free path regions. The Monte Carlo nature of the approach allows the algorithm to be massively parallelized. Work to date on the method will be presented. This work was supported by Sandia National Laboratory - Albuquerque and the University of Rochester Laboratory for Laser Energetics.

  17. A continuation multilevel Monte Carlo algorithm

    KAUST Repository

    Collier, Nathan; Haji Ali, Abdul Lateef; Nobile, Fabio; von Schwerin, Erik; Tempone, Raul

    2014-01-01

    We propose a novel Continuation Multi Level Monte Carlo (CMLMC) algorithm for weak approximation of stochastic models. The CMLMC algorithm solves the given approximation problem for a sequence of decreasing tolerances, ending when the required error

  18. Simulation and the Monte Carlo method

    CERN Document Server

    Rubinstein, Reuven Y

    2016-01-01

    Simulation and the Monte Carlo Method, Third Edition reflects the latest developments in the field and presents a fully updated and comprehensive account of the major topics that have emerged in Monte Carlo simulation since the publication of the classic First Edition over more than a quarter of a century ago. While maintaining its accessible and intuitive approach, this revised edition features a wealth of up-to-date information that facilitates a deeper understanding of problem solving across a wide array of subject areas, such as engineering, statistics, computer science, mathematics, and the physical and life sciences. The book begins with a modernized introduction that addresses the basic concepts of probability, Markov processes, and convex optimization. Subsequent chapters discuss the dramatic changes that have occurred in the field of the Monte Carlo method, with coverage of many modern topics including: Markov Chain Monte Carlo, variance reduction techniques such as the transform likelihood ratio...

  19. Hybrid Monte Carlo methods in computational finance

    NARCIS (Netherlands)

    Leitao Rodriguez, A.

    2017-01-01

    Monte Carlo methods are highly appreciated and intensively employed in computational finance in the context of financial derivatives valuation or risk management. The method offers valuable advantages like flexibility, easy interpretation and straightforward implementation. Furthermore, the

  20. 77 FR 5566 - Indian Gaming

    Science.gov (United States)

    2012-02-03

    ... up to 900 gaming devices, any banking or percentage card games, and any devices or games authorized... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Tribal--State Class III Gaming Compact Taking Effect. SUMMARY: This publishes...

  1. 76 FR 56466 - Indian Gaming

    Science.gov (United States)

    2011-09-13

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an approval of the gaming compact between the Flandreau Santee Sioux Tribe and the State of South...

  2. 76 FR 65208 - Indian Gaming

    Science.gov (United States)

    2011-10-20

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an Approval of the Gaming Compact between the Confederated Tribes of the [[Page 65209

  3. 75 FR 68823 - Indian Gaming

    Science.gov (United States)

    2010-11-09

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Amendment. SUMMARY: This notice publishes approval of the Amendments to the Class III Gaming Compact (Amendment) between the State of Oregon...

  4. 77 FR 43110 - Indian Gaming

    Science.gov (United States)

    2012-07-23

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an extension of Gaming between the Rosebud Sioux Tribe and the State of South Dakota. DATES...

  5. 75 FR 8108 - Indian Gaming

    Science.gov (United States)

    2010-02-23

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes... Governing Class III Gaming. DATES: Effective Date: February 23, 2010. FOR FURTHER INFORMATION CONTACT: Paula...

  6. 76 FR 8375 - Indian Gaming

    Science.gov (United States)

    2011-02-14

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes an extension of the Gaming Compact between the Oglala Sioux Tribe and the State of South Dakota...

  7. 78 FR 10203 - Indian Gaming

    Science.gov (United States)

    2013-02-13

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal State Class III Gaming Compact. SUMMARY: This notice publishes the Approval of the Class III Tribal- State Gaming Compact between the Chippewa-Cree Tribe of the...

  8. 77 FR 30550 - Indian Gaming

    Science.gov (United States)

    2012-05-23

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes approval by the Department of an extension to the Class III Gaming Compact between the Pyramid Lake Paiute...

  9. 77 FR 45371 - Indian Gaming

    Science.gov (United States)

    2012-07-31

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an extension of Gaming between the Oglala Sioux Tribe and the State of South Dakota. DATES: Effective...

  10. 76 FR 11258 - Indian Gaming

    Science.gov (United States)

    2011-03-01

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Tribal--State Class III Gaming Compact taking effect. SUMMARY: Notice is given that the Tribal-State Compact for Regulation of Class III Gaming between the Confederated Tribes of the...

  11. 78 FR 15738 - Indian Gaming

    Science.gov (United States)

    2013-03-12

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an extension of the gaming compact between the Rosebud Sioux Tribe and the State of South Dakota...

  12. 77 FR 41200 - Indian Gaming

    Science.gov (United States)

    2012-07-12

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes approval by the Department of an extension to the Class III Gaming Compact between the State of California...

  13. 77 FR 59641 - Indian Gaming

    Science.gov (United States)

    2012-09-28

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes an extension of Gaming between the Rosebud Sioux Tribe and the State of South Dakota. DATES...

  14. 78 FR 17428 - Indian Gaming

    Science.gov (United States)

    2013-03-21

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes the approval of the Class III Tribal- State Gaming Compact between the Pyramid Lake Paiute Tribe and...

  15. 78 FR 26801 - Indian Gaming

    Science.gov (United States)

    2013-05-08

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs [DR.5B711.IA000813] Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes the approval of an amendment to the Class III Tribal-State Gaming Compact...

  16. 78 FR 62650 - Indian Gaming

    Science.gov (United States)

    2013-10-22

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs [DR.5B711.IA000813] Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of extension of Tribal-State Class III Gaming Compact. SUMMARY: This publishes notice of the extension of the Class III gaming compact between the Rosebud Sioux...

  17. 78 FR 54908 - Indian Gaming

    Science.gov (United States)

    2013-09-06

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs [DR.5B711.IA000813] Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes the approval of the Class III Tribal- State Gaming Compact between the...

  18. 78 FR 62649 - Indian Gaming

    Science.gov (United States)

    2013-10-22

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs [DR.5B711.IA000813] Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Tribal-State Class III Gaming Compact taking effect. SUMMARY: This notice publishes the Class III Gaming Compact between the North Fork Rancheria of Mono...

  19. 76 FR 52968 - Indian Gaming

    Science.gov (United States)

    2011-08-24

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an extension of Gaming between the Rosebud Sioux Tribe and the State of South Dakota. DATES...

  20. 78 FR 78377 - Indian Gaming

    Science.gov (United States)

    2013-12-26

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs [DR.5B711.IA000814] Indian Gaming AGENCY... Gaming Compact. SUMMARY: This publishes notice of the extension of the Class III gaming compact between... FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the Deputy...

  1. 76 FR 33341 - Indian Gaming

    Science.gov (United States)

    2011-06-08

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an extension of Gaming between the Rosebud Sioux Tribe and the State of South Dakota. DATES...

  2. 75 FR 55823 - Indian Gaming

    Science.gov (United States)

    2010-09-14

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes an extension of Gaming between the Oglala Sioux Tribe and the State of South Dakota. DATES: Effective...

  3. 78 FR 44146 - Indian Gaming

    Science.gov (United States)

    2013-07-23

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Tribal-State Class III Gaming Compact taking effect. SUMMARY: This notice publishes the Class III Amended and Restated Tribal-State Gaming Compact between the Shingle Springs Band of...

  4. 78 FR 54670 - Indian Gaming

    Science.gov (United States)

    2013-09-05

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs [DR.5B711.IA000813] Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of extension of Tribal--State Class III Gaming Compact. SUMMARY: This publishes notice of the Extension of the Class III gaming compact between the Yankton Sioux...

  5. 78 FR 33435 - Indian Gaming

    Science.gov (United States)

    2013-06-04

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Amendments. SUMMARY: This notice publishes approval of an Agreement to Amend the Class III Tribal-State Gaming Compact between the Salt River...

  6. 78 FR 17427 - Indian Gaming

    Science.gov (United States)

    2013-03-21

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes... Gaming (Compact). DATES: Effective Date: March 21, 2013. FOR FURTHER INFORMATION CONTACT: Paula L. Hart...

  7. 78 FR 11221 - Indian Gaming

    Science.gov (United States)

    2013-02-15

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes an extension of the gaming compact between the Oglala Sioux Tribe and the State of South Dakota...

  8. Facts about American Indian Education

    Science.gov (United States)

    American Indian College Fund, 2010

    2010-01-01

    As a result of living in remote rural areas, American Indians living on reservations have limited access to higher education. One-third of American Indians live on reservations, according to the U.S. Census Bureau. According to the most recent U.S. government statistics, the overall poverty rate for American Indians/Alaska Natives, including…

  9. Leadership Challenges in Indian Country.

    Science.gov (United States)

    Horse, Perry

    2002-01-01

    American Indian leaders must meld the holistic and cyclical world view of Indian peoples with the linear, rational world view of mainstream society. Tribal leaders need to be statesmen and ethical politicians. Economic and educational development must be based on disciplined long-range planning and a strong, Indian-controlled educational base.…

  10. LCG Monte-Carlo Data Base

    CERN Document Server

    Bartalini, P.; Kryukov, A.; Selyuzhenkov, Ilya V.; Sherstnev, A.; Vologdin, A.

    2004-01-01

    We present the Monte-Carlo events Data Base (MCDB) project and its development plans. MCDB facilitates communication between authors of Monte-Carlo generators and experimental users. It also provides a convenient book-keeping and an easy access to generator level samples. The first release of MCDB is now operational for the CMS collaboration. In this paper we review the main ideas behind MCDB and discuss future plans to develop this Data Base further within the CERN LCG framework.

  11. Multilevel Monte Carlo in Approximate Bayesian Computation

    KAUST Repository

    Jasra, Ajay

    2017-02-13

    In the following article we consider approximate Bayesian computation (ABC) inference. We introduce a method for numerically approximating ABC posteriors using the multilevel Monte Carlo (MLMC). A sequential Monte Carlo version of the approach is developed and it is shown under some assumptions that for a given level of mean square error, this method for ABC has a lower cost than i.i.d. sampling from the most accurate ABC approximation. Several numerical examples are given.

  12. Monte Carlo method applied to medical physics

    International Nuclear Information System (INIS)

    Oliveira, C.; Goncalves, I.F.; Chaves, A.; Lopes, M.C.; Teixeira, N.; Matos, B.; Goncalves, I.C.; Ramalho, A.; Salgado, J.

    2000-01-01

    The main application of the Monte Carlo method to medical physics is dose calculation. This paper shows some results of two dose calculation studies and two other different applications: optimisation of neutron field for Boron Neutron Capture Therapy and optimization of a filter for a beam tube for several purposes. The time necessary for Monte Carlo calculations - the highest boundary for its intensive utilisation - is being over-passed with faster and cheaper computers. (author)

  13. Intensive Care in India: The Indian Intensive Care Case Mix and Practice Patterns Study.

    Science.gov (United States)

    Divatia, Jigeeshu V; Amin, Pravin R; Ramakrishnan, Nagarajan; Kapadia, Farhad N; Todi, Subhash; Sahu, Samir; Govil, Deepak; Chawla, Rajesh; Kulkarni, Atul P; Samavedam, Srinivas; Jani, Charu K; Rungta, Narendra; Samaddar, Devi Prasad; Mehta, Sujata; Venkataraman, Ramesh; Hegde, Ashit; Bande, B D; Dhanuka, Sanjay; Singh, Virendra; Tewari, Reshma; Zirpe, Kapil; Sathe, Prachee

    2016-04-01

    To obtain information on organizational aspects, case mix and practices in Indian Intensive Care Units (ICUs). An observational, 4-day point prevalence study was performed between 2010 and 2011 in 4209 patients from 124 ICUs. ICU and patient characteristics, and interventions were recorded for 24 h of the study day, and outcomes till 30 days after the study day. Data were analyzed for 4038 adult patients from 120 ICUs. On the study day, mean age, Acute Physiology and Chronic Health Evaluation (APACHE II) and sequential organ failure assessment (SOFA) scores were 54.1 ± 17.1 years, 17.4 ± 9.2 and 3.8 ± 3.6, respectively. About 46.4% patients had ≥1 organ failure. Nearly, 37% and 22.2% patients received mechanical ventilation (MV) and vasopressors or inotropes, respectively. Nearly, 12.2% patients developed an infection in the ICU. About 28.3% patients had severe sepsis or septic shock (SvSpSS) during their ICU stay. About 60.7% patients without infection received antibiotics. There were 546 deaths and 183 terminal discharges (TDs) from ICU (including left against medical advice or discharged on request), with ICU mortality 729/4038 (18.1%). In 1627 patients admitted within 24 h of the study day, the standardized mortality ratio was 0.67. The APACHE II and SOFA scores, public hospital ICUs, medical ICUs, inadequately equipped ICUs, medical admission, self-paying patient, presence of SvSpSS, acute respiratory failure or cancer, need for a fluid bolus, and MV were independent predictors of mortality. The high proportion of TDs and the association of public hospitals, self-paying patients, and inadequately equipped hospitals with mortality has important implications for critical care in India.

  14. The Living Indian Critical Tradition

    Directory of Open Access Journals (Sweden)

    Vivek Kumar Dwivedi

    2010-11-01

    Full Text Available This paper attempts to establish the identity of something that is often considered to be missing – a living Indian critical tradition. I refer to the tradition that arises out of the work of those Indians who write in English. The chief architects of this tradition are Sri Aurobindo, C.D. Narasimhaiah, Gayatri Chakravorty Spivak and Homi K. Bhabha. It is possible to believe that Indian literary theories derive almost solely from ancient Sanskrit poetics. Or, alternatively, one can be concerned about the sad state of affairs regarding Indian literary theories or criticism in English. There have been scholars who have raised the question of the pathetic state of Indian scholarship in English and have even come up with some positive suggestions. But these scholars are those who are ignorant about the living Indian critical tradition. The significance of the Indian critical tradition lies in the fact that it provides the real focus to the Indian critical scene. Without an awareness of this tradition Indian literary scholarship (which is quite a different thing from Indian literary criticism and theory as it does not have the same impact as the latter two do can easily fail to see who the real Indian literary critics and theorists are.

  15. 75 FR 57290 - Notice of Inventory Completion: University of Colorado Museum, Boulder, CO

    Science.gov (United States)

    2010-09-20

    ... Indian Colony, Nevada; Lovelock Paiute Tribe of the Lovelock Indian Colony, Nevada; Mescalero Apache...; Lovelock Paiute Tribe of the Lovelock Indian Colony, Nevada; Mescalero Apache Tribe of the Mescalero...

  16. Indian Women: An Historical and Personal Perspective

    Science.gov (United States)

    Christensen, Rosemary Ackley

    1975-01-01

    Several issues relating to Indian women are discussed. These include (1) the three types of people to whom we owe our historical perceptions of Indian women, (2) role delineation in Indian society; (3) differences between Indian women and white women, and (4) literary role models of Indian women. (Author/BW)

  17. INDIAN ACADEMY OF SCIENCES

    Indian Academy of Sciences (India)

    user

    2016-07-02

    Jul 2, 2016 ... P R O G R A M M E. 1 July 2016 (Friday). Venue: Faculty Hall, Indian Institute of Science, Bengaluru ... 1800–1900 Session 1E – Public Lecture. Pratap Bhanu Mehta, Centre for Policy Research, New Delhi. Two ideas of India.

  18. Indian Astronomy: History of

    Science.gov (United States)

    Mercier, R.; Murdin, P.

    2002-01-01

    From the time of A macronryabhat under dota (ca AD 500) there appeared in India a series of Sanskrit treatises on astronomy. Written always in verse, and normally accompanied by prose commentaries, these served to create an Indian tradition of mathematical astronomy which continued into the 18th century. There are as well texts from earlier centuries, grouped under the name Jyotishaveda macronn d...

  19. The Indian Monsoon

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 13; Issue 3. The Indian Monsoon - Links to Cloud systems over the Tropical Oceans. Sulochana Gadgil. Series Article Volume 13 Issue 3 March 2008 pp 218-235. Fulltext. Click here to view fulltext PDF. Permanent link:

  20. Becoming an Indian

    Indian Academy of Sciences (India)

    Ramachandra Guha

    2017-11-25

    Nov 25, 2017 ... learning science by what he later recalled as 'Gandhian or basic .... Calcutta to offer their thoughts on Indian planning. Hal- ... had come to India for good. But any .... am eager to be of help and service to a sincere soul like you.

  1. Indians of North Carolina.

    Science.gov (United States)

    Bureau of Indian Affairs (Dept. of Interior), Washington, DC.

    Published by the U.S. Department of the Interior, this brief booklet on the historical development of the Cherokee Nation emphasizes the Tribe's relationship with the Bureau of Indian Affairs and its improved economy. Citing tourism as the major tribal industry, tribal enterprises are named and described (a 61 unit motor court in existence since…

  2. Indian Health Disparities

    Science.gov (United States)

    ... reservations and in rural communities, mostly in the western United States and Alaska. The American Indian and ... Office of Finance and Accounting - 10E54 Office of Human Resources - 11E53A Office of Information Technology - 07E57B Office of ...

  3. Caregiving in Indian Country

    Centers for Disease Control (CDC) Podcasts

    2009-12-23

    This podcast discusses the role of caregivers in Indian County and the importance of protecting their health. It is primarily targeted to public health and aging services professionals.  Created: 12/23/2009 by National Center for Chronic Disease Prevention and Health Promotion (NCCDPHP).   Date Released: 12/23/2009.

  4. Successful vectorization - reactor physics Monte Carlo code

    International Nuclear Information System (INIS)

    Martin, W.R.

    1989-01-01

    Most particle transport Monte Carlo codes in use today are based on the ''history-based'' algorithm, wherein one particle history at a time is simulated. Unfortunately, the ''history-based'' approach (present in all Monte Carlo codes until recent years) is inherently scalar and cannot be vectorized. In particular, the history-based algorithm cannot take advantage of vector architectures, which characterize the largest and fastest computers at the current time, vector supercomputers such as the Cray X/MP or IBM 3090/600. However, substantial progress has been made in recent years in developing and implementing a vectorized Monte Carlo algorithm. This algorithm follows portions of many particle histories at the same time and forms the basis for all successful vectorized Monte Carlo codes that are in use today. This paper describes the basic vectorized algorithm along with descriptions of several variations that have been developed by different researchers for specific applications. These applications have been mainly in the areas of neutron transport in nuclear reactor and shielding analysis and photon transport in fusion plasmas. The relative merits of the various approach schemes will be discussed and the present status of known vectorization efforts will be summarized along with available timing results, including results from the successful vectorization of 3-D general geometry, continuous energy Monte Carlo. (orig.)

  5. The Apache Longbow-Hellfire Missile Test at Yuma Proving Ground: Introduction and Problem Formulation for a Multiple Stressor Risk Assessment

    International Nuclear Information System (INIS)

    Efroymson, Rebecca Ann; Peterson, Mark J.; Jones, Daniel Steven; Suter, Glenn

    2008-01-01

    An ecological risk assessment was conducted at Yuma Proving Ground, Arizona, as a demonstration of the Military Ecological Risk Assessment Framework (MERAF). The focus of the assessment was a testing program at Cibola Range, which involved an Apache Longbow helicopter firing Hellfire missiles at moving targets, i.e., M60-A1 tanks. The problem formulation for the assessment included conceptual models for three component activities of the test, helicopter overflight, missile firing, and tracked vehicle movement, and two ecological endpoint entities, woody desert wash communities and desert mule deer (Odocoileus hemionus crooki) populations. An activity-specific risk assessment framework was available to provide guidance for assessing risks associated with aircraft overflights. Key environmental features of the study area include barren desert pavement and tree-lined desert washes. The primary stressors associated with helicopter overflights were sound and the view of the aircraft. The primary stressor associated with Hellfire missile firing was sound. The principal stressor associated with tracked vehicle movement was soil disturbance, and a resulting, secondary stressor was hydrological change. Water loss to washes and wash vegetation was expected to result from increased ponding, infiltration and/or evaporation associated with disturbances to desert pavement. A plan for estimating integrated risks from the three military activities was included in the problem formulation

  6. Yours in Revolution: Retrofitting Carlos the Jackal

    Directory of Open Access Journals (Sweden)

    Samuel Thomas

    2013-09-01

    Full Text Available This paper explores the representation of ‘Carlos the Jackal’, the one-time ‘World’s Most Wanted Man’ and ‘International Face of Terror’ – primarily in cin-ema but also encompassing other forms of popular culture and aspects of Cold War policy-making. At the centre of the analysis is Olivier Assayas’s Carlos (2010, a transnational, five and a half hour film (first screened as a TV mini-series about the life and times of the infamous militant. Concentrating on the var-ious ways in which Assayas expresses a critical preoccupation with names and faces through complex formal composition, the project examines the play of ab-straction and embodiment that emerges from the narrativisation of terrorist vio-lence. Lastly, it seeks to engage with the hidden implications of Carlos in terms of the intertwined trajectories of formal experimentation and revolutionary politics.

  7. Monte Carlo strategies in scientific computing

    CERN Document Server

    Liu, Jun S

    2008-01-01

    This paperback edition is a reprint of the 2001 Springer edition This book provides a self-contained and up-to-date treatment of the Monte Carlo method and develops a common framework under which various Monte Carlo techniques can be "standardized" and compared Given the interdisciplinary nature of the topics and a moderate prerequisite for the reader, this book should be of interest to a broad audience of quantitative researchers such as computational biologists, computer scientists, econometricians, engineers, probabilists, and statisticians It can also be used as the textbook for a graduate-level course on Monte Carlo methods Many problems discussed in the alter chapters can be potential thesis topics for masters’ or PhD students in statistics or computer science departments Jun Liu is Professor of Statistics at Harvard University, with a courtesy Professor appointment at Harvard Biostatistics Department Professor Liu was the recipient of the 2002 COPSS Presidents' Award, the most prestigious one for sta...

  8. Random Numbers and Monte Carlo Methods

    Science.gov (United States)

    Scherer, Philipp O. J.

    Many-body problems often involve the calculation of integrals of very high dimension which cannot be treated by standard methods. For the calculation of thermodynamic averages Monte Carlo methods are very useful which sample the integration volume at randomly chosen points. After summarizing some basic statistics, we discuss algorithms for the generation of pseudo-random numbers with given probability distribution which are essential for all Monte Carlo methods. We show how the efficiency of Monte Carlo integration can be improved by sampling preferentially the important configurations. Finally the famous Metropolis algorithm is applied to classical many-particle systems. Computer experiments visualize the central limit theorem and apply the Metropolis method to the traveling salesman problem.

  9. Off-diagonal expansion quantum Monte Carlo.

    Science.gov (United States)

    Albash, Tameem; Wagenbreth, Gene; Hen, Itay

    2017-12-01

    We propose a Monte Carlo algorithm designed to simulate quantum as well as classical systems at equilibrium, bridging the algorithmic gap between quantum and classical thermal simulation algorithms. The method is based on a decomposition of the quantum partition function that can be viewed as a series expansion about its classical part. We argue that the algorithm not only provides a theoretical advancement in the field of quantum Monte Carlo simulations, but is optimally suited to tackle quantum many-body systems that exhibit a range of behaviors from "fully quantum" to "fully classical," in contrast to many existing methods. We demonstrate the advantages, sometimes by orders of magnitude, of the technique by comparing it against existing state-of-the-art schemes such as path integral quantum Monte Carlo and stochastic series expansion. We also illustrate how our method allows for the unification of quantum and classical thermal parallel tempering techniques into a single algorithm and discuss its practical significance.

  10. Reflections on early Monte Carlo calculations

    International Nuclear Information System (INIS)

    Spanier, J.

    1992-01-01

    Monte Carlo methods for solving various particle transport problems developed in parallel with the evolution of increasingly sophisticated computer programs implementing diffusion theory and low-order moments calculations. In these early years, Monte Carlo calculations and high-order approximations to the transport equation were seen as too expensive to use routinely for nuclear design but served as invaluable aids and supplements to design with less expensive tools. The earliest Monte Carlo programs were quite literal; i.e., neutron and other particle random walk histories were simulated by sampling from the probability laws inherent in the physical system without distoration. Use of such analogue sampling schemes resulted in a good deal of time being spent in examining the possibility of lowering the statistical uncertainties in the sample estimates by replacing simple, and intuitively obvious, random variables by those with identical means but lower variances

  11. Monte Carlo simulation of Markov unreliability models

    International Nuclear Information System (INIS)

    Lewis, E.E.; Boehm, F.

    1984-01-01

    A Monte Carlo method is formulated for the evaluation of the unrealibility of complex systems with known component failure and repair rates. The formulation is in terms of a Markov process allowing dependences between components to be modeled and computational efficiencies to be achieved in the Monte Carlo simulation. Two variance reduction techniques, forced transition and failure biasing, are employed to increase computational efficiency of the random walk procedure. For an example problem these result in improved computational efficiency by more than three orders of magnitudes over analog Monte Carlo. The method is generalized to treat problems with distributed failure and repair rate data, and a batching technique is introduced and shown to result in substantial increases in computational efficiency for an example problem. A method for separating the variance due to the data uncertainty from that due to the finite number of random walks is presented. (orig.)

  12. Shell model the Monte Carlo way

    International Nuclear Information System (INIS)

    Ormand, W.E.

    1995-01-01

    The formalism for the auxiliary-field Monte Carlo approach to the nuclear shell model is presented. The method is based on a linearization of the two-body part of the Hamiltonian in an imaginary-time propagator using the Hubbard-Stratonovich transformation. The foundation of the method, as applied to the nuclear many-body problem, is discussed. Topics presented in detail include: (1) the density-density formulation of the method, (2) computation of the overlaps, (3) the sign of the Monte Carlo weight function, (4) techniques for performing Monte Carlo sampling, and (5) the reconstruction of response functions from an imaginary-time auto-correlation function using MaxEnt techniques. Results obtained using schematic interactions, which have no sign problem, are presented to demonstrate the feasibility of the method, while an extrapolation method for realistic Hamiltonians is presented. In addition, applications at finite temperature are outlined

  13. Shell model the Monte Carlo way

    Energy Technology Data Exchange (ETDEWEB)

    Ormand, W.E.

    1995-03-01

    The formalism for the auxiliary-field Monte Carlo approach to the nuclear shell model is presented. The method is based on a linearization of the two-body part of the Hamiltonian in an imaginary-time propagator using the Hubbard-Stratonovich transformation. The foundation of the method, as applied to the nuclear many-body problem, is discussed. Topics presented in detail include: (1) the density-density formulation of the method, (2) computation of the overlaps, (3) the sign of the Monte Carlo weight function, (4) techniques for performing Monte Carlo sampling, and (5) the reconstruction of response functions from an imaginary-time auto-correlation function using MaxEnt techniques. Results obtained using schematic interactions, which have no sign problem, are presented to demonstrate the feasibility of the method, while an extrapolation method for realistic Hamiltonians is presented. In addition, applications at finite temperature are outlined.

  14. SPQR: a Monte Carlo reactor kinetics code

    International Nuclear Information System (INIS)

    Cramer, S.N.; Dodds, H.L.

    1980-02-01

    The SPQR Monte Carlo code has been developed to analyze fast reactor core accident problems where conventional methods are considered inadequate. The code is based on the adiabatic approximation of the quasi-static method. This initial version contains no automatic material motion or feedback. An existing Monte Carlo code is used to calculate the shape functions and the integral quantities needed in the kinetics module. Several sample problems have been devised and analyzed. Due to the large statistical uncertainty associated with the calculation of reactivity in accident simulations, the results, especially at later times, differ greatly from deterministic methods. It was also found that in large uncoupled systems, the Monte Carlo method has difficulty in handling asymmetric perturbations

  15. Neutronic performance optimization study of Indian fusion demo reactor first wall and breeding blanket

    International Nuclear Information System (INIS)

    Swami, H.L.; Danani, C.

    2015-01-01

    In frame of design studies of Indian Nuclear Fusion DEMO Reactor, neutronic performance optimization of first wall and breeding blanket are carried out. The study mainly focuses on tritium breeding ratio (TBR) and power density responses estimation of breeding blanket. Apart from neutronic efficiency of existing breeding blanket concepts for Indian DEMO i.e. lead lithium ceramic breeder and helium cooled solid breeder concept other concepts like helium cooled lead lithium and helium-cooled Li_8PbO_6 with reflector are also explored. The aim of study is to establish a neutronically efficient breeding blanket concept for DEMO. Effect of first wall materials and thickness on breeding blanket neutronic performance is also evaluated. For this study 1 D cylindrical neutronic model of DEMO has been constructed according to the preliminary radial build up of Indian DEMO. The assessment is being done using Monte Carlo based radiation transport code and nuclear cross section data file ENDF/B- VII. (author)

  16. Current and future applications of Monte Carlo

    International Nuclear Information System (INIS)

    Zaidi, H.

    2003-01-01

    Full text: The use of radionuclides in medicine has a long history and encompasses a large area of applications including diagnosis and radiation treatment of cancer patients using either external or radionuclide radiotherapy. The 'Monte Carlo method'describes a very broad area of science, in which many processes, physical systems, and phenomena are simulated by statistical methods employing random numbers. The general idea of Monte Carlo analysis is to create a model, which is as similar as possible to the real physical system of interest, and to create interactions within that system based on known probabilities of occurrence, with random sampling of the probability density functions (pdfs). As the number of individual events (called 'histories') is increased, the quality of the reported average behavior of the system improves, meaning that the statistical uncertainty decreases. The use of the Monte Carlo method to simulate radiation transport has become the most accurate means of predicting absorbed dose distributions and other quantities of interest in the radiation treatment of cancer patients using either external or radionuclide radiotherapy. The same trend has occurred for the estimation of the absorbed dose in diagnostic procedures using radionuclides as well as the assessment of image quality and quantitative accuracy of radionuclide imaging. As a consequence of this generalized use, many questions are being raised primarily about the need and potential of Monte Carlo techniques, but also about how accurate it really is, what would it take to apply it clinically and make it available widely to the nuclear medicine community at large. Many of these questions will be answered when Monte Carlo techniques are implemented and used for more routine calculations and for in-depth investigations. In this paper, the conceptual role of the Monte Carlo method is briefly introduced and followed by a survey of its different applications in diagnostic and therapeutic

  17. Monte Carlo method for array criticality calculations

    International Nuclear Information System (INIS)

    Dickinson, D.; Whitesides, G.E.

    1976-01-01

    The Monte Carlo method for solving neutron transport problems consists of mathematically tracing paths of individual neutrons collision by collision until they are lost by absorption or leakage. The fate of the neutron after each collision is determined by the probability distribution functions that are formed from the neutron cross-section data. These distributions are sampled statistically to establish the successive steps in the neutron's path. The resulting data, accumulated from following a large number of batches, are analyzed to give estimates of k/sub eff/ and other collision-related quantities. The use of electronic computers to produce the simulated neutron histories, initiated at Los Alamos Scientific Laboratory, made the use of the Monte Carlo method practical for many applications. In analog Monte Carlo simulation, the calculation follows the physical events of neutron scattering, absorption, and leakage. To increase calculational efficiency, modifications such as the use of statistical weights are introduced. The Monte Carlo method permits the use of a three-dimensional geometry description and a detailed cross-section representation. Some of the problems in using the method are the selection of the spatial distribution for the initial batch, the preparation of the geometry description for complex units, and the calculation of error estimates for region-dependent quantities such as fluxes. The Monte Carlo method is especially appropriate for criticality safety calculations since it permits an accurate representation of interacting units of fissile material. Dissimilar units, units of complex shape, moderators between units, and reflected arrays may be calculated. Monte Carlo results must be correlated with relevant experimental data, and caution must be used to ensure that a representative set of neutron histories is produced

  18. Depreciation of the Indian Currency: Implications for the Indian Economy.

    OpenAIRE

    Sumanjeet Singh

    2009-01-01

    The Indian currency has depreciated by more than 20 per cent since April 2008 and breached its crucial 50-level against the greenback on sustained dollar purchases by foreign banks and stronger dollar overseas. The fall in the value of Indian rupee has several consequences which could have mixed effects on Indian economy. But, mainly, there are four expected implications of falling rupee. First, it should boost exports; second, it will lead to higher cost of imported goods and make some of th...

  19. New fellows | Announcements | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    ... of Medical Sciences, New Delhi; S K Bhowmik, Indian Institute of Technology, ... Souvik Mahapatra, Indian Institute of Technology, Mumbai; Prabal K Maiti, Indian ... Math Art and Design: MAD about Math, Math Education and Outreach.

  20. Asthma and American Indians/Alaska Natives

    Science.gov (United States)

    ... Minority Population Profiles > American Indian/Alaska Native > Asthma Asthma and American Indians/Alaska Natives In 2015, 240, ... Native American adults reported that they currently have asthma. American Indian/Alaska Native children are 60% more ...

  1. Monte Carlo simulation applied to alpha spectrometry

    International Nuclear Information System (INIS)

    Baccouche, S.; Gharbi, F.; Trabelsi, A.

    2007-01-01

    Alpha particle spectrometry is a widely-used analytical method, in particular when we deal with pure alpha emitting radionuclides. Monte Carlo simulation is an adequate tool to investigate the influence of various phenomena on this analytical method. We performed an investigation of those phenomena using the simulation code GEANT of CERN. The results concerning the geometrical detection efficiency in different measurement geometries agree with analytical calculations. This work confirms that Monte Carlo simulation of solid angle of detection is a very useful tool to determine with very good accuracy the detection efficiency.

  2. Simplified monte carlo simulation for Beijing spectrometer

    International Nuclear Information System (INIS)

    Wang Taijie; Wang Shuqin; Yan Wuguang; Huang Yinzhi; Huang Deqiang; Lang Pengfei

    1986-01-01

    The Monte Carlo method based on the functionization of the performance of detectors and the transformation of values of kinematical variables into ''measured'' ones by means of smearing has been used to program the Monte Carlo simulation of the performance of the Beijing Spectrometer (BES) in FORTRAN language named BESMC. It can be used to investigate the multiplicity, the particle type, and the distribution of four-momentum of the final states of electron-positron collision, and also the response of the BES to these final states. Thus, it provides a measure to examine whether the overall design of the BES is reasonable and to decide the physical topics of the BES

  3. Self-learning Monte Carlo (dynamical biasing)

    International Nuclear Information System (INIS)

    Matthes, W.

    1981-01-01

    In many applications the histories of a normal Monte Carlo game rarely reach the target region. An approximate knowledge of the importance (with respect to the target) may be used to guide the particles more frequently into the target region. A Monte Carlo method is presented in which each history contributes to update the importance field such that eventually most target histories are sampled. It is a self-learning method in the sense that the procedure itself: (a) learns which histories are important (reach the target) and increases their probability; (b) reduces the probabilities of unimportant histories; (c) concentrates gradually on the more important target histories. (U.K.)

  4. Burnup calculations using Monte Carlo method

    International Nuclear Information System (INIS)

    Ghosh, Biplab; Degweker, S.B.

    2009-01-01

    In the recent years, interest in burnup calculations using Monte Carlo methods has gained momentum. Previous burn up codes have used multigroup transport theory based calculations followed by diffusion theory based core calculations for the neutronic portion of codes. The transport theory methods invariably make approximations with regard to treatment of the energy and angle variables involved in scattering, besides approximations related to geometry simplification. Cell homogenisation to produce diffusion, theory parameters adds to these approximations. Moreover, while diffusion theory works for most reactors, it does not produce accurate results in systems that have strong gradients, strong absorbers or large voids. Also, diffusion theory codes are geometry limited (rectangular, hexagonal, cylindrical, and spherical coordinates). Monte Carlo methods are ideal to solve very heterogeneous reactors and/or lattices/assemblies in which considerable burnable poisons are used. The key feature of this approach is that Monte Carlo methods permit essentially 'exact' modeling of all geometrical detail, without resort to ene and spatial homogenization of neutron cross sections. Monte Carlo method would also be better for in Accelerator Driven Systems (ADS) which could have strong gradients due to the external source and a sub-critical assembly. To meet the demand for an accurate burnup code, we have developed a Monte Carlo burnup calculation code system in which Monte Carlo neutron transport code is coupled with a versatile code (McBurn) for calculating the buildup and decay of nuclides in nuclear materials. McBurn is developed from scratch by the authors. In this article we will discuss our effort in developing the continuous energy Monte Carlo burn-up code, McBurn. McBurn is intended for entire reactor core as well as for unit cells and assemblies. Generally, McBurn can do burnup of any geometrical system which can be handled by the underlying Monte Carlo transport code

  5. Improvements for Monte Carlo burnup calculation

    Energy Technology Data Exchange (ETDEWEB)

    Shenglong, Q.; Dong, Y.; Danrong, S.; Wei, L., E-mail: qiangshenglong@tsinghua.org.cn, E-mail: d.yao@npic.ac.cn, E-mail: songdr@npic.ac.cn, E-mail: luwei@npic.ac.cn [Nuclear Power Inst. of China, Cheng Du, Si Chuan (China)

    2015-07-01

    Monte Carlo burnup calculation is development trend of reactor physics, there would be a lot of work to be done for engineering applications. Based on Monte Carlo burnup code MOI, non-fuel burnup calculation methods and critical search suggestions will be mentioned in this paper. For non-fuel burnup, mixed burnup mode will improve the accuracy of burnup calculation and efficiency. For critical search of control rod position, a new method called ABN based on ABA which used by MC21 will be proposed for the first time in this paper. (author)

  6. A keff calculation method by Monte Carlo

    International Nuclear Information System (INIS)

    Shen, H; Wang, K.

    2008-01-01

    The effective multiplication factor (k eff ) is defined as the ratio between the number of neutrons in successive generations, which definition is adopted by most Monte Carlo codes (e.g. MCNP). Also, it can be thought of as the ratio of the generation rate of neutrons by the sum of the leakage rate and the absorption rate, which should exclude the effect of the neutron reaction such as (n, 2n) and (n, 3n). This article discusses the Monte Carlo method for k eff calculation based on the second definition. A new code has been developed and the results are presented. (author)

  7. Monte Carlo electron/photon transport

    International Nuclear Information System (INIS)

    Mack, J.M.; Morel, J.E.; Hughes, H.G.

    1985-01-01

    A review of nonplasma coupled electron/photon transport using Monte Carlo method is presented. Remarks are mainly restricted to linerarized formalisms at electron energies from 1 keV to 1000 MeV. Applications involving pulse-height estimation, transport in external magnetic fields, and optical Cerenkov production are discussed to underscore the importance of this branch of computational physics. Advances in electron multigroup cross-section generation is reported, and its impact on future code development assessed. Progress toward the transformation of MCNP into a generalized neutral/charged-particle Monte Carlo code is described. 48 refs

  8. Monte Carlo simulation of neutron scattering instruments

    International Nuclear Information System (INIS)

    Seeger, P.A.

    1995-01-01

    A library of Monte Carlo subroutines has been developed for the purpose of design of neutron scattering instruments. Using small-angle scattering as an example, the philosophy and structure of the library are described and the programs are used to compare instruments at continuous wave (CW) and long-pulse spallation source (LPSS) neutron facilities. The Monte Carlo results give a count-rate gain of a factor between 2 and 4 using time-of-flight analysis. This is comparable to scaling arguments based on the ratio of wavelength bandwidth to resolution width

  9. Monte Carlo applications to radiation shielding problems

    International Nuclear Information System (INIS)

    Subbaiah, K.V.

    2009-01-01

    Monte Carlo methods are a class of computational algorithms that rely on repeated random sampling of physical and mathematical systems to compute their results. However, basic concepts of MC are both simple and straightforward and can be learned by using a personal computer. Uses of Monte Carlo methods require large amounts of random numbers, and it was their use that spurred the development of pseudorandom number generators, which were far quicker to use than the tables of random numbers which had been previously used for statistical sampling. In Monte Carlo simulation of radiation transport, the history (track) of a particle is viewed as a random sequence of free flights that end with an interaction event where the particle changes its direction of movement, loses energy and, occasionally, produces secondary particles. The Monte Carlo simulation of a given experimental arrangement (e.g., an electron beam, coming from an accelerator and impinging on a water phantom) consists of the numerical generation of random histories. To simulate these histories we need an interaction model, i.e., a set of differential cross sections (DCS) for the relevant interaction mechanisms. The DCSs determine the probability distribution functions (pdf) of the random variables that characterize a track; 1) free path between successive interaction events, 2) type of interaction taking place and 3) energy loss and angular deflection in a particular event (and initial state of emitted secondary particles, if any). Once these pdfs are known, random histories can be generated by using appropriate sampling methods. If the number of generated histories is large enough, quantitative information on the transport process may be obtained by simply averaging over the simulated histories. The Monte Carlo method yields the same information as the solution of the Boltzmann transport equation, with the same interaction model, but is easier to implement. In particular, the simulation of radiation

  10. Simulation of transport equations with Monte Carlo

    International Nuclear Information System (INIS)

    Matthes, W.

    1975-09-01

    The main purpose of the report is to explain the relation between the transport equation and the Monte Carlo game used for its solution. The introduction of artificial particles carrying a weight provides one with high flexibility in constructing many different games for the solution of the same equation. This flexibility opens a way to construct a Monte Carlo game for the solution of the adjoint transport equation. Emphasis is laid mostly on giving a clear understanding of what to do and not on the details of how to do a specific game

  11. Monte Carlo dose distributions for radiosurgery

    International Nuclear Information System (INIS)

    Perucha, M.; Leal, A.; Rincon, M.; Carrasco, E.

    2001-01-01

    The precision of Radiosurgery Treatment planning systems is limited by the approximations of their algorithms and by their dosimetrical input data. This fact is especially important in small fields. However, the Monte Carlo methods is an accurate alternative as it considers every aspect of particle transport. In this work an acoustic neurinoma is studied by comparing the dose distribution of both a planning system and Monte Carlo. Relative shifts have been measured and furthermore, Dose-Volume Histograms have been calculated for target and adjacent organs at risk. (orig.)

  12. BIA Indian Lands Dataset (Indian Lands of the United States)

    Data.gov (United States)

    Federal Geographic Data Committee — The American Indian Reservations / Federally Recognized Tribal Entities dataset depicts feature location, selected demographics and other associated data for the 561...

  13. Fast sequential Monte Carlo methods for counting and optimization

    CERN Document Server

    Rubinstein, Reuven Y; Vaisman, Radislav

    2013-01-01

    A comprehensive account of the theory and application of Monte Carlo methods Based on years of research in efficient Monte Carlo methods for estimation of rare-event probabilities, counting problems, and combinatorial optimization, Fast Sequential Monte Carlo Methods for Counting and Optimization is a complete illustration of fast sequential Monte Carlo techniques. The book provides an accessible overview of current work in the field of Monte Carlo methods, specifically sequential Monte Carlo techniques, for solving abstract counting and optimization problems. Written by authorities in the

  14. Specialized Monte Carlo codes versus general-purpose Monte Carlo codes

    International Nuclear Information System (INIS)

    Moskvin, Vadim; DesRosiers, Colleen; Papiez, Lech; Lu, Xiaoyi

    2002-01-01

    The possibilities of Monte Carlo modeling for dose calculations and optimization treatment are quite limited in radiation oncology applications. The main reason is that the Monte Carlo technique for dose calculations is time consuming while treatment planning may require hundreds of possible cases of dose simulations to be evaluated for dose optimization. The second reason is that general-purpose codes widely used in practice, require an experienced user to customize them for calculations. This paper discusses the concept of Monte Carlo code design that can avoid the main problems that are preventing wide spread use of this simulation technique in medical physics. (authors)

  15. On the use of stochastic approximation Monte Carlo for Monte Carlo integration

    KAUST Repository

    Liang, Faming

    2009-03-01

    The stochastic approximation Monte Carlo (SAMC) algorithm has recently been proposed as a dynamic optimization algorithm in the literature. In this paper, we show in theory that the samples generated by SAMC can be used for Monte Carlo integration via a dynamically weighted estimator by calling some results from the literature of nonhomogeneous Markov chains. Our numerical results indicate that SAMC can yield significant savings over conventional Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, for the problems for which the energy landscape is rugged. © 2008 Elsevier B.V. All rights reserved.

  16. Celebrating National American Indian Heritage Month

    National Research Council Canada - National Science Library

    Mann, Diane

    2004-01-01

    November has been designated National American Indian Heritage Month to honor American Indians and Alaska Natives by increasing awareness of their culture, history, and, especially, their tremendous...

  17. Fellowship | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Last known address: Professor, Department of Chemistry, Indian Institute of ... Specialization: Natural Products & Drug Development, Reaction Mechanism, ... Specialization: Plant Molecular Biology, Plant Tissue Culture and Genetic ...

  18. Fellowship | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Address: Department of Electrical Engineering, Indian Institute of Technology, Powai, Mumbai ..... Specialization: Elementary Particle Physics ..... Sciences, National Institute of Science Education & Research, Jatni, Khordha 752 050, Orissa

  19. Fellowship | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Specialization: DNA Double-Strand Break Repair, Genomic Instability, Cancer ... Address: Indian Institute of Science Education & Research, Dr Homi Bhabha Road, .... Inflammatory Bowel Disease, Gastrointestinal Microbiome Stem Cells

  20. Fellowship | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Time Programs, Logic Programs, Mobile Computing and Computer & Information Security Address: Distinguished V Professor, Computer Science & Engineering Department, Indian Institute of Technology, Powai, Mumbai 400 076, Maharashtra

  1. Combination of Mean Platelet Volume/Platelet Count Ratio and the APACHE II Score Better Predicts the Short-Term Outcome in Patients with Acute Kidney Injury Receiving Continuous Renal Replacement Therapy.

    Science.gov (United States)

    Li, Junhui; Li, Yingchuan; Sheng, Xiaohua; Wang, Feng; Cheng, Dongsheng; Jian, Guihua; Li, Yongguang; Feng, Liang; Wang, Niansong

    2018-03-29

    Both the Acute physiology and Chronic Health Evaluation (APACHE II) score and mean platelet volume/platelet count Ratio (MPR) can independently predict adverse outcomes in critically ill patients. This study was aimed to investigate whether the combination of them could have a better performance in predicting prognosis of patients with acute kidney injury (AKI) who received continuous renal replacement therapy (CRRT). Two hundred twenty-three patients with AKI who underwent CRRT between January 2009 and December 2014 in a Chinese university hospital were enrolled. They were divided into survivals group and non-survivals group based on the situation at discharge. Receiver Operating Characteristic (ROC) curve was used for MPR and APACHE II score, and to determine the optimal cut-off value of MPR for in-hospital mortality. Factors associated with mortality were identified by univariate and multivariate logistic regression analysis. The mean age of the patients was 61.4 years, and the overall in-hospital mortality was 48.4%. Acute cardiorenal syndrome (ACRS) was the most common cause of AKI. The optimal cut-off value of MPR for mortality was 0.099 with an area under the ROC curve (AUC) of 0.636. The AUC increased to 0.851 with the addition of the APACHE II score. The mortality of patients with of MPR > 0.099 was 56.4%, which was significantly higher than that of the control group with of ≤ 0.099 (39.6%, P= 0.012). Logistic regression analysis showed that average number of organ failure (OR = 2.372), APACHE II score (OR = 1.187), age (OR = 1.028) and vasopressors administration (OR = 38.130) were significantly associated with poor prognosis. Severity of illness was significantly associated with prognosis of patients with AKI. The combination of MPR and APACHE II score may be helpful in predicting the short-term outcome of AKI. © 2018 The Author(s). Published by S. Karger AG, Basel.

  2. Utility of Procalcitonin (PCT and Mid regional pro-Adrenomedullin (MR-proADM in risk stratification of critically ill febrile patients in Emergency Department (ED. A comparison with APACHE II score

    Directory of Open Access Journals (Sweden)

    Travaglino Francesco

    2012-08-01

    Full Text Available Abstract Background The aim of our study was to evaluate the prognostic value of MR-proADM and PCT levels in febrile patients in the ED in comparison with a disease severity index score, the APACHE II score. We also evaluated the ability of MR-proADM and PCT to predict hospitalization. Methods This was an observational, multicentric study. We enrolled 128 patients referred to the ED with high fever and a suspicion of severe infection such as sepsis, lower respiratory tract infections, urinary tract infections, gastrointestinal infections, soft tissue infections, central nervous system infections, or osteomyelitis. The APACHE II score was calculated for each patient. Results MR-proADM median values in controls were 0.5 nmol/l as compared with 0.85 nmol/l in patients (P P . MR-proADM and PCT levels were significantly increased in accordance with the Apache II quartiles (P  respectively. In the respiratory infections, urinary infections, and sepsis-septic shock groups we found a correlation between the Apache II and MR-proADM respectively and MR-proADM and PCT respectively. We evaluated the ability of MR-proADM and PCT to predict hospitalization in patients admitted to our emergency departments complaining of fever. MR-proADM alone had an AUC of 0.694, while PCT alone had an AUC of 0.763. The combined use of PCT and MR-proADM instead showed an AUC of 0.79. Conclusions The present study highlights the way in which MR-proADM and PCT may be helpful to the febrile patient’s care in the ED. Our data support the prognostic role of MR-proADM and PCT in that setting, as demonstrated by the correlation with the APACHE II score. The combined use of the two biomarkers can predict a subsequent hospitalization of febrile patients. The rational use of these two molecules could lead to several advantages, such as faster diagnosis, more accurate risk stratification, and optimization of the treatment, with consequent benefit to the patient and

  3. Combination of Mean Platelet Volume/Platelet Count Ratio and the APACHE II Score Better Predicts the Short-Term Outcome in Patients with Acute Kidney Injury Receiving Continuous Renal Replacement Therapy

    Directory of Open Access Journals (Sweden)

    Junhui Li

    2018-03-01

    Full Text Available Background/Aims: Both the Acute physiology and Chronic Health Evaluation (APACHE II score and mean platelet volume/platelet count Ratio (MPR can independently predict adverse outcomes in critically ill patients. This study was aimed to investigate whether the combination of them could have a better performance in predicting prognosis of patients with acute kidney injury (AKI who received continuous renal replacement therapy (CRRT. Methods: Two hundred twenty-three patients with AKI who underwent CRRT between January 2009 and December 2014 in a Chinese university hospital were enrolled. They were divided into survivals group and non-survivals group based on the situation at discharge. Receiver Operating Characteristic (ROC curve was used for MPR and APACHE II score, and to determine the optimal cut-off value of MPR for in-hospital mortality. Factors associated with mortality were identified by univariate and multivariate logistic regression analysis. Results: The mean age of the patients was 61.4 years, and the overall in-hospital mortality was 48.4%. Acute cardiorenal syndrome (ACRS was the most common cause of AKI. The optimal cut-off value of MPR for mortality was 0.099 with an area under the ROC curve (AUC of 0.636. The AUC increased to 0.851 with the addition of the APACHE II score. The mortality of patients with of MPR > 0.099 was 56.4%, which was significantly higher than that of the control group with of ≤ 0.099 (39.6%, P= 0.012. Logistic regression analysis showed that average number of organ failure (OR = 2.372, APACHE II score (OR = 1.187, age (OR = 1.028 and vasopressors administration (OR = 38.130 were significantly associated with poor prognosis. Conclusion: Severity of illness was significantly associated with prognosis of patients with AKI. The combination of MPR and APACHE II score may be helpful in predicting the short-term outcome of AKI.

  4. Indian Danish intermarriage

    DEFF Research Database (Denmark)

    Singla, Rashmi; Sriram, Sujata

    This paper explores motivations of Indian partner in mixed Indian-Danish couples living in Denmark. One of the characteristics of modernity is increased movements across borders, leading to increased intimate relationships across national/ethnic borders. The main research question here deals...... with the reasons for couple ‘getting together’. How do motives interplay with the gender- and the family generational, socio -economical categories? The paper draws from an explorative study conducted in Denmark among intermarried couples, consisting of in-depth interviews with ten ‘ordinary’ intermarried couples...... (TEM), transnationalism and a phenomenological approach to sexual desire and love. We find that there are three different pathways, highlighting commonality of work identity, a cosmopolitan identity and academic interests, where differential changing patterns of privileges and power are also evoked...

  5. Indian President visits CERN

    CERN Multimedia

    Katarina Anthony

    2011-01-01

    On 1 October, her Excellency Mrs Pratibha Devisingh Patil, President of India, picked CERN as the first stop on her official state visit to Switzerland. Accompanied by a host of Indian journalists, a security team, and a group of presidential delegates, the president left quite an impression when she visited CERN’s Point 2!   Upon arrival, Pratibha Patil was greeted by CERN Director General Rolf Heuer, as well as senior Indian scientists working at CERN, and various department directors. After a quick overview of the Organization, Rolf Heuer and the President addressed India’s future collaboration with CERN. India is currently an Observer State of the Organization, and is considering becoming an Associate Member State. A short stop in LHC operations gave Steve Myers and the Accelerator team the opportunity to take the President on a tour through the LHC tunnel. From there, ALICE’s Tapan Nayak and Spokesperson Paolo Giubellino took Pratibha Patil to the experiment&am...

  6. Parallel processing Monte Carlo radiation transport codes

    International Nuclear Information System (INIS)

    McKinney, G.W.

    1994-01-01

    Issues related to distributed-memory multiprocessing as applied to Monte Carlo radiation transport are discussed. Measurements of communication overhead are presented for the radiation transport code MCNP which employs the communication software package PVM, and average efficiency curves are provided for a homogeneous virtual machine

  7. Monte Carlo determination of heteroepitaxial misfit structures

    DEFF Research Database (Denmark)

    Baker, J.; Lindgård, Per-Anker

    1996-01-01

    We use Monte Carlo simulations to determine the structure of KBr overlayers on a NaCl(001) substrate, a system with large (17%) heteroepitaxial misfit. The equilibrium relaxation structure is determined for films of 2-6 ML, for which extensive helium-atom scattering data exist for comparison...

  8. Juan Carlos D'Olivo: A portrait

    Science.gov (United States)

    Aguilar-Arévalo, Alexis A.

    2013-06-01

    This report attempts to give a brief bibliographical sketch of the academic life of Juan Carlos D'Olivo, researcher and teacher at the Instituto de Ciencias Nucleares of UNAM, devoted to advancing the fields of High Energy Physics and Astroparticle Physics in Mexico and Latin America.

  9. The Monte Carlo applied for calculation dose

    International Nuclear Information System (INIS)

    Peixoto, J.E.

    1988-01-01

    The Monte Carlo method is showed for the calculation of absorbed dose. The trajectory of the photon is traced simulating sucessive interaction between the photon and the substance that consist the human body simulator. The energy deposition in each interaction of the simulator organ or tissue per photon is also calculated. (C.G.C.) [pt

  10. Monte Carlo code for neutron radiography

    International Nuclear Information System (INIS)

    Milczarek, Jacek J.; Trzcinski, Andrzej; El-Ghany El Abd, Abd; Czachor, Andrzej

    2005-01-01

    The concise Monte Carlo code, MSX, for simulation of neutron radiography images of non-uniform objects is presented. The possibility of modeling the images of objects with continuous spatial distribution of specific isotopes is included. The code can be used for assessment of the scattered neutron component in neutron radiograms

  11. Monte Carlo code for neutron radiography

    Energy Technology Data Exchange (ETDEWEB)

    Milczarek, Jacek J. [Institute of Atomic Energy, Swierk, 05-400 Otwock (Poland)]. E-mail: jjmilcz@cyf.gov.pl; Trzcinski, Andrzej [Institute for Nuclear Studies, Swierk, 05-400 Otwock (Poland); El-Ghany El Abd, Abd [Institute of Atomic Energy, Swierk, 05-400 Otwock (Poland); Nuclear Research Center, PC 13759, Cairo (Egypt); Czachor, Andrzej [Institute of Atomic Energy, Swierk, 05-400 Otwock (Poland)

    2005-04-21

    The concise Monte Carlo code, MSX, for simulation of neutron radiography images of non-uniform objects is presented. The possibility of modeling the images of objects with continuous spatial distribution of specific isotopes is included. The code can be used for assessment of the scattered neutron component in neutron radiograms.

  12. Monte Carlo method in neutron activation analysis

    International Nuclear Information System (INIS)

    Majerle, M.; Krasa, A.; Svoboda, O.; Wagner, V.; Adam, J.; Peetermans, S.; Slama, O.; Stegajlov, V.I.; Tsupko-Sitnikov, V.M.

    2009-01-01

    Neutron activation detectors are a useful technique for the neutron flux measurements in spallation experiments. The study of the usefulness and the accuracy of this method at similar experiments was performed with the help of Monte Carlo codes MCNPX and FLUKA

  13. Atomistic Monte Carlo simulation of lipid membranes

    DEFF Research Database (Denmark)

    Wüstner, Daniel; Sklenar, Heinz

    2014-01-01

    Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC) simulation of lipid membranes. We provide an introduction...... of local-move MC methods in combination with molecular dynamics simulations, for example, for studying multi-component lipid membranes containing cholesterol....

  14. Computer system for Monte Carlo experimentation

    International Nuclear Information System (INIS)

    Grier, D.A.

    1986-01-01

    A new computer system for Monte Carlo Experimentation is presented. The new system speeds and simplifies the process of coding and preparing a Monte Carlo Experiment; it also encourages the proper design of Monte Carlo Experiments, and the careful analysis of the experimental results. A new functional language is the core of this system. Monte Carlo Experiments, and their experimental designs, are programmed in this new language; those programs are compiled into Fortran output. The Fortran output is then compiled and executed. The experimental results are analyzed with a standard statistics package such as Si, Isp, or Minitab or with a user-supplied program. Both the experimental results and the experimental design may be directly loaded into the workspace of those packages. The new functional language frees programmers from many of the details of programming an experiment. Experimental designs such as factorial, fractional factorial, or latin square are easily described by the control structures and expressions of the language. Specific mathematical modes are generated by the routines of the language

  15. Scalable Domain Decomposed Monte Carlo Particle Transport

    Energy Technology Data Exchange (ETDEWEB)

    O' Brien, Matthew Joseph [Univ. of California, Davis, CA (United States)

    2013-12-05

    In this dissertation, we present the parallel algorithms necessary to run domain decomposed Monte Carlo particle transport on large numbers of processors (millions of processors). Previous algorithms were not scalable, and the parallel overhead became more computationally costly than the numerical simulation.

  16. Monte Carlo methods beyond detailed balance

    NARCIS (Netherlands)

    Schram, Raoul D.; Barkema, Gerard T.|info:eu-repo/dai/nl/101275080

    2015-01-01

    Monte Carlo algorithms are nearly always based on the concept of detailed balance and ergodicity. In this paper we focus on algorithms that do not satisfy detailed balance. We introduce a general method for designing non-detailed balance algorithms, starting from a conventional algorithm satisfying

  17. Monte Carlo studies of ZEPLIN III

    CERN Document Server

    Dawson, J; Davidge, D C R; Gillespie, J R; Howard, A S; Jones, W G; Joshi, M; Lebedenko, V N; Sumner, T J; Quenby, J J

    2002-01-01

    A Monte Carlo simulation of a two-phase xenon dark matter detector, ZEPLIN III, has been achieved. Results from the analysis of a simulated data set are presented, showing primary and secondary signal distributions from low energy gamma ray events.

  18. Biases in Monte Carlo eigenvalue calculations

    Energy Technology Data Exchange (ETDEWEB)

    Gelbard, E.M.

    1992-12-01

    The Monte Carlo method has been used for many years to analyze the neutronics of nuclear reactors. In fact, as the power of computers has increased the importance of Monte Carlo in neutronics has also increased, until today this method plays a central role in reactor analysis and design. Monte Carlo is used in neutronics for two somewhat different purposes, i.e., (a) to compute the distribution of neutrons in a given medium when the neutron source-density is specified, and (b) to compute the neutron distribution in a self-sustaining chain reaction, in which case the source is determined as the eigenvector of a certain linear operator. In (b), then, the source is not given, but must be computed. In the first case (the ``fixed-source`` case) the Monte Carlo calculation is unbiased. That is to say that, if the calculation is repeated (``replicated``) over and over, with independent random number sequences for each replica, then averages over all replicas will approach the correct neutron distribution as the number of replicas goes to infinity. Unfortunately, the computation is not unbiased in the second case, which we discuss here.

  19. Biases in Monte Carlo eigenvalue calculations

    Energy Technology Data Exchange (ETDEWEB)

    Gelbard, E.M.

    1992-01-01

    The Monte Carlo method has been used for many years to analyze the neutronics of nuclear reactors. In fact, as the power of computers has increased the importance of Monte Carlo in neutronics has also increased, until today this method plays a central role in reactor analysis and design. Monte Carlo is used in neutronics for two somewhat different purposes, i.e., (a) to compute the distribution of neutrons in a given medium when the neutron source-density is specified, and (b) to compute the neutron distribution in a self-sustaining chain reaction, in which case the source is determined as the eigenvector of a certain linear operator. In (b), then, the source is not given, but must be computed. In the first case (the fixed-source'' case) the Monte Carlo calculation is unbiased. That is to say that, if the calculation is repeated ( replicated'') over and over, with independent random number sequences for each replica, then averages over all replicas will approach the correct neutron distribution as the number of replicas goes to infinity. Unfortunately, the computation is not unbiased in the second case, which we discuss here.

  20. Dynamic bounds coupled with Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Rajabalinejad, M., E-mail: M.Rajabalinejad@tudelft.n [Faculty of Civil Engineering, Delft University of Technology, Delft (Netherlands); Meester, L.E. [Delft Institute of Applied Mathematics, Delft University of Technology, Delft (Netherlands); Gelder, P.H.A.J.M. van; Vrijling, J.K. [Faculty of Civil Engineering, Delft University of Technology, Delft (Netherlands)

    2011-02-15

    For the reliability analysis of engineering structures a variety of methods is known, of which Monte Carlo (MC) simulation is widely considered to be among the most robust and most generally applicable. To reduce simulation cost of the MC method, variance reduction methods are applied. This paper describes a method to reduce the simulation cost even further, while retaining the accuracy of Monte Carlo, by taking into account widely present monotonicity. For models exhibiting monotonic (decreasing or increasing) behavior, dynamic bounds (DB) are defined, which in a coupled Monte Carlo simulation are updated dynamically, resulting in a failure probability estimate, as well as a strict (non-probabilistic) upper and lower bounds. Accurate results are obtained at a much lower cost than an equivalent ordinary Monte Carlo simulation. In a two-dimensional and a four-dimensional numerical example, the cost reduction factors are 130 and 9, respectively, where the relative error is smaller than 5%. At higher accuracy levels, this factor increases, though this effect is expected to be smaller with increasing dimension. To show the application of DB method to real world problems, it is applied to a complex finite element model of a flood wall in New Orleans.

  1. Dynamic bounds coupled with Monte Carlo simulations

    NARCIS (Netherlands)

    Rajabali Nejad, Mohammadreza; Meester, L.E.; van Gelder, P.H.A.J.M.; Vrijling, J.K.

    2011-01-01

    For the reliability analysis of engineering structures a variety of methods is known, of which Monte Carlo (MC) simulation is widely considered to be among the most robust and most generally applicable. To reduce simulation cost of the MC method, variance reduction methods are applied. This paper

  2. Design and analysis of Monte Carlo experiments

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.; Gentle, J.E.; Haerdle, W.; Mori, Y.

    2012-01-01

    By definition, computer simulation or Monte Carlo models are not solved by mathematical analysis (such as differential calculus), but are used for numerical experimentation. The goal of these experiments is to answer questions about the real world; i.e., the experimenters may use their models to

  3. Some problems on Monte Carlo method development

    International Nuclear Information System (INIS)

    Pei Lucheng

    1992-01-01

    This is a short paper on some problems of Monte Carlo method development. The content consists of deep-penetration problems, unbounded estimate problems, limitation of Mdtropolis' method, dependency problem in Metropolis' method, random error interference problems and random equations, intellectualisation and vectorization problems of general software

  4. Monte Carlo simulations in theoretical physic

    International Nuclear Information System (INIS)

    Billoire, A.

    1991-01-01

    After a presentation of the MONTE CARLO method principle, the method is applied, first to the critical exponents calculations in the three dimensions ISING model, and secondly to the discrete quantum chromodynamic with calculation times in function of computer power. 28 refs., 4 tabs

  5. Monte Carlo method for random surfaces

    International Nuclear Information System (INIS)

    Berg, B.

    1985-01-01

    Previously two of the authors proposed a Monte Carlo method for sampling statistical ensembles of random walks and surfaces with a Boltzmann probabilistic weight. In the present paper we work out the details for several models of random surfaces, defined on d-dimensional hypercubic lattices. (orig.)

  6. Monte Carlo simulation of the microcanonical ensemble

    International Nuclear Information System (INIS)

    Creutz, M.

    1984-01-01

    We consider simulating statistical systems with a random walk on a constant energy surface. This combines features of deterministic molecular dynamics techniques and conventional Monte Carlo simulations. For discrete systems the method can be programmed to run an order of magnitude faster than other approaches. It does not require high quality random numbers and may also be useful for nonequilibrium studies. 10 references

  7. Variance Reduction Techniques in Monte Carlo Methods

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.; Ridder, A.A.N.; Rubinstein, R.Y.

    2010-01-01

    Monte Carlo methods are simulation algorithms to estimate a numerical quantity in a statistical model of a real system. These algorithms are executed by computer programs. Variance reduction techniques (VRT) are needed, even though computer speed has been increasing dramatically, ever since the

  8. Gian-Carlos Rota and Combinatorial Math.

    Science.gov (United States)

    Kolata, Gina Bari

    1979-01-01

    Presents the first of a series of occasional articles about mathematics as seen through the eyes of its prominent scholars. In an interview with Gian-Carlos Rota of the Massachusetts Institute of Technology he discusses how combinatorial mathematics began as a field and its future. (HM)

  9. Coded aperture optimization using Monte Carlo simulations

    International Nuclear Information System (INIS)

    Martineau, A.; Rocchisani, J.M.; Moretti, J.L.

    2010-01-01

    Coded apertures using Uniformly Redundant Arrays (URA) have been unsuccessfully evaluated for two-dimensional and three-dimensional imaging in Nuclear Medicine. The images reconstructed from coded projections contain artifacts and suffer from poor spatial resolution in the longitudinal direction. We introduce a Maximum-Likelihood Expectation-Maximization (MLEM) algorithm for three-dimensional coded aperture imaging which uses a projection matrix calculated by Monte Carlo simulations. The aim of the algorithm is to reduce artifacts and improve the three-dimensional spatial resolution in the reconstructed images. Firstly, we present the validation of GATE (Geant4 Application for Emission Tomography) for Monte Carlo simulations of a coded mask installed on a clinical gamma camera. The coded mask modelling was validated by comparison between experimental and simulated data in terms of energy spectra, sensitivity and spatial resolution. In the second part of the study, we use the validated model to calculate the projection matrix with Monte Carlo simulations. A three-dimensional thyroid phantom study was performed to compare the performance of the three-dimensional MLEM reconstruction with conventional correlation method. The results indicate that the artifacts are reduced and three-dimensional spatial resolution is improved with the Monte Carlo-based MLEM reconstruction.

  10. Biases in Monte Carlo eigenvalue calculations

    International Nuclear Information System (INIS)

    Gelbard, E.M.

    1992-01-01

    The Monte Carlo method has been used for many years to analyze the neutronics of nuclear reactors. In fact, as the power of computers has increased the importance of Monte Carlo in neutronics has also increased, until today this method plays a central role in reactor analysis and design. Monte Carlo is used in neutronics for two somewhat different purposes, i.e., (a) to compute the distribution of neutrons in a given medium when the neutron source-density is specified, and (b) to compute the neutron distribution in a self-sustaining chain reaction, in which case the source is determined as the eigenvector of a certain linear operator. In (b), then, the source is not given, but must be computed. In the first case (the ''fixed-source'' case) the Monte Carlo calculation is unbiased. That is to say that, if the calculation is repeated (''replicated'') over and over, with independent random number sequences for each replica, then averages over all replicas will approach the correct neutron distribution as the number of replicas goes to infinity. Unfortunately, the computation is not unbiased in the second case, which we discuss here

  11. Monte Carlo studies of uranium calorimetry

    International Nuclear Information System (INIS)

    Brau, J.; Hargis, H.J.; Gabriel, T.A.; Bishop, B.L.

    1985-01-01

    Detailed Monte Carlo calculations of uranium calorimetry are presented which reveal a significant difference in the responses of liquid argon and plastic scintillator in uranium calorimeters. Due to saturation effects, neutrons from the uranium are found to contribute only weakly to the liquid argon signal. Electromagnetic sampling inefficiencies are significant and contribute substantially to compensation in both systems. 17 references

  12. Indian cosmogonies and cosmologies

    Directory of Open Access Journals (Sweden)

    Pajin Dušan

    2011-01-01

    Full Text Available Various ideas on how the universe appeared and develops, were in Indian tradition related to mythic, religious, or philosophical ideas and contexts, and developed during some 3.000 years - from the time of Vedas, to Puranas. Conserning its appeareance, two main ideas were presented. In one concept it appeared out of itself (auto-generated, and gods were among the first to appear in the cosmic sequences. In the other, it was a kind of divine creation, with hard work (like the dismembering of the primal Purusha, or as emanation of divine dance. Indian tradition had also various critiques of mythic and religious concepts (from the 8th c. BC, to the 6c., who favoured naturalistic and materialistic explanations, and concepts, in their cosmogony and cosmology. One the peculiarities was that indian cosmogony and cosmology includes great time spans, since they used a digit system which was later (in the 13th c. introduced to Europe by Fibonacci (Leonardo of Pisa, 1170-1240.

  13. Working Women: Indian Perspective

    Directory of Open Access Journals (Sweden)

    Dharmendra MEHTA

    2014-06-01

    Full Text Available In India, due to unprecedented rise in the cost of living, ris-ing prices of commodities, growing expenses on children ed-ucation, huge rate of unemployment, and increasing cost of housing properties compel every Indian family to explore all the possible ways and means to increase the household income. It is also witnessed that after globalization Indian women are able to get more jobs but the work they get is more casual in nature or is the one that men do not prefer to do or is left by them to move to higher or better jobs. Working women refers to those in paid employment. They work as lawyers, nurses, doctors, teachers and secretaries etc. There is no profession today where women are not employed. University of Oxford’s Professor Linda Scott recently coined the term the Double X Economy to describe the global economy of women. The present paper makes an attempt to discuss issues and challenges that are being faced by Indian working women at their respective workstations.

  14. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  15. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. PRIYANKA SHUKLA. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 133-143 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Grad-type fourteen-moment theory for ...

  16. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. SERGEY P KUZNETSOV. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 117-132 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Chaos in three coupled rotators: ...

  17. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. NORBERT MARWAN. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 51-60 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Inferring interdependencies from short time ...

  18. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. GIOVANNA ZIMATORE. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 35-41 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. RQA correlations on real business cycles ...

  19. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. SUDHARSANA V IYENGAR. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 93-99 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Missing cycles: Effect of climate ...

  20. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. BEDARTHA GOSWAMI. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 51-60 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Inferring interdependencies from short ...

  1. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. MURILO S BAPTISTA. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 17-23 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Interpreting physical flows in networks as a ...

  2. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. F REVUELTA. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 145-155 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Rate calculation in two-dimensional barriers with ...

  3. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. JOYDEEP SINGHA. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 195-203 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Spatial splay states in coupled map lattices ...

  4. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. F FAMILY. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 221-224 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Transport in ratchets with single-file constraint.

  5. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. JANAKI BALAKRISHNAN. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 93-99 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Missing cycles: Effect of climate change ...

  6. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. PAUL SCHULTZ. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 51-60 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Inferring interdependencies from short time ...

  7. Pore-scale uncertainty quantification with multilevel Monte Carlo

    KAUST Repository

    Icardi, Matteo; Hoel, Haakon; Long, Quan; Tempone, Raul

    2014-01-01

    . Since there are no generic ways to parametrize the randomness in the porescale structures, Monte Carlo techniques are the most accessible to compute statistics. We propose a multilevel Monte Carlo (MLMC) technique to reduce the computational cost

  8. Prospect on general software of Monte Carlo method

    International Nuclear Information System (INIS)

    Pei Lucheng

    1992-01-01

    This is a short paper on the prospect of Monte Carlo general software. The content consists of cluster sampling method, zero variance technique, self-improved method, and vectorized Monte Carlo method

  9. Bayesian phylogeny analysis via stochastic approximation Monte Carlo

    KAUST Repository

    Cheon, Sooyoung; Liang, Faming

    2009-01-01

    in simulating from the posterior distribution of phylogenetic trees, rendering the inference ineffective. In this paper, we apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo algorithm, to Bayesian phylogeny analysis. Our method

  10. The international INTRAVAL project. Phase 2, working group 1 report. Flow and tracer experiments in unsaturated tuff and soil. Las Cruces trench and Apache Leap tuff studies

    International Nuclear Information System (INIS)

    Nicholson, T.J.; Guzman-Guzman, A.; Hills, R.; Rasmussen, T.C.

    1997-01-01

    The Working Group 1 final report summaries two test case studies, the Las Cruces Trench (LCT), and Apache Leap Tuff Site (ALTS) experiments. The objectives of these two field studies were to evaluate models for water flow and contaminant transport in unsaturated, heterogeneous soils and fractured tuff. The LCT experiments were specifically designed to test various deterministic and stochastic models of water flow and solute transport in heterogeneous, unsaturated soils. Experimental data from the first tow LCT experiments, and detailed field characterisation studies provided information for developing and calibrating the models. Experimental results from the third experiment were held confidential from the modellers, and were used for model comparison. Comparative analyses included: point comparisons of water content; predicted mean behavior for water flow; point comparisons of solute concentrations; and predicted mean behavior for tritium transport. These analyses indicated that no model, whether uniform or heterogeneous, proved superior. Since the INTRAVAL study, however, a new method has been developed for conditioning the hydraulic properties used for flow and transport modelling based on the initial field-measured water content distributions and a set of scale-mean hydraulic parameters. Very good matches between the observed and simulated flow and transport behavior were obtained using the conditioning procedure, without model calibration. The ALTS experiments were designed to evaluate characterisation methods and their associated conceptual models for coupled matrix-fracture continua over a range of scales (i.e., 2.5 centimeter rock samples; 10 centimeter cores; 1 meter block; and 30 meter boreholes). Within these spatial scales, laboratory and field tests were conducted for estimating pneumatic, thermal, hydraulic, and transport property values for different conceptual models. The analyses included testing of current conceptual, mathematical and physical

  11. Summary of air permeability data from single-hole injection tests in unsaturated fractured tuffs at the Apache Leap Research Site: Results of steady-state test interpretation

    International Nuclear Information System (INIS)

    Guzman, A.G.; Geddis, A.M.; Henrich, M.J.; Lohrstorfer, C.F.; Neuman, S.P.

    1996-03-01

    This document summarizes air permeability estimates obtained from single hole pneumatic injection tests in unsaturated fractured tuffs at the Covered Borehole Site (CBS) within the larger apache Leap Research Site (ALRS). Only permeability estimates obtained from a steady state interpretation of relatively stable pressure and flow rate data are included. Tests were conducted in five boreholes inclined at 45 degree to the horizontal, and one vertical borehole. Over 180 borehole segments were tested by setting the packers 1 m apart. Additional tests were conducted in segments of lengths 0.5, 2.0, and 3.0 m in one borehole, and 2.0 m in another borehole, bringing the total number of tests to over 270. Tests were conducted by maintaining a constant injection rate until air pressure became relatively stable and remained so for some time. The injection rate was then incremented by a constant value and the procedure repeated. The air injection rate, pressure, temperature, and relative humidity were recorded. For each relatively stable period of injection rate and pressure, air permeability was estimated by treating the rock around each test interval as a uniform, isotropic porous medium within which air flows as a single phase under steady state, in a pressure field exhibiting prolate spheroidal symmetry. For each permeability estimate the authors list the corresponding injection rate, pressure, temperature and relative humidity. They also present selected graphs which show how the latter quantities vary with time; logarithmic plots of pressure versus time which demonstrate the importance of borehole storage effects during the early transient portion of each incremental test period; and semilogarithmic plots of pressure versus recovery time at the end of each test sequence

  12. The Apache Longbow-Hellfire Missile Test at Yuma Proving Ground: Ecological Risk Assessment for Tracked Vehicle Movement across Desert Pavement

    International Nuclear Information System (INIS)

    Peterson, Mark J; Efroymson, Rebecca Ann; Hargrove, William Walter

    2008-01-01

    A multiple stressor risk assessment was conducted at Yuma Proving Ground, Arizona, as a demonstration of the Military Ecological Risk Assessment Framework. The focus was a testing program at Cibola Range, which involved an Apache Longbow helicopter firing Hellfire missiles at moving targets, M60-A1 tanks. This paper describes the ecological risk assessment for the tracked vehicle movement component of the testing program. The principal stressor associated with tracked vehicle movement was soil disturbance, and a resulting, secondary stressor was hydrological change. Water loss to washes and wash vegetation was expected to result from increased infiltration and/or evaporation associated with disturbances to desert pavement. The simulated exposure of wash vegetation to water loss was quantified using estimates of exposed land area from a digital ortho quarter quad aerial photo and field observations, a 30 30 m digital elevation model, the flow accumulation feature of ESRI ArcInfo, and a two-step process in which runoff was estimated from direct precipitation to a land area and from water that flowed from upgradient to a land area. In all simulated scenarios, absolute water loss decreased with distance from the disturbance, downgradient in the washes; however, percentage water loss was greatest in land areas immediately downgradient of a disturbance. Potential effects on growth and survival of wash trees were quantified by using an empirical relationship derived from a local unpublished study of water infiltration rates. The risk characterization concluded that neither risk to wash vegetation growth or survival nor risk to mule deer abundance and reproduction was expected. The risk characterization was negative for both the incremental risk of the test program and the combination of the test and pretest disturbances

  13. The 13th Data Release of the Sloan Digital Sky Survey: First Spectroscopic Data from the SDSS-IV Survey Mapping Nearby Galaxies at Apache Point Observatory

    Science.gov (United States)

    Albareti, Franco D.; Allende Prieto, Carlos; Almeida, Andres; Anders, Friedrich; Anderson, Scott; Andrews, Brett H.; Aragón-Salamanca, Alfonso; Argudo-Fernández, Maria; Armengaud, Eric; Aubourg, Eric; Avila-Reese, Vladimir; Badenes, Carles; Bailey, Stephen; Barbuy, Beatriz; Barger, Kat; Barrera-Ballesteros, Jorge; Bartosz, Curtis; Basu, Sarbani; Bates, Dominic; Battaglia, Giuseppina; Baumgarten, Falk; Baur, Julien; Bautista, Julian; Beers, Timothy C.; Belfiore, Francesco; Bershady, Matthew; Bertran de Lis, Sara; Bird, Jonathan C.; Bizyaev, Dmitry; Blanc, Guillermo A.; Blanton, Michael; Blomqvist, Michael; Bolton, Adam S.; Borissova, J.; Bovy, Jo; Nielsen Brandt, William; Brinkmann, Jonathan; Brownstein, Joel R.; Bundy, Kevin; Burtin, Etienne; Busca, Nicolás G.; Orlando Camacho Chavez, Hugo; Cano Díaz, M.; Cappellari, Michele; Carrera, Ricardo; Chen, Yanping; Cherinka, Brian; Cheung, Edmond; Chiappini, Cristina; Chojnowski, Drew; Chuang, Chia-Hsun; Chung, Haeun; Cirolini, Rafael Fernando; Clerc, Nicolas; Cohen, Roger E.; Comerford, Julia M.; Comparat, Johan; Correa do Nascimento, Janaina; Cousinou, Marie-Claude; Covey, Kevin; Crane, Jeffrey D.; Croft, Rupert; Cunha, Katia; Darling, Jeremy; Davidson, James W., Jr.; Dawson, Kyle; Da Costa, Luiz; Da Silva Ilha, Gabriele; Deconto Machado, Alice; Delubac, Timothée; De Lee, Nathan; De la Macorra, Axel; De la Torre, Sylvain; Diamond-Stanic, Aleksandar M.; Donor, John; Downes, Juan Jose; Drory, Niv; Du, Cheng; Du Mas des Bourboux, Hélion; Dwelly, Tom; Ebelke, Garrett; Eigenbrot, Arthur; Eisenstein, Daniel J.; Elsworth, Yvonne P.; Emsellem, Eric; Eracleous, Michael; Escoffier, Stephanie; Evans, Michael L.; Falcón-Barroso, Jesús; Fan, Xiaohui; Favole, Ginevra; Fernandez-Alvar, Emma; Fernandez-Trincado, J. G.; Feuillet, Diane; Fleming, Scott W.; Font-Ribera, Andreu; Freischlad, Gordon; Frinchaboy, Peter; Fu, Hai; Gao, Yang; Garcia, Rafael A.; Garcia-Dias, R.; Garcia-Hernández, D. A.; Garcia Pérez, Ana E.; Gaulme, Patrick; Ge, Junqiang; Geisler, Douglas; Gillespie, Bruce; Gil Marin, Hector; Girardi, Léo; Goddard, Daniel; Gomez Maqueo Chew, Yilen; Gonzalez-Perez, Violeta; Grabowski, Kathleen; Green, Paul; Grier, Catherine J.; Grier, Thomas; Guo, Hong; Guy, Julien; Hagen, Alex; Hall, Matt; Harding, Paul; Harley, R. E.; Hasselquist, Sten; Hawley, Suzanne; Hayes, Christian R.; Hearty, Fred; Hekker, Saskia; Hernandez Toledo, Hector; Ho, Shirley; Hogg, David W.; Holley-Bockelmann, Kelly; Holtzman, Jon A.; Holzer, Parker H.; Hu, Jian; Huber, Daniel; Hutchinson, Timothy Alan; Hwang, Ho Seong; Ibarra-Medel, Héctor J.; Ivans, Inese I.; Ivory, KeShawn; Jaehnig, Kurt; Jensen, Trey W.; Johnson, Jennifer A.; Jones, Amy; Jullo, Eric; Kallinger, T.; Kinemuchi, Karen; Kirkby, David; Klaene, Mark; Kneib, Jean-Paul; Kollmeier, Juna A.; Lacerna, Ivan; Lane, Richard R.; Lang, Dustin; Laurent, Pierre; Law, David R.; Leauthaud, Alexie; Le Goff, Jean-Marc; Li, Chen; Li, Cheng; Li, Niu; Li, Ran; Liang, Fu-Heng; Liang, Yu; Lima, Marcos; Lin, Lihwai; Lin, Lin; Lin, Yen-Ting; Liu, Chao; Long, Dan; Lucatello, Sara; MacDonald, Nicholas; MacLeod, Chelsea L.; Mackereth, J. Ted; Mahadevan, Suvrath; Geimba Maia, Marcio Antonio; Maiolino, Roberto; Majewski, Steven R.; Malanushenko, Olena; Malanushenko, Viktor; Dullius Mallmann, Nícolas; Manchado, Arturo; Maraston, Claudia; Marques-Chaves, Rui; Martinez Valpuesta, Inma; Masters, Karen L.; Mathur, Savita; McGreer, Ian D.; Merloni, Andrea; Merrifield, Michael R.; Meszáros, Szabolcs; Meza, Andres; Miglio, Andrea; Minchev, Ivan; Molaverdikhani, Karan; Montero-Dorta, Antonio D.; Mosser, Benoit; Muna, Demitri; Myers, Adam; Nair, Preethi; Nandra, Kirpal; Ness, Melissa; Newman, Jeffrey A.; Nichol, Robert C.; Nidever, David L.; Nitschelm, Christian; O’Connell, Julia; Oravetz, Audrey; Oravetz, Daniel J.; Pace, Zachary; Padilla, Nelson; Palanque-Delabrouille, Nathalie; Pan, Kaike; Parejko, John; Paris, Isabelle; Park, Changbom; Peacock, John A.; Peirani, Sebastien; Pellejero-Ibanez, Marcos; Penny, Samantha; Percival, Will J.; Percival, Jeffrey W.; Perez-Fournon, Ismael; Petitjean, Patrick; Pieri, Matthew; Pinsonneault, Marc H.; Pisani, Alice; Prada, Francisco; Prakash, Abhishek; Price-Jones, Natalie; Raddick, M. Jordan; Rahman, Mubdi; Raichoor, Anand; Barboza Rembold, Sandro; Reyna, A. M.; Rich, James; Richstein, Hannah; Ridl, Jethro; Riffel, Rogemar A.; Riffel, Rogério; Rix, Hans-Walter; Robin, Annie C.; Rockosi, Constance M.; Rodríguez-Torres, Sergio; Rodrigues, Thaíse S.; Roe, Natalie; Lopes, A. Roman; Román-Zúñiga, Carlos; Ross, Ashley J.; Rossi, Graziano; Ruan, John; Ruggeri, Rossana; Runnoe, Jessie C.; Salazar-Albornoz, Salvador; Salvato, Mara; Sanchez, Sebastian F.; Sanchez, Ariel G.; Sanchez-Gallego, José R.; Santiago, Basílio Xavier; Schiavon, Ricardo; Schimoia, Jaderson S.; Schlafly, Eddie; Schlegel, David J.; Schneider, Donald P.; Schönrich, Ralph; Schultheis, Mathias; Schwope, Axel; Seo, Hee-Jong; Serenelli, Aldo; Sesar, Branimir; Shao, Zhengyi; Shetrone, Matthew; Shull, Michael; Silva Aguirre, Victor; Skrutskie, M. F.; Slosar, Anže; Smith, Michael; Smith, Verne V.; Sobeck, Jennifer; Somers, Garrett; Souto, Diogo; Stark, David V.; Stassun, Keivan G.; Steinmetz, Matthias; Stello, Dennis; Storchi Bergmann, Thaisa; Strauss, Michael A.; Streblyanska, Alina; Stringfellow, Guy S.; Suarez, Genaro; Sun, Jing; Taghizadeh-Popp, Manuchehr; Tang, Baitian; Tao, Charling; Tayar, Jamie; Tembe, Mita; Thomas, Daniel; Tinker, Jeremy; Tojeiro, Rita; Tremonti, Christy; Troup, Nicholas; Trump, Jonathan R.; Unda-Sanzana, Eduardo; Valenzuela, O.; Van den Bosch, Remco; Vargas-Magaña, Mariana; Vazquez, Jose Alberto; Villanova, Sandro; Vivek, M.; Vogt, Nicole; Wake, David; Walterbos, Rene; Wang, Yuting; Wang, Enci; Weaver, Benjamin Alan; Weijmans, Anne-Marie; Weinberg, David H.; Westfall, Kyle B.; Whelan, David G.; Wilcots, Eric; Wild, Vivienne; Williams, Rob A.; Wilson, John; Wood-Vasey, W. M.; Wylezalek, Dominika; Xiao, Ting; Yan, Renbin; Yang, Meng; Ybarra, Jason E.; Yeche, Christophe; Yuan, Fang-Ting; Zakamska, Nadia; Zamora, Olga; Zasowski, Gail; Zhang, Kai; Zhao, Cheng; Zhao, Gong-Bo; Zheng, Zheng; Zheng, Zheng; Zhou, Zhi-Min; Zhu, Guangtun; Zinn, Joel C.; Zou, Hu

    2017-12-01

    The fourth generation of the Sloan Digital Sky Survey (SDSS-IV) began observations in 2014 July. It pursues three core programs: the Apache Point Observatory Galactic Evolution Experiment 2 (APOGEE-2), Mapping Nearby Galaxies at APO (MaNGA), and the Extended Baryon Oscillation Spectroscopic Survey (eBOSS). As well as its core program, eBOSS contains two major subprograms: the Time Domain Spectroscopic Survey (TDSS) and the SPectroscopic IDentification of ERosita Sources (SPIDERS). This paper describes the first data release from SDSS-IV, Data Release 13 (DR13). DR13 makes publicly available the first 1390 spatially resolved integral field unit observations of nearby galaxies from MaNGA. It includes new observations from eBOSS, completing the Sloan Extended QUasar, Emission-line galaxy, Luminous red galaxy Survey (SEQUELS), which also targeted variability-selected objects and X-ray-selected objects. DR13 includes new reductions of the SDSS-III BOSS data, improving the spectrophotometric calibration and redshift classification, and new reductions of the SDSS-III APOGEE-1 data, improving stellar parameters for dwarf stars and cooler stars. DR13 provides more robust and precise photometric calibrations. Value-added target catalogs relevant for eBOSS, TDSS, and SPIDERS and an updated red-clump catalog for APOGEE are also available. This paper describes the location and format of the data and provides references to important technical papers. The SDSS web site, http://www.sdss.org, provides links to the data, tutorials, examples of data access, and extensive documentation of the reduction and analysis procedures. DR13 is the first of a scheduled set that will contain new data and analyses from the planned ∼6 yr operations of SDSS-IV.

  14. Applications of Monte Carlo method in Medical Physics

    International Nuclear Information System (INIS)

    Diez Rios, A.; Labajos, M.

    1989-01-01

    The basic ideas of Monte Carlo techniques are presented. Random numbers and their generation by congruential methods, which underlie Monte Carlo calculations are shown. Monte Carlo techniques to solve integrals are discussed. The evaluation of a simple monodimensional integral with a known answer, by means of two different Monte Carlo approaches are discussed. The basic principles to simualate on a computer photon histories reduce variance and the current applications in Medical Physics are commented. (Author)

  15. Monte Carlo computation in the applied research of nuclear technology

    International Nuclear Information System (INIS)

    Xu Shuyan; Liu Baojie; Li Qin

    2007-01-01

    This article briefly introduces Monte Carlo Methods and their properties. It narrates the Monte Carlo methods with emphasis in their applications to several domains of nuclear technology. Monte Carlo simulation methods and several commonly used computer software to implement them are also introduced. The proposed methods are demonstrated by a real example. (authors)

  16. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    2018-06-07

    Jun 7, 2018 ... Science Education Programmes · Women in Science · Committee on ... Transliteration; informal information; natural language processing (NLP); information retrieval. ... Department of Computer Science and Engineering, Indian Institute of Technology (Indian School of Mines), Dhanbad 826004, India ...

  17. American Indians in Graduate Education.

    Science.gov (United States)

    Kidwell, Clara Sue

    1989-01-01

    The number of American Indians enrolled in institutions of higher education is very small. Enrollment figures for fall 1984 show Indians made up .68% of the total enrollment in institutions of higher education in the country, but only 15% of them were in universities. Their largest representation was in two-year institutions, where 54% of Indian…

  18. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Home; Journals; Sadhana. K Samudravijaya. Articles written in Sadhana. Volume 27 Issue 1 February 2002 pp 113-126. Indian accent text-to-speech system for web browsing · Aniruddha Sen K Samudravijaya · More Details Abstract Fulltext PDF. Incorporation of speech and Indian scripts can greatly enhance the ...

  19. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Department of Industrial Engineering and Management, Maulana Abul Kalam Azad University of Technology, Kolkata 700064, India; Indian Institute of Management Raipur, GEC Campus, Sejbahar, Raipur 492015, India; Indian National Centre for Ocean Information Services, Ministry of Earth Sciences, Hyderabad 500090, ...

  20. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Home; Journals; Sadhana; Volume 41; Issue 2. Nearest neighbour classification of Indian sign language gestures using kinect camera. Zafar Ahmed Ansari Gaurav Harit. Volume 41 Issue 2 February 2016 pp 161-182 ... Keywords. Indian sign language recognition; multi-class classification; gesture recognition.

  1. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Logo of the Indian Academy of Sciences. Indian Academy of ... 2013 pp 571-589. An evolutionary approach for colour constancy based on gamut mapping constraint satisfaction ... A new colour constancy algorithm based on automatic determination of gray framework parameters using neural network · Mohammad Mehdi ...

  2. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Toggle navigation. Logo of the Indian Academy of Sciences. Indian Academy of Sciences. Home · About IASc · History · Memorandum of Association ... Volume 31 Issue 5 October 2006 pp 621-633. Minimizing total costs of forest roads with computer-aided design model · Abdullah E Akay · More Details Abstract Fulltext PDF.

  3. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    2018-03-14

    Mar 14, 2018 ... Cloud security; network security; anomaly detection; network traffic analysis; DDoS attack detection. ... Department of Computer Science and Engineering, Indian Institute of Technology Roorkee, Roorkee 247667, India; Department of Applied Science and Engineering, Indian Institute of Technology ...

  4. Textbooks and the American Indian.

    Science.gov (United States)

    Costo, Rupert, Ed.

    An independent Indian publishing house has been formed to provide classroom instructional materials which deal accurately with the history, culture, and role of the American Indian. This book is a preliminary statement in that publishing program. General criteria, valid for instructional materials from elementary through high school, are applied…

  5. The average Indian female nose.

    Science.gov (United States)

    Patil, Surendra B; Kale, Satish M; Jaiswal, Sumeet; Khare, Nishant; Math, Mahantesh

    2011-12-01

    This study aimed to delineate the anthropometric measurements of the noses of young women of an Indian population and to compare them with the published ideals and average measurements for white women. This anthropometric survey included a volunteer sample of 100 young Indian women ages 18 to 35 years with Indian parents and no history of previous surgery or trauma to the nose. Standardized frontal, lateral, oblique, and basal photographs of the subjects' noses were taken, and 12 standard anthropometric measurements of the nose were determined. The results were compared with published standards for North American white women. In addition, nine nasal indices were calculated and compared with the standards for North American white women. The nose of Indian women differs significantly from the white nose. All the nasal measurements for the Indian women were found to be significantly different from those for North American white women. Seven of the nine nasal indices also differed significantly. Anthropometric analysis suggests differences between the Indian female nose and the North American white nose. Thus, a single aesthetic ideal is inadequate. Noses of Indian women are smaller and wider, with a less projected and rounded tip than the noses of white women. This study established the nasal anthropometric norms for nasal parameters, which will serve as a guide for cosmetic and reconstructive surgery in Indian women.

  6. epubworkshop | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Toggle navigation. Logo of the Indian Academy of Sciences. Indian Academy of Sciences. Home · About IASc · History · Memorandum of Association · Role of the Academy · Statutes · Council · Raman Chair · Jubilee Chair · Academy – Springer Nature chair · Academy Trust · Contact details · Office Staff · Office complaint ...

  7. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    ... features of Indian Heavy Water Reactors for prevention and mitigation of such extreme events. The probabilistic safety analysis revealed that the risk from Indian Heavy Water Reactors are negligibly small. Volume 38 Issue 6 December 2013 pp 1173-1217. Entrainment phenomenon in gas–liquid two-phase flow: A review.

  8. Home | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    2017-07-02

    Jul 2, 2017 ... The editors Biman Bagchi (FASc, FNA, FTWAS; Indian Institute of Science, Bangalore, India), David Clary (FRS; Oxford University, Oxford, UK) and N Sathyamurthy (FASc, FNA, FTWAS; Indian Institute of Science Education and Research, Mohali, India) have put together a 29 articles on theoretical physical ...

  9. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Keywords. Markov chains; Monte Carlo method; random number generator; simulation. Abstract. Markov Chain Monte Carlo (MCMC) is a popular method used to generate samples from arbitrary distributions, which may be specified indirectly. In this article, we give an introduction to this method along with some examples.

  10. Methodology for understanding Indian culture

    DEFF Research Database (Denmark)

    Sinha, Jai; Kumar, Rajesh

    2004-01-01

    Methods of understanding cultures, including Indian culture, are embedded in a broad spectrum of sociocultural approaches to human behavior in general. The approaches examined in this paper reflect evolving perspectives on Indian culture, ranging from the starkly ethnocentric to the largely...... eclectic and integrative. Most of the methods herin discussed were developed in the West and were subsequently taken up with or without adaptations to fit the Indian context. The paper begins by briefly reviewing the intrinsic concept of culture. It then adopts a historical view of the different ways...... and means by which scholars have construed the particular facets of Indian culture, highlighting the advantages and disadvantages of each. The final section concludes with some proposals about the best ways of understnding the complexity that constitutes the Indian cultural reality....

  11. Washington Irving and the American Indian.

    Science.gov (United States)

    Littlefield, Daniel F., Jr.

    1979-01-01

    Some modern scholars feel that Washington Irving vacillated between romanticism and realism in his literary treatment of the American Indian. However, a study of all his works dealing with Indians, placed in context with his non-Indian works, reveals that his attitude towards Indians was intelligent and enlightened for his time. (CM)

  12. Equality in Education for Indian Women.

    Science.gov (United States)

    Krepps, Ethel

    1980-01-01

    Historically, Indian women have been denied education due to: early marriage and family responsibilities; lack of money; inadequate family attention to education; the threat education poses to Indian men; and geographical location. Indian tribes can best administer funds and programs to provide the education so necessary for Indian women. (SB)

  13. Monte Carlo-based tail exponent estimator

    Science.gov (United States)

    Barunik, Jozef; Vacha, Lukas

    2010-11-01

    In this paper we propose a new approach to estimation of the tail exponent in financial stock markets. We begin the study with the finite sample behavior of the Hill estimator under α-stable distributions. Using large Monte Carlo simulations, we show that the Hill estimator overestimates the true tail exponent and can hardly be used on samples with small length. Utilizing our results, we introduce a Monte Carlo-based method of estimation for the tail exponent. Our proposed method is not sensitive to the choice of tail size and works well also on small data samples. The new estimator also gives unbiased results with symmetrical confidence intervals. Finally, we demonstrate the power of our estimator on the international world stock market indices. On the two separate periods of 2002-2005 and 2006-2009, we estimate the tail exponent.

  14. No-compromise reptation quantum Monte Carlo

    International Nuclear Information System (INIS)

    Yuen, W K; Farrar, Thomas J; Rothstein, Stuart M

    2007-01-01

    Since its publication, the reptation quantum Monte Carlo algorithm of Baroni and Moroni (1999 Phys. Rev. Lett. 82 4745) has been applied to several important problems in physics, but its mathematical foundations are not well understood. We show that their algorithm is not of typical Metropolis-Hastings type, and we specify conditions required for the generated Markov chain to be stationary and to converge to the intended distribution. The time-step bias may add up, and in many applications it is only the middle of a reptile that is the most important. Therefore, we propose an alternative, 'no-compromise reptation quantum Monte Carlo' to stabilize the middle of the reptile. (fast track communication)

  15. Multilevel Monte Carlo Approaches for Numerical Homogenization

    KAUST Repository

    Efendiev, Yalchin R.

    2015-10-01

    In this article, we study the application of multilevel Monte Carlo (MLMC) approaches to numerical random homogenization. Our objective is to compute the expectation of some functionals of the homogenized coefficients, or of the homogenized solutions. This is accomplished within MLMC by considering different sizes of representative volumes (RVEs). Many inexpensive computations with the smallest RVE size are combined with fewer expensive computations performed on larger RVEs. Likewise, when it comes to homogenized solutions, different levels of coarse-grid meshes are used to solve the homogenized equation. We show that, by carefully selecting the number of realizations at each level, we can achieve a speed-up in the computations in comparison to a standard Monte Carlo method. Numerical results are presented for both one-dimensional and two-dimensional test-cases that illustrate the efficiency of the approach.

  16. Status of Monte Carlo at Los Alamos

    International Nuclear Information System (INIS)

    Thompson, W.L.; Cashwell, E.D.

    1980-01-01

    At Los Alamos the early work of Fermi, von Neumann, and Ulam has been developed and supplemented by many followers, notably Cashwell and Everett, and the main product today is the continuous-energy, general-purpose, generalized-geometry, time-dependent, coupled neutron-photon transport code called MCNP. The Los Alamos Monte Carlo research and development effort is concentrated in Group X-6. MCNP treats an arbitrary three-dimensional configuration of arbitrary materials in geometric cells bounded by first- and second-degree surfaces and some fourth-degree surfaces (elliptical tori). Monte Carlo has evolved into perhaps the main method for radiation transport calculations at Los Alamos. MCNP is used in every technical division at the Laboratory by over 130 users about 600 times a month accounting for nearly 200 hours of CDC-7600 time

  17. Monte Carlo simulations in skin radiotherapy

    International Nuclear Information System (INIS)

    Sarvari, A.; Jeraj, R.; Kron, T.

    2000-01-01

    The primary goal of this work was to develop a procedure for calculation the appropriate filter shape for a brachytherapy applicator used for skin radiotherapy. In the applicator a radioactive source is positioned close to the skin. Without a filter, the resultant dose distribution would be highly nonuniform.High uniformity is usually required however. This can be achieved using an appropriately shaped filter, which flattens the dose profile. Because of the complexity of the transport and geometry, Monte Carlo simulations had to be used. An 192 Ir high dose rate photon source was used. All necessary transport parameters were simulated with the MCNP4B Monte Carlo code. A highly efficient iterative procedure was developed, which enabled calculation of the optimal filter shape in only few iterations. The initially non-uniform dose distributions became uniform within a percent when applying the filter calculated by this procedure. (author)

  18. Coevolution Based Adaptive Monte Carlo Localization (CEAMCL

    Directory of Open Access Journals (Sweden)

    Luo Ronghua

    2008-11-01

    Full Text Available An adaptive Monte Carlo localization algorithm based on coevolution mechanism of ecological species is proposed. Samples are clustered into species, each of which represents a hypothesis of the robot's pose. Since the coevolution between the species ensures that the multiple distinct hypotheses can be tracked stably, the problem of premature convergence when using MCL in highly symmetric environments can be solved. And the sample size can be adjusted adaptively over time according to the uncertainty of the robot's pose by using the population growth model. In addition, by using the crossover and mutation operators in evolutionary computation, intra-species evolution can drive the samples move towards the regions where the desired posterior density is large. So a small size of samples can represent the desired density well enough to make precise localization. The new algorithm is termed coevolution based adaptive Monte Carlo localization (CEAMCL. Experiments have been carried out to prove the efficiency of the new localization algorithm.

  19. Multilevel sequential Monte-Carlo samplers

    KAUST Repository

    Jasra, Ajay

    2016-01-01

    Multilevel Monte-Carlo methods provide a powerful computational technique for reducing the computational cost of estimating expectations for a given computational effort. They are particularly relevant for computational problems when approximate distributions are determined via a resolution parameter h, with h=0 giving the theoretical exact distribution (e.g. SDEs or inverse problems with PDEs). The method provides a benefit by coupling samples from successive resolutions, and estimating differences of successive expectations. We develop a methodology that brings Sequential Monte-Carlo (SMC) algorithms within the framework of the Multilevel idea, as SMC provides a natural set-up for coupling samples over different resolutions. We prove that the new algorithm indeed preserves the benefits of the multilevel principle, even if samples at all resolutions are now correlated.

  20. Monte Carlo simulation of gas Cerenkov detectors

    International Nuclear Information System (INIS)

    Mack, J.M.; Jain, M.; Jordan, T.M.

    1984-01-01

    Theoretical study of selected gamma-ray and electron diagnostic necessitates coupling Cerenkov radiation to electron/photon cascades. A Cerenkov production model and its incorporation into a general geometry Monte Carlo coupled electron/photon transport code is discussed. A special optical photon ray-trace is implemented using bulk optical properties assigned to each Monte Carlo zone. Good agreement exists between experimental and calculated Cerenkov data in the case of a carbon-dioxide gas Cerenkov detector experiment. Cerenkov production and threshold data are presented for a typical carbon-dioxide gas detector that converts a 16.7 MeV photon source to Cerenkov light, which is collected by optics and detected by a photomultiplier

  1. EU Commissioner Carlos Moedas visits SESAME

    CERN Multimedia

    CERN Bulletin

    2015-01-01

    The European Commissioner for research, science and innovation, Carlos Moedas, visited the SESAME laboratory in Jordan on Monday 13 April. When it begins operation in 2016, SESAME, a synchrotron light source, will be the Middle East’s first major international science centre, carrying out experiments ranging from the physical sciences to environmental science and archaeology.   CERN Director-General Rolf Heuer (left) and European Commissioner Carlos Moedas with the model SESAME magnet. © European Union, 2015.   Commissioner Moedas was accompanied by a European Commission delegation led by Robert-Jan Smits, Director-General of DG Research and Innovation, as well as Rolf Heuer, CERN Director-General, Jean-Pierre Koutchouk, coordinator of the CERN-EC Support for SESAME Magnets (CESSAMag) project and Princess Sumaya bint El Hassan of Jordan, a leading advocate of science in the region. They toured the SESAME facility together with SESAME Director, Khaled Tou...

  2. Hypothesis testing of scientific Monte Carlo calculations

    Science.gov (United States)

    Wallerberger, Markus; Gull, Emanuel

    2017-11-01

    The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.

  3. Multilevel sequential Monte-Carlo samplers

    KAUST Repository

    Jasra, Ajay

    2016-01-05

    Multilevel Monte-Carlo methods provide a powerful computational technique for reducing the computational cost of estimating expectations for a given computational effort. They are particularly relevant for computational problems when approximate distributions are determined via a resolution parameter h, with h=0 giving the theoretical exact distribution (e.g. SDEs or inverse problems with PDEs). The method provides a benefit by coupling samples from successive resolutions, and estimating differences of successive expectations. We develop a methodology that brings Sequential Monte-Carlo (SMC) algorithms within the framework of the Multilevel idea, as SMC provides a natural set-up for coupling samples over different resolutions. We prove that the new algorithm indeed preserves the benefits of the multilevel principle, even if samples at all resolutions are now correlated.

  4. Monte Carlo Simulation for Particle Detectors

    CERN Document Server

    Pia, Maria Grazia

    2012-01-01

    Monte Carlo simulation is an essential component of experimental particle physics in all the phases of its life-cycle: the investigation of the physics reach of detector concepts, the design of facilities and detectors, the development and optimization of data reconstruction software, the data analysis for the production of physics results. This note briefly outlines some research topics related to Monte Carlo simulation, that are relevant to future experimental perspectives in particle physics. The focus is on physics aspects: conceptual progress beyond current particle transport schemes, the incorporation of materials science knowledge relevant to novel detection technologies, functionality to model radiation damage, the capability for multi-scale simulation, quantitative validation and uncertainty quantification to determine the predictive power of simulation. The R&D on simulation for future detectors would profit from cooperation within various components of the particle physics community, and synerg...

  5. Status of Monte Carlo at Los Alamos

    International Nuclear Information System (INIS)

    Thompson, W.L.; Cashwell, E.D.; Godfrey, T.N.K.; Schrandt, R.G.; Deutsch, O.L.; Booth, T.E.

    1980-05-01

    Four papers were presented by Group X-6 on April 22, 1980, at the Oak Ridge Radiation Shielding Information Center (RSIC) Seminar-Workshop on Theory and Applications of Monte Carlo Methods. These papers are combined into one report for convenience and because they are related to each other. The first paper (by Thompson and Cashwell) is a general survey about X-6 and MCNP and is an introduction to the other three papers. It can also serve as a resume of X-6. The second paper (by Godfrey) explains some of the details of geometry specification in MCNP. The third paper (by Cashwell and Schrandt) illustrates calculating flux at a point with MCNP; in particular, the once-more-collided flux estimator is demonstrated. Finally, the fourth paper (by Thompson, Deutsch, and Booth) is a tutorial on some variance-reduction techniques. It should be required for a fledging Monte Carlo practitioner

  6. Topological zero modes in Monte Carlo simulations

    International Nuclear Information System (INIS)

    Dilger, H.

    1994-08-01

    We present an improvement of global Metropolis updating steps, the instanton hits, used in a hybrid Monte Carlo simulation of the two-flavor Schwinger model with staggered fermions. These hits are designed to change the topological sector of the gauge field. In order to match these hits to an unquenched simulation with pseudofermions, the approximate zero mode structure of the lattice Dirac operator has to be considered explicitly. (orig.)

  7. Handbook of Markov chain Monte Carlo

    CERN Document Server

    Brooks, Steve

    2011-01-01

    ""Handbook of Markov Chain Monte Carlo"" brings together the major advances that have occurred in recent years while incorporating enough introductory material for new users of MCMC. Along with thorough coverage of the theoretical foundations and algorithmic and computational methodology, this comprehensive handbook includes substantial realistic case studies from a variety of disciplines. These case studies demonstrate the application of MCMC methods and serve as a series of templates for the construction, implementation, and choice of MCMC methodology.

  8. The lund Monte Carlo for jet fragmentation

    International Nuclear Information System (INIS)

    Sjoestrand, T.

    1982-03-01

    We present a Monte Carlo program based on the Lund model for jet fragmentation. Quark, gluon, diquark and hadron jets are considered. Special emphasis is put on the fragmentation of colour singlet jet systems, for which energy, momentum and flavour are conserved explicitly. The model for decays of unstable particles, in particular the weak decay of heavy hadrons, is described. The central part of the paper is a detailed description on how to use the FORTRAN 77 program. (Author)

  9. Monte Carlo methods for preference learning

    DEFF Research Database (Denmark)

    Viappiani, P.

    2012-01-01

    Utility elicitation is an important component of many applications, such as decision support systems and recommender systems. Such systems query the users about their preferences and give recommendations based on the system’s belief about the utility function. Critical to these applications is th...... is the acquisition of prior distribution about the utility parameters and the possibility of real time Bayesian inference. In this paper we consider Monte Carlo methods for these problems....

  10. Monte Carlo methods for shield design calculations

    International Nuclear Information System (INIS)

    Grimstone, M.J.

    1974-01-01

    A suite of Monte Carlo codes is being developed for use on a routine basis in commercial reactor shield design. The methods adopted for this purpose include the modular construction of codes, simplified geometries, automatic variance reduction techniques, continuous energy treatment of cross section data, and albedo methods for streaming. Descriptions are given of the implementation of these methods and of their use in practical calculations. 26 references. (U.S.)

  11. General purpose code for Monte Carlo simulations

    International Nuclear Information System (INIS)

    Wilcke, W.W.

    1983-01-01

    A general-purpose computer called MONTHY has been written to perform Monte Carlo simulations of physical systems. To achieve a high degree of flexibility the code is organized like a general purpose computer, operating on a vector describing the time dependent state of the system under simulation. The instruction set of the computer is defined by the user and is therefore adaptable to the particular problem studied. The organization of MONTHY allows iterative and conditional execution of operations

  12. Autocorrelations in hybrid Monte Carlo simulations

    International Nuclear Information System (INIS)

    Schaefer, Stefan; Virotta, Francesco

    2010-11-01

    Simulations of QCD suffer from severe critical slowing down towards the continuum limit. This problem is known to be prominent in the topological charge, however, all observables are affected to various degree by these slow modes in the Monte Carlo evolution. We investigate the slowing down in high statistics simulations and propose a new error analysis method, which gives a realistic estimate of the contribution of the slow modes to the errors. (orig.)

  13. Introduction to the Monte Carlo methods

    International Nuclear Information System (INIS)

    Uzhinskij, V.V.

    1993-01-01

    Codes illustrating the use of Monte Carlo methods in high energy physics such as the inverse transformation method, the ejection method, the particle propagation through the nucleus, the particle interaction with the nucleus, etc. are presented. A set of useful algorithms of random number generators is given (the binomial distribution, the Poisson distribution, β-distribution, γ-distribution and normal distribution). 5 figs., 1 tab

  14. Sequential Monte Carlo with Highly Informative Observations

    OpenAIRE

    Del Moral, Pierre; Murray, Lawrence M.

    2014-01-01

    We propose sequential Monte Carlo (SMC) methods for sampling the posterior distribution of state-space models under highly informative observation regimes, a situation in which standard SMC methods can perform poorly. A special case is simulating bridges between given initial and final values. The basic idea is to introduce a schedule of intermediate weighting and resampling times between observation times, which guide particles towards the final state. This can always be done for continuous-...

  15. Monte Carlo codes use in neutron therapy

    International Nuclear Information System (INIS)

    Paquis, P.; Mokhtari, F.; Karamanoukian, D.; Pignol, J.P.; Cuendet, P.; Iborra, N.

    1998-01-01

    Monte Carlo calculation codes allow to study accurately all the parameters relevant to radiation effects, like the dose deposition or the type of microscopic interactions, through one by one particle transport simulation. These features are very useful for neutron irradiations, from device development up to dosimetry. This paper illustrates some applications of these codes in Neutron Capture Therapy and Neutron Capture Enhancement of fast neutrons irradiations. (authors)

  16. Quantum Monte Carlo calculations of light nuclei

    International Nuclear Information System (INIS)

    Pandharipande, V. R.

    1999-01-01

    Quantum Monte Carlo methods provide an essentially exact way to calculate various properties of nuclear bound, and low energy continuum states, from realistic models of nuclear interactions and currents. After a brief description of the methods and modern models of nuclear forces, we review the results obtained for all the bound, and some continuum states of up to eight nucleons. Various other applications of the methods are reviewed along with future prospects

  17. Monte-Carlo simulation of electromagnetic showers

    International Nuclear Information System (INIS)

    Amatuni, Ts.A.

    1984-01-01

    The universal ELSS-1 program for Monte Carlo simulation of high energy electromagnetic showers in homogeneous absorbers of arbitrary geometry is written. The major processes and effects of electron and photon interaction with matter, particularly the Landau-Pomeranchuk-Migdal effect, are taken into account in the simulation procedures. The simulation results are compared with experimental data. Some characteristics of shower detectors and electromagnetic showers for energies up 1 TeV are calculated

  18. Cost of splitting in Monte Carlo transport

    International Nuclear Information System (INIS)

    Everett, C.J.; Cashwell, E.D.

    1978-03-01

    In a simple transport problem designed to estimate transmission through a plane slab of x free paths by Monte Carlo methods, it is shown that m-splitting (m > or = 2) does not pay unless exp(x) > m(m + 3)/(m - 1). In such a case, the minimum total cost in terms of machine time is obtained as a function of m, and the optimal value of m is determined

  19. Monte Carlo simulation of Touschek effect

    Directory of Open Access Journals (Sweden)

    Aimin Xiao

    2010-07-01

    Full Text Available We present a Monte Carlo method implementation in the code elegant for simulating Touschek scattering effects in a linac beam. The local scattering rate and the distribution of scattered electrons can be obtained from the code either for a Gaussian-distributed beam or for a general beam whose distribution function is given. In addition, scattered electrons can be tracked through the beam line and the local beam-loss rate and beam halo information recorded.

  20. The Indian ultrasound paradox

    OpenAIRE

    Akbulut-Yuksel, Mevlude; Rosenblum, Daniel

    2012-01-01

    The liberalization of the Indian economy in the 1990s made prenatal ultrasound technology affordable and available to a large fraction of the population. As a result, ultrasound use amongst pregnant women rose dramatically in many parts of India. This paper provides evidence on the consequences of the expansion of prenatal ultrasound use on sex-selection. We exploit state-by-cohort variation in ultrasound use in India as a unique quasi-experiment. We find that sex-selective abortion of female...

  1. Indian advanced nuclear reactors

    International Nuclear Information System (INIS)

    Saha, D.; Sinha, R.K.

    2005-01-01

    For sustainable development of nuclear energy, a number of important issues like safety, waste management, economics etc. are to be addressed. To do this, a number of advanced reactor designs as well as fuel cycle technologies are being pursued worldwide. The advanced reactors being developed in India are the AHWR and the CHTR. Both the reactors use thorium based fuel and have many passive features. This paper describes the Indian advanced reactors and gives a brief account of the international initiatives for the sustainable development of nuclear energy. (author)

  2. Monte Carlo method for neutron transport problems

    International Nuclear Information System (INIS)

    Asaoka, Takumi

    1977-01-01

    Some methods for decreasing variances in Monte Carlo neutron transport calculations are presented together with the results of sample calculations. A general purpose neutron transport Monte Carlo code ''MORSE'' was used for the purpose. The first method discussed in this report is the method of statistical estimation. As an example of this method, the application of the coarse-mesh rebalance acceleration method to the criticality calculation of a cylindrical fast reactor is presented. Effective multiplication factor and its standard deviation are presented as a function of the number of histories and comparisons are made between the coarse-mesh rebalance method and the standard method. Five-group neutron fluxes at core center are also compared with the result of S4 calculation. The second method is the method of correlated sampling. This method was applied to the perturbation calculation of control rod worths in a fast critical assembly (FCA-V-3) Two methods of sampling (similar flight paths and identical flight paths) are tested and compared with experimental results. For every cases the experimental value lies within the standard deviation of the Monte Carlo calculations. The third method is the importance sampling. In this report a biased selection of particle flight directions discussed. This method was applied to the flux calculation in a spherical fast neutron system surrounded by a 10.16 cm iron reflector. Result-direction biasing, path-length stretching, and no biasing are compared with S8 calculation. (Aoki, K.)

  3. Biased Monte Carlo optimization: the basic approach

    International Nuclear Information System (INIS)

    Campioni, Luca; Scardovelli, Ruben; Vestrucci, Paolo

    2005-01-01

    It is well-known that the Monte Carlo method is very successful in tackling several kinds of system simulations. It often happens that one has to deal with rare events, and the use of a variance reduction technique is almost mandatory, in order to have Monte Carlo efficient applications. The main issue associated with variance reduction techniques is related to the choice of the value of the biasing parameter. Actually, this task is typically left to the experience of the Monte Carlo user, who has to make many attempts before achieving an advantageous biasing. A valuable result is provided: a methodology and a practical rule addressed to establish an a priori guidance for the choice of the optimal value of the biasing parameter. This result, which has been obtained for a single component system, has the notable property of being valid for any multicomponent system. In particular, in this paper, the exponential and the uniform biases of exponentially distributed phenomena are investigated thoroughly

  4. Quantum Monte Carlo for vibrating molecules

    International Nuclear Information System (INIS)

    Brown, W.R.; Lawrence Berkeley National Lab., CA

    1996-08-01

    Quantum Monte Carlo (QMC) has successfully computed the total electronic energies of atoms and molecules. The main goal of this work is to use correlation function quantum Monte Carlo (CFQMC) to compute the vibrational state energies of molecules given a potential energy surface (PES). In CFQMC, an ensemble of random walkers simulate the diffusion and branching processes of the imaginary-time time dependent Schroedinger equation in order to evaluate the matrix elements. The program QMCVIB was written to perform multi-state VMC and CFQMC calculations and employed for several calculations of the H 2 O and C 3 vibrational states, using 7 PES's, 3 trial wavefunction forms, two methods of non-linear basis function parameter optimization, and on both serial and parallel computers. In order to construct accurate trial wavefunctions different wavefunctions forms were required for H 2 O and C 3 . In order to construct accurate trial wavefunctions for C 3 , the non-linear parameters were optimized with respect to the sum of the energies of several low-lying vibrational states. In order to stabilize the statistical error estimates for C 3 the Monte Carlo data was collected into blocks. Accurate vibrational state energies were computed using both serial and parallel QMCVIB programs. Comparison of vibrational state energies computed from the three C 3 PES's suggested that a non-linear equilibrium geometry PES is the most accurate and that discrete potential representations may be used to conveniently determine vibrational state energies

  5. Lattice gauge theories and Monte Carlo simulations

    International Nuclear Information System (INIS)

    Rebbi, C.

    1981-11-01

    After some preliminary considerations, the discussion of quantum gauge theories on a Euclidean lattice takes up the definition of Euclidean quantum theory and treatment of the continuum limit; analogy is made with statistical mechanics. Perturbative methods can produce useful results for strong or weak coupling. In the attempts to investigate the properties of the systems for intermediate coupling, numerical methods known as Monte Carlo simulations have proved valuable. The bulk of this paper illustrates the basic ideas underlying the Monte Carlo numerical techniques and the major results achieved with them according to the following program: Monte Carlo simulations (general theory, practical considerations), phase structure of Abelian and non-Abelian models, the observables (coefficient of the linear term in the potential between two static sources at large separation, mass of the lowest excited state with the quantum numbers of the vacuum (the so-called glueball), the potential between two static sources at very small distance, the critical temperature at which sources become deconfined), gauge fields coupled to basonic matter (Higgs) fields, and systems with fermions

  6. Efficacy of a pentavalent human-bovine reassortant rotavirus vaccine against rotavirus gastroenteritis among American Indian children.

    Science.gov (United States)

    Grant, Lindsay R; Watt, James P; Weatherholtz, Robert C; Moulton, Lawrence H; Reid, Raymond; Santosham, Mathuram; O'Brien, Katherine L

    2012-02-01

    Before the widespread use of rotavirus vaccines, rotavirus was a leading cause of gastroenteritis among children. Navajo and White Mountain Apache children suffer a disproportionate burden of severe rotavirus disease compared with the general U.S. population. We enrolled Navajo and White Mountain Apache infants in a multicenter, double-blind, placebo-controlled trial of pentavalent human-bovine reassortant rotavirus vaccine (PRV). Subjects received 3 doses of vaccine or placebo at 4 to 10 week intervals, with the first dose given between 6 and 12 weeks of age. Gastroenteritis episodes were identified by active surveillance. Disease severity was determined by a standardized scoring system. There were 509 and 494 randomized children who received vaccine and placebo, respectively. Among placebo recipients, the incidence of rotavirus gastroenteritis was 34.2 episodes/100 child-years (95% confidence interval [95% CI]: 25.8-38.9) versus 8.1 episodes/100 child-years (95% CI: 5.4-12.5) in the vaccine group. The percentage of rotavirus episodes caused by serotypes G1, G2, and G3 was 72.3%, 23.4%, and 2.1%, respectively. There were no severe rotavirus episodes among vaccinees and 4 among placebo recipients. PRV was 77.1% (95% CI: 59.7-87.6), 89.5% (95% CI: 65.9-97.9), and 82.9% (95% CI: 61.1-93.6) effective against G1-G4 rotavirus disease, severe and moderate rotavirus disease combined, and outpatient visits for rotavirus disease, respectively. The risk of adverse events was similar for the vaccine and placebo groups. PRV was highly effective in preventing rotavirus disease and related health care utilization in these American Indian infants. Vaccine efficacy and immunogenicity were similar to the overall study population enrolled in the multicenter trial.

  7. Generalized hybrid Monte Carlo - CMFD methods for fission source convergence

    International Nuclear Information System (INIS)

    Wolters, Emily R.; Larsen, Edward W.; Martin, William R.

    2011-01-01

    In this paper, we generalize the recently published 'CMFD-Accelerated Monte Carlo' method and present two new methods that reduce the statistical error in CMFD-Accelerated Monte Carlo. The CMFD-Accelerated Monte Carlo method uses Monte Carlo to estimate nonlinear functionals used in low-order CMFD equations for the eigenfunction and eigenvalue. The Monte Carlo fission source is then modified to match the resulting CMFD fission source in a 'feedback' procedure. The two proposed methods differ from CMFD-Accelerated Monte Carlo in the definition of the required nonlinear functionals, but they have identical CMFD equations. The proposed methods are compared with CMFD-Accelerated Monte Carlo on a high dominance ratio test problem. All hybrid methods converge the Monte Carlo fission source almost immediately, leading to a large reduction in the number of inactive cycles required. The proposed methods stabilize the fission source more efficiently than CMFD-Accelerated Monte Carlo, leading to a reduction in the number of active cycles required. Finally, as in CMFD-Accelerated Monte Carlo, the apparent variance of the eigenfunction is approximately equal to the real variance, so the real error is well-estimated from a single calculation. This is an advantage over standard Monte Carlo, in which the real error can be underestimated due to inter-cycle correlation. (author)

  8. Monte Carlo methods and models in finance and insurance

    CERN Document Server

    Korn, Ralf; Kroisandt, Gerald

    2010-01-01

    Offering a unique balance between applications and calculations, Monte Carlo Methods and Models in Finance and Insurance incorporates the application background of finance and insurance with the theory and applications of Monte Carlo methods. It presents recent methods and algorithms, including the multilevel Monte Carlo method, the statistical Romberg method, and the Heath-Platen estimator, as well as recent financial and actuarial models, such as the Cheyette and dynamic mortality models. The authors separately discuss Monte Carlo techniques, stochastic process basics, and the theoretical background and intuition behind financial and actuarial mathematics, before bringing the topics together to apply the Monte Carlo methods to areas of finance and insurance. This allows for the easy identification of standard Monte Carlo tools and for a detailed focus on the main principles of financial and insurance mathematics. The book describes high-level Monte Carlo methods for standard simulation and the simulation of...

  9. Indian Vacuum Society: The Indian Vacuum Society

    Science.gov (United States)

    Saha, T. K.

    2008-03-01

    The Indian Vacuum Society (IVS) was established in 1970. It has over 800 members including many from Industry and R & D Institutions spread throughout India. The society has an active chapter at Kolkata. The society was formed with the main aim to promote, encourage and develop the growth of Vacuum Science, Techniques and Applications in India. In order to achieve this aim it has conducted a number of short term courses at graduate and technician levels on vacuum science and technology on topics ranging from low vacuum to ultrahigh vacuum So far it has conducted 39 such courses at different parts of the country and imparted training to more than 1200 persons in the field. Some of these courses were in-plant training courses conducted on the premises of the establishment and designed to take care of the special needs of the establishment. IVS also regularly conducts national and international seminars and symposia on vacuum science and technology with special emphasis on some theme related to applications of vacuum. A large number of delegates from all over India take part in the deliberations of such seminars and symposia and present their work. IVS also arranges technical visits to different industries and research institutes. The society also helped in the UNESCO sponsored post-graduate level courses in vacuum science, technology and applications conducted by Mumbai University. The society has also designed a certificate and diploma course for graduate level students studying vacuum science and technology and has submitted a syllabus to the academic council of the University of Mumbai for their approval, we hope that some colleges affiliated to the university will start this course from the coming academic year. IVS extended its support in standardizing many of the vacuum instruments and played a vital role in helping to set up a Regional Testing Centre along with BARC. As part of the development of vacuum education, the society arranges the participation of

  10. Apache Wars: A Constabulary Perspective

    National Research Council Canada - National Science Library

    Siegrist, Jeremy T

    2005-01-01

    .... This monograph contends that principles of counterinsurgency, drawn from theory and doctrine, are nearly identical in post-conflict environments to principles that guide constabularies, and that each...

  11. Apache Solr enterprise search server

    CERN Document Server

    Smiley, David; Parisa, Kranti; Mitchell, Matt

    2015-01-01

    This book is for developers who want to learn how to get the most out of Solr in their applications, whether you are new to the field, have used Solr but don't know everything, or simply want a good reference. It would be helpful to have some familiarity with basic programming concepts, but no prior experience is required.

  12. Open-source web-enabled data management, analyses, and visualization of very large data in geosciences using Jupyter, Apache Spark, and community tools

    Science.gov (United States)

    Chaudhary, A.

    2017-12-01

    Current simulation models and sensors are producing high-resolution, high-velocity data in geosciences domain. Knowledge discovery from these complex and large size datasets require tools that are capable of handling very large data and providing interactive data analytics features to researchers. To this end, Kitware and its collaborators are producing open-source tools GeoNotebook, GeoJS, Gaia, and Minerva for geosciences that are using hardware accelerated graphics and advancements in parallel and distributed processing (Celery and Apache Spark) and can be loosely coupled to solve real-world use-cases. GeoNotebook (https://github.com/OpenGeoscience/geonotebook) is co-developed by Kitware and NASA-Ames and is an extension to the Jupyter Notebook. It provides interactive visualization and python-based analysis of geospatial data and depending the backend (KTile or GeoPySpark) can handle data sizes of Hundreds of Gigabytes to Terabytes. GeoNotebook uses GeoJS (https://github.com/OpenGeoscience/geojs) to render very large geospatial data on the map using WebGL and Canvas2D API. GeoJS is more than just a GIS library as users can create scientific plots such as vector and contour and can embed InfoVis plots using D3.js. GeoJS aims for high-performance visualization and interactive data exploration of scientific and geospatial location aware datasets and supports features such as Point, Line, Polygon, and advanced features such as Pixelmap, Contour, Heatmap, and Choropleth. Our another open-source tool Minerva ((https://github.com/kitware/minerva) is a geospatial application that is built on top of open-source web-based data management system Girder (https://github.com/girder/girder) which provides an ability to access data from HDFS or Amazon S3 buckets and provides capabilities to perform visualization and analyses on geosciences data in a web environment using GDAL and GeoPandas wrapped in a unified API provided by Gaia (https

  13. Guideline of Monte Carlo calculation. Neutron/gamma ray transport simulation by Monte Carlo method

    CERN Document Server

    2002-01-01

    This report condenses basic theories and advanced applications of neutron/gamma ray transport calculations in many fields of nuclear energy research. Chapters 1 through 5 treat historical progress of Monte Carlo methods, general issues of variance reduction technique, cross section libraries used in continuous energy Monte Carlo codes. In chapter 6, the following issues are discussed: fusion benchmark experiments, design of ITER, experiment analyses of fast critical assembly, core analyses of JMTR, simulation of pulsed neutron experiment, core analyses of HTTR, duct streaming calculations, bulk shielding calculations, neutron/gamma ray transport calculations of the Hiroshima atomic bomb. Chapters 8 and 9 treat function enhancements of MCNP and MVP codes, and a parallel processing of Monte Carlo calculation, respectively. An important references are attached at the end of this report.

  14. Rasam Indian Restaurant Menu 2017

    OpenAIRE

    Rasam Indian Restaurant

    2017-01-01

    A little bit about us, we opened our doors for business in November 2003 with the solid ambition to serve high quality authentic Indian cuisine in Dublin. Indian food over time has escaped the European misunderstanding or notion of ‘one sauce fits all’ and has been recognised for the rich dining experience with all the wonderful potent flavours of India Rasam wanted to contribute to the Indian food awakening and so when a suitable premise came available in Glasthule at the heart of a busy...

  15. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Author Affiliations. SATYAM MUKHERJEE1. Department of Operations Management, Quantitative Methods & Information Systems; Indian Institute of Management, Udaipur; and Research Center for Open Digital Innovation, Purdue University, IN 47906, USA ...

  16. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series; Volume 1; Issue 1. Chimera-like states generated by large perturbation of synchronous state of coupled metronomes. SERGEY BREZETSKIY DAWID DUDKOWSKI PATRYCJA JAROS JERZY WOJEWODA KRZYSZTOF CZOLCZYNSKI YURI MAISTRENKO ...

  17. Statistical estimation Monte Carlo for unreliability evaluation of highly reliable system

    International Nuclear Information System (INIS)

    Xiao Gang; Su Guanghui; Jia Dounan; Li Tianduo

    2000-01-01

    Based on analog Monte Carlo simulation, statistical Monte Carlo methods for unreliable evaluation of highly reliable system are constructed, including direct statistical estimation Monte Carlo method and weighted statistical estimation Monte Carlo method. The basal element is given, and the statistical estimation Monte Carlo estimators are derived. Direct Monte Carlo simulation method, bounding-sampling method, forced transitions Monte Carlo method, direct statistical estimation Monte Carlo and weighted statistical estimation Monte Carlo are used to evaluate unreliability of a same system. By comparing, weighted statistical estimation Monte Carlo estimator has smallest variance, and has highest calculating efficiency

  18. Investigating the impossible: Monte Carlo simulations

    International Nuclear Information System (INIS)

    Kramer, Gary H.; Crowley, Paul; Burns, Linda C.

    2000-01-01

    Designing and testing new equipment can be an expensive and time consuming process or the desired performance characteristics may preclude its construction due to technological shortcomings. Cost may also prevent equipment being purchased for other scenarios to be tested. An alternative is to use Monte Carlo simulations to make the investigations. This presentation exemplifies how Monte Carlo code calculations can be used to fill the gap. An example is given for the investigation of two sizes of germanium detector (70 mm and 80 mm diameter) at four different crystal thicknesses (15, 20, 25, and 30 mm) and makes predictions on how the size affects the counting efficiency and the Minimum Detectable Activity (MDA). The Monte Carlo simulations have shown that detector efficiencies can be adequately modelled using photon transport if the data is used to investigate trends. The investigation of the effect of detector thickness on the counting efficiency has shown that thickness for a fixed diameter detector of either 70 mm or 80 mm is unimportant up to 60 keV. At higher photon energies, the counting efficiency begins to decrease as the thickness decreases as expected. The simulations predict that the MDA of either the 70 mm or 80 mm diameter detectors does not differ by more than a factor of 1.15 at 17 keV or 1.2 at 60 keV when comparing detectors of equivalent thicknesses. The MDA is slightly increased at 17 keV, and rises by about 52% at 660 keV, when the thickness is decreased from 30 mm to 15 mm. One could conclude from this information that the extra cost associated with the larger area Ge detectors may not be justified for the slight improvement predicted in the MDA. (author)

  19. Zoogeography of the Indian Ocean

    Digital Repository Service at National Institute of Oceanography (India)

    Rao, T.S.S.

    The distribution pattern of zooplankton in the Indian Ocean is briefly reviewed on a within and between ocean patterns and is limited to species within a quite restricted sort of groups namely, Copepoda, Chaetognatha, Pteropoda and Euphausiacea...

  20. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Department of Aerospace Engineering, Indian Institute of Science, Bangalore 560012, India; Structures group, ISRO Satellite Centre, Bangalore 560017, India; Department of Mechanical Engineering, PES University, Bangalore 560085, India ...