WorldWideScience

Sample records for carlos apache indians

  1. Solar Feasibility Study May 2013 - San Carlos Apache Tribe

    Energy Technology Data Exchange (ETDEWEB)

    Rapp, Jim [Parametrix; Duncan, Ken [San Carlos Apache Tribe; Albert, Steve [Parametrix

    2013-05-01

    The San Carlos Apache Tribe (Tribe) in the interests of strengthening tribal sovereignty, becoming more energy self-sufficient, and providing improved services and economic opportunities to tribal members and San Carlos Apache Reservation (Reservation) residents and businesses, has explored a variety of options for renewable energy development. The development of renewable energy technologies and generation is consistent with the Tribe’s 2011 Strategic Plan. This Study assessed the possibilities for both commercial-scale and community-scale solar development within the southwestern portions of the Reservation around the communities of San Carlos, Peridot, and Cutter, and in the southeastern Reservation around the community of Bylas. Based on the lack of any commercial-scale electric power transmission between the Reservation and the regional transmission grid, Phase 2 of this Study greatly expanded consideration of community-scale options. Three smaller sites (Point of Pines, Dudleyville/Winkleman, and Seneca Lake) were also evaluated for community-scale solar potential. Three building complexes were identified within the Reservation where the development of site-specific facility-scale solar power would be the most beneficial and cost-effective: Apache Gold Casino/Resort, Tribal College/Skill Center, and the Dudleyville (Winkleman) Casino.

  2. San Carlos Apache Tribe - Energy Organizational Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rapp, James; Albert, Steve

    2012-04-01

    The San Carlos Apache Tribe (SCAT) was awarded $164,000 in late-2011 by the U.S. Department of Energy (U.S. DOE) Tribal Energy Program's "First Steps Toward Developing Renewable Energy and Energy Efficiency on Tribal Lands" Grant Program. This grant funded:  The analysis and selection of preferred form(s) of tribal energy organization (this Energy Organization Analysis, hereinafter referred to as "EOA").  Start-up staffing and other costs associated with the Phase 1 SCAT energy organization.  An intern program.  Staff training.  Tribal outreach and workshops regarding the new organization and SCAT energy programs and projects, including two annual tribal energy summits (2011 and 2012). This report documents the analysis and selection of preferred form(s) of a tribal energy organization.

  3. Analysis of oil-bearing Cretaceous sandstone hydrocarbon reservoirs, exclusive of the Dakota Sandstone, on the Jicarilla Apache Indian Reservation, New Mexico; TOPICAL

    International Nuclear Information System (INIS)

    Ridgley, Jennie; Wright Dunbar, Robyn

    2000-01-01

    This is the Phase One contract report to the United States Department of Energy, United State Geological Survey and the Jicarilla Apache Indian Tribe on the project entitled''Outcrop Analysis of the Cretaceous Mesaverde Group: Jicarilla Apache Reservation, New Mexico.'' Field work for this project was conducted during July and August 1998, at which time fourteen measured sections were described and correlated on or adjacent to Jicarilla Apache Reservation lands. A fifteen section, described east of the main field area, is included in this report, although its distant location precluded use in the correlation's and cross-sections presented herein. Ground-based photo mosaics were shot for much of the exposed Mesaverde outcrop belt and were used to assist in correlation. Outcrop gamma-ray surveys at six of the fifteen measured sections using a GAD-6 scintillometer was conducted. The raw gamma-ray data are included in this report, however, analysis of those data is part of the ongoing Phase Two of this project

  4. Preliminary Assessment of Apache Hopefulness: Relationships with Hopelessness and with Collective as well as Personal Self-Esteem

    Science.gov (United States)

    Hammond, Vanessa Lea; Watson, P. J.; O'Leary, Brian J.; Cothran, D. Lisa

    2009-01-01

    Hopelessness is central to prominent mental health problems within American Indian (AI) communities. Apaches living on a reservation in Arizona responded to diverse expressions of hope along with Hopelessness, Personal Self-Esteem, and Collective Self-Esteem scales. An Apache Hopefulness Scale expressed five themes of hope and correlated…

  5. Apache 2 Pocket Reference For Apache Programmers & Administrators

    CERN Document Server

    Ford, Andrew

    2008-01-01

    Even if you know the Apache web server inside and out, you still need an occasional on-the-job reminder -- especially if you're moving to the newer Apache 2.x. Apache 2 Pocket Reference gives you exactly what you need to get the job done without forcing you to plow through a cumbersome, doorstop-sized reference. This Book provides essential information to help you configure and maintain the server quickly, with brief explanations that get directly to the point. It covers Apache 2.x, giving web masters, web administrators, and programmers a quick and easy reference solution. This pocket r

  6. Apache Maven cookbook

    CERN Document Server

    Bharathan, Raghuram

    2015-01-01

    If you are a Java developer or a manager who has experience with Apache Maven and want to extend your knowledge, then this is the ideal book for you. Apache Maven Cookbook is for those who want to learn how Apache Maven can be used for build automation. It is also meant for those familiar with Apache Maven, but want to understand the finer nuances of Maven and solve specific problems.

  7. Remote sensing analysis of vegetation at the San Carlos Apache Reservation, Arizona and surrounding area

    Science.gov (United States)

    Norman, Laura M.; Middleton, Barry R.; Wilson, Natalie R.

    2018-01-01

    Mapping of vegetation types is of great importance to the San Carlos Apache Tribe and their management of forestry and fire fuels. Various remote sensing techniques were applied to classify multitemporal Landsat 8 satellite data, vegetation index, and digital elevation model data. A multitiered unsupervised classification generated over 900 classes that were then recoded to one of the 16 generalized vegetation/land cover classes using the Southwest Regional Gap Analysis Project (SWReGAP) map as a guide. A supervised classification was also run using field data collected in the SWReGAP project and our field campaign. Field data were gathered and accuracy assessments were generated to compare outputs. Our hypothesis was that a resulting map would update and potentially improve upon the vegetation/land cover class distributions of the older SWReGAP map over the 24,000  km2 study area. The estimated overall accuracies ranged between 43% and 75%, depending on which method and field dataset were used. The findings demonstrate the complexity of vegetation mapping, the importance of recent, high-quality-field data, and the potential for misleading results when insufficient field data are collected.

  8. Learning Apache Kafka

    CERN Document Server

    Garg, Nishant

    2015-01-01

    This book is for readers who want to know more about Apache Kafka at a hands-on level; the key audience is those with software development experience but no prior exposure to Apache Kafka or similar technologies. It is also useful for enterprise application developers and big data enthusiasts who have worked with other publisher-subscriber-based systems and want to explore Apache Kafka as a futuristic solution.

  9. Apache The Definitive Guide

    CERN Document Server

    Laurie, Ben

    2003-01-01

    Apache is far and away the most widely used web server platform in the world. This versatile server runs more than half of the world's existing web sites. Apache is both free and rock-solid, running more than 21 million web sites ranging from huge e-commerce operations to corporate intranets and smaller hobby sites. With this new third edition of Apache: The Definitive Guide, web administrators new to Apache will come up to speed quickly, and experienced administrators will find the logically organized, concise reference sections indispensable, and system programmers interested in customizin

  10. The APACHE Project

    Directory of Open Access Journals (Sweden)

    Giacobbe P.

    2013-04-01

    Full Text Available First, we summarize the four-year long efforts undertaken to build the final setup of the APACHE Project, a photometric transit search for small-size planets orbiting bright, low-mass M dwarfs. Next, we describe the present status of the APACHE survey, officially started in July 2012 at the site of the Astronomical Observatory of the Autonomous Region of the Aosta Valley, in the Western Italian Alps. Finally, we briefly discuss the potentially far-reaching consequences of a multi-technique characterization program of the (potentially planet-bearing APACHE targets.

  11. Learning Apache Karaf

    CERN Document Server

    Edstrom, Johan; Kesler, Heath

    2013-01-01

    The book is a fast-paced guide full of step-by-step instructions covering all aspects of application development using Apache Karaf.Learning Apache Karaf will benefit all Java developers and system administrators who need to develop for and/or operate Karaf's OSGi-based runtime. Basic knowledge of Java is assumed.

  12. [Validity of APACHE II, APACHE III, SAPS 2, SAPS 3 and SOFA scales in obstetric patients with sepsis].

    Science.gov (United States)

    Zabolotskikh, I B; Musaeva, T S; Denisova, E A

    2012-01-01

    to estimate efficiency of APACHE II, APACHE III, SAPS II, SAPS III, SOFA scales for obstetric patients with heavy sepsis. 186 medical cards retrospective analysis of pregnant women with pulmonary sepsis, 40 women with urosepsis and puerperas with abdominal sepsis--66 was performed. Middle age of women was 26.7 (22.4-34.5). In population of puerperas with abdominal sepsis APACHE II, APACHE III, SAPS 2, SAPS 3, SOFA scales showed to good calibration, however, high resolution was observed only in APACHE III, SAPS 3 and SOFA (AUROC 0.95; 0.93; 0.92 respectively). APACHE III and SOFA scales provided qualitative prognosis in pregnant women with urosepsis; resolution ratio of these scales considerably exceeds APACHE II, SAPS 2 and SAPS 3 (AUROC 0.73; 0.74; 0.79 respectively). APACHE II scale is inapplicable because of a lack of calibration (X2 = 13.1; p < 0.01), and at other scales (APACHE III, SAPS 2, SAPS 3, SOFA) was observed the insufficient resolution (AUROC < 0.9) in pregnant women with pulmonary sepsis. Prognostic possibilities assessment of score scales showed that APACHE III, SAPS 3 and SOFA scales can be used for a lethality prognosis for puerperas with abdominal sepsis, in population of pregnant women with urosepsis--only APACHE III and SOFA, and with pulmonary sepsis--SAPS 3 and APACHE III only in case of additional clinical information.

  13. Arizona TeleMedicine Project.

    Science.gov (United States)

    Arizona Univ., Tucson. Coll. of Medicine.

    Designed to provide health services for American Indians living on rurally isolated reservations, the Arizona TeleMedicine Project proposes to link Phoenix and Tucson medical centers, via a statewide telecommunications system, with the Hopi, San Carlos Apache, Papago, Navajo, and White Mountain Apache reservations. Advisory boards are being…

  14. The White Mountain Recreational Enterprise: Bio-Political Foundations for White Mountain Apache Natural Resource Control, 1945–1960

    Directory of Open Access Journals (Sweden)

    David C. Tomblin

    2016-07-01

    Full Text Available Among American Indian nations, the White Mountain Apache Tribe has been at the forefront of a struggle to control natural resource management within reservation boundaries. In 1952, they developed the first comprehensive tribal natural resource management program, the White Mountain Recreational Enterprise (WMRE, which became a cornerstone for fighting legal battles over the tribe’s right to manage cultural and natural resources on the reservation for the benefit of the tribal community rather than outside interests. This article examines how White Mountain Apaches used the WMRE, while embracing both Euro-American and Apache traditions, as an institutional foundation for resistance and exchange with Euro-American society so as to reassert control over tribal eco-cultural resources in east-central Arizona.

  15. Instant Apache Wicket 6

    CERN Document Server

    Longo, João Sávio Ceregatti

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. This Starter style guide takes the reader through the basic workflow of Apache Wicket in a practical and friendly style.Instant Apache Wicket 6 is for people who want to learn the basics of Apache Wicket 6 and who already have some experience with Java and object-oriented programming. Basic knowledge of web concepts like HTTP and Ajax will be an added advantage.

  16. Apache Mahout cookbook

    CERN Document Server

    Giacomelli, Piero

    2013-01-01

    Apache Mahout Cookbook uses over 35 recipes packed with illustrations and real-world examples to help beginners as well as advanced programmers get acquainted with the features of Mahout.""Apache Mahout Cookbook"" is great for developers who want to have a fresh and fast introduction to Mahout coding. No previous knowledge of Mahout is required, and even skilled developers or system administrators will benefit from the various recipes presented

  17. The Jicarilla Apaches. A Study in Survival.

    Science.gov (United States)

    Gunnerson, Dolores A.

    Focusing on the ultimate fate of the Cuartelejo and/or Paloma Apaches known in archaeological terms as the Dismal River people of the Central Plains, this book is divided into 2 parts. The early Apache (1525-1700) and the Jicarilla Apache (1700-1800) tribes are studied in terms of their: persistent cultural survival, social/political adaptability,…

  18. The Apache OODT Project: An Introduction

    Science.gov (United States)

    Mattmann, C. A.; Crichton, D. J.; Hughes, J. S.; Ramirez, P.; Goodale, C. E.; Hart, A. F.

    2012-12-01

    Apache OODT is a science data system framework, borne over the past decade, with 100s of FTEs of investment, tens of sponsoring agencies (NASA, NIH/NCI, DoD, NSF, universities, etc.), and hundreds of projects and science missions that it powers everyday to their success. At its core, Apache OODT carries with it two fundamental classes of software services and components: those that deal with information integration from existing science data repositories and archives, that themselves have already-in-use business processes and models for populating those archives. Information integration allows search, retrieval, and dissemination across these heterogeneous systems, and ultimately rapid, interactive data access, and retrieval. The other suite of services and components within Apache OODT handle population and processing of those data repositories and archives. Workflows, resource management, crawling, remote data retrieval, curation and ingestion, along with science data algorithm integration all are part of these Apache OODT software elements. In this talk, I will provide an overview of the use of Apache OODT to unlock and populate information from science data repositories and archives. We'll cover the basics, along with some advanced use cases and success stories.

  19. Random Decision Forests on Apache Spark

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    About the speaker Tom White has been an Apache Hadoop committer since February 2007, and is a member of the Apache Software Foundation. He works for Cloudera, a company set up to offer Hadoop support and training. Previously he was as an independent Hadoop consultant, work...

  20. Apache Tomcat 7 Essentials

    CERN Document Server

    Khare, Tanuj

    2012-01-01

    This book is a step-by-step tutorial for anyone wanting to learn Apache Tomcat 7 from scratch. There are plenty of illustrations and examples to escalate you from a novice to an expert with minimal strain. If you are a J2EE administrator, migration administrator, technical architect, or a project manager for a web hosting domain, and are interested in Apache Tomcat 7, then this book is for you. If you are someone responsible for installation, configuration, and management of Tomcat 7, then too, this book will be of help to you.

  1. Apache Flume distributed log collection for Hadoop

    CERN Document Server

    D'Souza, Subas

    2013-01-01

    A starter guide that covers Apache Flume in detail.Apache Flume: Distributed Log Collection for Hadoop is intended for people who are responsible for moving datasets into Hadoop in a timely and reliable manner like software engineers, database administrators, and data warehouse administrators

  2. Instant Apache Maven starter

    CERN Document Server

    Turatti, Maurizio

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks.The book follows a starter approach for using Maven to create and build a new Java application or Web project from scratch.Instant Apache Maven Starter is great for Java developers new to Apache Maven, but also for experts looking for immediate information. Moreover, only 20% of the necessary information about Maven is used in 80% of the activities. This book aims to focus on the most important information, those pragmatic parts you actually use

  3. Apache Solr essentials

    CERN Document Server

    Gazzarini, Andrea

    2015-01-01

    If you are a competent developer with experience of working with technologies similar to Apache Solr and want to develop efficient search applications, then this book is for you. Familiarity with the Java programming language is required.

  4. Apache Mahout essentials

    CERN Document Server

    Withanawasam, Jayani

    2015-01-01

    If you are a Java developer or data scientist, haven't worked with Apache Mahout before, and want to get up to speed on implementing machine learning on big data, then this is the perfect guide for you.

  5. FEASIBILITY STUDY FOR A PETROLEUM REFINERY FOR THE JICARILLA APACHE TRIBE

    International Nuclear Information System (INIS)

    Jones, John D.

    2004-01-01

    A feasibility study for a proposed petroleum refinery for the Jicarilla Apache Indian Reservation was performed. The available crude oil production was identified and characterized. There is 6,000 barrels per day of crude oil production available for processing in the proposed refinery. The proposed refinery will utilize a lower temperature, smaller crude fractionation unit. It will have a Naphtha Hydrodesulfurizer and Reformer to produce high octane gasoline. The surplus hydrogen from the reformer will be used in a specialized hydrocracker to convert the heavier crude oil fractions to ultra low sulfur gasoline and diesel fuel products. The proposed refinery will produce gasoline, jet fuel, diesel fuel, and a minimal amount of lube oil. The refinery will require about $86,700,000 to construct. It will have net annual pre-tax profit of about $17,000,000. The estimated return on investment is 20%. The feasibility is positive subject to confirmation of long term crude supply. The study also identified procedures for evaluating processing options as a means for American Indian Tribes and Native American Corporations to maximize the value of their crude oil production

  6. 78 FR 65370 - Notice of Inventory Completion: Pima County Office of the Medical Examiner, Tucson, AZ

    Science.gov (United States)

    2013-10-31

    ... Tribe of the San Carlos Reservation, Arizona; Tohono O'odham Nation of Arizona; White Mountain Apache... Community of the Gila River Indian Reservation, Arizona; Hopi Tribe of Arizona; Tohono O'odham Nation of...; Tohono O'odham Nation of Arizona; and the Zuni Tribe of the Zuni Reservation, New Mexico. Additional...

  7. 77 FR 15796 - Notice of Intent To Repatriate Cultural Items: U.S. Department of the Interior, Bureau of Indian...

    Science.gov (United States)

    2012-03-16

    ... stone and 1 chert scraper. The Pinnacle Site consists of a pueblo of about 10 rooms and dates from A.D... with additional stone alignments and dates from A.D. 1275-1400, based on the ceramic assemblage. The... Assessment of White Mountain Apache Tribal Lands (Fort Apache Indian Reservation),'' by John R. Welch and T.J...

  8. Representation without Taxation: Citizenship and Suffrage in Indian Country.

    Science.gov (United States)

    Phelps, Glenn A.

    1985-01-01

    Reviews history of Arizona Indian voting rights. Details current dispute over voting rights in Apache County (Arizona). Explores three unanswered questions in light of current constitutional interpretation. Stresses solution to political disputes will require climate of mutual trust, awareness of constitutional rights/obligations of all concerned,…

  9. Apaches push privatization

    International Nuclear Information System (INIS)

    Daniels, S.

    1994-01-01

    Trying to drum up business for what would be the first private temporary storage facility for spent nuclear fuel rods, the Mescalero Apaches are inviting officials of 30 utilities to convene March 10 at the tribe's New Mexico reservation. The state public utilities commission will also attend the meeting, which grew from an agreement the tribe signed last month with Minneapolis-based Northern States Power Co

  10. Perl and Apache Your visual blueprint for developing dynamic Web content

    CERN Document Server

    McDaniel, Adam

    2010-01-01

    Visually explore the range of built-in and third-party libraries of Perl and Apache. Perl and Apache have been providing Common Gateway Interface (CGI) access to Web sites for 20 years and are constantly evolving to support the ever-changing demands of Internet users. With this book, you will heighten your knowledge and see how to usePerl and Apache to develop dynamic Web sites. Beginning with a clear, step-by-step explanation of how to install Perl and Apache on both Windows and Linux servers, you then move on to configuring each to securely provide CGI Services. CGI developer and author Adam

  11. Subsurface Analysis of the Mesaverde Group on and near the Jicarilla Apache Indian Reservation, New Mexico-its implication on Sites of Oil and Gas Accumulation

    Energy Technology Data Exchange (ETDEWEB)

    Ridgley, Jennie

    2001-08-21

    The purpose of the phase 2 Mesaverde study part of the Department of Energy funded project ''Analysis of oil-bearing Cretaceous Sandstone Hydrocarbon Reservoirs, exclusive of the Dakota Sandstone, on the Jicarilla Apache Indian Reservation, New Mexico'' was to define the facies of the oil-producing units within the subsurface units of the Mesaverde Group and integrate these results with outcrop studies that defined the depositional environments of these facies within a sequence stratigraphic context. The focus of this report will center on (1) integration of subsurface correlations with outcrop correlations of components of the Mesaverde, (2) application of the sequence stratigraphic model determined in the phase one study to these correlations, (3) determination of the facies distribution of the Mesaverde Group and their relationship to sites of oil and gas accumulation, (4) evaluation of the thermal maturity and potential source rocks for oil and gas in the Mesaverde Group, and (5) evaluation of the structural features on the Reservation as they may control sites of oil accumulation.

  12. Mastering Apache Cassandra

    CERN Document Server

    Neeraj, Nishant

    2013-01-01

    Mastering Apache Cassandra is a practical, hands-on guide with step-by-step instructions. The smooth and easy tutorial approach focuses on showing people how to utilize Cassandra to its full potential.This book is aimed at intermediate Cassandra users. It is best suited for startups where developers have to wear multiple hats: programmer, DevOps, release manager, convincing clients, and handling failures. No prior knowledge of Cassandra is required.

  13. Apache Cordova 3 programming

    CERN Document Server

    Wargo, John M

    2013-01-01

    Written for experienced mobile developers, Apache Cordova 3 Programming is a complete introduction to Apache Cordova 3 and Adobe PhoneGap 3. It describes what makes Cordova important and shows how to install and use the tools, the new Cordova CLI, the native SDKs, and more. If you’re brand new to Cordova, this book will be just what you need to get started. If you’re familiar with an older version of Cordova, this book will show you in detail how to use all of the new stuff that’s in Cordova 3 plus stuff that has been around for a while (like the Cordova core APIs). After walking you through the process of downloading and setting up the framework, mobile expert John M. Wargo shows you how to install and use the command line tools to manage the Cordova application lifecycle and how to set up and use development environments for several of the more popular Cordova supported mobile device platforms. Of special interest to new developers are the chapters on the anatomy of a Cordova application, as well ...

  14. Mescalero Apache Tribe Monitored Retrievable Storage (MRS)

    Energy Technology Data Exchange (ETDEWEB)

    Peso, F.

    1992-03-13

    The Nuclear Waste Policy Act of 1982, as amended, authorizes the siting, construction and operation of a Monitored Retrievable Storage (MRS) facility. The MRS is intended to be used for the temporary storage of spent nuclear fuel from the nation's nuclear power plants beginning as early as 1998. Pursuant to the Nuclear Waste Policy Act, the Office of the Nuclear Waste Negotiator was created. On October 7, 1991, the Nuclear Waste Negotiator invited the governors of states and the Presidents of Indian tribes to apply for government grants in order to conduct a study to assess under what conditions, if any, they might consider hosting an MRS facility. Pursuant to this invitation, on October 11, 1991 the Mescalero Apache Indian Tribe of Mescalero, NM applied for a grant to conduct a phased, preliminary study of the safety, technical, political, environmental, social and economic feasibility of hosting an MRS. The preliminary study included: (1) An investigative education process to facilitate the Tribe's comprehensive understanding of the safety, environmental, technical, social, political, and economic aspects of hosting an MRS, and; (2) The development of an extensive program that is enabling the Tribe, in collaboration with the Negotiator, to reach an informed and carefully researched decision regarding the conditions, (if any), under which further pursuit of the MRS would be considered. The Phase 1 grant application enabled the Tribe to begin the initial activities necessary to determine whether further consideration is warranted for hosting the MRS facility. The Tribe intends to pursue continued study of the MRS in order to meet the following objectives: (1) Continuing the education process towards a comprehensive understanding of the safety, environmental, technical, social and economic aspects of the MRS; (2) Conducting an effective public participation and information program; (3) Participating in MRS meetings.

  15. Mescalero Apache Tribe Monitored Retrievable Storage (MRS)

    International Nuclear Information System (INIS)

    Peso, F.

    1992-01-01

    The Nuclear Waste Policy Act of 1982, as amended, authorizes the siting, construction and operation of a Monitored Retrievable Storage (MRS) facility. The MRS is intended to be used for the temporary storage of spent nuclear fuel from the nation's nuclear power plants beginning as early as 1998. Pursuant to the Nuclear Waste Policy Act, the Office of the Nuclear Waste Negotiator was created. On October 7, 1991, the Nuclear Waste Negotiator invited the governors of states and the Presidents of Indian tribes to apply for government grants in order to conduct a study to assess under what conditions, if any, they might consider hosting an MRS facility. Pursuant to this invitation, on October 11, 1991 the Mescalero Apache Indian Tribe of Mescalero, NM applied for a grant to conduct a phased, preliminary study of the safety, technical, political, environmental, social and economic feasibility of hosting an MRS. The preliminary study included: (1) An investigative education process to facilitate the Tribe's comprehensive understanding of the safety, environmental, technical, social, political, and economic aspects of hosting an MRS, and; (2) The development of an extensive program that is enabling the Tribe, in collaboration with the Negotiator, to reach an informed and carefully researched decision regarding the conditions, (if any), under which further pursuit of the MRS would be considered. The Phase 1 grant application enabled the Tribe to begin the initial activities necessary to determine whether further consideration is warranted for hosting the MRS facility. The Tribe intends to pursue continued study of the MRS in order to meet the following objectives: (1) Continuing the education process towards a comprehensive understanding of the safety, environmental, technical, social and economic aspects of the MRS; (2) Conducting an effective public participation and information program; (3) Participating in MRS meetings

  16. APACHE II as an indicator of ventilator-associated pneumonia (VAP.

    Directory of Open Access Journals (Sweden)

    Kelser de Souza Kock

    2015-01-01

    Full Text Available Background and objectives: strategies for risk stratification in severe pathologies are extremely important. The aim of this study was to analyze the accuracy of the APACHE II score as an indicator of Ventilator-Associated Pneumonia (VAP in ICU patient sat Hospital Nossa Senhora da Conceição (HNSC Tubarão-SC. Methods: It was conducted a prospective cohort study with 120 patients admitted between March and August 2013, being held APACHE II in the first 24 hours of mechanical ventilation (MV. Patients were followed until the following gout comes: discharge or death. It was also analyzed the cause of ICU admission, age, gender, days of mechanical ventilation, length of ICU and outcome. Results: The incidence of VAP was 31.8% (38/120. Two variables showed a relative riskin the development of VAP, APACHE II above average (RR = 1,62; IC 95% 1,03-2,55 and males (RR = 1,56; IC 95 % 1,18-2,08. The duration of mechanical ventilation (days above average18.4± 14.9(p =0.001, ICU stay (days above average 20.4± 15.3(p =0.003 presented the development of VAP. The accuracy of APACHE II in predicting VAP score >23, showed a sensitivity of 84% and specificity of 33%. Inrelation to death, two variables showed relative risk, age above average (RR=2.08; 95% CI =1.34 to 3.23 and ICU stay above average (RR=2.05; CI 95 =1.28 to 3.28%. Conclusion: The APACHE II score above or equal 23 might to indicate the risk of VAP. Keywords: Pneumonia, Ventilator-Associated, Intensive Care Units, APACHE. Prognosis

  17. Apache ZooKeeper essentials

    CERN Document Server

    Haloi, Saurav

    2015-01-01

    Whether you are a novice to ZooKeeper or already have some experience, you will be able to master the concepts of ZooKeeper and its usage with ease. This book assumes you to have some prior knowledge of distributed systems and high-level programming knowledge of C, Java, or Python, but no experience with Apache ZooKeeper is required.

  18. Sequence Stratigraphic Analysis and Facies Architecture of the Cretaceous Mancos Shale on and Near the Jicarilla Apache Indian Reservation, New Mexico-their relation to Sites of Oil Accumulation; FINAL

    International Nuclear Information System (INIS)

    Ridgley, Jennie

    2001-01-01

    The purpose of phase 1 and phase 2 of the Department of Energy funded project Analysis of oil- bearing Cretaceous Sandstone Hydrocarbon Reservoirs, exclusive of the Dakota Sandstone, on the Jicarilla Apache Indian Reservation, New Mexico was to define the facies of the oil producing units within the Mancos Shale and interpret the depositional environments of these facies within a sequence stratigraphic context. The focus of this report will center on (1) redefinition of the area and vertical extent of the ''Gallup sandstone'' or El Vado Sandstone Member of the Mancos Shale, (2) determination of the facies distribution within the ''Gallup sandstone'' and other oil-producing sandstones within the lower Mancos, placing these facies within the overall depositional history of the San Juan Basin, (3) application of the principals of sequence stratigraphy to the depositional units that comprise the Mancos Shale, and (4) evaluation of the structural features on the Reservation as they may control sites of oil accumulation

  19. The Creation of a Carmeleño Identity:Marriage Practices in the Indian Village at Mission San Carlos Borromeo del Río Carmel

    OpenAIRE

    Peelo, Sarah

    2010-01-01

    Indigenous peoples from diverse tribelets lived within the Indian village at Mission San Carlos Borromeo del Río Carmel. In precolonial times, California Indians formed identities tied to their tribelets. In the mission, those identities were reproduced as members of this pluralistic community formed a connection with their new place of residence. In this paper, I illustrate how marriage was one arena within which different indigenous peoples at this mission may have created a shared sense o...

  20. One hundred years of instrumental phonetic fieldwork on North America Indian languages

    Science.gov (United States)

    McDonough, Joyce

    2005-04-01

    A resurgence of interest in phonetic fieldwork on generally morphologically complex North American Indian languages over the last 15 years is a continuation of a tradition started a century ago with the Earle Pliny Goddard, who collected kymographic and palatographic field-data between 1906-1927 on several Athabaskan languages: Coastal Athabaskan (Hupa and Kato), Apachean (Mescalero, Jicarilla, White Mountain, San Juan Carlos Apache), and several Athabaskan languages in Northern Canada (Cold Lake and Beaver); data that remains important for its record of segmental timing profiles and rare articulatory documentation in then largely monolingual communities. This data in combination with new work has resulted in the emergence of a body of knowledge of these typologically distinct families that often challenge notions of phonetic universality and typology. Using the Athabaskan languages as benchmark example and starting with Goddard's work, two types of emergent typological patterns will be discussed; the persistence of fine-grained timing and duration details across the widely dispersed family, and the broad variation in prosodic types that exists, both of which are unaccounted for by phonetic or phonological theories.

  1. Instant Apache Camel message routing

    CERN Document Server

    Ibryam, Bilgin

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. This short, instruction-based guide shows you how to perform application integration using the industry standard Enterprise Integration Patterns.This book is intended for Java developers who are new to Apache Camel and message- oriented applications.

  2. Biology and distribution of Lutzomyia apache as it relates to VSV

    Science.gov (United States)

    Phlebotomine sand flies are vectors of bacteria, parasites, and viruses. Lutzomyia apache was incriminated as a vector of vesicular stomatitis viruses(VSV)due to overlapping ranges of the sand fly and outbreaks of VSV. I report on newly discovered populations of L. apache in Wyoming from Albany and ...

  3. Nuclear data physics issues in Monte Carlo simulations of neutron and photon transport in the Indian context

    International Nuclear Information System (INIS)

    Ganesan, S.

    2009-01-01

    In this write-up, some of the basic issues of nuclear data physics in Monte Carlo simulation of neutron transport in the Indian context are dealt with. In this lecture, some of the aspects associated with usage of the ENDF/B system, and of the PREPRO code system developed by D.E. Cullen and distributed by the IAEA Nuclear Data Section are briefly touched upon. Some aspects of the SIGACE code system which was developed by the author in collaboration with IPR, Ahmedabad and the IAEA Nuclear Data Section are also briefly covered. The validation of the SIGACE package included investigations using the NJOY and the MCNP compatible ACE files. Appendix-1 of the paper provides some useful discussions pointing out that voluminous and high-quality nuclear physics data required for nuclear applications usually evolve from a national effort to provide state-of-the-art data that are based upon established needs and uncertainties. Appendix-2 deals with some interesting work that was carried out using the SIGACE Code for Generating High Temperature ACE Files. Appendix-3 mentions briefly Integral nuclear data validation studies and use of Monte Carlo codes and nuclear data. Appendix-4 provides a brief summary report on selected Indian nuclear data physics activities for the interested reader in the light of BARC/DAE treating the subject area of nuclear data physics as a thrust area in our atomic energy programme

  4. An Indian tribal view of the back end of the nuclear fuel cycle: historical and cultural lessons

    International Nuclear Information System (INIS)

    Tano, M.L.; Powankee, D.; Lester, A.D.

    1995-01-01

    The Nez Perce Tribe, the Confederated Tribes of the Umatilla Indian Reservation and the Yakama Indian Nation have entered into cooperative agreements with the US Department of Energy to oversee the cleanup of the Hanford Reservation. The Mescalero Apache Tribe and the Meadow Lake Tribal Council have come under severe criticism from some ''ideological pure'' Indians and non-Indians for aiding and abetting the violation of Mother Earth by permitting the land to be contaminated by radioactive wastes. This paper suggests that this view of the Indian relationship to nature and the environment is too narrow and describes aspects of Indian religion that support tribal involvement in radioactive waste management. (O.M.)

  5. 77 FR 51475 - Safety Zone; Apache Pier Labor Day Fireworks; Myrtle Beach, SC

    Science.gov (United States)

    2012-08-24

    ...-AA00 Safety Zone; Apache Pier Labor Day Fireworks; Myrtle Beach, SC AGENCY: Coast Guard, DHS. ACTION... Atlantic Ocean in the vicinity of Apache Pier in Myrtle Beach, SC, during the Labor Day fireworks... [[Page 51476

  6. Growth and survival of Apache Trout under static and fluctuating temperature regimes

    Science.gov (United States)

    Recsetar, Matthew S.; Bonar, Scott A.; Feuerbacher, Olin

    2014-01-01

    Increasing stream temperatures have important implications for arid-region fishes. Little is known about effects of high water temperatures that fluctuate over extended periods on Apache Trout Oncorhynchus gilae apache, a federally threatened species of southwestern USA streams. We compared survival and growth of juvenile Apache Trout held for 30 d in static temperatures (16, 19, 22, 25, and 28°C) and fluctuating diel temperatures (±3°C from 16, 19, 22 and 25°C midpoints and ±6°C from 19°C and 22°C midpoints). Lethal temperature for 50% (LT50) of the Apache Trout under static temperatures (mean [SD] = 22.8 [0.6]°C) was similar to that of ±3°C diel temperature fluctuations (23.1 [0.1]°C). Mean LT50 for the midpoint of the ±6°C fluctuations could not be calculated because survival in the two treatments (19 ± 6°C and 22 ± 6°C) was not below 50%; however, it probably was also between 22°C and 25°C because the upper limb of a ±6°C fluctuation on a 25°C midpoint is above critical thermal maximum for Apache Trout (28.5–30.4°C). Growth decreased as temperatures approached the LT50. Apache Trout can survive short-term exposure to water temperatures with daily maxima that remain below 25°C and midpoint diel temperatures below 22°C. However, median summer stream temperatures must remain below 19°C for best growth and even lower if daily fluctuations are high (≥12°C).

  7. 75 FR 57290 - Notice of Inventory Completion: University of Colorado Museum, Boulder, CO

    Science.gov (United States)

    2010-09-20

    ... Indian Colony, Nevada; Lovelock Paiute Tribe of the Lovelock Indian Colony, Nevada; Mescalero Apache...; Lovelock Paiute Tribe of the Lovelock Indian Colony, Nevada; Mescalero Apache Tribe of the Mescalero...

  8. Apache, Santa Fe energy units awarded two Myanmar blocks

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    This paper reports that Myanmar's state oil company has awarded production sharing contracts (PSCs) on two blocks to units of Apache Corp. and Santa Fe Energy Resources Inc., both of Houston. That comes on the heels of a report by County NatWest Woodmac that notes Myanmar's oil production, currently meeting less than half the country's demand, is set to fall further this year. 150 line km of new seismic data could be acquired and one well drilled. During the initial 2 year exploration period on Block EP-3, Apache will conduct geological studies and conduct at least 200 line km of seismic data

  9. Satellite Imagery Production and Processing Using Apache Hadoop

    Science.gov (United States)

    Hill, D. V.; Werpy, J.

    2011-12-01

    The United States Geological Survey's (USGS) Earth Resources Observation and Science (EROS) Center Land Science Research and Development (LSRD) project has devised a method to fulfill its processing needs for Essential Climate Variable (ECV) production from the Landsat archive using Apache Hadoop. Apache Hadoop is the distributed processing technology at the heart of many large-scale, processing solutions implemented at well-known companies such as Yahoo, Amazon, and Facebook. It is a proven framework and can be used to process petabytes of data on thousands of processors concurrently. It is a natural fit for producing satellite imagery and requires only a few simple modifications to serve the needs of science data processing. This presentation provides an invaluable learning opportunity and should be heard by anyone doing large scale image processing today. The session will cover a description of the problem space, evaluation of alternatives, feature set overview, configuration of Hadoop for satellite image processing, real-world performance results, tuning recommendations and finally challenges and ongoing activities. It will also present how the LSRD project built a 102 core processing cluster with no financial hardware investment and achieved ten times the initial daily throughput requirements with a full time staff of only one engineer. Satellite Imagery Production and Processing Using Apache Hadoop is presented by David V. Hill, Principal Software Architect for USGS LSRD.

  10. An External Independent Validation of APACHE IV in a Malaysian Intensive Care Unit.

    Science.gov (United States)

    Wong, Rowena S Y; Ismail, Noor Azina; Tan, Cheng Cheng

    2015-04-01

    Intensive care unit (ICU) prognostic models are predominantly used in more developed nations such as the United States, Europe and Australia. These are not that popular in Southeast Asian countries due to costs and technology considerations. The purpose of this study is to evaluate the suitability of the acute physiology and chronic health evaluation (APACHE) IV model in a single centre Malaysian ICU. A prospective study was conducted at the single centre ICU in Hospital Sultanah Aminah (HSA) Malaysia. External validation of APACHE IV involved a cohort of 916 patients who were admitted in 2009. Model performance was assessed through its calibration and discrimination abilities. A first-level customisation using logistic regression approach was also applied to improve model calibration. APACHE IV exhibited good discrimination, with an area under receiver operating characteristic (ROC) curve of 0.78. However, the model's overall fit was observed to be poor, as indicated by the Hosmer-Lemeshow goodness-of-fit test (Ĉ = 113, P discrimination was not affected. APACHE IV is not suitable for application in HSA ICU, without further customisation. The model's lack of fit in the Malaysian study is attributed to differences in the baseline characteristics between HSA ICU and APACHE IV datasets. Other possible factors could be due to differences in clinical practice, quality and services of health care systems between Malaysia and the United States.

  11. Network Intrusion Detection System using Apache Storm

    Directory of Open Access Journals (Sweden)

    Muhammad Asif Manzoor

    2017-06-01

    Full Text Available Network security implements various strategies for the identification and prevention of security breaches. Network intrusion detection is a critical component of network management for security, quality of service and other purposes. These systems allow early detection of network intrusion and malicious activities; so that the Network Security infrastructure can react to mitigate these threats. Various systems are proposed to enhance the network security. We are proposing to use anomaly based network intrusion detection system in this work. Anomaly based intrusion detection system can identify the new network threats. We also propose to use Real-time Big Data Stream Processing Framework, Apache Storm, for the implementation of network intrusion detection system. Apache Storm can help to manage the network traffic which is generated at enormous speed and size and the network traffic speed and size is constantly increasing. We have used Support Vector Machine in this work. We use Knowledge Discovery and Data Mining 1999 (KDD’99 dataset to test and evaluate our proposed solution.

  12. Optimizing CMS build infrastructure via Apache Mesos

    CERN Document Server

    Abduracmanov, David; Degano, Alessandro; Elmer, Peter; Eulisse, Giulio; Mendez, David; Muzaffar, Shahzad

    2015-12-23

    The Offline Software of the CMS Experiment at the Large Hadron Collider (LHC) at CERN consists of 6M lines of in-house code, developed over a decade by nearly 1000 physicists, as well as a comparable amount of general use open-source code. A critical ingredient to the success of the construction and early operation of the WLCG was the convergence, around the year 2000, on the use of a homogeneous environment of commodity x86-64 processors and Linux. Apache Mesos is a cluster manager that provides efficient resource isolation and sharing across distributed applications, or frameworks. It can run Hadoop, Jenkins, Spark, Aurora, and other applications on a dynamically shared pool of nodes. We present how we migrated our continuos integration system to schedule jobs on a relatively small Apache Mesos enabled cluster and how this resulted in better resource usage, higher peak performance and lower latency thanks to the dynamic scheduling capabilities of Mesos.

  13. Managing Variant Calling Files the Big Data Way: Using HDFS and Apache Parquet

    NARCIS (Netherlands)

    Boufea, Aikaterini; Finkers, H.J.; Kaauwen, van M.P.W.; Kramer, M.R.; Athanasiadis, I.N.

    2017-01-01

    Big Data has been seen as a remedy for the efficient management of the ever-increasing genomic data. In this paper, we investigate the use of Apache Spark to store and process Variant Calling Files (VCF) on a Hadoop cluster. We demonstrate Tomatula, a software tool for converting VCF files to Apache

  14. Jicarilla Apache Utility Authority Renewable Energy and Energy Efficiency Strategic Planning

    Energy Technology Data Exchange (ETDEWEB)

    Rabago, K.R.

    2008-06-28

    The purpose of this Strategic Plan Report is to provide an introduction and in-depth analysis of the issues and opportunities, resources, and technologies of energy efficiency and renewable energy that have potential beneficial application for the people of the Jicarilla Apache Nation and surrounding communities. The Report seeks to draw on the best available information that existed at the time of writing, and where necessary, draws on new research to assess this potential. This study provides a strategic assessment of opportunities for maximizing the potential for electrical energy efficiency and renewable energy development by the Jicarilla Apache Nation. The report analyzes electricity use on the Jicarilla Apache Reservation in buildings. The report also assesses particular resources and technologies in detail, including energy efficiency, solar, wind, geothermal, biomass, and small hydropower. The closing sections set out the elements of a multi-year, multi-phase strategy for development of resources to the maximum benefit of the Nation.

  15. 75 FR 68607 - BP Canada Energy Marketing Corp. Apache Corporation; Notice for Temporary Waivers

    Science.gov (United States)

    2010-11-08

    ... Energy Marketing Corp. Apache Corporation; Notice for Temporary Waivers November 1, 2010. Take notice that on October 29, 2010, BP Canada Energy Marketing Corp. and Apache Corporation filed with the... assistance with any FERC Online service, please e-mail [email protected] , or call (866) 208-3676...

  16. Learning Apache Solr high performance

    CERN Document Server

    Mohan, Surendra

    2014-01-01

    This book is an easy-to-follow guide, full of hands-on, real-world examples. Each topic is explained and demonstrated in a specific and user-friendly flow, from search optimization using Solr to Deployment of Zookeeper applications. This book is ideal for Apache Solr developers and want to learn different techniques to optimize Solr performance with utmost efficiency, along with effectively troubleshooting the problems that usually occur while trying to boost performance. Familiarity with search servers and database querying is expected.

  17. Prediction of heart disease using apache spark analysing decision trees and gradient boosting algorithm

    Science.gov (United States)

    Chugh, Saryu; Arivu Selvan, K.; Nadesh, RK

    2017-11-01

    Numerous destructive things influence the working arrangement of human body as hypertension, smoking, obesity, inappropriate medication taking which causes many contrasting diseases as diabetes, thyroid, strokes and coronary diseases. The impermanence and horribleness of the environment situation is also the reason for the coronary disease. The structure of Apache start relies on the evolution which requires gathering of the data. To break down the significance of use programming focused on data structure the Apache stop ought to be utilized and it gives various central focuses as it is fast in light as it uses memory worked in preparing. Apache Spark continues running on dispersed environment and chops down the data in bunches giving a high profitability rate. Utilizing mining procedure as a part of the determination of coronary disease has been exhaustively examined indicating worthy levels of precision. Decision trees, Neural Network, Gradient Boosting Algorithm are the various apache spark proficiencies which help in collecting the information.

  18. Conservation priorities in the Apache Highlands ecoregion

    Science.gov (United States)

    Dale Turner; Rob Marshall; Carolyn A. F. Enquist; Anne Gondor; David F. Gori; Eduardo Lopez; Gonzalo Luna; Rafaela Paredes Aguilar; Chris Watts; Sabra Schwartz

    2005-01-01

    The Apache Highlands ecoregion incorporates the entire Madrean Archipelago/Sky Island region. We analyzed the current distribution of 223 target species and 26 terrestrial ecological systems there, and compared them with constraints on ecosystem integrity (e.g., road density) to determine the most efficient set of areas needed to maintain current biodiversity. The...

  19. Apache Flume distributed log collection for Hadoop

    CERN Document Server

    Hoffman, Steve

    2015-01-01

    If you are a Hadoop programmer who wants to learn about Flume to be able to move datasets into Hadoop in a timely and replicable manner, then this book is ideal for you. No prior knowledge about Apache Flume is necessary, but a basic knowledge of Hadoop and the Hadoop File System (HDFS) is assumed.

  20. The Apache Point Observatory Galactic Evolution Experiment (APOGEE)

    DEFF Research Database (Denmark)

    Majewski, Steven R.; Schiavon, Ricardo P.; Frinchaboy, Peter M.

    2017-01-01

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE), one of the programs in the Sloan Digital Sky Survey III (SDSS-III), has now completed its systematic, homogeneous spectroscopic survey sampling all major populations of the Milky Way. After a three-year observing campaign on the...

  1. Instant Apache Camel messaging system

    CERN Document Server

    Sharapov, Evgeniy

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. A beginner's guide to Apache Camel that walks you through basic operations like installation and setup right through to developing simple applications.This book is a good starting point for Java developers who have to work on an application dealing with various systems and interfaces but who haven't yet started using Enterprise System Buses or Java Business Integration frameworks.

  2. The Ability of the Acute Physiology and Chronic Health Evaluation (APACHE IV Score to Predict Mortality in a Single Tertiary Hospital

    Directory of Open Access Journals (Sweden)

    Jae Woo Choi

    2017-08-01

    Full Text Available Background The Acute Physiology and Chronic Health Evaluation (APACHE II model has been widely used in Korea. However, there have been few studies on the APACHE IV model in Korean intensive care units (ICUs. The aim of this study was to compare the ability of APACHE IV and APACHE II in predicting hospital mortality, and to investigate the ability of APACHE IV as a critical care triage criterion. Methods The study was designed as a prospective cohort study. Measurements of discrimination and calibration were performed using the area under the receiver operating characteristic curve (AUROC and the Hosmer-Lemeshow goodness-of-fit test respectively. We also calculated the standardized mortality ratio (SMR. Results The APACHE IV score, the Charlson Comorbidity index (CCI score, acute respiratory distress syndrome, and unplanned ICU admissions were independently associated with hospital mortality. The calibration, discrimination, and SMR of APACHE IV were good (H = 7.67, P = 0.465; C = 3.42, P = 0.905; AUROC = 0.759; SMR = 1.00. However, the explanatory power of an APACHE IV score >93 alone on hospital mortality was low at 44.1%. The explanatory power was increased to 53.8% when the hospital mortality was predicted using a model that considers APACHE IV >93 scores, medical admission, and risk factors for CCI >3 coincidentally. However, the discriminative ability of the prediction model was unsatisfactory (C index <0.70. Conclusions The APACHE IV presented good discrimination, calibration, and SMR for hospital mortality.

  3. Predictive value of SAPS II and APACHE II scoring systems for patient outcome in a medical intensive care unit

    Directory of Open Access Journals (Sweden)

    Amina Godinjak

    2016-11-01

    Full Text Available Objective. The aim is to determine SAPS II and APACHE II scores in medical intensive care unit (MICU patients, to compare them for prediction of patient outcome, and to compare with actual hospital mortality rates for different subgroups of patients. Methods. One hundred and seventy-four patients were included in this analysis over a oneyear period in the MICU, Clinical Center, University of Sarajevo. The following patient data were obtained: demographics, admission diagnosis, SAPS II, APACHE II scores and final outcome. Results. Out of 174 patients, 70 patients (40.2% died. Mean SAPS II and APACHE II scores in all patients were 48.4±17.0 and 21.6±10.3 respectively, and they were significantly different between survivors and non-survivors. SAPS II >50.5 and APACHE II >27.5 can predict the risk of mortality in these patients. There was no statistically significant difference in the clinical values of SAPS II vs APACHE II (p=0.501. A statistically significant positive correlation was established between the values of SAPS II and APACHE II (r=0.708; p=0.001. Patients with an admission diagnosis of sepsis/septic shock had the highest values of both SAPS II and APACHE II scores, and also the highest hospital mortality rate of 55.1%. Conclusion. Both APACHE II and SAPS II had an excellent ability to discriminate between survivors and non-survivors. There was no significant difference in the clinical values of SAPS II and APACHE II. A positive correlation was established between them. Sepsis/septic shock patients had the highest predicted and observed hospital mortality rate.

  4. Spinal Pain and Occupational Disability: A Cohort Study of British Apache AH Mk1 Pilots

    Science.gov (United States)

    2013-09-01

    British RW community. 33 References Apache AH Mk1. 2012. Agusta Westland. http://www.agustawestland.com/ product /apache-ah- mk1-0. Ang, B., and...muscles Physical ex and stretching Continued pt and stretching exercises Use pt session included pumpkin bobs to stretch the neck. No effects noticed

  5. Evaluation of APACHE II system among intensive care patients at a teaching hospital

    Directory of Open Access Journals (Sweden)

    Paulo Antonio Chiavone

    Full Text Available CONTEXT: The high-complexity features of intensive care unit services and the clinical situation of patients themselves render correct prognosis fundamentally important not only for patients, their families and physicians, but also for hospital administrators, fund-providers and controllers. Prognostic indices have been developed for estimating hospital mortality rates for hospitalized patients, based on demographic, physiological and clinical data. OBJECTIVE: The APACHE II system was applied within an intensive care unit to evaluate its ability to predict patient outcome; to compare illness severity with outcomes for clinical and surgical patients; and to compare the recorded result with the predicted death rate. DESIGN: Diagnostic test. SETTING: Clinical and surgical intensive care unit in a tertiary-care teaching hospital. PARTICIPANTS: The study involved 521 consecutive patients admitted to the intensive care unit from July 1998 to June 1999. MAIN MEASUREMENTS: APACHE II score, in-hospital mortality, receiver operating characteristic curve, decision matrices and linear regression analysis. RESULTS: The patients' mean age was 50 ± 19 years and the APACHE II score was 16.7 ± 7.3. There were 166 clinical patients (32%, 173 (33% post-elective surgery patients (33%, and 182 post-emergency surgery patients (35%, thus producing statistically similar proportions. The APACHE II scores for clinical patients (18.5 ± 7.8 were similar to those for non-elective surgery patients (18.6 ± 6.5 and both were greater than for elective surgery patients (13.0 ± 6.3 (p < 0.05. The higher this score was, the higher the mortality rate was (p < 0.05. The predicted death rate was 25.6% and the recorded death rate was 35.5%. Through the use of receiver operating curve analysis, good discrimination was found (area under the curve = 0.80. From the 2 x 2 decision matrix, 72.2% of patients were correctly classified (sensitivity = 35.1%; specificity = 92.6%. Linear

  6. Use of APACHE II and SAPS II to predict mortality for hemorrhagic and ischemic stroke patients.

    Science.gov (United States)

    Moon, Byeong Hoo; Park, Sang Kyu; Jang, Dong Kyu; Jang, Kyoung Sool; Kim, Jong Tae; Han, Yong Min

    2015-01-01

    We studied the applicability of the Acute Physiology and Chronic Health Evaluation II (APACHE II) and Simplified Acute Physiology Score II (SAPS II) in patients admitted to the intensive care unit (ICU) with acute stroke and compared the results with the Glasgow Coma Scale (GCS) and National Institutes of Health Stroke Scale (NIHSS). We also conducted a comparative study of accuracy for predicting hemorrhagic and ischemic stroke mortality. Between January 2011 and December 2012, ischemic or hemorrhagic stroke patients admitted to the ICU were included in the study. APACHE II and SAPS II-predicted mortalities were compared using a calibration curve, the Hosmer-Lemeshow goodness-of-fit test, and the receiver operating characteristic (ROC) curve, and the results were compared with the GCS and NIHSS. Overall 498 patients were included in this study. The observed mortality was 26.3%, whereas APACHE II and SAPS II-predicted mortalities were 35.12% and 35.34%, respectively. The mean GCS and NIHSS scores were 9.43 and 21.63, respectively. The calibration curve was close to the line of perfect prediction. The ROC curve showed a slightly better prediction of mortality for APACHE II in hemorrhagic stroke patients and SAPS II in ischemic stroke patients. The GCS and NIHSS were inferior in predicting mortality in both patient groups. Although both the APACHE II and SAPS II systems can be used to measure performance in the neurosurgical ICU setting, the accuracy of APACHE II in hemorrhagic stroke patients and SAPS II in ischemic stroke patients was superior. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Analyzing large data sets from XGC1 magnetic fusion simulations using apache spark

    Energy Technology Data Exchange (ETDEWEB)

    Churchill, R. Michael [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States)

    2016-11-21

    Apache Spark is explored as a tool for analyzing large data sets from the magnetic fusion simulation code XGCI. Implementation details of Apache Spark on the NERSC Edison supercomputer are discussed, including binary file reading, and parameter setup. Here, an unsupervised machine learning algorithm, k-means clustering, is applied to XGCI particle distribution function data, showing that highly turbulent spatial regions do not have common coherent structures, but rather broad, ring-like structures in velocity space.

  8. Mechanical characterization of densely welded Apache Leap tuff

    International Nuclear Information System (INIS)

    Fuenkajorn, K.; Daemen, J.J.K.

    1991-06-01

    An empirical criterion is formulated to describe the compressive strength of the densely welded Apache Leap tuff. The criterion incorporates the effects of size, L/D ratio, loading rate and density variations. The criterion improves the correlation between the test results and the failure envelope. Uniaxial and triaxial compressive strengths, Brazilian tensile strength and elastic properties of the densely welded brown unit of the Apache Leap tuff have been determined using the ASTM standard test methods. All tuff samples are tested dry at room temperature (22 ± 2 degrees C), and have the core axis normal to the flow layers. The uniaxial compressive strength is 73.2 ± 16.5 MPa. The Brazilian tensile strength is 5.12 ± 1.2 MPa. The Young's modulus and Poisson's ratio are 22.6 ± 5.7 GPa and 0.20 ± 0.03. Smoothness and perpendicularity do not fully meet the ASTM requirements for all samples, due to the presence of voids and inclusions on the sample surfaces and the sample preparation methods. The investigations of loading rate, L/D radio and cyclic loading effects on the compressive strength and of the size effect on the tensile strength are not conclusive. The Coulomb strength criterion adequately represents the failure envelope of the tuff under confining pressures from 0 to 62 MPa. Cohesion and internal friction angle are 16 MPa and 43 degrees. The brown unit of the Apache Leap tuff is highly heterogeneous as suggested by large variations of the test results. The high intrinsic variability of the tuff is probably caused by the presence of flow layers and by nonuniform distributions of inclusions, voids and degree of welding. Similar variability of the properties has been found in publications on the Topopah Spring tuff at Yucca Mountain. 57 refs., 32 figs., 29 tabs

  9. Jicarilla Apache Utility Authority. Strategic Plan for Energy Efficiency and Renewable Energy Development

    International Nuclear Information System (INIS)

    Rabago, K.R.

    2008-01-01

    The purpose of this Strategic Plan Report is to provide an introduction and in-depth analysis of the issues and opportunities, resources, and technologies of energy efficiency and renewable energy that have potential beneficial application for the people of the Jicarilla Apache Nation and surrounding communities. The Report seeks to draw on the best available information that existed at the time of writing, and where necessary, draws on new research to assess this potential. This study provides a strategic assessment of opportunities for maximizing the potential for electrical energy efficiency and renewable energy development by the Jicarilla Apache Nation. The report analyzes electricity use on the Jicarilla Apache Reservation in buildings. The report also assesses particular resources and technologies in detail, including energy efficiency, solar, wind, geothermal, biomass, and small hydropower. The closing sections set out the elements of a multi-year, multi-phase strategy for development of resources to the maximum benefit of the Nation

  10. Mescalero Apache Tribe Monitored Retrievable Storage (MRS). Phase 1 feasibility study report

    Energy Technology Data Exchange (ETDEWEB)

    Peso, F.

    1992-03-13

    The Nuclear Waste Policy Act of 1982, as amended, authorizes the siting, construction and operation of a Monitored Retrievable Storage (MRS) facility. The MRS is intended to be used for the temporary storage of spent nuclear fuel from the nation`s nuclear power plants beginning as early as 1998. Pursuant to the Nuclear Waste Policy Act, the Office of the Nuclear Waste Negotiator was created. On October 7, 1991, the Nuclear Waste Negotiator invited the governors of states and the Presidents of Indian tribes to apply for government grants in order to conduct a study to assess under what conditions, if any, they might consider hosting an MRS facility. Pursuant to this invitation, on October 11, 1991 the Mescalero Apache Indian Tribe of Mescalero, NM applied for a grant to conduct a phased, preliminary study of the safety, technical, political, environmental, social and economic feasibility of hosting an MRS. The preliminary study included: (1) An investigative education process to facilitate the Tribe`s comprehensive understanding of the safety, environmental, technical, social, political, and economic aspects of hosting an MRS, and; (2) The development of an extensive program that is enabling the Tribe, in collaboration with the Negotiator, to reach an informed and carefully researched decision regarding the conditions, (if any), under which further pursuit of the MRS would be considered. The Phase 1 grant application enabled the Tribe to begin the initial activities necessary to determine whether further consideration is warranted for hosting the MRS facility. The Tribe intends to pursue continued study of the MRS in order to meet the following objectives: (1) Continuing the education process towards a comprehensive understanding of the safety, environmental, technical, social and economic aspects of the MRS; (2) Conducting an effective public participation and information program; (3) Participating in MRS meetings.

  11. Resonance – Journal of Science Education | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 4. Markov Chain Monte Carlo Methods - Simple Monte Carlo. K B Athreya Mohan Delampady T Krishnan. General ... School of ORIE Rhodes Hall Cornell University, Ithaca New York 14853, USA. Indian Statistical Institute 8th Mile, Mysore Road ...

  12. Assessment of performance and utility of mortality prediction models in a single Indian mixed tertiary intensive care unit.

    Science.gov (United States)

    Sathe, Prachee M; Bapat, Sharda N

    2014-01-01

    To assess the performance and utility of two mortality prediction models viz. Acute Physiology and Chronic Health Evaluation II (APACHE II) and Simplified Acute Physiology Score II (SAPS II) in a single Indian mixed tertiary intensive care unit (ICU). Secondary objectives were bench-marking and setting a base line for research. In this observational cohort, data needed for calculation of both scores were prospectively collected for all consecutive admissions to 28-bedded ICU in the year 2011. After excluding readmissions, discharges within 24 h and age <18 years, the records of 1543 patients were analyzed using appropriate statistical methods. Both models overpredicted mortality in this cohort [standardized mortality ratio (SMR) 0.88 ± 0.05 and 0.95 ± 0.06 using APACHE II and SAPS II respectively]. Patterns of predicted mortality had strong association with true mortality (R (2) = 0.98 for APACHE II and R (2) = 0.99 for SAPS II). Both models performed poorly in formal Hosmer-Lemeshow goodness-of-fit testing (Chi-square = 12.8 (P = 0.03) for APACHE II, Chi-square = 26.6 (P = 0.001) for SAPS II) but showed good discrimination (area under receiver operating characteristic curve 0.86 ± 0.013 SE (P < 0.001) and 0.83 ± 0.013 SE (P < 0.001) for APACHE II and SAPS II, respectively). There were wide variations in SMRs calculated for subgroups based on International Classification of Disease, 10(th) edition (standard deviation ± 0.27 for APACHE II and 0.30 for SAPS II). Lack of fit of data to the models and wide variation in SMRs in subgroups put a limitation on utility of these models as tools for assessing quality of care and comparing performances of different units without customization. Considering comparable performance and simplicity of use, efforts should be made to adapt SAPS II.

  13. Better prognostic marker in ICU - APACHE II, SOFA or SAP II!

    Science.gov (United States)

    Naqvi, Iftikhar Haider; Mahmood, Khalid; Ziaullaha, Syed; Kashif, Syed Mohammad; Sharif, Asim

    2016-01-01

    This study was designed to determine the comparative efficacy of different scoring system in assessing the prognosis of critically ill patients. This was a retrospective study conducted in medical intensive care unit (MICU) and high dependency unit (HDU) Medical Unit III, Civil Hospital, from April 2012 to August 2012. All patients over age 16 years old who have fulfilled the criteria for MICU admission were included. Predictive mortality of APACHE II, SAP II and SOFA were calculated. Calibration and discrimination were used for validity of each scoring model. A total of 96 patients with equal gender distribution were enrolled. The average APACHE II score in non-survivors (27.97+8.53) was higher than survivors (15.82+8.79) with statistically significant p value (discrimination power than SAP II and SOFA.

  14. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif

    2017-01-07

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  15. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif; Orakzai, Faisal Moeen; Abdelaziz, Ibrahim; Khayyat, Zuhair

    2017-01-01

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  16. Ergonomic and anthropometric issues of the forward Apache crew station

    NARCIS (Netherlands)

    Oudenhuijzen, A.J.K.

    1999-01-01

    This paper describes the anthropometric accommodation in the Apache crew systems. These activities are part of a comprehensive project, in a cooperative effort from the Armstrong Laboratory at Wright Patterson Air Force Base (Dayton, Ohio, USA) and TNO Human Factors Research Institute (TNO HFRI) in

  17. 25 CFR 183.1 - What is the purpose of this part?

    Science.gov (United States)

    2010-04-01

    ... SAN CARLOS APACHE TRIBE DEVELOPMENT TRUST FUND AND SAN CARLOS APACHE TRIBE LEASE FUND Introduction... Tribe Water Settlement Act (the Act), Public Law 102-575, 106 Stat. 4748, that requires regulations to administer the Trust Fund, and the Lease Fund established by the Act. ...

  18. 78 FR 5197 - Notice of Intent To Repatriate a Cultural Item: Department of the Interior, Bureau of Land...

    Science.gov (United States)

    2013-01-24

    ... this notice. History and Description of the Cultural Items The one cultural item is a Dilzini Gaan... Jicarilla Apache Nation, New Mexico; Mescalero Apache Tribe of the Mescalero Reservation, New Mexico; San... Nation, New Mexico; Mescalero Apache Tribe of the Mescalero Reservation, New Mexico; San Carlos Apache...

  19. A Modified APACHE II Score for Predicting Mortality of Variceal ...

    African Journals Online (AJOL)

    Conclusion: Modified APACHE II score is effective in predicting outcome of patients with variceal bleeding. Score of L 15 points and long ICU stay are associated with high mortality. Keywords: liver cirrhosis, periportal fibrosis, portal hypertension, schistosomiasis udan Journal of Medical Sciences Vol. 2 (2) 2007: pp. 105- ...

  20. Uncomfortable Experience: Lessons Lost in the Apache War

    Science.gov (United States)

    2015-03-01

    the Apache War gripped the focus of American and Mexican citizens throughout Arizona, New Mexico, Chihuahua , and Sonora for a period greater than...Arizona and portions of New Mexico, and Northern Sonora and Chihuahua .5 Although confusion exists as to their true subdivisions, the Chokonen led by...contributed directly to the Victorio War, the Loco and Geronimo campaigns, and the Nana and Chatto- Chihuahua raids that followed.38 Once again, failure to

  1. Fallugia paradoxa (D. Don) Endl. ex Torr.: Apache-plume

    Science.gov (United States)

    Susan E. Meyer

    2008-01-01

    The genus Fallugia contains a single species - Apache-plume, F. paradoxa (D. Don) Endl. ex Torr. - found throughout the southwestern United States and northern Mexico. It occurs mostly on coarse soils on benches and especially along washes and canyons in both warm and cool desert shrub communities and up into the pinyon-juniper vegetation type. It is a sprawling, much-...

  2. Are cicadas (Diceroprocta apache) both a "keystone" and a "critical-link" species in lower Colorado River riparian communities?

    Science.gov (United States)

    Andersen, Douglas C.

    1994-01-01

    Apache cicada (Homoptera: Cicadidae: Diceroprocta apache Davis) densities were estimated to be 10 individuals/m2 within a closed-canopy stand of Fremont cottonwood (Populus fremontii) and Goodding willow (Salix gooddingii) in a revegetated site adjacent to the Colorado River near Parker, Arizona. Coupled with data drawn from the literature, I estimate that up to 1.3 cm (13 1/m2) of water may be added to the upper soil layers annually through the feeding activities of cicada nymphs. This is equivalent to 12% of the annual precipitation received in the study area. Apache cicadas may have significant effects on ecosystem functioning via effects on water transport and thus act as a critical-link species in this southwest desert riverine ecosystem. Cicadas emerged later within the cottonwood-willow stand than in relatively open saltcedar-mesquite stands; this difference in temporal dynamics would affect their availability to several insectivorous bird species and may help explain the birds' recent declines. Resource managers in this region should be sensitive to the multiple and strong effects that Apache cicadas may have on ecosystem structure and functioning.

  3. 76 FR 14063 - Notice of Inventory Completion: University of Colorado Museum, Boulder, CO

    Science.gov (United States)

    2011-03-15

    ... Pueblo of Acoma, New Mexico. History and Description of the Remains In 1962, human remains representing a... Mescalero Reservation, New Mexico; Pueblo of Acoma, New Mexico; Pueblo of Laguna, New Mexico; Pueblo of Zia, New Mexico; San Carlos Apache of the San Carlos Reservation, Arizona; White Mountain Apache Tribe of...

  4. Kelayakan Raspberry Pi sebagai Web Server: Perbandingan Kinerja Nginx, Apache, dan Lighttpd pada Platform Raspberry Pi

    Directory of Open Access Journals (Sweden)

    Rahmad Dawood

    2014-04-01

    Full Text Available Raspberry Pi is a small-sized computer, but it can function like an ordinary computer. Because it can function like a regular PC then it is also possible to run a web server application on the Raspberry Pi. This paper will report results from testing the feasibility and performance of running a web server on the Raspberry Pi. The test was conducted on the current top three most popular web servers, which are: Apache, Nginx, and Lighttpd. The parameters used to evaluate the feasibility and performance of these web servers were: maximum request and reply time. The results from the test showed that it is feasible to run all three web servers on the Raspberry Pi but Nginx gave the best performance followed by Lighttpd and Apache.Keywords: Raspberry Pi, web server, Apache, Lighttpd, Nginx, web server performance

  5. Lutzomyia (Helcocyrtomyia) Apache Young and Perkins (Diptera: Psychodidae) feeds on reptiles

    Science.gov (United States)

    Phlebotomine sand flies are vectors of bacteria, parasites, and viruses. In the western USA a sand fly, Lutzomyia apache Young and Perkins, was initially associated with epizootics of vesicular stomatitis virus (VSV), because sand flies were trapped at sites of an outbreak. Additional studies indica...

  6. Seguridad en la configuración del servidor web Apache

    Directory of Open Access Journals (Sweden)

    Carlos Eduardo Gómez Montoya

    2013-07-01

    Full Text Available Apache es el servidor Web con mayor presencia en el mercado mundial. Aunque su configuración es relativamente sencilla, fortalecer sus condiciones de seguridad implica entender y aplicar un conjunto de reglas generales conocidas, aceptadas y disponibles. Por otra parte, a pesar de ser un tema aparentemente resuelto, la seguridad en los servidores HTTP constituye un problema en aumento, y no todas las compañías lo toman en serio. Este artículo identifica y verifica un conjunto de buenas prácticas de seguridad informática aplicadas a la configuración de Apache. Para alcanzar los objetivos, y con el fin de garantizar un proceso adecuado, se eligió una metodología basada en el Círculo de Calidad de Deming, el cual comprende cuatro fases: planear, hacer, verificar y actuar, y su aplicación condujo el desarrollo del proyecto. Este artículo consta de cinco secciones: Introducción, Marco de referencia, Metodología, Resultados y discusión, y Conclusiones.

  7. The customization of APACHE II for patients receiving orthotopic liver transplants

    Science.gov (United States)

    Moreno, Rui

    2002-01-01

    General outcome prediction models developed for use with large, multicenter databases of critically ill patients may not correctly estimate mortality if applied to a particular group of patients that was under-represented in the original database. The development of new diagnostic weights has been proposed as a method of adapting the general model – the Acute Physiology and Chronic Health Evaluation (APACHE) II in this case – to a new group of patients. Such customization must be empirically tested, because the original model cannot contain an appropriate set of predictive variables for the particular group. In this issue of Critical Care, Arabi and co-workers present the results of the validation of a modified model of the APACHE II system for patients receiving orthotopic liver transplants. The use of a highly heterogeneous database for which not all important variables were taken into account and of a sample too small to use the Hosmer–Lemeshow goodness-of-fit test appropriately makes their conclusions uncertain. PMID:12133174

  8. Beginning PHP, Apache, MySQL web development

    CERN Document Server

    Glass, Michael K; Naramore, Elizabeth; Mailer, Gary; Stolz, Jeremy; Gerner, Jason

    2004-01-01

    An ideal introduction to the entire process of setting up a Web site using PHP (a scripting language), MySQL (a database management system), and Apache (a Web server)* Programmers will be up and running in no time, whether they're using Linux or Windows servers* Shows readers step by step how to create several Web sites that share common themes, enabling readers to use these examples in real-world projects* Invaluable reading for even the experienced programmer whose current site has outgrown the traditional static structure and who is looking for a way to upgrade to a more efficient, user-f

  9. Accuracy and Predictability of PANC-3 Scoring System over APACHE II in Acute Pancreatitis: A Prospective Study.

    Science.gov (United States)

    Rathnakar, Surag Kajoor; Vishnu, Vikram Hubbanageri; Muniyappa, Shridhar; Prasath, Arun

    2017-02-01

    Acute Pancreatitis (AP) is one of the common conditions encountered in the emergency room. The course of the disease ranges from mild form to severe acute form. Most of these episodes are mild and spontaneously subsiding within 3 to 5 days. In contrast, Severe Acute Pancreatitis (SAP) occurring in around 15-20% of all cases, mortality can range between 10 to 85% across various centres and countries. In such a situation we need an indicator which can predict the outcome of an attack, as severe or mild, as early as possible and such an indicator should be sensitive and specific enough to trust upon. PANC-3 scoring is such a scoring system in predicting the outcome of an attack of AP. To assess the accuracy and predictability of PANC-3 scoring system over APACHE II in predicting severity in an attack of AP. This prospective study was conducted on 82 patients admitted with the diagnosis of pancreatitis. Investigations to evaluate PANC-3 and APACHE II were done on all the patients and the PANC-3 and APACHE II score was calculated. PANC-3 score has a sensitivity of 82.6% and specificity of 77.9%, the test had a Positive Predictive Value (PPV) of 0.59 and Negative Predictive Value (NPV) of 0.92. Sensitivity of APACHE II in predicting SAP was 91.3% and specificity was 96.6% with PPV of 0.91, NPV was 0.96. Our study shows that PANC-3 can be used to predict the severity of pancreatitis as efficiently as APACHE II. The interpretation of PANC-3 does not need expertise and can be applied at the time of admission which is an advantage when compared to classical scoring systems.

  10. Integration of event streaming and microservices with Apache Kafka

    OpenAIRE

    Kljun, Matija

    2017-01-01

    Over the last decade, the microservice architecture has become a standard for big and successful internet companies, like Netflix, Amazon and LinkedIn. The importance of stream processing, aggregation and exchange of data is growing, as it allows companies to compete better and move faster. In this diploma, we have analyzed the interactions between microservices, described the streaming platform and ordinary message queues. We have described the Apache Kafka platform and how...

  11. 76 FR 9603 - Notice of Inventory Completion: Denver Museum of Nature & Science, Denver, CO

    Science.gov (United States)

    2011-02-18

    ... Zuni Tribe of the Zuni Reservation, New Mexico (hereinafter referred to as ``The Tribes''). History and... Apache Tribe of Oklahoma; Hopi Tribe of Arizona; Jicarilla Apache Nation, New Mexico; Kiowa Indian Tribe of Oklahoma; Mescalero Apache Tribe of the Mescalero Reservation, New Mexico; Navajo Nation, Arizona...

  12. Body composition assessment in American Indian children.

    Science.gov (United States)

    Lohman, T G; Caballero, B; Himes, J H; Hunsberger, S; Reid, R; Stewart, D; Skipper, B

    1999-04-01

    Although the high prevalence of obesity in American Indian children was documented in several surveys that used body mass index (BMI, in kg/m2) as the measure, there is limited information on more direct measurements of body adiposity in this population. The present study evaluated body composition in 81 boys (aged 11.2+/-0.6 y) and 75 girls (aged 11.0+/-0.4 y) attending public schools in 6 American Indian communities: White Mountain Apache, Pima, and Tohono O'Odham in Arizona; Oglala Lakota and Sicangu Lakota in South Dakota; and Navajo in New Mexico and Arizona. These communities were participating in the feasibility phase of Pathways, a multicenter intervention for the primary prevention of obesity. Body composition was estimated by using a combination of skinfold thickness and bioelectrical impedance measurements, with a prediction equation validated previously in this same population. The mean BMI was 20.4+/-4.2 for boys and 21.1+/-5.0 for girls. The sum of the triceps plus subscapular skinfold thicknesses averaged 28.6+/-7.0 mm in boys and 34.0+/-8.0 mm in girls. Mean percentage body fat was 35.6+/-6.9 in boys and 38.8+/-8.5 in girls. The results from this study confirmed the high prevalence of excess body fatness in school-age American Indian children and permitted the development of procedures, training, and quality control for measurement of the main outcome variable in the full-scale Pathways study.

  13. Protecting Oak Flat: Narratives Of Survivance As Observed Through Digital Activism

    Directory of Open Access Journals (Sweden)

    Nicholet Deschine Parkhurst

    2017-07-01

    Full Text Available American Indians are increasingly using social media/social network platforms as a tool to influence policy through social change. The activist group Apache Stronghold represents a case of American Indians utilising social media tools to protect Oak Flat and influence federal Indian policy. Oak Flat is sacred Apache land located in Superior, Arizona. United States legislators transferred Oak Flat to the mining company Resolution Copper as part of the omnibus National Defense Authorization Act of 2015. Qualitative analysis of social media content and advocacy tactics – specifically through use of timeline and digital ethnography – of Apache Stronghold from 2015-2016 reveal the interrelated nature of on-the-ground efforts, online efforts, solidarity efforts, and legislative support efforts. In sum, these efforts express narratives of survivance, healing, and a future orientation, as a unique dimension of social change.

  14. Journal of Earth System Science | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Earth System Science; Volume 122; Issue 5 .... Atmospheric correction of Earth-observation remote sensing images by Monte Carlo method ... Decision tree approach for classification of remotely sensed satellite data ... Analysis of carbon dioxide, water vapour and energy fluxes over an Indian ...

  15. CMS Analysis and Data Reduction with Apache Spark

    Energy Technology Data Exchange (ETDEWEB)

    Gutsche, Oliver [Fermilab; Canali, Luca [CERN; Cremer, Illia [Magnetic Corp., Waltham; Cremonesi, Matteo [Fermilab; Elmer, Peter [Princeton U.; Fisk, Ian [Flatiron Inst., New York; Girone, Maria [CERN; Jayatilaka, Bo [Fermilab; Kowalkowski, Jim [Fermilab; Khristenko, Viktor [CERN; Motesnitsalis, Evangelos [CERN; Pivarski, Jim [Princeton U.; Sehrish, Saba [Fermilab; Surdy, Kacper [CERN; Svyatkovskiy, Alexey [Princeton U.

    2017-10-31

    Experimental Particle Physics has been at the forefront of analyzing the world's largest datasets for decades. The HEP community was among the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems for distributed data processing, collectively called "Big Data" technologies have emerged from industry and open source projects to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and tools, promising a fresh look at analysis of very large datasets that could potentially reduce the time-to-physics with increased interactivity. Moreover these new tools are typically actively developed by large communities, often profiting of industry resources, and under open source licensing. These factors result in a boost for adoption and maturity of the tools and for the communities supporting them, at the same time helping in reducing the cost of ownership for the end-users. In this talk, we are presenting studies of using Apache Spark for end user data analysis. We are studying the HEP analysis workflow separated into two thrusts: the reduction of centrally produced experiment datasets and the end-analysis up to the publication plot. Studying the first thrust, CMS is working together with CERN openlab and Intel on the CMS Big Data Reduction Facility. The goal is to reduce 1 PB of official CMS data to 1 TB of ntuple output for analysis. We are presenting the progress of this 2-year project with first results of scaling up Spark-based HEP analysis. Studying the second thrust, we are presenting studies on using Apache Spark for a CMS Dark Matter physics search, comparing Spark's feasibility, usability and performance to the ROOT-based analysis.

  16. Tribal motor vehicle injury prevention programs for reducing disparities in motor vehicle-related injuries.

    Science.gov (United States)

    West, Bethany A; Naumann, Rebecca B

    2014-04-18

    A previous analysis of National Vital Statistics System data for 2003-2007 that examined disparities in rates of motor vehicle-related death by race/ethnicity and sex found that death rates for American Indians/Alaska Natives were two to four times the rates of other races/ethnicities. To address the disparity in motor vehicle-related injuries and deaths among American Indians/Alaska Natives, CDC funded four American Indian tribes during 2004-2009 to tailor, implement, and evaluate evidence-based road safety interventions. During the implementation of these four motor vehicle-related injury prevention pilot programs, seat belt and child safety seat use increased and alcohol-impaired driving decreased. Four American Indian/Alaska Native tribal communities-the Tohono O'odham Nation, the Ho-Chunk Nation, the White Mountain Apache Tribe, and the San Carlos Apache Tribe-implemented evidence-based road safety interventions to reduce motor vehicle-related injuries and deaths. Each community selected interventions from the Guide to Community Preventive Services and implemented them during 2004-2009. Furthermore, each community took a multifaceted approach by incorporating several strategies, such as school and community education programs, media campaigns, and collaborations with law enforcement officers into their programs. Police data and direct observational surveys were the main data sources used to assess results of the programs. Results included increased use of seat belts and child safety seats, increased enforcement of alcohol-impaired driving laws, and decreased motor vehicle crashes involving injuries or deaths. CDC's Office of Minority Health and Health Equity selected the intervention analysis and discussion as an example of a program that might be effective for reducing motor vehicle-related injury disparities in the United States. The Guide to Community Preventive Services recognizes these selected interventions as effective; this report examines the

  17. LHCbDIRAC as Apache Mesos microservices

    CERN Multimedia

    Couturier, Ben

    2016-01-01

    The LHCb experiment relies on LHCbDIRAC, an extension of DIRAC, to drive its offline computing. This middleware provides a development framework and a complete set of components for building distributed computing systems. These components are currently installed and ran on virtual machines (VM) or bare metal hardware. Due to the increased load of work, high availability is becoming more and more important for the LHCbDIRAC services, and the current installation model is showing its limitations. Apache Mesos is a cluster manager which aims at abstracting heterogeneous physical resources on which various tasks can be distributed thanks to so called "framework". The Marathon framework is suitable for long running tasks such as the DIRAC services, while the Chronos framework meets the needs of cron-like tasks like the DIRAC agents. A combination of the service discovery tool Consul together with HAProxy allows to expose the running containers to the outside world while hiding their dynamic placements. Such an arc...

  18. Spatial correlations of Diceroprocta apache and its host plants: Evidence for a negative impact from Tamarix invasion

    Science.gov (United States)

    Ellingson, A.R.; Andersen, D.C.

    2002-01-01

    1. The hypothesis that the habitat-scale spatial distribution of the Apache cicada Diceroprocta apache Davis is unaffected by the presence of the invasive exotic saltcedar Tamarix ramosissima was tested using data from 205 1-m2 quadrats placed within the flood-plain of the Bill Williams River, Arizona, U.S.A. Spatial dependencies within and between cicada density and habitat variables were estimated using Moran's I and its bivariate analogue to discern patterns and associations at spatial scales from 1 to 30 m.2. Apache cicadas were spatially aggregated in high-density clusters averaging 3 m in diameter. A positive association between cicada density, estimated by exuvial density, and the per cent canopy cover of a native tree, Goodding's willow Salix gooddingii, was detected in a non-spatial correlation analysis. No non-spatial association between cicada density and saltcedar canopy cover was detected.3. Tests for spatial cross-correlation using the bivariate IYZ indicated the presence of a broad-scale negative association between cicada density and saltcedar canopy cover. This result suggests that large continuous stands of saltcedar are associated with reduced cicada density. In contrast, positive associations detected at spatial scales larger than individual quadrats suggested a spill-over of high cicada density from areas featuring Goodding's willow canopy into surrounding saltcedar monoculture.4. Taken together and considered in light of the Apache cicada's polyphagous habits, the observed spatial patterns suggest that broad-scale factors such as canopy heterogeneity affect cicada habitat use more than host plant selection. This has implications for management of lower Colorado River riparian woodlands to promote cicada presence and density through maintenance or creation of stands of native trees as well as manipulation of the characteristically dense and homogeneous saltcedar canopies.

  19. Efficient Streaming Mass Spatio-Temporal Vehicle Data Access in Urban Sensor Networks Based on Apache Storm.

    Science.gov (United States)

    Zhou, Lianjie; Chen, Nengcheng; Chen, Zeqiang

    2017-04-10

    The efficient data access of streaming vehicle data is the foundation of analyzing, using and mining vehicle data in smart cities, which is an approach to understand traffic environments. However, the number of vehicles in urban cities has grown rapidly, reaching hundreds of thousands in number. Accessing the mass streaming data of vehicles is hard and takes a long time due to limited computation capability and backward modes. We propose an efficient streaming spatio-temporal data access based on Apache Storm (ESDAS) to achieve real-time streaming data access and data cleaning. As a popular streaming data processing tool, Apache Storm can be applied to streaming mass data access and real time data cleaning. By designing the Spout/bolt workflow of topology in ESDAS and by developing the speeding bolt and other bolts, Apache Storm can achieve the prospective aim. In our experiments, Taiyuan BeiDou bus location data is selected as the mass spatio-temporal data source. In the experiments, the data access results with different bolts are shown in map form, and the filtered buses' aggregation forms are different. In terms of performance evaluation, the consumption time in ESDAS for ten thousand records per second for a speeding bolt is approximately 300 milliseconds, and that for MongoDB is approximately 1300 milliseconds. The efficiency of ESDAS is approximately three times higher than that of MongoDB.

  20. Efficient Streaming Mass Spatio-Temporal Vehicle Data Access in Urban Sensor Networks Based on Apache Storm

    Directory of Open Access Journals (Sweden)

    Lianjie Zhou

    2017-04-01

    Full Text Available The efficient data access of streaming vehicle data is the foundation of analyzing, using and mining vehicle data in smart cities, which is an approach to understand traffic environments. However, the number of vehicles in urban cities has grown rapidly, reaching hundreds of thousands in number. Accessing the mass streaming data of vehicles is hard and takes a long time due to limited computation capability and backward modes. We propose an efficient streaming spatio-temporal data access based on Apache Storm (ESDAS to achieve real-time streaming data access and data cleaning. As a popular streaming data processing tool, Apache Storm can be applied to streaming mass data access and real time data cleaning. By designing the Spout/bolt workflow of topology in ESDAS and by developing the speeding bolt and other bolts, Apache Storm can achieve the prospective aim. In our experiments, Taiyuan BeiDou bus location data is selected as the mass spatio-temporal data source. In the experiments, the data access results with different bolts are shown in map form, and the filtered buses’ aggregation forms are different. In terms of performance evaluation, the consumption time in ESDAS for ten thousand records per second for a speeding bolt is approximately 300 milliseconds, and that for MongoDB is approximately 1300 milliseconds. The efficiency of ESDAS is approximately three times higher than that of MongoDB.

  1. Evaluation of Apache Hadoop for parallel data analysis with ROOT

    International Nuclear Information System (INIS)

    Lehrack, S; Duckeck, G; Ebke, J

    2014-01-01

    The Apache Hadoop software is a Java based framework for distributed processing of large data sets across clusters of computers, using the Hadoop file system (HDFS) for data storage and backup and MapReduce as a processing platform. Hadoop is primarily designed for processing large textual data sets which can be processed in arbitrary chunks, and must be adapted to the use case of processing binary data files which cannot be split automatically. However, Hadoop offers attractive features in terms of fault tolerance, task supervision and control, multi-user functionality and job management. For this reason, we evaluated Apache Hadoop as an alternative approach to PROOF for ROOT data analysis. Two alternatives in distributing analysis data were discussed: either the data was stored in HDFS and processed with MapReduce, or the data was accessed via a standard Grid storage system (dCache Tier-2) and MapReduce was used only as execution back-end. The focus in the measurements were on the one hand to safely store analysis data on HDFS with reasonable data rates and on the other hand to process data fast and reliably with MapReduce. In the evaluation of the HDFS, read/write data rates from local Hadoop cluster have been measured and compared to standard data rates from the local NFS installation. In the evaluation of MapReduce, realistic ROOT analyses have been used and event rates have been compared to PROOF.

  2. Evaluation of Apache Hadoop for parallel data analysis with ROOT

    Science.gov (United States)

    Lehrack, S.; Duckeck, G.; Ebke, J.

    2014-06-01

    The Apache Hadoop software is a Java based framework for distributed processing of large data sets across clusters of computers, using the Hadoop file system (HDFS) for data storage and backup and MapReduce as a processing platform. Hadoop is primarily designed for processing large textual data sets which can be processed in arbitrary chunks, and must be adapted to the use case of processing binary data files which cannot be split automatically. However, Hadoop offers attractive features in terms of fault tolerance, task supervision and control, multi-user functionality and job management. For this reason, we evaluated Apache Hadoop as an alternative approach to PROOF for ROOT data analysis. Two alternatives in distributing analysis data were discussed: either the data was stored in HDFS and processed with MapReduce, or the data was accessed via a standard Grid storage system (dCache Tier-2) and MapReduce was used only as execution back-end. The focus in the measurements were on the one hand to safely store analysis data on HDFS with reasonable data rates and on the other hand to process data fast and reliably with MapReduce. In the evaluation of the HDFS, read/write data rates from local Hadoop cluster have been measured and compared to standard data rates from the local NFS installation. In the evaluation of MapReduce, realistic ROOT analyses have been used and event rates have been compared to PROOF.

  3. 77 FR 18997 - Rim Lakes Forest Restoration Project; Apache-Sitgreavese National Forest, Black Mesa Ranger...

    Science.gov (United States)

    2012-03-29

    ... DEPARTMENT OF AGRICULTURE Forest Service Rim Lakes Forest Restoration Project; Apache-Sitgreavese National Forest, Black Mesa Ranger District, Coconino County, AZ AGENCY: Forest Service, USDA. ACTION: Notice of intent to prepare an environmental impact statement. SUMMARY: The U.S. Forest Service (FS) will...

  4. 75 FR 14419 - Camp Tatiyee Land Exchange on the Lakeside Ranger District of the Apache-Sitgreaves National...

    Science.gov (United States)

    2010-03-25

    ... Ranger, Lakeside Ranger District, Apache-Sitgreaves National Forests, c/o TEC Inc., 514 Via de la Valle... to other papers serving areas affected by this proposal: Tucson Citizen, Sierra Vista Herald, Nogales...

  5. Overview of the SDSS-IV MaNGA Survey: Mapping nearby Galaxies at Apache Point Observatory

    NARCIS (Netherlands)

    Bundy, Kevin; Bershady, Matthew A.; Law, David R.; Yan, Renbin; Drory, Niv; MacDonald, Nicholas; Wake, David A.; Cherinka, Brian; Sánchez-Gallego, José R.; Weijmans, Anne-Marie; Thomas, Daniel; Tremonti, Christy; Masters, Karen; Coccato, Lodovico; Diamond-Stanic, Aleksandar M.; Aragón-Salamanca, Alfonso; Avila-Reese, Vladimir; Badenes, Carles; Falcón-Barroso, Jésus; Belfiore, Francesco; Bizyaev, Dmitry; Blanc, Guillermo A.; Bland-Hawthorn, Joss; Blanton, Michael R.; Brownstein, Joel R.; Byler, Nell; Cappellari, Michele; Conroy, Charlie; Dutton, Aaron A.; Emsellem, Eric; Etherington, James; Frinchaboy, Peter M.; Fu, Hai; Gunn, James E.; Harding, Paul; Johnston, Evelyn J.; Kauffmann, Guinevere; Kinemuchi, Karen; Klaene, Mark A.; Knapen, Johan H.; Leauthaud, Alexie; Li, Cheng; Lin, Lihwai; Maiolino, Roberto; Malanushenko, Viktor; Malanushenko, Elena; Mao, Shude; Maraston, Claudia; McDermid, Richard M.; Merrifield, Michael R.; Nichol, Robert C.; Oravetz, Daniel; Pan, Kaike; Parejko, John K.; Sanchez, Sebastian F.; Schlegel, David; Simmons, Audrey; Steele, Oliver; Steinmetz, Matthias; Thanjavur, Karun; Thompson, Benjamin A.; Tinker, Jeremy L.; van den Bosch, Remco C. E.; Westfall, Kyle B.; Wilkinson, David; Wright, Shelley; Xiao, Ting; Zhang, Kai

    We present an overview of a new integral field spectroscopic survey called MaNGA (Mapping Nearby Galaxies at Apache Point Observatory), one of three core programs in the fourth-generation Sloan Digital Sky Survey (SDSS-IV) that began on 2014 July 1. MaNGA will investigate the internal kinematic

  6. The APACHE survey hardware and software design: Tools for an automatic search of small-size transiting exoplanets

    Directory of Open Access Journals (Sweden)

    Lattanzi M.G.

    2013-04-01

    Full Text Available Small-size ground-based telescopes can effectively be used to look for transiting rocky planets around nearby low-mass M stars using the photometric transit method, as recently demonstrated for example by the MEarth project. Since 2008 at the Astronomical Observatory of the Autonomous Region of Aosta Valley (OAVdA, we have been preparing for the long-term photometric survey APACHE, aimed at finding transiting small-size planets around thousands of nearby early and mid-M dwarfs. APACHE (A PAthway toward the Characterization of Habitable Earths is designed to use an array of five dedicated and identical 40-cm Ritchey-Chretien telescopes and its observations started at the beginning of summer 2012. The main characteristics of the survey final set up and the preliminary results from the first weeks of observations will be discussed.

  7. Sideloading - Ingestion of Large Point Clouds Into the Apache Spark Big Data Engine

    Science.gov (United States)

    Boehm, J.; Liu, K.; Alis, C.

    2016-06-01

    In the geospatial domain we have now reached the point where data volumes we handle have clearly grown beyond the capacity of most desktop computers. This is particularly true in the area of point cloud processing. It is therefore naturally lucrative to explore established big data frameworks for big geospatial data. The very first hurdle is the import of geospatial data into big data frameworks, commonly referred to as data ingestion. Geospatial data is typically encoded in specialised binary file formats, which are not naturally supported by the existing big data frameworks. Instead such file formats are supported by software libraries that are restricted to single CPU execution. We present an approach that allows the use of existing point cloud file format libraries on the Apache Spark big data framework. We demonstrate the ingestion of large volumes of point cloud data into a compute cluster. The approach uses a map function to distribute the data ingestion across the nodes of a cluster. We test the capabilities of the proposed method to load billions of points into a commodity hardware compute cluster and we discuss the implications on scalability and performance. The performance is benchmarked against an existing native Apache Spark data import implementation.

  8. Western Indian Ocean Journal of Marine Science - Vol 11, No 1 (2012)

    African Journals Online (AJOL)

    Using an ecosystem model to evaluate fisheries management options to mitigate climate change impacts in western Indian Ocean coral reefs · EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT. Carlos Ruiz Sebastián, Tim R. McClanahan, 77-86 ...

  9. Validation of the LOD score compared with APACHE II score in prediction of the hospital outcome in critically ill patients.

    Science.gov (United States)

    Khwannimit, Bodin

    2008-01-01

    The Logistic Organ Dysfunction score (LOD) is an organ dysfunction score that can predict hospital mortality. The aim of this study was to validate the performance of the LOD score compared with the Acute Physiology and Chronic Health Evaluation II (APACHE II) score in a mixed intensive care unit (ICU) at a tertiary referral university hospital in Thailand. The data were collected prospectively on consecutive ICU admissions over a 24 month period from July1, 2004 until June 30, 2006. Discrimination was evaluated by the area under the receiver operating characteristic curve (AUROC). The calibration was assessed by the Hosmer-Lemeshow goodness-of-fit H statistic. The overall fit of the model was evaluated by the Brier's score. Overall, 1,429 patients were enrolled during the study period. The mortality in the ICU was 20.9% and in the hospital was 27.9%. The median ICU and hospital lengths of stay were 3 and 18 days, respectively, for all patients. Both models showed excellent discrimination. The AUROC for the LOD and APACHE II were 0.860 [95% confidence interval (CI) = 0.838-0.882] and 0.898 (95% Cl = 0.879-0.917), respectively. The LOD score had perfect calibration with the Hosmer-Lemeshow goodness-of-fit H chi-2 = 10 (p = 0.44). However, the APACHE II had poor calibration with the Hosmer-Lemeshow goodness-of-fit H chi-2 = 75.69 (p < 0.001). Brier's score showed the overall fit for both models were 0.123 (95%Cl = 0.107-0.141) and 0.114 (0.098-0.132) for the LOD and APACHE II, respectively. Thus, the LOD score was found to be accurate for predicting hospital mortality for general critically ill patients in Thailand.

  10. Monte Carlo codes and Monte Carlo simulator program

    International Nuclear Information System (INIS)

    Higuchi, Kenji; Asai, Kiyoshi; Suganuma, Masayuki.

    1990-03-01

    Four typical Monte Carlo codes KENO-IV, MORSE, MCNP and VIM have been vectorized on VP-100 at Computing Center, JAERI. The problems in vector processing of Monte Carlo codes on vector processors have become clear through the work. As the result, it is recognized that these are difficulties to obtain good performance in vector processing of Monte Carlo codes. A Monte Carlo computing machine, which processes the Monte Carlo codes with high performances is being developed at our Computing Center since 1987. The concept of Monte Carlo computing machine and its performance have been investigated and estimated by using a software simulator. In this report the problems in vectorization of Monte Carlo codes, Monte Carlo pipelines proposed to mitigate these difficulties and the results of the performance estimation of the Monte Carlo computing machine by the simulator are described. (author)

  11. SIDELOADING – INGESTION OF LARGE POINT CLOUDS INTO THE APACHE SPARK BIG DATA ENGINE

    Directory of Open Access Journals (Sweden)

    J. Boehm

    2016-06-01

    Full Text Available In the geospatial domain we have now reached the point where data volumes we handle have clearly grown beyond the capacity of most desktop computers. This is particularly true in the area of point cloud processing. It is therefore naturally lucrative to explore established big data frameworks for big geospatial data. The very first hurdle is the import of geospatial data into big data frameworks, commonly referred to as data ingestion. Geospatial data is typically encoded in specialised binary file formats, which are not naturally supported by the existing big data frameworks. Instead such file formats are supported by software libraries that are restricted to single CPU execution. We present an approach that allows the use of existing point cloud file format libraries on the Apache Spark big data framework. We demonstrate the ingestion of large volumes of point cloud data into a compute cluster. The approach uses a map function to distribute the data ingestion across the nodes of a cluster. We test the capabilities of the proposed method to load billions of points into a commodity hardware compute cluster and we discuss the implications on scalability and performance. The performance is benchmarked against an existing native Apache Spark data import implementation.

  12. Validation of APACHE II scoring system at 24 hours after admission as a prognostic tool in urosepsis: A prospective observational study

    Directory of Open Access Journals (Sweden)

    Sundaramoorthy VijayGanapathy

    2017-11-01

    Full Text Available Purpose: Urosepsis implies clinically evident severe infection of urinary tract with features of systemic inflammatory response syndrome (SIRS. We validate the role of a single Acute Physiology and Chronic Health Evaluation II (APACHE II score at 24 hours after admission in predicting mortality in urosepsis. Materials and Methods: A prospective observational study was done in 178 patients admitted with urosepsis in the Department of Urology, in a tertiary care institute from January 2015 to August 2016. Patients >18 years diagnosed as urosepsis using SIRS criteria with positive urine or blood culture for bacteria were included. At 24 hours after admission to intensive care unit, APACHE II score was calculated using 12 physiological variables, age and chronic health. Results: Mean±standard deviation (SD APACHE II score was 26.03±7.03. It was 24.31±6.48 in survivors and 32.39±5.09 in those expired (p<0.001. Among patients undergoing surgery, mean±SD score was higher (30.74±4.85 than among survivors (24.30±6.54 (p<0.001. Receiver operating characteristic (ROC analysis revealed area under curve (AUC of 0.825 with cutoff 25.5 being 94.7% sensitive and 56.4% specific to predict mortality. Mean±SD score in those undergoing surgery was 25.22±6.70 and was lesser than those who did not undergo surgery (28.44±7.49 (p=0.007. ROC analysis revealed AUC of 0.760 with cutoff 25.5 being 94.7% sensitive and 45.6% specific to predict mortality even after surgery. Conclusions: A single APACHE II score assessed at 24 hours after admission was able to predict morbidity, mortality, need for surgical intervention, length of hospitalization, treatment success and outcome in urosepsis patients.

  13. Validation of APACHE II scoring system at 24 hours after admission as a prognostic tool in urosepsis: A prospective observational study.

    Science.gov (United States)

    VijayGanapathy, Sundaramoorthy; Karthikeyan, VIlvapathy Senguttuvan; Sreenivas, Jayaram; Mallya, Ashwin; Keshavamurthy, Ramaiah

    2017-11-01

    Urosepsis implies clinically evident severe infection of urinary tract with features of systemic inflammatory response syndrome (SIRS). We validate the role of a single Acute Physiology and Chronic Health Evaluation II (APACHE II) score at 24 hours after admission in predicting mortality in urosepsis. A prospective observational study was done in 178 patients admitted with urosepsis in the Department of Urology, in a tertiary care institute from January 2015 to August 2016. Patients >18 years diagnosed as urosepsis using SIRS criteria with positive urine or blood culture for bacteria were included. At 24 hours after admission to intensive care unit, APACHE II score was calculated using 12 physiological variables, age and chronic health. Mean±standard deviation (SD) APACHE II score was 26.03±7.03. It was 24.31±6.48 in survivors and 32.39±5.09 in those expired (p<0.001). Among patients undergoing surgery, mean±SD score was higher (30.74±4.85) than among survivors (24.30±6.54) (p<0.001). Receiver operating characteristic (ROC) analysis revealed area under curve (AUC) of 0.825 with cutoff 25.5 being 94.7% sensitive and 56.4% specific to predict mortality. Mean±SD score in those undergoing surgery was 25.22±6.70 and was lesser than those who did not undergo surgery (28.44±7.49) (p=0.007). ROC analysis revealed AUC of 0.760 with cutoff 25.5 being 94.7% sensitive and 45.6% specific to predict mortality even after surgery. A single APACHE II score assessed at 24 hours after admission was able to predict morbidity, mortality, need for surgical intervention, length of hospitalization, treatment success and outcome in urosepsis patients.

  14. Monte Carlo and Quasi-Monte Carlo Sampling

    CERN Document Server

    Lemieux, Christiane

    2009-01-01

    Presents essential tools for using quasi-Monte Carlo sampling in practice. This book focuses on issues related to Monte Carlo methods - uniform and non-uniform random number generation, variance reduction techniques. It covers several aspects of quasi-Monte Carlo methods.

  15. Redskins in Bluecoats: A Strategic and Cultural Analysis of General George Crooks Use of Apache Scouts in the Second Apache Campaign, 1882-1886

    Science.gov (United States)

    2010-03-31

    Scouts ...................................................................... .44 Figure 7. Captain John Gregory Bourke ...John Gregory Bourke (see Figure 7), served with him "for more than 15 years ... as a member of his military staff.,,3o Following his retirement, Bourke ...Bureau of Indian affairs. John Bourke said of Crook in an obituary, "The story of his administration of Indian Affairs in that, as in every other

  16. 78 FR 59953 - Notice of Inventory Completion: U.S. Department of the Interior, National Park Service...

    Science.gov (United States)

    2013-09-30

    ... Reservation, Arizona; San Carlos Apache Tribe of the San Carlos Reservation, Arizona; Tohono O'odham Nation of... notice has occurred. ADDRESSES: Bob Love, Superintendent, Tumacacori National Historical Park, P.O. Box..., Superintendent, Tumacacori National Historical Park, P.O. Box 8067, Tumacacori, AZ 85640, telephone (520) 398...

  17. 78 FR 59967 - Notice of Inventory Completion: U.S. Department of the Interior, National Park Service...

    Science.gov (United States)

    2013-09-30

    ... Carlos Apache Tribe of the San Carlos Reservation, Arizona; Tohono O'odham Nation of Arizona; Tonto... correction notice has occurred. ADDRESSES: Bob Love, Superintendent, Tumacacori National Historical Park, P.O... National Historical Park, P.O. Box 8067, Tumacacori, AZ 85640, telephone (520) 398-2341 Ext. 52, email bob...

  18. Uso do escore prognóstico APACHE II e ATN-ISS em insuficiência renal aguda tratada dentro e fora da unidade de terapia intensiva

    OpenAIRE

    Fernandes,Natáia Maria da Silva; Pinto,Patrícia dos Santos; Lacet,Thiago Bento de Paiva; Rodrigues,Dominique Fonseca; Bastos,Marcus Gomes; Stella,Sérgio Reinaldo; Cendoroglo Neto,Miguel

    2009-01-01

    INTRODUÇÃO: A insuficiência renal aguda (IRA) mantém alta prevalência, morbidade e mortalidade. OBJETIVO: Comparar o uso do escore prognóstico APACHE II com o ATN-ISS e determinar se o APACHE II pode ser utilizado para pacientes com IRA, fora da UTI. MÉTODOS: Coorte prospectiva, 205 pacientes com IRA. Analisamos dados demográficos, condições pré-existentes, falência de órgãos e características da IRA. Os escores prognósticos foram realizados no dia da avaliação do nefrologista. RESULTADOS: A ...

  19. LHCbDIRAC as Apache Mesos microservices

    Science.gov (United States)

    Haen, Christophe; Couturier, Benjamin

    2017-10-01

    The LHCb experiment relies on LHCbDIRAC, an extension of DIRAC, to drive its offline computing. This middleware provides a development framework and a complete set of components for building distributed computing systems. These components are currently installed and run on virtual machines (VM) or bare metal hardware. Due to the increased workload, high availability is becoming more and more important for the LHCbDIRAC services, and the current installation model is showing its limitations. Apache Mesos is a cluster manager which aims at abstracting heterogeneous physical resources on which various tasks can be distributed thanks to so called “frameworks” The Marathon framework is suitable for long running tasks such as the DIRAC services, while the Chronos framework meets the needs of cron-like tasks like the DIRAC agents. A combination of the service discovery tool Consul together with HAProxy allows to expose the running containers to the outside world while hiding their dynamic placements. Such an architecture brings a greater flexibility in the deployment of LHCbDirac services, allowing for easier deployment maintenance and scaling of services on demand (e..g LHCbDirac relies on 138 services and 116 agents). Higher reliability is also easier, as clustering is part of the toolset, which allows constraints on the location of the services. This paper describes the investigations carried out to package the LHCbDIRAC and DIRAC components into Docker containers and orchestrate them using the previously described set of tools.

  20. Lot 4 AH-64E Apache Attack Helicopter Follow-on Operational Test and Evaluation Report

    Science.gov (United States)

    2014-12-01

    engine is tested to determine its Engine Torque Factor ( ETF ) rating.6 To meet contract specifications, a new engine must have an ETF of 1.0. The...published AH-64E operator’s manual estimates performance based on engines with an ETF of 1.0, and pilots normally plan missions anticipating the 717...pound shortfall in hover performance at KPP conditions. The Apache Program Manager reports that new engines are delivered with an average ETF of

  1. Developer Initiation and Social Interactions in OSS: A Case Study of the Apache Software Foundation

    Science.gov (United States)

    2014-08-01

    pp. 201–215, 2003. 2. K. Crowston, K. Wei, J. Howison, and A. Wiggins, “Free/ libre open-source software devel- opment: What we know and what we do not...Understanding the process of participating in open source communities,” in International Workshop on Emerging Trends in Free/ Libre /Open Source Software ...Noname manuscript No. (will be inserted by the editor) Developer Initiation and Social Interactions in OSS: A Case Study of the Apache Software

  2. Constructing Flexible, Configurable, ETL Pipelines for the Analysis of "Big Data" with Apache OODT

    Science.gov (United States)

    Hart, A. F.; Mattmann, C. A.; Ramirez, P.; Verma, R.; Zimdars, P. A.; Park, S.; Estrada, A.; Sumarlidason, A.; Gil, Y.; Ratnakar, V.; Krum, D.; Phan, T.; Meena, A.

    2013-12-01

    A plethora of open source technologies for manipulating, transforming, querying, and visualizing 'big data' have blossomed and matured in the last few years, driven in large part by recognition of the tremendous value that can be derived by leveraging data mining and visualization techniques on large data sets. One facet of many of these tools is that input data must often be prepared into a particular format (e.g.: JSON, CSV), or loaded into a particular storage technology (e.g.: HDFS) before analysis can take place. This process, commonly known as Extract-Transform-Load, or ETL, often involves multiple well-defined steps that must be executed in a particular order, and the approach taken for a particular data set is generally sensitive to the quantity and quality of the input data, as well as the structure and complexity of the desired output. When working with very large, heterogeneous, unstructured or semi-structured data sets, automating the ETL process and monitoring its progress becomes increasingly important. Apache Object Oriented Data Technology (OODT) provides a suite of complementary data management components called the Process Control System (PCS) that can be connected together to form flexible ETL pipelines as well as browser-based user interfaces for monitoring and control of ongoing operations. The lightweight, metadata driven middleware layer can be wrapped around custom ETL workflow steps, which themselves can be implemented in any language. Once configured, it facilitates communication between workflow steps and supports execution of ETL pipelines across a distributed cluster of compute resources. As participants in a DARPA-funded effort to develop open source tools for large-scale data analysis, we utilized Apache OODT to rapidly construct custom ETL pipelines for a variety of very large data sets to prepare them for analysis and visualization applications. We feel that OODT, which is free and open source software available through the Apache

  3. D-dimer as marker for microcirculatory failure: correlation with LOD and APACHE II scores.

    Science.gov (United States)

    Angstwurm, Matthias W A; Reininger, Armin J; Spannagl, Michael

    2004-01-01

    The relevance of plasma d-dimer levels as marker for morbidity and organ dysfunction in severely ill patients is largely unknown. In a prospective study we determined d-dimer plasma levels of 800 unselected patients at admission to our intensive care unit. In 91% of the patients' samples d-dimer levels were elevated, in some patients up to several hundredfold as compared to normal values. The highest mean d-dimer values were present in the patient group with thromboembolic diseases, and particularly in non-survivors of pulmonary embolism. In patients with circulatory impairment (r=0.794) and in patients with infections (r=0.487) a statistically significant correlation was present between d-dimer levels and the APACHE II score (P<0.001). The logistic organ dysfunction score (LOD, P<0.001) correlated with d-dimer levels only in patients with circulatory impairment (r=0.474). On the contrary, patients without circulatory impairment demonstrated no correlation of d-dimer levels to the APACHE II or LOD score. Taking all patients together, no correlations of d-dimer levels with single organ failure or with indicators of infection could be detected. In conclusion, d-dimer plasma levels strongly correlated with the severity of the disease and organ dysfunction in patients with circulatory impairment or infections suggesting that elevated d-dimer levels may reflect the extent of microcirculatory failure. Thus, a therapeutic strategy to improve the microcirculation in such patients may be monitored using d-dimer plasma levels.

  4. Shear velocity structure of the laterally heterogeneous crust and uppermost mantle beneath the Indian region

    Science.gov (United States)

    Mohan, G.; Rai, S. S.; Panza, G. F.

    1997-08-01

    The shear velocity structure of the Indian lithosphere is mapped by inverting regionalized Rayleigh wave group velocities in time periods of 15-60 s. The regionalized maps are used to subdivide the Indian plate into several geologic units and determine the variation of velocity with depth in each unit. The Hedgehog Monte Carlo technique is used to obtain the shear wave velocity structure for each geologic unit, revealing distinct velocity variations in the lower crust and uppermost mantle. The Indian shield has a high-velocity (4.4-4.6 km/s) upper mantle which, however, is slower than other shields in the world. The central Indian platform comprised of Proterozoic basins and cratons is marked by a distinct low-velocity (4.0-4.2 km/s) upper mantle. Lower crustal velocities in the Indian lithosphere generally range between 3.8 and 4.0 km/s with the oceanic segments and the sedimentary basins marked by marginally higher and lower velocities, respectively. A remarkable contrast is observed in upper mantle velocities between the northern and eastern convergence fronts of the Indian plate. The South Bruma region along the eastern subduction front of the Indian oceanic lithosphere shows significant velocity enhancement in the lower crust and upper mantle. High velocities (≈4.8 km/s) are also observed in the upper mantle beneath the Ninetyeast ridge in the northeastern Indian Ocean.

  5. Enhancing organization and maintenance of big data with Apache Solr in IBM WebSphere Commerce deployments

    OpenAIRE

    Grigel, Rudolf

    2015-01-01

    The main objective of this thesis was to enhance the organization and maintenance of big data with Apache Solr in IBM WebSphere Commerce deployments. This objective can be split into several subtasks: reorganization of data, fast and optimised exporting and importing, efficient update and cleanup operations. E-Commerce is a fast growing and frequently changing environment. There is a constant flow of data that is rapidly growing larger and larger every day which is becoming an ...

  6. Prediction of Mortality after Emergent Transjugular Intrahepatic Portosystemic Shunt Placement: Use of APACHE II, Child-Pugh and MELD Scores in Asian Patients with Refractory Variceal Hemorrhage

    Energy Technology Data Exchange (ETDEWEB)

    Tzeng, Wen Sheng; Wu, Reng Hong; Lin, Ching Yih; Chen, Jyh Jou; Sheu, Ming Juen; Koay, Lok Beng; Lee, Chuan [Chi-Mei Foundation Medical Center, Tainan (China)

    2009-10-15

    This study was designed to determine if existing methods of grading liver function that have been developed in non-Asian patients with cirrhosis can be used to predict mortality in Asian patients treated for refractory variceal hemorrhage by the use of the transjugular intrahepatic portosystemic shunt (TIPS) procedure. Data for 107 consecutive patients who underwent an emergency TIPS procedure were retrospectively analyzed. Acute physiology and chronic health evaluation (APACHE II), Child-Pugh and model for end-stage liver disease (MELD) scores were calculated. Survival analyses were performed to evaluate the ability of the various models to predict 30-day, 60-day and 360-day mortality. The ability of stratified APACHE II, Child-Pugh, and MELD scores to predict survival was assessed by the use of Kaplan-Meier analysis with the log-rank test. No patient died during the TIPS procedure, but 82 patients died during the follow-up period. Thirty patients died within 30 days after the TIPS procedure; 37 patients died within 60 days and 53 patients died within 360 days. Univariate analysis indicated that hepatorenal syndrome, use of inotropic agents and mechanical ventilation were associated with elevated 30-day mortality (p < 0.05). Multivariate analysis showed that a Child-Pugh score > 11 or an MELD score > 20 predicted increased risk of death at 30, 60 and 360 days (p < 0.05). APACHE II scores could only predict mortality at 360 days (p < 0.05). A Child-Pugh score > 11 or an MELD score > 20 are predictive of mortality in Asian patients with refractory variceal hemorrhage treated with the TIPS procedure. An APACHE II score is not predictive of early mortality in this patient population.

  7. Outcrop Analysis of the Cretaceous Mesaverde Group: Jicarilla Apache Reservation, New Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Ridgley, Jennie; Dunbar, Robin Wright

    2001-04-24

    Field work for this project was conducted during July and April 1998, at which time fourteen measured sections were described and correlated on or adjacent to Jicarilla Apache Reservation lands. A fifteenth section, described east of the main field area, is included in this report, although its distant location precluded use in the correlations and cross sections presented herein. Ground-based photo mosaics were shot for much of the exposed Mesaverde outcrop belt and were used to assist in correlation. Outcrop gamma-ray surveys at six of the fifteen measured sections using a GAD-6 scintillometer was conducted. The raw gamma-ray data are included in this report, however, analysis of those data is part of the ongoing Phase Two of this project.

  8. Tribal lands provide forest management laboratory for mainstream university students

    Science.gov (United States)

    Serra J. Hoagland; Ronald Miller; Kristen M. Waring; Orlando Carroll

    2017-01-01

    Northern Arizona University (NAU) faculty and Bureau of Indian Affairs (BIA) foresters initiated a partnership to expose NAU School of Forestry (SoF) graduate students to tribal forest management practices by incorporating field trips to the 1.68-million acre Fort Apache Indian Reservation as part of their silviculture curriculum. Tribal field trips were contrasted and...

  9. APACHE II SCORING SYSTEM AND ITS MODIFICATION FOR THE ASSESSMENT OF DISEASE SEVERITY IN CHILDREN WHO UNDERWENT POLYCHEMOTHERAPY

    Directory of Open Access Journals (Sweden)

    А. V. Sotnikov

    2014-01-01

    Full Text Available Short-term disease prognosis should be considered for the appropriate treatment policy based on the assessment of disease severity in patients with acute disease. The adequate assessment of disease severity and prognosis allows the indications for transferring patients to the resuscitation and intensive care department to be defined more precisely. Disease severity of patients who underwent polychemotherapy was assessed using APACHE II scoring system.

  10. Update on Astrometric Follow-Up at Apache Point Observatory by Adler Planetarium

    Science.gov (United States)

    Nault, Kristie A.; Brucker, Melissa; Hammergren, Mark

    2016-10-01

    We began our NEO astrometric follow-up and characterization program in 2014 Q4 using about 500 hours of observing time per year with the Astrophysical Research Consortium (ARC) 3.5m telescope at Apache Point Observatory (APO). Our observing is split into 2 hour blocks approximately every other night for astrometry (this poster) and several half-nights per month for spectroscopy (see poster by M. Hammergren et al.) and light curve studies.For astrometry, we use the ARC Telescope Imaging Camera (ARCTIC) with an SDSS r filter, in 2 hour observing blocks centered around midnight. ARCTIC has a magnitude limit of V~23 in 60s, and we target 20 NEOs per session. ARCTIC has a FOV 1.57 times larger and a readout time half as long as the previous imager, SPIcam, which we used from 2014 Q4 through 2015 Q3. Targets are selected primarily from the Minor Planet Center's (MPC) NEO Confirmation Page (NEOCP), and NEA Observation Planning Aid; we also refer to JPL's What's Observable page, the Spaceguard Priority List and Faint NEOs List, and requests from other observers. To quickly adapt to changing weather and seeing conditions, we create faint, midrange, and bright target lists. Detected NEOs are measured with Astrometrica and internal software, and the astrometry is reported to the MPC.As of June 19, 2016, we have targeted 2264 NEOs, 1955 with provisional designations, 1582 of which were detected. We began observing NEOCP asteroids on January 30, 2016, and have targeted 309, 207 of which were detected. In addition, we serendipitously observed 281 moving objects, 201 of which were identified as previously known objects.This work is based on observations obtained with the Apache Point Observatory 3.5m telescope, which is owned and operated by the Astrophysical Research Consortium. We gratefully acknowledge support from NASA NEOO award NNX14AL17G and thank the University of Chicago Department of Astronomy and Astrophysics for observing time in 2014.

  11. Neutronic performance optimization study of Indian fusion demo reactor first wall and breeding blanket

    International Nuclear Information System (INIS)

    Swami, H.L.; Danani, C.

    2015-01-01

    In frame of design studies of Indian Nuclear Fusion DEMO Reactor, neutronic performance optimization of first wall and breeding blanket are carried out. The study mainly focuses on tritium breeding ratio (TBR) and power density responses estimation of breeding blanket. Apart from neutronic efficiency of existing breeding blanket concepts for Indian DEMO i.e. lead lithium ceramic breeder and helium cooled solid breeder concept other concepts like helium cooled lead lithium and helium-cooled Li_8PbO_6 with reflector are also explored. The aim of study is to establish a neutronically efficient breeding blanket concept for DEMO. Effect of first wall materials and thickness on breeding blanket neutronic performance is also evaluated. For this study 1 D cylindrical neutronic model of DEMO has been constructed according to the preliminary radial build up of Indian DEMO. The assessment is being done using Monte Carlo based radiation transport code and nuclear cross section data file ENDF/B- VII. (author)

  12. Combination of Mean Platelet Volume/Platelet Count Ratio and the APACHE II Score Better Predicts the Short-Term Outcome in Patients with Acute Kidney Injury Receiving Continuous Renal Replacement Therapy.

    Science.gov (United States)

    Li, Junhui; Li, Yingchuan; Sheng, Xiaohua; Wang, Feng; Cheng, Dongsheng; Jian, Guihua; Li, Yongguang; Feng, Liang; Wang, Niansong

    2018-03-29

    Both the Acute physiology and Chronic Health Evaluation (APACHE II) score and mean platelet volume/platelet count Ratio (MPR) can independently predict adverse outcomes in critically ill patients. This study was aimed to investigate whether the combination of them could have a better performance in predicting prognosis of patients with acute kidney injury (AKI) who received continuous renal replacement therapy (CRRT). Two hundred twenty-three patients with AKI who underwent CRRT between January 2009 and December 2014 in a Chinese university hospital were enrolled. They were divided into survivals group and non-survivals group based on the situation at discharge. Receiver Operating Characteristic (ROC) curve was used for MPR and APACHE II score, and to determine the optimal cut-off value of MPR for in-hospital mortality. Factors associated with mortality were identified by univariate and multivariate logistic regression analysis. The mean age of the patients was 61.4 years, and the overall in-hospital mortality was 48.4%. Acute cardiorenal syndrome (ACRS) was the most common cause of AKI. The optimal cut-off value of MPR for mortality was 0.099 with an area under the ROC curve (AUC) of 0.636. The AUC increased to 0.851 with the addition of the APACHE II score. The mortality of patients with of MPR > 0.099 was 56.4%, which was significantly higher than that of the control group with of ≤ 0.099 (39.6%, P= 0.012). Logistic regression analysis showed that average number of organ failure (OR = 2.372), APACHE II score (OR = 1.187), age (OR = 1.028) and vasopressors administration (OR = 38.130) were significantly associated with poor prognosis. Severity of illness was significantly associated with prognosis of patients with AKI. The combination of MPR and APACHE II score may be helpful in predicting the short-term outcome of AKI. © 2018 The Author(s). Published by S. Karger AG, Basel.

  13. Utility of Procalcitonin (PCT and Mid regional pro-Adrenomedullin (MR-proADM in risk stratification of critically ill febrile patients in Emergency Department (ED. A comparison with APACHE II score

    Directory of Open Access Journals (Sweden)

    Travaglino Francesco

    2012-08-01

    Full Text Available Abstract Background The aim of our study was to evaluate the prognostic value of MR-proADM and PCT levels in febrile patients in the ED in comparison with a disease severity index score, the APACHE II score. We also evaluated the ability of MR-proADM and PCT to predict hospitalization. Methods This was an observational, multicentric study. We enrolled 128 patients referred to the ED with high fever and a suspicion of severe infection such as sepsis, lower respiratory tract infections, urinary tract infections, gastrointestinal infections, soft tissue infections, central nervous system infections, or osteomyelitis. The APACHE II score was calculated for each patient. Results MR-proADM median values in controls were 0.5 nmol/l as compared with 0.85 nmol/l in patients (P P . MR-proADM and PCT levels were significantly increased in accordance with the Apache II quartiles (P  respectively. In the respiratory infections, urinary infections, and sepsis-septic shock groups we found a correlation between the Apache II and MR-proADM respectively and MR-proADM and PCT respectively. We evaluated the ability of MR-proADM and PCT to predict hospitalization in patients admitted to our emergency departments complaining of fever. MR-proADM alone had an AUC of 0.694, while PCT alone had an AUC of 0.763. The combined use of PCT and MR-proADM instead showed an AUC of 0.79. Conclusions The present study highlights the way in which MR-proADM and PCT may be helpful to the febrile patient’s care in the ED. Our data support the prognostic role of MR-proADM and PCT in that setting, as demonstrated by the correlation with the APACHE II score. The combined use of the two biomarkers can predict a subsequent hospitalization of febrile patients. The rational use of these two molecules could lead to several advantages, such as faster diagnosis, more accurate risk stratification, and optimization of the treatment, with consequent benefit to the patient and

  14. Combination of Mean Platelet Volume/Platelet Count Ratio and the APACHE II Score Better Predicts the Short-Term Outcome in Patients with Acute Kidney Injury Receiving Continuous Renal Replacement Therapy

    Directory of Open Access Journals (Sweden)

    Junhui Li

    2018-03-01

    Full Text Available Background/Aims: Both the Acute physiology and Chronic Health Evaluation (APACHE II score and mean platelet volume/platelet count Ratio (MPR can independently predict adverse outcomes in critically ill patients. This study was aimed to investigate whether the combination of them could have a better performance in predicting prognosis of patients with acute kidney injury (AKI who received continuous renal replacement therapy (CRRT. Methods: Two hundred twenty-three patients with AKI who underwent CRRT between January 2009 and December 2014 in a Chinese university hospital were enrolled. They were divided into survivals group and non-survivals group based on the situation at discharge. Receiver Operating Characteristic (ROC curve was used for MPR and APACHE II score, and to determine the optimal cut-off value of MPR for in-hospital mortality. Factors associated with mortality were identified by univariate and multivariate logistic regression analysis. Results: The mean age of the patients was 61.4 years, and the overall in-hospital mortality was 48.4%. Acute cardiorenal syndrome (ACRS was the most common cause of AKI. The optimal cut-off value of MPR for mortality was 0.099 with an area under the ROC curve (AUC of 0.636. The AUC increased to 0.851 with the addition of the APACHE II score. The mortality of patients with of MPR > 0.099 was 56.4%, which was significantly higher than that of the control group with of ≤ 0.099 (39.6%, P= 0.012. Logistic regression analysis showed that average number of organ failure (OR = 2.372, APACHE II score (OR = 1.187, age (OR = 1.028 and vasopressors administration (OR = 38.130 were significantly associated with poor prognosis. Conclusion: Severity of illness was significantly associated with prognosis of patients with AKI. The combination of MPR and APACHE II score may be helpful in predicting the short-term outcome of AKI.

  15. Predictive values of urine paraquat concentration, dose of poison, arterial blood lactate and APACHE II score in the prognosis of patients with acute paraquat poisoning.

    Science.gov (United States)

    Liu, Xiao-Wei; Ma, Tao; Li, Lu-Lu; Qu, Bo; Liu, Zhi

    2017-07-01

    The present study investigated the predictive values of urine paraquat (PQ) concentration, dose of poison, arterial blood lactate and Acute Physiology and Chronic Health Evaluation (APACHE) II score in the prognosis of patients with acute PQ poisoning. A total of 194 patients with acute PQ poisoning, hospitalized between April 2012 and January 2014 at the First Affiliated Hospital of P.R. China Medical University (Shenyang, China), were selected and divided into survival and mortality groups. Logistic regression analysis, receiver operator characteristic (ROC) curve analysis and Kaplan-Meier curve were applied to evaluate the values of urine paraquat (PQ) concentration, dose of poison, arterial blood lactate and (APACHE) II score for predicting the prognosis of patients with acute PQ poisoning. Initial urine PQ concentration (C0), dose of poison, arterial blood lactate and APACHE II score of patients in the mortality group were significantly higher compared with the survival group (all Ppoison and arterial blood lactate correlated with mortality risk of acute PQ poisoning (all Ppoison, arterial blood lactate and APACHE II score in predicting the mortality of patients within 28 days were 0.921, 0.887, 0.808 and 0.648, respectively. The AUC of C0 for predicting early and delayed mortality were 0.890 and 0.764, respectively. The AUC values of urine paraquat concentration the day after poisoning (Csec) and the rebound rate of urine paraquat concentration in predicting the mortality of patients within 28 days were 0.919 and 0.805, respectively. The 28-day survival rate of patients with C0 ≤32.2 µg/ml (42/71; 59.2%) was significantly higher when compared with patients with C0 >32.2 µg/ml (38/123; 30.9%). These results suggest that the initial urine PQ concentration may be the optimal index for predicting the prognosis of patients with acute PQ poisoning. Additionally, dose of poison, arterial blood lactate, Csec and rebound rate also have referential significance.

  16. High performance Spark best practices for scaling and optimizing Apache Spark

    CERN Document Server

    Karau, Holden

    2017-01-01

    Apache Spark is amazing when everything clicks. But if you haven’t seen the performance improvements you expected, or still don’t feel confident enough to use Spark in production, this practical book is for you. Authors Holden Karau and Rachel Warren demonstrate performance optimizations to help your Spark queries run faster and handle larger data sizes, while using fewer resources. Ideal for software engineers, data engineers, developers, and system administrators working with large-scale data applications, this book describes techniques that can reduce data infrastructure costs and developer hours. Not only will you gain a more comprehensive understanding of Spark, you’ll also learn how to make it sing. With this book, you’ll explore: How Spark SQL’s new interfaces improve performance over SQL’s RDD data structure The choice between data joins in Core Spark and Spark SQL Techniques for getting the most out of standard RDD transformations How to work around performance issues i...

  17. Extraction of UMLS® Concepts Using Apache cTAKES™ for German Language.

    Science.gov (United States)

    Becker, Matthias; Böckmann, Britta

    2016-01-01

    Automatic information extraction of medical concepts and classification with semantic standards from medical reports is useful for standardization and for clinical research. This paper presents an approach for an UMLS concept extraction with a customized natural language processing pipeline for German clinical notes using Apache cTAKES. The objectives are, to test the natural language processing tool for German language if it is suitable to identify UMLS concepts and map these with SNOMED-CT. The German UMLS database and German OpenNLP models extended the natural language processing pipeline, so the pipeline can normalize to domain ontologies such as SNOMED-CT using the German concepts. For testing, the ShARe/CLEF eHealth 2013 training dataset translated into German was used. The implemented algorithms are tested with a set of 199 German reports, obtaining a result of average 0.36 F1 measure without German stemming, pre- and post-processing of the reports.

  18. Monte Carlo methods

    Directory of Open Access Journals (Sweden)

    Bardenet Rémi

    2013-07-01

    Full Text Available Bayesian inference often requires integrating some function with respect to a posterior distribution. Monte Carlo methods are sampling algorithms that allow to compute these integrals numerically when they are not analytically tractable. We review here the basic principles and the most common Monte Carlo algorithms, among which rejection sampling, importance sampling and Monte Carlo Markov chain (MCMC methods. We give intuition on the theoretical justification of the algorithms as well as practical advice, trying to relate both. We discuss the application of Monte Carlo in experimental physics, and point to landmarks in the literature for the curious reader.

  19. Intensive Care in India: The Indian Intensive Care Case Mix and Practice Patterns Study.

    Science.gov (United States)

    Divatia, Jigeeshu V; Amin, Pravin R; Ramakrishnan, Nagarajan; Kapadia, Farhad N; Todi, Subhash; Sahu, Samir; Govil, Deepak; Chawla, Rajesh; Kulkarni, Atul P; Samavedam, Srinivas; Jani, Charu K; Rungta, Narendra; Samaddar, Devi Prasad; Mehta, Sujata; Venkataraman, Ramesh; Hegde, Ashit; Bande, B D; Dhanuka, Sanjay; Singh, Virendra; Tewari, Reshma; Zirpe, Kapil; Sathe, Prachee

    2016-04-01

    To obtain information on organizational aspects, case mix and practices in Indian Intensive Care Units (ICUs). An observational, 4-day point prevalence study was performed between 2010 and 2011 in 4209 patients from 124 ICUs. ICU and patient characteristics, and interventions were recorded for 24 h of the study day, and outcomes till 30 days after the study day. Data were analyzed for 4038 adult patients from 120 ICUs. On the study day, mean age, Acute Physiology and Chronic Health Evaluation (APACHE II) and sequential organ failure assessment (SOFA) scores were 54.1 ± 17.1 years, 17.4 ± 9.2 and 3.8 ± 3.6, respectively. About 46.4% patients had ≥1 organ failure. Nearly, 37% and 22.2% patients received mechanical ventilation (MV) and vasopressors or inotropes, respectively. Nearly, 12.2% patients developed an infection in the ICU. About 28.3% patients had severe sepsis or septic shock (SvSpSS) during their ICU stay. About 60.7% patients without infection received antibiotics. There were 546 deaths and 183 terminal discharges (TDs) from ICU (including left against medical advice or discharged on request), with ICU mortality 729/4038 (18.1%). In 1627 patients admitted within 24 h of the study day, the standardized mortality ratio was 0.67. The APACHE II and SOFA scores, public hospital ICUs, medical ICUs, inadequately equipped ICUs, medical admission, self-paying patient, presence of SvSpSS, acute respiratory failure or cancer, need for a fluid bolus, and MV were independent predictors of mortality. The high proportion of TDs and the association of public hospitals, self-paying patients, and inadequately equipped hospitals with mortality has important implications for critical care in India.

  20. Restoration of Gooseberry Creek

    Science.gov (United States)

    Jonathan W. Long

    2000-01-01

    Grazing exclusion and channel modifications were used to restore wet meadows along a stream on the Fort Apache Indian Reservation. The efforts are reestablishing functional processes to promote long-term restoration of wetland health and species conservation.

  1. APOGEE-2: The Second Phase of the Apache Point Observatory Galactic Evolution Experiment in SDSS-IV

    Science.gov (United States)

    Sobeck, Jennifer; Majewski, S.; Hearty, F.; Schiavon, R. P.; Holtzman, J. A.; Johnson, J.; Frinchaboy, P. M.; Skrutskie, M. F.; Munoz, R.; Pinsonneault, M. H.; Nidever, D. L.; Zasowski, G.; Garcia Perez, A.; Fabbian, D.; Meza Cofre, A.; Cunha, K. M.; Smith, V. V.; Chiappini, C.; Beers, T. C.; Steinmetz, M.; Anders, F.; Bizyaev, D.; Roman, A.; Fleming, S. W.; Crane, J. D.; SDSS-IV/APOGEE-2 Collaboration

    2014-01-01

    The second phase of the Apache Point Observatory Galactic Evolution Experiment (APOGEE-2), a part of the Sloan Digital Sky Survey IV (SDSS-IV), will commence operations in 2014. APOGEE-2 represents a significant expansion over APOGEE-1, not only in the size of the stellar sample, but also in the coverage of the sky through observations in both the Northern and Southern Hemispheres. Observations on the 2.5m Sloan Foundation Telescope of the Apache Point Observatory (APOGEE-2N) will continue immediately after the conclusion of APOGEE-1, to be followed by observations with the 2.5m du Pont Telescope of the Las Campanas Observatory (APOGEE-2S) within three years. Over the six-year lifetime of the project, high resolution (R˜22,500), high signal-to-noise (≥100) spectroscopic data in the H-band wavelength regime (1.51-1.69 μm) will be obtained for several hundred thousand stars, more than tripling the total APOGEE-1 sample. Accurate radial velocities and detailed chemical compositions will be generated for target stars in the main Galactic components (bulge, disk, and halo), open/globular clusters, and satellite dwarf galaxies. The spectroscopic follow-up program of Kepler targets with the APOGEE-2N instrument will be continued and expanded. APOGEE-2 will significantly extend and enhance the APOGEE-1 legacy of scientific contributions to understanding the origin and evolution of the elements, the assembly and formation history of galaxies like the Milky Way, and fundamental stellar astrophysics.

  2. Leadership Preferences of Indian and Non-Indian Athletes.

    Science.gov (United States)

    Malloy, D. C.; Nilson, R. N.

    1991-01-01

    Among 86 Indian and non-Indian volleyball competitors, non-Indian players indicated significantly greater preferences for leadership that involved democratic behavior, autocratic behavior, or social support. Indians may adapt their behavior by participating in non-Indian games, without changing their traditional value orientations. Contains 22…

  3. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    ... considerable difference between the Procedural programming and Object Oriented PHP language, on the middle layer in the three tier of the web architecture. Also, the research concerning the comparison of relationdatabase system, MySQL and NoSQL, key value store system, ApacheCassandra, on the database layer.

  4. Historical review of uranium-vanadium in the eastern Carrizo Mountains, San Juan County, New Mexico and Apache County, Arizona

    International Nuclear Information System (INIS)

    Chenoweth, W.L.

    1980-03-01

    This report is a brief review of the uranium and/or vanadium mining in the eastern Carrizo Mountains, San Juan County, New Mexico and Apache County, Arizona. It was prepared at the request of the Navajo Tribe, the New Mexico Energy and Minerals Department, and the Arizona Bureau of Geology and Mineral Technology. This report deals only with historical production data. The locations of the mines and the production are presented in figures and tables

  5. The Apache Point Observatory Galactic Evolution Experiment (APOGEE)

    Science.gov (United States)

    Majewski, Steven R.; Schiavon, Ricardo P.; Frinchaboy, Peter M.; Allende Prieto, Carlos; Barkhouser, Robert; Bizyaev, Dmitry; Blank, Basil; Brunner, Sophia; Burton, Adam; Carrera, Ricardo; Chojnowski, S. Drew; Cunha, Kátia; Epstein, Courtney; Fitzgerald, Greg; García Pérez, Ana E.; Hearty, Fred R.; Henderson, Chuck; Holtzman, Jon A.; Johnson, Jennifer A.; Lam, Charles R.; Lawler, James E.; Maseman, Paul; Mészáros, Szabolcs; Nelson, Matthew; Nguyen, Duy Coung; Nidever, David L.; Pinsonneault, Marc; Shetrone, Matthew; Smee, Stephen; Smith, Verne V.; Stolberg, Todd; Skrutskie, Michael F.; Walker, Eric; Wilson, John C.; Zasowski, Gail; Anders, Friedrich; Basu, Sarbani; Beland, Stephane; Blanton, Michael R.; Bovy, Jo; Brownstein, Joel R.; Carlberg, Joleen; Chaplin, William; Chiappini, Cristina; Eisenstein, Daniel J.; Elsworth, Yvonne; Feuillet, Diane; Fleming, Scott W.; Galbraith-Frew, Jessica; García, Rafael A.; García-Hernández, D. Aníbal; Gillespie, Bruce A.; Girardi, Léo; Gunn, James E.; Hasselquist, Sten; Hayden, Michael R.; Hekker, Saskia; Ivans, Inese; Kinemuchi, Karen; Klaene, Mark; Mahadevan, Suvrath; Mathur, Savita; Mosser, Benoît; Muna, Demitri; Munn, Jeffrey A.; Nichol, Robert C.; O'Connell, Robert W.; Parejko, John K.; Robin, A. C.; Rocha-Pinto, Helio; Schultheis, Matthias; Serenelli, Aldo M.; Shane, Neville; Silva Aguirre, Victor; Sobeck, Jennifer S.; Thompson, Benjamin; Troup, Nicholas W.; Weinberg, David H.; Zamora, Olga

    2017-09-01

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE), one of the programs in the Sloan Digital Sky Survey III (SDSS-III), has now completed its systematic, homogeneous spectroscopic survey sampling all major populations of the Milky Way. After a three-year observing campaign on the Sloan 2.5 m Telescope, APOGEE has collected a half million high-resolution (R ˜ 22,500), high signal-to-noise ratio (>100), infrared (1.51-1.70 μm) spectra for 146,000 stars, with time series information via repeat visits to most of these stars. This paper describes the motivations for the survey and its overall design—hardware, field placement, target selection, operations—and gives an overview of these aspects as well as the data reduction, analysis, and products. An index is also given to the complement of technical papers that describe various critical survey components in detail. Finally, we discuss the achieved survey performance and illustrate the variety of potential uses of the data products by way of a number of science demonstrations, which span from time series analysis of stellar spectral variations and radial velocity variations from stellar companions, to spatial maps of kinematics, metallicity, and abundance patterns across the Galaxy and as a function of age, to new views of the interstellar medium, the chemistry of star clusters, and the discovery of rare stellar species. As part of SDSS-III Data Release 12 and later releases, all of the APOGEE data products are publicly available.

  6. Prediction of Mortality after Emergent Transjugular Intrahepatic Portosystemic Shunt Placement: Use of APACHE II, Child-Pugh and MELD Scores in Asian Patients with Refractory Variceal Hemorrhage

    International Nuclear Information System (INIS)

    Tzeng, Wen Sheng; Wu, Reng Hong; Lin, Ching Yih; Chen, Jyh Jou; Sheu, Ming Juen; Koay, Lok Beng; Lee, Chuan

    2009-01-01

    This study was designed to determine if existing methods of grading liver function that have been developed in non-Asian patients with cirrhosis can be used to predict mortality in Asian patients treated for refractory variceal hemorrhage by the use of the transjugular intrahepatic portosystemic shunt (TIPS) procedure. Data for 107 consecutive patients who underwent an emergency TIPS procedure were retrospectively analyzed. Acute physiology and chronic health evaluation (APACHE II), Child-Pugh and model for end-stage liver disease (MELD) scores were calculated. Survival analyses were performed to evaluate the ability of the various models to predict 30-day, 60-day and 360-day mortality. The ability of stratified APACHE II, Child-Pugh, and MELD scores to predict survival was assessed by the use of Kaplan-Meier analysis with the log-rank test. No patient died during the TIPS procedure, but 82 patients died during the follow-up period. Thirty patients died within 30 days after the TIPS procedure; 37 patients died within 60 days and 53 patients died within 360 days. Univariate analysis indicated that hepatorenal syndrome, use of inotropic agents and mechanical ventilation were associated with elevated 30-day mortality (p 11 or an MELD score > 20 predicted increased risk of death at 30, 60 and 360 days (p 11 or an MELD score > 20 are predictive of mortality in Asian patients with refractory variceal hemorrhage treated with the TIPS procedure. An APACHE II score is not predictive of early mortality in this patient population

  7. Hardening en servidor web Linux Apache, PHP y configurar el firewall de aplicaciones modsecurity para mitigar ataques al servidor

    OpenAIRE

    Espol; Delgado Quishpe, Byron Alberto

    2017-01-01

    Realizar un hardening al servidor web, se procederá a revisar las directivas en los archivos de configuración del servicio Apache, PHP, y se procederá a realizar instalación y configuración de un firewall de aplicaciones llamado mod_security la cual nos permitirá mitigar ataques a nuestro servidor web. realizando un análisis de vulnerabilidades encontrado en el servidor. Guayaquil Magíster en Seguridad Informática Aplicada

  8. Aplikasi Search Engine Perpustakaan Petra Berbasis Android dengan Apache SOLR

    Directory of Open Access Journals (Sweden)

    Andreas Handojo

    2016-07-01

    Full Text Available Abstrak: Pendidikan merupakan kebutuhan yang penting bagi manusia untuk meningkatkan kemampuan serta taraf hidupnya.Selain melalui pendidikan formal, ilmu juga dapat diperoleh melalui media cetak atau buku.Perpustakaan merupakan salah satu sarana yang penting dalam menunjang hal tersebut.Meskipun sangat bermanfaat, terdapat kesulitan penggunaan layanan perpustakaan, karena terlalu banyaknya koleksi pustaka yang ada (buku, jurnal, majalah, dan sebagainya sehingga sulit untuk menemukan buku yang ingin dicari.Oleh sebab itu, selain harus berkembang dengan penyediaan koleksi pustaka, perpustakaan harus dapat mengikuti perkembangan zaman yang ada sehingga mempermudah penggunaan layanan perpustakaan.Saat iniperpustakaan Universitas Kristen Petra memiliki perpustakaan dengan kurang lebih 230.000 koleksi fisik maupun digital (berdasarkan data 2014.Dimana daftar koleksi fisik dan dokumen digital dapat diakses melalui website perpustakaan.Adanya koleksi pustaka yang sangat banyak ini menyebabkan kesulitan pengguna dalam melakukan proses pencarian. Sehingga guna menambah fitur layanan yang diberikan maka pada penelitian ini dibuatlah sebuah aplikasi layanan search engine perpustakaan menggunakan platform Apache SOLR dan database PostgreSQL. Selain itu, guna lebih meningkatkan kemudahan akses maka aplikasi ini dibuat dengan menggunakan platform mobile device berbasis Android.Selain pengujian terhadap aplikasi dilakukan juga pengujian dengan mengedarkan kuesioner terhadap 50 calon pengguna.Dari hasil kuestioner tersebut menunjukkan bahwa fitur-fitur yang dibuat telah sesuai dengan kebutuhan pengguna (78%. Kata kunci: SOLR, Mesin Pencarian, Perpustakaan, PostgreSQL Abstract: Education is an essential requirement for people to improve their standard of living. Other than through formal education, science can also be obtained through the print media or books. Library is one important tool supporting it. Although it is useful, there are difficulties use library

  9. Exploring Monte Carlo methods

    CERN Document Server

    Dunn, William L

    2012-01-01

    Exploring Monte Carlo Methods is a basic text that describes the numerical methods that have come to be known as "Monte Carlo." The book treats the subject generically through the first eight chapters and, thus, should be of use to anyone who wants to learn to use Monte Carlo. The next two chapters focus on applications in nuclear engineering, which are illustrative of uses in other fields. Five appendices are included, which provide useful information on probability distributions, general-purpose Monte Carlo codes for radiation transport, and other matters. The famous "Buffon's needle proble

  10. Efficacy of a pentavalent human-bovine reassortant rotavirus vaccine against rotavirus gastroenteritis among American Indian children.

    Science.gov (United States)

    Grant, Lindsay R; Watt, James P; Weatherholtz, Robert C; Moulton, Lawrence H; Reid, Raymond; Santosham, Mathuram; O'Brien, Katherine L

    2012-02-01

    Before the widespread use of rotavirus vaccines, rotavirus was a leading cause of gastroenteritis among children. Navajo and White Mountain Apache children suffer a disproportionate burden of severe rotavirus disease compared with the general U.S. population. We enrolled Navajo and White Mountain Apache infants in a multicenter, double-blind, placebo-controlled trial of pentavalent human-bovine reassortant rotavirus vaccine (PRV). Subjects received 3 doses of vaccine or placebo at 4 to 10 week intervals, with the first dose given between 6 and 12 weeks of age. Gastroenteritis episodes were identified by active surveillance. Disease severity was determined by a standardized scoring system. There were 509 and 494 randomized children who received vaccine and placebo, respectively. Among placebo recipients, the incidence of rotavirus gastroenteritis was 34.2 episodes/100 child-years (95% confidence interval [95% CI]: 25.8-38.9) versus 8.1 episodes/100 child-years (95% CI: 5.4-12.5) in the vaccine group. The percentage of rotavirus episodes caused by serotypes G1, G2, and G3 was 72.3%, 23.4%, and 2.1%, respectively. There were no severe rotavirus episodes among vaccinees and 4 among placebo recipients. PRV was 77.1% (95% CI: 59.7-87.6), 89.5% (95% CI: 65.9-97.9), and 82.9% (95% CI: 61.1-93.6) effective against G1-G4 rotavirus disease, severe and moderate rotavirus disease combined, and outpatient visits for rotavirus disease, respectively. The risk of adverse events was similar for the vaccine and placebo groups. PRV was highly effective in preventing rotavirus disease and related health care utilization in these American Indian infants. Vaccine efficacy and immunogenicity were similar to the overall study population enrolled in the multicenter trial.

  11. Specialized Monte Carlo codes versus general-purpose Monte Carlo codes

    International Nuclear Information System (INIS)

    Moskvin, Vadim; DesRosiers, Colleen; Papiez, Lech; Lu, Xiaoyi

    2002-01-01

    The possibilities of Monte Carlo modeling for dose calculations and optimization treatment are quite limited in radiation oncology applications. The main reason is that the Monte Carlo technique for dose calculations is time consuming while treatment planning may require hundreds of possible cases of dose simulations to be evaluated for dose optimization. The second reason is that general-purpose codes widely used in practice, require an experienced user to customize them for calculations. This paper discusses the concept of Monte Carlo code design that can avoid the main problems that are preventing wide spread use of this simulation technique in medical physics. (authors)

  12. Monte Carlo principles and applications

    Energy Technology Data Exchange (ETDEWEB)

    Raeside, D E [Oklahoma Univ., Oklahoma City (USA). Health Sciences Center

    1976-03-01

    The principles underlying the use of Monte Carlo methods are explained, for readers who may not be familiar with the approach. The generation of random numbers is discussed, and the connection between Monte Carlo methods and random numbers is indicated. Outlines of two well established Monte Carlo sampling techniques are given, together with examples illustrating their use. The general techniques for improving the efficiency of Monte Carlo calculations are considered. The literature relevant to the applications of Monte Carlo calculations in medical physics is reviewed.

  13. 11th International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing

    CERN Document Server

    Nuyens, Dirk

    2016-01-01

    This book presents the refereed proceedings of the Eleventh International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing that was held at the University of Leuven (Belgium) in April 2014. These biennial conferences are major events for Monte Carlo and quasi-Monte Carlo researchers. The proceedings include articles based on invited lectures as well as carefully selected contributed papers on all theoretical aspects and applications of Monte Carlo and quasi-Monte Carlo methods. Offering information on the latest developments in these very active areas, this book is an excellent reference resource for theoreticians and practitioners interested in solving high-dimensional computational problems, arising, in particular, in finance, statistics and computer graphics.

  14. The Apache Point Observatory Galactic Evolution Experiment (APOGEE)

    International Nuclear Information System (INIS)

    Majewski, Steven R.; Brunner, Sophia; Burton, Adam; Chojnowski, S. Drew; Pérez, Ana E. García; Hearty, Fred R.; Lam, Charles R.; Schiavon, Ricardo P.; Frinchaboy, Peter M.; Prieto, Carlos Allende; Carrera, Ricardo; Barkhouser, Robert; Bizyaev, Dmitry; Blank, Basil; Henderson, Chuck; Cunha, Kátia; Epstein, Courtney; Johnson, Jennifer A.; Fitzgerald, Greg; Holtzman, Jon A.

    2017-01-01

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE), one of the programs in the Sloan Digital Sky Survey III (SDSS-III), has now completed its systematic, homogeneous spectroscopic survey sampling all major populations of the Milky Way. After a three-year observing campaign on the Sloan 2.5 m Telescope, APOGEE has collected a half million high-resolution ( R  ∼ 22,500), high signal-to-noise ratio (>100), infrared (1.51–1.70 μ m) spectra for 146,000 stars, with time series information via repeat visits to most of these stars. This paper describes the motivations for the survey and its overall design—hardware, field placement, target selection, operations—and gives an overview of these aspects as well as the data reduction, analysis, and products. An index is also given to the complement of technical papers that describe various critical survey components in detail. Finally, we discuss the achieved survey performance and illustrate the variety of potential uses of the data products by way of a number of science demonstrations, which span from time series analysis of stellar spectral variations and radial velocity variations from stellar companions, to spatial maps of kinematics, metallicity, and abundance patterns across the Galaxy and as a function of age, to new views of the interstellar medium, the chemistry of star clusters, and the discovery of rare stellar species. As part of SDSS-III Data Release 12 and later releases, all of the APOGEE data products are publicly available.

  15. The Apache Point Observatory Galactic Evolution Experiment (APOGEE)

    Energy Technology Data Exchange (ETDEWEB)

    Majewski, Steven R.; Brunner, Sophia; Burton, Adam; Chojnowski, S. Drew; Pérez, Ana E. García; Hearty, Fred R.; Lam, Charles R. [Department of Astronomy, University of Virginia, Charlottesville, VA 22904-4325 (United States); Schiavon, Ricardo P. [Gemini Observatory, 670 N. A’Ohoku Place, Hilo, HI 96720 (United States); Frinchaboy, Peter M. [Department of Physics and Astronomy, Texas Christian University, Fort Worth, TX 76129 (United States); Prieto, Carlos Allende; Carrera, Ricardo [Instituto de Astrofísica de Canarias, E-38200 La Laguna, Tenerife (Spain); Barkhouser, Robert [Department of Physics and Astronomy, Johns Hopkins University, Baltimore, MD 21218 (United States); Bizyaev, Dmitry [Apache Point Observatory and New Mexico State University, P.O. Box 59, Sunspot, NM, 88349-0059 (United States); Blank, Basil; Henderson, Chuck [Pulse Ray Machining and Design, 4583 State Route 414, Beaver Dams, NY 14812 (United States); Cunha, Kátia [Observatório Nacional, Rio de Janeiro, RJ 20921-400 (Brazil); Epstein, Courtney; Johnson, Jennifer A. [The Ohio State University, Columbus, OH 43210 (United States); Fitzgerald, Greg [New England Optical Systems, 237 Cedar Hill Street, Marlborough, MA 01752 (United States); Holtzman, Jon A. [New Mexico State University, Las Cruces, NM 88003 (United States); and others

    2017-09-01

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE), one of the programs in the Sloan Digital Sky Survey III (SDSS-III), has now completed its systematic, homogeneous spectroscopic survey sampling all major populations of the Milky Way. After a three-year observing campaign on the Sloan 2.5 m Telescope, APOGEE has collected a half million high-resolution ( R  ∼ 22,500), high signal-to-noise ratio (>100), infrared (1.51–1.70 μ m) spectra for 146,000 stars, with time series information via repeat visits to most of these stars. This paper describes the motivations for the survey and its overall design—hardware, field placement, target selection, operations—and gives an overview of these aspects as well as the data reduction, analysis, and products. An index is also given to the complement of technical papers that describe various critical survey components in detail. Finally, we discuss the achieved survey performance and illustrate the variety of potential uses of the data products by way of a number of science demonstrations, which span from time series analysis of stellar spectral variations and radial velocity variations from stellar companions, to spatial maps of kinematics, metallicity, and abundance patterns across the Galaxy and as a function of age, to new views of the interstellar medium, the chemistry of star clusters, and the discovery of rare stellar species. As part of SDSS-III Data Release 12 and later releases, all of the APOGEE data products are publicly available.

  16. Inequalities in Open Source Software Development: Analysis of Contributor's Commits in Apache Software Foundation Projects.

    Science.gov (United States)

    Chełkowski, Tadeusz; Gloor, Peter; Jemielniak, Dariusz

    2016-01-01

    While researchers are becoming increasingly interested in studying OSS phenomenon, there is still a small number of studies analyzing larger samples of projects investigating the structure of activities among OSS developers. The significant amount of information that has been gathered in the publicly available open-source software repositories and mailing-list archives offers an opportunity to analyze projects structures and participant involvement. In this article, using on commits data from 263 Apache projects repositories (nearly all), we show that although OSS development is often described as collaborative, but it in fact predominantly relies on radically solitary input and individual, non-collaborative contributions. We also show, in the first published study of this magnitude, that the engagement of contributors is based on a power-law distribution.

  17. Indian Energy Beat. Spring/Summer 2014: News on Actions to Accelerate Energy Development in Indian Country

    Energy Technology Data Exchange (ETDEWEB)

    None

    2014-03-01

    Articles include: Arizona Apache tribe set to break ground on new solar project; Native leaders give tribes a voice on White House Climate Task Force; Chaninik Wind Group Pursues Innovative Solutions to native Alaska energy challenges; and sections, Message from the Director, Tracey Lebeau; On the Horizon, Sharing Knowledge, and Building Bridges.

  18. CERN honours Carlo Rubbia

    CERN Document Server

    2009-01-01

    Carlo Rubbia turned 75 on March 31, and CERN held a symposium to mark his birthday and pay tribute to his impressive contribution to both CERN and science. Carlo Rubbia, 4th from right, together with the speakers at the symposium.On 7 April CERN hosted a celebration marking Carlo Rubbia’s 75th birthday and 25 years since he was awarded the Nobel Prize for Physics. "Today we will celebrate 100 years of Carlo Rubbia" joked CERN’s Director-General, Rolf Heuer in his opening speech, "75 years of his age and 25 years of the Nobel Prize." Rubbia received the Nobel Prize along with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. During the symposium, which was held in the Main Auditorium, several eminent speakers gave lectures on areas of science to which Carlo Rubbia made decisive contributions. Among those who spoke were Michel Spiro, Director of the French National Insti...

  19. Perform wordcount Map-Reduce Job in Single Node Apache Hadoop cluster and compress data using Lempel-Ziv-Oberhumer (LZO) algorithm

    OpenAIRE

    Mirajkar, Nandan; Bhujbal, Sandeep; Deshmukh, Aaradhana

    2013-01-01

    Applications like Yahoo, Facebook, Twitter have huge data which has to be stored and retrieved as per client access. This huge data storage requires huge database leading to increase in physical storage and becomes complex for analysis required in business growth. This storage capacity can be reduced and distributed processing of huge data can be done using Apache Hadoop which uses Map-reduce algorithm and combines the repeating data so that entire data is stored in reduced format. The paper ...

  20. Arizona TeleMedicine Network: Engineering Master Plan.

    Science.gov (United States)

    Atlantic Research Corp., Alexandria, VA.

    As the planning document for establishing a statewide health communications system initially servicing the Papago, San Carlos and White Mountain Apache, Navajo, and Hopi reservations, this document prescribes the communications services to be provided by the Arizona TeleMedicine Network. Specifications include: (1) communications services for each…

  1. On the use of stochastic approximation Monte Carlo for Monte Carlo integration

    KAUST Repository

    Liang, Faming

    2009-03-01

    The stochastic approximation Monte Carlo (SAMC) algorithm has recently been proposed as a dynamic optimization algorithm in the literature. In this paper, we show in theory that the samples generated by SAMC can be used for Monte Carlo integration via a dynamically weighted estimator by calling some results from the literature of nonhomogeneous Markov chains. Our numerical results indicate that SAMC can yield significant savings over conventional Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, for the problems for which the energy landscape is rugged. © 2008 Elsevier B.V. All rights reserved.

  2. On the use of stochastic approximation Monte Carlo for Monte Carlo integration

    KAUST Repository

    Liang, Faming

    2009-01-01

    The stochastic approximation Monte Carlo (SAMC) algorithm has recently been proposed as a dynamic optimization algorithm in the literature. In this paper, we show in theory that the samples generated by SAMC can be used for Monte Carlo integration

  3. Visible Wavelength Reflectance Spectra and Taxonomies of Near-Earth Objects from Apache Point Observatory

    Science.gov (United States)

    Hammergren, Mark; Brucker, Melissa J.; Nault, Kristie A.; Gyuk, Geza; Solontoi, Michael R.

    2015-11-01

    Near-Earth Objects (NEOs) are interesting to scientists and the general public for diverse reasons: their impacts pose a threat to life and property; they present important albeit biased records of the formation and evolution of the Solar System; and their materials may provide in situ resources for future space exploration and habitation.In January 2015 we began a program of NEO astrometric follow-up and physical characterization using a 17% share of time on the Astrophysical Research Consortium (ARC) 3.5-meter telescope at Apache Point Observatory (APO). Our 500 hours of annual observing time are split into frequent, short astrometric runs (see poster by K. A. Nault et. al), and half-night runs devoted to physical characterization (see poster by M. J. Brucker et. al for preliminary rotational lightcurve results). NEO surface compositions are investigated with 0.36-1.0 μm reflectance spectroscopy using the Dual Imaging Spectrograph (DIS) instrument. As of August 25, 2015, including testing runs during fourth quarter 2014, we have obtained reflectance spectra of 68 unique NEOs, ranging in diameter from approximately 5m to 8km.In addition to investigating the compositions of individual NEOs to inform impact hazard and space resource evaluations, we may examine the distribution of taxonomic types and potential trends with other physical and orbital properties. For example, the Yarkovsky effect, which is dependent on asteroid shape, mass, rotation, and thermal characteristics, is believed to dominate other dynamical effects in driving the delivery of small NEOs from the main asteroid belt. Studies of the taxonomic distribution of a large sample of NEOs of a wide range of sizes will test this hypothesis.We present a preliminary analysis of the reflectance spectra obtained in our survey to date, including taxonomic classifications and potential trends with size.Acknowledgements: Based on observations obtained with the Apache Point Observatory 3.5-meter telescope, which

  4. Vectorized Monte Carlo

    International Nuclear Information System (INIS)

    Brown, F.B.

    1981-01-01

    Examination of the global algorithms and local kernels of conventional general-purpose Monte Carlo codes shows that multigroup Monte Carlo methods have sufficient structure to permit efficient vectorization. A structured multigroup Monte Carlo algorithm for vector computers is developed in which many particle events are treated at once on a cell-by-cell basis. Vectorization of kernels for tracking and variance reduction is described, and a new method for discrete sampling is developed to facilitate the vectorization of collision analysis. To demonstrate the potential of the new method, a vectorized Monte Carlo code for multigroup radiation transport analysis was developed. This code incorporates many features of conventional general-purpose production codes, including general geometry, splitting and Russian roulette, survival biasing, variance estimation via batching, a number of cutoffs, and generalized tallies of collision, tracklength, and surface crossing estimators with response functions. Predictions of vectorized performance characteristics for the CYBER-205 were made using emulated coding and a dynamic model of vector instruction timing. Computation rates were examined for a variety of test problems to determine sensitivities to batch size and vector lengths. Significant speedups are predicted for even a few hundred particles per batch, and asymptotic speedups by about 40 over equivalent Amdahl 470V/8 scalar codes arepredicted for a few thousand particles per batch. The principal conclusion is that vectorization of a general-purpose multigroup Monte Carlo code is well worth the significant effort required for stylized coding and major algorithmic changes

  5. Inequalities in Open Source Software Development: Analysis of Contributor’s Commits in Apache Software Foundation Projects

    Science.gov (United States)

    2016-01-01

    While researchers are becoming increasingly interested in studying OSS phenomenon, there is still a small number of studies analyzing larger samples of projects investigating the structure of activities among OSS developers. The significant amount of information that has been gathered in the publicly available open-source software repositories and mailing-list archives offers an opportunity to analyze projects structures and participant involvement. In this article, using on commits data from 263 Apache projects repositories (nearly all), we show that although OSS development is often described as collaborative, but it in fact predominantly relies on radically solitary input and individual, non-collaborative contributions. We also show, in the first published study of this magnitude, that the engagement of contributors is based on a power-law distribution. PMID:27096157

  6. Adjoint electron Monte Carlo calculations

    International Nuclear Information System (INIS)

    Jordan, T.M.

    1986-01-01

    Adjoint Monte Carlo is the most efficient method for accurate analysis of space systems exposed to natural and artificially enhanced electron environments. Recent adjoint calculations for isotropic electron environments include: comparative data for experimental measurements on electronics boxes; benchmark problem solutions for comparing total dose prediction methodologies; preliminary assessment of sectoring methods used during space system design; and total dose predictions on an electronics package. Adjoint Monte Carlo, forward Monte Carlo, and experiment are in excellent agreement for electron sources that simulate space environments. For electron space environments, adjoint Monte Carlo is clearly superior to forward Monte Carlo, requiring one to two orders of magnitude less computer time for relatively simple geometries. The solid-angle sectoring approximations used for routine design calculations can err by more than a factor of 2 on dose in simple shield geometries. For critical space systems exposed to severe electron environments, these potential sectoring errors demand the establishment of large design margins and/or verification of shield design by adjoint Monte Carlo/experiment

  7. Monte Carlo: Basics

    OpenAIRE

    Murthy, K. P. N.

    2001-01-01

    An introduction to the basics of Monte Carlo is given. The topics covered include, sample space, events, probabilities, random variables, mean, variance, covariance, characteristic function, chebyshev inequality, law of large numbers, central limit theorem (stable distribution, Levy distribution), random numbers (generation and testing), random sampling techniques (inversion, rejection, sampling from a Gaussian, Metropolis sampling), analogue Monte Carlo and Importance sampling (exponential b...

  8. Red Women, White Policy: American Indian Women and Indian Education.

    Science.gov (United States)

    Warner, Linda Sue

    This paper discusses American Indian educational policies and implications for educational leadership by Indian women. The paper begins with an overview of federal Indian educational policies from 1802 to the 1970s. As the tribes have moved toward self-determination in recent years, a growing number of American Indian women have assumed leadership…

  9. MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described

  10. Monte Carlo theory and practice

    International Nuclear Information System (INIS)

    James, F.

    1987-01-01

    Historically, the first large-scale calculations to make use of the Monte Carlo method were studies of neutron scattering and absorption, random processes for which it is quite natural to employ random numbers. Such calculations, a subset of Monte Carlo calculations, are known as direct simulation, since the 'hypothetical population' of the narrower definition above corresponds directly to the real population being studied. The Monte Carlo method may be applied wherever it is possible to establish equivalence between the desired result and the expected behaviour of a stochastic system. The problem to be solved may already be of a probabilistic or statistical nature, in which case its Monte Carlo formulation will usually be a straightforward simulation, or it may be of a deterministic or analytic nature, in which case an appropriate Monte Carlo formulation may require some imagination and may appear contrived or artificial. In any case, the suitability of the method chosen will depend on its mathematical properties and not on its superficial resemblance to the problem to be solved. The authors show how Monte Carlo techniques may be compared with other methods of solution of the same physical problem

  11. Indian Ledger Art.

    Science.gov (United States)

    Chilcoat, George W.

    1990-01-01

    Offers an innovative way to teach mid-nineteenth century North American Indian history by having students create their own Indian Ledger art. Purposes of the project are: to understand the role played by American Indians, to reveal American Indian stereotypes, and to identify relationships between cultures and environments. Background and…

  12. THE DATA REDUCTION PIPELINE FOR THE APACHE POINT OBSERVATORY GALACTIC EVOLUTION EXPERIMENT

    International Nuclear Information System (INIS)

    Nidever, David L.; Holtzman, Jon A.; Prieto, Carlos Allende; Mészáros, Szabolcs; Beland, Stephane; Bender, Chad; Desphande, Rohit; Bizyaev, Dmitry; Burton, Adam; García Pérez, Ana E.; Hearty, Fred R.; Majewski, Steven R.; Skrutskie, Michael F.; Sobeck, Jennifer S.; Wilson, John C.; Fleming, Scott W.; Muna, Demitri; Nguyen, Duy; Schiavon, Ricardo P.; Shetrone, Matthew

    2015-01-01

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE), part of the Sloan Digital Sky Survey III, explores the stellar populations of the Milky Way using the Sloan 2.5-m telescope linked to a high resolution (R ∼ 22,500), near-infrared (1.51–1.70 μm) spectrograph with 300 optical fibers. For over 150,000 predominantly red giant branch stars that APOGEE targeted across the Galactic bulge, disks and halo, the collected high signal-to-noise ratio (>100 per half-resolution element) spectra provide accurate (∼0.1 km s −1 ) RVs, stellar atmospheric parameters, and precise (≲0.1 dex) chemical abundances for about 15 chemical species. Here we describe the basic APOGEE data reduction software that reduces multiple 3D raw data cubes into calibrated, well-sampled, combined 1D spectra, as implemented for the SDSS-III/APOGEE data releases (DR10, DR11 and DR12). The processing of the near-IR spectral data of APOGEE presents some challenges for reduction, including automated sky subtraction and telluric correction over a 3°-diameter field and the combination of spectrally dithered spectra. We also discuss areas for future improvement

  13. Detection of attack-targeted scans from the Apache HTTP Server access logs

    Directory of Open Access Journals (Sweden)

    Merve Baş Seyyar

    2018-01-01

    Full Text Available A web application could be visited for different purposes. It is possible for a web site to be visited by a regular user as a normal (natural visit, to be viewed by crawlers, bots, spiders, etc. for indexing purposes, lastly to be exploratory scanned by malicious users prior to an attack. An attack targeted web scan can be viewed as a phase of a potential attack and can lead to more attack detection as compared to traditional detection methods. In this work, we propose a method to detect attack-oriented scans and to distinguish them from other types of visits. In this context, we use access log files of Apache (or ISS web servers and try to determine attack situations through examination of the past data. In addition to web scan detections, we insert a rule set to detect SQL Injection and XSS attacks. Our approach has been applied on sample data sets and results have been analyzed in terms of performance measures to compare our method and other commonly used detection techniques. Furthermore, various tests have been made on log samples from real systems. Lastly, several suggestions about further development have been also discussed.

  14. THE DATA REDUCTION PIPELINE FOR THE APACHE POINT OBSERVATORY GALACTIC EVOLUTION EXPERIMENT

    Energy Technology Data Exchange (ETDEWEB)

    Nidever, David L. [Department of Astronomy, University of Michigan, Ann Arbor, MI 48109 (United States); Holtzman, Jon A. [New Mexico State University, Las Cruces, NM 88003 (United States); Prieto, Carlos Allende; Mészáros, Szabolcs [Instituto de Astrofísica de Canarias, Via Láctea s/n, E-38205 La Laguna, Tenerife (Spain); Beland, Stephane [Laboratory for Atmospheric and Space Sciences, University of Colorado at Boulder, Boulder, CO (United States); Bender, Chad; Desphande, Rohit [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States); Bizyaev, Dmitry [Apache Point Observatory and New Mexico State University, P.O. Box 59, sunspot, NM 88349-0059 (United States); Burton, Adam; García Pérez, Ana E.; Hearty, Fred R.; Majewski, Steven R.; Skrutskie, Michael F.; Sobeck, Jennifer S.; Wilson, John C. [Department of Astronomy, University of Virginia, Charlottesville, VA 22904-4325 (United States); Fleming, Scott W. [Computer Sciences Corporation, 3700 San Martin Dr, Baltimore, MD 21218 (United States); Muna, Demitri [Department of Astronomy and the Center for Cosmology and Astro-Particle Physics, The Ohio State University, Columbus, OH 43210 (United States); Nguyen, Duy [Department of Astronomy and Astrophysics, University of Toronto, Toronto, Ontario, M5S 3H4 (Canada); Schiavon, Ricardo P. [Gemini Observatory, 670 N. A’Ohoku Place, Hilo, HI 96720 (United States); Shetrone, Matthew, E-mail: dnidever@umich.edu [University of Texas at Austin, McDonald Observatory, Fort Davis, TX 79734 (United States)

    2015-12-15

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE), part of the Sloan Digital Sky Survey III, explores the stellar populations of the Milky Way using the Sloan 2.5-m telescope linked to a high resolution (R ∼ 22,500), near-infrared (1.51–1.70 μm) spectrograph with 300 optical fibers. For over 150,000 predominantly red giant branch stars that APOGEE targeted across the Galactic bulge, disks and halo, the collected high signal-to-noise ratio (>100 per half-resolution element) spectra provide accurate (∼0.1 km s{sup −1}) RVs, stellar atmospheric parameters, and precise (≲0.1 dex) chemical abundances for about 15 chemical species. Here we describe the basic APOGEE data reduction software that reduces multiple 3D raw data cubes into calibrated, well-sampled, combined 1D spectra, as implemented for the SDSS-III/APOGEE data releases (DR10, DR11 and DR12). The processing of the near-IR spectral data of APOGEE presents some challenges for reduction, including automated sky subtraction and telluric correction over a 3°-diameter field and the combination of spectrally dithered spectra. We also discuss areas for future improvement.

  15. Monte Carlo Methods in Physics

    International Nuclear Information System (INIS)

    Santoso, B.

    1997-01-01

    Method of Monte Carlo integration is reviewed briefly and some of its applications in physics are explained. A numerical experiment on random generators used in the monte Carlo techniques is carried out to show the behavior of the randomness of various methods in generating them. To account for the weight function involved in the Monte Carlo, the metropolis method is used. From the results of the experiment, one can see that there is no regular patterns of the numbers generated, showing that the program generators are reasonably good, while the experimental results, shows a statistical distribution obeying statistical distribution law. Further some applications of the Monte Carlo methods in physics are given. The choice of physical problems are such that the models have available solutions either in exact or approximate values, in which comparisons can be mode, with the calculations using the Monte Carlo method. Comparison show that for the models to be considered, good agreement have been obtained

  16. SEQUENCE STRATIGRAPHIC ANALYSIS AND FACIES ARCHITECTURE OF THE CRETACEOUS MANCOS SHALE ON AND NEAR THE JICARILLA APACHE INDIAN RESERVATION, NEW MEXICO-THEIR RELATION TO SITES OF OIL ACCUMULATION

    International Nuclear Information System (INIS)

    Jennie Ridgley

    2000-01-01

    Oil distribution in the lower part of the Mancos Shale seems to be mainly controlled by fractures and by sandier facies that are dolomite-cemented. Structure in the area of the Jicarilla Apache Indian Reservation consists of the broad northwest- to southeast-trending Chaco slope, the deep central basin, and the monocline that forms the eastern boundary of the San Juan Basin. Superimposed on the regional structure are broad low-amplitude folds. Fractures seem best developed in the areas of these folds. Using sequence stratigraphic principals, the lower part of the Mancos Shale has been subdivided into four main regressive and transgressive components. These include facies that are the basinal time equivalents to the Gallup Sandstone, an overlying interbedded sandstone and shale sequence time equivalent to the transgressive Mulatto Tongue of the Mancos Shale, the El Vado Sandstone Member which is time equivalent to part of the Dalton Sandstone, and an unnamed interbedded sandstone and shale succession time equivalent to the regressive Dalton Sandstone and transgressive Hosta Tongue of the Mesaverde Group. Facies time equivalent to the Gallup Sandstone underlie an unconformity of regional extent. These facies are gradually truncated from south to north across the Reservation. The best potential for additional oil resources in these facies is in the southern part of the Reservation where the top sandier part of these facies is preserved. The overlying unnamed wedge of transgressive rocks produces some oil but is underexplored, except for sandstones equivalent to the Tocito Sandstone. This wedge of rocks is divided into from two to five units. The highest sand content in this wedge occurs where each of the four subdivisions above the Tocito terminates to the south and is overstepped by the next youngest unit. These terminal areas should offer the best targets for future oil exploration. The El Vado Sandstone Member overlies the transgressive wedge. It produces most of

  17. Monte Carlo techniques in radiation therapy

    CERN Document Server

    Verhaegen, Frank

    2013-01-01

    Modern cancer treatment relies on Monte Carlo simulations to help radiotherapists and clinical physicists better understand and compute radiation dose from imaging devices as well as exploit four-dimensional imaging data. With Monte Carlo-based treatment planning tools now available from commercial vendors, a complete transition to Monte Carlo-based dose calculation methods in radiotherapy could likely take place in the next decade. Monte Carlo Techniques in Radiation Therapy explores the use of Monte Carlo methods for modeling various features of internal and external radiation sources, including light ion beams. The book-the first of its kind-addresses applications of the Monte Carlo particle transport simulation technique in radiation therapy, mainly focusing on external beam radiotherapy and brachytherapy. It presents the mathematical and technical aspects of the methods in particle transport simulations. The book also discusses the modeling of medical linacs and other irradiation devices; issues specific...

  18. Statistical implications in Monte Carlo depletions - 051

    International Nuclear Information System (INIS)

    Zhiwen, Xu; Rhodes, J.; Smith, K.

    2010-01-01

    As a result of steady advances of computer power, continuous-energy Monte Carlo depletion analysis is attracting considerable attention for reactor burnup calculations. The typical Monte Carlo analysis is set up as a combination of a Monte Carlo neutron transport solver and a fuel burnup solver. Note that the burnup solver is a deterministic module. The statistical errors in Monte Carlo solutions are introduced into nuclide number densities and propagated along fuel burnup. This paper is towards the understanding of the statistical implications in Monte Carlo depletions, including both statistical bias and statistical variations in depleted fuel number densities. The deterministic Studsvik lattice physics code, CASMO-5, is modified to model the Monte Carlo depletion. The statistical bias in depleted number densities is found to be negligible compared to its statistical variations, which, in turn, demonstrates the correctness of the Monte Carlo depletion method. Meanwhile, the statistical variation in number densities generally increases with burnup. Several possible ways of reducing the statistical errors are discussed: 1) to increase the number of individual Monte Carlo histories; 2) to increase the number of time steps; 3) to run additional independent Monte Carlo depletion cases. Finally, a new Monte Carlo depletion methodology, called the batch depletion method, is proposed, which consists of performing a set of independent Monte Carlo depletions and is thus capable of estimating the overall statistical errors including both the local statistical error and the propagated statistical error. (authors)

  19. Monte Carlo simulation for IRRMA

    International Nuclear Information System (INIS)

    Gardner, R.P.; Liu Lianyan

    2000-01-01

    Monte Carlo simulation is fast becoming a standard approach for many radiation applications that were previously treated almost entirely by experimental techniques. This is certainly true for Industrial Radiation and Radioisotope Measurement Applications - IRRMA. The reasons for this include: (1) the increased cost and inadequacy of experimentation for design and interpretation purposes; (2) the availability of low cost, large memory, and fast personal computers; and (3) the general availability of general purpose Monte Carlo codes that are increasingly user-friendly, efficient, and accurate. This paper discusses the history and present status of Monte Carlo simulation for IRRMA including the general purpose (GP) and specific purpose (SP) Monte Carlo codes and future needs - primarily from the experience of the authors

  20. Demonstration of the Military Ecological Risk Assessment Framework (MERAF): Apache Longbow - Hell Missile Test at Yuma Proving Ground

    Energy Technology Data Exchange (ETDEWEB)

    Efroymson, R.A.

    2002-05-09

    This ecological risk assessment for a testing program at Yuma Proving Ground, Arizona, is a demonstration of the Military Ecological Risk Assessment Framework (MERAF; Suter et al. 2001). The demonstration is intended to illustrate how risk assessment guidance concerning-generic military training and testing activities and guidance concerning a specific type of activity (e.g., low-altitude aircraft overflights) may be implemented at a military installation. MERAF was developed with funding from the Strategic Research and Development Program (SERDP) of the Department of Defense. Novel aspects of MERAF include: (1) the assessment of risks from physical stressors using an ecological risk assessment framework, (2) the consideration of contingent or indirect effects of stressors (e.g., population-level effects that are derived from habitat or hydrological changes), (3) the integration of risks associated with different component activities or stressors, (4) the emphasis on quantitative risk estimates and estimates of uncertainty, and (5) the modularity of design, permitting components of the framework to be used in various military risk assessments that include similar activities. The particular subject of this report is the assessment of ecological risks associated with a testing program at Cibola Range of Yuma Proving Ground, Arizona. The program involves an Apache Longbow helicopter firing Hellfire missiles at moving targets, i.e., M60-A1 tanks. Thus, the three component activities of the Apache-Hellfire test were: (1) helicopter overflight, (2) missile firing, and (3) tracked vehicle movement. The demonstration was limited, to two ecological endpoint entities (i.e., potentially susceptible and valued populations or communities): woody desert wash communities and mule deer populations. The core assessment area is composed of about 126 km{sup 2} between the Chocolate and Middle Mountains. The core time of the program is a three-week period, including fourteen days of

  1. Demonstration of the Military Ecological Risk Assessment Framework (MERAF): Apache Longbow - Hell Missile Test at Yuma Proving Ground

    International Nuclear Information System (INIS)

    Efroymson, R.A.

    2002-01-01

    This ecological risk assessment for a testing program at Yuma Proving Ground, Arizona, is a demonstration of the Military Ecological Risk Assessment Framework (MERAF; Suter et al. 2001). The demonstration is intended to illustrate how risk assessment guidance concerning-generic military training and testing activities and guidance concerning a specific type of activity (e.g., low-altitude aircraft overflights) may be implemented at a military installation. MERAF was developed with funding from the Strategic Research and Development Program (SERDP) of the Department of Defense. Novel aspects of MERAF include: (1) the assessment of risks from physical stressors using an ecological risk assessment framework, (2) the consideration of contingent or indirect effects of stressors (e.g., population-level effects that are derived from habitat or hydrological changes), (3) the integration of risks associated with different component activities or stressors, (4) the emphasis on quantitative risk estimates and estimates of uncertainty, and (5) the modularity of design, permitting components of the framework to be used in various military risk assessments that include similar activities. The particular subject of this report is the assessment of ecological risks associated with a testing program at Cibola Range of Yuma Proving Ground, Arizona. The program involves an Apache Longbow helicopter firing Hellfire missiles at moving targets, i.e., M60-A1 tanks. Thus, the three component activities of the Apache-Hellfire test were: (1) helicopter overflight, (2) missile firing, and (3) tracked vehicle movement. The demonstration was limited, to two ecological endpoint entities (i.e., potentially susceptible and valued populations or communities): woody desert wash communities and mule deer populations. The core assessment area is composed of about 126 km 2 between the Chocolate and Middle Mountains. The core time of the program is a three-week period, including fourteen days of

  2. Hybrid SN/Monte Carlo research and results

    International Nuclear Information System (INIS)

    Baker, R.S.

    1993-01-01

    The neutral particle transport equation is solved by a hybrid method that iteratively couples regions where deterministic (S N ) and stochastic (Monte Carlo) methods are applied. The Monte Carlo and S N regions are fully coupled in the sense that no assumption is made about geometrical separation or decoupling. The hybrid Monte Carlo/S N method provides a new means of solving problems involving both optically thick and optically thin regions that neither Monte Carlo nor S N is well suited for by themselves. The hybrid method has been successfully applied to realistic shielding problems. The vectorized Monte Carlo algorithm in the hybrid method has been ported to the massively parallel architecture of the Connection Machine. Comparisons of performance on a vector machine (Cray Y-MP) and the Connection Machine (CM-2) show that significant speedups are obtainable for vectorized Monte Carlo algorithms on massively parallel machines, even when realistic problems requiring variance reduction are considered. However, the architecture of the Connection Machine does place some limitations on the regime in which the Monte Carlo algorithm may be expected to perform well

  3. Indianization of psychiatry utilizing Indian mental concepts

    Science.gov (United States)

    Avasthi, Ajit; Kate, Natasha; Grover, Sandeep

    2013-01-01

    Most of the psychiatry practice in India is guided by the western concepts of mental health and illness, which have largely ignored the role of religion, family, eastern philosophy, and medicine in understanding and managing the psychiatric disorders. India comprises of diverse cultures, languages, ethnicities, and religious affiliations. However, besides these diversities, there are certain commonalities, which include Hinduism as a religion which is spread across the country, the traditional family system, ancient Indian system of medicine and emphasis on use of traditional methods like Yoga and Meditation for controlling mind. This article discusses as to how mind and mental health are understood from the point of view of Hinduism, Indian traditions and Indian systems of medicine. Further, the article focuses on as to how these Indian concepts can be incorporated in the practice of contemporary psychiatry. PMID:23858244

  4. (U) Introduction to Monte Carlo Methods

    Energy Technology Data Exchange (ETDEWEB)

    Hungerford, Aimee L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-20

    Monte Carlo methods are very valuable for representing solutions to particle transport problems. Here we describe a “cook book” approach to handling the terms in a transport equation using Monte Carlo methods. Focus is on the mechanics of a numerical Monte Carlo code, rather than the mathematical foundations of the method.

  5. Carlo Caso (1940 - 2007)

    CERN Multimedia

    Leonardo Rossi

    Carlo Caso (1940 - 2007) Our friend and colleague Carlo Caso passed away on July 7th, after several months of courageous fight against cancer. Carlo spent most of his scientific career at CERN, taking an active part in the experimental programme of the laboratory. His long and fruitful involvement in particle physics started in the sixties, in the Genoa group led by G. Tomasini. He then made several experiments using the CERN liquid hydrogen bubble chambers -first the 2000HBC and later BEBC- to study various facets of the production and decay of meson and baryon resonances. He later made his own group and joined the NA27 Collaboration to exploit the EHS Spectrometer with a rapid cycling bubble chamber as vertex detector. Amongst their many achievements, they were the first to measure, with excellent precision, the lifetime of the charmed D mesons. At the start of the LEP era, Carlo and his group moved to the DELPHI experiment, participating in the construction and running of the HPC electromagnetic c...

  6. The Effect of a Monocular Helmet-Mounted Display on Aircrew Health: A Longitudinal Cohort Study of Apache AH Mk 1 Pilots -(Vision and Handedness)

    Science.gov (United States)

    2015-05-19

    the day, night, and in adverse weather through the use of nose-mounted, forward-looking infrared (FLIR) pilotage and targeting sensors that provide a...sensor video and/or symbology to each crewmember via a helmet display unit (HDU). The HDU contains a 1-inch (in.) diameter cathode ray tube (CRT...American Association for Pediatric Ophthalmology and Strabismus, 12(4): 365–369. Sale, D. F., and Lund, G. J. 1993. AH-64 Apache program update

  7. 25 CFR 137.1 - Water supply.

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Water supply. 137.1 Section 137.1 Indians BUREAU OF... CARLOS INDIAN IRRIGATION PROJECT, ARIZONA § 137.1 Water supply. The engineering report dealt with in... capacity of the San Carlos reservoir created by the Coolidge Dam and the water supply therefor over a...

  8. Lectures on Monte Carlo methods

    CERN Document Server

    Madras, Neal

    2001-01-01

    Monte Carlo methods form an experimental branch of mathematics that employs simulations driven by random number generators. These methods are often used when others fail, since they are much less sensitive to the "curse of dimensionality", which plagues deterministic methods in problems with a large number of variables. Monte Carlo methods are used in many fields: mathematics, statistics, physics, chemistry, finance, computer science, and biology, for instance. This book is an introduction to Monte Carlo methods for anyone who would like to use these methods to study various kinds of mathemati

  9. Monte Carlo simulation in nuclear medicine

    International Nuclear Information System (INIS)

    Morel, Ch.

    2007-01-01

    The Monte Carlo method allows for simulating random processes by using series of pseudo-random numbers. It became an important tool in nuclear medicine to assist in the design of new medical imaging devices, optimise their use and analyse their data. Presently, the sophistication of the simulation tools allows the introduction of Monte Carlo predictions in data correction and image reconstruction processes. The availability to simulate time dependent processes opens up new horizons for Monte Carlo simulation in nuclear medicine. In a near future, these developments will allow to tackle simultaneously imaging and dosimetry issues and soon, case system Monte Carlo simulations may become part of the nuclear medicine diagnostic process. This paper describes some Monte Carlo method basics and the sampling methods that were developed for it. It gives a referenced list of different simulation software used in nuclear medicine and enumerates some of their present and prospective applications. (author)

  10. 75 FR 61511 - Indian Gaming

    Science.gov (United States)

    2010-10-05

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs.... FOR FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the.... SUPPLEMENTARY INFORMATION: Under section 11 of the Indian Gaming Regulatory Act of 1988 (IGRA), Public Law 100...

  11. 75 FR 38834 - Indian Gaming

    Science.gov (United States)

    2010-07-06

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs...: July 6, 2010. FOR FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office...-4066. SUPPLEMENTARY INFORMATION: Under Section 11 of the Indian Gaming Regulatory Act of 1988 (IGRA...

  12. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Editorial Board. Sadhana. Editor. N Viswanadham, Indian Institute of Science, Bengaluru. Senior Associate Editors. Arakeri J H, Indian Institute of Science, Bengaluru Hari K V S, Indian Institute of Science, Bengaluru Mujumdar P P, Indian Institute of Science, Bengaluru Manoj Kumar Tiwari, Indian Institute of Technology, ...

  13. Multi-decadal modulation of the El Nino-Indian monsoon relationship by Indian Ocean variability

    International Nuclear Information System (INIS)

    Ummenhofer, Caroline C; Sen Gupta, Alexander; Li Yue; Taschetto, Andrea S; England, Matthew H

    2011-01-01

    The role of leading modes of Indo-Pacific climate variability is investigated for modulation of the strength of the Indian summer monsoon during the period 1877-2006. In particular, the effect of Indian Ocean conditions on the relationship between the El Nino-Southern Oscillation (ENSO) and the Indian monsoon is explored. Using an extended classification for ENSO and Indian Ocean dipole (IOD) events for the past 130 years and reanalyses, we have expanded previous interannual work to show that variations in Indian Ocean conditions modulate the ENSO-Indian monsoon relationship also on decadal timescales. El Nino events are frequently accompanied by a significantly reduced Indian monsoon and widespread drought conditions due to anomalous subsidence associated with a shift in the descending branch of the zonal Walker circulation. However, for El Nino events that co-occur with positive IOD (pIOD) events, Indian Ocean conditions act to counter El Nino's drought-inducing subsidence by enhancing moisture convergence over the Indian subcontinent, with an average monsoon season resulting. Decadal modulations of the frequency of independent and combined El Nino and pIOD events are consistent with a strengthened El Nino-Indian monsoon relationship observed at the start of the 20th century and the apparent recent weakening of the El Nino-Indian monsoon relationship.

  14. Multi-decadal modulation of the El Nino-Indian monsoon relationship by Indian Ocean variability

    Energy Technology Data Exchange (ETDEWEB)

    Ummenhofer, Caroline C; Sen Gupta, Alexander; Li Yue; Taschetto, Andrea S; England, Matthew H, E-mail: c.ummenhofer@unsw.edu.au [Climate Change Research Centre, University of New South Wales, Sydney (Australia)

    2011-07-15

    The role of leading modes of Indo-Pacific climate variability is investigated for modulation of the strength of the Indian summer monsoon during the period 1877-2006. In particular, the effect of Indian Ocean conditions on the relationship between the El Nino-Southern Oscillation (ENSO) and the Indian monsoon is explored. Using an extended classification for ENSO and Indian Ocean dipole (IOD) events for the past 130 years and reanalyses, we have expanded previous interannual work to show that variations in Indian Ocean conditions modulate the ENSO-Indian monsoon relationship also on decadal timescales. El Nino events are frequently accompanied by a significantly reduced Indian monsoon and widespread drought conditions due to anomalous subsidence associated with a shift in the descending branch of the zonal Walker circulation. However, for El Nino events that co-occur with positive IOD (pIOD) events, Indian Ocean conditions act to counter El Nino's drought-inducing subsidence by enhancing moisture convergence over the Indian subcontinent, with an average monsoon season resulting. Decadal modulations of the frequency of independent and combined El Nino and pIOD events are consistent with a strengthened El Nino-Indian monsoon relationship observed at the start of the 20th century and the apparent recent weakening of the El Nino-Indian monsoon relationship.

  15. Advanced Multilevel Monte Carlo Methods

    KAUST Repository

    Jasra, Ajay

    2017-04-24

    This article reviews the application of advanced Monte Carlo techniques in the context of Multilevel Monte Carlo (MLMC). MLMC is a strategy employed to compute expectations which can be biased in some sense, for instance, by using the discretization of a associated probability law. The MLMC approach works with a hierarchy of biased approximations which become progressively more accurate and more expensive. Using a telescoping representation of the most accurate approximation, the method is able to reduce the computational cost for a given level of error versus i.i.d. sampling from this latter approximation. All of these ideas originated for cases where exact sampling from couples in the hierarchy is possible. This article considers the case where such exact sampling is not currently possible. We consider Markov chain Monte Carlo and sequential Monte Carlo methods which have been introduced in the literature and we describe different strategies which facilitate the application of MLMC within these methods.

  16. Advanced Multilevel Monte Carlo Methods

    KAUST Repository

    Jasra, Ajay; Law, Kody; Suciu, Carina

    2017-01-01

    This article reviews the application of advanced Monte Carlo techniques in the context of Multilevel Monte Carlo (MLMC). MLMC is a strategy employed to compute expectations which can be biased in some sense, for instance, by using the discretization of a associated probability law. The MLMC approach works with a hierarchy of biased approximations which become progressively more accurate and more expensive. Using a telescoping representation of the most accurate approximation, the method is able to reduce the computational cost for a given level of error versus i.i.d. sampling from this latter approximation. All of these ideas originated for cases where exact sampling from couples in the hierarchy is possible. This article considers the case where such exact sampling is not currently possible. We consider Markov chain Monte Carlo and sequential Monte Carlo methods which have been introduced in the literature and we describe different strategies which facilitate the application of MLMC within these methods.

  17. Monte Carlo - Advances and Challenges

    International Nuclear Information System (INIS)

    Brown, Forrest B.; Mosteller, Russell D.; Martin, William R.

    2008-01-01

    Abstract only, full text follows: With ever-faster computers and mature Monte Carlo production codes, there has been tremendous growth in the application of Monte Carlo methods to the analysis of reactor physics and reactor systems. In the past, Monte Carlo methods were used primarily for calculating k eff of a critical system. More recently, Monte Carlo methods have been increasingly used for determining reactor power distributions and many design parameters, such as β eff , l eff , τ, reactivity coefficients, Doppler defect, dominance ratio, etc. These advanced applications of Monte Carlo methods are now becoming common, not just feasible, but bring new challenges to both developers and users: Convergence of 3D power distributions must be assured; confidence interval bias must be eliminated; iterated fission probabilities are required, rather than single-generation probabilities; temperature effects including Doppler and feedback must be represented; isotopic depletion and fission product buildup must be modeled. This workshop focuses on recent advances in Monte Carlo methods and their application to reactor physics problems, and on the resulting challenges faced by code developers and users. The workshop is partly tutorial, partly a review of the current state-of-the-art, and partly a discussion of future work that is needed. It should benefit both novice and expert Monte Carlo developers and users. In each of the topic areas, we provide an overview of needs, perspective on past and current methods, a review of recent work, and discussion of further research and capabilities that are required. Electronic copies of all workshop presentations and material will be available. The workshop is structured as 2 morning and 2 afternoon segments: - Criticality Calculations I - convergence diagnostics, acceleration methods, confidence intervals, and the iterated fission probability, - Criticality Calculations II - reactor kinetics parameters, dominance ratio, temperature

  18. 76 FR 42722 - Indian Gaming

    Science.gov (United States)

    2011-07-19

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs... Date: July 19, 2011. FOR FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming... INFORMATION: Under section 11 of the Indian Gaming Regulatory Act of 1988 (IGRA), Public Law 100-497, 25 U.S.C...

  19. Deciphering Detailed Plate Kinematics of the Indian Ocean: A Combined Indian-Australian-French Initiative

    Science.gov (United States)

    Vadakkeyakath, Y.; Müller, R.; Dyment, J.; Bhattacharya, G.; Lister, G. S.; Kattoju, K. R.; Whittaker, J.; Shuhail, M.; Gibbons, A.; Jacob, J.; White, L. T.; Bissessur, P. D.; Kiranmai, S.

    2012-12-01

    The Indian Ocean formed as a result of the fragmentation and dispersal of East Gondwanaland since the Jurassic. The deep ocean basins in the Indian Ocean contain the imprints of this plate tectonic history, which is related with several major events such as the Kerguelen, Marion and Reunion hotspot inception and the Indo-Eurasian collision. A broad model for evolution of the Indian Ocean was proposed in the early 1980s. Subsequently, French scientists collected a large amount of magnetic data from the western and southern parts of the Indian Ocean while Indian and Australian scientists collected considerable volumes of magnetic data from the regions of Indian Ocean around their mainlands. Using these data, the Indian, French and Australian researchers independently carried out investigations over different parts of the Indian Ocean and provided improved models of plate kinematics at different sectoral plate boundaries. Under two Indo-French collaborative projects, detailed magnetic investigations were carried out in the Northwestern and Central Indian Ocean by combining the available magnetic data from conjugate regions. Those projects were complemented by additional area-specific studies in the Mascarene, Wharton, Laxmi and Gop basins, which are characterized by extinct spreading regimes. These Indo-French projects provided high resolution and improved plate tectonic models for the evolution of the conjugate Arabian and Eastern Somali basins that constrain the relative motion between the Indian-African (now Indian-Somalian) plate boundaries, and the conjugate Central Indian, Crozet and Madagascar basins that mainly constrain the relative motions of Indian-African (now Capricorn-Somalian) and Indian-Antarctic (now Capricorn-Antarctic) plate boundaries. During the same period, Australian scientists carried out investigations in the southeastern part of the Indian Ocean and provided an improved understanding of the plate tectonic evolution of the Indian

  20. Fast sequential Monte Carlo methods for counting and optimization

    CERN Document Server

    Rubinstein, Reuven Y; Vaisman, Radislav

    2013-01-01

    A comprehensive account of the theory and application of Monte Carlo methods Based on years of research in efficient Monte Carlo methods for estimation of rare-event probabilities, counting problems, and combinatorial optimization, Fast Sequential Monte Carlo Methods for Counting and Optimization is a complete illustration of fast sequential Monte Carlo techniques. The book provides an accessible overview of current work in the field of Monte Carlo methods, specifically sequential Monte Carlo techniques, for solving abstract counting and optimization problems. Written by authorities in the

  1. The Living Indian Critical Tradition

    Directory of Open Access Journals (Sweden)

    Vivek Kumar Dwivedi

    2010-11-01

    Full Text Available This paper attempts to establish the identity of something that is often considered to be missing – a living Indian critical tradition. I refer to the tradition that arises out of the work of those Indians who write in English. The chief architects of this tradition are Sri Aurobindo, C.D. Narasimhaiah, Gayatri Chakravorty Spivak and Homi K. Bhabha. It is possible to believe that Indian literary theories derive almost solely from ancient Sanskrit poetics. Or, alternatively, one can be concerned about the sad state of affairs regarding Indian literary theories or criticism in English. There have been scholars who have raised the question of the pathetic state of Indian scholarship in English and have even come up with some positive suggestions. But these scholars are those who are ignorant about the living Indian critical tradition. The significance of the Indian critical tradition lies in the fact that it provides the real focus to the Indian critical scene. Without an awareness of this tradition Indian literary scholarship (which is quite a different thing from Indian literary criticism and theory as it does not have the same impact as the latter two do can easily fail to see who the real Indian literary critics and theorists are.

  2. The MC21 Monte Carlo Transport Code

    International Nuclear Information System (INIS)

    Sutton TM; Donovan TJ; Trumbull TH; Dobreff PS; Caro E; Griesheimer DP; Tyburski LJ; Carpenter DC; Joo H

    2007-01-01

    MC21 is a new Monte Carlo neutron and photon transport code currently under joint development at the Knolls Atomic Power Laboratory and the Bettis Atomic Power Laboratory. MC21 is the Monte Carlo transport kernel of the broader Common Monte Carlo Design Tool (CMCDT), which is also currently under development. The vision for CMCDT is to provide an automated, computer-aided modeling and post-processing environment integrated with a Monte Carlo solver that is optimized for reactor analysis. CMCDT represents a strategy to push the Monte Carlo method beyond its traditional role as a benchmarking tool or ''tool of last resort'' and into a dominant design role. This paper describes various aspects of the code, including the neutron physics and nuclear data treatments, the geometry representation, and the tally and depletion capabilities

  3. Monte Carlo Treatment Planning for Advanced Radiotherapy

    DEFF Research Database (Denmark)

    Cronholm, Rickard

    This Ph.d. project describes the development of a workflow for Monte Carlo Treatment Planning for clinical radiotherapy plans. The workflow may be utilized to perform an independent dose verification of treatment plans. Modern radiotherapy treatment delivery is often conducted by dynamically...... modulating the intensity of the field during the irradiation. The workflow described has the potential to fully model the dynamic delivery, including gantry rotation during irradiation, of modern radiotherapy. Three corner stones of Monte Carlo Treatment Planning are identified: Building, commissioning...... and validation of a Monte Carlo model of a medical linear accelerator (i), converting a CT scan of a patient to a Monte Carlo compliant phantom (ii) and translating the treatment plan parameters (including beam energy, angles of incidence, collimator settings etc) to a Monte Carlo input file (iii). A protocol...

  4. Hereditary polymorphic light eruption of American Indians: occurrence in non-Indians with polymorphic light eruption.

    Science.gov (United States)

    Fusaro, R M; Johnson, J A

    1996-04-01

    Hereditary polymorphic light eruption (HPLE) occurs unique ly in the American Indian and Inuit and exhibits autosomal dominant transmission. Because the cutaneous expression of HPLE resembles that of polymorphic light eruption (PLE) and because many non-Indians in the United States have American Indian heritage, some instances of PLE may actually be HPLE. Our purpose was to determine whether non-Indian patients with PLE have characteristics suggestive of HPLE. We surveyed in Nebraska 25 European-Caucasian and 36 African-American patients with PLE for American Indian heritage and photosensitive relatives. Nonphotosensitive subjects (52 Caucasians and 40 African Americans) were surveyed for American Indian heritage. American Indian heritage occurred in 11 Caucasian patients (44%); of those, seven (64%) had photosensitive relatives. Likewise, 29 African Americans (81%) had American Indian heritage; 19 (66%) of those had photosensitive relatives. American Indian heritage occurred in 10 Caucasian control subjects (19%) and in 34 African-American control subjects (85%). If American Indian heritage and a family history of photosensitivity are definitive for HPLE, seven (28%) of our Caucasian patients and 19 (53%) of our African-American patients have HPLE rather than PLE. We urge physicians who suspect PLE in non-Indians to ask about American Indian heritage and photosensitive relatives and to screen their present patients with PLE for such characteristics.

  5. Preparation and Consumer Acceptance of Indian Mango Leather and Osmo - Dehyrated Indian Mango

    Directory of Open Access Journals (Sweden)

    Cyril John A. Domingo

    2017-05-01

    Full Text Available Indian mangoes are considered highly perishable products due to high moisture content which resulted in high postharvest losses in Pangasinan, Philippines. This study exploits the potential of underutilized indian mango to value - added products. The developed i ndian mango leather and osmo - dehyrated indian mango are deh ydrated fruit products can be eaten as snacks or desserts. Indian mango leathe r was prepared by mixing fruit puree and other additives like sugar, citric acid, and sodium met abisulphite and then dehydrated them at 55 °C for 15 hours under convective oven. Osmo - dehydrated indian mang o was prepared by immer sing h alves of deseeded and deskinned pulps in 50 % (w/w sucrose solution for 20 hours f ollowed by drying initially at 50 °C then aft er one hour at 60 °C for 15 hours. Thirty - three member untrained panels were involved in consumer a ccep tance evaluation . Panelists evaluated the colo r, sweetness, sourness, texture, and overall acceptability of the osmotically - treated indian mango and indian mango leather using seven - point h edonic scale . Over - all, the indian mango leather and osmo - dehy drated indian mango developed in this study seemed to be acceptable for all the sensory parameters as indicated by high scores of greater than five (>5 .

  6. American Indian Men's Perceptions of Breast Cancer Screening for American Indian Women.

    Science.gov (United States)

    Filippi, Melissa K; Pacheco, Joseph; James, Aimee S; Brown, Travis; Ndikum-Moffor, Florence; Choi, Won S; Greiner, K Allen; Daley, Christine M

    2014-01-01

    Screening, especially screening mammography, is vital for decreasing breast cancer incidence and mortality. Screening rates in American Indian women are low compared to other racial/ethnic groups. In addition, American Indian women are diagnosed at more advanced stages and have lower 5-year survival rate than others. To better address the screening rates of American Indian women, focus groups (N=8) were conducted with American Indian men (N=42) to explore their perceptions of breast cancer screening for American Indian women. Our intent was to understand men's support level toward screening. Using a community-based participatory approach, focus groups were audio-taped, transcribed verbatim, and analyzed using a text analysis approach developed by our team. Topics discussed included breast cancer and screening knowledge, barriers to screening, and suggestions to improve screening rates. These findings can guide strategies to improve knowledge and awareness, communication among families and health care providers, and screening rates in American Indian communities.

  7. Monte carlo simulation for soot dynamics

    KAUST Repository

    Zhou, Kun

    2012-01-01

    A new Monte Carlo method termed Comb-like frame Monte Carlo is developed to simulate the soot dynamics. Detailed stochastic error analysis is provided. Comb-like frame Monte Carlo is coupled with the gas phase solver Chemkin II to simulate soot formation in a 1-D premixed burner stabilized flame. The simulated soot number density, volume fraction, and particle size distribution all agree well with the measurement available in literature. The origin of the bimodal distribution of particle size distribution is revealed with quantitative proof.

  8. 76 FR 49505 - Indian Gaming

    Science.gov (United States)

    2011-08-10

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Tribal-State Class III Gaming Compact taking effect. SUMMARY: This publishes..., Director, Office of Indian Gaming, Office of the Deputy Assistant Secretary--Policy and Economic...

  9. 75 FR 38833 - Indian Gaming

    Science.gov (United States)

    2010-07-06

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes... Date: July 6, 2010. FOR FURTHER INFORMATION CONTACT: Paula Hart, Director, Office of Indian Gaming...

  10. 77 FR 76514 - Indian Gaming

    Science.gov (United States)

    2012-12-28

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact taking effect. SUMMARY: This... FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the Deputy...

  11. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. BEDARTHA GOSWAMI. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 51-60 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Inferring interdependencies from short ...

  12. Multilevel sequential Monte Carlo samplers

    KAUST Repository

    Beskos, Alexandros; Jasra, Ajay; Law, Kody; Tempone, Raul; Zhou, Yan

    2016-01-01

    In this article we consider the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods which depend on the step-size level . hL. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretization levels . ∞>h0>h1⋯>hL. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence and a sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. It is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context. That is, relative to exact sampling and Monte Carlo for the distribution at the finest level . hL. The approach is numerically illustrated on a Bayesian inverse problem. © 2016 Elsevier B.V.

  13. Multilevel sequential Monte Carlo samplers

    KAUST Repository

    Beskos, Alexandros

    2016-08-29

    In this article we consider the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods which depend on the step-size level . hL. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretization levels . ∞>h0>h1⋯>hL. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence and a sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. It is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context. That is, relative to exact sampling and Monte Carlo for the distribution at the finest level . hL. The approach is numerically illustrated on a Bayesian inverse problem. © 2016 Elsevier B.V.

  14. Statistical estimation Monte Carlo for unreliability evaluation of highly reliable system

    International Nuclear Information System (INIS)

    Xiao Gang; Su Guanghui; Jia Dounan; Li Tianduo

    2000-01-01

    Based on analog Monte Carlo simulation, statistical Monte Carlo methods for unreliable evaluation of highly reliable system are constructed, including direct statistical estimation Monte Carlo method and weighted statistical estimation Monte Carlo method. The basal element is given, and the statistical estimation Monte Carlo estimators are derived. Direct Monte Carlo simulation method, bounding-sampling method, forced transitions Monte Carlo method, direct statistical estimation Monte Carlo and weighted statistical estimation Monte Carlo are used to evaluate unreliability of a same system. By comparing, weighted statistical estimation Monte Carlo estimator has smallest variance, and has highest calculating efficiency

  15. Applications of Monte Carlo method in Medical Physics

    International Nuclear Information System (INIS)

    Diez Rios, A.; Labajos, M.

    1989-01-01

    The basic ideas of Monte Carlo techniques are presented. Random numbers and their generation by congruential methods, which underlie Monte Carlo calculations are shown. Monte Carlo techniques to solve integrals are discussed. The evaluation of a simple monodimensional integral with a known answer, by means of two different Monte Carlo approaches are discussed. The basic principles to simualate on a computer photon histories reduce variance and the current applications in Medical Physics are commented. (Author)

  16. Guideline of Monte Carlo calculation. Neutron/gamma ray transport simulation by Monte Carlo method

    CERN Document Server

    2002-01-01

    This report condenses basic theories and advanced applications of neutron/gamma ray transport calculations in many fields of nuclear energy research. Chapters 1 through 5 treat historical progress of Monte Carlo methods, general issues of variance reduction technique, cross section libraries used in continuous energy Monte Carlo codes. In chapter 6, the following issues are discussed: fusion benchmark experiments, design of ITER, experiment analyses of fast critical assembly, core analyses of JMTR, simulation of pulsed neutron experiment, core analyses of HTTR, duct streaming calculations, bulk shielding calculations, neutron/gamma ray transport calculations of the Hiroshima atomic bomb. Chapters 8 and 9 treat function enhancements of MCNP and MVP codes, and a parallel processing of Monte Carlo calculation, respectively. An important references are attached at the end of this report.

  17. Experience with the Monte Carlo Method

    Energy Technology Data Exchange (ETDEWEB)

    Hussein, E M.A. [Department of Mechanical Engineering University of New Brunswick, Fredericton, N.B., (Canada)

    2007-06-15

    Monte Carlo simulation of radiation transport provides a powerful research and design tool that resembles in many aspects laboratory experiments. Moreover, Monte Carlo simulations can provide an insight not attainable in the laboratory. However, the Monte Carlo method has its limitations, which if not taken into account can result in misleading conclusions. This paper will present the experience of this author, over almost three decades, in the use of the Monte Carlo method for a variety of applications. Examples will be shown on how the method was used to explore new ideas, as a parametric study and design optimization tool, and to analyze experimental data. The consequences of not accounting in detail for detector response and the scattering of radiation by surrounding structures are two of the examples that will be presented to demonstrate the pitfall of condensed.

  18. Experience with the Monte Carlo Method

    International Nuclear Information System (INIS)

    Hussein, E.M.A.

    2007-01-01

    Monte Carlo simulation of radiation transport provides a powerful research and design tool that resembles in many aspects laboratory experiments. Moreover, Monte Carlo simulations can provide an insight not attainable in the laboratory. However, the Monte Carlo method has its limitations, which if not taken into account can result in misleading conclusions. This paper will present the experience of this author, over almost three decades, in the use of the Monte Carlo method for a variety of applications. Examples will be shown on how the method was used to explore new ideas, as a parametric study and design optimization tool, and to analyze experimental data. The consequences of not accounting in detail for detector response and the scattering of radiation by surrounding structures are two of the examples that will be presented to demonstrate the pitfall of condensed

  19. 77 FR 76513 - Indian Gaming

    Science.gov (United States)

    2012-12-28

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Amended Tribal-State Class III Gaming Compact taking effect. SUMMARY..., 2012. FOR FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the...

  20. Monte Carlo alpha calculation

    Energy Technology Data Exchange (ETDEWEB)

    Brockway, D.; Soran, P.; Whalen, P.

    1985-01-01

    A Monte Carlo algorithm to efficiently calculate static alpha eigenvalues, N = ne/sup ..cap alpha..t/, for supercritical systems has been developed and tested. A direct Monte Carlo approach to calculating a static alpha is to simply follow the buildup in time of neutrons in a supercritical system and evaluate the logarithmic derivative of the neutron population with respect to time. This procedure is expensive, and the solution is very noisy and almost useless for a system near critical. The modified approach is to convert the time-dependent problem to a static ..cap alpha../sup -/eigenvalue problem and regress ..cap alpha.. on solutions of a/sup -/ k/sup -/eigenvalue problem. In practice, this procedure is much more efficient than the direct calculation, and produces much more accurate results. Because the Monte Carlo codes are intrinsically three-dimensional and use elaborate continuous-energy cross sections, this technique is now used as a standard for evaluating other calculational techniques in odd geometries or with group cross sections.

  1. Monte Carlo simulations of neutron scattering instruments

    International Nuclear Information System (INIS)

    Aestrand, Per-Olof; Copenhagen Univ.; Lefmann, K.; Nielsen, K.

    2001-01-01

    A Monte Carlo simulation is an important computational tool used in many areas of science and engineering. The use of Monte Carlo techniques for simulating neutron scattering instruments is discussed. The basic ideas, techniques and approximations are presented. Since the construction of a neutron scattering instrument is very expensive, Monte Carlo software used for design of instruments have to be validated and tested extensively. The McStas software was designed with these aspects in mind and some of the basic principles of the McStas software will be discussed. Finally, some future prospects are discussed for using Monte Carlo simulations in optimizing neutron scattering experiments. (R.P.)

  2. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. SERGEY P KUZNETSOV. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 117-132 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Chaos in three coupled rotators: ...

  3. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. PRIYANKA SHUKLA. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 133-143 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Grad-type fourteen-moment theory for ...

  4. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. F FAMILY. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 221-224 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Transport in ratchets with single-file constraint.

  5. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. GIOVANNA ZIMATORE. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 35-41 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. RQA correlations on real business cycles ...

  6. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. SUDHARSANA V IYENGAR. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 93-99 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Missing cycles: Effect of climate ...

  7. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. NORBERT MARWAN. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 51-60 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Inferring interdependencies from short time ...

  8. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. JANAKI BALAKRISHNAN. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 93-99 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Missing cycles: Effect of climate change ...

  9. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. PAUL SCHULTZ. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 51-60 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Inferring interdependencies from short time ...

  10. Depreciation of the Indian Currency: Implications for the Indian Economy.

    OpenAIRE

    Sumanjeet Singh

    2009-01-01

    The Indian currency has depreciated by more than 20 per cent since April 2008 and breached its crucial 50-level against the greenback on sustained dollar purchases by foreign banks and stronger dollar overseas. The fall in the value of Indian rupee has several consequences which could have mixed effects on Indian economy. But, mainly, there are four expected implications of falling rupee. First, it should boost exports; second, it will lead to higher cost of imported goods and make some of th...

  11. 76 FR 165 - Indian Gaming

    Science.gov (United States)

    2011-01-03

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs... Wisconsin Gaming Compact of 1992, as Amended in 1999, 2000, and 2003. DATES: Effective Date: January 3, 2011. FOR FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the...

  12. 75 FR 68618 - Indian Gaming

    Science.gov (United States)

    2010-11-08

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs... of Wisconsin Gaming Compact of 1991, as Amended in 1999 and 2003. DATES: Effective Date: November 8, 2010. FOR FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the...

  13. Linear filtering applied to Monte Carlo criticality calculations

    International Nuclear Information System (INIS)

    Morrison, G.W.; Pike, D.H.; Petrie, L.M.

    1975-01-01

    A significant improvement in the acceleration of the convergence of the eigenvalue computed by Monte Carlo techniques has been developed by applying linear filtering theory to Monte Carlo calculations for multiplying systems. A Kalman filter was applied to a KENO Monte Carlo calculation of an experimental critical system consisting of eight interacting units of fissile material. A comparison of the filter estimate and the Monte Carlo realization was made. The Kalman filter converged in five iterations to 0.9977. After 95 iterations, the average k-eff from the Monte Carlo calculation was 0.9981. This demonstrates that the Kalman filter has the potential of reducing the calculational effort of multiplying systems. Other examples and results are discussed

  14. Burnup calculations using Monte Carlo method

    International Nuclear Information System (INIS)

    Ghosh, Biplab; Degweker, S.B.

    2009-01-01

    In the recent years, interest in burnup calculations using Monte Carlo methods has gained momentum. Previous burn up codes have used multigroup transport theory based calculations followed by diffusion theory based core calculations for the neutronic portion of codes. The transport theory methods invariably make approximations with regard to treatment of the energy and angle variables involved in scattering, besides approximations related to geometry simplification. Cell homogenisation to produce diffusion, theory parameters adds to these approximations. Moreover, while diffusion theory works for most reactors, it does not produce accurate results in systems that have strong gradients, strong absorbers or large voids. Also, diffusion theory codes are geometry limited (rectangular, hexagonal, cylindrical, and spherical coordinates). Monte Carlo methods are ideal to solve very heterogeneous reactors and/or lattices/assemblies in which considerable burnable poisons are used. The key feature of this approach is that Monte Carlo methods permit essentially 'exact' modeling of all geometrical detail, without resort to ene and spatial homogenization of neutron cross sections. Monte Carlo method would also be better for in Accelerator Driven Systems (ADS) which could have strong gradients due to the external source and a sub-critical assembly. To meet the demand for an accurate burnup code, we have developed a Monte Carlo burnup calculation code system in which Monte Carlo neutron transport code is coupled with a versatile code (McBurn) for calculating the buildup and decay of nuclides in nuclear materials. McBurn is developed from scratch by the authors. In this article we will discuss our effort in developing the continuous energy Monte Carlo burn-up code, McBurn. McBurn is intended for entire reactor core as well as for unit cells and assemblies. Generally, McBurn can do burnup of any geometrical system which can be handled by the underlying Monte Carlo transport code

  15. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. F REVUELTA. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 145-155 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Rate calculation in two-dimensional barriers with ...

  16. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. JOYDEEP SINGHA. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 195-203 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Spatial splay states in coupled map lattices ...

  17. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. MURILO S BAPTISTA. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 17-23 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Interpreting physical flows in networks as a ...

  18. Monte Carlo simulations for plasma physics

    International Nuclear Information System (INIS)

    Okamoto, M.; Murakami, S.; Nakajima, N.; Wang, W.X.

    2000-07-01

    Plasma behaviours are very complicated and the analyses are generally difficult. However, when the collisional processes play an important role in the plasma behaviour, the Monte Carlo method is often employed as a useful tool. For examples, in neutral particle injection heating (NBI heating), electron or ion cyclotron heating, and alpha heating, Coulomb collisions slow down high energetic particles and pitch angle scatter them. These processes are often studied by the Monte Carlo technique and good agreements can be obtained with the experimental results. Recently, Monte Carlo Method has been developed to study fast particle transports associated with heating and generating the radial electric field. Further it is applied to investigating the neoclassical transport in the plasma with steep gradients of density and temperatures which is beyong the conventional neoclassical theory. In this report, we briefly summarize the researches done by the present authors utilizing the Monte Carlo method. (author)

  19. La justicia en venta. El beneficio de cargos americanos de audiencia bajo Carlos II (1683-1700

    Directory of Open Access Journals (Sweden)

    Sanz Tapia, Ángel

    2012-06-01

    Full Text Available The article deals with the provision and «beneficio» (sale of Justice appointments to Spanish American Audiencias (Higher Courts in the 17th century, that modifies and completes previous research. The sale process of posts in American Audiencias (Crown attorneys, civil and criminal judges and attorneys for Indian defence is studied beginning with the antecedents in the reign of Felipe IV and the mercantile start in 1683 to the final years of Carlos II. Together with the Crown policy on this respect, it is offered a yearly analysis of the provided posts and beneficiaries, amounts paid and posts holders, along with some social aspects of the buyers (origin, status and professional experience.

    El artículo trata la provisión y «beneficio» (venta de los cargos de Justicia de las audiencias hispanoamericanas durante el siglo XVII, modificando y completando investigaciones anteriores. Estudia el proceso de venta de las plazas audienciales indianas (fiscales, oidores, alcaldes del crimen y fiscales protectores de indios desde sus antecedentes con Felipe IV y su inicio mercantil en 1683 hasta finales de Carlos II. Junto al planteamiento de la Corona se ofrece un completo análisis anual de las plazas provistas y beneficiadas, cantidades abonadas y personas titulares, junto con algunos aspectos sociales de los compradores (origen, estatus, experiencia.

  20. Monte Carlo simulation of calibration of shadow shield scanning bed whole body monitor using different size BOMAB phantoms

    International Nuclear Information System (INIS)

    Bhati, S.; Patni, H.K.; Singh, I.S.; Garg, S.P.

    2005-01-01

    A shadow shield scanning bed whole body monitor incorporating a (102 mm dia x 76 mm thick) NaI(Tl) detector, is employed for assessment of high-energy photon emitters at BARC. The monitor is calibrated using a Reference BOMAB phantom representative of an average Indian radiation worker. However to account for the size variation in the physique of workers, it is required to calibrate the system with different size BOMAB phantoms which is both difficult and expensive. Therefore, a theoretical approach based on Monte Carlo techniques has been employed to calibrate the system with BOMAB phantoms of different sizes for several radionuclides of interest. A computer program developed for this purpose, simulates the scanning geometry of the whole body monitor and computes detection efficiencies for the BARC Reference phantom (63 kg/168 cm), ICRP Reference phantom (70 kg/170 cm) and several of its scaled versions covering a wide range of body builds. The detection efficiencies computed for different photon energies for BARC Reference phantom were found to be in very good agreement with experimental data, thus validating the Monte Carlo scheme used in the computer code. The results from this study could be used for assessment of internal contamination due to high-energy photon emitters for radiation workers of different physiques. (author)

  1. Indian Summer

    Energy Technology Data Exchange (ETDEWEB)

    Galindo, E. [Sho-Ban High School, Fort Hall, ID (United States)

    1997-08-01

    This paper focuses on preserving and strengthening two resources culturally and socially important to the Shoshone-Bannock Indian Tribe on the Fort Hall Reservation in Idaho; their young people and the Pacific-Northwest Salmon. After learning that salmon were not returning in significant numbers to ancestral fishing waters at headwater spawning sites, tribal youth wanted to know why. As a result, the Indian Summer project was conceived to give Shoshone-Bannock High School students the opportunity to develop hands-on, workable solutions to improve future Indian fishing and help make the river healthy again. The project goals were to increase the number of fry introduced into the streams, teach the Shoshone-Bannock students how to use scientific methodologies, and get students, parents, community members, and Indian and non-Indian mentors excited about learning. The students chose an egg incubation experiment to help increase self-sustaining, natural production of steelhead trout, and formulated and carried out a three step plan to increase the hatch-rate of steelhead trout in Idaho waters. With the help of local companies, governmental agencies, scientists, and mentors students have been able to meet their project goals, and at the same time, have learned how to use scientific methods to solve real life problems, how to return what they have used to the water and land, and how to have fun and enjoy life while learning.

  2. Vegetation burn severity mapping using Landsat-8 and WorldView-2

    Science.gov (United States)

    Wu, Zhuoting; Middleton, Barry R.; Hetzler, Robert; Vogel, John M.; Dye, Dennis G.

    2015-01-01

    We used remotely sensed data from the Landsat-8 and WorldView-2 satellites to estimate vegetation burn severity of the Creek Fire on the San Carlos Apache Reservation, where wildfire occurrences affect the Tribe's crucial livestock and logging industries. Accurate pre- and post-fire canopy maps at high (0.5-meter) resolution were created from World- View-2 data to generate canopy loss maps, and multiple indices from pre- and post-fire Landsat-8 images were used to evaluate vegetation burn severity. Normalized difference vegetation index based vegetation burn severity map had the highest correlation coefficients with canopy loss map from WorldView-2. Two distinct approaches - canopy loss mapping from WorldView-2 and spectral index differencing from Landsat-8 - agreed well with the field-based burn severity estimates and are both effective for vegetation burn severity mapping. Canopy loss maps created with WorldView-2 imagery add to a short list of accurate vegetation burn severity mapping techniques that can help guide effective management of forest resources on the San Carlos Apache Reservation, and the broader fire-prone regions of the Southwest.

  3. Monte Carlo approaches to light nuclei

    International Nuclear Information System (INIS)

    Carlson, J.

    1990-01-01

    Significant progress has been made recently in the application of Monte Carlo methods to the study of light nuclei. We review new Green's function Monte Carlo results for the alpha particle, Variational Monte Carlo studies of 16 O, and methods for low-energy scattering and transitions. Through these calculations, a coherent picture of the structure and electromagnetic properties of light nuclei has arisen. In particular, we examine the effect of the three-nucleon interaction and the importance of exchange currents in a variety of experimentally measured properties, including form factors and capture cross sections. 29 refs., 7 figs

  4. Monte Carlo approaches to light nuclei

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, J.

    1990-01-01

    Significant progress has been made recently in the application of Monte Carlo methods to the study of light nuclei. We review new Green's function Monte Carlo results for the alpha particle, Variational Monte Carlo studies of {sup 16}O, and methods for low-energy scattering and transitions. Through these calculations, a coherent picture of the structure and electromagnetic properties of light nuclei has arisen. In particular, we examine the effect of the three-nucleon interaction and the importance of exchange currents in a variety of experimentally measured properties, including form factors and capture cross sections. 29 refs., 7 figs.

  5. Monte Carlo methods and models in finance and insurance

    CERN Document Server

    Korn, Ralf; Kroisandt, Gerald

    2010-01-01

    Offering a unique balance between applications and calculations, Monte Carlo Methods and Models in Finance and Insurance incorporates the application background of finance and insurance with the theory and applications of Monte Carlo methods. It presents recent methods and algorithms, including the multilevel Monte Carlo method, the statistical Romberg method, and the Heath-Platen estimator, as well as recent financial and actuarial models, such as the Cheyette and dynamic mortality models. The authors separately discuss Monte Carlo techniques, stochastic process basics, and the theoretical background and intuition behind financial and actuarial mathematics, before bringing the topics together to apply the Monte Carlo methods to areas of finance and insurance. This allows for the easy identification of standard Monte Carlo tools and for a detailed focus on the main principles of financial and insurance mathematics. The book describes high-level Monte Carlo methods for standard simulation and the simulation of...

  6. Indian Legends.

    Science.gov (United States)

    Gurnoe, Katherine J.; Skjervold, Christian, Ed.

    Presenting American Indian legends, this material provides insight into the cultural background of the Dakota, Ojibwa, and Winnebago people. Written in a straightforward manner, each of the eight legends is associated with an Indian group. The legends included here are titled as follows: Minnesota is Minabozho's Land (Ojibwa); How We Got the…

  7. 77 FR 47868 - Indian Entities Recognized and Eligible To Receive Services From the Bureau of Indian Affairs

    Science.gov (United States)

    2012-08-10

    ... Indian Colony of California) Buena Vista Rancheria of Me-Wuk Indians of California Burns Paiute Tribe... of Idaho La Jolla Band of Luiseno Indians, California (previously listed as the La Jolla Band of Luiseno Mission Indians of the La Jolla Reservation) La Posta Band of Diegueno Mission Indians of the La...

  8. Simulation and the Monte Carlo method

    CERN Document Server

    Rubinstein, Reuven Y

    2016-01-01

    Simulation and the Monte Carlo Method, Third Edition reflects the latest developments in the field and presents a fully updated and comprehensive account of the major topics that have emerged in Monte Carlo simulation since the publication of the classic First Edition over more than a quarter of a century ago. While maintaining its accessible and intuitive approach, this revised edition features a wealth of up-to-date information that facilitates a deeper understanding of problem solving across a wide array of subject areas, such as engineering, statistics, computer science, mathematics, and the physical and life sciences. The book begins with a modernized introduction that addresses the basic concepts of probability, Markov processes, and convex optimization. Subsequent chapters discuss the dramatic changes that have occurred in the field of the Monte Carlo method, with coverage of many modern topics including: Markov Chain Monte Carlo, variance reduction techniques such as the transform likelihood ratio...

  9. Lecture 1. Monte Carlo basics. Lecture 2. Adjoint Monte Carlo. Lecture 3. Coupled Forward-Adjoint calculations

    Energy Technology Data Exchange (ETDEWEB)

    Hoogenboom, J.E. [Delft University of Technology, Interfaculty Reactor Institute, Delft (Netherlands)

    2000-07-01

    The Monte Carlo method is a statistical method to solve mathematical and physical problems using random numbers. The principle of the methods will be demonstrated for a simple mathematical problem and for neutron transport. Various types of estimators will be discussed, as well as generally applied variance reduction methods like splitting, Russian roulette and importance biasing. The theoretical formulation for solving eigenvalue problems for multiplying systems will be shown. Some reflections will be given about the applicability of the Monte Carlo method, its limitations and its future prospects for reactor physics calculations. Adjoint Monte Carlo is a Monte Carlo game to solve the adjoint neutron (or photon) transport equation. The adjoint transport equation can be interpreted in terms of simulating histories of artificial particles, which show properties of neutrons that move backwards in history. These particles will start their history at the detector from which the response must be estimated and give a contribution to the estimated quantity when they hit or pass through the neutron source. Application to multigroup transport formulation will be demonstrated Possible implementation for the continuous energy case will be outlined. The inherent advantages and disadvantages of the method will be discussed. The Midway Monte Carlo method will be presented for calculating a detector response due to a (neutron or photon) source. A derivation will be given of the basic formula for the Midway Monte Carlo method The black absorber technique, allowing for a cutoff of particle histories when reaching the midway surface in one of the calculations will be derived. An extension of the theory to coupled neutron-photon problems is given. The method will be demonstrated for an oil well logging problem, comprising a neutron source in a borehole and photon detectors to register the photons generated by inelastic neutron scattering. (author)

  10. Lecture 1. Monte Carlo basics. Lecture 2. Adjoint Monte Carlo. Lecture 3. Coupled Forward-Adjoint calculations

    International Nuclear Information System (INIS)

    Hoogenboom, J.E.

    2000-01-01

    The Monte Carlo method is a statistical method to solve mathematical and physical problems using random numbers. The principle of the methods will be demonstrated for a simple mathematical problem and for neutron transport. Various types of estimators will be discussed, as well as generally applied variance reduction methods like splitting, Russian roulette and importance biasing. The theoretical formulation for solving eigenvalue problems for multiplying systems will be shown. Some reflections will be given about the applicability of the Monte Carlo method, its limitations and its future prospects for reactor physics calculations. Adjoint Monte Carlo is a Monte Carlo game to solve the adjoint neutron (or photon) transport equation. The adjoint transport equation can be interpreted in terms of simulating histories of artificial particles, which show properties of neutrons that move backwards in history. These particles will start their history at the detector from which the response must be estimated and give a contribution to the estimated quantity when they hit or pass through the neutron source. Application to multigroup transport formulation will be demonstrated Possible implementation for the continuous energy case will be outlined. The inherent advantages and disadvantages of the method will be discussed. The Midway Monte Carlo method will be presented for calculating a detector response due to a (neutron or photon) source. A derivation will be given of the basic formula for the Midway Monte Carlo method The black absorber technique, allowing for a cutoff of particle histories when reaching the midway surface in one of the calculations will be derived. An extension of the theory to coupled neutron-photon problems is given. The method will be demonstrated for an oil well logging problem, comprising a neutron source in a borehole and photon detectors to register the photons generated by inelastic neutron scattering. (author)

  11. Monte Carlo Techniques for Nuclear Systems - Theory Lectures

    International Nuclear Information System (INIS)

    Brown, Forrest B.; Univ. of New Mexico, Albuquerque, NM

    2016-01-01

    These are lecture notes for a Monte Carlo class given at the University of New Mexico. The following topics are covered: course information; nuclear eng. review & MC; random numbers and sampling; computational geometry; collision physics; tallies and statistics; eigenvalue calculations I; eigenvalue calculations II; eigenvalue calculations III; variance reduction; parallel Monte Carlo; parameter studies; fission matrix and higher eigenmodes; doppler broadening; Monte Carlo depletion; HTGR modeling; coupled MC and T/H calculations; fission energy deposition. Solving particle transport problems with the Monte Carlo method is simple - just simulate the particle behavior. The devil is in the details, however. These lectures provide a balanced approach to the theory and practice of Monte Carlo simulation codes. The first lectures provide an overview of Monte Carlo simulation methods, covering the transport equation, random sampling, computational geometry, collision physics, and statistics. The next lectures focus on the state-of-the-art in Monte Carlo criticality simulations, covering the theory of eigenvalue calculations, convergence analysis, dominance ratio calculations, bias in Keff and tallies, bias in uncertainties, a case study of a realistic calculation, and Wielandt acceleration techniques. The remaining lectures cover advanced topics, including HTGR modeling and stochastic geometry, temperature dependence, fission energy deposition, depletion calculations, parallel calculations, and parameter studies. This portion of the class focuses on using MCNP to perform criticality calculations for reactor physics and criticality safety applications. It is an intermediate level class, intended for those with at least some familiarity with MCNP. Class examples provide hands-on experience at running the code, plotting both geometry and results, and understanding the code output. The class includes lectures & hands-on computer use for a variety of Monte Carlo calculations

  12. Monte Carlo Techniques for Nuclear Systems - Theory Lectures

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Monte Carlo Methods, Codes, and Applications Group; Univ. of New Mexico, Albuquerque, NM (United States). Nuclear Engineering Dept.

    2016-11-29

    These are lecture notes for a Monte Carlo class given at the University of New Mexico. The following topics are covered: course information; nuclear eng. review & MC; random numbers and sampling; computational geometry; collision physics; tallies and statistics; eigenvalue calculations I; eigenvalue calculations II; eigenvalue calculations III; variance reduction; parallel Monte Carlo; parameter studies; fission matrix and higher eigenmodes; doppler broadening; Monte Carlo depletion; HTGR modeling; coupled MC and T/H calculations; fission energy deposition. Solving particle transport problems with the Monte Carlo method is simple - just simulate the particle behavior. The devil is in the details, however. These lectures provide a balanced approach to the theory and practice of Monte Carlo simulation codes. The first lectures provide an overview of Monte Carlo simulation methods, covering the transport equation, random sampling, computational geometry, collision physics, and statistics. The next lectures focus on the state-of-the-art in Monte Carlo criticality simulations, covering the theory of eigenvalue calculations, convergence analysis, dominance ratio calculations, bias in Keff and tallies, bias in uncertainties, a case study of a realistic calculation, and Wielandt acceleration techniques. The remaining lectures cover advanced topics, including HTGR modeling and stochastic geometry, temperature dependence, fission energy deposition, depletion calculations, parallel calculations, and parameter studies. This portion of the class focuses on using MCNP to perform criticality calculations for reactor physics and criticality safety applications. It is an intermediate level class, intended for those with at least some familiarity with MCNP. Class examples provide hands-on experience at running the code, plotting both geometry and results, and understanding the code output. The class includes lectures & hands-on computer use for a variety of Monte Carlo calculations

  13. CERN honours Carlo Rubbia

    CERN Multimedia

    2009-01-01

    On 7 April CERN will be holding a symposium to mark the 75th birthday of Carlo Rubbia, who shared the 1984 Nobel Prize for Physics with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. Following a presentation by Rolf Heuer, lectures will be given by eminent speakers on areas of science to which Carlo Rubbia has made decisive contributions. Michel Spiro, Director of the French National Institute of Nuclear and Particle Physics (IN2P3) of the CNRS, Lyn Evans, sLHC Project Leader, and Alan Astbury of the TRIUMF Laboratory will talk about the physics of the weak interaction and the discovery of the W and Z bosons. Former CERN Director-General Herwig Schopper will lecture on CERN’s accelerators from LEP to the LHC. Giovanni Bignami, former President of the Italian Space Agency and Professor at the IUSS School for Advanced Studies in Pavia will speak about his work with Carlo Rubbia. Finally, Hans Joachim Sch...

  14. CERN honours Carlo Rubbia

    CERN Multimedia

    2009-01-01

    On 7 April CERN will be holding a symposium to mark the 75th birthday of Carlo Rubbia, who shared the 1984 Nobel Prize for Physics with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. Following a presentation by Rolf Heuer, lectures will be given by eminent speakers on areas of science to which Carlo Rubbia has made decisive contributions. Michel Spiro, Director of the French National Institute of Nuclear and Particle Physics (IN2P3) of the CNRS, Lyn Evans, sLHC Project Leader, and Alan Astbury of the TRIUMF Laboratory will talk about the physics of the weak interaction and the discovery of the W and Z bosons. Former CERN Director-General Herwig Schopper will lecture on CERN’s accelerators from LEP to the LHC. Giovanni Bignami, former President of the Italian Space Agency, will speak about his work with Carlo Rubbia. Finally, Hans Joachim Schellnhuber of the Potsdam Institute for Climate Research and Sven Kul...

  15. Monte Carlo Transport for Electron Thermal Transport

    Science.gov (United States)

    Chenhall, Jeffrey; Cao, Duc; Moses, Gregory

    2015-11-01

    The iSNB (implicit Schurtz Nicolai Busquet multigroup electron thermal transport method of Cao et al. is adapted into a Monte Carlo transport method in order to better model the effects of non-local behavior. The end goal is a hybrid transport-diffusion method that combines Monte Carlo Transport with a discrete diffusion Monte Carlo (DDMC). The hybrid method will combine the efficiency of a diffusion method in short mean free path regions with the accuracy of a transport method in long mean free path regions. The Monte Carlo nature of the approach allows the algorithm to be massively parallelized. Work to date on the method will be presented. This work was supported by Sandia National Laboratory - Albuquerque and the University of Rochester Laboratory for Laser Energetics.

  16. Generalized hybrid Monte Carlo - CMFD methods for fission source convergence

    International Nuclear Information System (INIS)

    Wolters, Emily R.; Larsen, Edward W.; Martin, William R.

    2011-01-01

    In this paper, we generalize the recently published 'CMFD-Accelerated Monte Carlo' method and present two new methods that reduce the statistical error in CMFD-Accelerated Monte Carlo. The CMFD-Accelerated Monte Carlo method uses Monte Carlo to estimate nonlinear functionals used in low-order CMFD equations for the eigenfunction and eigenvalue. The Monte Carlo fission source is then modified to match the resulting CMFD fission source in a 'feedback' procedure. The two proposed methods differ from CMFD-Accelerated Monte Carlo in the definition of the required nonlinear functionals, but they have identical CMFD equations. The proposed methods are compared with CMFD-Accelerated Monte Carlo on a high dominance ratio test problem. All hybrid methods converge the Monte Carlo fission source almost immediately, leading to a large reduction in the number of inactive cycles required. The proposed methods stabilize the fission source more efficiently than CMFD-Accelerated Monte Carlo, leading to a reduction in the number of active cycles required. Finally, as in CMFD-Accelerated Monte Carlo, the apparent variance of the eigenfunction is approximately equal to the real variance, so the real error is well-estimated from a single calculation. This is an advantage over standard Monte Carlo, in which the real error can be underestimated due to inter-cycle correlation. (author)

  17. Is Monte Carlo embarrassingly parallel?

    Energy Technology Data Exchange (ETDEWEB)

    Hoogenboom, J. E. [Delft Univ. of Technology, Mekelweg 15, 2629 JB Delft (Netherlands); Delft Nuclear Consultancy, IJsselzoom 2, 2902 LB Capelle aan den IJssel (Netherlands)

    2012-07-01

    Monte Carlo is often stated as being embarrassingly parallel. However, running a Monte Carlo calculation, especially a reactor criticality calculation, in parallel using tens of processors shows a serious limitation in speedup and the execution time may even increase beyond a certain number of processors. In this paper the main causes of the loss of efficiency when using many processors are analyzed using a simple Monte Carlo program for criticality. The basic mechanism for parallel execution is MPI. One of the bottlenecks turn out to be the rendez-vous points in the parallel calculation used for synchronization and exchange of data between processors. This happens at least at the end of each cycle for fission source generation in order to collect the full fission source distribution for the next cycle and to estimate the effective multiplication factor, which is not only part of the requested results, but also input to the next cycle for population control. Basic improvements to overcome this limitation are suggested and tested. Also other time losses in the parallel calculation are identified. Moreover, the threading mechanism, which allows the parallel execution of tasks based on shared memory using OpenMP, is analyzed in detail. Recommendations are given to get the maximum efficiency out of a parallel Monte Carlo calculation. (authors)

  18. Is Monte Carlo embarrassingly parallel?

    International Nuclear Information System (INIS)

    Hoogenboom, J. E.

    2012-01-01

    Monte Carlo is often stated as being embarrassingly parallel. However, running a Monte Carlo calculation, especially a reactor criticality calculation, in parallel using tens of processors shows a serious limitation in speedup and the execution time may even increase beyond a certain number of processors. In this paper the main causes of the loss of efficiency when using many processors are analyzed using a simple Monte Carlo program for criticality. The basic mechanism for parallel execution is MPI. One of the bottlenecks turn out to be the rendez-vous points in the parallel calculation used for synchronization and exchange of data between processors. This happens at least at the end of each cycle for fission source generation in order to collect the full fission source distribution for the next cycle and to estimate the effective multiplication factor, which is not only part of the requested results, but also input to the next cycle for population control. Basic improvements to overcome this limitation are suggested and tested. Also other time losses in the parallel calculation are identified. Moreover, the threading mechanism, which allows the parallel execution of tasks based on shared memory using OpenMP, is analyzed in detail. Recommendations are given to get the maximum efficiency out of a parallel Monte Carlo calculation. (authors)

  19. [Indian workers in Oman].

    Science.gov (United States)

    Longuenesse, E

    1985-01-01

    Until recently Oman was a country of emigration, but by 1980 an estimated 200,000 foreign workers were in the country due to the petroleum boom. Almost 1/3 of the estimated 300,000 Indian workers in the Gulf states were in Oman, a country whose colonial heritage was closely tied to that of India and many of whose inhabitants still speak Urdu. The number of work permits granted to Indians working in the private sector in Oman increased from 47,928 in 1976 to 80,787 in 1980. An estimated 110,000 Indians were working in Oman in 1982, the great majority in the construction and public works sector. A few hundred Indian women were employed by the government of Oman, as domestics, or in other capacities. No accurate data is available on the qualifications of Indian workers in Oman, but a 1979 survey suggested a relatively low illiteracy rate among them. 60-75% of Indians in Oman are from the state of Kerala, followed by workers from the Punjab and the southern states of Tamil Nadu and Andhra Pradesh and Bombay. Indian workers are recruited by specialized agencies or by friends or relatives already employed in Oman. Employers in Oman prefer to recruit through agencies because the preselection process minimizes hiring of workers unqualified for their posts. Officially, expenses of transportation, visas, and other needs are shared by the worker and the employer, but the demand for jobs is so strong that the workers are obliged to pay commissions which amount to considerable sums for stable and well paying jobs. Wages in Oman are however 2 to 5 times the level in India. Numerous abuses have been reported in recruitment practices and in failure of employers in Oman to pay the promised wages, but Indian workers have little recourse. At the same level of qualifications, Indians are paid less then non-Omani Arabs, who in turn receive less than Oman nationals. Indians who remain in Oman long enough nevertheless are able to support families at home and to accumulate considerable

  20. Indian Ocean and Indian summer monsoon: relationships without ENSO in ocean-atmosphere coupled simulations

    Science.gov (United States)

    Crétat, Julien; Terray, Pascal; Masson, Sébastien; Sooraj, K. P.; Roxy, Mathew Koll

    2017-08-01

    The relationship between the Indian Ocean and the Indian summer monsoon (ISM) and their respective influence over the Indo-Western North Pacific (WNP) region are examined in the absence of El Niño Southern Oscillation (ENSO) in two partially decoupled global experiments. ENSO is removed by nudging the tropical Pacific simulated sea surface temperature (SST) toward SST climatology from either observations or a fully coupled control run. The control reasonably captures the observed relationships between ENSO, ISM and the Indian Ocean Dipole (IOD). Despite weaker amplitude, IODs do exist in the absence of ENSO and are triggered by a boreal spring ocean-atmosphere coupled mode over the South-East Indian Ocean similar to that found in the presence of ENSO. These pure IODs significantly affect the tropical Indian Ocean throughout boreal summer, inducing a significant modulation of both the local Walker and Hadley cells. This meridional circulation is masked in the presence of ENSO. However, these pure IODs do not significantly influence the Indian subcontinent rainfall despite overestimated SST variability in the eastern equatorial Indian Ocean compared to observations. On the other hand, they promote a late summer cross-equatorial quadrupole rainfall pattern linking the tropical Indian Ocean with the WNP, inducing important zonal shifts of the Walker circulation despite the absence of ENSO. Surprisingly, the interannual ISM rainfall variability is barely modified and the Indian Ocean does not force the monsoon circulation when ENSO is removed. On the contrary, the monsoon circulation significantly forces the Arabian Sea and Bay of Bengal SSTs, while its connection with the western tropical Indian Ocean is clearly driven by ENSO in our numerical framework. Convection and diabatic heating associated with above-normal ISM induce a strong response over the WNP, even in the absence of ENSO, favoring moisture convergence over India.

  1. Molecular phylogeny of Hemidactylus geckos (Squamata: Gekkonidae) of the Indian subcontinent reveals a unique Indian radiation and an Indian origin of Asian house geckos.

    Science.gov (United States)

    Bansal, Rohini; Karanth, K Praveen

    2010-10-01

    Represented by approximately 85 species, Hemidactylus is one of the most diverse and widely distributed genera of reptiles in the world. In the Indian subcontinent, this genus is represented by 28 species out of which at least 13 are endemic to this region. Here, we report the phylogeny of the Indian Hemidactylus geckos based on mitochondrial and nuclear DNA markers sequenced from multiple individuals of widely distributed as well as endemic congeners of India. Results indicate that a majority of the species distributed in India form a distinct clade whose members are largely confined to the Indian subcontinent thus representing a unique Indian radiation. The remaining Hemidactylus geckos of India belong to two other geographical clades representing the Southeast Asian and West-Asian arid zone species. Additionally, the three widely distributed, commensal species (H. brookii, H. frenatus and H. flaviviridis) are nested within the Indian radiation suggesting their Indian origin. Dispersal-vicariance analysis also supports their Indian origin and subsequent dispersal out-of-India into West-Asian arid zone and Southeast Asia. Thus, Indian subcontinent has served as an important arena for diversification amongst the Hemidactylus geckos and in the evolution and spread of its commensal geckos. Copyright 2010 Elsevier Inc. All rights reserved.

  2. Mean field simulation for Monte Carlo integration

    CERN Document Server

    Del Moral, Pierre

    2013-01-01

    In the last three decades, there has been a dramatic increase in the use of interacting particle methods as a powerful tool in real-world applications of Monte Carlo simulation in computational physics, population biology, computer sciences, and statistical machine learning. Ideally suited to parallel and distributed computation, these advanced particle algorithms include nonlinear interacting jump diffusions; quantum, diffusion, and resampled Monte Carlo methods; Feynman-Kac particle models; genetic and evolutionary algorithms; sequential Monte Carlo methods; adaptive and interacting Marko

  3. Variational Variance Reduction for Monte Carlo Criticality Calculations

    International Nuclear Information System (INIS)

    Densmore, Jeffery D.; Larsen, Edward W.

    2001-01-01

    A new variational variance reduction (VVR) method for Monte Carlo criticality calculations was developed. This method employs (a) a variational functional that is more accurate than the standard direct functional, (b) a representation of the deterministically obtained adjoint flux that is especially accurate for optically thick problems with high scattering ratios, and (c) estimates of the forward flux obtained by Monte Carlo. The VVR method requires no nonanalog Monte Carlo biasing, but it may be used in conjunction with Monte Carlo biasing schemes. Some results are presented from a class of criticality calculations involving alternating arrays of fuel and moderator regions

  4. Monte Carlo Solutions for Blind Phase Noise Estimation

    Directory of Open Access Journals (Sweden)

    Çırpan Hakan

    2009-01-01

    Full Text Available This paper investigates the use of Monte Carlo sampling methods for phase noise estimation on additive white Gaussian noise (AWGN channels. The main contributions of the paper are (i the development of a Monte Carlo framework for phase noise estimation, with special attention to sequential importance sampling and Rao-Blackwellization, (ii the interpretation of existing Monte Carlo solutions within this generic framework, and (iii the derivation of a novel phase noise estimator. Contrary to the ad hoc phase noise estimators that have been proposed in the past, the estimators considered in this paper are derived from solid probabilistic and performance-determining arguments. Computer simulations demonstrate that, on one hand, the Monte Carlo phase noise estimators outperform the existing estimators and, on the other hand, our newly proposed solution exhibits a lower complexity than the existing Monte Carlo solutions.

  5. Indian refining industry

    International Nuclear Information System (INIS)

    Singh, I.J.

    2002-01-01

    The author discusses the history of the Indian refining industry and ongoing developments under the headings: the present state; refinery configuration; Indian capabilities for refinery projects; and reforms in the refining industry. Tables lists India's petroleum refineries giving location and capacity; new refinery projects together with location and capacity; and expansion projects of Indian petroleum refineries. The Indian refinery industry has undergone substantial expansion as well as technological changes over the past years. There has been progressive technology upgrading, energy efficiency, better environmental control and improved capacity utilisation. Major reform processes have been set in motion by the government of India: converting the refining industry from a centrally controlled public sector dominated industry to a delicensed regime in a competitive market economy with the introduction of a liberal exploration policy; dismantling the administered price mechanism; and a 25 year hydrocarbon vision. (UK)

  6. Monte Carlo based diffusion coefficients for LMFBR analysis

    International Nuclear Information System (INIS)

    Van Rooijen, Willem F.G.; Takeda, Toshikazu; Hazama, Taira

    2010-01-01

    A method based on Monte Carlo calculations is developed to estimate the diffusion coefficient of unit cells. The method uses a geometrical model similar to that used in lattice theory, but does not use the assumption of a separable fundamental mode used in lattice theory. The method uses standard Monte Carlo flux and current tallies, and the continuous energy Monte Carlo code MVP was used without modifications. Four models are presented to derive the diffusion coefficient from tally results of flux and partial currents. In this paper the method is applied to the calculation of a plate cell of the fast-spectrum critical facility ZEBRA. Conventional calculations of the diffusion coefficient diverge in the presence of planar voids in the lattice, but our Monte Carlo method can treat this situation without any problem. The Monte Carlo method was used to investigate the influence of geometrical modeling as well as the directional dependence of the diffusion coefficient. The method can be used to estimate the diffusion coefficient of complicated unit cells, the limitation being the capabilities of the Monte Carlo code. The method will be used in the future to confirm results for the diffusion coefficient obtained of the Monte Carlo code. The method will be used in the future to confirm results for the diffusion coefficient obtained with deterministic codes. (author)

  7. Effectively Engaging in Tribal Consultation to protect Traditional Cultural Properties while navigating the 1872 Mining Law - Tonto National Forest, Western Apache Tribes, & Resolution Copper Mine

    Science.gov (United States)

    Nez, N.

    2017-12-01

    By effectively engaging in government-to-government consultation the Tonto National Forest is able to consider oral histories and tribal cultural knowledge in decision making. These conversations often have the potential to lead to the protection and preservation of public lands. Discussed here is one example of successful tribal consultation and how it let to the protection of Traditional Cultural Properties (TCPs). One hour east of Phoenix, Arizona on the Tonto National Forest, Resolution Copper Mine, is working to access a rich copper vein more than 7,000 feet deep. As part of the mining plan of operation they are investigating viable locations to store the earth removed from the mine site. One proposed storage location required hydrologic and geotechnical studies to determine viability. This constituted a significant amount of ground disturbance in an area that is of known importance to local Indian tribes. To ensure proper consideration of tribal concerns, the Forest engaged nine local tribes in government-government consultation. Consultation resulted in the identification of five springs in the project area considered (TCPs) by the Western Apache tribes. Due to the presence of identified TCPs, the Forest asked tribes to assist in the development of mitigation measures to minimize effects of this project on the TCPs identified. The goal of this partnership was to find a way for the Mine to still be able to gather data, while protecting TCPs. During field visits and consultations, a wide range of concerns were shared which were recorded and considered by Tonto National Forest. The Forest developed a proposed mitigation approach to protect springs, which would prevent (not permit) the installation of water monitoring wells, geotechnical borings or trench excavations within 1,200 feet of perennial springs in the project area. As an added mitigation measure, a cultural resources specialist would be on-site during all ground-disturbing activities. Diligent work on

  8. 25 CFR 309.9 - When can non-Indians make and sell products in the style of Indian arts and crafts?

    Science.gov (United States)

    2010-04-01

    ... of Indian arts and crafts? 309.9 Section 309.9 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.9 When can non-Indians make and sell products in the style of Indian arts and crafts? A non-Indian can make and sell products in the style of...

  9. Computer system for Monte Carlo experimentation

    International Nuclear Information System (INIS)

    Grier, D.A.

    1986-01-01

    A new computer system for Monte Carlo Experimentation is presented. The new system speeds and simplifies the process of coding and preparing a Monte Carlo Experiment; it also encourages the proper design of Monte Carlo Experiments, and the careful analysis of the experimental results. A new functional language is the core of this system. Monte Carlo Experiments, and their experimental designs, are programmed in this new language; those programs are compiled into Fortran output. The Fortran output is then compiled and executed. The experimental results are analyzed with a standard statistics package such as Si, Isp, or Minitab or with a user-supplied program. Both the experimental results and the experimental design may be directly loaded into the workspace of those packages. The new functional language frees programmers from many of the details of programming an experiment. Experimental designs such as factorial, fractional factorial, or latin square are easily described by the control structures and expressions of the language. Specific mathematical modes are generated by the routines of the language

  10. About | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    The 82nd Annual Meeting of the Indian Academy of Sciences is being held at ... by newly elected Fellows and Associates over a wide range of scientific topics. ... Indian Institute of Science Education and Research (IISER), Bhopal: Indian ...

  11. Random Numbers and Monte Carlo Methods

    Science.gov (United States)

    Scherer, Philipp O. J.

    Many-body problems often involve the calculation of integrals of very high dimension which cannot be treated by standard methods. For the calculation of thermodynamic averages Monte Carlo methods are very useful which sample the integration volume at randomly chosen points. After summarizing some basic statistics, we discuss algorithms for the generation of pseudo-random numbers with given probability distribution which are essential for all Monte Carlo methods. We show how the efficiency of Monte Carlo integration can be improved by sampling preferentially the important configurations. Finally the famous Metropolis algorithm is applied to classical many-particle systems. Computer experiments visualize the central limit theorem and apply the Metropolis method to the traveling salesman problem.

  12. Indian Woman Today; Southwest Indian Women's Conference (Window Rock, Arizona, September 24-25, 1975).

    Science.gov (United States)

    1975

    Describing the activities and responses of American Indian women attending the 1975 Southwest Indian Women's Conference in Window Rock, Arizona, these proceedings present the following: (1) the keynote address (focus is on program funding and Indian female civil rights, self-concept, and cultural background); (2) observations derived from…

  13. LCG Monte-Carlo Data Base

    CERN Document Server

    Bartalini, P.; Kryukov, A.; Selyuzhenkov, Ilya V.; Sherstnev, A.; Vologdin, A.

    2004-01-01

    We present the Monte-Carlo events Data Base (MCDB) project and its development plans. MCDB facilitates communication between authors of Monte-Carlo generators and experimental users. It also provides a convenient book-keeping and an easy access to generator level samples. The first release of MCDB is now operational for the CMS collaboration. In this paper we review the main ideas behind MCDB and discuss future plans to develop this Data Base further within the CERN LCG framework.

  14. Alternative implementations of the Monte Carlo power method

    International Nuclear Information System (INIS)

    Blomquist, R.N.; Gelbard, E.M.

    2002-01-01

    We compare nominal efficiencies, i.e. variances in power shapes for equal running time, of different versions of the Monte Carlo eigenvalue computation, as applied to criticality safety analysis calculations. The two main methods considered here are ''conventional'' Monte Carlo and the superhistory method, and both are used in criticality safety codes. Within each of these major methods, different variants are available for the main steps of the basic Monte Carlo algorithm. Thus, for example, different treatments of the fission process may vary in the extent to which they follow, in analog fashion, the details of real-world fission, or may vary in details of the methods by which they choose next-generation source sites. In general the same options are available in both the superhistory method and conventional Monte Carlo, but there seems not to have been much examination of the special properties of the two major methods and their minor variants. We find, first, that the superhistory method is just as efficient as conventional Monte Carlo and, secondly, that use of different variants of the basic algorithms may, in special cases, have a surprisingly large effect on Monte Carlo computational efficiency

  15. Igo - A Monte Carlo Code For Radiotherapy Planning

    International Nuclear Information System (INIS)

    Goldstein, M.; Regev, D.

    1999-01-01

    The goal of radiation therapy is to deliver a lethal dose to the tumor, while minimizing the dose to normal tissues and vital organs. To carry out this task, it is critical to calculate correctly the 3-D dose delivered. Monte Carlo transport methods (especially the Adjoint Monte Carlo have the potential to provide more accurate predictions of the 3-D dose the currently used methods. IG0 is a Monte Carlo code derived from the general Monte Carlo Program - MCNP, tailored specifically for calculating the effects of radiation therapy. This paper describes the IG0 transport code, the PIG0 interface and some preliminary results

  16. Monte Carlo techniques for analyzing deep-penetration problems

    International Nuclear Information System (INIS)

    Cramer, S.N.; Gonnord, J.; Hendricks, J.S.

    1986-01-01

    Current methods and difficulties in Monte Carlo deep-penetration calculations are reviewed, including statistical uncertainty and recent adjoint optimization of splitting, Russian roulette, and exponential transformation biasing. Other aspects of the random walk and estimation processes are covered, including the relatively new DXANG angular biasing technique. Specific items summarized are albedo scattering, Monte Carlo coupling techniques with discrete ordinates and other methods, adjoint solutions, and multigroup Monte Carlo. The topic of code-generated biasing parameters is presented, including the creation of adjoint importance functions from forward calculations. Finally, current and future work in the area of computer learning and artificial intelligence is discussed in connection with Monte Carlo applications

  17. Odd-flavor Simulations by the Hybrid Monte Carlo

    CERN Document Server

    Takaishi, Tetsuya; Takaishi, Tetsuya; De Forcrand, Philippe

    2001-01-01

    The standard hybrid Monte Carlo algorithm is known to simulate even flavors QCD only. Simulations of odd flavors QCD, however, can be also performed in the framework of the hybrid Monte Carlo algorithm where the inverse of the fermion matrix is approximated by a polynomial. In this exploratory study we perform three flavors QCD simulations. We make a comparison of the hybrid Monte Carlo algorithm and the R-algorithm which also simulates odd flavors systems but has step-size errors. We find that results from our hybrid Monte Carlo algorithm are in agreement with those from the R-algorithm obtained at very small step-size.

  18. The Indian Monsoon

    Indian Academy of Sciences (India)

    Pacific Oceans, on subseasonal scales of a few days and on an interannual scale. ... over the Indian monsoon zone2 (Figure 3) during the summer monsoon .... each 500 km ×500 km grid over the equatorial Indian Ocean, Bay of Bengal and ...

  19. Quantum Monte Carlo approaches for correlated systems

    CERN Document Server

    Becca, Federico

    2017-01-01

    Over the past several decades, computational approaches to studying strongly-interacting systems have become increasingly varied and sophisticated. This book provides a comprehensive introduction to state-of-the-art quantum Monte Carlo techniques relevant for applications in correlated systems. Providing a clear overview of variational wave functions, and featuring a detailed presentation of stochastic samplings including Markov chains and Langevin dynamics, which are developed into a discussion of Monte Carlo methods. The variational technique is described, from foundations to a detailed description of its algorithms. Further topics discussed include optimisation techniques, real-time dynamics and projection methods, including Green's function, reptation and auxiliary-field Monte Carlo, from basic definitions to advanced algorithms for efficient codes, and the book concludes with recent developments on the continuum space. Quantum Monte Carlo Approaches for Correlated Systems provides an extensive reference ...

  20. Non statistical Monte-Carlo

    International Nuclear Information System (INIS)

    Mercier, B.

    1985-04-01

    We have shown that the transport equation can be solved with particles, like the Monte-Carlo method, but without random numbers. In the Monte-Carlo method, particles are created from the source, and are followed from collision to collision until either they are absorbed or they leave the spatial domain. In our method, particles are created from the original source, with a variable weight taking into account both collision and absorption. These particles are followed until they leave the spatial domain, and we use them to determine a first collision source. Another set of particles is then created from this first collision source, and tracked to determine a second collision source, and so on. This process introduces an approximation which does not exist in the Monte-Carlo method. However, we have analyzed the effect of this approximation, and shown that it can be limited. Our method is deterministic, gives reproducible results. Furthermore, when extra accuracy is needed in some region, it is easier to get more particles to go there. It has the same kind of applications: rather problems where streaming is dominant than collision dominated problems

  1. Track 4: basic nuclear science variance reduction for Monte Carlo criticality simulations. 6. Variational Variance Reduction for Monte Carlo Criticality Calculations

    International Nuclear Information System (INIS)

    Densmore, Jeffery D.; Larsen, Edward W.

    2001-01-01

    Recently, it has been shown that the figure of merit (FOM) of Monte Carlo source-detector problems can be enhanced by using a variational rather than a direct functional to estimate the detector response. The direct functional, which is traditionally employed in Monte Carlo simulations, requires an estimate of the solution of the forward problem within the detector region. The variational functional is theoretically more accurate than the direct functional, but it requires estimates of the solutions of the forward and adjoint source-detector problems over the entire phase-space of the problem. In recent work, we have performed Monte Carlo simulations using the variational functional by (a) approximating the adjoint solution deterministically and representing this solution as a function in phase-space and (b) estimating the forward solution using Monte Carlo. We have called this general procedure variational variance reduction (VVR). The VVR method is more computationally expensive per history than traditional Monte Carlo because extra information must be tallied and processed. However, the variational functional yields a more accurate estimate of the detector response. Our simulations have shown that the VVR reduction in variance usually outweighs the increase in cost, resulting in an increased FOM. In recent work on source-detector problems, we have calculated the adjoint solution deterministically and represented this solution as a linear-in-angle, histogram-in-space function. This procedure has several advantages over previous implementations: (a) it requires much less adjoint information to be stored and (b) it is highly efficient for diffusive problems, due to the accurate linear-in-angle representation of the adjoint solution. (Traditional variance-reduction methods perform poorly for diffusive problems.) Here, we extend this VVR method to Monte Carlo criticality calculations, which are often diffusive and difficult for traditional variance-reduction methods

  2. Equality in Education for Indian Women.

    Science.gov (United States)

    Krepps, Ethel

    1980-01-01

    Historically, Indian women have been denied education due to: early marriage and family responsibilities; lack of money; inadequate family attention to education; the threat education poses to Indian men; and geographical location. Indian tribes can best administer funds and programs to provide the education so necessary for Indian women. (SB)

  3. 76 FR 65208 - Indian Gaming

    Science.gov (United States)

    2011-10-20

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an Approval of the Gaming Compact between the Confederated Tribes of the [[Page 65209

  4. 75 FR 8108 - Indian Gaming

    Science.gov (United States)

    2010-02-23

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes... Governing Class III Gaming. DATES: Effective Date: February 23, 2010. FOR FURTHER INFORMATION CONTACT: Paula...

  5. 78 FR 17427 - Indian Gaming

    Science.gov (United States)

    2013-03-21

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes... Gaming (Compact). DATES: Effective Date: March 21, 2013. FOR FURTHER INFORMATION CONTACT: Paula L. Hart...

  6. The average Indian female nose.

    Science.gov (United States)

    Patil, Surendra B; Kale, Satish M; Jaiswal, Sumeet; Khare, Nishant; Math, Mahantesh

    2011-12-01

    This study aimed to delineate the anthropometric measurements of the noses of young women of an Indian population and to compare them with the published ideals and average measurements for white women. This anthropometric survey included a volunteer sample of 100 young Indian women ages 18 to 35 years with Indian parents and no history of previous surgery or trauma to the nose. Standardized frontal, lateral, oblique, and basal photographs of the subjects' noses were taken, and 12 standard anthropometric measurements of the nose were determined. The results were compared with published standards for North American white women. In addition, nine nasal indices were calculated and compared with the standards for North American white women. The nose of Indian women differs significantly from the white nose. All the nasal measurements for the Indian women were found to be significantly different from those for North American white women. Seven of the nine nasal indices also differed significantly. Anthropometric analysis suggests differences between the Indian female nose and the North American white nose. Thus, a single aesthetic ideal is inadequate. Noses of Indian women are smaller and wider, with a less projected and rounded tip than the noses of white women. This study established the nasal anthropometric norms for nasal parameters, which will serve as a guide for cosmetic and reconstructive surgery in Indian women.

  7. Washington Irving and the American Indian.

    Science.gov (United States)

    Littlefield, Daniel F., Jr.

    1979-01-01

    Some modern scholars feel that Washington Irving vacillated between romanticism and realism in his literary treatment of the American Indian. However, a study of all his works dealing with Indians, placed in context with his non-Indian works, reveals that his attitude towards Indians was intelligent and enlightened for his time. (CM)

  8. Indian Women: An Historical and Personal Perspective

    Science.gov (United States)

    Christensen, Rosemary Ackley

    1975-01-01

    Several issues relating to Indian women are discussed. These include (1) the three types of people to whom we owe our historical perceptions of Indian women, (2) role delineation in Indian society; (3) differences between Indian women and white women, and (4) literary role models of Indian women. (Author/BW)

  9. Urban American Indian/Alaskan Natives Compared to Non-Indians in Out-of-Home Care

    Science.gov (United States)

    Carter, Vernon B.

    2011-01-01

    Historically, American Indian/Alaskan Native (AI/AN) children have been disproportionately represented in the foster care system. In this study, nationally representative child welfare data from October 1999 was used to compare urban AI/AN children to non-Indian children placed into out-of-home care. Compared to non-Indian children, urban AI/AN…

  10. A Cultural Resources Inventory of the John Martin Reservoir, Colorado.

    Science.gov (United States)

    1982-08-31

    Rating (VAR26) .......................................... 117 5.2.20 Upland Game Bird Rating (VAR27) ................................ 118 5 2.21...return the pueblo Indians, who he heard arrived at the Rio Nepestle (Arkansas), which were being held captive by the Apache at El is a very copious and...nor nutrition suffi- called). The other group, led by Captain John cient to nourish timber. These vast plains of Bell, continued east along the Arkansas

  11. 77 FR 5566 - Indian Gaming

    Science.gov (United States)

    2012-02-03

    ... up to 900 gaming devices, any banking or percentage card games, and any devices or games authorized... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Tribal--State Class III Gaming Compact Taking Effect. SUMMARY: This publishes...

  12. 76 FR 56466 - Indian Gaming

    Science.gov (United States)

    2011-09-13

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an approval of the gaming compact between the Flandreau Santee Sioux Tribe and the State of South...

  13. 75 FR 68823 - Indian Gaming

    Science.gov (United States)

    2010-11-09

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Amendment. SUMMARY: This notice publishes approval of the Amendments to the Class III Gaming Compact (Amendment) between the State of Oregon...

  14. 77 FR 43110 - Indian Gaming

    Science.gov (United States)

    2012-07-23

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an extension of Gaming between the Rosebud Sioux Tribe and the State of South Dakota. DATES...

  15. 76 FR 8375 - Indian Gaming

    Science.gov (United States)

    2011-02-14

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes an extension of the Gaming Compact between the Oglala Sioux Tribe and the State of South Dakota...

  16. 78 FR 10203 - Indian Gaming

    Science.gov (United States)

    2013-02-13

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal State Class III Gaming Compact. SUMMARY: This notice publishes the Approval of the Class III Tribal- State Gaming Compact between the Chippewa-Cree Tribe of the...

  17. 77 FR 45371 - Indian Gaming

    Science.gov (United States)

    2012-07-31

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an extension of Gaming between the Oglala Sioux Tribe and the State of South Dakota. DATES: Effective...

  18. 78 FR 15738 - Indian Gaming

    Science.gov (United States)

    2013-03-12

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an extension of the gaming compact between the Rosebud Sioux Tribe and the State of South Dakota...

  19. 77 FR 59641 - Indian Gaming

    Science.gov (United States)

    2012-09-28

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes an extension of Gaming between the Rosebud Sioux Tribe and the State of South Dakota. DATES...

  20. 78 FR 17428 - Indian Gaming

    Science.gov (United States)

    2013-03-21

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes the approval of the Class III Tribal- State Gaming Compact between the Pyramid Lake Paiute Tribe and...

  1. 76 FR 52968 - Indian Gaming

    Science.gov (United States)

    2011-08-24

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an extension of Gaming between the Rosebud Sioux Tribe and the State of South Dakota. DATES...

  2. 76 FR 33341 - Indian Gaming

    Science.gov (United States)

    2011-06-08

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an extension of Gaming between the Rosebud Sioux Tribe and the State of South Dakota. DATES...

  3. 75 FR 55823 - Indian Gaming

    Science.gov (United States)

    2010-09-14

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes an extension of Gaming between the Oglala Sioux Tribe and the State of South Dakota. DATES: Effective...

  4. 78 FR 44146 - Indian Gaming

    Science.gov (United States)

    2013-07-23

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Tribal-State Class III Gaming Compact taking effect. SUMMARY: This notice publishes the Class III Amended and Restated Tribal-State Gaming Compact between the Shingle Springs Band of...

  5. 78 FR 33435 - Indian Gaming

    Science.gov (United States)

    2013-06-04

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Amendments. SUMMARY: This notice publishes approval of an Agreement to Amend the Class III Tribal-State Gaming Compact between the Salt River...

  6. 78 FR 11221 - Indian Gaming

    Science.gov (United States)

    2013-02-15

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes an extension of the gaming compact between the Oglala Sioux Tribe and the State of South Dakota...

  7. Monte Carlo techniques for analyzing deep penetration problems

    International Nuclear Information System (INIS)

    Cramer, S.N.; Gonnord, J.; Hendricks, J.S.

    1985-01-01

    A review of current methods and difficulties in Monte Carlo deep-penetration calculations is presented. Statistical uncertainty is discussed, and recent adjoint optimization of splitting, Russian roulette, and exponential transformation biasing is reviewed. Other aspects of the random walk and estimation processes are covered, including the relatively new DXANG angular biasing technique. Specific items summarized are albedo scattering, Monte Carlo coupling techniques with discrete ordinates and other methods, adjoint solutions, and multi-group Monte Carlo. The topic of code-generated biasing parameters is presented, including the creation of adjoint importance functions from forward calculations. Finally, current and future work in the area of computer learning and artificial intelligence is discussed in connection with Monte Carlo applications

  8. Monte Carlo techniques for analyzing deep penetration problems

    International Nuclear Information System (INIS)

    Cramer, S.N.; Gonnord, J.; Hendricks, J.S.

    1985-01-01

    A review of current methods and difficulties in Monte Carlo deep-penetration calculations is presented. Statistical uncertainty is discussed, and recent adjoint optimization of splitting, Russian roulette, and exponential transformation biasing is reviewed. Other aspects of the random walk and estimation processes are covered, including the relatively new DXANG angular biasing technique. Specific items summarized are albedo scattering, Monte Carlo coupling techniques with discrete ordinates and other methods, adjoint solutions, and multi-group Monte Carlo. The topic of code-generated biasing parameters is presented, including the creation of adjoint importance functions from forward calculations. Finally, current and future work in the area of computer learning and artificial intelligence is discussed in connection with Monte Carlo applications. 29 refs

  9. 77 FR 30550 - Indian Gaming

    Science.gov (United States)

    2012-05-23

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes approval by the Department of an extension to the Class III Gaming Compact between the Pyramid Lake Paiute...

  10. 76 FR 11258 - Indian Gaming

    Science.gov (United States)

    2011-03-01

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Tribal--State Class III Gaming Compact taking effect. SUMMARY: Notice is given that the Tribal-State Compact for Regulation of Class III Gaming between the Confederated Tribes of the...

  11. 77 FR 41200 - Indian Gaming

    Science.gov (United States)

    2012-07-12

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes approval by the Department of an extension to the Class III Gaming Compact between the State of California...

  12. 78 FR 26801 - Indian Gaming

    Science.gov (United States)

    2013-05-08

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs [DR.5B711.IA000813] Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes the approval of an amendment to the Class III Tribal-State Gaming Compact...

  13. 78 FR 54908 - Indian Gaming

    Science.gov (United States)

    2013-09-06

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs [DR.5B711.IA000813] Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes the approval of the Class III Tribal- State Gaming Compact between the...

  14. 78 FR 62649 - Indian Gaming

    Science.gov (United States)

    2013-10-22

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs [DR.5B711.IA000813] Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Tribal-State Class III Gaming Compact taking effect. SUMMARY: This notice publishes the Class III Gaming Compact between the North Fork Rancheria of Mono...

  15. 78 FR 78377 - Indian Gaming

    Science.gov (United States)

    2013-12-26

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs [DR.5B711.IA000814] Indian Gaming AGENCY... Gaming Compact. SUMMARY: This publishes notice of the extension of the Class III gaming compact between... FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the Deputy...

  16. Biases in Monte Carlo eigenvalue calculations

    Energy Technology Data Exchange (ETDEWEB)

    Gelbard, E.M.

    1992-12-01

    The Monte Carlo method has been used for many years to analyze the neutronics of nuclear reactors. In fact, as the power of computers has increased the importance of Monte Carlo in neutronics has also increased, until today this method plays a central role in reactor analysis and design. Monte Carlo is used in neutronics for two somewhat different purposes, i.e., (a) to compute the distribution of neutrons in a given medium when the neutron source-density is specified, and (b) to compute the neutron distribution in a self-sustaining chain reaction, in which case the source is determined as the eigenvector of a certain linear operator. In (b), then, the source is not given, but must be computed. In the first case (the ``fixed-source`` case) the Monte Carlo calculation is unbiased. That is to say that, if the calculation is repeated (``replicated``) over and over, with independent random number sequences for each replica, then averages over all replicas will approach the correct neutron distribution as the number of replicas goes to infinity. Unfortunately, the computation is not unbiased in the second case, which we discuss here.

  17. Biases in Monte Carlo eigenvalue calculations

    Energy Technology Data Exchange (ETDEWEB)

    Gelbard, E.M.

    1992-01-01

    The Monte Carlo method has been used for many years to analyze the neutronics of nuclear reactors. In fact, as the power of computers has increased the importance of Monte Carlo in neutronics has also increased, until today this method plays a central role in reactor analysis and design. Monte Carlo is used in neutronics for two somewhat different purposes, i.e., (a) to compute the distribution of neutrons in a given medium when the neutron source-density is specified, and (b) to compute the neutron distribution in a self-sustaining chain reaction, in which case the source is determined as the eigenvector of a certain linear operator. In (b), then, the source is not given, but must be computed. In the first case (the fixed-source'' case) the Monte Carlo calculation is unbiased. That is to say that, if the calculation is repeated ( replicated'') over and over, with independent random number sequences for each replica, then averages over all replicas will approach the correct neutron distribution as the number of replicas goes to infinity. Unfortunately, the computation is not unbiased in the second case, which we discuss here.

  18. Biases in Monte Carlo eigenvalue calculations

    International Nuclear Information System (INIS)

    Gelbard, E.M.

    1992-01-01

    The Monte Carlo method has been used for many years to analyze the neutronics of nuclear reactors. In fact, as the power of computers has increased the importance of Monte Carlo in neutronics has also increased, until today this method plays a central role in reactor analysis and design. Monte Carlo is used in neutronics for two somewhat different purposes, i.e., (a) to compute the distribution of neutrons in a given medium when the neutron source-density is specified, and (b) to compute the neutron distribution in a self-sustaining chain reaction, in which case the source is determined as the eigenvector of a certain linear operator. In (b), then, the source is not given, but must be computed. In the first case (the ''fixed-source'' case) the Monte Carlo calculation is unbiased. That is to say that, if the calculation is repeated (''replicated'') over and over, with independent random number sequences for each replica, then averages over all replicas will approach the correct neutron distribution as the number of replicas goes to infinity. Unfortunately, the computation is not unbiased in the second case, which we discuss here

  19. Culturally Informed Social Work Practice with American Indian Clients: Guidelines for Non-Indian Social Workers.

    Science.gov (United States)

    Williams, Edith Ellison; Ellison, Florence

    1996-01-01

    Culturally informed social work health and mental health interventions directed toward American Indian clients must be harmonious with their environment and acculturation. Discusses American Indian beliefs about health and illness and degrees of acculturation. Guidelines are offered to help non-Indian social workers design culturally appropriate…

  20. Importance iteration in MORSE Monte Carlo calculations

    International Nuclear Information System (INIS)

    Kloosterman, J.L.; Hoogenboom, J.E.

    1994-01-01

    An expression to calculate point values (the expected detector response of a particle emerging from a collision or the source) is derived and implemented in the MORSE-SGC/S Monte Carlo code. It is outlined how these point values can be smoothed as a function of energy and as a function of the optical thickness between the detector and the source. The smoothed point values are subsequently used to calculate the biasing parameters of the Monte Carlo runs to follow. The method is illustrated by an example that shows that the obtained biasing parameters lead to a more efficient Monte Carlo calculation

  1. Importance iteration in MORSE Monte Carlo calculations

    International Nuclear Information System (INIS)

    Kloosterman, J.L.; Hoogenboom, J.E.

    1994-02-01

    An expression to calculate point values (the expected detector response of a particle emerging from a collision or the source) is derived and implemented in the MORSE-SGC/S Monte Carlo code. It is outlined how these point values can be smoothed as a function of energy and as a function of the optical thickness between the detector and the source. The smoothed point values are subsequently used to calculate the biasing parameters of the Monte Carlo runs to follow. The method is illustrated by an example, which shows that the obtained biasing parameters lead to a more efficient Monte Carlo calculation. (orig.)

  2. 78 FR 62650 - Indian Gaming

    Science.gov (United States)

    2013-10-22

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs [DR.5B711.IA000813] Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of extension of Tribal-State Class III Gaming Compact. SUMMARY: This publishes notice of the extension of the Class III gaming compact between the Rosebud Sioux...

  3. 78 FR 54670 - Indian Gaming

    Science.gov (United States)

    2013-09-05

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs [DR.5B711.IA000813] Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of extension of Tribal--State Class III Gaming Compact. SUMMARY: This publishes notice of the Extension of the Class III gaming compact between the Yankton Sioux...

  4. Advanced Computational Methods for Monte Carlo Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-12

    This course is intended for graduate students who already have a basic understanding of Monte Carlo methods. It focuses on advanced topics that may be needed for thesis research, for developing new state-of-the-art methods, or for working with modern production Monte Carlo codes.

  5. Leadership Challenges in Indian Country.

    Science.gov (United States)

    Horse, Perry

    2002-01-01

    American Indian leaders must meld the holistic and cyclical world view of Indian peoples with the linear, rational world view of mainstream society. Tribal leaders need to be statesmen and ethical politicians. Economic and educational development must be based on disciplined long-range planning and a strong, Indian-controlled educational base.…

  6. Methodology for understanding Indian culture

    DEFF Research Database (Denmark)

    Sinha, Jai; Kumar, Rajesh

    2004-01-01

    Methods of understanding cultures, including Indian culture, are embedded in a broad spectrum of sociocultural approaches to human behavior in general. The approaches examined in this paper reflect evolving perspectives on Indian culture, ranging from the starkly ethnocentric to the largely...... eclectic and integrative. Most of the methods herin discussed were developed in the West and were subsequently taken up with or without adaptations to fit the Indian context. The paper begins by briefly reviewing the intrinsic concept of culture. It then adopts a historical view of the different ways...... and means by which scholars have construed the particular facets of Indian culture, highlighting the advantages and disadvantages of each. The final section concludes with some proposals about the best ways of understnding the complexity that constitutes the Indian cultural reality....

  7. Strategije drevesnega preiskovanja Monte Carlo

    OpenAIRE

    VODOPIVEC, TOM

    2018-01-01

    Po preboju pri igri go so metode drevesnega preiskovanja Monte Carlo (ang. Monte Carlo tree search – MCTS) sprožile bliskovit napredek agentov za igranje iger: raziskovalna skupnost je od takrat razvila veliko variant in izboljšav algoritma MCTS ter s tem zagotovila napredek umetne inteligence ne samo pri igrah, ampak tudi v številnih drugih domenah. Čeprav metode MCTS združujejo splošnost naključnega vzorčenja z natančnostjo drevesnega preiskovanja, imajo lahko v praksi težave s počasno konv...

  8. Monte Carlo electron/photon transport

    International Nuclear Information System (INIS)

    Mack, J.M.; Morel, J.E.; Hughes, H.G.

    1985-01-01

    A review of nonplasma coupled electron/photon transport using Monte Carlo method is presented. Remarks are mainly restricted to linerarized formalisms at electron energies from 1 keV to 1000 MeV. Applications involving pulse-height estimation, transport in external magnetic fields, and optical Cerenkov production are discussed to underscore the importance of this branch of computational physics. Advances in electron multigroup cross-section generation is reported, and its impact on future code development assessed. Progress toward the transformation of MCNP into a generalized neutral/charged-particle Monte Carlo code is described. 48 refs

  9. Prospect on general software of Monte Carlo method

    International Nuclear Information System (INIS)

    Pei Lucheng

    1992-01-01

    This is a short paper on the prospect of Monte Carlo general software. The content consists of cluster sampling method, zero variance technique, self-improved method, and vectorized Monte Carlo method

  10. Facts about American Indian Education

    Science.gov (United States)

    American Indian College Fund, 2010

    2010-01-01

    As a result of living in remote rural areas, American Indians living on reservations have limited access to higher education. One-third of American Indians live on reservations, according to the U.S. Census Bureau. According to the most recent U.S. government statistics, the overall poverty rate for American Indians/Alaska Natives, including…

  11. Defeathering the Indian.

    Science.gov (United States)

    LaRoque, Emma

    In an effort to mitigate the stultified image of the American Indian in Canada, this handbook on Native Studies is written from the Indian point of view and is designed to sensitize the dominant society, particularly educators. While numerous approaches and pointers are presented and specific mateirals are recommended, the focus is essentially…

  12. The Indian Gaming Regulatory Act and Its Effects on American Indian Economic Development

    OpenAIRE

    Randall K. Q. Akee; Katherine A. Spilde; Jonathan B. Taylor

    2015-01-01

    The Indian Gaming Regulatory Act (IGRA), passed by the US Congress in 1988, was a watershed in the history of policymaking directed toward reservation-resident American Indians. IGRA set the stage for tribal government-owned gaming facilities. It also shaped how this new industry would develop and how tribal governments would invest gaming revenues. Since then, Indian gaming has approached commercial, state-licensed gaming in total revenues. Gaming operations have had a far-reaching and trans...

  13. Monte Carlo method for array criticality calculations

    International Nuclear Information System (INIS)

    Dickinson, D.; Whitesides, G.E.

    1976-01-01

    The Monte Carlo method for solving neutron transport problems consists of mathematically tracing paths of individual neutrons collision by collision until they are lost by absorption or leakage. The fate of the neutron after each collision is determined by the probability distribution functions that are formed from the neutron cross-section data. These distributions are sampled statistically to establish the successive steps in the neutron's path. The resulting data, accumulated from following a large number of batches, are analyzed to give estimates of k/sub eff/ and other collision-related quantities. The use of electronic computers to produce the simulated neutron histories, initiated at Los Alamos Scientific Laboratory, made the use of the Monte Carlo method practical for many applications. In analog Monte Carlo simulation, the calculation follows the physical events of neutron scattering, absorption, and leakage. To increase calculational efficiency, modifications such as the use of statistical weights are introduced. The Monte Carlo method permits the use of a three-dimensional geometry description and a detailed cross-section representation. Some of the problems in using the method are the selection of the spatial distribution for the initial batch, the preparation of the geometry description for complex units, and the calculation of error estimates for region-dependent quantities such as fluxes. The Monte Carlo method is especially appropriate for criticality safety calculations since it permits an accurate representation of interacting units of fissile material. Dissimilar units, units of complex shape, moderators between units, and reflected arrays may be calculated. Monte Carlo results must be correlated with relevant experimental data, and caution must be used to ensure that a representative set of neutron histories is produced

  14. Bayesian Optimal Experimental Design Using Multilevel Monte Carlo

    KAUST Repository

    Ben Issaid, Chaouki; Long, Quan; Scavino, Marco; Tempone, Raul

    2015-01-01

    Experimental design is very important since experiments are often resource-exhaustive and time-consuming. We carry out experimental design in the Bayesian framework. To measure the amount of information, which can be extracted from the data in an experiment, we use the expected information gain as the utility function, which specifically is the expected logarithmic ratio between the posterior and prior distributions. Optimizing this utility function enables us to design experiments that yield the most informative data for our purpose. One of the major difficulties in evaluating the expected information gain is that the integral is nested and can be high dimensional. We propose using Multilevel Monte Carlo techniques to accelerate the computation of the nested high dimensional integral. The advantages are twofold. First, the Multilevel Monte Carlo can significantly reduce the cost of the nested integral for a given tolerance, by using an optimal sample distribution among different sample averages of the inner integrals. Second, the Multilevel Monte Carlo method imposes less assumptions, such as the concentration of measures, required by Laplace method. We test our Multilevel Monte Carlo technique using a numerical example on the design of sensor deployment for a Darcy flow problem governed by one dimensional Laplace equation. We also compare the performance of the Multilevel Monte Carlo, Laplace approximation and direct double loop Monte Carlo.

  15. Bayesian Optimal Experimental Design Using Multilevel Monte Carlo

    KAUST Repository

    Ben Issaid, Chaouki

    2015-01-07

    Experimental design is very important since experiments are often resource-exhaustive and time-consuming. We carry out experimental design in the Bayesian framework. To measure the amount of information, which can be extracted from the data in an experiment, we use the expected information gain as the utility function, which specifically is the expected logarithmic ratio between the posterior and prior distributions. Optimizing this utility function enables us to design experiments that yield the most informative data for our purpose. One of the major difficulties in evaluating the expected information gain is that the integral is nested and can be high dimensional. We propose using Multilevel Monte Carlo techniques to accelerate the computation of the nested high dimensional integral. The advantages are twofold. First, the Multilevel Monte Carlo can significantly reduce the cost of the nested integral for a given tolerance, by using an optimal sample distribution among different sample averages of the inner integrals. Second, the Multilevel Monte Carlo method imposes less assumptions, such as the concentration of measures, required by Laplace method. We test our Multilevel Monte Carlo technique using a numerical example on the design of sensor deployment for a Darcy flow problem governed by one dimensional Laplace equation. We also compare the performance of the Multilevel Monte Carlo, Laplace approximation and direct double loop Monte Carlo.

  16. Asthma and American Indians/Alaska Natives

    Science.gov (United States)

    ... Minority Population Profiles > American Indian/Alaska Native > Asthma Asthma and American Indians/Alaska Natives In 2015, 240, ... Native American adults reported that they currently have asthma. American Indian/Alaska Native children are 60% more ...

  17. Present status of transport code development based on Monte Carlo method

    International Nuclear Information System (INIS)

    Nakagawa, Masayuki

    1985-01-01

    The present status of development in Monte Carlo code is briefly reviewed. The main items are the followings; Application fields, Methods used in Monte Carlo code (geometry spectification, nuclear data, estimator and variance reduction technique) and unfinished works, Typical Monte Carlo codes and Merits of continuous energy Monte Carlo code. (author)

  18. Successful vectorization - reactor physics Monte Carlo code

    International Nuclear Information System (INIS)

    Martin, W.R.

    1989-01-01

    Most particle transport Monte Carlo codes in use today are based on the ''history-based'' algorithm, wherein one particle history at a time is simulated. Unfortunately, the ''history-based'' approach (present in all Monte Carlo codes until recent years) is inherently scalar and cannot be vectorized. In particular, the history-based algorithm cannot take advantage of vector architectures, which characterize the largest and fastest computers at the current time, vector supercomputers such as the Cray X/MP or IBM 3090/600. However, substantial progress has been made in recent years in developing and implementing a vectorized Monte Carlo algorithm. This algorithm follows portions of many particle histories at the same time and forms the basis for all successful vectorized Monte Carlo codes that are in use today. This paper describes the basic vectorized algorithm along with descriptions of several variations that have been developed by different researchers for specific applications. These applications have been mainly in the areas of neutron transport in nuclear reactor and shielding analysis and photon transport in fusion plasmas. The relative merits of the various approach schemes will be discussed and the present status of known vectorization efforts will be summarized along with available timing results, including results from the successful vectorization of 3-D general geometry, continuous energy Monte Carlo. (orig.)

  19. Bayesian phylogeny analysis via stochastic approximation Monte Carlo

    KAUST Repository

    Cheon, Sooyoung; Liang, Faming

    2009-01-01

    in simulating from the posterior distribution of phylogenetic trees, rendering the inference ineffective. In this paper, we apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo algorithm, to Bayesian phylogeny analysis. Our method

  20. 75 FR 39697 - Indians Into Psychology Program; Correction

    Science.gov (United States)

    2010-07-12

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Indian Health Service Indians Into Psychology Program; Correction AGENCY: Indian Health Service, HHS. ACTION: Notice correction. SUMMARY: The Indian Health Service...-IHS-2010-INPSY-0001, for the Indians Into Psychology Program. The document contained an incorrect...

  1. Rasam Indian Restaurant: Menu

    OpenAIRE

    Rasam Indian Restaurant

    2013-01-01

    Rasam Indian Restaurant is located in the Glasthule, a suburb of Dublin and opened in 2003. The objective is to serve high quality, authentic Indian cuisine. "We blend, roast and grind our own spices daily to provide a flavour that is unique to Rasam. Cooking Indian food is founded upon long held family traditions. The secret is in the varying elements of heat and spices, the tandoor clay oven is a hugely important fixture in our kitchen. Marinated meats are lowered into the oven on long m...

  2. Reflections on early Monte Carlo calculations

    International Nuclear Information System (INIS)

    Spanier, J.

    1992-01-01

    Monte Carlo methods for solving various particle transport problems developed in parallel with the evolution of increasingly sophisticated computer programs implementing diffusion theory and low-order moments calculations. In these early years, Monte Carlo calculations and high-order approximations to the transport equation were seen as too expensive to use routinely for nuclear design but served as invaluable aids and supplements to design with less expensive tools. The earliest Monte Carlo programs were quite literal; i.e., neutron and other particle random walk histories were simulated by sampling from the probability laws inherent in the physical system without distoration. Use of such analogue sampling schemes resulted in a good deal of time being spent in examining the possibility of lowering the statistical uncertainties in the sample estimates by replacing simple, and intuitively obvious, random variables by those with identical means but lower variances

  3. The Apache Longbow-Hellfire Missile Test at Yuma Proving Ground: Ecological Risk Assessment for Missile Firing

    International Nuclear Information System (INIS)

    Jones, Daniel Steven; Efroymson, Rebecca Ann; Hargrove, William Walter; Suter, Glenn; Pater, Larry

    2008-01-01

    A multiple stressor risk assessment was conducted at Yuma Proving Ground, Arizona, as a demonstration of the Military Ecological Risk Assessment Framework. The focus was a testing program at Cibola Range, which involved an Apache Longbow helicopter firing Hellfire missiles at moving targets, M60-A1 tanks. This paper describes the ecological risk assessment for the missile launch and detonation. The primary stressor associated with this activity was sound. Other minor stressors included the detonation impact, shrapnel, and fire. Exposure to desert mule deer (Odocoileus hemionus crooki) was quantified using the Army sound contour program BNOISE2, as well as distances from the explosion to deer. Few effects data were available from related studies. Exposure-response models for the characterization of effects consisted of human 'disturbance' and hearing damage thresholds in units of C-weighted decibels (sound exposure level) and a distance-based No Observed Adverse Effects Level for moose and cannonfire. The risk characterization used a weight-of-evidence approach and concluded that risk to mule deer behavior from the missile firing was likely for a negligible number of deer, but that no risk to mule deer abundance and reproduction is expected

  4. Data collection and field experiments at the Apache Leap research site. Annual report, May 1995--1996

    International Nuclear Information System (INIS)

    Woodhouse, E.G.; Bassett, R.L.; Neuman, S.P.; Chen, G.

    1997-08-01

    This report documents the research performed during the period May 1995-May 1996 for a project of the U.S. Regulatory Commission (sponsored contract NRC-04-090-051) by the University of Arizona. The project manager for this research in Thomas J. Nicholson, Office of Nuclear Regulatory Research. The objectives of this research were to examine hypotheses and test alternative conceptual models concerning unsaturated flow and transport through fractured rock, and to design and execute confirmatory field and laboratory experiments to test these hypotheses and conceptual models at the Apache Leap Research Site near Superior, Arizona. Each chapter in this report summarizes research related to a specific set of objectives and can be read and interpreted as a separate entity. Topics include: crosshole pneumatic and gaseous tracer field and modeling experiments designed to help validate the applicability of contiuum geostatistical and stochastic concepts, theories, models, and scaling relations relevant to unsaturated flow and transport in fractured porous tuffs; use of geochemistry and aquifer testing to evaluate fracture flow and perching mechanisms; investigations of 234 U/ 238 U fractionation to evaluate leaching selectivity; and transport and modeling of both conservative and non-conservative tracers

  5. Leading Indian Business-Groups

    Directory of Open Access Journals (Sweden)

    Maria Alexandrovna Vorobyeva

    2016-01-01

    Full Text Available The goal of this paper is to investigate the evolution of the leading Indian business-groups under the conditions of economical liberalization. It is shown that the role of modern business-groups in the Indian economy is determined by their high rate in the gross domestic product (GDP, huge overall actives, substantial pert in the e[port of goods and services, as well as by their activities in modern branch structure formatting, and developing labor-intensive and high-tech branches. They strongly influence upon economical national strategies, they became a locomotive of internationalization and of transnationalization of India, the basis of the external economy factor system, the promoters of Indian "economical miracle" on the world scene, and the dynamical segment of economical and social development of modern India. The tendencies of the development of the leading Indian business groups are: gradual concentration of production in few clue sectors, "horizontal" structure, incorporation of the enterprises into joint-stock structure, attraction of hired top-managers and transnationaliziation. But against this background the leading Indian business-groups keep main traditional peculiarities: they mostly still belong to the families of their founders, even today they observe caste or communal relations which are the basis of their non-formal backbone tides, they still remain highly diversificated structures with weak interrelations. Specific national ambivalence and combination of traditions and innovations of the leading Indian business-groups provide their high vitality and stability in the controversial, multiform, overloaded with caste and confessional remains Indian reality. We conclude that in contrast to the dominant opinion transformation of these groups into multisectoral corporations of the western type is far from completion, and in the nearest perspective they will still possess all their peculiarities and incident social and economical

  6. Reconstruction of Monte Carlo replicas from Hessian parton distributions

    Energy Technology Data Exchange (ETDEWEB)

    Hou, Tie-Jiun [Department of Physics, Southern Methodist University,Dallas, TX 75275-0181 (United States); Gao, Jun [INPAC, Shanghai Key Laboratory for Particle Physics and Cosmology,Department of Physics and Astronomy, Shanghai Jiao-Tong University, Shanghai 200240 (China); High Energy Physics Division, Argonne National Laboratory,Argonne, Illinois, 60439 (United States); Huston, Joey [Department of Physics and Astronomy, Michigan State University,East Lansing, MI 48824 (United States); Nadolsky, Pavel [Department of Physics, Southern Methodist University,Dallas, TX 75275-0181 (United States); Schmidt, Carl; Stump, Daniel [Department of Physics and Astronomy, Michigan State University,East Lansing, MI 48824 (United States); Wang, Bo-Ting; Xie, Ke Ping [Department of Physics, Southern Methodist University,Dallas, TX 75275-0181 (United States); Dulat, Sayipjamal [Department of Physics and Astronomy, Michigan State University,East Lansing, MI 48824 (United States); School of Physics Science and Technology, Xinjiang University,Urumqi, Xinjiang 830046 (China); Center for Theoretical Physics, Xinjiang University,Urumqi, Xinjiang 830046 (China); Pumplin, Jon; Yuan, C.P. [Department of Physics and Astronomy, Michigan State University,East Lansing, MI 48824 (United States)

    2017-03-20

    We explore connections between two common methods for quantifying the uncertainty in parton distribution functions (PDFs), based on the Hessian error matrix and Monte-Carlo sampling. CT14 parton distributions in the Hessian representation are converted into Monte-Carlo replicas by a numerical method that reproduces important properties of CT14 Hessian PDFs: the asymmetry of CT14 uncertainties and positivity of individual parton distributions. The ensembles of CT14 Monte-Carlo replicas constructed this way at NNLO and NLO are suitable for various collider applications, such as cross section reweighting. Master formulas for computation of asymmetric standard deviations in the Monte-Carlo representation are derived. A correction is proposed to address a bias in asymmetric uncertainties introduced by the Taylor series approximation. A numerical program is made available for conversion of Hessian PDFs into Monte-Carlo replicas according to normal, log-normal, and Watt-Thorne sampling procedures.

  7. Sampling from a polytope and hard-disk Monte Carlo

    International Nuclear Information System (INIS)

    Kapfer, Sebastian C; Krauth, Werner

    2013-01-01

    The hard-disk problem, the statics and the dynamics of equal two-dimensional hard spheres in a periodic box, has had a profound influence on statistical and computational physics. Markov-chain Monte Carlo and molecular dynamics were first discussed for this model. Here we reformulate hard-disk Monte Carlo algorithms in terms of another classic problem, namely the sampling from a polytope. Local Markov-chain Monte Carlo, as proposed by Metropolis et al. in 1953, appears as a sequence of random walks in high-dimensional polytopes, while the moves of the more powerful event-chain algorithm correspond to molecular dynamics evolution. We determine the convergence properties of Monte Carlo methods in a special invariant polytope associated with hard-disk configurations, and the implications for convergence of hard-disk sampling. Finally, we discuss parallelization strategies for event-chain Monte Carlo and present results for a multicore implementation

  8. Problems in radiation shielding calculations with Monte Carlo methods

    International Nuclear Information System (INIS)

    Ueki, Kohtaro

    1985-01-01

    The Monte Carlo method is a very useful tool for solving a large class of radiation transport problem. In contrast with deterministic method, geometric complexity is a much less significant problem for Monte Carlo calculations. However, the accuracy of Monte Carlo calculations is of course, limited by statistical error of the quantities to be estimated. In this report, we point out some typical problems to solve a large shielding system including radiation streaming. The Monte Carlo coupling technique was developed to settle such a shielding problem accurately. However, the variance of the Monte Carlo results using the coupling technique of which detectors were located outside the radiation streaming, was still not enough. So as to bring on more accurate results for the detectors located outside the streaming and also for a multi-legged-duct streaming problem, a practicable way of ''Prism Scattering technique'' is proposed in the study. (author)

  9. Cluster monte carlo method for nuclear criticality safety calculation

    International Nuclear Information System (INIS)

    Pei Lucheng

    1984-01-01

    One of the most important applications of the Monte Carlo method is the calculation of the nuclear criticality safety. The fair source game problem was presented at almost the same time as the Monte Carlo method was applied to calculating the nuclear criticality safety. The source iteration cost may be reduced as much as possible or no need for any source iteration. This kind of problems all belongs to the fair source game prolems, among which, the optimal source game is without any source iteration. Although the single neutron Monte Carlo method solved the problem without the source iteration, there is still quite an apparent shortcoming in it, that is, it solves the problem without the source iteration only in the asymptotic sense. In this work, a new Monte Carlo method called the cluster Monte Carlo method is given to solve the problem further

  10. Pore-scale uncertainty quantification with multilevel Monte Carlo

    KAUST Repository

    Icardi, Matteo; Hoel, Haakon; Long, Quan; Tempone, Raul

    2014-01-01

    . Since there are no generic ways to parametrize the randomness in the porescale structures, Monte Carlo techniques are the most accessible to compute statistics. We propose a multilevel Monte Carlo (MLMC) technique to reduce the computational cost

  11. Wielandt acceleration for MCNP5 Monte Carlo eigenvalue calculations

    International Nuclear Information System (INIS)

    Brown, F.

    2007-01-01

    Monte Carlo criticality calculations use the power iteration method to determine the eigenvalue (k eff ) and eigenfunction (fission source distribution) of the fundamental mode. A recently proposed method for accelerating convergence of the Monte Carlo power iteration using Wielandt's method has been implemented in a test version of MCNP5. The method is shown to provide dramatic improvements in convergence rates and to greatly reduce the possibility of false convergence assessment. The method is effective and efficient, improving the Monte Carlo figure-of-merit for many problems. In addition, the method should eliminate most of the underprediction bias in confidence intervals for Monte Carlo criticality calculations. (authors)

  12. Monte Carlo shielding analyses using an automated biasing procedure

    International Nuclear Information System (INIS)

    Tang, J.S.; Hoffman, T.J.

    1988-01-01

    A systematic and automated approach for biasing Monte Carlo shielding calculations is described. In particular, adjoint fluxes from a one-dimensional discrete ordinates calculation are used to generate biasing parameters for a Monte Carlo calculation. The entire procedure of adjoint calculation, biasing parameters generation, and Monte Carlo calculation has been automated. The automated biasing procedure has been applied to several realistic deep-penetration shipping cask problems. The results obtained for neutron and gamma-ray transport indicate that with the automated biasing procedure Monte Carlo shielding calculations of spent-fuel casks can be easily performed with minimum effort and that accurate results can be obtained at reasonable computing cost

  13. Applications of the Monte Carlo method in radiation protection

    International Nuclear Information System (INIS)

    Kulkarni, R.N.; Prasad, M.A.

    1999-01-01

    This paper gives a brief introduction to the application of the Monte Carlo method in radiation protection. It may be noted that an exhaustive review has not been attempted. The special advantage of the Monte Carlo method has been first brought out. The fundamentals of the Monte Carlo method have next been explained in brief, with special reference to two applications in radiation protection. Some sample current applications have been reported in the end in brief as examples. They are, medical radiation physics, microdosimetry, calculations of thermoluminescence intensity and probabilistic safety analysis. The limitations of the Monte Carlo method have also been mentioned in passing. (author)

  14. Irregular Warfare in the American West: The Geronimo Campaign

    Science.gov (United States)

    2010-05-05

    reaching the United States while under the escort of Apache scouts. Crook’s biographer and fellow Indian Wars veteran Captain John G. Bourke described...worst. Bourke observed, howevet,that the Rsychological impact of being tracked by their own . . i tribesmen, who were backed by ~ighly ~obile...hold a temporary higher rank. Brevet ranks disappeared from the U.S. military at the end of the nineteenth century. 5 Captain John G. Bourke , An

  15. Current and future applications of Monte Carlo

    International Nuclear Information System (INIS)

    Zaidi, H.

    2003-01-01

    Full text: The use of radionuclides in medicine has a long history and encompasses a large area of applications including diagnosis and radiation treatment of cancer patients using either external or radionuclide radiotherapy. The 'Monte Carlo method'describes a very broad area of science, in which many processes, physical systems, and phenomena are simulated by statistical methods employing random numbers. The general idea of Monte Carlo analysis is to create a model, which is as similar as possible to the real physical system of interest, and to create interactions within that system based on known probabilities of occurrence, with random sampling of the probability density functions (pdfs). As the number of individual events (called 'histories') is increased, the quality of the reported average behavior of the system improves, meaning that the statistical uncertainty decreases. The use of the Monte Carlo method to simulate radiation transport has become the most accurate means of predicting absorbed dose distributions and other quantities of interest in the radiation treatment of cancer patients using either external or radionuclide radiotherapy. The same trend has occurred for the estimation of the absorbed dose in diagnostic procedures using radionuclides as well as the assessment of image quality and quantitative accuracy of radionuclide imaging. As a consequence of this generalized use, many questions are being raised primarily about the need and potential of Monte Carlo techniques, but also about how accurate it really is, what would it take to apply it clinically and make it available widely to the nuclear medicine community at large. Many of these questions will be answered when Monte Carlo techniques are implemented and used for more routine calculations and for in-depth investigations. In this paper, the conceptual role of the Monte Carlo method is briefly introduced and followed by a survey of its different applications in diagnostic and therapeutic

  16. Quantum statistical Monte Carlo methods and applications to spin systems

    International Nuclear Information System (INIS)

    Suzuki, M.

    1986-01-01

    A short review is given concerning the quantum statistical Monte Carlo method based on the equivalence theorem that d-dimensional quantum systems are mapped onto (d+1)-dimensional classical systems. The convergence property of this approximate tansformation is discussed in detail. Some applications of this general appoach to quantum spin systems are reviewed. A new Monte Carlo method, ''thermo field Monte Carlo method,'' is presented, which is an extension of the projection Monte Carlo method at zero temperature to that at finite temperatures

  17. SPQR: a Monte Carlo reactor kinetics code

    International Nuclear Information System (INIS)

    Cramer, S.N.; Dodds, H.L.

    1980-02-01

    The SPQR Monte Carlo code has been developed to analyze fast reactor core accident problems where conventional methods are considered inadequate. The code is based on the adiabatic approximation of the quasi-static method. This initial version contains no automatic material motion or feedback. An existing Monte Carlo code is used to calculate the shape functions and the integral quantities needed in the kinetics module. Several sample problems have been devised and analyzed. Due to the large statistical uncertainty associated with the calculation of reactivity in accident simulations, the results, especially at later times, differ greatly from deterministic methods. It was also found that in large uncoupled systems, the Monte Carlo method has difficulty in handling asymmetric perturbations

  18. Optix: A Monte Carlo scintillation light transport code

    Energy Technology Data Exchange (ETDEWEB)

    Safari, M.J., E-mail: mjsafari@aut.ac.ir [Department of Energy Engineering and Physics, Amir Kabir University of Technology, PO Box 15875-4413, Tehran (Iran, Islamic Republic of); Afarideh, H. [Department of Energy Engineering and Physics, Amir Kabir University of Technology, PO Box 15875-4413, Tehran (Iran, Islamic Republic of); Ghal-Eh, N. [School of Physics, Damghan University, PO Box 36716-41167, Damghan (Iran, Islamic Republic of); Davani, F. Abbasi [Nuclear Engineering Department, Shahid Beheshti University, PO Box 1983963113, Tehran (Iran, Islamic Republic of)

    2014-02-11

    The paper reports on the capabilities of Monte Carlo scintillation light transport code Optix, which is an extended version of previously introduced code Optics. Optix provides the user a variety of both numerical and graphical outputs with a very simple and user-friendly input structure. A benchmarking strategy has been adopted based on the comparison with experimental results, semi-analytical solutions, and other Monte Carlo simulation codes to verify various aspects of the developed code. Besides, some extensive comparisons have been made against the tracking abilities of general-purpose MCNPX and FLUKA codes. The presented benchmark results for the Optix code exhibit promising agreements. -- Highlights: • Monte Carlo simulation of scintillation light transport in 3D geometry. • Evaluation of angular distribution of detected photons. • Benchmark studies to check the accuracy of Monte Carlo simulations.

  19. Bayesian phylogeny analysis via stochastic approximation Monte Carlo

    KAUST Repository

    Cheon, Sooyoung

    2009-11-01

    Monte Carlo methods have received much attention in the recent literature of phylogeny analysis. However, the conventional Markov chain Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, tend to get trapped in a local mode in simulating from the posterior distribution of phylogenetic trees, rendering the inference ineffective. In this paper, we apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo algorithm, to Bayesian phylogeny analysis. Our method is compared with two popular Bayesian phylogeny software, BAMBE and MrBayes, on simulated and real datasets. The numerical results indicate that our method outperforms BAMBE and MrBayes. Among the three methods, SAMC produces the consensus trees which have the highest similarity to the true trees, and the model parameter estimates which have the smallest mean square errors, but costs the least CPU time. © 2009 Elsevier Inc. All rights reserved.

  20. Working With Abusive/Neglectful Indian Parents. Revised.

    Science.gov (United States)

    National Indian Child Abuse and Neglect Resource Center, Tulsa, OK.

    Considering such factors as disruption of Indian families caused by Anglo educational programs (missionary schools, BIA boarding schools), by Indian relocation programs, and other non-Indian institutions, many of today's abusive and neglectful Indian parents were victims as children in these same institutions. The 9-page information sheet offers a…

  1. Present status and future prospects of neutronics Monte Carlo

    International Nuclear Information System (INIS)

    Gelbard, E.M.

    1990-01-01

    It is fair to say that the Monte Carlo method, over the last decade, has grown steadily more important as a neutronics computational tool. Apparently this has happened for assorted reasons. Thus, for example, as the power of computers has increased, the cost of the method has dropped, steadily becoming less and less of an obstacle to its use. In addition, more and more sophisticated input processors have now made it feasible to model extremely complicated systems routinely with really remarkable fidelity. Finally, as we demand greater and greater precision in reactor calculations, Monte Carlo is often found to be the only method accurate enough for use in benchmarking. Cross section uncertainties are now almost the only inherent limitations in our Monte Carlo capabilities. For this reason Monte Carlo has come to occupy a special position, interposed between experiment and other computational techniques. More and more often deterministic methods are tested by comparison with Monte Carlo, and cross sections are tested by comparing Monte Carlo with experiment. In this way one can distinguish very clearly between errors due to flaws in our numerical methods, and those due to deficiencies in cross section files. The special role of Monte Carlo as a benchmarking tool, often the only available benchmarking tool, makes it crucially important that this method should be polished to perfection. Problems relating to Eigenvalue calculations, variance reduction and the use of advanced computers are reviewed in this paper. (author)

  2. Diffusion Monte Carlo approach versus adiabatic computation for local Hamiltonians

    Science.gov (United States)

    Bringewatt, Jacob; Dorland, William; Jordan, Stephen P.; Mink, Alan

    2018-02-01

    Most research regarding quantum adiabatic optimization has focused on stoquastic Hamiltonians, whose ground states can be expressed with only real non-negative amplitudes and thus for whom destructive interference is not manifest. This raises the question of whether classical Monte Carlo algorithms can efficiently simulate quantum adiabatic optimization with stoquastic Hamiltonians. Recent results have given counterexamples in which path-integral and diffusion Monte Carlo fail to do so. However, most adiabatic optimization algorithms, such as for solving MAX-k -SAT problems, use k -local Hamiltonians, whereas our previous counterexample for diffusion Monte Carlo involved n -body interactions. Here we present a 6-local counterexample which demonstrates that even for these local Hamiltonians there are cases where diffusion Monte Carlo cannot efficiently simulate quantum adiabatic optimization. Furthermore, we perform empirical testing of diffusion Monte Carlo on a standard well-studied class of permutation-symmetric tunneling problems and similarly find large advantages for quantum optimization over diffusion Monte Carlo.

  3. Neutron point-flux calculation by Monte Carlo

    International Nuclear Information System (INIS)

    Eichhorn, M.

    1986-04-01

    A survey of the usual methods for estimating flux at a point is given. The associated variance-reducing techniques in direct Monte Carlo games are explained. The multigroup Monte Carlo codes MC for critical systems and PUNKT for point source-point detector-systems are represented, and problems in applying the codes to practical tasks are discussed. (author)

  4. Frequency domain Monte Carlo simulation method for cross power spectral density driven by periodically pulsed spallation neutron source using complex-valued weight Monte Carlo

    International Nuclear Information System (INIS)

    Yamamoto, Toshihiro

    2014-01-01

    Highlights: • The cross power spectral density in ADS has correlated and uncorrelated components. • A frequency domain Monte Carlo method to calculate the uncorrelated one is developed. • The method solves the Fourier transformed transport equation. • The method uses complex-valued weights to solve the equation. • The new method reproduces well the CPSDs calculated with time domain MC method. - Abstract: In an accelerator driven system (ADS), pulsed spallation neutrons are injected at a constant frequency. The cross power spectral density (CPSD), which can be used for monitoring the subcriticality of the ADS, is composed of the correlated and uncorrelated components. The uncorrelated component is described by a series of the Dirac delta functions that occur at the integer multiples of the pulse repetition frequency. In the present paper, a Monte Carlo method to solve the Fourier transformed neutron transport equation with a periodically pulsed neutron source term has been developed to obtain the CPSD in ADSs. Since the Fourier transformed flux is a complex-valued quantity, the Monte Carlo method introduces complex-valued weights to solve the Fourier transformed equation. The Monte Carlo algorithm used in this paper is similar to the one that was developed by the author of this paper to calculate the neutron noise caused by cross section perturbations. The newly-developed Monte Carlo algorithm is benchmarked to the conventional time domain Monte Carlo simulation technique. The CPSDs are obtained both with the newly-developed frequency domain Monte Carlo method and the conventional time domain Monte Carlo method for a one-dimensional infinite slab. The CPSDs obtained with the frequency domain Monte Carlo method agree well with those with the time domain method. The higher order mode effects on the CPSD in an ADS with a periodically pulsed neutron source are discussed

  5. Shell model the Monte Carlo way

    International Nuclear Information System (INIS)

    Ormand, W.E.

    1995-01-01

    The formalism for the auxiliary-field Monte Carlo approach to the nuclear shell model is presented. The method is based on a linearization of the two-body part of the Hamiltonian in an imaginary-time propagator using the Hubbard-Stratonovich transformation. The foundation of the method, as applied to the nuclear many-body problem, is discussed. Topics presented in detail include: (1) the density-density formulation of the method, (2) computation of the overlaps, (3) the sign of the Monte Carlo weight function, (4) techniques for performing Monte Carlo sampling, and (5) the reconstruction of response functions from an imaginary-time auto-correlation function using MaxEnt techniques. Results obtained using schematic interactions, which have no sign problem, are presented to demonstrate the feasibility of the method, while an extrapolation method for realistic Hamiltonians is presented. In addition, applications at finite temperature are outlined

  6. Shell model the Monte Carlo way

    Energy Technology Data Exchange (ETDEWEB)

    Ormand, W.E.

    1995-03-01

    The formalism for the auxiliary-field Monte Carlo approach to the nuclear shell model is presented. The method is based on a linearization of the two-body part of the Hamiltonian in an imaginary-time propagator using the Hubbard-Stratonovich transformation. The foundation of the method, as applied to the nuclear many-body problem, is discussed. Topics presented in detail include: (1) the density-density formulation of the method, (2) computation of the overlaps, (3) the sign of the Monte Carlo weight function, (4) techniques for performing Monte Carlo sampling, and (5) the reconstruction of response functions from an imaginary-time auto-correlation function using MaxEnt techniques. Results obtained using schematic interactions, which have no sign problem, are presented to demonstrate the feasibility of the method, while an extrapolation method for realistic Hamiltonians is presented. In addition, applications at finite temperature are outlined.

  7. Research on perturbation based Monte Carlo reactor criticality search

    International Nuclear Information System (INIS)

    Li Zeguang; Wang Kan; Li Yangliu; Deng Jingkang

    2013-01-01

    Criticality search is a very important aspect in reactor physics analysis. Due to the advantages of Monte Carlo method and the development of computer technologies, Monte Carlo criticality search is becoming more and more necessary and feasible. Traditional Monte Carlo criticality search method is suffered from large amount of individual criticality runs and uncertainty and fluctuation of Monte Carlo results. A new Monte Carlo criticality search method based on perturbation calculation is put forward in this paper to overcome the disadvantages of traditional method. By using only one criticality run to get initial k_e_f_f and differential coefficients of concerned parameter, the polynomial estimator of k_e_f_f changing function is solved to get the critical value of concerned parameter. The feasibility of this method was tested. The results show that the accuracy and efficiency of perturbation based criticality search method are quite inspiring and the method overcomes the disadvantages of traditional one. (authors)

  8. The Comprehensive View of Indian Education.

    Science.gov (United States)

    Kaegi, Gerda

    Relating historical conflicts between Indians and whites, the document explained how education was originally aimed at "civilizing" and domesticating the Canadian Indian. This philosophy, used extensively by church groups that established the original Indian schools, alienated children from both the white society and the educational…

  9. Monte Carlo learning/biasing experiment with intelligent random numbers

    International Nuclear Information System (INIS)

    Booth, T.E.

    1985-01-01

    A Monte Carlo learning and biasing technique is described that does its learning and biasing in the random number space rather than the physical phase-space. The technique is probably applicable to all linear Monte Carlo problems, but no proof is provided here. Instead, the technique is illustrated with a simple Monte Carlo transport problem. Problems encountered, problems solved, and speculations about future progress are discussed. 12 refs

  10. Temperature variance study in Monte-Carlo photon transport theory

    International Nuclear Information System (INIS)

    Giorla, J.

    1985-10-01

    We study different Monte-Carlo methods for solving radiative transfer problems, and particularly Fleck's Monte-Carlo method. We first give the different time-discretization schemes and the corresponding stability criteria. Then we write the temperature variance as a function of the variances of temperature and absorbed energy at the previous time step. Finally we obtain some stability criteria for the Monte-Carlo method in the stationary case [fr

  11. Monte Carlo applications to radiation shielding problems

    International Nuclear Information System (INIS)

    Subbaiah, K.V.

    2009-01-01

    Monte Carlo methods are a class of computational algorithms that rely on repeated random sampling of physical and mathematical systems to compute their results. However, basic concepts of MC are both simple and straightforward and can be learned by using a personal computer. Uses of Monte Carlo methods require large amounts of random numbers, and it was their use that spurred the development of pseudorandom number generators, which were far quicker to use than the tables of random numbers which had been previously used for statistical sampling. In Monte Carlo simulation of radiation transport, the history (track) of a particle is viewed as a random sequence of free flights that end with an interaction event where the particle changes its direction of movement, loses energy and, occasionally, produces secondary particles. The Monte Carlo simulation of a given experimental arrangement (e.g., an electron beam, coming from an accelerator and impinging on a water phantom) consists of the numerical generation of random histories. To simulate these histories we need an interaction model, i.e., a set of differential cross sections (DCS) for the relevant interaction mechanisms. The DCSs determine the probability distribution functions (pdf) of the random variables that characterize a track; 1) free path between successive interaction events, 2) type of interaction taking place and 3) energy loss and angular deflection in a particular event (and initial state of emitted secondary particles, if any). Once these pdfs are known, random histories can be generated by using appropriate sampling methods. If the number of generated histories is large enough, quantitative information on the transport process may be obtained by simply averaging over the simulated histories. The Monte Carlo method yields the same information as the solution of the Boltzmann transport equation, with the same interaction model, but is easier to implement. In particular, the simulation of radiation

  12. Randomized quasi-Monte Carlo simulation of fast-ion thermalization

    Science.gov (United States)

    Höök, L. J.; Johnson, T.; Hellsten, T.

    2012-01-01

    This work investigates the applicability of the randomized quasi-Monte Carlo method for simulation of fast-ion thermalization processes in fusion plasmas, e.g. for simulation of neutral beam injection and radio frequency heating. In contrast to the standard Monte Carlo method, the quasi-Monte Carlo method uses deterministic numbers instead of pseudo-random numbers and has a statistical weak convergence close to {O}(N^{-1}) , where N is the number of markers. We have compared different quasi-Monte Carlo methods for a neutral beam injection scenario, which is solved by many realizations of the associated stochastic differential equation, discretized with the Euler-Maruyama scheme. The statistical convergence of the methods is measured for time steps up to 214.

  13. Smile characterization by U.S. white, U.S. Asian Indian, and Indian populations.

    Science.gov (United States)

    Sharma, Neeru; Rosenstiel, Stephen F; Fields, Henry W; Beck, F Mike

    2012-05-01

    With growing demand for high esthetic standards, dentists must understand patient perception and incorporate their preferences into treatment. However, little is known about how cultural and ethnic differences influence esthetic perception. The purpose of this study was to determine whether differences in ethnic background, including the possibility of assimilation, affected a layperson's perception of esthetic and smile characteristics. A survey was developed containing images that were digitally manipulated into a series of barely perceptible steps, changing 1 smile parameter to form a strip of images that displayed that parameter over a wide range. Data were collected with a customized program which randomly displayed a single image and allowed the subject to use the mouse to adjust an on-screen slider according to displayed instructions, that is, "Please move the slider to select the image you find to be most ideal"; or "Please move the slider to select the first image that you find unattractive." A convenience sample (n=288) comprised of U.S. whites, U.S. Asian Indians, and Indians living in India was surveyed. This sample provided a power of .86 to detect a difference of ±1.5 mm. Subjects evaluated images showing the smile arc, buccal corridor, gingival display, vertical overlap, lateral incisal step, maxillary midline to midface, and maxillary to mandibular midline. Rater reliability was assessed with the Fleiss-Cohen weighted Kappa (Kw) statistic and corresponding 95% confidence interval after each question was repeated in a random sequence. Choice differences due to ethnicity were assessed with a multiple randomization test and the adjusted P value with the step-down Bonferrroni method of Holm (α=.05). The Kw for the 17 variables in all 3 groups ranged from 0.11 for ideal vertical overlap to 0.64 for ideal buccal corridor space. Overall reliability was fair to moderate. Differences attributed to ethnicity were demonstrated between the Asian Indians and U

  14. Debating the Social Thinking of Carlos Nelson Coutinho

    Directory of Open Access Journals (Sweden)

    Bruno Bruziguessi

    2017-10-01

    Full Text Available BRAZ, Marcelo; RODRIGUES, Mavi (Org.. Cultura, democracia e socialismo: as idéias de Carlos Nelson Coutinho em debate. [Culture, democracy and socialism: The ideas of Carlos Nelson Coutinho in debate]. Rio de Janeiro: Mórula, 2016. 248 p.

  15. A Monte Carlo algorithm for the Vavilov distribution

    International Nuclear Information System (INIS)

    Yi, Chul-Young; Han, Hyon-Soo

    1999-01-01

    Using the convolution property of the inverse Laplace transform, an improved Monte Carlo algorithm for the Vavilov energy-loss straggling distribution of the charged particle is developed, which is relatively simple and gives enough accuracy to be used for most Monte Carlo applications

  16. Adaptive Multilevel Monte Carlo Simulation

    KAUST Repository

    Hoel, H

    2011-08-23

    This work generalizes a multilevel forward Euler Monte Carlo method introduced in Michael B. Giles. (Michael Giles. Oper. Res. 56(3):607–617, 2008.) for the approximation of expected values depending on the solution to an Itô stochastic differential equation. The work (Michael Giles. Oper. Res. 56(3):607– 617, 2008.) proposed and analyzed a forward Euler multilevelMonte Carlo method based on a hierarchy of uniform time discretizations and control variates to reduce the computational effort required by a standard, single level, Forward Euler Monte Carlo method. This work introduces an adaptive hierarchy of non uniform time discretizations, generated by an adaptive algorithmintroduced in (AnnaDzougoutov et al. Raùl Tempone. Adaptive Monte Carlo algorithms for stopped diffusion. In Multiscale methods in science and engineering, volume 44 of Lect. Notes Comput. Sci. Eng., pages 59–88. Springer, Berlin, 2005; Kyoung-Sook Moon et al. Stoch. Anal. Appl. 23(3):511–558, 2005; Kyoung-Sook Moon et al. An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates stochastic, path dependent, time steps and is based on a posteriori error expansions first developed in (Anders Szepessy et al. Comm. Pure Appl. Math. 54(10):1169– 1214, 2001). Our numerical results for a stopped diffusion problem, exhibit savings in the computational cost to achieve an accuracy of ϑ(TOL),from(TOL−3), from using a single level version of the adaptive algorithm to ϑ(((TOL−1)log(TOL))2).

  17. Effects of Alcohol Use and Anti-American Indian Attitudes on Domestic-Violence Culpability Decisions for American Indian and Euro-American Actors

    Science.gov (United States)

    Esqueda, Cynthia Willis; Hack, Lori; Tehee, Melissa

    2010-01-01

    Few studies have focused on the unique issues surrounding American Indian violence. Yet American Indian women are at high risk for domestic abuse, and domestic violence has been identified as the most important issue for American Indians now and in the future by the National Congress of American Indians. American Indian women suffer from domestic…

  18. Nested Sampling with Constrained Hamiltonian Monte Carlo

    OpenAIRE

    Betancourt, M. J.

    2010-01-01

    Nested sampling is a powerful approach to Bayesian inference ultimately limited by the computationally demanding task of sampling from a heavily constrained probability distribution. An effective algorithm in its own right, Hamiltonian Monte Carlo is readily adapted to efficiently sample from any smooth, constrained distribution. Utilizing this constrained Hamiltonian Monte Carlo, I introduce a general implementation of the nested sampling algorithm.

  19. New associates | Announcements | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Sushmee Badhulika, Indian Institute of Technology, Hyderabad ... Sankar Chakma, Indian Institute of Science Education & Research, Bhopal Joydeep ... B Praveen Kumar, Indian National Centre for Ocean Information Services, Hyderabad

  20. Fellowship | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Address: Director, Indian Institute of Science Education & Research, .... Address: Visiting Professor, CORAL, Indian Institute of Technology, ..... Specialization: Elementary Particles & High Energy Physics, Plasma Physics and Atomic Physics

  1. Fellowship | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Address: Department of Chemistry, Indian Institute of Technology, Powai, Mumbai .... Address: Emeritus Professor, National Institute of Advanced Studies, Indian .... Specialization: High Energy & Elementary Particle Physics, Supersymmetric ...

  2. Monte Carlo computation in the applied research of nuclear technology

    International Nuclear Information System (INIS)

    Xu Shuyan; Liu Baojie; Li Qin

    2007-01-01

    This article briefly introduces Monte Carlo Methods and their properties. It narrates the Monte Carlo methods with emphasis in their applications to several domains of nuclear technology. Monte Carlo simulation methods and several commonly used computer software to implement them are also introduced. The proposed methods are demonstrated by a real example. (authors)

  3. Indian concepts on sexuality.

    Science.gov (United States)

    Chakraborty, Kaustav; Thakurata, Rajarshi Guha

    2013-01-01

    India is a vast country depicting wide social, cultural and sexual variations. Indian concept of sexuality has evolved over time and has been immensely influenced by various rulers and religions. Indian sexuality is manifested in our attire, behavior, recreation, literature, sculptures, scriptures, religion and sports. It has influenced the way we perceive our health, disease and device remedies for the same. In modern era, with rapid globalization the unique Indian sexuality is getting diffused. The time has come to rediscover ourselves in terms of sexuality to attain individual freedom and to reinvest our energy to social issues related to sexuality.

  4. U. S. and Canadian Indian Periodicals.

    Science.gov (United States)

    Price, John

    The document lists and discusses Indian-published and Indian-oriented newspapers, periodicals, and other assorted publications generally designed to establish a communication system reflecting the interest of the majority of American Indians. Also provided are resumes of several publications that are thought to have gained wide acceptance through…

  5. Shell model Monte Carlo methods

    International Nuclear Information System (INIS)

    Koonin, S.E.

    1996-01-01

    We review quantum Monte Carlo methods for dealing with large shell model problems. These methods reduce the imaginary-time many-body evolution operator to a coherent superposition of one-body evolutions in fluctuating one-body fields; resultant path integral is evaluated stochastically. We first discuss the motivation, formalism, and implementation of such Shell Model Monte Carlo methods. There then follows a sampler of results and insights obtained from a number of applications. These include the ground state and thermal properties of pf-shell nuclei, thermal behavior of γ-soft nuclei, and calculation of double beta-decay matrix elements. Finally, prospects for further progress in such calculations are discussed. 87 refs

  6. Statistics of Monte Carlo methods used in radiation transport calculation

    International Nuclear Information System (INIS)

    Datta, D.

    2009-01-01

    Radiation transport calculation can be carried out by using either deterministic or statistical methods. Radiation transport calculation based on statistical methods is basic theme of the Monte Carlo methods. The aim of this lecture is to describe the fundamental statistics required to build the foundations of Monte Carlo technique for radiation transport calculation. Lecture note is organized in the following way. Section (1) will describe the introduction of Basic Monte Carlo and its classification towards the respective field. Section (2) will describe the random sampling methods, a key component of Monte Carlo radiation transport calculation, Section (3) will provide the statistical uncertainty of Monte Carlo estimates, Section (4) will describe in brief the importance of variance reduction techniques while sampling particles such as photon, or neutron in the process of radiation transport

  7. Multiple histogram method and static Monte Carlo sampling

    NARCIS (Netherlands)

    Inda, M.A.; Frenkel, D.

    2004-01-01

    We describe an approach to use multiple-histogram methods in combination with static, biased Monte Carlo simulations. To illustrate this, we computed the force-extension curve of an athermal polymer from multiple histograms constructed in a series of static Rosenbluth Monte Carlo simulations. From

  8. Forest canopy BRDF simulation using Monte Carlo method

    NARCIS (Netherlands)

    Huang, J.; Wu, B.; Zeng, Y.; Tian, Y.

    2006-01-01

    Monte Carlo method is a random statistic method, which has been widely used to simulate the Bidirectional Reflectance Distribution Function (BRDF) of vegetation canopy in the field of visible remote sensing. The random process between photons and forest canopy was designed using Monte Carlo method.

  9. Associateship | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Address: Dept. of Electrical Engineering, Indian Institute of Technology, Kandi, ... Specialization: Elementary Particle Physics Address during Associateship: Centre for Theoretical Studies, Indian Institute of Science, Bangalore 560 012.

  10. Discrete Diffusion Monte Carlo for Electron Thermal Transport

    Science.gov (United States)

    Chenhall, Jeffrey; Cao, Duc; Wollaeger, Ryan; Moses, Gregory

    2014-10-01

    The iSNB (implicit Schurtz Nicolai Busquet electron thermal transport method of Cao et al. is adapted to a Discrete Diffusion Monte Carlo (DDMC) solution method for eventual inclusion in a hybrid IMC-DDMC (Implicit Monte Carlo) method. The hybrid method will combine the efficiency of a diffusion method in short mean free path regions with the accuracy of a transport method in long mean free path regions. The Monte Carlo nature of the approach allows the algorithm to be massively parallelized. Work to date on the iSNB-DDMC method will be presented. This work was supported by Sandia National Laboratory - Albuquerque.

  11. Monte Carlo techniques in diagnostic and therapeutic nuclear medicine

    International Nuclear Information System (INIS)

    Zaidi, H.

    2002-01-01

    Monte Carlo techniques have become one of the most popular tools in different areas of medical radiation physics following the development and subsequent implementation of powerful computing systems for clinical use. In particular, they have been extensively applied to simulate processes involving random behaviour and to quantify physical parameters that are difficult or even impossible to calculate analytically or to determine by experimental measurements. The use of the Monte Carlo method to simulate radiation transport turned out to be the most accurate means of predicting absorbed dose distributions and other quantities of interest in the radiation treatment of cancer patients using either external or radionuclide radiotherapy. The same trend has occurred for the estimation of the absorbed dose in diagnostic procedures using radionuclides. There is broad consensus in accepting that the earliest Monte Carlo calculations in medical radiation physics were made in the area of nuclear medicine, where the technique was used for dosimetry modelling and computations. Formalism and data based on Monte Carlo calculations, developed by the Medical Internal Radiation Dose (MIRD) committee of the Society of Nuclear Medicine, were published in a series of supplements to the Journal of Nuclear Medicine, the first one being released in 1968. Some of these pamphlets made extensive use of Monte Carlo calculations to derive specific absorbed fractions for electron and photon sources uniformly distributed in organs of mathematical phantoms. Interest in Monte Carlo-based dose calculations with β-emitters has been revived with the application of radiolabelled monoclonal antibodies to radioimmunotherapy. As a consequence of this generalized use, many questions are being raised primarily about the need and potential of Monte Carlo techniques, but also about how accurate it really is, what would it take to apply it clinically and make it available widely to the medical physics

  12. Monte Carlo strategies in scientific computing

    CERN Document Server

    Liu, Jun S

    2008-01-01

    This paperback edition is a reprint of the 2001 Springer edition This book provides a self-contained and up-to-date treatment of the Monte Carlo method and develops a common framework under which various Monte Carlo techniques can be "standardized" and compared Given the interdisciplinary nature of the topics and a moderate prerequisite for the reader, this book should be of interest to a broad audience of quantitative researchers such as computational biologists, computer scientists, econometricians, engineers, probabilists, and statisticians It can also be used as the textbook for a graduate-level course on Monte Carlo methods Many problems discussed in the alter chapters can be potential thesis topics for masters’ or PhD students in statistics or computer science departments Jun Liu is Professor of Statistics at Harvard University, with a courtesy Professor appointment at Harvard Biostatistics Department Professor Liu was the recipient of the 2002 COPSS Presidents' Award, the most prestigious one for sta...

  13. Off-diagonal expansion quantum Monte Carlo.

    Science.gov (United States)

    Albash, Tameem; Wagenbreth, Gene; Hen, Itay

    2017-12-01

    We propose a Monte Carlo algorithm designed to simulate quantum as well as classical systems at equilibrium, bridging the algorithmic gap between quantum and classical thermal simulation algorithms. The method is based on a decomposition of the quantum partition function that can be viewed as a series expansion about its classical part. We argue that the algorithm not only provides a theoretical advancement in the field of quantum Monte Carlo simulations, but is optimally suited to tackle quantum many-body systems that exhibit a range of behaviors from "fully quantum" to "fully classical," in contrast to many existing methods. We demonstrate the advantages, sometimes by orders of magnitude, of the technique by comparing it against existing state-of-the-art schemes such as path integral quantum Monte Carlo and stochastic series expansion. We also illustrate how our method allows for the unification of quantum and classical thermal parallel tempering techniques into a single algorithm and discuss its practical significance.

  14. American Indian Community Colleges.

    Science.gov (United States)

    One Feather, Gerald

    With the emergence of reservation based community colleges (th Navajo Community College and the Dakota Community Colleges), the American Indian people, as decision makers in these institutions, are providing Indians with the technical skills and cultural knowledge necessary for self-determination. Confronted with limited numbers of accredited…

  15. Variational Monte Carlo Technique

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 19; Issue 8. Variational Monte Carlo Technique: Ground State Energies of Quantum Mechanical Systems. Sukanta Deb. General Article Volume 19 Issue 8 August 2014 pp 713-739 ...

  16. Dynamic bounds coupled with Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Rajabalinejad, M., E-mail: M.Rajabalinejad@tudelft.n [Faculty of Civil Engineering, Delft University of Technology, Delft (Netherlands); Meester, L.E. [Delft Institute of Applied Mathematics, Delft University of Technology, Delft (Netherlands); Gelder, P.H.A.J.M. van; Vrijling, J.K. [Faculty of Civil Engineering, Delft University of Technology, Delft (Netherlands)

    2011-02-15

    For the reliability analysis of engineering structures a variety of methods is known, of which Monte Carlo (MC) simulation is widely considered to be among the most robust and most generally applicable. To reduce simulation cost of the MC method, variance reduction methods are applied. This paper describes a method to reduce the simulation cost even further, while retaining the accuracy of Monte Carlo, by taking into account widely present monotonicity. For models exhibiting monotonic (decreasing or increasing) behavior, dynamic bounds (DB) are defined, which in a coupled Monte Carlo simulation are updated dynamically, resulting in a failure probability estimate, as well as a strict (non-probabilistic) upper and lower bounds. Accurate results are obtained at a much lower cost than an equivalent ordinary Monte Carlo simulation. In a two-dimensional and a four-dimensional numerical example, the cost reduction factors are 130 and 9, respectively, where the relative error is smaller than 5%. At higher accuracy levels, this factor increases, though this effect is expected to be smaller with increasing dimension. To show the application of DB method to real world problems, it is applied to a complex finite element model of a flood wall in New Orleans.

  17. Coded aperture optimization using Monte Carlo simulations

    International Nuclear Information System (INIS)

    Martineau, A.; Rocchisani, J.M.; Moretti, J.L.

    2010-01-01

    Coded apertures using Uniformly Redundant Arrays (URA) have been unsuccessfully evaluated for two-dimensional and three-dimensional imaging in Nuclear Medicine. The images reconstructed from coded projections contain artifacts and suffer from poor spatial resolution in the longitudinal direction. We introduce a Maximum-Likelihood Expectation-Maximization (MLEM) algorithm for three-dimensional coded aperture imaging which uses a projection matrix calculated by Monte Carlo simulations. The aim of the algorithm is to reduce artifacts and improve the three-dimensional spatial resolution in the reconstructed images. Firstly, we present the validation of GATE (Geant4 Application for Emission Tomography) for Monte Carlo simulations of a coded mask installed on a clinical gamma camera. The coded mask modelling was validated by comparison between experimental and simulated data in terms of energy spectra, sensitivity and spatial resolution. In the second part of the study, we use the validated model to calculate the projection matrix with Monte Carlo simulations. A three-dimensional thyroid phantom study was performed to compare the performance of the three-dimensional MLEM reconstruction with conventional correlation method. The results indicate that the artifacts are reduced and three-dimensional spatial resolution is improved with the Monte Carlo-based MLEM reconstruction.

  18. Seamounts in the Central Indian Ocean Basin: indicators of the Indian plate movement

    Digital Repository Service at National Institute of Oceanography (India)

    Mukhopadhyay, R.; Khadge, N.H.

    stream_size 9 stream_content_type text/plain stream_name Proc_Indian_Acad_Sci_(EPS)_99_357.pdf.txt stream_source_info Proc_Indian_Acad_Sci_(EPS)_99_357.pdf.txt Content-Encoding ISO-8859-1 Content-Type text/plain; charset=ISO-8859-1 ...

  19. Randomized quasi-Monte Carlo simulation of fast-ion thermalization

    International Nuclear Information System (INIS)

    Höök, L J; Johnson, T; Hellsten, T

    2012-01-01

    This work investigates the applicability of the randomized quasi-Monte Carlo method for simulation of fast-ion thermalization processes in fusion plasmas, e.g. for simulation of neutral beam injection and radio frequency heating. In contrast to the standard Monte Carlo method, the quasi-Monte Carlo method uses deterministic numbers instead of pseudo-random numbers and has a statistical weak convergence close to O(N -1 ), where N is the number of markers. We have compared different quasi-Monte Carlo methods for a neutral beam injection scenario, which is solved by many realizations of the associated stochastic differential equation, discretized with the Euler-Maruyama scheme. The statistical convergence of the methods is measured for time steps up to 2 14 . (paper)

  20. Usefulness of the Monte Carlo method in reliability calculations

    International Nuclear Information System (INIS)

    Lanore, J.M.; Kalli, H.

    1977-01-01

    Three examples of reliability Monte Carlo programs developed in the LEP (Laboratory for Radiation Shielding Studies in the Nuclear Research Center at Saclay) are presented. First, an uncertainty analysis is given for a simplified spray system; a Monte Carlo program PATREC-MC has been written to solve the problem with the system components given in the fault tree representation. The second program MONARC 2 has been written to solve the problem of complex systems reliability by the Monte Carlo simulation, here again the system (a residual heat removal system) is in the fault tree representation. Third, the Monte Carlo program MONARC was used instead of the Markov diagram to solve the simulation problem of an electric power supply including two nets and two stand-by diesels

  1. Combinatorial nuclear level density by a Monte Carlo method

    International Nuclear Information System (INIS)

    Cerf, N.

    1994-01-01

    We present a new combinatorial method for the calculation of the nuclear level density. It is based on a Monte Carlo technique, in order to avoid a direct counting procedure which is generally impracticable for high-A nuclei. The Monte Carlo simulation, making use of the Metropolis sampling scheme, allows a computationally fast estimate of the level density for many fermion systems in large shell model spaces. We emphasize the advantages of this Monte Carlo approach, particularly concerning the prediction of the spin and parity distributions of the excited states,and compare our results with those derived from a traditional combinatorial or a statistical method. Such a Monte Carlo technique seems very promising to determine accurate level densities in a large energy range for nuclear reaction calculations

  2. Monte Carlo variance reduction approaches for non-Boltzmann tallies

    International Nuclear Information System (INIS)

    Booth, T.E.

    1992-12-01

    Quantities that depend on the collective effects of groups of particles cannot be obtained from the standard Boltzmann transport equation. Monte Carlo estimates of these quantities are called non-Boltzmann tallies and have become increasingly important recently. Standard Monte Carlo variance reduction techniques were designed for tallies based on individual particles rather than groups of particles. Experience with non-Boltzmann tallies and analog Monte Carlo has demonstrated the severe limitations of analog Monte Carlo for many non-Boltzmann tallies. In fact, many calculations absolutely require variance reduction methods to achieve practical computation times. Three different approaches to variance reduction for non-Boltzmann tallies are described and shown to be unbiased. The advantages and disadvantages of each of the approaches are discussed

  3. Discrete diffusion Monte Carlo for frequency-dependent radiative transfer

    International Nuclear Information System (INIS)

    Densmore, Jeffery D.; Thompson, Kelly G.; Urbatsch, Todd J.

    2011-01-01

    Discrete Diffusion Monte Carlo (DDMC) is a technique for increasing the efficiency of Implicit Monte Carlo radiative-transfer simulations. In this paper, we develop an extension of DDMC for frequency-dependent radiative transfer. We base our new DDMC method on a frequency integrated diffusion equation for frequencies below a specified threshold. Above this threshold we employ standard Monte Carlo. With a frequency-dependent test problem, we confirm the increased efficiency of our new DDMC technique. (author)

  4. A MISCELLANY ON INDIAN TRADITIONAL MUSIC

    Directory of Open Access Journals (Sweden)

    Rauf Kerimov

    2013-06-01

    Full Text Available Indian music has a very long, unbroken tradition and is an accumulated heritage of centuries. Music in India was popular among all the sections of society and intertwined in life and culture from birth to death. Indian music was formed with the evolution of ancient religious and secular music. The Indian culture absorbed all the best that was brought by other nations in the process of historical development. The Indian music is quite diverse: there are classical instrumental and vocal works and traditional singing of sacred hymns, folk songs and music of different nations. In contrast to the music scholarship, where typically image is a certain regularity, discipline and harmony, beauty of the traditional Indian music in the free improvisation, which is used by the performer. Listening carefully of this music, the man in a new world, a different sounds and explore a different idea of music for himself. The aim of the Indian music, unlike European musical culture define, explore, create and move depths to people's moods. And the Indian instruments is a miracle, that could reflect all these philosophical and aesthetic views. Along with the vocal art, this musical tradition has rich variety of melodic and rhythmic instruments.

  5. Who Writes Carlos Bulosan?

    Directory of Open Access Journals (Sweden)

    Charlie Samuya Veric

    2001-12-01

    Full Text Available The importance of Carlos Bulosan in Filipino and Filipino-American radical history and literature is indisputable. His eminence spans the pacific, and he is known, diversely, as a radical poet, fictionist, novelist, and labor organizer. Author of the canonical America Iis the Hearts, Bulosan is celebrated for chronicling the conditions in America in his time, such as racism and unemployment. In the history of criticism on Bulosan's life and work, however, there is an undeclared general consensus that views Bulosan and his work as coherent permanent texts of radicalism and anti-imperialism. Central to the existence of such a tradition of critical reception are the generations of critics who, in more ways than one, control the discourse on and of Carlos Bulosan. This essay inquires into the sphere of the critical reception that orders, for our time and for the time ahead, the reading and interpretation of Bulosan. What eye and seeing, the essay asks, determine the perception of Bulosan as the angel of radicalism? What is obscured in constructing Bulosan as an immutable figure of the political? What light does the reader conceive when the personal is brought into the open and situated against the political? the essay explores the answers to these questions in Bulosan's loving letters to various friends, strangers, and white American women. The presence of these interrogations, the essay believes, will secure ultimately the continuing importance of Carlos Bulosan to radical literature and history.

  6. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  7. Modified Monte Carlo procedure for particle transport problems

    International Nuclear Information System (INIS)

    Matthes, W.

    1978-01-01

    The simulation of photon transport in the atmosphere with the Monte Carlo method forms part of the EURASEP-programme. The specifications for the problems posed for a solution were such, that the direct application of the analogue Monte Carlo method was not feasible. For this reason the standard Monte Carlo procedure was modified in the sense that additional properly weighted branchings at each collision and transport process in a photon history were introduced. This modified Monte Carlo procedure leads to a clear and logical separation of the essential parts of a problem and offers a large flexibility for variance reducing techniques. More complex problems, as foreseen in the EURASEP-programme (e.g. clouds in the atmosphere, rough ocean-surface and chlorophyl-distribution in the ocean) can be handled by recoding some subroutines. This collision- and transport-splitting procedure can of course be performed differently in different space- and energy regions. It is applied here only for a homogeneous problem

  8. Evaluation of the Apache II and the oncologic history, as indicative predictions of mortality in the unit of intensive care of the INC September 1996 -December 1997

    International Nuclear Information System (INIS)

    Camargo, David O; Gomez, Clara; Martinez, Teresa

    1999-01-01

    They are multiple the indexes of severity that have been carried out to value the predict and the quality of a patient's life, especially when this it enters to the unit of intensive care (UIC); however, the oncologic patient presents particularities in their mobility, that it supposes a different behavior in the results of the Indexes. Presently work is compared the Apache scale and the oncologic history like morbid mortality as predictors in the UCI. 207 patients were included that entered the UCI between September of 1996 and December of 1997. It was a mortality of 29%, the stay of most of this group of patient smaller than 24 hours or bigger than 8 days. To the entrance, 50% of the patients presented superior averages at 15 in the Apache Scale and at the 48 hours, alone 30.4% continued with this value. The patients with hematologic neoplasia presented superior average at 15 in 87%, with a mortality of 63.3% with average between 15 and 24 to the entrance, the risk of dying was 9.8 times but that with inferior average. In the hematologic patient, the risk of dying was 5.7 times but regarding the solid tumors. The system but altered it was the breathing one, with an increase in the risk of dying from 2,8 times for each increment utility in the scale. Contrary to described in the literature, the oncologic diagnoses and the neoplasia statistic they didn't influence in the mortality of the patients

  9. Indian Ocean Rim Cooperation

    DEFF Research Database (Denmark)

    Wippel, Steffen

    Since the mid-1990s, the Indian Ocean has been experiencing increasing economic cooperation among its rim states. Middle Eastern countries, too, participate in the work of the Indian Ocean Rim Association, which received new impetus in the course of the current decade. Notably Oman is a very active...

  10. Fellowship | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Address: Director, Indian Institute of Science Education & Research, Sri Rama ... Address: Department of Chemistry, Indian Institute of Technology, New Delhi 110 016, Delhi ..... Specialization: Elementary Particle Physics, Field Theory and ...

  11. An Overview of the Monte Carlo Application ToolKit (MCATK)

    Energy Technology Data Exchange (ETDEWEB)

    Trahan, Travis John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-01-07

    MCATK is a C++ component-based Monte Carlo neutron-gamma transport software library designed to build specialized applications and designed to provide new functionality in existing general-purpose Monte Carlo codes like MCNP; it was developed with Agile software engineering methodologies under the motivation to reduce costs. The characteristics of MCATK can be summarized as follows: MCATK physics – continuous energy neutron-gamma transport with multi-temperature treatment, static eigenvalue (k and α) algorithms, time-dependent algorithm, fission chain algorithms; MCATK geometry – mesh geometries, solid body geometries. MCATK provides verified, unit-tested Monte Carlo components, flexibility in Monte Carlo applications development, and numerous tools such as geometry and cross section plotters. Recent work has involved deterministic and Monte Carlo analysis of stochastic systems. Static and dynamic analysis is discussed, and the results of a dynamic test problem are given.

  12. Infant Mortality and American Indians/Alaska Natives

    Science.gov (United States)

    ... American Indian/Alaska Native > Infant Health & Mortality Infant Mortality and American Indians/Alaska Natives American Indian/Alaska ... as compared to non-Hispanic white mothers. Infant Mortality Rate: Infant mortality rate per 1,000 live ...

  13. New fellows | Announcements | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    ... of Medical Sciences, New Delhi; S K Bhowmik, Indian Institute of Technology, ... Souvik Mahapatra, Indian Institute of Technology, Mumbai; Prabal K Maiti, Indian ... Math Art and Design: MAD about Math, Math Education and Outreach.

  14. Efficiency and accuracy of Monte Carlo (importance) sampling

    NARCIS (Netherlands)

    Waarts, P.H.

    2003-01-01

    Monte Carlo Analysis is often regarded as the most simple and accurate reliability method. Be-sides it is the most transparent method. The only problem is the accuracy in correlation with the efficiency. Monte Carlo gets less efficient or less accurate when very low probabilities are to be computed

  15. Evaluating the Intraspecific Interactions of Indian Rosewood (Dalbergia sissoo Roxb. Trees in Indian Rosewood Reserveof Khuzestan Province

    Directory of Open Access Journals (Sweden)

    Y. Erfanifard

    2016-05-01

    Full Text Available Positive and negative (facilitative and competitive interactions of plants are important issues in autecology and can be evaluated by the spatial pattern analysis in plant ecosystems. This study investigates the intraspecific interactions of Indian rosewood (Dalbergia sissoo Roxb. trees in Indian rosewood Reserve of Khuzestan province. Three 150 m × 200 m plots were selected and the spatial locations of all Indian rosewoods (239 trees were specified. Structurally different summary statistics (nearest neighbour distribution function D(r, K2-index K2(r, pair correlation function g(r, and O-ring O(r were also implemented to analyze the spatial pattern of the trees. The distribution of Indian rosewood trees significantly followed inhomogeneous Poisson process (α=0.05. The results of D(r and K2(r showed that the maximum distance to nearest tree was 12 m and density was decreased to this scale. The results of g(r and O(r also revealed the significant aggregation of Indian rosewood trees at scales of 1.5 to 4 m (α=0.05. In general, it was concluded that Indian rosewood trees had positive intraspecific interactions in Indian rosewood Reserve of Khuzestan province and their aggregation showed their facilitative effects on one another.

  16. Suppression of the initial transient in Monte Carlo criticality simulations; Suppression du regime transitoire initial des simulations Monte-Carlo de criticite

    Energy Technology Data Exchange (ETDEWEB)

    Richet, Y

    2006-12-15

    Criticality Monte Carlo calculations aim at estimating the effective multiplication factor (k-effective) for a fissile system through iterations simulating neutrons propagation (making a Markov chain). Arbitrary initialization of the neutron population can deeply bias the k-effective estimation, defined as the mean of the k-effective computed at each iteration. A simplified model of this cycle k-effective sequence is built, based on characteristics of industrial criticality Monte Carlo calculations. Statistical tests, inspired by Brownian bridge properties, are designed to discriminate stationarity of the cycle k-effective sequence. The initial detected transient is, then, suppressed in order to improve the estimation of the system k-effective. The different versions of this methodology are detailed and compared, firstly on a plan of numerical tests fitted on criticality Monte Carlo calculations, and, secondly on real criticality calculations. Eventually, the best methodologies observed in these tests are selected and allow to improve industrial Monte Carlo criticality calculations. (author)

  17. Monte Carlo criticality analysis for dissolvers with neutron poison

    International Nuclear Information System (INIS)

    Yu, Deshun; Dong, Xiufang; Pu, Fuxiang.

    1987-01-01

    Criticality analysis for dissolvers with neutron poison is given on the basis of Monte Carlo method. In Monte Carlo calculations of thermal neutron group parameters for fuel pieces, neutron transport length is determined in terms of maximum cross section approach. A set of related effective multiplication factors (K eff ) are calculated by Monte Carlo method for the three cases. Related numerical results are quite useful for the design and operation of this kind of dissolver in the criticality safety analysis. (author)

  18. Variational Monte Carlo Technique

    Indian Academy of Sciences (India)

    ias

    on the development of nuclear weapons in Los Alamos ..... cantly improved the paper. ... Carlo simulations of solids, Reviews of Modern Physics, Vol.73, pp.33– ... The computer algorithms are usually based on a random seed that starts the ...

  19. Improvements for Monte Carlo burnup calculation

    Energy Technology Data Exchange (ETDEWEB)

    Shenglong, Q.; Dong, Y.; Danrong, S.; Wei, L., E-mail: qiangshenglong@tsinghua.org.cn, E-mail: d.yao@npic.ac.cn, E-mail: songdr@npic.ac.cn, E-mail: luwei@npic.ac.cn [Nuclear Power Inst. of China, Cheng Du, Si Chuan (China)

    2015-07-01

    Monte Carlo burnup calculation is development trend of reactor physics, there would be a lot of work to be done for engineering applications. Based on Monte Carlo burnup code MOI, non-fuel burnup calculation methods and critical search suggestions will be mentioned in this paper. For non-fuel burnup, mixed burnup mode will improve the accuracy of burnup calculation and efficiency. For critical search of control rod position, a new method called ABN based on ABA which used by MC21 will be proposed for the first time in this paper. (author)

  20. Monte Carlo dose distributions for radiosurgery

    International Nuclear Information System (INIS)

    Perucha, M.; Leal, A.; Rincon, M.; Carrasco, E.

    2001-01-01

    The precision of Radiosurgery Treatment planning systems is limited by the approximations of their algorithms and by their dosimetrical input data. This fact is especially important in small fields. However, the Monte Carlo methods is an accurate alternative as it considers every aspect of particle transport. In this work an acoustic neurinoma is studied by comparing the dose distribution of both a planning system and Monte Carlo. Relative shifts have been measured and furthermore, Dose-Volume Histograms have been calculated for target and adjacent organs at risk. (orig.)

  1. Shell model Monte Carlo methods

    International Nuclear Information System (INIS)

    Koonin, S.E.; Dean, D.J.; Langanke, K.

    1997-01-01

    We review quantum Monte Carlo methods for dealing with large shell model problems. These methods reduce the imaginary-time many-body evolution operator to a coherent superposition of one-body evolutions in fluctuating one-body fields; the resultant path integral is evaluated stochastically. We first discuss the motivation, formalism, and implementation of such Shell Model Monte Carlo (SMMC) methods. There then follows a sampler of results and insights obtained from a number of applications. These include the ground state and thermal properties of pf-shell nuclei, the thermal and rotational behavior of rare-earth and γ-soft nuclei, and the calculation of double beta-decay matrix elements. Finally, prospects for further progress in such calculations are discussed. (orig.)

  2. Monte Carlo Methods in ICF

    Science.gov (United States)

    Zimmerman, George B.

    Monte Carlo methods appropriate to simulate the transport of x-rays, neutrons, ions and electrons in Inertial Confinement Fusion targets are described and analyzed. The Implicit Monte Carlo method of x-ray transport handles symmetry within indirect drive ICF hohlraums well, but can be improved 50X in efficiency by angular biasing the x-rays towards the fuel capsule. Accurate simulation of thermonuclear burn and burn diagnostics involves detailed particle source spectra, charged particle ranges, inflight reaction kinematics, corrections for bulk and thermal Doppler effects and variance reduction to obtain adequate statistics for rare events. It is found that the effects of angular Coulomb scattering must be included in models of charged particle transport through heterogeneous materials.

  3. Monte Carlo methods in ICF

    International Nuclear Information System (INIS)

    Zimmerman, George B.

    1997-01-01

    Monte Carlo methods appropriate to simulate the transport of x-rays, neutrons, ions and electrons in Inertial Confinement Fusion targets are described and analyzed. The Implicit Monte Carlo method of x-ray transport handles symmetry within indirect drive ICF hohlraums well, but can be improved 50X in efficiency by angular biasing the x-rays towards the fuel capsule. Accurate simulation of thermonuclear burn and burn diagnostics involves detailed particle source spectra, charged particle ranges, inflight reaction kinematics, corrections for bulk and thermal Doppler effects and variance reduction to obtain adequate statistics for rare events. It is found that the effects of angular Coulomb scattering must be included in models of charged particle transport through heterogeneous materials

  4. American Indian Studies, Multiculturalism, and the Academic Library

    Science.gov (United States)

    Alexander, David L.

    2013-01-01

    The current status of multicultural and diversity efforts suggests the need for incorporating into the discussion of librarianship an understanding of previously underrepresented populations such as the American Indian. American Indian Studies speaks from the American Indian perspective and addresses the contemporary condition of American Indians.…

  5. Comparative Study of Load Testing Tools: Apache JMeter, HP LoadRunner, Microsoft Visual Studio (TFS, Siege

    Directory of Open Access Journals (Sweden)

    Rabiya Abbas

    2017-12-01

    Full Text Available Software testing is the process of verifying and validating the user’s requirements. Testing is ongoing process during whole software development. Software testing is characterized into three main types. That is, in Black box testing, user doesn’t know domestic knowledge, internal logics and design of system. In white box testing, Tester knows the domestic logic of code. In Grey box testing, Tester has little bit knowledge about the internal structure and working of the system. It is commonly used in case of Integration testing.Load testing helps us to analyze the performance of the system under heavy load or under Zero load. This is achieved with the help of a Load Testing Tool. The intention for writing this research is to carry out a comparison of four load testing tools i.e. Apache JMeter, LoadRunner, Microsoft Visual Studio (TFS, Siege based on certain criteria  i.e. test scripts generation , result reports, application support, plug-in supports, and cost . The main focus is to study these load testing tools and identify which tool is better and more efficient . We assume this comparison can help in selecting the most appropriate tool and motivates the use of open source load testing tools.

  6. Quasi Monte Carlo methods for optimization models of the energy industry with pricing and load processes; Quasi-Monte Carlo Methoden fuer Optimierungsmodelle der Energiewirtschaft mit Preis- und Last-Prozessen

    Energy Technology Data Exchange (ETDEWEB)

    Leoevey, H.; Roemisch, W. [Humboldt-Univ., Berlin (Germany)

    2015-07-01

    We discuss progress in quasi Monte Carlo methods for numerical calculation integrals or expected values and justify why these methods are more efficient than the classic Monte Carlo methods. Quasi Monte Carlo methods are found to be particularly efficient if the integrands have a low effective dimension. That's why We also discuss the concept of effective dimension and prove on the example of a stochastic Optimization model of the energy industry that such models can posses a low effective dimension. Modern quasi Monte Carlo methods are therefore for such models very promising. [German] Wir diskutieren Fortschritte bei Quasi-Monte Carlo Methoden zur numerischen Berechnung von Integralen bzw. Erwartungswerten und begruenden warum diese Methoden effizienter sind als die klassischen Monte Carlo Methoden. Quasi-Monte Carlo Methoden erweisen sich als besonders effizient, falls die Integranden eine geringe effektive Dimension besitzen. Deshalb diskutieren wir auch den Begriff effektive Dimension und weisen am Beispiel eines stochastischen Optimierungsmodell aus der Energiewirtschaft nach, dass solche Modelle eine niedrige effektive Dimension besitzen koennen. Moderne Quasi-Monte Carlo Methoden sind deshalb fuer solche Modelle sehr erfolgversprechend.

  7. Indian Arts in Canada

    Science.gov (United States)

    Tawow, 1974

    1974-01-01

    A recent publication, "Indian Arts in Canada", examines some of the forces, both past and present, which are not only affecting American Indian artists today, but which will also profoundly influence their future. The review presents a few of the illustrations used in the book, along with the Introduction and the Foreword. (KM)

  8. Indian Treaties: Two Centuries of Dishonor. American Indian Reader: Current Affairs, Volume 5.

    Science.gov (United States)

    Costo, Rupert; Henry, Jeannette

    Today self-determination, economy, tribal jurisdiction, taxation, water and resource rights, and other aspects of American Indian affairs are affected by issues raised through the treaties and agreements made with Indian nations and tribes, and through the executive orders and statutes. Government policy has been influenced by the pressure brought…

  9. BREM5 electroweak Monte Carlo

    International Nuclear Information System (INIS)

    Kennedy, D.C. II.

    1987-01-01

    This is an update on the progress of the BREMMUS Monte Carlo simulator, particularly in its current incarnation, BREM5. The present report is intended only as a follow-up to the Mark II/Granlibakken proceedings, and those proceedings should be consulted for a complete description of the capabilities and goals of the BREMMUS program. The new BREM5 program improves on the previous version of BREMMUS, BREM2, in a number of important ways. In BREM2, the internal loop (oblique) corrections were not treated in consistent fashion, a deficiency that led to renormalization scheme-dependence; i.e., physical results, such as cross sections, were dependent on the method used to eliminate infinities from the theory. Of course, this problem cannot be tolerated in a Monte Carlo designed for experimental use. BREM5 incorporates a new way of treating the oblique corrections, as explained in the Granlibakken proceedings, that guarantees renormalization scheme-independence and dramatically simplifies the organization and calculation of radiative corrections. This technique is to be presented in full detail in a forthcoming paper. BREM5 is, at this point, the only Monte Carlo to contain the entire set of one-loop corrections to electroweak four-fermion processes and renormalization scheme-independence. 3 figures

  10. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Author Affiliations. A Salih1 S Ghosh Moulic2. Department of Aerospace Engineering, Indian Institute of Space Science and Technology, Thiruvananthapuram 695 022; Department of Mechanical Engineering, Indian Institute of Technology, Kharagpur 721 302 ...

  11. PEPSI: a Monte Carlo generator for polarized leptoproduction

    International Nuclear Information System (INIS)

    Mankiewicz, L.

    1992-01-01

    We describe PEPSI (Polarized Electron Proton Scattering Interactions) a Monte Carlo program for the polarized deep inelastic leptoproduction mediated by electromagnetic interaction. The code is a modification of the LEPTO 4.3 Lund Monte Carlo for unpolarized scattering and requires the standard polarization-independent JETSET routines to perform fragmentation into final hadrons. (orig.)

  12. Importance estimation in Monte Carlo modelling of neutron and photon transport

    International Nuclear Information System (INIS)

    Mickael, M.W.

    1992-01-01

    The estimation of neutron and photon importance in a three-dimensional geometry is achieved using a coupled Monte Carlo and diffusion theory calculation. The parameters required for the solution of the multigroup adjoint diffusion equation are estimated from an analog Monte Carlo simulation of the system under investigation. The solution of the adjoint diffusion equation is then used as an estimate of the particle importance in the actual simulation. This approach provides an automated and efficient variance reduction method for Monte Carlo simulations. The technique has been successfully applied to Monte Carlo simulation of neutron and coupled neutron-photon transport in the nuclear well-logging field. The results show that the importance maps obtained in a few minutes of computer time using this technique are in good agreement with Monte Carlo generated importance maps that require prohibitive computing times. The application of this method to Monte Carlo modelling of the response of neutron porosity and pulsed neutron instruments has resulted in major reductions in computation time. (Author)

  13. TARC: Carlo Rubbia's Energy Amplifier

    CERN Multimedia

    Laurent Guiraud

    1997-01-01

    Transmutation by Adiabatic Resonance Crossing (TARC) is Carlo Rubbia's energy amplifier. This CERN experiment demonstrated that long-lived fission fragments, such as 99-TC, can be efficiently destroyed.

  14. Rasam Indian Restaurant Menu 2017

    OpenAIRE

    Rasam Indian Restaurant

    2017-01-01

    A little bit about us, we opened our doors for business in November 2003 with the solid ambition to serve high quality authentic Indian cuisine in Dublin. Indian food over time has escaped the European misunderstanding or notion of ‘one sauce fits all’ and has been recognised for the rich dining experience with all the wonderful potent flavours of India Rasam wanted to contribute to the Indian food awakening and so when a suitable premise came available in Glasthule at the heart of a busy...

  15. Carlos Vesga Duarte

    Directory of Open Access Journals (Sweden)

    Pedro Medina Avendaño

    1981-01-01

    Full Text Available Carlos Vega Duarte tenía la sencillez de los seres elementales y puros. Su corazón era limpio como oro de aluvión. Su trato directo y coloquial ponía de relieve a un santandereano sin contaminaciones que amaba el fulgor de las armas y se encandilaba con el destello de las frases perfectas

  16. Iterative acceleration methods for Monte Carlo and deterministic criticality calculations

    Energy Technology Data Exchange (ETDEWEB)

    Urbatsch, T.J.

    1995-11-01

    If you have ever given up on a nuclear criticality calculation and terminated it because it took so long to converge, you might find this thesis of interest. The author develops three methods for improving the fission source convergence in nuclear criticality calculations for physical systems with high dominance ratios for which convergence is slow. The Fission Matrix Acceleration Method and the Fission Diffusion Synthetic Acceleration (FDSA) Method are acceleration methods that speed fission source convergence for both Monte Carlo and deterministic methods. The third method is a hybrid Monte Carlo method that also converges for difficult problems where the unaccelerated Monte Carlo method fails. The author tested the feasibility of all three methods in a test bed consisting of idealized problems. He has successfully accelerated fission source convergence in both deterministic and Monte Carlo criticality calculations. By filtering statistical noise, he has incorporated deterministic attributes into the Monte Carlo calculations in order to speed their source convergence. He has used both the fission matrix and a diffusion approximation to perform unbiased accelerations. The Fission Matrix Acceleration method has been implemented in the production code MCNP and successfully applied to a real problem. When the unaccelerated calculations are unable to converge to the correct solution, they cannot be accelerated in an unbiased fashion. A Hybrid Monte Carlo method weds Monte Carlo and a modified diffusion calculation to overcome these deficiencies. The Hybrid method additionally possesses reduced statistical errors.

  17. Iterative acceleration methods for Monte Carlo and deterministic criticality calculations

    International Nuclear Information System (INIS)

    Urbatsch, T.J.

    1995-11-01

    If you have ever given up on a nuclear criticality calculation and terminated it because it took so long to converge, you might find this thesis of interest. The author develops three methods for improving the fission source convergence in nuclear criticality calculations for physical systems with high dominance ratios for which convergence is slow. The Fission Matrix Acceleration Method and the Fission Diffusion Synthetic Acceleration (FDSA) Method are acceleration methods that speed fission source convergence for both Monte Carlo and deterministic methods. The third method is a hybrid Monte Carlo method that also converges for difficult problems where the unaccelerated Monte Carlo method fails. The author tested the feasibility of all three methods in a test bed consisting of idealized problems. He has successfully accelerated fission source convergence in both deterministic and Monte Carlo criticality calculations. By filtering statistical noise, he has incorporated deterministic attributes into the Monte Carlo calculations in order to speed their source convergence. He has used both the fission matrix and a diffusion approximation to perform unbiased accelerations. The Fission Matrix Acceleration method has been implemented in the production code MCNP and successfully applied to a real problem. When the unaccelerated calculations are unable to converge to the correct solution, they cannot be accelerated in an unbiased fashion. A Hybrid Monte Carlo method weds Monte Carlo and a modified diffusion calculation to overcome these deficiencies. The Hybrid method additionally possesses reduced statistical errors

  18. Study on random number generator in Monte Carlo code

    International Nuclear Information System (INIS)

    Oya, Kentaro; Kitada, Takanori; Tanaka, Shinichi

    2011-01-01

    The Monte Carlo code uses a sequence of pseudo-random numbers with a random number generator (RNG) to simulate particle histories. A pseudo-random number has its own period depending on its generation method and the period is desired to be long enough not to exceed the period during one Monte Carlo calculation to ensure the correctness especially for a standard deviation of results. The linear congruential generator (LCG) is widely used as Monte Carlo RNG and the period of LCG is not so long by considering the increasing rate of simulation histories in a Monte Carlo calculation according to the remarkable enhancement of computer performance. Recently, many kinds of RNG have been developed and some of their features are better than those of LCG. In this study, we investigate the appropriate RNG in a Monte Carlo code as an alternative to LCG especially for the case of enormous histories. It is found that xorshift has desirable features compared with LCG, and xorshift has a larger period, a comparable speed to generate random numbers, a better randomness, and good applicability to parallel calculation. (author)

  19. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Author Affiliations. Soumen Bag1 Gaurav Harit2. Department of Computer Science and Engineering, Indian Institute of Technology Kharagpur, Kharagpur 721 302, India; Information and Communication Technology, Indian Institute of Technology Rajasthan, Jodhpur 342 011, India ...

  20. History of Indian Arts Education in Santa Fe: The Institute of American Indian Arts with Historical Background 1890 to 1962.

    Science.gov (United States)

    Garmhausen, Winona

    This book traces the history of the Institute of American Indian Arts in Santa Fe, New Mexico. Sections cover four time periods in the evolution of the Institute: the United States Indian Industrial School at Sante Fe, 1890-1932; the Santa Fe Indian School, 1930-62; and the Institute of American Indian Arts, 1962-70 and 1970-78. The United States…

  1. Combinatorial geometry domain decomposition strategies for Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Li, G.; Zhang, B.; Deng, L.; Mo, Z.; Liu, Z.; Shangguan, D.; Ma, Y.; Li, S.; Hu, Z. [Institute of Applied Physics and Computational Mathematics, Beijing, 100094 (China)

    2013-07-01

    Analysis and modeling of nuclear reactors can lead to memory overload for a single core processor when it comes to refined modeling. A method to solve this problem is called 'domain decomposition'. In the current work, domain decomposition algorithms for a combinatorial geometry Monte Carlo transport code are developed on the JCOGIN (J Combinatorial Geometry Monte Carlo transport INfrastructure). Tree-based decomposition and asynchronous communication of particle information between domains are described in the paper. Combination of domain decomposition and domain replication (particle parallelism) is demonstrated and compared with that of MERCURY code. A full-core reactor model is simulated to verify the domain decomposition algorithms using the Monte Carlo particle transport code JMCT (J Monte Carlo Transport Code), which has being developed on the JCOGIN infrastructure. Besides, influences of the domain decomposition algorithms to tally variances are discussed. (authors)

  2. Combinatorial geometry domain decomposition strategies for Monte Carlo simulations

    International Nuclear Information System (INIS)

    Li, G.; Zhang, B.; Deng, L.; Mo, Z.; Liu, Z.; Shangguan, D.; Ma, Y.; Li, S.; Hu, Z.

    2013-01-01

    Analysis and modeling of nuclear reactors can lead to memory overload for a single core processor when it comes to refined modeling. A method to solve this problem is called 'domain decomposition'. In the current work, domain decomposition algorithms for a combinatorial geometry Monte Carlo transport code are developed on the JCOGIN (J Combinatorial Geometry Monte Carlo transport INfrastructure). Tree-based decomposition and asynchronous communication of particle information between domains are described in the paper. Combination of domain decomposition and domain replication (particle parallelism) is demonstrated and compared with that of MERCURY code. A full-core reactor model is simulated to verify the domain decomposition algorithms using the Monte Carlo particle transport code JMCT (J Monte Carlo Transport Code), which has being developed on the JCOGIN infrastructure. Besides, influences of the domain decomposition algorithms to tally variances are discussed. (authors)

  3. Monte Carlo method applied to medical physics

    International Nuclear Information System (INIS)

    Oliveira, C.; Goncalves, I.F.; Chaves, A.; Lopes, M.C.; Teixeira, N.; Matos, B.; Goncalves, I.C.; Ramalho, A.; Salgado, J.

    2000-01-01

    The main application of the Monte Carlo method to medical physics is dose calculation. This paper shows some results of two dose calculation studies and two other different applications: optimisation of neutron field for Boron Neutron Capture Therapy and optimization of a filter for a beam tube for several purposes. The time necessary for Monte Carlo calculations - the highest boundary for its intensive utilisation - is being over-passed with faster and cheaper computers. (author)

  4. New fellows | Announcements | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Aninda J Bhattacharyya, Indian Institute of Science, Bengaluru; Suvendra N Bhattacharyya, CSIR-Indian Institute of Chemical Biology, Kolkata; Mitali Chatterjee, Institute of Postgraduate Medical Education & Research, Kolkata; Prasanta K Das, Indian Association for the Cultivation of Science, Kolkata; Swapan K Datta, ...

  5. A Canadian Indian Health Status Index.

    Science.gov (United States)

    Connop, P J

    1983-01-01

    Health care services for registered "band" Indians in Ontario are provided primarily by the Canadian Federal Government. Complex management methods preclude the direct involvement of Indian people in the decisions for their health resource allocation. Health indicators, need, and health status indexes are reviewed. The biostatistics of mortality and demography of the Indian and reference populations are aggregated with hospitalization/morbidity experience as the Chen G'1 Index, as an indicator of normative and comparative need. This is weighted by linear measurements of perceived need for preventive medicine programs, as ranked and scaled values of priorities, Zj. These were determined by community survey on 11 Indian reserves using a non-probabilistic psychometric method of "pair comparisons," based upon "Thurstone's Law of Comparative Judgement.," The calculation of the aggregate single unit Indian Health Status Index [Log.G'1].Zj and its potential application in a "zero-base" budget is described.

  6. A radiating shock evaluated using Implicit Monte Carlo Diffusion

    International Nuclear Information System (INIS)

    Cleveland, M.; Gentile, N.

    2013-01-01

    Implicit Monte Carlo [1] (IMC) has been shown to be very expensive when used to evaluate a radiation field in opaque media. Implicit Monte Carlo Diffusion (IMD) [2], which evaluates a spatial discretized diffusion equation using a Monte Carlo algorithm, can be used to reduce the cost of evaluating the radiation field in opaque media [2]. This work couples IMD to the hydrodynamics equations to evaluate opaque diffusive radiating shocks. The Lowrie semi-analytic diffusive radiating shock benchmark[a] is used to verify our implementation of the coupled system of equations. (authors)

  7. The Monte Carlo method the method of statistical trials

    CERN Document Server

    Shreider, YuA

    1966-01-01

    The Monte Carlo Method: The Method of Statistical Trials is a systematic account of the fundamental concepts and techniques of the Monte Carlo method, together with its range of applications. Some of these applications include the computation of definite integrals, neutron physics, and in the investigation of servicing processes. This volume is comprised of seven chapters and begins with an overview of the basic features of the Monte Carlo method and typical examples of its application to simple problems in computational mathematics. The next chapter examines the computation of multi-dimensio

  8. Indian Summer Arts Festival


    OpenAIRE

    Martel, Yann; Tabu; Tejpal, Tarun; Kunzru, Hari

    2011-01-01

    The SFU Woodward's Cultural Unit partnered with the Indian Summer Festival Society to kick off the inaugural Indian Summer Festival. Held at the Goldcorp Centre for the Arts, it included an interactive Literature Series with notable authors from both India and Canada, including special guests Yann Martel, Bollywood superstar Tabu, journalist Tarun Tejpal, writer Hari Kunzru, and many others.

  9. Applicability of quasi-Monte Carlo for lattice systems

    International Nuclear Information System (INIS)

    Ammon, Andreas; Deutsches Elektronen-Synchrotron; Hartung, Tobias; Jansen, Karl; Leovey, Hernan; Griewank, Andreas; Mueller-Preussker, Michael

    2013-11-01

    This project investigates the applicability of quasi-Monte Carlo methods to Euclidean lattice systems in order to improve the asymptotic error scaling of observables for such theories. The error of an observable calculated by averaging over random observations generated from ordinary Monte Carlo simulations scales like N -1/2 , where N is the number of observations. By means of quasi-Monte Carlo methods it is possible to improve this scaling for certain problems to N -1 , or even further if the problems are regular enough. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling of all investigated observables in both cases.

  10. Applicability of quasi-Monte Carlo for lattice systems

    Energy Technology Data Exchange (ETDEWEB)

    Ammon, Andreas [Berlin Humboldt-Univ. (Germany). Dept. of Physics; Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Hartung, Tobias [King' s College London (United Kingdom). Dept. of Mathematics; Jansen, Karl [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Leovey, Hernan; Griewank, Andreas [Berlin Humboldt-Univ. (Germany). Dept. of Mathematics; Mueller-Preussker, Michael [Berlin Humboldt-Univ. (Germany). Dept. of Physics

    2013-11-15

    This project investigates the applicability of quasi-Monte Carlo methods to Euclidean lattice systems in order to improve the asymptotic error scaling of observables for such theories. The error of an observable calculated by averaging over random observations generated from ordinary Monte Carlo simulations scales like N{sup -1/2}, where N is the number of observations. By means of quasi-Monte Carlo methods it is possible to improve this scaling for certain problems to N{sup -1}, or even further if the problems are regular enough. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling of all investigated observables in both cases.

  11. Automated Monte Carlo biasing for photon-generated electrons near surfaces.

    Energy Technology Data Exchange (ETDEWEB)

    Franke, Brian Claude; Crawford, Martin James; Kensek, Ronald Patrick

    2009-09-01

    This report describes efforts to automate the biasing of coupled electron-photon Monte Carlo particle transport calculations. The approach was based on weight-windows biasing. Weight-window settings were determined using adjoint-flux Monte Carlo calculations. A variety of algorithms were investigated for adaptivity of the Monte Carlo tallies. Tree data structures were used to investigate spatial partitioning. Functional-expansion tallies were used to investigate higher-order spatial representations.

  12. The use of perioperative serial blood lactate levels, the APACHE II and the postoperative MELD as predictors of early mortality after liver transplantation O uso da dosagem seriada do lactato sérico no perioperatório, do APACHE II e do MELD pós-operatório como preditores de mortalidade precoce após transplante hepático

    Directory of Open Access Journals (Sweden)

    Anibal Basile-Filho

    2011-12-01

    Full Text Available PURPOSE: To evaluate the accuracy of different parameters in predicting early (one-month mortality of patients submitted to orthotopic liver transplantation (OLT. METHODS: This is a retrospective study of forty-four patients (38 males and 10 females, mean age of 52.2 ± 8.9 years admitted to the Intensive Care Unit of a tertiary hospital. Serial lactate blood levels, APACHE II, MELD post-OLT, creatinine, bilirubin and INR parameters were analyzed by receiver-operator characteristic (ROC curves as evidenced by the area under the curve (AUC. The level of significance was set at 0.05. RESULTS: The mortality of OLT patients within one month was 17.3%. Differences in blood lactate levels became statistically significant between survivors and nonsurvivors at the end of the surgery (pOBJETIVO: Avaliar qual parâmetro é o mais eficiente na predição de mortalidade precoce (um mês de pacientes submetidos a transplante ortotópico de fígado (OLT. MÉTODOS: Estudo retrospectivo em cinqüenta e oito pacientes adultos (44 homens e 14 mulheres, com uma idade média de 51,7 ± 10,1 anos admitidos na Unidade de Terapia Intensiva de um hospital terciário. Os parâmetros como a dosagem seriada de lactato no sangue, APACHE II, MELD pós-OLT, creatinina, bilirrubina e INR foram analisados por curvas ROC (Receiver-operator characteristic, evidenciado pela área abaixo da curva (AUC. O nível de significância foi definido em 0,05. RESULTADOS: A mortalidade dos pacientes OLT em até um mês foi de 17,3%. As diferenças no nível de lactato no sangue tornaram-se estatisticamente significantes entre sobreviventes e não sobreviventes no final da cirurgia (p < 0,05. A AUC foi de 0,726 (95%CI = 0,593-0,835 para APACHE II (p = 0,02; 0,770 (95%CI = 0,596-0,849 para o lactato sérico (L7-L8 (p = 0,03; 0,814 (95%CI = 0,690-0,904 para MELD post-OLT (p < 0,01; 0,550 (95%CI = 0,414-0,651 de creatinina (p = 0,64; 0,705 (95%CI = 0,571-0,818 de bilirrubina (p = 0,05 e 0

  13. Uniform distribution and quasi-Monte Carlo methods discrepancy, integration and applications

    CERN Document Server

    Kritzer, Peter; Pillichshammer, Friedrich; Winterhof, Arne

    2014-01-01

    The survey articles in this book focus on number theoretic point constructions, uniform distribution theory, and quasi-Monte Carlo methods. As deterministic versions of the Monte Carlo method, quasi-Monte Carlo rules enjoy increasing popularity, with many fruitful applications in mathematical practice, as for example in finance, computer graphics, and biology.

  14. Clinical implementation of full Monte Carlo dose calculation in proton beam therapy

    International Nuclear Information System (INIS)

    Paganetti, Harald; Jiang, Hongyu; Parodi, Katia; Slopsema, Roelf; Engelsman, Martijn

    2008-01-01

    The goal of this work was to facilitate the clinical use of Monte Carlo proton dose calculation to support routine treatment planning and delivery. The Monte Carlo code Geant4 was used to simulate the treatment head setup, including a time-dependent simulation of modulator wheels (for broad beam modulation) and magnetic field settings (for beam scanning). Any patient-field-specific setup can be modeled according to the treatment control system of the facility. The code was benchmarked against phantom measurements. Using a simulation of the ionization chamber reading in the treatment head allows the Monte Carlo dose to be specified in absolute units (Gy per ionization chamber reading). Next, the capability of reading CT data information was implemented into the Monte Carlo code to model patient anatomy. To allow time-efficient dose calculation, the standard Geant4 tracking algorithm was modified. Finally, a software link of the Monte Carlo dose engine to the patient database and the commercial planning system was established to allow data exchange, thus completing the implementation of the proton Monte Carlo dose calculation engine ('DoC++'). Monte Carlo re-calculated plans are a valuable tool to revisit decisions in the planning process. Identification of clinically significant differences between Monte Carlo and pencil-beam-based dose calculations may also drive improvements of current pencil-beam methods. As an example, four patients (29 fields in total) with tumors in the head and neck regions were analyzed. Differences between the pencil-beam algorithm and Monte Carlo were identified in particular near the end of range, both due to dose degradation and overall differences in range prediction due to bony anatomy in the beam path. Further, the Monte Carlo reports dose-to-tissue as compared to dose-to-water by the planning system. Our implementation is tailored to a specific Monte Carlo code and the treatment planning system XiO (Computerized Medical Systems Inc

  15. Exponential convergence on a continuous Monte Carlo transport problem

    International Nuclear Information System (INIS)

    Booth, T.E.

    1997-01-01

    For more than a decade, it has been known that exponential convergence on discrete transport problems was possible using adaptive Monte Carlo techniques. An adaptive Monte Carlo method that empirically produces exponential convergence on a simple continuous transport problem is described

  16. 48 CFR 1452.226-70 - Indian Preference.

    Science.gov (United States)

    2010-10-01

    ...); and (3) “Indian-owned economic enterprise” means any Indian-owned commercial, industrial, or business... MANAGEMENT SOLICITATION PROVISIONS AND CONTRACT CLAUSES Text of Provisions and Clauses 1452.226-70 Indian...-owned economic enterprises in the awarding of any subcontracts consistent with the efficient performance...

  17. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Sequential Bayesian technique: An alternative approach for software reliability estimation ... Software reliability; Bayesian sequential estimation; Kalman filter. ... Department of Mathematics, Indian Institute of Technology, Kharagpur 721 302; Reliability Engineering Centre, Indian Institute of Technology, Kharagpur 721 302 ...

  18. Isotopic depletion with Monte Carlo

    International Nuclear Information System (INIS)

    Martin, W.R.; Rathkopf, J.A.

    1996-06-01

    This work considers a method to deplete isotopes during a time- dependent Monte Carlo simulation of an evolving system. The method is based on explicitly combining a conventional estimator for the scalar flux with the analytical solutions to the isotopic depletion equations. There are no auxiliary calculations; the method is an integral part of the Monte Carlo calculation. The method eliminates negative densities and reduces the variance in the estimates for the isotope densities, compared to existing methods. Moreover, existing methods are shown to be special cases of the general method described in this work, as they can be derived by combining a high variance estimator for the scalar flux with a low-order approximation to the analytical solution to the depletion equation

  19. Multilevel sequential Monte-Carlo samplers

    KAUST Repository

    Jasra, Ajay

    2016-01-01

    Multilevel Monte-Carlo methods provide a powerful computational technique for reducing the computational cost of estimating expectations for a given computational effort. They are particularly relevant for computational problems when approximate distributions are determined via a resolution parameter h, with h=0 giving the theoretical exact distribution (e.g. SDEs or inverse problems with PDEs). The method provides a benefit by coupling samples from successive resolutions, and estimating differences of successive expectations. We develop a methodology that brings Sequential Monte-Carlo (SMC) algorithms within the framework of the Multilevel idea, as SMC provides a natural set-up for coupling samples over different resolutions. We prove that the new algorithm indeed preserves the benefits of the multilevel principle, even if samples at all resolutions are now correlated.

  20. Monte Carlo methods in ICF

    International Nuclear Information System (INIS)

    Zimmerman, G.B.

    1997-01-01

    Monte Carlo methods appropriate to simulate the transport of x-rays, neutrons, ions and electrons in Inertial Confinement Fusion targets are described and analyzed. The Implicit Monte Carlo method of x-ray transport handles symmetry within indirect drive ICF hohlraums well, but can be improved 50X in efficiency by angular biasing the x-rays towards the fuel capsule. Accurate simulation of thermonuclear burn and burn diagnostics involves detailed particle source spectra, charged particle ranges, inflight reaction kinematics, corrections for bulk and thermal Doppler effects and variance reduction to obtain adequate statistics for rare events. It is found that the effects of angular Coulomb scattering must be included in models of charged particle transport through heterogeneous materials. copyright 1997 American Institute of Physics

  1. Multilevel sequential Monte-Carlo samplers

    KAUST Repository

    Jasra, Ajay

    2016-01-05

    Multilevel Monte-Carlo methods provide a powerful computational technique for reducing the computational cost of estimating expectations for a given computational effort. They are particularly relevant for computational problems when approximate distributions are determined via a resolution parameter h, with h=0 giving the theoretical exact distribution (e.g. SDEs or inverse problems with PDEs). The method provides a benefit by coupling samples from successive resolutions, and estimating differences of successive expectations. We develop a methodology that brings Sequential Monte-Carlo (SMC) algorithms within the framework of the Multilevel idea, as SMC provides a natural set-up for coupling samples over different resolutions. We prove that the new algorithm indeed preserves the benefits of the multilevel principle, even if samples at all resolutions are now correlated.

  2. Investigating the Indian Ocean Geoid Low

    Science.gov (United States)

    Ghosh, A.; Gollapalli, T.; Steinberger, B. M.

    2016-12-01

    The lowest geoid anomaly on Earth lies in the Indian Ocean just south of the Indian peninsula.Several theories have been proposed to explain this geoid low, most of which invoke past subduction. Some recent studies have alsoargued that high velocity anomalies in the lower mantle coupled with low velocity anomalies in the upper mantle are responsible for these negative geoidanomalies. However, there is no general consensus regarding the source of the Indian Ocean negative geoid. We investigate the source of this geoid low by using forward models of density driven mantle convection using CitcomS. We test various tomography models in our flow calculations with different radial and lateral viscosity variations. Many tomography modelsproduce a fairly high correlation to the global geoid, however none could match the precise location of the geoid low in the Indian Ocean. Amerged P-wave model of LLNL-G3DV3 in the Indian Ocean region and S40rts elsewhere yields a good fit to the geoid anomaly, both in pattern and magnitude.The source of this geoid low seems to stem from a low velocity anomaly stretching from a depth of 300 km up to 700 km in the northern Indian Ocean region.This velocity anomaly could potentially arise from material rising along the edge of the African LLSVP and moving towards the northeast, facilitated by the movementof the Indian plate in the same direction.

  3. A flexible coupling scheme for Monte Carlo and thermal-hydraulics codes

    Energy Technology Data Exchange (ETDEWEB)

    Hoogenboom, J. Eduard, E-mail: J.E.Hoogenboom@tudelft.nl [Delft University of Technology (Netherlands); Ivanov, Aleksandar; Sanchez, Victor, E-mail: Aleksandar.Ivanov@kit.edu, E-mail: Victor.Sanchez@kit.edu [Karlsruhe Institute of Technology, Institute of Neutron Physics and Reactor Technology, Eggenstein-Leopoldshafen (Germany); Diop, Cheikh, E-mail: Cheikh.Diop@cea.fr [CEA/DEN/DANS/DM2S/SERMA, Commissariat a l' Energie Atomique, Gif-sur-Yvette (France)

    2011-07-01

    A coupling scheme between a Monte Carlo code and a thermal-hydraulics code is being developed within the European NURISP project for comprehensive and validated reactor analysis. The scheme is flexible as it allows different Monte Carlo codes and different thermal-hydraulics codes to be used. At present the MCNP and TRIPOLI4 Monte Carlo codes can be used and the FLICA4 and SubChanFlow thermal-hydraulics codes. For all these codes only an original executable is necessary. A Python script drives the iterations between Monte Carlo and thermal-hydraulics calculations. It also calls a conversion program to merge a master input file for the Monte Carlo code with the appropriate temperature and coolant density data from the thermal-hydraulics calculation. Likewise it calls another conversion program to merge a master input file for the thermal-hydraulics code with the power distribution data from the Monte Carlo calculation. Special attention is given to the neutron cross section data for the various required temperatures in the Monte Carlo calculation. Results are shown for an infinite lattice of PWR fuel pin cells and a 3 x 3 fuel BWR pin cell cluster. Various possibilities for further improvement and optimization of the coupling system are discussed. (author)

  4. A flexible coupling scheme for Monte Carlo and thermal-hydraulics codes

    International Nuclear Information System (INIS)

    Hoogenboom, J. Eduard; Ivanov, Aleksandar; Sanchez, Victor; Diop, Cheikh

    2011-01-01

    A coupling scheme between a Monte Carlo code and a thermal-hydraulics code is being developed within the European NURISP project for comprehensive and validated reactor analysis. The scheme is flexible as it allows different Monte Carlo codes and different thermal-hydraulics codes to be used. At present the MCNP and TRIPOLI4 Monte Carlo codes can be used and the FLICA4 and SubChanFlow thermal-hydraulics codes. For all these codes only an original executable is necessary. A Python script drives the iterations between Monte Carlo and thermal-hydraulics calculations. It also calls a conversion program to merge a master input file for the Monte Carlo code with the appropriate temperature and coolant density data from the thermal-hydraulics calculation. Likewise it calls another conversion program to merge a master input file for the thermal-hydraulics code with the power distribution data from the Monte Carlo calculation. Special attention is given to the neutron cross section data for the various required temperatures in the Monte Carlo calculation. Results are shown for an infinite lattice of PWR fuel pin cells and a 3 x 3 fuel BWR pin cell cluster. Various possibilities for further improvement and optimization of the coupling system are discussed. (author)

  5. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Department of Industrial Engineering and Management, Maulana Abul Kalam Azad University of Technology, Kolkata 700064, India; Indian Institute of Management Raipur, GEC Campus, Sejbahar, Raipur 492015, India; Indian National Centre for Ocean Information Services, Ministry of Earth Sciences, Hyderabad 500090, ...

  6. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    2018-06-07

    Jun 7, 2018 ... Science Education Programmes · Women in Science · Committee on ... Transliteration; informal information; natural language processing (NLP); information retrieval. ... Department of Computer Science and Engineering, Indian Institute of Technology (Indian School of Mines), Dhanbad 826004, India ...

  7. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    2018-03-14

    Mar 14, 2018 ... Cloud security; network security; anomaly detection; network traffic analysis; DDoS attack detection. ... Department of Computer Science and Engineering, Indian Institute of Technology Roorkee, Roorkee 247667, India; Department of Applied Science and Engineering, Indian Institute of Technology ...

  8. Parallel MCNP Monte Carlo transport calculations with MPI

    International Nuclear Information System (INIS)

    Wagner, J.C.; Haghighat, A.

    1996-01-01

    The steady increase in computational performance has made Monte Carlo calculations for large/complex systems possible. However, in order to make these calculations practical, order of magnitude increases in performance are necessary. The Monte Carlo method is inherently parallel (particles are simulated independently) and thus has the potential for near-linear speedup with respect to the number of processors. Further, the ever-increasing accessibility of parallel computers, such as workstation clusters, facilitates the practical use of parallel Monte Carlo. Recognizing the nature of the Monte Carlo method and the trends in available computing, the code developers at Los Alamos National Laboratory implemented the message-passing general-purpose Monte Carlo radiation transport code MCNP (version 4A). The PVM package was chosen by the MCNP code developers because it supports a variety of communication networks, several UNIX platforms, and heterogeneous computer systems. This PVM version of MCNP has been shown to produce speedups that approach the number of processors and thus, is a very useful tool for transport analysis. Due to software incompatibilities on the local IBM SP2, PVM has not been available, and thus it is not possible to take advantage of this useful tool. Hence, it became necessary to implement an alternative message-passing library package into MCNP. Because the message-passing interface (MPI) is supported on the local system, takes advantage of the high-speed communication switches in the SP2, and is considered to be the emerging standard, it was selected

  9. Monte Carlo systems used for treatment planning and dose verification

    Energy Technology Data Exchange (ETDEWEB)

    Brualla, Lorenzo [Universitaetsklinikum Essen, NCTeam, Strahlenklinik, Essen (Germany); Rodriguez, Miguel [Centro Medico Paitilla, Balboa (Panama); Lallena, Antonio M. [Universidad de Granada, Departamento de Fisica Atomica, Molecular y Nuclear, Granada (Spain)

    2017-04-15

    General-purpose radiation transport Monte Carlo codes have been used for estimation of the absorbed dose distribution in external photon and electron beam radiotherapy patients since several decades. Results obtained with these codes are usually more accurate than those provided by treatment planning systems based on non-stochastic methods. Traditionally, absorbed dose computations based on general-purpose Monte Carlo codes have been used only for research, owing to the difficulties associated with setting up a simulation and the long computation time required. To take advantage of radiation transport Monte Carlo codes applied to routine clinical practice, researchers and private companies have developed treatment planning and dose verification systems that are partly or fully based on fast Monte Carlo algorithms. This review presents a comprehensive list of the currently existing Monte Carlo systems that can be used to calculate or verify an external photon and electron beam radiotherapy treatment plan. Particular attention is given to those systems that are distributed, either freely or commercially, and that do not require programming tasks from the end user. These systems are compared in terms of features and the simulation time required to compute a set of benchmark calculations. (orig.) [German] Seit mehreren Jahrzehnten werden allgemein anwendbare Monte-Carlo-Codes zur Simulation des Strahlungstransports benutzt, um die Verteilung der absorbierten Dosis in der perkutanen Strahlentherapie mit Photonen und Elektronen zu evaluieren. Die damit erzielten Ergebnisse sind meist akkurater als solche, die mit nichtstochastischen Methoden herkoemmlicher Bestrahlungsplanungssysteme erzielt werden koennen. Wegen des damit verbundenen Arbeitsaufwands und der langen Dauer der Berechnungen wurden Monte-Carlo-Simulationen von Dosisverteilungen in der konventionellen Strahlentherapie in der Vergangenheit im Wesentlichen in der Forschung eingesetzt. Im Bemuehen, Monte-Carlo

  10. SERC School on Computational Statistical Physics held at the Indian Institute of Technology

    CERN Document Server

    Ray, Purusattam

    2011-01-01

    The present book is an outcome of the SERC school on Computational Statistical Physics held at the Indian Institute of Technology, Guwahati, in December 2008. Numerical experimentation has played an extremely important role in statistical physics in recent years. Lectures given at the School covered a large number of topics of current and continuing interest. Based on lectures by active researchers in the field- Bikas Chakrabarti, S Chaplot, Deepak Dhar, Sanjay Kumar, Prabal Maiti, Sanjay Puri, Purusattam Ray, Sitangshu Santra and Subir Sarkar- the nine chapters comprising the book deal with topics that range from the fundamentals of the field, to problems and questions that are at the very forefront of current research. This book aims to expose the graduate student to the basic as well as advanced techniques in computational statistical physics. Following a general introduction to statistical mechanics and critical phenomena, the various chapters cover Monte Carlo and molecular dynamics simulation methodolog...

  11. Multilevel Monte Carlo in Approximate Bayesian Computation

    KAUST Repository

    Jasra, Ajay

    2017-02-13

    In the following article we consider approximate Bayesian computation (ABC) inference. We introduce a method for numerically approximating ABC posteriors using the multilevel Monte Carlo (MLMC). A sequential Monte Carlo version of the approach is developed and it is shown under some assumptions that for a given level of mean square error, this method for ABC has a lower cost than i.i.d. sampling from the most accurate ABC approximation. Several numerical examples are given.

  12. Monte Carlo simulation of Markov unreliability models

    International Nuclear Information System (INIS)

    Lewis, E.E.; Boehm, F.

    1984-01-01

    A Monte Carlo method is formulated for the evaluation of the unrealibility of complex systems with known component failure and repair rates. The formulation is in terms of a Markov process allowing dependences between components to be modeled and computational efficiencies to be achieved in the Monte Carlo simulation. Two variance reduction techniques, forced transition and failure biasing, are employed to increase computational efficiency of the random walk procedure. For an example problem these result in improved computational efficiency by more than three orders of magnitudes over analog Monte Carlo. The method is generalized to treat problems with distributed failure and repair rate data, and a batching technique is introduced and shown to result in substantial increases in computational efficiency for an example problem. A method for separating the variance due to the data uncertainty from that due to the finite number of random walks is presented. (orig.)

  13. Yours in Revolution: Retrofitting Carlos the Jackal

    Directory of Open Access Journals (Sweden)

    Samuel Thomas

    2013-09-01

    Full Text Available This paper explores the representation of ‘Carlos the Jackal’, the one-time ‘World’s Most Wanted Man’ and ‘International Face of Terror’ – primarily in cin-ema but also encompassing other forms of popular culture and aspects of Cold War policy-making. At the centre of the analysis is Olivier Assayas’s Carlos (2010, a transnational, five and a half hour film (first screened as a TV mini-series about the life and times of the infamous militant. Concentrating on the var-ious ways in which Assayas expresses a critical preoccupation with names and faces through complex formal composition, the project examines the play of ab-straction and embodiment that emerges from the narrativisation of terrorist vio-lence. Lastly, it seeks to engage with the hidden implications of Carlos in terms of the intertwined trajectories of formal experimentation and revolutionary politics.

  14. Contributon Monte Carlo

    International Nuclear Information System (INIS)

    Dubi, A.; Gerstl, S.A.W.

    1979-05-01

    The contributon Monte Carlo method is based on a new recipe to calculate target responses by means of volume integral of the contributon current in a region between the source and the detector. A comprehensive description of the method, its implementation in the general-purpose MCNP code, and results of the method for realistic nonhomogeneous, energy-dependent problems are presented. 23 figures, 10 tables

  15. A residual Monte Carlo method for discrete thermal radiative diffusion

    International Nuclear Information System (INIS)

    Evans, T.M.; Urbatsch, T.J.; Lichtenstein, H.; Morel, J.E.

    2003-01-01

    Residual Monte Carlo methods reduce statistical error at a rate of exp(-bN), where b is a positive constant and N is the number of particle histories. Contrast this convergence rate with 1/√N, which is the rate of statistical error reduction for conventional Monte Carlo methods. Thus, residual Monte Carlo methods hold great promise for increased efficiency relative to conventional Monte Carlo methods. Previous research has shown that the application of residual Monte Carlo methods to the solution of continuum equations, such as the radiation transport equation, is problematic for all but the simplest of cases. However, the residual method readily applies to discrete systems as long as those systems are monotone, i.e., they produce positive solutions given positive sources. We develop a residual Monte Carlo method for solving a discrete 1D non-linear thermal radiative equilibrium diffusion equation, and we compare its performance with that of the discrete conventional Monte Carlo method upon which it is based. We find that the residual method provides efficiency gains of many orders of magnitude. Part of the residual gain is due to the fact that we begin each timestep with an initial guess equal to the solution from the previous timestep. Moreover, fully consistent non-linear solutions can be obtained in a reasonable amount of time because of the effective lack of statistical noise. We conclude that the residual approach has great potential and that further research into such methods should be pursued for more general discrete and continuum systems

  16. Swell Propagation over Indian Ocean Region

    Directory of Open Access Journals (Sweden)

    Suchandra A. Bhowmick

    2011-06-01

    Full Text Available Swells are the ocean surface gravity waves that have propagated out of their generating fetch to the distant coasts without significant attenuation. Therefore they contain a clear signature of the nature and intensity of wind at the generation location. This makes them a precursor to various atmospheric phenomena like distant storms, tropical cyclones, or even large scale sea breeze like monsoon. Since they are not affected by wind once they propagate out of their generating region, they cannot be described by regional wave models forced by local winds. However, their prediction is important, in particular, for ship routing and off shore structure designing. In the present work, the propagation of swell waves from the Southern Ocean and southern Indian Ocean to the central and northern Indian Ocean has been studied. For this purpose a spectral ocean Wave Model (WAM has been used to simulate significant wave height for 13 years from 1993–2005 using NCEP blended winds at a horizontal spatial resolution of 1° × 1°. It has been observed that Indian Ocean, with average wave height of approximately 2–3 m during July, is mostly dominated by swell waves generated predominantly under the extreme windy conditions prevailing over the Southern Ocean and southern Indian Ocean. In fact the swell waves reaching the Indian Ocean in early or mid May carry unique signatures of monsoon arriving over the Indian Subcontinent. Pre-monsoon month of April contains low swell waves ranging from 0.5–1 m. The amplitudes subsequently increase to approximately 1.5–2 meters around 7–15 days prior to the arrival of monsoon over the Indian Subcontinent. This embedded signature may be utilized as one of the important oceanographic precursor to the monsoon onset over the Indian Ocean.

  17. Bayesian Monte Carlo method

    International Nuclear Information System (INIS)

    Rajabalinejad, M.

    2010-01-01

    To reduce cost of Monte Carlo (MC) simulations for time-consuming processes, Bayesian Monte Carlo (BMC) is introduced in this paper. The BMC method reduces number of realizations in MC according to the desired accuracy level. BMC also provides a possibility of considering more priors. In other words, different priors can be integrated into one model by using BMC to further reduce cost of simulations. This study suggests speeding up the simulation process by considering the logical dependence of neighboring points as prior information. This information is used in the BMC method to produce a predictive tool through the simulation process. The general methodology and algorithm of BMC method are presented in this paper. The BMC method is applied to the simplified break water model as well as the finite element model of 17th Street Canal in New Orleans, and the results are compared with the MC and Dynamic Bounds methods.

  18. Stochastic approximation Monte Carlo importance sampling for approximating exact conditional probabilities

    KAUST Repository

    Cheon, Sooyoung

    2013-02-16

    Importance sampling and Markov chain Monte Carlo methods have been used in exact inference for contingency tables for a long time, however, their performances are not always very satisfactory. In this paper, we propose a stochastic approximation Monte Carlo importance sampling (SAMCIS) method for tackling this problem. SAMCIS is a combination of adaptive Markov chain Monte Carlo and importance sampling, which employs the stochastic approximation Monte Carlo algorithm (Liang et al., J. Am. Stat. Assoc., 102(477):305-320, 2007) to draw samples from an enlarged reference set with a known Markov basis. Compared to the existing importance sampling and Markov chain Monte Carlo methods, SAMCIS has a few advantages, such as fast convergence, ergodicity, and the ability to achieve a desired proportion of valid tables. The numerical results indicate that SAMCIS can outperform the existing importance sampling and Markov chain Monte Carlo methods: It can produce much more accurate estimates in much shorter CPU time than the existing methods, especially for the tables with high degrees of freedom. © 2013 Springer Science+Business Media New York.

  19. Stochastic approximation Monte Carlo importance sampling for approximating exact conditional probabilities

    KAUST Repository

    Cheon, Sooyoung; Liang, Faming; Chen, Yuguo; Yu, Kai

    2013-01-01

    Importance sampling and Markov chain Monte Carlo methods have been used in exact inference for contingency tables for a long time, however, their performances are not always very satisfactory. In this paper, we propose a stochastic approximation Monte Carlo importance sampling (SAMCIS) method for tackling this problem. SAMCIS is a combination of adaptive Markov chain Monte Carlo and importance sampling, which employs the stochastic approximation Monte Carlo algorithm (Liang et al., J. Am. Stat. Assoc., 102(477):305-320, 2007) to draw samples from an enlarged reference set with a known Markov basis. Compared to the existing importance sampling and Markov chain Monte Carlo methods, SAMCIS has a few advantages, such as fast convergence, ergodicity, and the ability to achieve a desired proportion of valid tables. The numerical results indicate that SAMCIS can outperform the existing importance sampling and Markov chain Monte Carlo methods: It can produce much more accurate estimates in much shorter CPU time than the existing methods, especially for the tables with high degrees of freedom. © 2013 Springer Science+Business Media New York.

  20. Promoting Indian Library Use. Guide Number 7.

    Science.gov (United States)

    Townley, Charles T.

    Individuals, organizations, and American Indian tribes are rapidly recognizing the value of libraries. They are recognizing that libraries and the information services which they offer are necessary to meet Indian goals. Specific sensitivity to Indian ways and alternatives is just developing as library and information services develop in Indian…

  1. Congressional Social Darwinism and the American Indian

    Science.gov (United States)

    Blinderman, Abraham

    1978-01-01

    Summarizing a congressional report on civil and military treatment of American Indians, this article asserts that the social Darwinism of the day prevailed among all congressional committee members ("Even friends of the Indian... knew American expansionism, technology, and racial ideology would reduce the Indian to a pitiful remnant...) (JC)

  2. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Home; Journals; Sadhana; Volume 41; Issue 2. Nearest neighbour classification of Indian sign language gestures using kinect camera. Zafar Ahmed Ansari Gaurav Harit. Volume 41 Issue 2 February 2016 pp 161-182 ... Keywords. Indian sign language recognition; multi-class classification; gesture recognition.

  3. Closed-shell variational quantum Monte Carlo simulation for the ...

    African Journals Online (AJOL)

    Closed-shell variational quantum Monte Carlo simulation for the electric dipole moment calculation of hydrazine molecule using casino-code. ... Nigeria Journal of Pure and Applied Physics ... The variational quantum Monte Carlo (VQMC) technique used in this work employed the restricted Hartree-Fock (RHF) scheme.

  4. New Approaches and Applications for Monte Carlo Perturbation Theory

    Energy Technology Data Exchange (ETDEWEB)

    Aufiero, Manuele; Bidaud, Adrien; Kotlyar, Dan; Leppänen, Jaakko; Palmiotti, Giuseppe; Salvatores, Massimo; Sen, Sonat; Shwageraus, Eugene; Fratoni, Massimiliano

    2017-02-01

    This paper presents some of the recent and new advancements in the extension of Monte Carlo Perturbation Theory methodologies and application. In particular, the discussed problems involve Brunup calculation, perturbation calculation based on continuous energy functions, and Monte Carlo Perturbation Theory in loosely coupled systems.

  5. Celebrating National American Indian Heritage Month

    National Research Council Canada - National Science Library

    Mann, Diane

    2004-01-01

    November has been designated National American Indian Heritage Month to honor American Indians and Alaska Natives by increasing awareness of their culture, history, and, especially, their tremendous...

  6. Rapid Monte Carlo Simulation of Gravitational Wave Galaxies

    Science.gov (United States)

    Breivik, Katelyn; Larson, Shane L.

    2015-01-01

    With the detection of gravitational waves on the horizon, astrophysical catalogs produced by gravitational wave observatories can be used to characterize the populations of sources and validate different galactic population models. Efforts to simulate gravitational wave catalogs and source populations generally focus on population synthesis models that require extensive time and computational power to produce a single simulated galaxy. Monte Carlo simulations of gravitational wave source populations can also be used to generate observation catalogs from the gravitational wave source population. Monte Carlo simulations have the advantes of flexibility and speed, enabling rapid galactic realizations as a function of galactic binary parameters with less time and compuational resources required. We present a Monte Carlo method for rapid galactic simulations of gravitational wave binary populations.

  7. Recommender engine for continuous-time quantum Monte Carlo methods

    Science.gov (United States)

    Huang, Li; Yang, Yi-feng; Wang, Lei

    2017-03-01

    Recommender systems play an essential role in the modern business world. They recommend favorable items such as books, movies, and search queries to users based on their past preferences. Applying similar ideas and techniques to Monte Carlo simulations of physical systems boosts their efficiency without sacrificing accuracy. Exploiting the quantum to classical mapping inherent in the continuous-time quantum Monte Carlo methods, we construct a classical molecular gas model to reproduce the quantum distributions. We then utilize powerful molecular simulation techniques to propose efficient quantum Monte Carlo updates. The recommender engine approach provides a general way to speed up the quantum impurity solvers.

  8. Acceleration of monte Carlo solution by conjugate gradient method

    International Nuclear Information System (INIS)

    Toshihisa, Yamamoto

    2005-01-01

    The conjugate gradient method (CG) was applied to accelerate Monte Carlo solutions in fixed source problems. The equilibrium model based formulation enables to use CG scheme as well as initial guess to maximize computational performance. This method is available to arbitrary geometry provided that the neutron source distribution in each subregion can be regarded as flat. Even if it is not the case, the method can still be used as a powerful tool to provide an initial guess very close to the converged solution. The major difference of Monte Carlo CG to deterministic CG is that residual error is estimated using Monte Carlo sampling, thus statistical error exists in the residual. This leads to a flow diagram specific to Monte Carlo-CG. Three pre-conditioners were proposed for CG scheme and the performance was compared with a simple 1-D slab heterogeneous test problem. One of them, Sparse-M option, showed an excellent performance in convergence. The performance per unit cost was improved by four times in the test problem. Although direct estimation of efficiency of the method is impossible mainly because of the strong problem-dependence of the optimized pre-conditioner in CG, the method seems to have efficient potential as a fast solution algorithm for Monte Carlo calculations. (author)

  9. Returns on Indian Art during 2000-2013

    OpenAIRE

    Jenny Rae Hawkins; Viplav Saini

    2014-01-01

    The market for modern Indian art is an emerging art market, having come into a proper existence only in the late 1990s. This market saw tremendous growth in its initial years and then a downturn that started around 2007-2008. Using data from auctions conducted by a major Indian art auctioneer, we estimate via hedonic regression a price index for paintings and drawings by Indian artists sold during 2000-2013. We are able to thus estimate a rate of return on Indian art as an investment and also...

  10. PERHITUNGAN VaR PORTOFOLIO SAHAM MENGGUNAKAN DATA HISTORIS DAN DATA SIMULASI MONTE CARLO

    Directory of Open Access Journals (Sweden)

    WAYAN ARTHINI

    2012-09-01

    Full Text Available Value at Risk (VaR is the maximum potential loss on a portfolio based on the probability at a certain time.  In this research, portfolio VaR values calculated from historical data and Monte Carlo simulation data. Historical data is processed so as to obtain stock returns, variance, correlation coefficient, and variance-covariance matrix, then the method of Markowitz sought proportion of each stock fund, and portfolio risk and return portfolio. The data was then simulated by Monte Carlo simulation, Exact Monte Carlo Simulation and Expected Monte Carlo Simulation. Exact Monte Carlo simulation have same returns and standard deviation  with historical data, while the Expected Monte Carlo Simulation satistic calculation similar to historical data. The results of this research is the portfolio VaR  with time horizon T=1, T=10, T=22 and the confidence level of 95 %, values obtained VaR between historical data and Monte Carlo simulation data with the method exact and expected. Value of VaR from both Monte Carlo simulation is greater than VaR historical data.

  11. Monte Carlo methods for the reliability analysis of Markov systems

    International Nuclear Information System (INIS)

    Buslik, A.J.

    1985-01-01

    This paper presents Monte Carlo methods for the reliability analysis of Markov systems. Markov models are useful in treating dependencies between components. The present paper shows how the adjoint Monte Carlo method for the continuous time Markov process can be derived from the method for the discrete-time Markov process by a limiting process. The straightforward extensions to the treatment of mean unavailability (over a time interval) are given. System unavailabilities can also be estimated; this is done by making the system failed states absorbing, and not permitting repair from them. A forward Monte Carlo method is presented in which the weighting functions are related to the adjoint function. In particular, if the exact adjoint function is known then weighting factors can be constructed such that the exact answer can be obtained with a single Monte Carlo trial. Of course, if the exact adjoint function is known, there is no need to perform the Monte Carlo calculation. However, the formulation is useful since it gives insight into choices of the weight factors which will reduce the variance of the estimator

  12. A general transform for variance reduction in Monte Carlo simulations

    International Nuclear Information System (INIS)

    Becker, T.L.; Larsen, E.W.

    2011-01-01

    This paper describes a general transform to reduce the variance of the Monte Carlo estimate of some desired solution, such as flux or biological dose. This transform implicitly includes many standard variance reduction techniques, including source biasing, collision biasing, the exponential transform for path-length stretching, and weight windows. Rather than optimizing each of these techniques separately or choosing semi-empirical biasing parameters based on the experience of a seasoned Monte Carlo practitioner, this General Transform unites all these variance techniques to achieve one objective: a distribution of Monte Carlo particles that attempts to optimize the desired solution. Specifically, this transform allows Monte Carlo particles to be distributed according to the user's specification by using information obtained from a computationally inexpensive deterministic simulation of the problem. For this reason, we consider the General Transform to be a hybrid Monte Carlo/Deterministic method. The numerical results con rm that the General Transform distributes particles according to the user-specified distribution and generally provide reasonable results for shielding applications. (author)

  13. A Monte Carlo approach to combating delayed completion of ...

    African Journals Online (AJOL)

    The objective of this paper is to unveil the relevance of Monte Carlo critical path analysis in resolving problem of delays in scheduled completion of development projects. Commencing with deterministic network scheduling, Monte Carlo critical path analysis was advanced by assigning probability distributions to task times.

  14. Perturbation based Monte Carlo criticality search in density, enrichment and concentration

    International Nuclear Information System (INIS)

    Li, Zeguang; Wang, Kan; Deng, Jingkang

    2015-01-01

    Highlights: • A new perturbation based Monte Carlo criticality search method is proposed. • The method could get accurate results with only one individual criticality run. • The method is used to solve density, enrichment and concentration search problems. • Results show the feasibility and good performances of this method. • The relationship between results’ accuracy and perturbation order is discussed. - Abstract: Criticality search is a very important aspect in reactor physics analysis. Due to the advantages of Monte Carlo method and the development of computer technologies, Monte Carlo criticality search is becoming more and more necessary and feasible. Existing Monte Carlo criticality search methods need large amount of individual criticality runs and may have unstable results because of the uncertainties of criticality results. In this paper, a new perturbation based Monte Carlo criticality search method is proposed and discussed. This method only needs one individual criticality calculation with perturbation tallies to estimate k eff changing function using initial k eff and differential coefficients results, and solves polynomial equations to get the criticality search results. The new perturbation based Monte Carlo criticality search method is implemented in the Monte Carlo code RMC, and criticality search problems in density, enrichment and concentration are taken out. Results show that this method is quite inspiring in accuracy and efficiency, and has advantages compared with other criticality search methods

  15. Monte Carlo numerical study of lattice field theories

    International Nuclear Information System (INIS)

    Gan Cheekwan; Kim Seyong; Ohta, Shigemi

    1997-01-01

    The authors are interested in the exact first-principle calculations of quantum field theories which are indeed exact ones. For quantum chromodynamics (QCD) at low energy scale, a nonperturbation method is needed, and the only known such method is the lattice method. The path integral can be evaluated by putting a system on a finite 4-dimensional volume and discretizing space time continuum into finite points, lattice. The continuum limit is taken by making the lattice infinitely fine. For evaluating such a finite-dimensional integral, the Monte Carlo numerical estimation of the path integral can be obtained. The calculation of light hadron mass in quenched lattice QCD with staggered quarks, 3-dimensional Thirring model calculation and the development of self-test Monte Carlo method have been carried out by using the RIKEN supercomputer. The motivation of this study, lattice QCD formulation, continuum limit, Monte Carlo update, hadron propagator, light hadron mass, auto-correlation and source size dependence are described on lattice QCD. The phase structure of the 3-dimensional Thirring model for a small 8 3 lattice has been mapped. The discussion on self-test Monte Carlo method is described again. (K.I.)

  16. Continuous energy Monte Carlo method based lattice homogeinzation

    International Nuclear Information System (INIS)

    Li Mancang; Yao Dong; Wang Kan

    2014-01-01

    Based on the Monte Carlo code MCNP, the continuous energy Monte Carlo multi-group constants generation code MCMC has been developed. The track length scheme has been used as the foundation of cross section generation. The scattering matrix and Legendre components require special techniques, and the scattering event method has been proposed to solve this problem. Three methods have been developed to calculate the diffusion coefficients for diffusion reactor core codes and the Legendre method has been applied in MCMC. To the satisfaction of the equivalence theory, the general equivalence theory (GET) and the superhomogenization method (SPH) have been applied to the Monte Carlo method based group constants. The super equivalence method (SPE) has been proposed to improve the equivalence. GET, SPH and SPE have been implemented into MCMC. The numerical results showed that generating the homogenization multi-group constants via Monte Carlo method overcomes the difficulties in geometry and treats energy in continuum, thus provides more accuracy parameters. Besides, the same code and data library can be used for a wide range of applications due to the versatility. The MCMC scheme can be seen as a potential alternative to the widely used deterministic lattice codes. (authors)

  17. Prevalence of cataract surgery and visual outcomes in Indian immigrants in Singapore: the Singapore Indian eye study.

    Science.gov (United States)

    Gupta, Preeti; Zheng, Yingfeng; Ting, Tay Wan; Lamoureux, Ecosse L; Cheng, Ching-Yu; Wong, Tien-Yin

    2013-01-01

    To determine the prevalence of cataract surgery and factors associated with post-surgical visual outcomes in migrant Indians living in Singapore. We conducted a population-based study in 3,400 Indian immigrants residing in Singapore-the Singapore Indian Eye Study (SINDI). All participants underwent comprehensive medical eye examination and a standardized interview. Post-operative visual impairment (VI) was defined as best-corrected or presenting visual acuity (BCVA or PVA) of 20/60 or worse. The age- and gender-standardized prevalence of cataract surgery was 9.7% (95% confidence interval [CI]: 8.9%, 10.7%) in Singapore resident Indians. Post-operative VI defined by BCVA occurred in 10.9% eyes (87/795). The main causes of post-operative VI were diabetic retinopathy (20.7%), posterior capsular opacification (18.4%), and age-related macular degeneration (12.6%). Undercorrected refractive error doubled the prevalence of post-operative VI when PVA was used. The rate of cataract surgery is about 10% in Indian residents in Singapore. Socioeconomic variables and migration had no significant impact on the prevalence of cataract surgery. Diabetic retinopathy was a major cause of post-operative VI in migrant Indians living in Singapore. Uncorrected postoperative refractive error remains an efficient way to improve vision.

  18. PENENTUAN HARGA OPSI BELI TIPE ASIA DENGAN METODE MONTE CARLO-CONTROL VARIATE

    Directory of Open Access Journals (Sweden)

    NI NYOMAN AYU ARTANADI

    2017-01-01

    Full Text Available Option is a contract between the writer and the holder which entitles the holder to buy or sell an underlying asset at the maturity date for a specified price known as an exercise price. Asian option is a type of financial derivatives which the payoff taking the average value over the time series of the asset price. The aim of the study is to present the Monte Carlo-Control Variate as an extension of Standard Monte Carlo applied on the calculation of the Asian option price. Standard Monte Carlo simulations 10.000.000 generate standard error 0.06 and the option price convergent at Rp.160.00 while Monte Carlo-Control Variate simulations 100.000 generate standard error 0.01 and the option price convergent at Rp.152.00. This shows the Monte Carlo-Control Variate achieve faster option price toward convergent of the Monte Carlo Standar.

  19. Implications of Monte Carlo Statistical Errors in Criticality Safety Assessments

    International Nuclear Information System (INIS)

    Pevey, Ronald E.

    2005-01-01

    Most criticality safety calculations are performed using Monte Carlo techniques because of Monte Carlo's ability to handle complex three-dimensional geometries. For Monte Carlo calculations, the more histories sampled, the lower the standard deviation of the resulting estimates. The common intuition is, therefore, that the more histories, the better; as a result, analysts tend to run Monte Carlo analyses as long as possible (or at least to a minimum acceptable uncertainty). For Monte Carlo criticality safety analyses, however, the optimization situation is complicated by the fact that procedures usually require that an extra margin of safety be added because of the statistical uncertainty of the Monte Carlo calculations. This additional safety margin affects the impact of the choice of the calculational standard deviation, both on production and on safety. This paper shows that, under the assumptions of normally distributed benchmarking calculational errors and exact compliance with the upper subcritical limit (USL), the standard deviation that optimizes production is zero, but there is a non-zero value of the calculational standard deviation that minimizes the risk of inadvertently labeling a supercritical configuration as subcritical. Furthermore, this value is shown to be a simple function of the typical benchmarking step outcomes--the bias, the standard deviation of the bias, the upper subcritical limit, and the number of standard deviations added to calculated k-effectives before comparison to the USL

  20. Biased Monte Carlo optimization: the basic approach

    International Nuclear Information System (INIS)

    Campioni, Luca; Scardovelli, Ruben; Vestrucci, Paolo

    2005-01-01

    It is well-known that the Monte Carlo method is very successful in tackling several kinds of system simulations. It often happens that one has to deal with rare events, and the use of a variance reduction technique is almost mandatory, in order to have Monte Carlo efficient applications. The main issue associated with variance reduction techniques is related to the choice of the value of the biasing parameter. Actually, this task is typically left to the experience of the Monte Carlo user, who has to make many attempts before achieving an advantageous biasing. A valuable result is provided: a methodology and a practical rule addressed to establish an a priori guidance for the choice of the optimal value of the biasing parameter. This result, which has been obtained for a single component system, has the notable property of being valid for any multicomponent system. In particular, in this paper, the exponential and the uniform biases of exponentially distributed phenomena are investigated thoroughly