WorldWideScience

Sample records for superfund program prior

  1. Restoration principles and criteria: superfund program policy for cleanup at radiation contaminated sites

    International Nuclear Information System (INIS)

    Walker, Stuart

    2006-01-01

    The Environmental Protection Agency (EPA) Office of Superfund Remediation and Technology Innovation (OSRTI) is responsible for implementing the long-term (non-emergency) portion of a key U.S. law regulating cleanup: the Comprehensive Environmental Response, Compensation and Liability Act, CERCLA, nicknamed 'Superfund'. The purpose of the Superfund program is to protect human health and the environment over the long term from releases or potential releases of hazardous substances from abandoned or uncontrolled hazardous waste sites. The focus of this paper is on Superfund, including how radiation is addressed by the Superfund program. This paper provides a brief overview of the approach used by EPA to conduct Superfund cleanups at contaminated sites, including those that are contaminated with radionuclides, to ensure protection of human health and the environment. The paper addresses how EPA Superfund determines if a site poses a risk to human health and the framework used to determine cleanup levels. The theme emphasized throughout the paper is that within the Superfund remediation framework, radioactive contamination is dealt with in a consistent manner as with chemical contamination, except to account for the technical differences between radionuclides and chemicals. This consistency is important since at every radioactively contaminated site being addressed under Superfund's primary program for long-term cleanup, the National Priorities List (NPL), chemical contamination is also present. (author)

  2. U.S. EPA Superfund Program's Policy for Community Involvement at Radioactively Contaminated Sites

    International Nuclear Information System (INIS)

    Carey, Pat; Walker, Stuart

    2008-01-01

    This paper describes the Superfund program's statutory requirements for community involvement. It also discusses the efforts the Superfund program has made that go beyond these statutory requirements to involve communities. The Environmental Protection Agency (EPA) implements the Superfund program under the authority of the Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA), as amended by the Superfund Amendments and Reauthorization Act of 1986 (SARA). From the beginning of the Superfund program, Congress envisioned a role for communities. This role has evolved and expanded during the implementation of the Superfund program. Initially, the CERCLA statute had community involvement requirements designed to inform surrounding communities of the work being done at a site. CERCLA's provisions required 1) development of a community relations plan for each site, 2) establishment of information repositories near each site where all publicly available materials related to the site would be accessible for public inspection, 3) opportunities for the public to comment on the proposed remedy for each site and 4) development of a responsiveness summary responding to all significant comments received on the proposed remedy. In recognition of the need for people living near Superfund sites to be well-informed and involved with decisions concerning sites in their communities, SARA expanded Superfund's community involvement activities in 1986. SARA provided the authority to award Technical Assistance Grants (TAGs) to local communities enabling them to hire independent technical advisors to assist them in understanding technical issues and data about the site. The Superfund Community Involvement Program has sought to effectively implement the statutory community involvement requirements, and to go beyond those requirements to find meaningful ways to involve citizens in the cleanup of sites in their communities. We've structured our program around

  3. Evaluating public participation in environmental decision-making: EPA's superfund community involvement program.

    Science.gov (United States)

    Susan Charnley; Bruce. Engelbert

    2005-01-01

    This article discusses an 8-year, ongoing project that evaluates the Environmental Protection Agency's Superfund community involvement program. The project originated as a response to the Government Performance and Results Act, which requires federal agencies to articulate program goals, and evaluate and report their progress in meeting those goals. The evaluation...

  4. Superfund fact sheet: The remedial program. Fact sheet

    International Nuclear Information System (INIS)

    1992-09-01

    The fact sheet describes what various actions the EPA can take to clean up hazardous wastes sites. Explanations of how the criteria for environmental and public health risk assessment are determined and the role of state and local governments in site remediation are given. The fact sheet is one in a series providing reference information about Superfund issues and is intended for readers with no formal scientific training

  5. U.S. EPA Superfund Program's Policy for Community Involvement at Radioactively Contaminated Sites

    International Nuclear Information System (INIS)

    Martin, K.; Walker, St.

    2009-01-01

    This paper describes the EPA Superfund program's statutory requirements for community involvement. It also discusses the efforts the Superfund program has made that go beyond these statutory requirements to involve communities, and what lessons have been learned by EPA when trying to conduct meaningful community involvement at sites. In addition, it discusses tools that EPA has designed to specifically enhance community involvement at radioactively contaminated Superfund sites. In summary, the Superfund program devotes substantial resources to involving the local community in the site cleanup decision making process. We believe community involvement provides us with highly valuable information that must be available to carefully consider remedial alternatives at a site. We also find our employees enjoy their jobs more. Rather than fighting with an angry public they can work collaboratively to solve the problems created by the hazardous waste sites. We have learned the time and resources we devote at the beginning of a project to developing relationships with the local community, and learning about their issues and concerns is time and resources well spent. We believe the evidence shows this up-front investment helps us make better cleanup decisions, and avoids last minute efforts to work with a hostile community who feels left out of the decision-making process. (authors)

  6. Superfund impasse

    International Nuclear Information System (INIS)

    Dowd, R.M.

    1988-01-01

    EPA recently reported to Congress on the status of the Superfund program. A review of the report reveals that Superfund is a costly, slow-moving juggernaut that consumes an ever-growing share of resources and threatens to overwhelm other, more pressing environmental issues. EPA was given a broad mandate to clean up hazardous-waste sites when Congress enacted the Comprehensive Environmental Response. Compensation, and Liability Act in 1980 and established a $1.6 billion appropriation for a Superfund. In 1986 Congress extended the program for another five years and added $8.5 billion to complete the job-an overly optimistic estimate, as we shall see. Superfund is a huge program; the inventory of potentially hazardous waste sites is large and growing quickly. By the end of fiscal year 1987, EPA's inventory listed 27,571 hazardous-waste sites,and this number is increasing steadily at a rate of about 2500 each year. The General Accounting Office suggests that there may be as many as 150,000 such sites

  7. HISTORY AND ACCOMPLISHMENTS OF THE US EPA'S SUPERFUND INNOVATIVE TECHNOLOGY EVALUATION (SITE) MONITORING AND MEASUREMENT (MMT) PROGRAM

    Science.gov (United States)

    This manuscript presents the history and evolution of the U.S. Environmental Protection Agency's (EPA) Superfund Innovative Technology Evaluation (SITE) Monitoring and Measurement Technology (MMT) Program. This includes a discussion of how the fundamental concepts of a performanc...

  8. Marketing Prior Learning Assessment Programs.

    Science.gov (United States)

    Heeger, Gerald A.

    1983-01-01

    Experiential learning programs must be marketed effectively if they are to succeed. The formulation of market strategy is discussed including: strategic planning; identification of a market target; and development of a market mix. A commitment to marketing academic programs is seen as a commitment to self-assessment. (MW)

  9. EPA [Environmental Protection Agency] SITE [Superfund Innovative Technology Evaluation] program seeks technology proposals

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    EPA will issue an RFP to initiate the SITE-005 solicitation for demonstration of technologies under the Superfund Innovative Technology Evaluation (SITE) Program. This portion of the SITE program offers a mechanism for conducting a joint technology demonstration between EPA and the private sector. The goal of the demonstration program is to provide an opportunity for developers to demonstrate the performance of their technologies on actual hazardous wastes at Superfund sites, and to provide accurate and reliable data on that performance. Technologies selected must be of commercial scale and provide solutions to problems encountered at Superfund Sites. Primary emphasis in the RFP is on technologies that address: treatment of mixed, low level radioactive wastes in soils and groundwater; treatment of soils and sludges contaminated with organics and/or inorganics, materials handling as a preliminary step to treatment or further processing, treatment trains designed to handle specific wastes, are in situ technologies, especially those processes providing alternatives to conventional groundwater pump and treat techniques

  10. 1992 update of US EPA's Superfund Innovative Technology Evaluation (SITE) Emerging Technology Program

    International Nuclear Information System (INIS)

    Lewis, N.M.; Barkley, N.P.; Williams, T.

    1992-01-01

    The Superfund Innovative Technology Evaluation (SITE) Emerging Technology Program (ETP) has financially supported further development of bench- and pilot-scale testing and evaluation of innovative technologies for use at hazardous waste sites for five years. The ETP was established under the Superfund Amendments and Reauthorization Act (SARA) of 1986. The ETP complies with the goal of the SITE Program to promote, accelerate and make commercially available the development of alternative/innovative treatment technologies for use at Superfund sites. Technologies are submitted to the ETP through yearly solicitations for Preproposals. Applicants are asked to submit a detailed project proposal and a cooperative agreement application that requires Developer/EPA cost sharing. EPA co-funds selected Developers for one to two years. Second-year funding requires documentation of significant progress during the first year. Facilities, equipment, data collection, performance and development are monitored throughout the project. The US Department of Energy (DOE) and the US Air Force (USAF) are participants in the ETP. DOE has co-funded ETP projects since 1990 and the USAF since 1991. A goal of the ETP is to move developed technologies to the field-demonstration stage. A developer may be considered for participation in the SITE Demonstration Program if performance in the ETP indicates the technology is field-ready for evaluation. Six technology categories: biological, chemical, materials handling, physical, solidification/stabilization and thermal, are presently in the ETP. Technologies of primary interest to EPA are those that can treat complex mixtures of hazardous organic and inorganic contaminants and provide improved solids handling and/or pretreatment. An account of the background and progress of the ETP's first five years is presented in this paper. Technologies currently in the ETP are noted, and developers and EPA Project Managers, are listed. 4 refs., 11 figs., 6 tabs

  11. Superfund Sites

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This layer represents active Superfund Sites published by the Environmental Protection Agency (EPA). These data were extracted from the Superfund Enterprise...

  12. Superfund Technical Assistance Grants

    Data.gov (United States)

    U.S. Environmental Protection Agency — This asset includes data related to the Superfund Technical Assistance Grant program, including grant number, award amounts, award dates, period of performance,...

  13. Superfund Query

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Superfund Query allows users to retrieve data from the Comprehensive Environmental Response, Compensation, and Liability Information System (CERCLIS) database.

  14. Case studies of community relations on DOE's Formerly Utilized Sites Remedial Action Program as models for Superfund sites

    International Nuclear Information System (INIS)

    Plant, S.W.; Adler, D.G.

    1995-01-01

    Ever since the US Department of Energy (DOE) created its Formerly Utilized Sites Remedial Action Program (FUSRAP) in 1974, there has been a community relations program. The community relations effort has grown as FUSRAP has grown. With 20 of 46 sites now cleaned up, considerable experience in working with FUSRAP stakeholders has been gained. Why not share that experience with others who labor on the Superfund sites? Many similarities exist between the Superfund sites and FUSRAP. FUSRAP is a large, multiple-site environmental restoration program. The challenges range from small sites requiring remedial actions measurable in weeks to major sites requiring the full remedial investigation/feasibility study process. The numerous Superfund sites throughout the United States offer the same diversity, both geographically and technically. But before DOE offers FUSRAP's community relations experience as a model, it needs to make clear that this will be a realistic model. As experiences are shared, DOE will certainly speak of the efforts that achieved its goals. But many of the problems that DOE encountered along the way will also be related. FUSRAP relies on a variety of one- and two-way communication techniques for involving stakeholders in the DOE decision-making process. Some of the techniques and experiences from the case studies are presented

  15. New York's new Superfund regulations: Implications for federal and other state programs

    International Nuclear Information System (INIS)

    Pavetto, C.S.; Rubinton, D.S.

    1994-01-01

    The need for cleaning up hazardous waste disposal sites was identified early in New York. In fact, New York's ''Superfund'' statute preceded the federal Superfund law thereby providing a model for CERCLA. Moreover, there are currently almost as many sites on New York's Registry of Inactive Hazardous Waste Disposal sites as there are sites on the National Priorities List. While New York's law served as a model for the federal CERCLA, CERCLA, in turn, has served as a model for other states' statutes. Similarly, lessons learned from the implementation of state Superfund statutes such as New York's can be instructive for those whose work involves dealing with CERCLA-type issues. This is because the problems associated with site restoration and cleanup, such as exceedingly complex site review and evaluation processes, high transaction costs, and difficulties in prioritizing sites for clean-up based upon the threat or risk of environmental harm, are universal

  16. Superfund Technology Evaluation Report: SITE Program Demonstration Test Shirco Pilot-Scale Infrared Incineration System at the Rose Township Demode Road Superfund Site Volume I

    Science.gov (United States)

    The Shirco Pilot-Scale Infrared Incineration System was evaluated during a series of seventeen test runs under varied operating conditions at the Demode Road Superfund Site located in Rose Township, Michigan. The tests sought to demonstrate the effectiveness of the unit and the t...

  17. Structure of NCI Cooperative Groups Program Prior to NCTN

    Science.gov (United States)

    Learn how the National Cancer Institute’s Cooperative Groups Program was structured prior to its being replaced by NCI’s National Clinical Trials Network (NCTN). The NCTN gives funds and other support to cancer research organizations to conduct cancer clinical trials.

  18. Superfund tio videos: Set A. Overview of superfund, response activities and responsibilities, site discovery, notification, and evaluation. Part 1. Audio-Visual

    International Nuclear Information System (INIS)

    1990-01-01

    The videotape is divided into three sections. Section 1 discusses the development and framework of CERCLA and the Superfund Program and outlines the implementing rules that guide Superfund site cleanups. The Superfund response actions - remedial, removal, and enforcement - are reviewed. Section 2 outlines On-Scene Coordinator's (OSC) and Remedial Project Manager's (RPM) roles and responsibilities in Superfund removal, remedial, and enforcement activities. The other players involved in Superfund response activities also are identified. Section 3 describes how EPA learns of potential Superfund sites and lists the authorities that determine the requirements for site discovery. The procedures used to prioritize the sites and to identify and select sites for remediation are discussed

  19. Stakeholder views of superfund sites

    International Nuclear Information System (INIS)

    English, M.R.

    1992-01-01

    Nearly ten years have passed since the enactment of the federal Comprehensive Response, Compensation, and Liability Act (CERCLA), usually referred to as open-quotes Superfundclose quotes. Nearly four years have passed since CERCLA's major overhaul through the Superfund Amendments and Reauthorization Act (SARA). Although much still remains to be done under Superfund, there is now enough experience to assess how effectively it is working. A study being undertaken by the University of Tennessee's Waste Management Research and Education Institute will supply a portion of that assessment. The study was completed in the fall of 1990. Our study examines two related issues: the resources that will be needed in the coming years to fulfill the mandate of Superfund and other hazardous waste remediation programs, and the site-level experience to date in implementing CERCLA and SARA. This chapter discusses only the open-quotes site-level experienceclose quotes effort, and only its methodological approach. The purpose of the open-quotes site-level experienceclose quotes effort is to explore what counts as a open-quotes successfulclose quotes site in the eyes of different stakeholders in a Superfund cleanup - e.g., the affected community, the potentially responsible parties (PRPs), state and local officials, and the US Environmental Protection Agency (EPA)

  20. Cleanups In My Community (CIMC) - Superfund National Priority List (NPL) Sites, National Layer

    Data.gov (United States)

    U.S. Environmental Protection Agency — This data layer provides access to Superfund National Priority List Sites as part of the CIMC web service. Superfund is a program administered by the EPA to locate,...

  1. Superfund TIO videos: Set B. Financial management and SCAP. Part 8. Audio-Visual

    International Nuclear Information System (INIS)

    1990-01-01

    The videotape covers various aspects of financial management for the Superfund Program. The importance of effective financial management and execution is discussed. The objectives and definitions of the Superfund Comprehensive Accomplishment Plan (SCAP) and the roles and responsibilities of Superfund personnel in the SCAP process are covered

  2. Superfund Site Information

    Data.gov (United States)

    U.S. Environmental Protection Agency — This asset includes a number of individual data sets related to site-specific information for Superfund, which is governed under the Comprehensive Environmental...

  3. Report: Remedial Project Manager Turnover at Superfund Sites

    Science.gov (United States)

    Report #2001-M-000015, June 15, 2001. We determined that EPA Region III did not have formal procedures in place to mitigate continuity problems caused by turnover of EPA personnel in the Superfund program.

  4. 13 CFR 115.62 - Prohibition on participation in Prior Approval program.

    Science.gov (United States)

    2010-01-01

    ... Prior Approval program. 115.62 Section 115.62 Business Credit and Assistance SMALL BUSINESS... in Prior Approval program. A PSB Surety is not eligible to submit applications under subpart B of this part. This prohibition does not extend to an Affiliate, as defined in 13 CFR § 121.103, of a PSB...

  5. Weapon System Requirements: Detailed Systems Engineering Prior to Product Development Positions Programs for Success

    Science.gov (United States)

    2016-11-01

    modified, replaced, or sustained by consumers or different manufacturers in addition to the manufacturer that developed the system. It also allows...WEAPON SYSTEM REQUIREMENTS Detailed Systems Engineering Prior to Product Development Positions Programs for Success...Engineering Prior to Product Development Positions Programs for Success Why GAO Did This Study Cost and schedule growth in DOD major defense

  6. SITE COMPREHENSIVE LISTING (CERCLIS) (Superfund)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Comprehensive Environmental Response, Compensation and Liability Information System (CERCLIS) (Superfund) Public Access Database contains a selected set of...

  7. TECHNOLOGY EVALUATION REPORT, SITE PROGRAM DEMONSTRATION TEST: SHIRCO PILOT-SCALE INFRARED INCINERATION SYSTEM ROSE TOWNSHIP DEMODE ROAD SUPERFUND SITE - VOLUME II

    Science.gov (United States)

    The performance of the Shirco pilot-scale infrared thermal destruction system has been evaluated at the Rose Township, Demode Road Superfund Site and is presented in the report. The waste tested consisted of solvents, organics and heavy metals in an illegal dump site. Volume I gi...

  8. Does Teaching Experience Matter? Examining Biology Teachers' Prior Knowledge for Teaching in an Alternative Certification Program

    Science.gov (United States)

    Friedrichsen, Patricia J.; Abell, Sandra K.; Pareja, Enrique M.; Brown, Patrick L.; Lankford, Deanna M.; Volkmann, Mark J.

    2009-01-01

    Alternative certification programs (ACPs) have been proposed as a viable way to address teacher shortages, yet we know little about how teacher knowledge develops within such programs. The purpose of this study was to investigate prior knowledge for teaching among students entering an ACP, comparing individuals with teaching experience to those…

  9. SHIRCO PILOT-SCALE INFRARED INCINERATION SYSTEM AT THE ROSE TOWNSHIP DEMODE ROAD SUPERFUND SITE

    Science.gov (United States)

    Under the Superfund Innovative Technology Evaluation or SITE Program, an evaluation was made of the Shirco Pilot-Scale Infrared Incineration System during 17 separate test runs under varying operating conditions. The tests were conducted at the Demode Road Superfund site in Ros...

  10. The remediation process: Approach and elements of the Department of Energy's environmental restoration program in a Superfund environment

    International Nuclear Information System (INIS)

    Lehr, J.C.

    1993-01-01

    The Department of Energy (DOE) operates a large industrial complex located at various production, processing, testing, and research and development installations across the country. During the 40+ years of operation, this complex generated and managed waste to then-current standards. However, some of these waste management practices have subsequently been proven to be inadequate for long-term environmental protection. The Office of Environmental Restoration and Waste Management (EM) was established in 1989, when DOE's top priority changed from nuclear weapons production to environmental cleanup. The Environmental Restoration (ER) Program within EM was tasked to ensure that risks to human health and the environment posed by DOE's past operations at its nuclear facilities and sites are eliminated or reduced to prescribed, safe levels. Since its creation, the ER Program has been one of the fastest growing programs in the Department, demonstrating the Secretary's commitment to the new clean-up priority. (The 1989 budget was $400 million, while the 1993 budget is $1.8 billion.) As new technologies are developed and new management strategies implemented, the program will continue to expand. This paper describes the environmental remediation process from its early assessment phase to the final compliance effort

  11. Cleanups In My Community (CIMC) - Base Realignment and Closure (BRAC) Superfund Sites, National Layer

    Science.gov (United States)

    This data layer provides access to Base Realignment and Closure (BRAC) Superfund Sites as part of the CIMC web service. EPA works with DoD to facilitate the reuse and redevelopment of BRAC federal properties. When the BRAC program began in the early 1990s, EPA worked with DoD and the states to identify uncontaminated areas and these parcels were immediately made available for reuse. Since then EPA has worked with DoD to clean up the contaminated portions of bases. These are usually parcels that were training ranges, landfills, maintenance facilities and other past waste-disposal areas. Superfund is a program administered by the EPA to locate, investigate, and clean up worst hazardous waste sites throughout the United States. EPA administers the Superfund program in cooperation with individual states and tribal governments. These sites include abandoned warehouses, manufacturing facilities, processing plants, and landfills - the key word here being abandoned.This data layer shows Superfund Sites that are located at BRAC Federal Facilities. Additional Superfund sites and other BRAC sites (those that are not Superfund sites) are included in other data layers as part of this web service.BRAC Superfund Sites shown in this web service are derived from the epa.gov website and include links to the relevant web pages within the attribute table. Data about BRAC Superfund Sites are located on their own EPA web pages, and CIMC links to those pages. The CIMC web service

  12. Impact of formulary restriction with prior authorization by an antimicrobial stewardship program.

    Science.gov (United States)

    Reed, Erica E; Stevenson, Kurt B; West, Jessica E; Bauer, Karri A; Goff, Debra A

    2013-02-15

    In an era of increasing antimicrobial resistance and few antimicrobials in the developmental pipeline, many institutions have developed antimicrobial stewardship programs (ASPs) to help implement evidence-based (EB) strategies for ensuring appropriate utilization of these agents. EB strategies for accomplishing this include formulary restriction with prior authorization. Potential limitations to this particular strategy include delays in therapy, prescriber pushback, and unintended increases in use of un-restricted antimicrobials; however, our ASP found that implementing prior authorization for select antimicrobials along with making a significant effort to educate clinicians on criteria for use ensured more appropriate prescribing of these agents, hopefully helping to preserve their utility for years to come.

  13. Key Principles of Superfund Remedy Selection

    Science.gov (United States)

    Guidance on the primary considerations of remedy selection which are universally applicable at Superfund sites. Key guidance here include: Rules of Thumb for Superfund Remedy Selection and Role of the Baseline Risk Assessment.

  14. 75 FR 38100 - National Institute of Environmental Health Sciences Superfund Hazardous Substance Research and...

    Science.gov (United States)

    2010-07-01

    ...- traditional communication methods to make the significance and applicability of SRP-funded research... and Social Sciences Research, and National Institute of Biomedical Imaging and Bioengineering. [cir... Superfund Hazardous Substance Research and Training Program Strategic Plan; Request for Comments ACTION...

  15. Challenge of superfund community relations

    International Nuclear Information System (INIS)

    Goldman, N.J.

    1991-01-01

    Conducting a community relations effort in a community which is home to a Superfund site is a formidable challenge. Any education press, however appropriate, quickly falls victim to doubt, mistrust of fears of the very public intended to be served by the effort. While each site is uniquely different, the issues raised by affected communities in one part of the country are strikingly similar to those raised in other parts. Those most involved must join those most affected in seeking meaningful solutions and in building the trust that is so vital in moving forward with Superfund

  16. Baccalaureate Student Nurses' Study Habits Prior to Admission to Nursing Program: A Descriptive Qualitative Study.

    Science.gov (United States)

    Felicilda-Reynaldo, Rhea Faye D; Cruz, Jonas Preposi; Bigley, Louise; Adams, Kathryn

    2017-06-01

    Faculty continue to observe students struggling as they adapt their study strategies to learn nursing core content. This study described the study habits of Bachelor of Science in Nursing (BSN) students prior to admission to the program. This study used a descriptive qualitative research design. A purposive sample of 19 BSN students (juniors [n=10] and seniors [n=9]) from a 4-year public Midwestern university were included in this study. Two focus group sessions, using a semi-structured interview guide, were conducted in the spring semester of 2013. The four themes which emerged from the analysis of data were: "I just got it," "I had a lot of time then," "I studied alone" mostly, and "…a little struggle with the sciences." The findings suggest the BSN students did not study much or employed poor study strategies during their years completing general education courses. Academic support is needed by students prior to admission to the nursing program so they can learn effective study skills and modify their study habits for easier adaptation to the rigors of nursing education. Published by Elsevier Ltd.

  17. Alternating current electrocoagulation for Superfund site remediation

    International Nuclear Information System (INIS)

    Farrell, C.W.

    1991-01-01

    A study is being conducted by Electro-Pure Systems, Inc. (EPS) under the Emerging Technology portion of the U.S. Environmental Protection Agency's (EPA's) Superfund Innovative Technology Evaluation (SITE) Program to study alternating current electrocoagulation for Superfund site remediation. Alternating current electrocoagulation has proven to be effective in agglomerating and removing colloidal solids, metals and certain organic contaminants from surrogate soils prepared from the US EPA's Synthetic Soil Matrix. Treatments under a wide range of operating conditions have enabled the optimum parameter settings to be established for multiple phase separation. Electrocoagulation enables appreciably enhanced filtration and dewatering rates to be realized for metals- and diesel fuel-spiked surrogate soil slurries; such enhancements are prompted by growth in the mean particle size of the clays and particulates from typically < 10 microns to as much as 150 microns depending on the degree of electrocoagulation. Reduction in the total suspended solids content of clays in all slurries in excess of 90% can routinely be achieved. Bench-scale experiments of the metals-spiked surrogate soils indicate that electrocoagulation preferentially concentrates soluble metals into the sludge phase; excellent metals separation (Pb, Cr, Cu, Cd) can be realized. Experiments on surrogate wastes spiked with volatile organics suggest that this technology is not capable of effecting good volatile extractions from the aqueous phase. Reductions in excess of 80% in the total organic carbon (TOC) content of the diesel fuel-spiked surrogates can, however, be achieved

  18. Research and Teaching: Computational Methods in General Chemistry--Perceptions of Programming, Prior Experience, and Student Outcomes

    Science.gov (United States)

    Wheeler, Lindsay B.; Chiu, Jennie L.; Grisham, Charles M.

    2016-01-01

    This article explores how integrating computational tools into a general chemistry laboratory course can influence student perceptions of programming and investigates relationships among student perceptions, prior experience, and student outcomes.

  19. SITE COMPREHENSIVE LISTING (CERCLIS) (Superfund) - NPL Sites

    Data.gov (United States)

    U.S. Environmental Protection Agency — National Priorities List (NPL) Sites - The Comprehensive Environmental Response, Compensation and Liability Information System (CERCLIS) (Superfund) Public Access...

  20. CERCLIS (Superfund) ASCII Text Format - CPAD Database

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Comprehensive Environmental Response, Compensation and Liability Information System (CERCLIS) (Superfund) Public Access Database (CPAD) contains a selected set...

  1. Superfund Site Information - Site Sampling Data

    Data.gov (United States)

    U.S. Environmental Protection Agency — This asset includes Superfund site-specific sampling information including location of samples, types of samples, and analytical chemistry characteristics of...

  2. Region 9 NPL Sites (Superfund Sites 2013)

    Science.gov (United States)

    NPL site POINT locations for the US EPA Region 9. NPL (National Priorities List) sites are hazardous waste sites that are eligible for extensive long-term cleanup under the Superfund program. Eligibility is determined by a scoring method called Hazard Ranking System. Sites with high scores are listed on the NPL. The majority of the locations are derived from polygon centroids of digitized site boundaries. The remaining locations were generated from address geocoding and digitizing. Area covered by this data set include Arizona, California, Nevada, Hawaii, Guam, American Samoa, Northern Marianas and Trust Territories. Attributes include NPL status codes, NPL industry type codes and environmental indicators. Related table, NPL_Contaminants contains information about contaminated media types and chemicals. This is a one-to-many relate and can be related to the feature class using the relationship classes under the Feature Data Set ENVIRO_CONTAMINANT.

  3. Superfund Programmatic Information

    Data.gov (United States)

    U.S. Environmental Protection Agency — This asset includes an inventory of program policy and guidance documents that are used by the EPA regions, states, tribes and private parties to implement the...

  4. A linear programming computational framework integrates phosphor-proteomics and prior knowledge to predict drug efficacy.

    Science.gov (United States)

    Ji, Zhiwei; Wang, Bing; Yan, Ke; Dong, Ligang; Meng, Guanmin; Shi, Lei

    2017-12-21

    In recent years, the integration of 'omics' technologies, high performance computation, and mathematical modeling of biological processes marks that the systems biology has started to fundamentally impact the way of approaching drug discovery. The LINCS public data warehouse provides detailed information about cell responses with various genetic and environmental stressors. It can be greatly helpful in developing new drugs and therapeutics, as well as improving the situations of lacking effective drugs, drug resistance and relapse in cancer therapies, etc. In this study, we developed a Ternary status based Integer Linear Programming (TILP) method to infer cell-specific signaling pathway network and predict compounds' treatment efficacy. The novelty of our study is that phosphor-proteomic data and prior knowledge are combined for modeling and optimizing the signaling network. To test the power of our approach, a generic pathway network was constructed for a human breast cancer cell line MCF7; and the TILP model was used to infer MCF7-specific pathways with a set of phosphor-proteomic data collected from ten representative small molecule chemical compounds (most of them were studied in breast cancer treatment). Cross-validation indicated that the MCF7-specific pathway network inferred by TILP were reliable predicting a compound's efficacy. Finally, we applied TILP to re-optimize the inferred cell-specific pathways and predict the outcomes of five small compounds (carmustine, doxorubicin, GW-8510, daunorubicin, and verapamil), which were rarely used in clinic for breast cancer. In the simulation, the proposed approach facilitates us to identify a compound's treatment efficacy qualitatively and quantitatively, and the cross validation analysis indicated good accuracy in predicting effects of five compounds. In summary, the TILP model is useful for discovering new drugs for clinic use, and also elucidating the potential mechanisms of a compound to targets.

  5. FUSRAP adapts to the amendments of Superfund

    International Nuclear Information System (INIS)

    Atkin, R.G.; Liedle, S.D.; Clemens, B.W.

    1988-01-01

    With the promulgation of the Superfund Amendments and Reauthorization Act (SARA) federal facilities were required to comply with the Comprehensive Environmental Response Compensation and Liability Act (CERCLA) in the same manner as any non-government entity. This situation presented challenges for the Department of Energy (DOE) and other federal agencies involved in remedial action work because of the requirements under SARA that overlap other laws requiring DOE compliance, e.g., the National Environmental Policy Act (NEPA). This paper outlines options developed to comply with CERCLA and NEPA as part of an active, multi-site remedial action program. The program, the Formerly Utilized Sites Remedial Action Program (FUSRAP), was developed to identify, clean up, or control sites containing residual radioactive contamination resulting from the nation's early development of nuclear power. During the Manhattan Project, uranium was extracted from domestic and foreign ores and resulted in mill concentrates, purified metals, and waste products that were transported for use or disposal at other locations. Figure 1 shows the steps for producing uranium metal during the Manhattan Project. As a result of these activities materials equipment, buildings, and land became contaminated, primarily with naturally occurring radionuclides. Currently, FUSRAP includes 29 sites; three are on the Environmental Protection Agency's (EPA's) National Priorities List (NPL) of hazardous waste sites

  6. Superfund National Priority List (NPL) Site Boundaries

    Data.gov (United States)

    U.S. Environmental Protection Agency — A set of site boundaries for each site in EPA Region 1 (Connecticut, Maine, Massachusetts, New Hampshire, Rhode Island, and Vermont) on EPA's Superfund National...

  7. Prior Work and Educational Experience Are Not Associated With Successful Completion of a Master's-Level, Distance Education Midwifery Program.

    Science.gov (United States)

    Niemczyk, Nancy A; Cutts, Alison; Perlman, Dana B

    2018-03-01

    In order to increase and diversify the midwifery workforce, admissions criteria for midwifery education programs must not contain unnecessary barriers to entry. Once accepted, students need to successfully complete the program. Many admissions criteria commonly used in midwifery education programs in the United States are not evidence based and could be unnecessary barriers to education. The primary objective of this study was to identify factors known during the admission process that were related to successful completion or failure to complete a midwifery program educating both student nurse-midwives (SNMs) and student midwives (SMs); a secondary objective was to quantify reasons for program noncompletion. This master's-level, distance education program educates a diverse group of both SNMs and SMs. A pilot, retrospective cohort study examined all students matriculating at the program from fall 2012 on and scheduled to graduate by summer 2016 (N = 58). Demographic information, admissions information, academic records, and advising notes were reviewed. Reasons for noncompletion were identified, and characteristics were compared between students who did and did not complete the program. Program completion was not significantly associated with students' status as nurses prior to admission, labor and delivery nursing experience, length of nursing experience, nursing degree held, presence of children at home, working while in school, or undergraduate grade point average. Being a nurse, years of nursing experience, type of nursing degree, or labor and delivery nursing experience were not associated with completion of this midwifery program. © 2018 by the American College of Nurse-Midwives.

  8. Stigma: The Psychology and Economics of Superfund (2004)

    Science.gov (United States)

    Study documents the long-term impacts of Superfund cleanup on property values in communities neighboring prominent Superfund sites, examining the sale prices of nearly 35,000 homes for up to a thirty-year period near six very large Superfund sites.

  9. Superfund TIO videos. Set A. Regulatory overview - CERCLA's relationship to other programs: RCRA, Title III, UST, CWA, SDWA. Part 1. Audio-Visual

    International Nuclear Information System (INIS)

    1990-01-01

    The videotape is divided into five sections. Section 1 provides definitions and historical information on both the Resource Conservation and Recovery Act (RCRA) and the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). The four types of RCRA regulatory programs - Subtitles C, D, I, and J - are described. Treatment, storage, and disposal (TSD) and recycling facilities are also discussed. Section 2 discusses the history behind the Emergency Planning and Community Right-to-Know Act (Title III). The four major provisions of Title III, which are emergency planning, emergency release notification, community right-to-know reporting, and the toxic chemical release inventory are covered. Section 3 outlines the UST program covering notification, record keeping, and the UST Trust Fund. Section 4 outlines the six major provisions of the Clean Water Act (CWA): water quality, pretreatment, prevention of oil and hazardous substance discharges, responses to oil and hazardous substance discharges, discharges of hazardous substances into the ocean, and dredge and fill. Section 5 explains the purpose, regulations, and standards of the Safe Drinking Water Act (SDWA). Specific issues such as underground injection, sole source aquifers, and lead contamination are discussed

  10. Gender, Interest, and Prior Experience Shape Opportunities to Learn Programming in Robotics Competitions

    Science.gov (United States)

    Witherspoon, Eben B.; Schunn, Christian D.; Higashi, Ross M.; Baehr, Emily C.

    2016-01-01

    Background: Robotics competitions are increasingly popular and potentially provide an on-ramp to computer science, which is currently highly gender imbalanced. However, within competitive robotics teams, student participation in programming is not universal. This study gathered surveys from over 500 elementary, middle, and high school robotics…

  11. A prior authorization program of a radiology benefits management company and how it has affected utilization of advanced diagnostic imaging.

    Science.gov (United States)

    Levin, David C; Bree, Robert L; Rao, Vijay M; Johnson, Jean

    2010-01-01

    Radiology benefits management companies have evolved in recent years to meet the need to control the rapid growth in advanced diagnostic imaging. The Obama administration and other key policymakers have proposed using them as a cost-control mechanism, but little is known about how they operate or what results they have produced. The main tool they use is prior authorization. The authors describe the inner workings of the call center of one radiology benefits management company and how its prior authorization program seems to have slowed the growth in the utilization of MRI, CT, and PET in the large markets of one commercial payer. Copyright 2010 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  12. SUPERFUND TREATABILITY CLEARINGHOUSE: BDAT FOR SOLIDIFICATION/STABILIZATION TECHNOLOGY FOR SUPERFUND SOILS (DRAFT FINAL REPORT)

    Science.gov (United States)

    This report evaluates the performance of solidification as a method for treating solids from Superfund sites. Tests were conducted on four different artificially contaminated soils which are representative of soils found at the sites. Contaminated soils were solidified us...

  13. Superfund Sites as Anti-landscapes

    DEFF Research Database (Denmark)

    Nye, David

    2017-01-01

    Americans have used a range of narratives to make sense of their settlement and use of natural resources. This article focuses on narratives of environmental degradation after the United States passed legislation mandating the cleanup of toxic sites and provided a Superfund for that purpose. Thre...

  14. Motivational interviewing: a part of the weight loss program for overweight and obese women prior to fertility treatment.

    Science.gov (United States)

    Karlsen, Kamilla; Humaidan, Peter; Sørensen, Lise H; Alsbjerg, Birgit; Ravn, Pernille

    2013-09-01

    This is a retrospective study to investigate whether motivational interviewing increases weight loss among obese or overweight women prior to fertility treatment. Women with body mass index (BMI) > 30 kg/m(2) approaching the Fertility Clinic, Regional Hospital Skive, were given advice about diet and physical activity with the purpose of weight loss. In addition, they were asked if they wanted to receive motivational interviewing. Among other data, age, height and weight were obtained. Main outcomes were weight loss measured in kg and decrease in BMI. We studied 187 women: 110 received sessions of motivational interviewing (intervention group, n = 110), 64 received motivational support by phone or e-mail only and 13 women did not wish any motivational support (control group, n = 77). The mean weight loss and decrease in BMI was greater in the intervention group compared with the control group (9.3 kg versus 7.3 kg, difference p = 0.01, 3.3 kg/m(2) versus 2.6 kg/m(2), difference p = 0.02). The mean period of intervention was comparable in the two groups, 7.9 month and 7.3 month, respectively, (difference non significant: NS). The study indicates that motivational interviewing may be a valuable tool in weight loss programs for obese and overweight women prior to fertility treatment.

  15. Fifteen years of Superfund at South Valley: Reengineering required

    International Nuclear Information System (INIS)

    Cormier, J.; Horak, F.

    1995-01-01

    It is no surprise to many of Superfund's practitioners that the law and its application are flawed. The South Valley Superfund Site in Albuquerque, New Mexico has not escaped Superfund's problems. The problems and issues arising out of the South Valley Superfund site have spurred the desire to seek a better way to administer and manage cleanup. This new method applies organizational and role changes that bring Superfund closer to an efficient business-like entity. This ''Reengineered'' Superfund strives for reorganization, contractor reduction, improved communication, reporting reduction, and teaming. In addition, modifications are made to the roles of regulators, potentially responsible parties (PRPs), and the public. Today the site encompasses roughly one square mile in area, includes six identified contaminant sources, and deals with solvent and petroleum by-product contamination

  16. SITE COMPREHENSIVE LISTING (CERCLIS) - Contaminants at CERCLIS (Superfund) Sites

    Data.gov (United States)

    U.S. Environmental Protection Agency — Contaminants at Comprehensive Environmental Response, Compensation and Liability Information System (CERCLIS) (Superfund) Sites - The CERCLIS Public Access Database...

  17. SITE COMPREHENSIVE LISTING (CERCLIS) (Superfund) - Responsible Parties at CERCLIS Sites

    Data.gov (United States)

    U.S. Environmental Protection Agency — Responsible Parties at CERCLIS Sites - The Comprehensive Environmental Response, Compensation and Liability Information System (CERCLIS) (Superfund) Public Access...

  18. SITE COMPREHENSIVE LISTING (CERCLIS) (Superfund) - Non-NPL Sites

    Data.gov (United States)

    U.S. Environmental Protection Agency — Non-NPL Sites - The Comprehensive Environmental Response, Compensation and Liability Information System (CERCLIS) (Superfund) Public Access Database contains a...

  19. Superfund at work: Hazardous waste cleanup efforts nationwide, spring 1993 (Radium Chemical Site profile, Queens, New York)

    International Nuclear Information System (INIS)

    1993-01-01

    The Radium Chemical hazardous waste site in Queens, New York was contaminated with radium, posing a grave potential threat to the community. The US Environmental Protection Agency (EPA) used the Superfund program to design a long-term cleanup for the site using input from citizens and the business community. Superfund staff: Mobilized a quick cleanup action to remove 10,000 small containers of radium; Developed a streamlined approach to long-term cleanup; Secured the site to reduce the possibility of radiation exposure to the local residents; Cooperated with the community to design a well-organized emergency response plan; and Educated local citizens about site hazards, incorporating community concerns into the cleanup process. The Radium Chemical site is a clear example of EPA's effective management and problem-solving strategies at Superfund sites

  20. 2005 to 2014 CT and MRI Utilization Trends in the Context of a Nondenial Prior Authorization Program

    Directory of Open Access Journals (Sweden)

    Adam C. Powell

    2017-10-01

    Full Text Available Purpose: Reducing unnecessary testing may benefit patients, as some computed tomography (CT and magnetic resonance imaging (MRI expose patients to contrast, and all CTs expose patients to radiation. This observational study with historical controls assessed shifts in CT and MRI utilization over a 9-year period after a private health insurer’s implementation of a nondenial, consultative prior authorization program. Methods/Materials: Normalized rates of exams per 1000 person-years were plotted over 2005 to 2014 for people with commercial and Medicare Advantage health plans in the San Antonio market, with 2005 utilization set as the baseline. The program was implemented at the start of 2006. Computed tomography and MRI utilization changes were compared with contemporaneous changes in low-tech plain film and ultrasound utilization. Results: Growth in high-tech imaging utilization decelerated or reversed during the period. In 2006, CT utilization dropped to between 76% and 90% of what it had been in 2005, depending on the plan. In 2014, it was between 52% and 88% of its initial level. MRI utilization declined to between 86% and 94% of its initial level in 2006, and then to between 50% and 75% in 2014. Ultrasound utilization was greater in 2014 than in 2005 for some plans. Plain film utilization declined between 2005 and 2014 for all plans. Conclusion: There was an immediate and sustained decline in CT and MRI utilization after the introduction of the program. While many factors may have impacted the long-term trends, the mixed trends in low-tech imaging suggest that a decline in low-tech imaging was not responsible for the decline in CT and MRI utilization.

  1. Human papillomavirus prevalence among indigenous and non-indigenous Australian women prior to a national HPV vaccination program

    Directory of Open Access Journals (Sweden)

    Condon John R

    2011-09-01

    Full Text Available Abstract Background Indigenous women in Australia have a disproportionate burden of cervical cancer despite a national cervical screening program. Prior to introduction of a national human papilloma virus (HPV vaccination program, we determined HPV genotype prevalence by Indigenous status and residence in remote areas. Methods We recruited women aged 17 to 40 years presenting to community-based primary health services for routine Pap screening across Australia. A liquid-based cytology (LBC cervical specimen was tested for HPV DNA using the AMPLICOR HPV-DNA test and a PGMY09/11-based HPV consensus PCR; positive specimens were typed by reverse hybridization. We calculated age-adjusted prevalence by weighting to relevant population data, and determined predictors of HPV-DNA positivity by age, Indigenous status and area of residence using logistic regression. Results Of 2152 women (655 Indigenous, prevalence of the high-risk HPV genotypes was similar for Indigenous and non-Indigenous women (HPV 16 was 9.4% and 10.5%, respectively; HPV 18 was 4.1% and 3.8%, respectively, and did not differ by age group. In younger age groups, the prevalence of other genotypes also did not differ, but in those aged 31 to 40 years, HPV prevalence was higher for Indigenous women (35% versus 22.5%; P Conclusion Although we found no difference in the prevalence of HPV16/18 among Australian women by Indigenous status or, for Indigenous women, residence in remote regions, differences were found in the prevalence of risk factors and some other HPV genotypes. This reinforces the importance of cervical screening as a complement to vaccination for all women, and the value of baseline data on HPV genotype prevalence by Indigenous status and residence for the monitoring of vaccine impact.

  2. 75 FR 49414 - Cooperative Agreements and Superfund State Contracts for Superfund Response Actions

    Science.gov (United States)

    2010-08-13

    ...-0276. FOR FURTHER INFORMATION CONTACT: Angelo Carasea, Assessment and Remediation Division, Office of Superfund Remediation and Technology Innovation, (5204P), Environmental Protection Agency, 1200 Pennsylvania... funds to a State, political subdivision, or Indian Tribe that assumes responsibility as the lead or...

  3. Superfund XV conference proceedings. Volume 1

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    This conference was held November 29--December 1, 1994 in Washington, D.C..The purpose of this conference was to provide a forum for exchange of state-of-the-art information on Superfund. Papers are included on the following topics: bioremediation; building decontamination; environmental policy issues; federal environmental restoration; groundwater remediation; innovative sampling and analytical technologies; laboratory methods; metals management; mixed wastes; PCB waste management; remediation technology and case studies; and risk assessment. Individual papers have been processed separately for inclusion in the appropriate data bases

  4. Feasibility study for the United Heckathorn Superfund Site, Richmond, California

    Energy Technology Data Exchange (ETDEWEB)

    Lincoff, A.H. [US Environmental Protection Agency, San Francisco, CA (United States). Region IX; Costan, G.P.; Montgomery, M.S.; White, P.J. [Pacific Northwest Lab., Richland, WA (United States)

    1994-07-01

    The United Heckathom Superfund Site in Richmond, California, was used to formulate pesticides from approximately 1947 to 1966. Soils at the site and sediments in the harbor were contaminated with various chlorinated pesticides, primarily DDT, as a result of these activities. The US Environmental Protection Agency listed the site on the Superfund National Priorities List in 1990. This document is part of the Remedial Investigation and Feasibility Study phase of the Superfund response, which will provide the basis for selection of a final remedy that will protect human health and the environment and achieve compliance with federal and state envirorunental laws.

  5. Feasibility study for the United Heckathorn Superfund Site, Richmond, California

    International Nuclear Information System (INIS)

    Lincoff, A.H.

    1994-07-01

    The United Heckathom Superfund Site in Richmond, California, was used to formulate pesticides from approximately 1947 to 1966. Soils at the site and sediments in the harbor were contaminated with various chlorinated pesticides, primarily DDT, as a result of these activities. The US Environmental Protection Agency listed the site on the Superfund National Priorities List in 1990. This document is part of the Remedial Investigation and Feasibility Study phase of the Superfund response, which will provide the basis for selection of a final remedy that will protect human health and the environment and achieve compliance with federal and state envirorunental laws

  6. Optimization Review: Carson River Mercury Superfund Site, Carson City, Nevada

    Science.gov (United States)

    The Carson River Mercury Site (CRMS) (Figure 1) is located in northwest Nevada and was designated a Superfund site in 1990 because of elevated mercury concentrations observed in surface water, sediments and biota inhabiting the site.

  7. Strategy to Ensure Institutional Control Implementation at Superfund Sites

    Science.gov (United States)

    This document sets forth EPA’s strategy (Strategy) for ensuring that institutional controls (ICs) are successfully implemented at Superfund sites, with an emphasis on evaluating ICs at sites where all construction of all remedies is complete (construction complete sites).

  8. Remediation System Evaluation, Savage Municipal Water Supply Superfund Site (PDF)

    Science.gov (United States)

    The Savage Municipal Water Supply Superfund Site, located on the western edge of Milford, New Hampshire, consists of a source area and an extended plume that is approximately 6,000 feet long and 2,500 feet wide.

  9. In-Depth Case Studies of Superfund Reuse

    Science.gov (United States)

    SRI’s in-depth case studies explore Superfund reuse stories from start to finish. Their purpose is to see what redevelopment strategies worked, acknowledge reuse barriers and understand how communities overcame the barriers to create new reuse outcomes.

  10. Privacy Impact Assessment for the Enforcement Superfund Tracking System

    Science.gov (United States)

    This Enforcement Superfund Tracking System (ESTS) collects publicly available information from the California Secretary of State on businesses. Learn how this data is collected, how it will be used, access to the data, and the purpose of data collection.

  11. Human Health Toxicity Values in Superfund Risk Assessments

    Science.gov (United States)

    This memorandum revises the hierarchy of human health toxicity values generally recommended for use inr isk assessments, originally presented in Risk Assessment Guidance for Superfund Volume I, Part A.

  12. Chromosomal aberrations in Sigmodon hispidus from a Superfund site

    International Nuclear Information System (INIS)

    Bowers, B.; McBee, K.; Lochmiller, R.; Burks, S.; Qualls, C.

    1995-01-01

    Cotton rats (Sigmodon hispidus) were collected from an EPA Superfund site located on an abandoned oil refinery. Three trapping grids were located on the refinery and three similar grids were located at uncontaminated localities which served as reference sites. Bone marrow metaphase chromosome preparations were examined for chromosomal damage. For each individual, 50 cells were scored for six classes of chromosomal lesions. For the fall 1991 trapping period, mean number of aberrant cells per individual was 2.33, 0.85, and 1.50 for the three Superfund grids., Mean number of aberrant cells per individual was 2.55, 2.55, and 2.12 from the reference grids. Mean number of lesions per cell was 2.77, 0.86, and 1.9 from the Superfund grids, and 3.55, 2.77, and 2.50 from the reference grids. For the spring 1992 trapping period, more damage was observed in animals from both Superfund and reference sites; however, animals from Superfund grids had more damage than animals from reference grids. Mean number of aberrant cells per individual was 3.50, 3.25, and 3.70 from the Superfund grids, and 2.40, 2.11, and 1.40 from the reference grids. Mean number of lesions per cell was 4.80, 4.25, and 5.50 from the Superfund grids, and 2.60, 2.33, and 1.50 from the reference grids. These data suggest animals may be more susceptible to chromosomal damage during winter months, and animals from the Superfund grids appear to be more severely affected than animals from reference grids

  13. Blasting at a Superfund chemical waste site

    International Nuclear Information System (INIS)

    Burns, D.R.

    1991-01-01

    During the summer of 1989, Maine Drilling and Blasting of Gardiner, Maine was contracted by Cayer Corporation of Harvard, Massachusetts to drill and blast an interceptor trench at the Nyanza Chemical Superfund Site in Ashland, Massachusetts. The interceptor trench was to be 1,365 feet long and to be blasted out of granite. The trench was to be 12 feet wide at the bottom with 1/1 slopes, the deepest cut being 30 feet deep. A French drain 12 feet wide by 15 to 35 feet deep was blasted below the main trench on a 2% slope from its center to each end. A French drain is an excavation where the rock is blasted but not dug. The trench would be used as a perimeter road with any ground water flow going through the French drain flowing to both ends of the trench. Being a Superfund project turned a simple blasting project into a regulatory nightmare. The US Environmental Protection Agency performed all the chemical related functions on site. The US Army Corps of Engineers was overseeing all related excavation and construction on site, as was the Massachusetts Department of Environmental Quality Engineering, the local Hazardous Wastes Council, and the local Fire Department. All parties had some input with the blasting and all issues had to be addressed. The paper outlines the project, how it was designed and completed. Also included is an outline of the blast plan to be submitted for approval, an outline of the Safety/Hazardous Waste training and a description of all the problems which arose during the project by various regulatory agencies

  14. Centredale Manor Superfund Site in Rhode Island included on EPA List of Targeted for Immediate Attention

    Science.gov (United States)

    Today, the U.S. Environmental Protection Agency released the list of Superfund sites that Administrator Pruitt has targeted for immediate and intense attention. The Centredale Manor Restoration Project superfund site is one of the 21 sites on the list.

  15. 78 FR 23563 - LWD, Inc. Superfund Site; Calvert City, Marshall County, Kentucky; Notice of Settlement

    Science.gov (United States)

    2013-04-19

    ... ENVIRONMENTAL PROTECTION AGENCY [FRL-9805-2; CERCLA-04-2013-3751] LWD, Inc. Superfund Site... costs concerning the LWD, Inc., Superfund Site located in Calvert City, Marshall County, Kentucky. The... V. Painter. Submit your comments by Site name LWD, Inc., Superfund Site by one of the following...

  16. 78 FR 14543 - Ward Transformer Superfund Site; Raleigh, Wake County, NC; Notice of Settlement

    Science.gov (United States)

    2013-03-06

    ... ENVIRONMENTAL PROTECTION AGENCY [FRL 9788-2; CERCLA-04-2013-3754] Ward Transformer Superfund Site... Ward Transformer Superfund Site located in Raleigh, Wake County, North Carolina. Under the terms of the.... Submit your comments by Site name Ward Transformer Superfund Site by one of the following methods: [[Page...

  17. An Examination on the Effect of Prior Knowledge, Personal Goals, and Incentive in an Online Employee Training Program

    Science.gov (United States)

    Zha, Shenghua; Adams, Andrea Harpine; Calcagno-Roach, Jamie Marie; Stringham, David A.

    2017-01-01

    This study explored factors that predicted learners' transformative learning in an online employee training program in a higher education institution in the U.S. A multivariate multiple regression analysis was conducted with a sample of 74 adult learners on their learning of a new learning management system. Four types of participants' behaviors…

  18. A Case Study of Prior Knowledge, Learning Approach and Conceptual Change in an Introductory College Chemistry Tutorial Program.

    Science.gov (United States)

    Braathen, Per Christian; Hewson, Peter W.

    This paper presents a case study involving a small group of students enrolled in a tutorial program learning introductory college chemistry. The underlying theoretical framework of this investigation was a constructivist view of learning, but more specifically it was based on Ausubel's theory of meaningful learning. The findings of this…

  19. Project Date SMART: a Dating Violence (DV) and Sexual Risk Prevention Program for Adolescent Girls with Prior DV Exposure.

    Science.gov (United States)

    Rizzo, Christie J; Joppa, Meredith; Barker, David; Collibee, Charlene; Zlotnick, Caron; Brown, Larry K

    2018-05-01

    This study assessed the initial feasibility, acceptability, and efficacy of an intervention aimed at reducing dating violence and sexual risk behavior in a sample of adolescent girls (ages 14-17) with prior exposure to physical dating violence (DV). One hundred and nine girls were randomly assigned to Date SMART (Skills to Manage Aggression in Relationships for Teens) or a Knowledge-only (KO) comparison group. Both intervention arms consisted of six, weekly 2-h sessions and one "booster" session 6 weeks later. Based on principles of cognitive behavioral therapy, the Date SMART intervention was designed to target common underlying skills deficits linked to both DV and sexual risk behavior in adolescent females: depression, self-regulation deficits, and interpersonal skills deficits. Assessments were administered at four time points (baseline, 3, 6, and 9 months). The Date SMART group was effective as reducing sexual DV involvement across the 9-month follow-up period. Both groups evidenced clinically meaningful reductions in physical, emotional, and digital DV involvement, total time in dating relationships, as well as reductions in depression. Findings indicate that delivering a DV and sexual risk prevention intervention to DV-affected adolescent girls is feasible and well-received. Furthermore, a skills-based approach that addresses the co-occurrence of DV and sexual risk behavior may be particularly useful for promoting reductions of sexual DV among high-risk adolescent girls. A future, large-scale trial with an inactive comparison condition is needed to evaluate the efficacy of Date SMART further. Clinical Trials, NCT01326195, and http://www.clinicaltrials.gov.

  20. Superfund TIO videos: Set B. Community relations, communicating with the media and presenting technical information. Part 9. Audio-Visual

    International Nuclear Information System (INIS)

    1990-01-01

    The videotape is divided into three sections. Section 1 discusses the Superfund Community Relations (CR) Program and its history and objectives. Community Relations requirements as defined by CERCLA for Superfund actions are outlined. Community Relations requirements, the nature of community involvement in CR plans, effective CR techniques, and the roles of the OSC, RPM, and EPA Community Relations Coordinator (CRC) are discussed. Section 2 (1) describes the media's perspective on seeking information; (2) identifies five settings and mechanisms for interacting with the media; (3) offers good media-relations techniques; and (4) lists tips for conducting media interviews. Section 3 outlines techniques for presenting technical information, describes how to be prepared to address typical issues of community concern, and identifies the four key elements in handling tough questions

  1. Remediation System Evaluation, Douglas Road Landfill Superfund Site

    Science.gov (United States)

    The Douglas Road Landfill Superfund Site is located in St. Joseph County just north of Mishawaka,Indiana. The site consists of a 16-acre capped landfill located on an approximately 32-acre lot (includingthe land purchased in 1999 for a wetlands...

  2. DECISION ANALYSIS OF INCINERATION COSTS IN SUPERFUND SITE REMEDIATION

    Science.gov (United States)

    This study examines the decision-making process of the remedial design (RD) phase of on-site incineration projects conducted at Superfund sites. Decisions made during RD affect the cost and schedule of remedial action (RA). Decision analysis techniques are used to determine the...

  3. PSYCHOLOGICAL FEATURES IN PATIENTS WITH CORONARY HEART DISEASE (MEN AND WOMEN PRIOR TO CORONARY ARTERY BYPASS GRAFTING DEPENDING ON THEIR INVOLVEMENT IN THE INDIVIDUAL PSYCHO-CORRECTION PROGRAM

    Directory of Open Access Journals (Sweden)

    D. A. Starunskaya

    2017-01-01

    Full Text Available Importance. The study of psychological characteristics of patients is important for the creation and planning of psychological correction and improve the efficiency of the treatment of coronary heart disease.Тhe purpose. This research is devoted to the study of the psychological features in patients with coronary artery disease (CHD in the preoperative period, depending on their involvement in psycho-correction program.Material and methods. We observed 30 patients with coronary heart disease before coronary bypass surgery. Clinical-psychological method (observation, conversation and psychological testing were used.Results  and conclusions. We found that patients who participated in psycho-correction program had lower values of «anxiety», «phobic anxiety» and «obsessive-compulsive» symptoms. In both groups of patients, on average, we identified the prevalence  of the coping-strategies «self-control» and «planning solution». Furthermore, on average, the «self-awareness» and «extraversion» were more manifested features in the structure of the personality traits of the surveyed patients. The revealed features should be taken into account in planning the programs of psycho-correction for patients with CHD prior to CABG surgery.

  4. USA - Paper provided by the US delegation to the RWMC. Site Decontamination and Clean-up Under the U.S. EPA 'Superfund'

    International Nuclear Information System (INIS)

    2003-01-01

    Contaminated and hazardous waste sites, including nuclear facilities, may be subject to clean-up under the U.S. Environmental Protection Agency (EPA). The Comprehensive Environmental Response, Compensation, and Liabilities Act (CERCLA), commonly known as 'Superfund', authorises EPA to respond to releases or threatened releases of hazardous substances, pollutants, or contaminants that may endanger public health or the environment. The legislation defines hazardous substances to include radiation. Entry into Superfund: The EPA may be notified of a site potentially requiring clean up from any source. Potential sites are evaluated under a numerical hazard ranking system, and are then included on the clean-up list ('National Priorities List') if they meet an established threshold. Nuclear Facilities and Radioactively Contaminated Sites under Superfund: Any site may be subject to CERCLA action if EPA determines that it poses a hazard. There are three major types of sites that have been or are subject to action under this program: Federal nuclear facilities, Decommissioned facilities, Privately-owned, unlicensed sites Liabilities Under Superfund: The authorising legislation specifically provided for liability of persons responsible for releases of hazardous waste at uncontrolled sites. Liability under CERCLA is 'strict,' 'retroactive,' and 'joint and several'. Thus, the burden of proof for disproving liability is quite high, and that the extent of the liability is not limited to the share of the waste or hazardous substance contributed by a party. The EPA may pursue liable parties to recover past and future costs associated with clean-up, including direct costs and indirect costs incurred by both EPA and its contractors. Clean-Up Levels: Clean-up goals and technologies are established on a site-specific basis. In general, clean-up goals must meet risk requirements and be consistent with applicable standards. Other factors such as community acceptance, volume reduction

  5. Superfund Removal Site Points, Region 9, 2012, US EPA Region 9

    Data.gov (United States)

    U.S. Environmental Protection Agency — Point geospatial dataset representing locations of CERCLA (Superfund) Removal sites. CERCLA (Comprehensive Environmental Response, Compensation, and Liability Act)...

  6. Superfund at work: Hazardous waste cleanup efforts nationwide, fall 1992. (Wide Beach section of Brant, New York)

    International Nuclear Information System (INIS)

    1992-01-01

    Wide-spread contamination of polychlorinated biphenyls (PCBs) threatened the Wide Beach section of Brant, New York, a popular vacation resort. EPA's Superfund program effectively completed a permanent cleanup of Wide Beach in the span of one year. Other highlights included: a new and innovative technology to remove PCB contamination; reduction of PCBs to one-fifth of acceptable levels; temporary relocation of residents who were concerned for their health while cleanup activities took place; newly paved roads and driveways, re-landscaped yards, and a new storm sewer system; and restoration of ecologically sensitive wetlands. EPA's achievements significantly reduced PCB risks at Wide Beach, and left a satisfied community in Brant

  7. 40 CFR 35.4040 - How many groups can receive a TAG at one Superfund site?

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false How many groups can receive a TAG at one Superfund site? 35.4040 Section 35.4040 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Eligible? § 35.4040 How many groups can receive a TAG at one Superfund site? (a) Only one TAG may be...

  8. 77 FR 11533 - Anniston PCB Superfund Site, Anniston, Calhoun County, Alabama; Notice of Amended Settlement

    Science.gov (United States)

    2012-02-27

    ... ENVIRONMENTAL PROTECTION AGENCY [CERCLA-04-2012-3763; FRL 9637-7] Anniston PCB Superfund Site... past response costs concerning the Anniston PCB Superfund Site located in Anniston, Calhoun County.... Submit your comments by Site name Anniston PCB by one of the following methods: www.epa.gov/region4...

  9. Smart moves in superfund - revitalization one year later. Volume 1, Number 3, January 1993. Bulletin

    International Nuclear Information System (INIS)

    1993-01-01

    The issue of the Smart Moves in Superfund bulletin series provides an update on the revitalization effort, highlighting National Priorities List (NPL) construction completions, accelerating cleanup, the Superfund Accelerated Cleanup Model, risk assessment/risk management, contracts management, enforcement policy/equity, interagency cooperation, public forms, and state meetings

  10. 77 FR 16548 - Florida Petroleum Reprocessors Superfund Site; Davie, Broward County, FL; Notice of Settlements

    Science.gov (United States)

    2012-03-21

    ...-2012- 3766; CERCLA-04-2012-3765] Florida Petroleum Reprocessors Superfund Site; Davie, Broward County... costs concerning the Florida Petroleum Reprocessors Superfund Site located in Davie, Broward County.... Painter. Submit your comments by Site name Florida Petroleum Reprocessors by one of the following methods...

  11. 78 FR 729 - Ellman Battery Superfund Site; Orlando, Orange County, FL; Notice of Settlement

    Science.gov (United States)

    2013-01-04

    ... ENVIRONMENTAL PROTECTION AGENCY [FRL-9767-6; CERCLA-04-2012-3780] Ellman Battery Superfund Site; Orlando, Orange County, FL; Notice of Settlement AGENCY: Environmental Protection Agency (EPA). ACTION... Action at the Ellman Battery Superfund Site located in Orlando, Orange County, Florida. DATES: The Agency...

  12. Remediation System Evaluation, McCormick and Baxter Superfund SiteRemediation System Evaluation, McCormick and Baxter Superfund Site

    Science.gov (United States)

    The McCormick and Baxter Creosoting Company, Portland Plant, Superfund Site is located adjacent tothe Willamette River in Portland, Oregon and addresses contamination of soil, groundwater, and riversediments stemming from creosoting operations...

  13. Restoration principles and criteria: Superfund programme policy for cleanup at radiation contaminated sites

    International Nuclear Information System (INIS)

    Shapiro, M.

    2000-01-01

    The Environmental Protection Agency (EPA) Office of Solid Waste and Emergency Response is responsible for implementing two key US laws regulating waste management and cleanup: the Resource Conservation and Recovery Act, and the Comprehensive Environmental Response, Compensation and Liability Act, CERCLA, nicknamed ''Superfund''. The purpose of the Superfund programme is to protect human health and the environment over the long term from releases or potential releases of hazardous substances from abandoned or uncontrolled hazardous waste sites. The focus of this paper is on Superfund, including how radiation is addressed by the Superfund programme. This paper provides a brief overview of the approach used by EPA to conduct Superfund cleanups at contaminated sites, including those that are contaminated with radionuclides, to ensure protection of human health and the environment. The paper addresses how EPA Superfund determines if a site poses a risk to human health and the framework used to determine cleanup levels. The theme emphasized throughout the paper is that within the Superfund remediation framework, radioactive contamination is dealt with in the identical way as chemical contamination. (author)

  14. Efficient analysis using custom interactive visualization tools at a Superfund site

    International Nuclear Information System (INIS)

    Williams, G.; Durham, L.

    1992-01-01

    Custom visualization analysis programs were developed and used to analyze contaminant transport calculations from a three-dimensional numerical groundwater flow model developed for a Department of Energy Superfund site. The site hydrogeology, which is highly heterogenous, includes both fractured limestone and dolomite and alluvium deposits. Three-dimensional interactive visualization techniques were used to understand and analyze the three-dimensional, double-porosity modeling results. A graphical object oriented programming environment was applied to efficiently develop custom visualization programs in a coarse-grained data structure language. Comparisons were made, using the results from the three-dimensional, finite-difference model, between traditional two-dimensional analyses (contour and vector plots) and interactive three-dimensional techniques. Subjective comparison areas include the accuracy of analysis, the ability to understand the results of three-dimensional contaminant transport simulation, and the capability to transmit the results of the analysis to the project management. In addition, a quantitative comparison was made on the time required to develop a thorough analysis of the modeling results. The conclusions from the comparative study showed that the visualization analysis provided an increased awareness of the contaminant transport mechanisms, provided new insights into contaminant migration, and resulted in a significant time savings

  15. Efficient analysis using custom interactive visualization tools at a Superfund site

    Energy Technology Data Exchange (ETDEWEB)

    Williams, G. [Northwestern Univ., Evanston, IL (United States); Durham, L. [Argonne National Lab., IL (United States)

    1992-12-01

    Custom visualization analysis programs were developed and used to analyze contaminant transport calculations from a three-dimensional numerical groundwater flow model developed for a Department of Energy Superfund site. The site hydrogeology, which is highly heterogenous, includes both fractured limestone and dolomite and alluvium deposits. Three-dimensional interactive visualization techniques were used to understand and analyze the three-dimensional, double-porosity modeling results. A graphical object oriented programming environment was applied to efficiently develop custom visualization programs in a coarse-grained data structure language. Comparisons were made, using the results from the three-dimensional, finite-difference model, between traditional two-dimensional analyses (contour and vector plots) and interactive three-dimensional techniques. Subjective comparison areas include the accuracy of analysis, the ability to understand the results of three-dimensional contaminant transport simulation, and the capability to transmit the results of the analysis to the project management. In addition, a quantitative comparison was made on the time required to develop a thorough analysis of the modeling results. The conclusions from the comparative study showed that the visualization analysis provided an increased awareness of the contaminant transport mechanisms, provided new insights into contaminant migration, and resulted in a significant time savings.

  16. Web-Based Alcohol Intervention in First-Year College Students: Efficacy of Full-Program Administration Prior to Second Semester.

    Science.gov (United States)

    Gilbertson, Rebecca J; Norton, Tina R; Beery, Susan H; Lee, Kassandra R

    2018-05-12

    Commercially available, web-based interventions for the prevention of alcohol use are being adopted for universal use with first-year college students, yet few have received empirical evaluation. This randomized controlled trial investigated the effectiveness of a novel, commercially available, personalized web-based alcohol intervention, Alcohol-Wise (version 4.0, 3 rd Millennium Classrooms), on multiple measures of alcohol consumption, alcohol consequences, alcohol expectancies, academic achievement, and adaptation to college in first-year students. Participants received Alcohol-Wise either prior to first semester or were waitlisted and received the intervention second semester. As longitudinal effectiveness was of interest, follow-up surveys were conducted 10 weeks (n = 76) and 24 weeks (n = 64) following the web-based alcohol intervention. Completion of Alcohol-Wise had effects on academic achievement. Specifically, at the 24 week follow-up, academic achievement was higher in participants who received the intervention first semester of their freshman year as compared to the waitlist control. The incremental rise in heavy episodic drinking during the first semester of college was also reduced in waitlisted participants by Alcohol-Wise administration prior to second semester. Conclusion/Importance: Implications for the timing of web-based alcohol interventions to include administration prior to both first and second semesters of the freshman year are discussed.

  17. Guidance: Strategies to Achieve Timely Settlement and Implementation of RD/RA at Superfund Sites

    Science.gov (United States)

    Memorandum recommends strategies to encourage PRPs to enter into a settlement using the model RD/RA Consent Decree; discusses the current model UAO; and suggests practical alternatives to expedite Superfund settlements and the cleanup process.

  18. Renton's Quendall Terminals on List of EPA Superfund Sites Targeted for Immediate, Intense Attention

    Science.gov (United States)

    EPA released the list of Superfund sites that Administrator Pruitt has targeted for intense and immediate attention, including the Quendall Terminals Site, a former creosote facility on the shore of Lake Washington in Renton, Washington.

  19. Ensuring the adequacy of cost share provisions in superfund state contracts. Directive

    International Nuclear Information System (INIS)

    1993-01-01

    The memorandum requests regional offices to re-examine existing Superfund State Contracts (SSCs) for Fund-financed remedial actions to verify that they adequately reflect incurred and projected remedial action costs

  20. Cleanups In My Community (CIMC) - Base Realignment and Closure (BRAC) Superfund Sites, National Layer

    Data.gov (United States)

    U.S. Environmental Protection Agency — This data layer provides access to Base Realignment and Closure (BRAC) Superfund Sites as part of the CIMC web service. EPA works with DoD to facilitate the reuse...

  1. Cleanups In My Community (CIMC) - Federal facilities that are also Superfund sites, National Layer

    Data.gov (United States)

    U.S. Environmental Protection Agency — Federal facilities are properties owned by the federal government. This data layer provides access to Federal facilities that are Superfund sites as part of the CIMC...

  2. Towards identifying the next generation of superfund and hazardous waste site contaminants

    Science.gov (United States)

    Ela, Wendell P.; Sedlak, David L.; Barlaz, Morton A.; Henry, Heather F.; Muir, Derek C.G.; Swackhamer, Deborah L.; Weber, Eric J.; Arnold, Robert G.; Ferguson, P. Lee; Field, Jennifer A.; Furlong, Edward T.; Giesy, John P.; Halden, Rolf U.; Henry, Tala; Hites, Ronald A.; Hornbuckle, Keri C.; Howard, Philip H.; Luthy, Richard G.; Meyer, Anita K.; Saez, A. Eduardo; vom Saal, Frederick S.; Vulpe, Chris D.; Wiesner, Mark R.

    2011-01-01

    Background This commentary evolved from a workshop sponsored by the National Institute of Environmental Health Sciences titled "Superfund Contaminants: The Next Generation" held in Tucson, Arizona, in August 2009. All the authors were workshop participants.

  3. Constrained noninformative priors

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-10-01

    The Jeffreys noninformative prior distribution for a single unknown parameter is the distribution corresponding to a uniform distribution in the transformed model where the unknown parameter is approximately a location parameter. To obtain a prior distribution with a specified mean but with diffusion reflecting great uncertainty, a natural generalization of the noninformative prior is the distribution corresponding to the constrained maximum entropy distribution in the transformed model. Examples are given

  4. A method for estimating the local area economic damages of Superfund waste sites

    International Nuclear Information System (INIS)

    Walker, D.R.

    1992-01-01

    National Priority List (NPL) sites, or more commonly called Superfund sites, are hazardous waste sites (HWS) deemed by the Environmental Protection Agency (EPA) to impose the greatest risks to human health or welfare or to the environment. HWS are placed and ranked for cleanup on the NPL based on a score derived from the Hazard Ranking System (HRS), which is a scientific assessment of the health and environmental risks posed by HWS. A concern of the HRS is that the rank of sites is not based on benefit-cost analysis. The main objective of this dissertation is to develop a method for estimating the local area economic damages associated with Superfund waste sites. Secondarily, the model is used to derive county-level damage estimates for use in ranking the county level damages from Superfund sites. The conceptual model used to describe the damages associated with Superfund sites is a household-firm location decision model. In this model assumes that households and firms make their location choice based on the local level of wages, rents and amenities. The model was empirically implemented using 1980 census microdata on households and workers in 253 counties across the US. The household sample includes data on the value and structural characteristics of homes. The worker sample includes the annual earnings of workers and a vector worker attributes. The microdata was combined with county level amenity data, including the number of Superfund sites. The hedonic pricing technique was used to estimate the effect of Superfund sites on average annual wages per household and on monthly expenditures on housing. The results show that Superfund sites impose statistically significant damages on households. The annual county damages from Superfund sites for a sample of 151 counties was over 14 billion dollars. The ranking of counties using the damage estimates is correlated with the rank of counties using the HRS

  5. Measurement and monitoring technologies are important SITE program component

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    An ongoing component of the Superfund Innovative Technologies Evaluation (SITE) Program, managed by the US EPA at its Hazardous Waste Engineering Research Laboratory in Cincinnati, is the development and demonstration of new and innovative measurement and monitoring technologies that will be applicable to Superfund site characterization. There are four important roles for monitoring and measurement technologies at Superfund sites: (1) to assess the extent of contamination at a site, (2) to supply data and information to determine impacts to human health and the environment, (3) to supply data to select the appropriate remedial action, and (4) to monitor the success or effectiveness of the selected remedy. The Environmental Monitoring Systems Laboratory in Las Vegas, Nevada (EMSL-LV) has been supporting the development of improved measurement and monitoring techniques in conjunction with the SITE Program with a focus on two areas: Immunoassay for toxic substances and fiber optic sensing for in-situ analysis at Superfund sites

  6. Post-remediation biomonitoring of pesticides and other contaminants in marine waters and sediment near the United Heckathorn Superfund Site, Richmond, California

    Energy Technology Data Exchange (ETDEWEB)

    LD Antrim; NP Kohn

    2000-05-26

    Marine sediment remediation at the United Heckathorn Superfund Site was completed in April 1997. Water and mussel tissues were sampled in February 1999 from four stations near Lauritzen Canal in Richmond, California, for Year 2 of post-remediation monitoring of marine areas near the United Heckathorn Site. Dieldrin and dichlorodiphenyl trichloroethane (DDT) were analyzed in water samples, tissue samples from resident mussels, and tissue samples from transplanted mussels deployed for 4 months. Concentrations of dieldrin and total DDT in water and total DDT in tissue were compared with Year 1 of post-remediation monitoring, and with preremediation data from the California State Mussel Watch program (tissues) and the Ecological Risk Assessment for the United Heckathorn Superfund Site (tissues and water). Mussel tissues were also analyzed for polychlorinated biphenyls (PCB), which were detected in sediment samples. Chlorinated pesticide concentrations in water samples were similar to preremediation levels and did not meet remediation goals. Mean dieidrin concentrations in water ranged from 0.62 rig/L to 12.5 ng/L and were higher than the remediation goal (0.14 ng/L) at all stations. Mean total DDT concentrations in water ranged from 14.4 ng/L to 62.3 ng/L and exceeded the remediation goal (0.59 ng/L) at all stations. The highest concentrations of both pesticides were found at the Lauritzen Canal/End station. Despite exceedence of the remediation goals, chlorinated pesticide concentrations in Lauritzen Canal water samples were notably lower in 1999 than in 1998. Tissue samples from biomonitoring organisms (mussels) provide an indication of the longer-term integrated exposure to contaminants in the water column, which overcomes the limitations of grab samples of water. Biomonitoring results indicated that the bioavailability of chlorinated pesticides has been reduced from preremediation levels both in the dredged area and throughout Richmond Harbor. Total DDT and

  7. Remediation of the Wells G & H Superfund Site, Woburn, Massachusetts.

    Science.gov (United States)

    Bair, E Scott; Metheny, Maura A

    2002-01-01

    Remediation of ground water and soil contamination at the Wells G & H Superfund Site, Woburn, Massachusetts, uses technologies that reflect differences in hydrogeologic settings, concentrations of volatile organic compounds (VOCs), and costs of treatment. The poorly permeable glacial materials that overlie fractured bedrock at the W.R. Grace property necessitate use of closely spaced recovery wells. Contaminated ground water is treated with hydrogen peroxide and ultraviolet (UV) oxidation. At UniFirst, a deep well completed in fractured bedrock removes contaminated ground water, which is treated by hydrogen peroxide, UV oxidation, and granular activated carbon (GAC). The remediation system at Wildwood integrates air sparging, soil-vapor extraction, and ground water pumping. Air stripping and GAC are used to treat contaminated water; GAC is used to treat contaminated air. New England Plastics (NEP) uses air sparging and soil-vapor extraction to remove VOCs from the unsaturated zone and shallow ground water. Contaminated air and water are treated using separate GAC systems. After nine years of operation at W.R. Grace and UniFirst, 30 and 786 kg, respectively, of VOCs have been removed. In three years of operation, 866 kg of VOCs have been removed at Wildwood. In 15 months of operation, 36 kg of VOCs were removed at NEP. Characterization work continues at the Olympia Nominee Trust, Whitney Barrel, Murphy Waste Oil, and Aberjona Auto Parts properties. Risk assessments are being finalized that address heavy metals in the floodplain sediments along the Aberjona River that are mobilized from the Industri-Plex Superfund Site located a few miles upstream.

  8. Superfund record of decision (EPA Region 7): Cherokee County Superfund Site, Cherokee County, KS, July 29, 1996

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-08-01

    The decision document presents the selected remedial action for the mining wastes at Operable Unit No. 07 of the Galena Subsite, which is part of the Cherokee County Superfund Site in Cherokee County, Kansas. The selected remedy includes actions for residential soils impacted by mining wastes and includes: Excavation and disposal of residential soils impacted by mining wastes; Health education for the general community and medical professionals; Institutional controls to guide future development in residential areas impacted by mining wastes; Treatability studies to evaluate the effectiveness of phosphate stabilization as a future alternative; and Operation and maintenance of all remedy aspects including, but not limited to, health education, institutional controls, and long-term monitoring.

  9. Pilot-scale incineration of comtaminated soils from the drake chemical superfund site. Final report

    International Nuclear Information System (INIS)

    King, C.; Lee, J.W.; Waterland, L.R.

    1993-03-01

    A series of pilot-scale incineration tests were performed at the U.S. Environmental Protection Agency's (EPA's) Incineration Research Facility to evaluate the potential of incineration as an option to treat contaminated soils from the Drake Chemical Superfund site in Lock Haven, Pennsylvania. The soils at the Drake site are reported to be contaminated to varying degrees with various organic constituents and several hazardous constituent trace metals. The purpose of the test program was to evaluate the incinerability of selected site soils in terms of the destruction of contaminant organic constituents and the fate of contaminant trace metals. All tests were conducted in the rotary kiln incineration system at the IRF. Test results show that greater than 99.995 percent principal organic hazardous constituent (POHC) destruction and removal efficiencies (DRE) can be achieved at kiln exit gas temperatures of nominally 816 C (1,500 F) and 538 C (1,000 F). Complete soil decontamination of semivolatile organics was achieved; however, kiln ash levels of three volatile organic constituents remained comparable to soil levels

  10. In-situ stabilization of the Geiger (C and M Oil) Superfund Site

    International Nuclear Information System (INIS)

    Andromalos, K.B.; Ameel, M.E.

    1994-01-01

    The Geiger (C and M Oil) Superfund Site is the first US Army Corps of Engineers managed soil remediation project which utilized the in-situ stabilization/solidification technique to remediate the soil. This project involved the remediation of approximately 23,000 cubic yards of contaminated soil. Contaminants of concern included chromium, lead, PCB'S, toluene, benzene, and other organic compounds. Clean-up criteria for the stabilized material was equal to the National Primary Drinking Water Regulations, when tested using the TCLP leachate extraction method. Chromium, lead, and toluene were the main contaminants of concern, with TCLP clean-up goals of 150, 15 and 1,000 parts per billion (ppb), respectively. This National Priorities List (NPL) site is located near Charleston, SC and was an abandoned old waste oil facility that utilized unlined shallow trenches for the storage of waste oil. This paper summarizes the initial testing programs and the final production work at the site. Extensive testing was performed throughout all phases of the project. This testing was performed for the purpose of mix optimization, quality assurance, and verification testing. Specific parameters tested included: TCLP testing of organics, metals and PCBs, permeability testing, and unconfirmed compression strength

  11. Chemical dechlorination of pesticides at a superfund site in Region II

    International Nuclear Information System (INIS)

    Pendergrass, S.; Prince, J.

    1991-01-01

    Selecting technologies for cleaning up hazardous waste sites is a complex task, due in part to the rapidly changing nature of the state-of-the-art in technology. There is strong support for use of innovative technologies as specified in Section 121(b) of CERCLA. However, use of an innovative technology requires overcoming a variety of challenges. These challenges include: Screening potentially appropriate technologies, including innovative technologies, and selecting one or more potential innovative technologies for which preliminary results are promising; however, site-specific data are needed prior to technology evaluation. Evaluating the effectiveness of the proposed technology for the site through the use of treatability studies. Gaining acceptance for the innovative technology, which may employ new or unfamiliar concepts. Determining optimal design and operating parameters for full-scale remediation. This paper discusses the technology evaluation process and how that process supported the selection of an innovative technology for the Myers Property site, a Superfund site in Region II. A case study is presented showing how technology screening and laboratory treatability studies were used to evaluate an innovative technology (chemical dechlorination), which was selected as the technology for remediation of soils and sediments contaminated with pesticides at this environmentally sensitive site in New Jersey. The remedy selected by the U.S. EPA for this site designates chemical dechlorination as the selected technology, but does not specify any particular vendor or process. Rather, the remedy sets forth technology performance standards and recommends certain design tasks which may be used to select a particular chemical process. This paper discusses he of these design tasks as they might apply to innovative technologies, using chemical dechlorination as a model

  12. 78 FR 47317 - Ore Knob Mine Superfund Site; Laurel Springs, Ashe County, North Carolina; Notice of Settlement

    Science.gov (United States)

    2013-08-05

    ... ENVIRONMENTAL PROTECTION AGENCY [FRL-9843-3; CERCLA-04-2013-3759] Ore Knob Mine Superfund Site; Laurel Springs, Ashe County, North Carolina; Notice of Settlement AGENCY: Environmental Protection Agency... settlement with Herbert N. Francis concerning the Ore Knob Mine Superfund Site located in Laurel Springs...

  13. The Prior-project

    DEFF Research Database (Denmark)

    Engerer, Volkmar Paul; Roued-Cunliffe, Henriette; Albretsen, Jørgen

    digitisation of Arthur Prior’s Nachlass kept in the Bodleian Library, Oxford. The DH infrastructure in question is the Prior Virtual Lab (PVL). PVL was established in 2011 in order to provide researchers in the field of temporal logic easy access to the papers of Arthur Norman Prior (1914-1969), and officially......In this paper, we present a DH research infrastructure which relies heavily on a combination of domain knowledge with information technology. The general goal is to develop tools to aid scholars in their interpretations and understanding of temporal logic. This in turn is based on an extensive...

  14. Case history: Vertical barrier wall system for Superfund Site

    International Nuclear Information System (INIS)

    Koelling, M.A.; Kovac, C.P.; Norris, J.E.

    1997-01-01

    Design considerations and construction aspects are presented for the installation of a vertical barrier wall system for the Boeing Company at a Superfund Site near Seattle, WA. The construction was performed during 1996. The vertical barrier wall system included: (1) a soil-bentonite (SB) slurry wall, approximately 670 meters (2200 feet) in length, ranging from 12 to 21 meters (40 to 70 feet) in depth; (2) expansion of a cover system over the area enclosed by the SB wall; and (3) surface drainage improvements. Design and construction of the system addressed requirements of a Consent Decree for the site issued in 1993. The paper discusses the development of the design to meet remedial performance goals of preventing migration of contaminants in the soil/groundwater system and aiding aquifer restoration. Secondly, the paper details installation of the SB wall, highlighting the more significant construction issues, which included excavation of the wall through glacially deposited cobbles/boulders/till as well as addressing the severe elevation changes along the wall alignment. Thirdly, the paper presents Quality Assurance (QA) monitoring and testing performed during the construction phase

  15. A strategy for end point criteria for Superfund remediation

    International Nuclear Information System (INIS)

    Hwang, S.T.

    1992-06-01

    Since the inception of cleanup for hazardous waste sites, estimating target cleanup levels has been the subject of considerable investigation and debate in the Superfund remediation process. Establishing formal procedures for assessing human health risks associated with hazardous waste sites has provided a conceptual framework for determining remediation goals and target cleanup levels (TCLs) based on human health and ecological risk consideration. This approach was once considered at variance with the concept of the pre-risk assessment period; that is, cleaning up to the background level, or using containment design or best available control technologies. The concept has been gradually adopted by the regulatory agencies and the parties responsible for cleanup. Evaluation of cleanup strategies at the outset of the planning stage will eventually benefit the parties responsible for cleanup and the oversight organizations, including regulatory agencies. Development of the strategies will provide an opportunity to promote an improvement in the pace and quality of many activities to be carried out. The strategies should help address the issues related to (1) improving remediation management activities to arrive at remediation as expeditiously as possible, (2) developing alternate remediation management activities, (3) identifying obstructing issues to management for resolution, (4) adapting the existing framework to correspond to the change in remediation statutes and guidelines, and (5) providing the basis for evaluating options for the record of decision process. This paper will discuss some of the issues and the research efforts that were addressed as part of the strategies requiring future discussion and comment

  16. Estimating remediation costs for the Montclair radium superfund sites

    International Nuclear Information System (INIS)

    Turner, M.J.

    1995-01-01

    The Montclair/West Orange and Glen Ridge Superfund Sites, located in Essex County, NJ, are contaminated to varying degrees with radioactive materials. The waste originated from radium processing facilities prevalent in the area during the early 1900s. The design for remediation of these sites is managed by Bechtel National, Inc. on behalf of the United States Army Corps of Engineers, Kansas City District, which administers the project through an interagency agreement with the US Environmental Protection Agency (EPA). Design efforts for the project began in 1990. A portion of the scope, which is the topic of this article, was preparing the remediation costs estimates. These estimates were to be prepared from the detailed design packages; the Corps of Engineers required that the estimates were prepared using the Micro Computer-Aided Cost Estimating System (MCACES). This article discusses the design methods used, provides an overview of MCACES, and discusses the structure and preparation of the cost estimate and its uses. However, the main focus of the article is the methods used to generate the required project-specific cost estimate format for this project. 6 figs

  17. Arthur Prior and 'Now'

    DEFF Research Database (Denmark)

    Blackburn, Patrick Rowan; Jørgensen, Klaus Frovin

    2016-01-01

    ’s search led him through the work of Castañeda, and back to his own work on hybrid logic: the first made temporal reference philosophically respectable, the second made it technically feasible in a modal framework. With the aid of hybrid logic, Prior built a bridge from a two-dimensional UT calculus...

  18. Prior Knowledge Assessment Guide

    Science.gov (United States)

    2014-12-01

    assessment in a reasonable amount of time. Hands-on assessments can be extremely diverse in makeup and administration depending on the subject matter...DEVELOPING AND USING PRIOR KNOWLEDGE ASSESSMENTS TO TAILOR TRAINING D-3 ___ Brush and scrub ___ Orchards ___ Rice

  19. EPA (Environmental Protection Agency) Indoor-Air Quality Implementation Plan. A report to Congress under Title IV of the Superfund Amendments and Reauthorization Act of 1986: radon gas and indoor air-quality research. Final report

    International Nuclear Information System (INIS)

    1987-06-01

    The EPA Indoor Air Quality Implementation Plan provides information on the direction of EPA's indoor air program, including the Agency's policy on indoor air and priorities for research and information dissemination over the next two years. EPA submitted the report to Congress on July 2, 1987 as required by the Superfund Amendments and Reauthorization Act of 1986. There are five appendices to the report: Appendix A--Preliminary Indoor Air Pollution Information Assessment; Appendix B--FY 87 Indoor Air Research Program; Appendix C--EPA Radon Program; Appendix D--Indoor Air Resource History (Published with Appendix C); Appendix E--Indoor Air Reference Data Base

  20. Phase I Source Investigation, Heckathorn Superfund Site, Richmond, California

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, Nancy P; Evans, Nathan R

    2002-12-18

    This report represents Phase I of a multi-phase approach to a source investigation of DDT at the Heckathorn Superfund Site, Richmond, California, the former site of a pesticide packaging plant, and the adjacent waterway, the Lauritzen Channel. Potential identified sources of contamination were from sloughed material from undredged areas (such as side banks) and from outfall pipes. Objectives of Phase I included the (1) evaluation of pesticide concentrations associated with discharge from outfalls, (2) identification of additional outfalls in the area, (3) identification of type, quantity, and distribution of sediment under the Levin pier, (4) quantification of pesticide concentrations in sediment under the pier, and (5) evaluation of sediment structure and slope stability under the pier. Field operations included the collection of sediment directly from inside the mouths of outfall pipes, when possible, or the deployment of specially designed particle traps where direct sampling was problematic. Passive water samplers were placed at the end of known outfall pipes and analyzed for DDT and other pesticides of concern. Underwater dive surveys were conducted beneath the Levin pier to document type, slope, and thickness of sediment. Samples were collected at locations of interest and analyzed for contaminants. Also sampled was soil from bank areas, which were suspected of potentially contributing to continued DDT contamination of the Lauritzen Channel through erosion and groundwater leaching. The Phase I Source Investigation was successful in identifying significant sources of DDT contamination to Lauritzen Channel sediment. Undredged sediment beneath the Levin pier that has been redistributed to the channel is a likely source. Two outfalls tested bear further investigation. Not as well-defined are the contributions of bank erosional material and groundwater leaching. Subsequent investigations will be based on the results of this first phase.

  1. Residential cancer cluster investigation nearby a Superfund Study Area with trichloroethylene contamination.

    Science.gov (United States)

    Press, David J; McKinley, Meg; Deapen, Dennis; Clarke, Christina A; Gomez, Scarlett Lin

    2016-05-01

    Trichloroethylene (TCE) is an industrial solvent associated with liver cancer, kidney cancer, and non-Hodgkin's lymphoma (NHL). It is unclear whether an excess of TCE-associated cancers have occurred surrounding the Middlefield-Ellis-Whisman Superfund site in Mountain View, California. We conducted a population-based cancer cluster investigation comparing the incidence of NHL, liver, and kidney cancers in the neighborhood of interest to the incidence among residents in the surrounding four-county region. Case counts and address information were obtained using routinely collected data from the Greater Bay Area Cancer Registry, part of the Surveillance, Epidemiology, and End Results program. Population denominators were obtained from the 1990, 2000, and 2010 US censuses. Standardized incidence ratios (SIRs) with two-sided 99 % confidence intervals (CIs) were calculated for time intervals surrounding the US Censuses. There were no statistically significant differences between the neighborhood of interest and the larger region for cancers of the liver or kidney. A statistically significant elevation was observed for NHL during one of the three time periods evaluated (1996-2005: SIR = 1.8, 99 % CI 1.1-2.8). No statistically significant NHL elevation existed in the earlier 1988-1995 (SIR = 1.3, 99 % CI 0.5-2.6) or later 2006-2011 (SIR = 1.3, 99 % CI 0.6-2.4) periods. There is no evidence of an increased incidence of liver or kidney cancer, and there is a lack of evidence of a consistent, sustained, or more recent elevation in NHL occurrence in this neighborhood. This evaluation included existing cancer registry data, which cannot speak to specific exposures incurred by past or current residents of this neighborhood.

  2. Measurement of volatile organic compounds during start-up of bioremediation of French limited superfund site in Crosby Texas using wind dependent whole-air sampling

    International Nuclear Information System (INIS)

    Pleil, J.D.; Fortune, C.R.; Yoong, M.; Oliver, K.D.

    1993-01-01

    Whole-air sampling was performed before and after the start-up of the bioremediation of an industrial (primarily petrochemical) waste lagoon in Crosby Texas, near Houston. Four 'Sector Samplers' were deployed at the four corners of the French Limited Superfund Site. These samplers collect air into one of two SUMMA polished canisters depending upon wind direction and speed. When the wind blows at the sampler from across the waste lagoon, air is routed to the 'IN' sector canister, otherwise sample is collected in the 'OUT' sector canister. As such, each sampler provides its own background sample, and, upon gas chromatographic analysis, individual compounds can be associated with the waste lagoon. Five sets of 24-hour sector samples were taken; the first set was collected prior to the start of the bioremediation effort and the remaining four sets were taken sequentially for four 24-hour periods after the start-up of the procedure

  3. Sets of priors reflecting prior-data conflict and agreement

    NARCIS (Netherlands)

    Walter, G.M.; Coolen, F.P.A.; Carvalho, J.P.; Lesot, M.-J.; Kaymak, U.; Vieira, S.; Bouchon-Meunier, B.; Yager, R.R.

    2016-01-01

    Bayesian inference enables combination of observations with prior knowledge in the reasoning process. The choice of a particular prior distribution to represent the available prior knowledge is, however, often debatable, especially when prior knowledge is limited or data are scarce, as then

  4. A Case Study of Peer Educators in a Community-Based Program to Reduce Teen Pregnancy: Selected Characteristics Prior to Training, Perceptions of Training and Work, and Perceptions of How Participation in the Program Has Affected Them

    Science.gov (United States)

    Beshers, Sarah C.

    2007-01-01

    This investigation is a case study of peer educators in a community-based teen pregnancy prevention program. Research questions focused on identifying ways in which peer educators differed from other teens and exploring the perceptions of the peer educators about their experience in the program and the ways in which it has affected them. Data were…

  5. Prior indigenous technological species

    Science.gov (United States)

    Wright, Jason T.

    2018-01-01

    One of the primary open questions of astrobiology is whether there is extant or extinct life elsewhere the solar system. Implicit in much of this work is that we are looking for microbial or, at best, unintelligent life, even though technological artefacts might be much easier to find. Search for Extraterrestrial Intelligence (SETI) work on searches for alien artefacts in the solar system typically presumes that such artefacts would be of extrasolar origin, even though life is known to have existed in the solar system, on Earth, for eons. But if a prior technological, perhaps spacefaring, species ever arose in the solar system, it might have produced artefacts or other technosignatures that have survived to present day, meaning solar system artefact SETI provides a potential path to resolving astrobiology's question. Here, I discuss the origins and possible locations for technosignatures of such a prior indigenous technological species, which might have arisen on ancient Earth or another body, such as a pre-greenhouse Venus or a wet Mars. In the case of Venus, the arrival of its global greenhouse and potential resurfacing might have erased all evidence of its existence on the Venusian surface. In the case of Earth, erosion and, ultimately, plate tectonics may have erased most such evidence if the species lived Gyr ago. Remaining indigenous technosignatures might be expected to be extremely old, limiting the places they might still be found to beneath the surfaces of Mars and the Moon, or in the outer solar system.

  6. Optimization Evaluation: Lee Chemical Superfund Site, City Of Liberty, Clay County, Missouri

    Science.gov (United States)

    The Lee Chemical Superfund Site (site) is located along Missouri Highway 210 in Liberty, Missouri, approximately 15 miles east of Kansas City, Missouri. Currently, the site is a vacant lot of approximately2.5 acres in a flat alluvial plain.

  7. EMERGING TECHNOLOGY BULLETIN: RECLAMATION OF LEAD FROM SUPERFUND WASTE MATERIAL USING SECONDARY LEAD SMELTERS

    Science.gov (United States)

    This process involves incorporating lead-contaminated Superfund waste with the regular feed to a secondary lead smelter. Since secondary lead smelters already recover lead from recycled automobile batteries, it seems likely that this technology could be used to treat waste from ...

  8. Private-Sector Cleanup Expenditures and Transaction Costs at 18 Superfund Sites (1993)

    Science.gov (United States)

    Superfund allows the government either to clean up a site and recover its cost from the potentially responsible parties (PRPs) or to require the PRPs to undertake the cleanup themselves. This study examines private-sector expenditures and transaction-costs

  9. 75 FR 81269 - Ward Transformer Superfund Site Raleigh, Wake County, NC; Notice of Settlements

    Science.gov (United States)

    2010-12-27

    ... ENVIRONMENTAL PROTECTION AGENCY [Docket EPA-RO4-SFUND-2010-1053, FRL-9243-2] Ward Transformer... entered into a five settlements for reimbursement of past response costs concerning the Ward Transformer... Docket ID No. EPA-RO4- SFUND-2010-1053 or Site name Ward Transformer Superfund Site by one of the...

  10. 77 FR 13603 - Anniston PCB Superfund Site; Anniston, Calhoun County, AL; Correction

    Science.gov (United States)

    2012-03-07

    ... ENVIRONMENTAL PROTECTION AGENCY [FRL-9644-2; CERCLA-04-2012-3763] Anniston PCB Superfund Site... FR 11533 (FRL-9637-7), EPA posted a Notice of Amended Settlement concerning the Anniston PCB... the settlement are available from Ms. Paula V. Painter. Submit your comments by Site name Anniston PCB...

  11. 78 FR 76143 - Proposed CERCLA Settlement Relating to the Paul's Tank Cleaning Service Superfund Site...

    Science.gov (United States)

    2013-12-16

    ... Paul's Tank Cleaning Service Superfund Site, Burlington County, New Jersey AGENCY: Environmental.... (``Settling Party''). The Settling Party is a potentially responsible party, pursuant to Section 107(a) of CERCLA, and thus is potentially liable for response costs incurred at or in connection Paul's Tank...

  12. A General Chemistry Assignment Analyzing Environmental Contamination for the Depue, IL, National Superfund Site

    Science.gov (United States)

    Saslow Gomez, Sarah A.; Faurie-Wisniewski, Danielle; Parsa, Arlen; Spitz, Jeff; Spitz, Jennifer Amdur; Loeb, Nancy C.; Geiger, Franz M.

    2015-01-01

    The classroom exercise outlined here is a self-directed assignment that connects students to the environmental contamination problem surrounding the DePue Superfund site. By connecting chemistry knowledge gained in the classroom with a real-world problem, students are encouraged to personally connect with the problem while simultaneously…

  13. 77 FR 21433 - Regulated Navigation Area; Pacific Sound Resources and Lockheed Shipyard EPA Superfund Cleanup...

    Science.gov (United States)

    2012-04-10

    ... superfund cleanup remediation efforts. This RNA will prohibit activities that would disturb the seabed, such... or capped are arsenic, copper, lead, mercury, zinc, PAHs and PCBs. The metal contaminants were... installed in the designated regulated navigation area, pursuant to the remediation efforts of the U.S...

  14. Remediation System Evaluation, Tutu Wellfield Superfund Site, St. Thomas, U.S. Virgin Islands

    Science.gov (United States)

    The Tutu Wellfield Superfund Site is a 1.5 square mile site located on the eastern end of St. Thomas, U.S. Virgin Islands (USVI) within the upper Turpentine Run surface drainage basin in the Anna’s Retreat area.

  15. 75 FR 53694 - Florida Petroleum Reprocessors Superfund Site; Davie, Broward County, FL; Notice of Settlement

    Science.gov (United States)

    2010-09-01

    ... ENVIRONMENTAL PROTECTION AGENCY [Docket EPA-RO4-SFUND-2010-0729, FRL-9196-1] Florida Petroleum... entered into a settlement for reimbursement of past response costs concerning the Florida Petroleum... No. EPA-RO4- SFUND-2010-0729 or Site name Florida Petroleum Reprocessors Superfund Site by one of the...

  16. Long-Term Groundwater Monitoring Optimization, Clare Water Supply Superfund Site, Permeable Reactive Barrier and Soil Remedy Areas, Clare, Michigan

    Science.gov (United States)

    This report contains a review of the long-term groundwater monitoring network for the Permeable Reactive Barrier (PRB) and Soil Remedy Areas at the Clare Water Supply Superfund Site in Clare, Michigan.

  17. Issuance of Final Guidance: Ecological Risk Assessment and Risk Management Principles for Superfund Sites, October 7, 1999

    Science.gov (United States)

    This guidance is intended to help Superfund risk managers make ecological risk management decisions that are based on sound science, consistent across Regions, and present a characterization of site risks that is transparent to the public.

  18. Prior Elicitation, Assessment and Inference with a Dirichlet Prior

    Directory of Open Access Journals (Sweden)

    Michael Evans

    2017-10-01

    Full Text Available Methods are developed for eliciting a Dirichlet prior based upon stating bounds on the individual probabilities that hold with high prior probability. This approach to selecting a prior is applied to a contingency table problem where it is demonstrated how to assess the prior with respect to the bias it induces as well as how to check for prior-data conflict. It is shown that the assessment of a hypothesis via relative belief can easily take into account what it means for the falsity of the hypothesis to correspond to a difference of practical importance and provide evidence in favor of a hypothesis.

  19. Biomonitoring for metal contamination near two Superfund sites in Woburn, Massachusetts, using phytochelatins

    International Nuclear Information System (INIS)

    Gawel, James E.; Hemond, Harold F.

    2004-01-01

    Characterizing the spatial extent of groundwater metal contamination traditionally requires installing sampling wells, an expensive and time-consuming process in urban areas. Moreover, extrapolating biotic effects from metal concentrations alone is problematic, making ecological risk assessment difficult. Our study is the first to examine the use of phytochelatin measurements in tree leaves for delimiting biological metal stress in shallow, metal-contaminated groundwater systems. Three tree species (Rhamnus frangula, Acer platanoides, and Betula populifolia) growing above the shallow groundwater aquifer of the Aberjona River watershed in Woburn, Massachusetts, display a pattern of phytochelatin production consistent with known sources of metal contamination and groundwater flow direction near the Industri-Plex Superfund site. Results also suggest the existence of a second area of contaminated groundwater and elevated metal stress near the Wells G and H Superfund site downstream, in agreement with a recent EPA ecological risk assessment. Possible contamination pathways at this site are discussed

  20. Assessment of technologies for the remediation of radioactively contaminated Superfund sites

    International Nuclear Information System (INIS)

    1990-01-01

    The report is a screening evaluation of information needs for the development of generic treatability studies for the remediation of Superfund Radiation Sites on the National Priorities List (NPL). It presents a categorization of the 25 radiation sites currently proposed or listed on the NPL, and provides a rating system for evaluating technologies that may be used to remediate these sites. It also identifies gaps in site assessment and technology data and provides information about and recommendations for technology development

  1. Electrochemical peroxidation of PCBs and VOCs in superfund site water and sediments

    Energy Technology Data Exchange (ETDEWEB)

    Scrudato, R.J.; Chiarenzelli, J.R. [SUNY, Oswego, NY (United States)

    1996-12-31

    An electrochemical peroxidation (ECP) process has been developed and used to degrade polychlorinated biphenyls (PCB) and volatile organic compounds (VOC)-contaminated water, sludge, and sediments at a New York State Federal and State Superfund Site. The process involves passing an oscillating low-amperage (<10 amps) current through steel electrodes immersed in an acidified water or sediment slurry into which hydrogen peroxide (<1,000 ppm) is added. The generated free radicals attack organic compounds, including organo-metallic complexes and refractory compounds including PCBs. PCB degradation ranged from about 30% to 80% in experiments involving Federal Superfund Site sediments; total PCBs were reduced by {approximately}97% to 68%, respectively, in water and slurry collected from a State Superfund subsurface storage tank. VOC bench-scale experiments involved chloroethane, 1,1-dichloroethane, dichloromethane, 1,1,1-trichloroethane, and acetone and after a 3-min ECP treatment, degradation ranged from >94% to about 99.9%. Results indicate the ECP is a viable process to degrade organic contaminants in water and sediment suspensions. Because the treated water suspensions are acidified, select trace metal sorbed to the particulates is solubilized and therefore can be segregated from the particulates, offering a process that simultaneously degrades organic contaminants and separates trace metals. 19 refs., 1 fig., 4 tabs.

  2. Accommodating Uncertainty in Prior Distributions

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-19

    A fundamental premise of Bayesian methodology is that a priori information is accurately summarized by a single, precisely de ned prior distribution. In many cases, especially involving informative priors, this premise is false, and the (mis)application of Bayes methods produces posterior quantities whose apparent precisions are highly misleading. We examine the implications of uncertainty in prior distributions, and present graphical methods for dealing with them.

  3. The Prior Internet Resources 2017

    DEFF Research Database (Denmark)

    Engerer, Volkmar Paul; Albretsen, Jørgen

    2017-01-01

    The Prior Internet Resources (PIR) are presented. Prior’s unpublished scientific manuscripts and his wast letter correspondence with fellow researchers at the time, his Nachlass, is now subject to transcription by Prior-researchers worldwide, and form an integral part of PIR. It is demonstrated...

  4. The Importance of Prior Knowledge.

    Science.gov (United States)

    Cleary, Linda Miller

    1989-01-01

    Recounts a college English teacher's experience of reading and rereading Noam Chomsky, building up a greater store of prior knowledge. Argues that Frank Smith provides a theory for the importance of prior knowledge and Chomsky's work provided a personal example with which to interpret and integrate that theory. (RS)

  5. Recruiting for Prior Service Market

    Science.gov (United States)

    2008-06-01

    perceptions, expectations and issues for re-enlistment • Develop potential marketing and advertising tactics and strategies targeted to the defined...01 JUN 2008 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Recruiting for Prior Service Market 5a. CONTRACT NUMBER 5b. GRANT...Command First Handshake to First Unit of Assignment An Army of One Proud to Be e e to Serve Recruiting for Prior Service Market MAJ Eric Givens / MAJ Brian

  6. Regional economic impact assessment: Evaluating remedial alternatives for the Portland Harbor Superfund Site, Portland, Oregon, USA.

    Science.gov (United States)

    Harrison, David; Coughlin, Conor; Hogan, Dylan; Edwards, Deborah A; Smith, Benjamin C

    2018-01-01

    The present paper describes a methodology for evaluating impacts of Superfund remedial alternatives on the regional economy in the context of a broader sustainability evaluation. Although economic impact methodology is well established, some applications to Superfund remedial evaluation have created confusion because of seemingly contradictory results. This confusion arises from failure to be explicit about 2 opposing impacts of remediation expenditures: 1) positive regional impacts of spending additional money in the region and 2) negative regional impacts of the need to pay for the expenditures (and thus forgo other expenditures in the region). The present paper provides a template for economic impact assessment that takes both positive and negative impacts into account, thus providing comprehensive estimates of net impacts. The paper also provides a strategy for identifying and estimating major uncertainties in the net impacts. The recommended methodology was applied at the Portland Harbor Superfund Site, located along the Lower Willamette River in Portland, Oregon, USA. The US Environmental Protection Agency (USEPA) developed remedial alternatives that it estimated would cost up to several billion dollars, with construction durations possibly lasting decades. The economic study estimated regional economic impacts-measured in terms of gross regional product (GRP), personal income, population, and employment-for 5 of the USEPA alternatives relative to the "no further action" alternative. Integr Environ Assess Manag 2018;14:32-42. © 2017 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). © 2017 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).

  7. Aquatic assessment of the Ely Copper Mine Superfund site, Vershire, Vermont

    Science.gov (United States)

    Seal, Robert R.; Kiah, Richard G.; Piatak, Nadine M.; Besser, John M.; Coles, James F.; Hammarstrom, Jane M.; Argue, Denise M.; Levitan, Denise M.; Deacon, Jeffrey R.; Ingersoll, Christopher G.

    2010-01-01

    The Ely Mine, which operated from 1821 to 1905, and its area of downstream impact constitute the Ely Copper Mine Superfund site. The site was placed on the National Priorities List in 2001. The mine comprises underground workings, foundations from historical structures, several waste-rock piles, roast beds associated with the smelting operation, and slag piles resulting from the smelting. The mine site is drained by Ely Brook, which includes several tributaries, one of which drains a series of six ponds. Ely Brook empties into Schoolhouse Brook, which flows 3.3 kilometers and joins the Ompompanoosuc River.

  8. Assessment of international remedial technologies for application to Superfund sites

    International Nuclear Information System (INIS)

    Sanning, D.E.

    1990-01-01

    This paper presents some of the logical arguments for conducting research on remedial technologies for contaminated land and groundwater at an international level. It gives information on many of the international organizations that are involved in environmental programs, but it especially gives emphasis to the NATO-CCMS pilot study on Demonstration of Remedial Action Technologies for Contaminated Land and Groundwater. The purpose of the study is to field demonstrate and evaluate new/innovative technologies for remedial action at uncontrolled hazardous waste sites. This study is a logical international extension of the US EPA SITE program. It offers the opportunity to obtain a multiple data base on various remedial action unit processes without any single country having to commit a disproportionate amount of its internal resources to any specific activity. Each participating country provides the necessary resources for those demonstrations which they are contributing to the study. Sites are selected by a majority vote of all participating countries (no country is permitted to vote for its own sites). The study is a 5 year program with participants from Canada, Denmark, Federal Republic of Germany, France, Greece, Italy, Japan, the Netherlands, Norway, Spain, and the US. The need for cost-effective remedial action technologies for hazardous waste sites is a problem of all industrialized countries. The need to build a knowledge base of emerging remedial technologies was the impetus behind the USEPA's lead role and commitment to this pilot study

  9. EPA RREL's mobile volume reduction unit advances soil washing at four Superfund sites

    International Nuclear Information System (INIS)

    Gaire, R.; Borst, M.

    1994-01-01

    Research testing of the US. Environmental Protection Agency (EPA) Risk Reduction Engineering Laboratory's (RREL) Volume Reduction Unit (VRU), produced data helping advance soil washing as a remedial technology for contaminated soils. Based on research at four Superfund sites, each with a different matrix of organic contaminants, EPA evaluated the soil technology and provided information to forecast realistic, full-scale remediation costs. Primarily a research tool, the VRU is RREL's mobile test unit for investigating the breadth of this technology. During a Superfund Innovative Technology Evaluation (SITE) Demonstration at Escambia Wood Treating Company Site, Pensacola, FL, the VRU treated soil contaminated with pentachlorophenol (PCP) and polynuclear aromatic hydrocarbon-laden creosote (PAH). At Montana Pole and Treatment Plant Site, Butte, MT, the VRU treated soil containing PCP mixed with diesel oil (measured as total petroleum hydrocarbons) and a trace of dioxin. At Dover Air Force Base Site, Dover, DE, the VRU treated soil containing JP-4 jet fuel, measured as TPHC. At Sand Creek Site, Commerce City, CO, the feed soil at this site was contaminated with two pesticides: heptachlor and dieldrin. Less than 10 percent of these pesticides remained in the treated coarse soil fractions

  10. Toward identifying the next generation of superfund and hazardous waste site contaminants.

    Science.gov (United States)

    Ela, Wendell P; Sedlak, David L; Barlaz, Morton A; Henry, Heather F; Muir, Derek C G; Swackhamer, Deborah L; Weber, Eric J; Arnold, Robert G; Ferguson, P Lee; Field, Jennifer A; Furlong, Edward T; Giesy, John P; Halden, Rolf U; Henry, Tala; Hites, Ronald A; Hornbuckle, Keri C; Howard, Philip H; Luthy, Richard G; Meyer, Anita K; Sáez, A Eduardo; Vom Saal, Frederick S; Vulpe, Chris D; Wiesner, Mark R

    2011-01-01

    This commentary evolved from a workshop sponsored by the National Institute of Environmental Health Sciences titled "Superfund Contaminants: The Next Generation" held in Tucson, Arizona, in August 2009. All the authors were workshop participants. Our aim was to initiate a dynamic, adaptable process for identifying contaminants of emerging concern (CECs) that are likely to be found in future hazardous waste sites, and to identify the gaps in primary research that cause uncertainty in determining future hazardous waste site contaminants. Superfund-relevant CECs can be characterized by specific attributes: They are persistent, bioaccumulative, toxic, occur in large quantities, and have localized accumulation with a likelihood of exposure. Although still under development and incompletely applied, methods to quantify these attributes can assist in winnowing down the list of candidates from the universe of potential CECs. Unfortunately, significant research gaps exist in detection and quantification, environmental fate and transport, health and risk assessment, and site exploration and remediation for CECs. Addressing these gaps is prerequisite to a preventive approach to generating and managing hazardous waste sites. A need exists for a carefully considered and orchestrated expansion of programmatic and research efforts to identify, evaluate, and manage CECs of hazardous waste site relevance, including developing an evolving list of priority CECs, intensifying the identification and monitoring of likely sites of present or future accumulation of CECs, and implementing efforts that focus on a holistic approach to prevention.

  11. Arsenic Fate, Transport And Stability Study: Groundwater, Surface Water, Soil And Sediment Investigation At Fort Devens Superfund Site

    Science.gov (United States)

    A field investigation was conducted to examine the distribution of arsenic in groundwater, surface water, and sediments at the Fort Devens Superfund Site. The study area encompassed a portion of plow Shop Pond (Red Cove), which receives groundwater discharge from the aquifer und...

  12. 77 FR 58989 - Proposed CERCLA Administrative Cost Recovery Settlement for the Buckbee-Mears Co. Superfund Site...

    Science.gov (United States)

    2012-09-25

    ... paid $150,000 attributable to the costs of marketing and selling the Properties; (b) The Bank will pay... ENVIRONMENTAL PROTECTION AGENCY [FRL-9720-7] Proposed CERCLA Administrative Cost Recovery... costs concerning the Buckbee-Mears Co. Superfund Site located in Cortland, Cortland County, New York...

  13. Diffusive flux of PAHs across sediment-water and water-air interfaces at urban superfund sites.

    Science.gov (United States)

    Minick, D James; Anderson, Kim A

    2017-09-01

    Superfund sites may be a source of polycyclic aromatic hydrocarbons (PAHs) to the surrounding environment. These sites can also act as PAH sinks from present-day anthropogenic activities, especially in urban locations. Understanding PAH transport across environmental compartments helps to define the relative contributions of these sources and is therefore important for informing remedial and management decisions. In the present study, paired passive samplers were co-deployed at sediment-water and water-air interfaces within the Portland Harbor Superfund Site and the McCormick and Baxter Superfund Site. These sites, located along the Willamette River (Portland, OR, USA), have PAH contamination from both legacy and modern sources. Diffusive flux calculations indicate that the Willamette River acts predominantly as a sink for low molecular weight PAHs from both the sediment and the air. The sediment was also predominantly a source of 4- and 5-ring PAHs to the river, and the river was a source of these same PAHs to the air, indicating that legacy pollution may be contributing to PAH exposure for residents of the Portland urban center. At the remediated McCormick and Baxter Superfund Site, flux measurements highlight locations within the sand and rock sediment cap where contaminant breakthrough is occurring. Environ Toxicol Chem 2017;36:2281-2289. © 2017 SETAC. © 2017 SETAC.

  14. Optimization Review: Bunker Hill Mining and Metallurgical Complex Superfund Site, Central Treatment Plant (CTP), Kellogg, Shoshone County, Idaho

    Science.gov (United States)

    The Bunker Hill Mining and Metallurgical Complex Superfund Site includes all areas of the Coeur d’Alene Basin where mining-related contamination occurred and encompasses a 21-square mile “Box” along Interstate 90 surrounding the former smelter complex.

  15. 76 FR 24479 - In the Matter of the Taylor Lumber and Treating Superfund Site, Sheridan, Oregon, Amendment to...

    Science.gov (United States)

    2011-05-02

    ... ENVIRONMENTAL PROTECTION AGENCY [FRL-9300-9] In the Matter of the Taylor Lumber and Treating... Taylor Lumber and Treating Site, which PWPO was acquiring, in exchange for several obligations related to...-553- 0705. Comments should reference the Taylor Lumber and Treating Superfund Site in Sheridan, Oregon...

  16. 76 FR 20287 - Superfund Site, New Bedford Harbor, New Bedford, MA: Anchorage Ground and Regulated Navigation Area

    Science.gov (United States)

    2011-04-12

    ... may lead to the discovery of a significant environmental impact from this proposed rule. List of... engaged in activities associated with remediation efforts in the New Bedford Harbor Superfund Site... activity can be performed without undue risk to environmental remediation efforts. Requests for waivers...

  17. Mining-related sediment and soil contamination in a large Superfund site: Characterization, habitat implications, and remediation

    Science.gov (United States)

    Juracek, Kyle E.; Drake, K. D.

    2016-01-01

    Historical mining activity (1850–1970) in the now inactive Tri-State Mining District provided an ongoing source of lead and zinc to the environment including the US Environmental Protection Agency Superfund site located in Cherokee County, southeast Kansas, USA. The resultant contamination adversely affected biota and caused human health problems and risks. Remediation in the Superfund site requires an understanding of the magnitude and extent of contamination. To provide some of the required information, a series of sediment and soil investigations were conducted in and near the Superfund site to characterize lead and zinc contamination in the aquatic and floodplain environments along the main-stem Spring River and its major tributaries. In the Superfund site, the most pronounced lead and zinc contamination, with concentrations that far exceed sediment quality guidelines associated with potential adverse biological effects, was measured for streambed sediments and floodplain soils located within or downstream from the most intensive mining-affected areas. Tributary streambeds and floodplains in affected areas are heavily contaminated with some sites having lead and zinc concentrations that are an order of magnitude (or more) greater than the sediment quality guidelines. For the main-stem Spring River, the streambed is contaminated but the floodplain is mostly uncontaminated. Measured lead and zinc concentrations in streambed sediments, lakebed sediments, and floodplain soils documented a persistence of the post-mining contamination on a decadal timescale. These results provide a basis for the prioritization, development, and implementation of plans to remediate contamination in the affected aquatic and floodplain environments within the Superfund site.

  18. Mining-Related Sediment and Soil Contamination in a Large Superfund Site: Characterization, Habitat Implications, and Remediation.

    Science.gov (United States)

    Juracek, K E; Drake, K D

    2016-10-01

    Historical mining activity (1850-1970) in the now inactive Tri-State Mining District provided an ongoing source of lead and zinc to the environment including the US Environmental Protection Agency Superfund site located in Cherokee County, southeast Kansas, USA. The resultant contamination adversely affected biota and caused human health problems and risks. Remediation in the Superfund site requires an understanding of the magnitude and extent of contamination. To provide some of the required information, a series of sediment and soil investigations were conducted in and near the Superfund site to characterize lead and zinc contamination in the aquatic and floodplain environments along the main-stem Spring River and its major tributaries. In the Superfund site, the most pronounced lead and zinc contamination, with concentrations that far exceed sediment quality guidelines associated with potential adverse biological effects, was measured for streambed sediments and floodplain soils located within or downstream from the most intensive mining-affected areas. Tributary streambeds and floodplains in affected areas are heavily contaminated with some sites having lead and zinc concentrations that are an order of magnitude (or more) greater than the sediment quality guidelines. For the main-stem Spring River, the streambed is contaminated but the floodplain is mostly uncontaminated. Measured lead and zinc concentrations in streambed sediments, lakebed sediments, and floodplain soils documented a persistence of the post-mining contamination on a decadal timescale. These results provide a basis for the prioritization, development, and implementation of plans to remediate contamination in the affected aquatic and floodplain environments within the Superfund site.

  19. Quantum steganography using prior entanglement

    International Nuclear Information System (INIS)

    Mihara, Takashi

    2015-01-01

    Steganography is the hiding of secret information within innocent-looking information (e.g., text, audio, image, video, etc.). A quantum version of steganography is a method based on quantum physics. In this paper, we propose quantum steganography by combining quantum error-correcting codes with prior entanglement. In many steganographic techniques, embedding secret messages in error-correcting codes may cause damage to them if the embedded part is corrupted. However, our proposed steganography can separately create secret messages and the content of cover messages. The intrinsic form of the cover message does not have to be modified for embedding secret messages. - Highlights: • Our steganography combines quantum error-correcting codes with prior entanglement. • Our steganography can separately create secret messages and the content of cover messages. • Errors in cover messages do not have affect the recovery of secret messages. • We embed a secret message in the Steane code as an example of our steganography

  20. Quantum steganography using prior entanglement

    Energy Technology Data Exchange (ETDEWEB)

    Mihara, Takashi, E-mail: mihara@toyo.jp

    2015-06-05

    Steganography is the hiding of secret information within innocent-looking information (e.g., text, audio, image, video, etc.). A quantum version of steganography is a method based on quantum physics. In this paper, we propose quantum steganography by combining quantum error-correcting codes with prior entanglement. In many steganographic techniques, embedding secret messages in error-correcting codes may cause damage to them if the embedded part is corrupted. However, our proposed steganography can separately create secret messages and the content of cover messages. The intrinsic form of the cover message does not have to be modified for embedding secret messages. - Highlights: • Our steganography combines quantum error-correcting codes with prior entanglement. • Our steganography can separately create secret messages and the content of cover messages. • Errors in cover messages do not have affect the recovery of secret messages. • We embed a secret message in the Steane code as an example of our steganography.

  1. Prior information in structure estimation

    Czech Academy of Sciences Publication Activity Database

    Kárný, Miroslav; Nedoma, Petr; Khailova, Natalia; Pavelková, Lenka

    2003-01-01

    Roč. 150, č. 6 (2003), s. 643-653 ISSN 1350-2379 R&D Projects: GA AV ČR IBS1075102; GA AV ČR IBS1075351; GA ČR GA102/03/0049 Institutional research plan: CEZ:AV0Z1075907 Keywords : prior knowledge * structure estimation * autoregressive models Subject RIV: BC - Control Systems Theory Impact factor: 0.745, year: 2003 http://library.utia.cas.cz/separaty/historie/karny-0411258.pdf

  2. Contingency analysis modeling for superfund sites and other sources. Final report

    International Nuclear Information System (INIS)

    Christensen, D.; Kaiser, G.D.

    1993-01-01

    The report provides information on contingency modeling for a wide range of different accidental release scenarios of hazardous air pollutants that might take place at Superfund and other sites. The scenarios are used to illustrate how atmospheric dispersion models, including dense gas models, should be applied. Particular emphasis is made on the input data that is needed for proper applications of models. Flow charts direct the user to specific sections where various scenarios are discussed. A check list of items that should be discussed before running the model is provided. Several examples are provided to specifically show how to apply the models so as to produce a credible analysis for a particular release scenario

  3. Raman spectroscopy of efflorescent sulfate salts from Iron Mountain Mine Superfund Site, California

    Science.gov (United States)

    Sobron, Pablo; Alpers, Charles N.

    2013-01-01

    The Iron Mountain Mine Superfund Site near Redding, California, is a massive sulfide ore deposit that was mined for iron, silver, gold, copper, zinc, and pyrite intermittently for nearly 100 years. As a result, both water and air reached the sulfide deposits deep within the mountain, producing acid mine drainage consisting of sulfuric acid and heavy metals from the ore. Particularly, the drainage water from the Richmond Mine at Iron Mountain is among the most acidic waters naturally found on Earth. The mineralogy at Iron Mountain can serve as a proxy for understanding sulfate formation on Mars. Selected sulfate efflorescent salts from Iron Mountain, formed from extremely acidic waters via drainage from sulfide mining, have been characterized by means of Raman spectroscopy. Gypsum, ferricopiapite, copiapite, melanterite, coquimbite, and voltaite are found within the samples. This work has implications for Mars mineralogical and geochemical investigations as well as for terrestrial environmental investigations related to acid mine drainage contamination.

  4. New photocatalytic process provides 99.9+% reduction of VOC at Superfund site

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1999-03-01

    A new photocatalytic process, dubbed the A-I-R-2000 Process, is described. The process is said to offer marked economic advantages, while providing consistent 99.9+% reduction of volatile organic compounds (VOCs) from soil vapours and groundwater at the Stamina Mills Superfund site in North Smithfield, Rhode Island. The A-I-R-2000 process has been developed by KSE Inc., of Amherst, Massachusetts, and has been licensed exclusively worldwide to Trojan Technologies, Inc., of London, Ontario. The process consists essentially of adsorption of VOCs onto a UV light-activated proprietary catalysts, for breakdown to carbon dioxide and water, and also to hydrochloric acid and a small amount of chlorine gas when the VOCs are chlorinated. With a maximum internal operating temperature of 125 degrees F, it is a low-energy system when compared to other catalytic technologies that feature thermal catalytic equipment. 1 photo.

  5. Raman spectroscopy of efflorescent sulfate salts from Iron Mountain Mine Superfund Site, California.

    Science.gov (United States)

    Sobron, Pablo; Alpers, Charles N

    2013-03-01

    The Iron Mountain Mine Superfund Site near Redding, California, is a massive sulfide ore deposit that was mined for iron, silver, gold, copper, zinc, and pyrite intermittently for nearly 100 years. As a result, both water and air reached the sulfide deposits deep within the mountain, producing acid mine drainage consisting of sulfuric acid and heavy metals from the ore. Particularly, the drainage water from the Richmond Mine at Iron Mountain is among the most acidic waters naturally found on Earth. The mineralogy at Iron Mountain can serve as a proxy for understanding sulfate formation on Mars. Selected sulfate efflorescent salts from Iron Mountain, formed from extremely acidic waters via drainage from sulfide mining, have been characterized by means of Raman spectroscopy. Gypsum, ferricopiapite, copiapite, melanterite, coquimbite, and voltaite are found within the samples. This work has implications for Mars mineralogical and geochemical investigations as well as for terrestrial environmental investigations related to acid mine drainage contamination.

  6. Remedial design services for Montclair/West Orange and Glen Ridge Superfund sites

    International Nuclear Information System (INIS)

    Urbaniak, T.F.; Tomiczek, P.W. Jr.

    1994-01-01

    The Montclair/West Orange and Glen Ridge Superfund Sites are located 12 miles west of New York City in Essex County, New Jersey. The sites are contaminated with waste materials from radium-processing facilities which operated in the area during the early 1900's. The waste materials, containing radium and other radioactive isotopes were placed in three separate landfill sites. Major public health risks are indoor radon gas build-up and indoor/ outdoor gamma radiation. In 1989, the EPA issued a Record of Decision (ROD) which chose excavation and off-site disposal of material as the preferred alternative. The purpose of this presentation is to highlight key elements of the design process for the remedial action at Montclair. Those key elements are as follows: meeting community relations challenges; measuring radioactive contamination; developing plans and specifications; packaging of remedial action contacts; and continually improving both the process and the designs

  7. On using residual risk to assess the cost effectiveness and health protectiveness of remedy selection at superfund sites

    International Nuclear Information System (INIS)

    Katsumata, Peter T.; Kastenberg, William E.

    1998-01-01

    This article examines the importance of determining residual risk and its impact on remedy selection at Superfund Sites. Within this examination, risks are assessed using probabilistic models that incorporate the uncertainty and variability of the input parameters, and utilize parameter distributions based on current and applicable site-specific data. Monte Carlo methods are used to propagate these uncertainties and variabilities through the risk calculations resulting in a distribution for the estimate of both risk and residual risk. Such an approach permits an informed decision based on a broad information base which involves considering the entire uncertainty distribution of risk rather than a point estimate for each exposure scenario. Using the probabilistic risk estimates, with current and applicable site-specific data, alternative decisions regarding cleanup are obtained for two Superfund Sites

  8. Superfund at work: Hazardous waste cleanup efforts nationwide, fall 1992. (CIBA-GEIGY Corporation, McIntosh, Alabama)

    International Nuclear Information System (INIS)

    1992-01-01

    On March 31, 1992, the U.S. Environmental Protection Agency (EPA) reached an agreement with Ciba-Geigy Corporation in McIntosh, Alabama to clean up soil and ground water contaminated by DDT, herbicides, and chemicals. The agreement is one of the largest private party settlements in Superfund history, valued at approximately $120 million. EPA activities at the site included: conducting preliminary contamination investigations jointly with the Alabama Environmental Health Administration, beginning in 1979; designing a multi-phased cleanup that is responsive to the complex nature of the contamination and reduces potential risk to the local population and environment; and awarding a grant to a community group to help them participate in cleanup decisions. Ciba-Geigy, like EPA, has made consistent efforts to build and maintain good relations with the community. These efforts demonstrate the increasing trend toward cooperation between industries, local communities, and EPA at Superfund sites

  9. Superfund record of decision (EPA Region 2): Carroll and Dubies Sewage Disposal, Port Jervis, NY, March 31, 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-08-01

    This decision document presents the selected remedial action for the Carroll and Dubies Superfund Site (the Site). This operable unit (OU1) represents the first of two operable units planned for the Site. This operable unit addresses the source areas (lagoons and surrounding impacted soils) at the Site and actions needed to ensure that the source areas do not pose a threat to human health or the environment, including any potential cross media impacts to groundwater.

  10. Superfund Record of Decision (EPA Region 5): Ossineke Groundwater Contamination Site, Alpena County, Ossineke, MI. (First remedial action), June 1991. Final report

    International Nuclear Information System (INIS)

    1991-01-01

    The Ossineke Ground Water Contamination site is an area overlying a contaminated aquifer in Ossineke, Alpena County, Michigan. The site hydrogeology is characterized by an upper aquifer and lower confined aquifer, both of which supply drinking water to local residents. Historically there have been two contaminant source areas of concern within Ossineke. Area 1 is in the center of the Town of Ossineke where two gas stations are located, consisting of underground storage tanks, and a former automobile rustproofing shop. Area 2 is a laundry and dry cleaning facility that has an associated wash water pond containing chlorinated hydrocarbons and VOCs. The State advised all users of the upper aquifer to stop using their wells. In 1982, the State discovered that a snow plow had hit a gasoline pump causing an unknown amount of gasoline to spill and, subsequently, contaminate the basements of several businesses. In 1986, the State replaced residential wells affected by ground water contamination. Because the contaminants of concern have been confirmed to be related to petroleum releases from underground storage tanks, the Superfund program does not have the authority to address cleanup under CERLCLA. The selected remedial action for the site is that no further action

  11. Superfund: right-to-know and hazardous waste site cleanup. Hearing before the Subcommittee on Commerce, Transportation, and Tourism of the Committee on Energy and Commerce, House of Representatives, Ninety-Ninth Congress, First Session, December 20, 1985

    Energy Technology Data Exchange (ETDEWEB)

    1985-01-01

    Representatives of local and state offices and the congressional representative of St. Paul, Minnesota testified at a field hearing on the Superfund program. The focus of the hearing was on community right-to-know aspects and the cleanup of hazardous materials that were abandoned on federal sites. At issue was environmental problems at the 38 priority sites listed for Minnesota and the lack of information on health effects after over 20 years of environmental study of toxic substances. The proposed legislation would subject federal facilities and sites to the same standards, cleanup schedules, and oversite as private sites. A new enforcement bill would encourage citizen suits to force cleanup. Military arsenals that contribute to water and soil pollution were of particular concern. Witnesses discussed the need for a national right-to-know law so that businesses would not be tempted to relocate to avoid Minnesota's environmental policy. The hearing record covers the testimony of seven witnesses.

  12. Assessment of prior learning in vocational education and training

    DEFF Research Database (Denmark)

    Wahlgren, Bjarne; Aarkrog, Vibe

    ’ knowledge, skills and competences during the students’ performances and the methods that the teachers apply in order to assess the students’ prior learning in relation to the regulations of the current VET-program. In particular the study focuses on how to assess not only the students’ explicated knowledge......The article deals about the results of a study of the assessment of prior learning among adult workers who want to obtain formal qualifications as skilled workers. The study contributes to developing methods for assessing prior learning including both the teachers’ ways of eliciting the students...... and skills but also their competences, i.e. the way the students use their skills and knowledge to perform in practice. Based on a description of the assessment procedures the article discusses central issues in relation to the assessment of prior learning. The empirical data have been obtained in the VET...

  13. Recognising Health Care Assistants' Prior Learning through a Caring Ideology

    Science.gov (United States)

    Sandberg, Fredrik

    2010-01-01

    This article critically appraises a process of recognising prior learning (RPL) using analytical tools from Habermas' theory of communicative action. The RPL process is part of an in-service training program for health care assistants where the goal is to become a licensed practical nurse. Data about the RPL process were collected using interviews…

  14. The Transformation of Higher Education through Prior Learning Assessment

    Science.gov (United States)

    Kamenetz, Anya

    2011-01-01

    Providing college credit for prior learning is nothing new. The American Council on Education's Credit Recommendation Service (CREDIT), the largest national program making credit recommendations for workplace and other training, dates to 1974. Several colleges that specialize in the practice--Excelsior and Empire State in New York, Thomas Edison…

  15. Divergent Priors and well Behaved Bayes Factors

    NARCIS (Netherlands)

    R.W. Strachan (Rodney); H.K. van Dijk (Herman)

    2011-01-01

    textabstractDivergent priors are improper when defined on unbounded supports. Bartlett's paradox has been taken to imply that using improper priors results in ill-defined Bayes factors, preventing model comparison by posterior probabilities. However many improper priors have attractive properties

  16. Estimating risk at a Superfund site contaminated with radiological and chemical wastes

    International Nuclear Information System (INIS)

    Temeshy, A.; Liedle, J.M.; Sims, L.M.; Efird, C.R.

    1992-01-01

    This paper describes the method and results for estimating carcinogenic and noncarcinogenic effects at a Superfund site that is radiologically and chemically contaminated. Risk to receptors from disposal of waste in soil and resulting contamination of groundwater, air, surface water, and sediment is quantified. Specific risk assessment components which are addressed are the exposure assessment, toxicity assessment, and the resulting risk characterization. In the exposure assessment, potential exposure pathways are identified using waste disposal inventory information for soil and modeled information for other media. Models are used to calculate future radionuclide concentrations in groundwater, soil, surface water and air. Chemical exposure concentrations are quantified using site characterization data. Models are used to determine concentrations of chemicals in surface water and in air. Toxicity parameters used to quantify the dose-response relationship associated with the carcinogenic contaminants are slope factors and with noncarcinogenic contaminants are reference doses. In the risk characterization step, results from the exposure assessment and toxicity assessment are summarized and integrated into quantitative risk estimates for carcinogens and hazard induces for noncarcinogens. Calculated risks for carcinogenic contaminants are compared with EPA's target risk range. At WAG 6, the risk from radionuclides and chemicals for an on-WAG homesteader exceeds EPA's target risk range. Hazard indices are compared to unity for noncarcinogenic contaminants. At WAG 6, the total pathway hazard index for the on-WAG homesteader exceeds unity

  17. Value engineering study for seletion of verticle barrier technology at a Superfund site

    International Nuclear Information System (INIS)

    Bryan, E.E.; Guglielmetti, J.L.; Butler, P.B.; Brill, M.P.

    1997-01-01

    A value engineering (VE) study was conducted to identify and evaluate vertical barrier technologies and alignments for a Superfund project in New Castle County, Delaware. The objective was to select and recommend the most appropriate vertical barrier(s) for two separate landfills and a portion of the manufacturing plant on the site. A VE team was assembled to identify and evaluate site specific issues related to effectiveness, constructability and cost for numerous vertical barrier technologies. Several cost-effective alternatives were identified that met project objectives. The VE study concluded that a composite vertical barrier system consisting of a soil-bentonite slurry trench and steel sheet piles would provide effective containment of the North Landfill. Additionally, the geologic confining unit specified in the Record of Decision (ROD) was found to be unsuitable as a vertical barrier key and a more suitable, shallow confining unit was discovered. This paper describes the value engineering process and results of the VE study for one of the landfills

  18. Case study: Montclair/West Orange and Glen Ridge Radium Superfund sites

    International Nuclear Information System (INIS)

    Pezzella, R.; Seppi, P.; Watson, D.

    1994-01-01

    The Montclair/West Orange and Glen Ridge Radium Sites are located 12 miles west of New York City in three residential communities in Essex County, New Jersey. The sites are contaminated with waste materials from a local radium processing facility which ceased operations in 1926. Houses were subsequently constructed on or near the radium waste disposal areas. The waste material was also used as backfill, which caused contamination to be spread randomly over the communities. There are 769 properties between four townships that comprise the Superfund sites. The Environmental Protection Agency (EPA) conducted an aerial survey in 1981 which identified the boundaries of the sites. In 1985, the New Jersey Department of Environmental Protection (NJDEP) began a pilot study to examine the feasibility of excavation and off-site disposal of contaminated material as a permanent solution. The study was interrupted when the permit for the disposal site was revoked by the state of Nevada. Since 1990 field testing has been completed on over 725 properties and remediation and restoration has been completed on 75 properties

  19. Assessment of Prior Learning in Adult Vocational Education and Training

    Directory of Open Access Journals (Sweden)

    Vibe Aarkrog

    2015-04-01

    Full Text Available The article deals about the results of a study of school-based Assessment of Prior Learning of adults who have enrolled as students in a VET college in order to qualify for occupations as skilled workers. Based on examples of VET teachers’ methods for assessing the students’ prior learning in the programs for gastronomes, respectively child care assistants the article discusses two issues in relation to Assessment of Prior Learing: the encounter of practical experience and school-based knowledge and the validity and reliability of the assessment procedures. Through focusing on the students’ knowing that and knowing why the assessment is based on a scholastic perception of the students’ needs for training, reflecting one of the most important challenges in Assessment of Prior Learning: how can practical experience be transformed into credits for the knowledge parts of the programs? The study shows that by combining several Assessment of Prior Learning methods and comparing the teachers’ assessments the teachers respond to the issues of validity and reliability. However, validity and reliability might be even further strengthened, if the competencies are well defined, if the education system is aware of securing a reasonable balance between knowing how, knowing that, and knowing why, and if the teachers are adequately trained for the assessment procedures.

  20. Iterated random walks with shape prior

    DEFF Research Database (Denmark)

    Pujadas, Esmeralda Ruiz; Kjer, Hans Martin; Piella, Gemma

    2016-01-01

    the parametric probability density function. Then, random walks is performed iteratively aligning the prior with the current segmentation in every iteration. We tested the proposed approach with natural and medical images and compared it with the latest techniques with random walks and shape priors......We propose a new framework for image segmentation using random walks where a distance shape prior is combined with a region term. The shape prior is weighted by a confidence map to reduce the influence of the prior in high gradient areas and the region term is computed with k-means to estimate....... The experiments suggest that this method gives promising results for medical and natural images....

  1. Programming

    International Nuclear Information System (INIS)

    Jackson, M.A.

    1982-01-01

    The programmer's task is often taken to be the construction of algorithms, expressed in hierarchical structures of procedures: this view underlies the majority of traditional programming languages, such as Fortran. A different view is appropriate to a wide class of problem, perhaps including some problems in High Energy Physics. The programmer's task is regarded as having three main stages: first, an explicit model is constructed of the reality with which the program is concerned; second, this model is elaborated to produce the required program outputs; third, the resulting program is transformed to run efficiently in the execution environment. The first two stages deal in network structures of sequential processes; only the third is concerned with procedure hierarchies. (orig.)

  2. Programming

    OpenAIRE

    Jackson, M A

    1982-01-01

    The programmer's task is often taken to be the construction of algorithms, expressed in hierarchical structures of procedures: this view underlies the majority of traditional programming languages, such as Fortran. A different view is appropriate to a wide class of problem, perhaps including some problems in High Energy Physics. The programmer's task is regarded as having three main stages: first, an explicit model is constructed of the reality with which the program is concerned; second, thi...

  3. Graphic products used in the evaluation of traditional and emerging remote sensing technologies for the detection of fugitive contamination at selected superfund hazardous waste sites

    Science.gov (United States)

    Slonecker, E. Terrence; Fisher, Gary B.

    2011-01-01

    This report presents the overhead imagery and field sampling results used to prepare U.S. Geological Survey Open-File Report 2011-1050, 'Evaluation of Traditional and Emerging Remote Sensing Technologies for the Detection of Fugitive Contamination at Selected Superfund Hazardous Waste Sites'. These graphic products were used in the evaluation of remote sensing technology in postclosure monitoring of hazardous waste sites and represent an ongoing research effort. Soil sampling results presented here were accomplished with field portable x-ray fluoresence (XRF) technology and are used as screening tools only representing the current conditions of metals and other contaminants at selected Superfund hazardous waste sites.

  4. Research Implementation and Quality Assurance Project Plan: An Evaluation of Hyperspectral Remote Sensing Technologies for the Detection of Fugitive Contamination at Selected Superfund Hazardous Waste Sites

    Science.gov (United States)

    Slonecker, E. Terrence; Fisher, Gary B.

    2009-01-01

    This project is a research collaboration between the U.S. Environmental Protection Agency (EPA) Office of Inspector General (OIG) and the U.S. Geological Survey (USGS) Eastern Geographic Science Center (EGSC), for the purpose of evaluating the utility of hyperspectral remote sensing technology for post-closure monitoring of residual contamination at delisted and closed hazardous waste sites as defined under the Comprehensive Environmental Response Compensation and Liability Act [CERCLA (also known as 'Superfund')] of 1980 and the Superfund Amendments and Reauthorization Act (SARA) of 1986.

  5. Geochemical Characteristics of TP3 Mine Wastes at the Elizabeth Copper Mine Superfund Site, Orange County, Vermont

    Science.gov (United States)

    Hammarstrom, Jane M.; Piatak, Nadine M.; Seal, Robert R.; Briggs, Paul H.; Meier, Allen L.; Muzik, Timothy L.

    2003-01-01

    Remediation of the Elizabeth mine Superfund site in the Vermont copper belt poses challenges for balancing environmental restoration goals with issues of historic preservation while adopting cost-effective strategies for site cleanup and long-term maintenance. The waste-rock pile known as TP3, at the headwaters of Copperas Brook, is especially noteworthy in this regard because it is the worst source of surface- and ground-water contamination identified to date, while also being the area of greatest historical significance. The U.S. Geological Survey (USGS) conducted a study of the historic mine-waste piles known as TP3 at the Elizabeth mine Superfund site near South Strafford, Orange County, VT. TP3 is a 12.3-acre (49,780 m2) subarea of the Elizabeth mine site. It is a focus area for historic preservation because it encompasses an early 19th century copperas works as well as waste from late 19th- and 20th century copper mining (Kierstead, 2001). Surface runoff and seeps from TP3 form the headwaters of Copperas Brook. The stream flows down a valley onto flotation tailings from 20th century copper mining operations and enters the West Branch of the Ompompanoosuc River approximately 1 kilometer downstream from the mine site. Shallow drinking water wells down gradient from TP3 exceed drinking water standards for copper and cadmium (Hathaway and others, 2001). The Elizabeth mine was listed as a Superfund site in 2001, mainly because of impacts of acid-mine drainage on the Ompompanoosuc River.

  6. Superfund explanation of significant difference for the record of decision (EPA Region 8): Lowry Landfill, Aurora, CO, October 24, 1997

    International Nuclear Information System (INIS)

    1999-03-01

    Please be advised that there is an error within Attachment E (Technical Evaluation of Proposed Ground-Water Treatment and Disposal Alternatives) of the ''Responsiveness Summary for the Second Explanation of Significant Differences, Lowry Landfill Superfund Site'' document. The evaluation table, which summarizes the rankings of the two cleanup alternatives, failed to include numerical values for State Acceptance and Community Acceptance. Enclosed is a copy of the table as it should have appeared in Attachment E. Copies of this errata sheet are being mailed to all recipients of the Responsiveness Summary

  7. Penalised Complexity Priors for Stationary Autoregressive Processes

    KAUST Repository

    Sø rbye, Sigrunn Holbek; Rue, Haavard

    2017-01-01

    The autoregressive (AR) process of order p(AR(p)) is a central model in time series analysis. A Bayesian approach requires the user to define a prior distribution for the coefficients of the AR(p) model. Although it is easy to write down some prior, it is not at all obvious how to understand and interpret the prior distribution, to ensure that it behaves according to the users' prior knowledge. In this article, we approach this problem using the recently developed ideas of penalised complexity (PC) priors. These prior have important properties like robustness and invariance to reparameterisations, as well as a clear interpretation. A PC prior is computed based on specific principles, where model component complexity is penalised in terms of deviation from simple base model formulations. In the AR(1) case, we discuss two natural base model choices, corresponding to either independence in time or no change in time. The latter case is illustrated in a survival model with possible time-dependent frailty. For higher-order processes, we propose a sequential approach, where the base model for AR(p) is the corresponding AR(p-1) model expressed using the partial autocorrelations. The properties of the new prior distribution are compared with the reference prior in a simulation study.

  8. Penalised Complexity Priors for Stationary Autoregressive Processes

    KAUST Repository

    Sørbye, Sigrunn Holbek

    2017-05-25

    The autoregressive (AR) process of order p(AR(p)) is a central model in time series analysis. A Bayesian approach requires the user to define a prior distribution for the coefficients of the AR(p) model. Although it is easy to write down some prior, it is not at all obvious how to understand and interpret the prior distribution, to ensure that it behaves according to the users\\' prior knowledge. In this article, we approach this problem using the recently developed ideas of penalised complexity (PC) priors. These prior have important properties like robustness and invariance to reparameterisations, as well as a clear interpretation. A PC prior is computed based on specific principles, where model component complexity is penalised in terms of deviation from simple base model formulations. In the AR(1) case, we discuss two natural base model choices, corresponding to either independence in time or no change in time. The latter case is illustrated in a survival model with possible time-dependent frailty. For higher-order processes, we propose a sequential approach, where the base model for AR(p) is the corresponding AR(p-1) model expressed using the partial autocorrelations. The properties of the new prior distribution are compared with the reference prior in a simulation study.

  9. Can standard sequential extraction determinations effectively define heavy metal species in superfund site soils?

    Energy Technology Data Exchange (ETDEWEB)

    Dahlin, Cheryl L.; Williamson, Connie A.; Collins, Wesley K.; Dahlin, David C.

    2001-01-01

    Speciation and distribution of heavy metals in soils controls the degree to which metals and their compounds are mobile, extractable, and plant-available. Consequently, speciation impacts the success of remediation efforts both by defining the relationship of the contaminants with their environment and by guiding development and evaluation of workable remediation strategies. The U.S. Department of Energy, Albany Research Center (Albany, OR), under a two-year interagency project with the U.S. Environmental Protection Agency (EPA), examined the suitability of sequential extraction as a definitive means to determine species of heavy metals in soil samples. Representative soil samples, contaminated with lead, arsenic, and/or chromium, were collected by EPA personnel from two Superfund sites, the National Lead Company site in Pedricktown, NJ, and the Roebling Steel, Inc., site in Florence, NJ. Data derived from Tessier=s standard three-stage sequential-extraction procedure were compared to data from a comprehensive characterization study that combined optical- and scanning-electron microscopy (with energy-dispersive x-ray and wavelength-dispersive x-ray analyses), x-ray diffraction, and chemical analyses. The results show that standard sequential-extraction procedures that were developed for characterizing species of contaminants in river sediments may be unsuitable for sole evaluation of contaminant species in industrial-site materials (particularly those that contain larger particles of the contaminants, encapsulated contaminants, and/or man-made materials such as slags, metals, and plastics). However, each sequential extraction or comprehensive characterization procedure has it=s own strengths and weaknesses. Findings of this study indicate that the use of both approaches, during the early stages of site studies, would be a best practice. The investigation also highlights the fact that an effective speciation study does not simply identify metal contaminants as

  10. Reproducing kernel Hilbert spaces of Gaussian priors

    NARCIS (Netherlands)

    Vaart, van der A.W.; Zanten, van J.H.; Clarke, B.; Ghosal, S.

    2008-01-01

    We review definitions and properties of reproducing kernel Hilbert spaces attached to Gaussian variables and processes, with a view to applications in nonparametric Bayesian statistics using Gaussian priors. The rate of contraction of posterior distributions based on Gaussian priors can be described

  11. Improving Open Access through Prior Learning Assessment

    Science.gov (United States)

    Yin, Shuangxu; Kawachi, Paul

    2013-01-01

    This paper explores and presents new data on how to improve open access in distance education through using prior learning assessments. Broadly there are three types of prior learning assessment (PLAR): Type-1 for prospective students to be allowed to register for a course; Type-2 for current students to avoid duplicating work-load to gain…

  12. Quantitative Evidence Synthesis with Power Priors

    NARCIS (Netherlands)

    Rietbergen, C.|info:eu-repo/dai/nl/322847796

    2016-01-01

    The aim of this thesis is to provide the applied researcher with a practical approach for quantitative evidence synthesis using the conditional power prior that allows for subjective input and thereby provides an alternative tgbgo deal with the difficulties as- sociated with the joint power prior

  13. Asessment of unskilled adults' prior learning fair to whom

    DEFF Research Database (Denmark)

    Aarkrog, Vibe

    (Dreyfus & Dreyfus), and Bernstein’s distinction between horizontal and vertical learning, the paper gives an account of the students’ development in relation to assessment of their prior learning. The study includes a number of VET-programs. The paper focuses on two of them: Social and health care......This paper discusses research that examined the meeting between on the one hand the adults’ prior learning and on the other the school system and curricular standards. Applying a theoretical frame that includes concepts of communities of practice (Wenger), the development from novice to expert...... and childcare assistant. It addresses questions of what is a fair APL, perceived in relation to both the adults’ knowing in practice and the qualification standards, formulated in the learning outcome descriptions of the programs...

  14. Superfund record of decision (EPA Region 2): Carroll and Dubies Sewage Disposal, Port Jervis, Town of Deerpark, Orange County, NY, September 30, 1996

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-01-01

    This decision document presents the selected remedial action for the contaminated groundwater at the Carroll and Dubies Superfund Site (the Site). This operable unit represents the second of two operable units planned for the Site. It addresses the contaminated groundwater underlying and downgradient of the Carroll and Dubies site.

  15. Final Report; Arsenic Fate, Transport and Stability Study; Groundwater, Surface Water, Soil And Sediment Investigation, Fort Devens Superfund Site, Devens, Massachusetts

    Science.gov (United States)

    This document presents results from the Fiscal Years 2006-2008 field investigation at the Fort Devens Superfund Site, Operable Unit 1 (Shepley's Hill Landfill) to fulfill the research objectives outlined in the proposal entitled, 'Fate and Transport of Arsenic in an Urban, Milita...

  16. Occurences and Fate of DDT Principal Isomers/Metabolites, DDA, and o,p'-DDD Enantiomers in Fish, Sediment and Water at a DDT-Impacted Superfund Site

    Science.gov (United States)

    In the 1950s and 60s, discharges from a DDT manufacturing plant contaminated a tributary system of the Tennessee River near Huntsville, Alabama, USA. Regulatory action resulted in declaring the area a Superfund site which required remediation and extensive monitoring. Monitoring ...

  17. DOJ News Release: New York Man Ordered to Pay Over $400,000 in Restitution and Fines for Role in Kickback Scheme at New Jersey Superfund Sites

    Science.gov (United States)

    WASHINGTON, D.C. – An Amherst, New York, man was ordered to pay over $400,000 in restitution and fines and placed on five years’ probation for his role in a kickback scheme at the Federal Creosote and Diamond Alkali Superfund sites in New Jersey.

  18. Terminology for pregnancy loss prior to viability

    DEFF Research Database (Denmark)

    Kolte, A M; Bernardi, L A; Christiansen, O B

    2015-01-01

    Pregnancy loss prior to viability is common and research in the field is extensive. Unfortunately, terminology in the literature is inconsistent. The lack of consensus regarding nomenclature and classification of pregnancy loss prior to viability makes it difficult to compare study results from...... different centres. In our opinion, terminology and definitions should be based on clinical findings, and when possible, transvaginal ultrasound. With this Early Pregnancy Consensus Statement, it is our goal to provide clear and consistent terminology for pregnancy loss prior to viability....

  19. Avoiding the known prior acts exclusion when insuring newly acquired entities.

    Science.gov (United States)

    Gasior, J P; Passannante, W G

    1998-09-01

    Adding a new entity to an organization's existing insurance program can be problematic if the existing policy contains a known prior acts exclusion clause. By purportedly excluding claims that a policyholder "could have reasonably foreseen or discovered," the known prior acts exclusion allows the insurer to reject those claims after a lawsuit has been filed policyholders should have known prior acts exclusion clauses removed from their policies or work with their insurers on language that will clarify the policy regarding this exclusion.

  20. Memorandum of Understanding Between U.S. EPA Superfund and U.S. NRC

    International Nuclear Information System (INIS)

    Walker, Stuart

    2008-01-01

    The Environmental Protection Agency (EPA) Office of Superfund Remediation and Technology Innovation (OSRTI) and the Nuclear Regulatory Commission (NRC) are responsible for implementing the 'Memorandum of Understanding Between the Environmental Protection Agency and the Nuclear Regulatory Commission: Consultation and Finality on Decommissioning and Decontamination of Contaminated Sites'. This paper provides a brief overview of the origin of the Memorandum of Understanding (MOU), the major features of the MOU, and how the MOU has been implemented site specifically. EPA and NRC developed the MOU in response to direction from the House Committee on Appropriations to EPA and NRC to work together to address the potential for dual regulation. The MOU was signed by EPA on September 30, 2002 and NRC on October 9, 2002. The two agencies had worked on the MOU since March 2000. While both EPA and NRC have statutory authority to clean up these sites, the MOU provides consultation procedures between EPA and NRC to eliminate dual regulation. Under the MOU, EPA and NRC identified the interactions of the two agencies for the decommissioning and decontamination of NRC-licensed sites and the ways in which those responsibilities will be exercised. Except for Section VI, which addresses corrective action under the Resource Conservation and Recovery Act (RCRA), this MOU is limited to the coordination between EPA, when acting under its CERCLA authority, and NRC, when a facility licensed by the NRC is undergoing decommissioning, or when a facility has completed decommissioning, and the NRC has terminated its license. EPA believes that implementation of the MOU between the two agencies will ensure that future confusion about dual regulation does not occur regarding the cleanup and reuse of NRC-licensed sites. NRC and EPA have so far exchanged MOU consultation letters on eight NRC-licensed sites. EPA has responded to each consultation request with a letter expressing its views on actions

  1. A Simulation of Pell Grant Awards and Costs Using Prior-Prior Year Financial Data

    Science.gov (United States)

    Kelchen, Robert; Jones, Gigi

    2015-01-01

    We examine the likely implications of switching from a prior year (PY) financial aid system, the current practice in which students file the Free Application for Federal Student Aid (FAFSA) using income data from the previous tax year, to prior-prior year (PPY), in which data from two years before enrollment is used. While PPY allows students to…

  2. Prior Authorization of PMDs Demonstration - Status Update

    Data.gov (United States)

    U.S. Department of Health & Human Services — CMS implemented a Prior Authorization process for scooters and power wheelchairs for people with Fee-For-Service Medicare who reside in seven states with high...

  3. Short Report Biochemical derangements prior to emergency ...

    African Journals Online (AJOL)

    MMJ VOL 29 (1): March 2017. Biochemical derangements prior to emergency laparotomy at QECH 55. Malawi Medical Journal 29 (1): March 2017 ... Venepuncture was performed preoperatively for urgent cases, defined as those requiring.

  4. A Habermasian Analysis of a Process of Recognition of Prior Learning for Health Care Assistants

    Science.gov (United States)

    Sandberg, Fredrik

    2012-01-01

    This article discusses a process of recognition of prior learning for accreditation of prior experiential learning to qualify for course credits used in an adult in-service education program for health care assistants at the upper-secondary level in Sweden. The data are based on interviews and observations drawn from a field study, and Habermas's…

  5. Attentional and Contextual Priors in Sound Perception.

    Science.gov (United States)

    Wolmetz, Michael; Elhilali, Mounya

    2016-01-01

    Behavioral and neural studies of selective attention have consistently demonstrated that explicit attentional cues to particular perceptual features profoundly alter perception and performance. The statistics of the sensory environment can also provide cues about what perceptual features to expect, but the extent to which these more implicit contextual cues impact perception and performance, as well as their relationship to explicit attentional cues, is not well understood. In this study, the explicit cues, or attentional prior probabilities, and the implicit cues, or contextual prior probabilities, associated with different acoustic frequencies in a detection task were simultaneously manipulated. Both attentional and contextual priors had similarly large but independent impacts on sound detectability, with evidence that listeners tracked and used contextual priors for a variety of sound classes (pure tones, harmonic complexes, and vowels). Further analyses showed that listeners updated their contextual priors rapidly and optimally, given the changing acoustic frequency statistics inherent in the paradigm. A Bayesian Observer model accounted for both attentional and contextual adaptations found with listeners. These results bolster the interpretation of perception as Bayesian inference, and suggest that some effects attributed to selective attention may be a special case of contextual prior integration along a feature axis.

  6. Varying prior information in Bayesian inversion

    International Nuclear Information System (INIS)

    Walker, Matthew; Curtis, Andrew

    2014-01-01

    Bayes' rule is used to combine likelihood and prior probability distributions. The former represents knowledge derived from new data, the latter represents pre-existing knowledge; the Bayesian combination is the so-called posterior distribution, representing the resultant new state of knowledge. While varying the likelihood due to differing data observations is common, there are also situations where the prior distribution must be changed or replaced repeatedly. For example, in mixture density neural network (MDN) inversion, using current methods the neural network employed for inversion needs to be retrained every time prior information changes. We develop a method of prior replacement to vary the prior without re-training the network. Thus the efficiency of MDN inversions can be increased, typically by orders of magnitude when applied to geophysical problems. We demonstrate this for the inversion of seismic attributes in a synthetic subsurface geological reservoir model. We also present results which suggest that prior replacement can be used to control the statistical properties (such as variance) of the final estimate of the posterior in more general (e.g., Monte Carlo based) inverse problem solutions. (paper)

  7. Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) or Superfund, Section 104(k); and CERCLA Section 104(d); ‘‘ ‘Discounted Loans’ Under Brownfields Revolving Loan Fund Grants’

    Science.gov (United States)

    Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) or Superfund, Section 104(k); and CERCLA Section 104(d); ‘‘ ‘Discounted Loans’ Under Brownfields Revolving Loan Fund Grants’`

  8. The atmosphere as a source/sink of polychlorinated biphenyls to/from the Lower Duwamish Waterway Superfund site

    International Nuclear Information System (INIS)

    Apell, Jennifer N.; Gschwend, Philip M.

    2017-01-01

    Waterbodies polluted with polychlorinated biphenyls (PCBs) may cause the air in the surrounding area to become PCB-contaminated. Conversely, when a waterbody is located in or near an urban area, the deposition of atmospheric PCBs may act as a low-level, ongoing source of PCB contamination to that water. Distinguishing these situations is necessary to be protective of human populations and to guide efforts seeking to cleanup such aquatic ecosystems. To assess the situation at the Lower Duwamish Waterway (LDW) Superfund site, low-density polyethylene passive samplers were deployed in the summer of 2015 to quantify freely dissolved water and gaseous air concentrations of PCBs thereby enabling estimates of the direction and magnitude of air-water exchange of PCB congeners. For the sum of the 27 PCB congeners, average concentrations were 220 pg/m 3 (95% C.I.: 80–610) in the air and 320 pg/L (95% C.I.: 110–960) in the water. The sum of air-water exchange fluxes of these PCB congeners was estimated to be 68 ng/m 2 /day (95% C.I.: 30–148) into the lower atmosphere, contrasting with the reported wet and dry depositional flux of only 5.5 ng/m 2 /day (95% C.I.: 1–38) from the air into the water. Therefore, the atmosphere was ultimately a sink of PCBs from the LDW Superfund site, at least under 2015 summertime conditions. However, we conclude that air-water exchange of PCBs is likely only a minor sink of PCBs from the LDW and only a minor source of contamination to the region's local atmosphere. - Highlights: • Passive samplers were used to estimate air and water concentrations. • At this site, PCBs were being transported from the water into the local atmosphere. • Air-water exchange was likely only a minor sink of PCBs for the LDW site. • The LDW was likely only a minor source of PCBs to the local atmosphere. - Air-water exchange of PCBs from the LDW Superfund site, calculated using passive sampler data, was determined to be a minor sink of PCBs from

  9. Heuristics as Bayesian inference under extreme priors.

    Science.gov (United States)

    Parpart, Paula; Jones, Matt; Love, Bradley C

    2018-05-01

    Simple heuristics are often regarded as tractable decision strategies because they ignore a great deal of information in the input data. One puzzle is why heuristics can outperform full-information models, such as linear regression, which make full use of the available information. These "less-is-more" effects, in which a relatively simpler model outperforms a more complex model, are prevalent throughout cognitive science, and are frequently argued to demonstrate an inherent advantage of simplifying computation or ignoring information. In contrast, we show at the computational level (where algorithmic restrictions are set aside) that it is never optimal to discard information. Through a formal Bayesian analysis, we prove that popular heuristics, such as tallying and take-the-best, are formally equivalent to Bayesian inference under the limit of infinitely strong priors. Varying the strength of the prior yields a continuum of Bayesian models with the heuristics at one end and ordinary regression at the other. Critically, intermediate models perform better across all our simulations, suggesting that down-weighting information with the appropriate prior is preferable to entirely ignoring it. Rather than because of their simplicity, our analyses suggest heuristics perform well because they implement strong priors that approximate the actual structure of the environment. We end by considering how new heuristics could be derived by infinitely strengthening the priors of other Bayesian models. These formal results have implications for work in psychology, machine learning and economics. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Assessment of unskilled adults’ prior learning – fair to whom?

    DEFF Research Database (Denmark)

    Aarkrog, Vibe

    2014-01-01

    As in many other countries, Danish adult education policy focuses on how to encourage adults for education; the most important and challenging group of adults being those with few or no formal qualifications. Assessment of prior learning (APL) is perceived as an important tool for motivating adults...... the school system and curricular standards. Applying a theoretical frame that includes concepts of communities of practice (Wenger), the development from novice to expert (Dreyfus & Dreyfus), and Bernstein’s distinction between horizontal and vertical learning, the paper gives an account of the students......’ development in relation to assessment of their prior learning. The study includes a number of VET-programs. The paper focuses on one of them: Social and health care and clerical assistant. It addresses questions of what is a fair APL, perceived in relation to both the adults’ knowing in practice...

  11. External Prior Guided Internal Prior Learning for Real-World Noisy Image Denoising

    Science.gov (United States)

    Xu, Jun; Zhang, Lei; Zhang, David

    2018-06-01

    Most of existing image denoising methods learn image priors from either external data or the noisy image itself to remove noise. However, priors learned from external data may not be adaptive to the image to be denoised, while priors learned from the given noisy image may not be accurate due to the interference of corrupted noise. Meanwhile, the noise in real-world noisy images is very complex, which is hard to be described by simple distributions such as Gaussian distribution, making real noisy image denoising a very challenging problem. We propose to exploit the information in both external data and the given noisy image, and develop an external prior guided internal prior learning method for real noisy image denoising. We first learn external priors from an independent set of clean natural images. With the aid of learned external priors, we then learn internal priors from the given noisy image to refine the prior model. The external and internal priors are formulated as a set of orthogonal dictionaries to efficiently reconstruct the desired image. Extensive experiments are performed on several real noisy image datasets. The proposed method demonstrates highly competitive denoising performance, outperforming state-of-the-art denoising methods including those designed for real noisy images.

  12. Geophysical logging and thermal imaging near the Hemphill Road TCE National Priorities List Superfund site near Gastonia, North Carolina

    Science.gov (United States)

    Antolino, Dominick J.; Chapman, Melinda J.

    2017-03-27

    Borehole geophysical logs and thermal imaging data were collected by the U.S. Geological Survey near the Hemphill Road TCE (trichloroethylene) National Priorities List Superfund site near Gastonia, North Carolina, during August 2014 through February 2015. In an effort to assist the U.S. Environmental Protection Agency in the development of a conceptual groundwater model for the assessment of current contaminant distribution and future migration of contaminants, surface geological mapping and borehole geophysical log and thermal imaging data collection, which included the delineation of more than 600 subsurface features (primarily fracture orientations), was completed in five open borehole wells and two private supply bedrock wells. In addition, areas of possible groundwater discharge within a nearby creek downgradient of the study site were determined based on temperature differences between the stream and bank seepage using thermal imagery.

  13. Comprehensive Environmental Response, Compensation, and Liability Act, as amended by the Superfund Amendments and Reauthorization Act Section 120(e)(5)

    International Nuclear Information System (INIS)

    1992-05-01

    The US Department of Energy (DOE) is committed to conducting its operations. In a safe and environmentally sound manner. High priorities for the Department are identifying and correcting environmental problems at DOE facilities that resulted from past operations, and preventing environmental problems from occurring during present and future operations. In this regard, the Department is committed to the 30-year goal of cleanup of all facilities by the year 2019. DOE has issued an Order and guidance establishing policy and procedures for activities conducted under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), as amended by the Superfund Amendments and Reauthorization Act (SARA), and has developed a Five-Year Plan, updated annually, that integrates planing for corrective activities, environmental restoration, and waste management operations at its facilities. During Calendar Year 1991 and early 1992, DOE made significant progress in reaching agreements with regulatory entities, undertaking cleanup actions, and initiating preventive measures designed to eliminate future environmental problems. These accomplishments are described

  14. CERCLA and RCRA requirements affecting cleanup of a hazardous waste management unit at a Superfund site: A case study

    International Nuclear Information System (INIS)

    Walsh, T.J.

    1995-03-01

    The Fernald Environmental Management Project (FEMP) attempted to address both RCRA and CERCLA requirements at the fire training facility (FTF) by integrating a CERCLA removal action work plan with a RCRA closure plan. While the regulatory agencies involved with the FTF cleanup agreed the integrated document was a good idea, implementation proved complicated, owing to disposition of clean debris from a Superfund site, treatment of contaminated media, duration of cleanup activities, and cleanup certification. While all the complications have not been resolved, solutions to all have been proposed to Ohio EPA and U.S. EPA. Both agencies have worked closely with FEMP to find the most effective fulfillment of RCRA and CERCLA requirements

  15. An evaluation of remote sensing technologies for the detection of fugitive contamination at selected Superfund hazardous waste sites in Pennsylvania

    Science.gov (United States)

    Slonecker, E. Terrence; Fisher, Gary B.

    2014-01-01

    This evaluation was conducted to assess the potential for using both traditional remote sensing, such as aerial imagery, and emerging remote sensing technology, such as hyperspectral imaging, as tools for postclosure monitoring of selected hazardous waste sites. Sixteen deleted Superfund (SF) National Priorities List (NPL) sites in Pennsylvania were imaged with a Civil Air Patrol (CAP) Airborne Real-Time Cueing Hyperspectral Enhanced Reconnaissance (ARCHER) sensor between 2009 and 2012. Deleted sites are those sites that have been remediated and removed from the NPL. The imagery was processed to radiance and atmospherically corrected to relative reflectance with standard software routines using the Environment for Visualizing Imagery (ENVI, ITT–VIS, Boulder, Colorado) software. Standard routines for anomaly detection, endmember collection, vegetation stress, and spectral analysis were applied.

  16. Offending prior to first psychiatric contact

    DEFF Research Database (Denmark)

    Stevens, H; Agerbo, E; Dean, K

    2012-01-01

    There is a well-established association between psychotic disorders and subsequent offending but the extent to which those who develop psychosis might have a prior history of offending is less clear. Little is known about whether the association between illness and offending exists in non-psychot......-psychotic disorders. The aim of this study was to determine whether the association between mental disorder and offending is present prior to illness onset in psychotic and non-psychotic disorders.......There is a well-established association between psychotic disorders and subsequent offending but the extent to which those who develop psychosis might have a prior history of offending is less clear. Little is known about whether the association between illness and offending exists in non...

  17. GENERAL ASPECTS REGARDING THE PRIOR DISCIPLINARY RESEARCH

    Directory of Open Access Journals (Sweden)

    ANDRA PURAN (DASCĂLU

    2012-05-01

    Full Text Available Disciplinary research is the first phase of the disciplinary action. According to art. 251 paragraph 1 of the Labour Code no disciplinary sanction may be ordered before performing the prior disciplinary research.These regulations provide an exception: the sanction of written warning. The current regulations in question, kept from the old regulation, provides a protection for employees against abuses made by employers, since sanctions are affecting the salary or the position held, or even the development of individual employment contract. Thus, prior research of the fact that is a misconduct, before a disciplinary sanction is applied, is an essential condition for the validity of the measure ordered. Through this study we try to highlight some general issues concerning the characteristics, processes and effects of prior disciplinary research.

  18. Bayesian Prior Probability Distributions for Internal Dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Miller, G.; Inkret, W.C.; Little, T.T.; Martz, H.F.; Schillaci, M.E

    2001-07-01

    The problem of choosing a prior distribution for the Bayesian interpretation of measurements (specifically internal dosimetry measurements) is considered using a theoretical analysis and by examining historical tritium and plutonium urine bioassay data from Los Alamos. Two models for the prior probability distribution are proposed: (1) the log-normal distribution, when there is some additional information to determine the scale of the true result, and (2) the 'alpha' distribution (a simplified variant of the gamma distribution) when there is not. These models have been incorporated into version 3 of the Bayesian internal dosimetric code in use at Los Alamos (downloadable from our web site). Plutonium internal dosimetry at Los Alamos is now being done using prior probability distribution parameters determined self-consistently from population averages of Los Alamos data. (author)

  19. Can natural selection encode Bayesian priors?

    Science.gov (United States)

    Ramírez, Juan Camilo; Marshall, James A R

    2017-08-07

    The evolutionary success of many organisms depends on their ability to make decisions based on estimates of the state of their environment (e.g., predation risk) from uncertain information. These decision problems have optimal solutions and individuals in nature are expected to evolve the behavioural mechanisms to make decisions as if using the optimal solutions. Bayesian inference is the optimal method to produce estimates from uncertain data, thus natural selection is expected to favour individuals with the behavioural mechanisms to make decisions as if they were computing Bayesian estimates in typically-experienced environments, although this does not necessarily imply that favoured decision-makers do perform Bayesian computations exactly. Each individual should evolve to behave as if updating a prior estimate of the unknown environment variable to a posterior estimate as it collects evidence. The prior estimate represents the decision-maker's default belief regarding the environment variable, i.e., the individual's default 'worldview' of the environment. This default belief has been hypothesised to be shaped by natural selection and represent the environment experienced by the individual's ancestors. We present an evolutionary model to explore how accurately Bayesian prior estimates can be encoded genetically and shaped by natural selection when decision-makers learn from uncertain information. The model simulates the evolution of a population of individuals that are required to estimate the probability of an event. Every individual has a prior estimate of this probability and collects noisy cues from the environment in order to update its prior belief to a Bayesian posterior estimate with the evidence gained. The prior is inherited and passed on to offspring. Fitness increases with the accuracy of the posterior estimates produced. Simulations show that prior estimates become accurate over evolutionary time. In addition to these 'Bayesian' individuals, we also

  20. Recognition of Prior Learning: The Participants' Perspective

    Science.gov (United States)

    Miguel, Marta C.; Ornelas, José H.; Maroco, João P.

    2016-01-01

    The current narrative on lifelong learning goes beyond formal education and training, including learning at work, in the family and in the community. Recognition of prior learning is a process of evaluation of those skills and knowledge acquired through life experience, allowing them to be formally recognized by the qualification systems. It is a…

  1. Validity in assessment of prior learning

    DEFF Research Database (Denmark)

    Wahlgren, Bjarne; Aarkrog, Vibe

    2015-01-01

    , the article discusses the need for specific criteria for assessment. The reliability and validity of the assessment procedures depend on whether the competences are well-defined, and whether the teachers are adequately trained for the assessment procedures. Keywords: assessment, prior learning, adult...... education, vocational training, lifelong learning, validity...

  2. PET reconstruction via nonlocal means induced prior.

    Science.gov (United States)

    Hou, Qingfeng; Huang, Jing; Bian, Zhaoying; Chen, Wufan; Ma, Jianhua

    2015-01-01

    The traditional Bayesian priors for maximum a posteriori (MAP) reconstruction methods usually incorporate local neighborhood interactions that penalize large deviations in parameter estimates for adjacent pixels; therefore, only local pixel differences are utilized. This limits their abilities of penalizing the image roughness. To achieve high-quality PET image reconstruction, this study investigates a MAP reconstruction strategy by incorporating a nonlocal means induced (NLMi) prior (NLMi-MAP) which enables utilizing global similarity information of image. The present NLMi prior approximates the derivative of Gibbs energy function by an NLM filtering process. Specially, the NLMi prior is obtained by subtracting the current image estimation from its NLM filtered version and feeding the residual error back to the reconstruction filter to yield the new image estimation. We tested the present NLMi-MAP method with simulated and real PET datasets. Comparison studies with conventional filtered backprojection (FBP) and a few iterative reconstruction methods clearly demonstrate that the present NLMi-MAP method performs better in lowering noise, preserving image edge and in higher signal to noise ratio (SNR). Extensive experimental results show that the NLMi-MAP method outperforms the existing methods in terms of cross profile, noise reduction, SNR, root mean square error (RMSE) and correlation coefficient (CORR).

  3. Prior learning assessment and quality assurance practice ...

    African Journals Online (AJOL)

    The use of RPL (Recognition of Prior Learning) in higher education to assess RPL candidates for admission into programmes of study met with a lot of criticism from faculty academics. Lecturers viewed the possibility of admitting large numbers of under-qualified adult learners, as a threat to the institution's reputation, or an ...

  4. Action priors for learning domain invariances

    CSIR Research Space (South Africa)

    Rosman, Benjamin S

    2015-04-01

    Full Text Available behavioural invariances in the domain, by identifying actions to be prioritised in local contexts, invariant to task details. This information has the effect of greatly increasing the speed of solving new problems. We formalise this notion as action priors...

  5. An evaluation of traditional and emerging remote sensing technologies for the detection of fugitive contamination at selected Superfund hazardous waste sites

    Science.gov (United States)

    Slonecker, E. Terrence; Fisher, Gary B.

    2011-01-01

    This report represents a remote sensing research effort conducted by the U.S. Geological Survey in cooperation with the U.S. Environmental Protection Agency (EPA) for the EPA Office of Inspector General. The objective of this investigation was to explore the efficacy of remote sensing as a technology for postclosure monitoring of hazardous waste sites as defined under the Comprehensive Environmental Response Compensation and Liability Act of 1980 (Public Law 96-510, 42 U.S.C. §9601 et seq.), also known as \\"Superfund.\\" Five delisted Superfund sites in Maryland and Virginia were imaged with a hyperspectral sensor and visited for collection of soil, water, and spectral samples and inspection of general site conditions. This report evaluates traditional and hyperspectral imagery and field spectroscopic measurement techniques in the characterization and analysis of fugitive (anthropogenic, uncontrolled) contamination at previously remediated hazardous waste disposal sites.

  6. Random template placement and prior information

    International Nuclear Information System (INIS)

    Roever, Christian

    2010-01-01

    In signal detection problems, one is usually faced with the task of searching a parameter space for peaks in the likelihood function which indicate the presence of a signal. Random searches have proven to be very efficient as well as easy to implement, compared e.g. to searches along regular grids in parameter space. Knowledge of the parameterised shape of the signal searched for adds structure to the parameter space, i.e., there are usually regions requiring to be densely searched while in other regions a coarser search is sufficient. On the other hand, prior information identifies the regions in which a search will actually be promising or may likely be in vain. Defining specific figures of merit allows one to combine both template metric and prior distribution and devise optimal sampling schemes over the parameter space. We show an example related to the gravitational wave signal from a binary inspiral event. Here the template metric and prior information are particularly contradictory, since signals from low-mass systems tolerate the least mismatch in parameter space while high-mass systems are far more likely, as they imply a greater signal-to-noise ratio (SNR) and hence are detectable to greater distances. The derived sampling strategy is implemented in a Markov chain Monte Carlo (MCMC) algorithm where it improves convergence.

  7. Temporal Chemical Data for Sediment, Water, and Biological Samples from the Lava Cap Mine Superfund Site, Nevada County, California-2006-2008

    Science.gov (United States)

    Foster, Andrea L.; Ona-Nguema, Georges; Tufano, Kate; White, Richard III

    2010-01-01

    the possibility of future movement of tailings, and began an assessment of the risks posed by physical and chemical hazards at the site. The EPA's assessment identified arsenic (As) as the primary hazard of concern. Three main exposure routes were identified: inhalation/ingestion of mine tailings, dermal absorption/ingestion of As in lake water from swimming, and ingestion of As-contaminated ground water or surface water. Lost Lake is a private lake which is completely surrounded by low-density residential development. Prior to the dam failure, the lake was used by the local residents for swimming and boating. An estimated 1,776 people reside within one mile of the lake, and almost all residents of the area use potable groundwater for domestic use. Risk factors for human exposure to As derived from mine wastes were high enough to merit placement of the mine site and surrounding area on the National Priority List (commonly called ?Superfund?). The Lava Cap Mine Superfund site (LCMS) encompasses approximately 33 acres that include the mine site, the stretch of Little Clipper Creek between the mine and Lost Lake, the lake itself, and the area between the lake and the confluence of Little Clipper Creek with its parent stream, Clipper Creek. The area between the two creeks is named the ?deposition area? due to the estimated 24 m thick layer of tailings that were laid down there during and after active mining. The lobate structure of Lost Lake is also due to deposition in this area. The deposition area and Lost Lake are together estimated to contain 382,277 m3 of tailings. The primary goals of the EPA have been to minimize tailings movement downstream of Lost Lake and to ensure that residents in the area have drinking water that meets national water quality standards. EPA has officially decided to construct a public water supply line to deliver safe water to affected residences, since some residential wells in the area have As concentrations above the curr

  8. Hydrogeologic investigation of the Malvern TCE Superfund Site, Chester County, Pennsylvania

    Science.gov (United States)

    Sloto, Ronald A.

    1997-01-01

    The Malvern TCE Superfund Site, a former solvent recycling facility that now stores and sells solvents, consists of a plant and disposal area, which are approximately 1,900 ft (feet) apart. The site is underlain by an unconfined carbonate bedrock aquifer in which permeability has been enhanced in places by solution. Water levels respond quickly to precipitation and show a similar seasonal variation, response to precipitation, and range of fluctuation. The altitude of water levels in wells at the disposal area is nearly identical because of the small hydraulic gradient. A comparison of water-table maps for 1983, 1993, and 1994 shows that the general shape of the water table and hydraulic gradients in the area have remained the same through time and for different climatic conditions.The plant area is underlain by dolomite of the Elbrook Formation. The dolomite at the plant area does not yield as much water as the dolomite at the disposal area because it is less fractured, and wells penetrate few water-bearing fractures. Yields of nine wells at the plant area range from 1 to 200 gal/min (gallons per minute); the median yield is 6 gal/min. Specific capacities range from 0.08 to 2 (gal/min)/ft (gallons per minute per foot). Aquifer tests were conducted in two wells; median transmissivities estimated from the aquifer-test data ranged from 528 to 839 feet squared per day. Maximum concentrations of volatile organic compounds (VOC's) in ground water at the plant area in 1996 were 53,900 ug/L (micrograms per liter) for trichloroethylene (TCE), 7,110 ug/L for tetrachloroethylene (PCE), and 17,700 ug/L for 1,1,1-trichloroethane (TCA).A ground-water divide is located between the plant area and the disposal area. Ground-water withdrawal for dewatering the Catanach quarry has caused a cone of depression in the water-table surface that reaches to the plant area. From the plant area, ground water flows 1.2 miles to the northeast and discharges to the Catanach quarry. The regional

  9. Aquatic assessment of the Pike Hill Copper Mine Superfund site, Corinth, Vermont

    Science.gov (United States)

    Piatak, Nadine M.; Argue, Denise M.; Seal, Robert R.; Kiah, Richard G.; Besser, John M.; Coles, James F.; Hammarstrom, Jane M.; Levitan, Denise M.; Deacon, Jeffrey R.; Ingersoll, Christopher G.

    2013-01-01

    The Pike Hill Copper Mine Superfund site in Corinth, Orange County, Vermont, includes the Eureka, Union, and Smith mines along with areas of downstream aquatic ecosystem impairment. The site was placed on the U.S. Environmental Protection Agency (USEPA) National Priorities List in 2004. The mines, which operated from about 1847 to 1919, contain underground workings, foundations from historical structures, several waste-rock piles, and some flotation tailings. The mine site is drained to the northeast by Pike Hill Brook, which includes several wetland areas, and to the southeast by an unnamed tributary that flows to the south and enters Cookville Brook. Both brooks eventually drain into the Waits River, which flows into the Connecticut River. The aquatic ecosystem at the site was assessed using a variety of approaches that investigated surface-water quality, sediment quality, and various ecological indicators of stream-ecosystem health. The degradation of surface-water quality is caused by elevated concentrations of copper, and to a lesser extent cadmium, with localized effects caused by aluminum, iron, and zinc. Copper concentrations in surface waters reached or exceeded the USEPA national recommended chronic water-quality criteria for the protection of aquatic life in all of the Pike Hill Brook sampling locations except for the location farthest downstream, in half of the locations sampled in the tributary to Cookville Brook, and in about half of the locations in one wetland area located in Pike Hill Brook. Most of these same locations also contained concentrations of cadmium that exceeded the chronic water-quality criteria. In contrast, surface waters at background sampling locations were below these criteria for copper and cadmium. Comparison of hardness-based and Biotic Ligand Model (BLM)-based criteria for copper yields similar results with respect to the extent or number of stations impaired for surface waters in the affected area. However, the BLM

  10. Department of Energy Defense Programs Environmental Restoration Program update

    International Nuclear Information System (INIS)

    Lehr, J.C.; Eyman, L.D.; Thompson, W.W. Jr.

    1989-01-01

    Federal facilities are under increasing pressure to remediate inactive hazardous waste sites and associated off-site areas. The Superfund Amendments and Reauthorization Act federal facilities provision requires that the Environmental Protection Agency establish a public docket to list all federal sites contaminated by hazardous wastes or substances and to monitor the progress of investigations and cleanups against an established schedule. In addition, the Resource Conservation and Recovery Act requires that operating permits for hazardous waste treatment, storage, and disposal facilities be issued only upon binding agreements that identify specific schedules for corrective action for all hazardous waste releases that have or are occurring at the facility. Defense Programs (DP) must make remedial actions integral to its mission. Environmental cleanups are given increased emphasis with the new regulations/laws providing the right to private citizens and the states to sue to enforce these statutes and schedule commitments. 1 fig., 2 tabs

  11. 78 FR 73525 - Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) or Superfund...

    Science.gov (United States)

    2013-12-06

    ... Conservation and Recovery Act (RCRA). Many state programs also offer accompanying financial incentive programs... states and tribes that have the management and administrative capacity within their government required... identify the institutional controls relied on in the remedy and include relevant information concerning the...

  12. Influence of a chlor-alkali superfund site on mercury bioaccumulation in periphyton and low-trophic level fauna

    Science.gov (United States)

    Buckman, Kate L.; Marvin-DiPasquale, Mark C.; Taylor, Vivien F.; Chalmers, Ann T.; Broadley, Hannah J.; Agee, Jennifer L.; Jackson, Brian P.; Chen, Celia Y.

    2015-01-01

    In Berlin, New Hampshire, USA, the Androscoggin River flows adjacent to a former chlor-alkali facility that is a US Environmental Protection Agency Superfund site and source of mercury (Hg) to the river. The present study was conducted to determine the fate and bioaccumulation of methylmercury (MeHg) to lower trophic-level taxa in the river. Surface sediment directly adjacent to the source showed significantly elevated MeHg (10–40× increase, mean ± standard deviation [SD]: 20.1 ± 24.8 ng g–1 dry wt) and total mercury (THg; 10–30× increase, mean ± SD: 2045 ± 2669 ng g–1 dry wt) compared with all other reaches, with sediment THg and MeHg from downstream reaches elevated (3–7× on average) relative to the reference (THg mean ± SD: 33.5 ± 9.33 ng g–1 dry wt; MeHg mean ± SD: 0.52 ± 0.21 ng g–1 dry wt). Water column THg concentrations adjacent to the point source for both particulate (0.23 ng L–1) and dissolved (0.76 ng L–1) fractions were 5-fold higher than at the reference sites, and 2-fold to 5-fold higher than downstream. Methylmercury production potential of periphyton material was highest (2–9 ng g–1 d–1 dry wt) adjacent to the Superfund site; other reaches were close to or below reporting limits (0. 1 ng g–1 d–1 dry wt). Total Hg and MeHg bioaccumulation in fauna was variable across sites and taxa, with no clear spatial patterns downstream of the contamination source. Crayfish, mayflies, and shiners showed a weak positive relationship with porewater MeHg concentration.

  13. Sparse Multivariate Modeling: Priors and Applications

    DEFF Research Database (Denmark)

    Henao, Ricardo

    This thesis presents a collection of statistical models that attempt to take advantage of every piece of prior knowledge available to provide the models with as much structure as possible. The main motivation for introducing these models is interpretability since in practice we want to be able...... a general yet self-contained description of every model in terms of generative assumptions, interpretability goals, probabilistic formulation and target applications. Case studies, benchmark results and practical details are also provided as appendices published elsewhere, containing reprints of peer...

  14. Genome position specific priors for genomic prediction

    DEFF Research Database (Denmark)

    Brøndum, Rasmus Froberg; Su, Guosheng; Lund, Mogens Sandø

    2012-01-01

    casual mutation is different between the populations but affects the same gene. Proportions of a four-distribution mixture for SNP effects in segments of fixed size along the genome are derived from one population and set as location specific prior proportions of distributions of SNP effects...... for the target population. The model was tested using dairy cattle populations of different breeds: 540 Australian Jersey bulls, 2297 Australian Holstein bulls and 5214 Nordic Holstein bulls. The traits studied were protein-, fat- and milk yield. Genotypic data was Illumina 777K SNPs, real or imputed Results...

  15. Models for Validation of Prior Learning (VPL)

    DEFF Research Database (Denmark)

    Ehlers, Søren

    The national policies for the education/training of adults are in the 21st century highly influenced by proposals which are formulated and promoted by The European Union (EU) as well as other transnational players and this shift in policy making has consequences. One is that ideas which in the past...... would have been categorized as utopian can become realpolitik. Validation of Prior Learning (VPL) was in Europe mainly regarded as utopian while universities in the United States of America (USA) were developing ways to obtain credits to those students which was coming with experiences from working life....

  16. Depth image enhancement using perceptual texture priors

    Science.gov (United States)

    Bang, Duhyeon; Shim, Hyunjung

    2015-03-01

    A depth camera is widely used in various applications because it provides a depth image of the scene in real time. However, due to the limited power consumption, the depth camera presents severe noises, incapable of providing the high quality 3D data. Although the smoothness prior is often employed to subside the depth noise, it discards the geometric details so to degrade the distance resolution and hinder achieving the realism in 3D contents. In this paper, we propose a perceptual-based depth image enhancement technique that automatically recovers the depth details of various textures, using a statistical framework inspired by human mechanism of perceiving surface details by texture priors. We construct the database composed of the high quality normals. Based on the recent studies in human visual perception (HVP), we select the pattern density as a primary feature to classify textures. Upon the classification results, we match and substitute the noisy input normals with high quality normals in the database. As a result, our method provides the high quality depth image preserving the surface details. We expect that our work is effective to enhance the details of depth image from 3D sensors and to provide a high-fidelity virtual reality experience.

  17. The design and construction of large diameter pre-filter packed recovery wells at the Ninth Avenue Superfund Site

    International Nuclear Information System (INIS)

    Lombardo, S.L.; Maley, T.J.; Bono, B.A.

    1992-01-01

    Large diameter groundwater/oil recovery wells were installed in an unconfined sand aquifer at the Ninth Avenue Superfund Site in Gary, Indiana. To assure adequate filter packs, prefilter packed groundwater/oil recovery wells were selected to minimize silting by using appropriate screen slot size and filter pack. A properly sized filter pack was necessary to prevent the formation material from entering the well. During field drilling operations, open-quotes having sandsclose quotes and silting of existing wells were encountered. By using sieve analyses of the native aquifer soil, described by Driscoll (1989), the filter pack and screen slot size were selected. Prefilter packed well screens were selected for this site to assure the presence of a uniform filter pack, thus minimizing siltation in the wells. A prefilter packed well screen consists of a double screen with the interstitial space filled with granular filter pack material designed specifically for site conditions. These wells provide the adequate filter pack without the need to add additional filter pack material outside the well screen. Wells were installed using 12 1/4 inch ID hollow stem augers. This methodology is EPA-approved, expeditious, and inexpensive. Level B personal protective equipment was required during installation. Therefore, the advantages of hollow stem drilling include short drilling time and no circulation fluids. The 14 recovery wells were successfully installed in 14 days using the hollow stem auger drilling technique. Observations during well development revealed little or no silt present in purged groundwater

  18. Extended Linear Models with Gaussian Priors

    DEFF Research Database (Denmark)

    Quinonero, Joaquin

    2002-01-01

    In extended linear models the input space is projected onto a feature space by means of an arbitrary non-linear transformation. A linear model is then applied to the feature space to construct the model output. The dimension of the feature space can be very large, or even infinite, giving the model...... a very big flexibility. Support Vector Machines (SVM's) and Gaussian processes are two examples of such models. In this technical report I present a model in which the dimension of the feature space remains finite, and where a Bayesian approach is used to train the model with Gaussian priors...... on the parameters. The Relevance Vector Machine, introduced by Tipping, is a particular case of such a model. I give the detailed derivations of the expectation-maximisation (EM) algorithm used in the training. These derivations are not found in the literature, and might be helpful for newcomers....

  19. Savings for visuomotor adaptation require prior history of error, not prior repetition of successful actions.

    Science.gov (United States)

    Leow, Li-Ann; de Rugy, Aymar; Marinovic, Welber; Riek, Stephan; Carroll, Timothy J

    2016-10-01

    When we move, perturbations to our body or the environment can elicit discrepancies between predicted and actual outcomes. We readily adapt movements to compensate for such discrepancies, and the retention of this learning is evident as savings, or faster readaptation to a previously encountered perturbation. The mechanistic processes contributing to savings, or even the necessary conditions for savings, are not fully understood. One theory suggests that savings requires increased sensitivity to previously experienced errors: when perturbations evoke a sequence of correlated errors, we increase our sensitivity to the errors experienced, which subsequently improves error correction (Herzfeld et al. 2014). An alternative theory suggests that a memory of actions is necessary for savings: when an action becomes associated with successful target acquisition through repetition, that action is more rapidly retrieved at subsequent learning (Huang et al. 2011). In the present study, to better understand the necessary conditions for savings, we tested how savings is affected by prior experience of similar errors and prior repetition of the action required to eliminate errors using a factorial design. Prior experience of errors induced by a visuomotor rotation in the savings block was either prevented at initial learning by gradually removing an oppositely signed perturbation or enforced by abruptly removing the perturbation. Prior repetition of the action required to eliminate errors in the savings block was either deprived or enforced by manipulating target location in preceding trials. The data suggest that prior experience of errors is both necessary and sufficient for savings, whereas prior repetition of a successful action is neither necessary nor sufficient for savings. Copyright © 2016 the American Physiological Society.

  20. Effects of changes in pumping on regional groundwater-flow paths, 2005 and 2010, and areas contributing recharge to discharging wells, 1990–2010, in the vicinity of North Penn Area 7 Superfund site, Montgomery County, Pennsylvania

    Science.gov (United States)

    Senior, Lisa A.; Goode, Daniel J.

    2017-06-06

    A previously developed regional groundwater flow model was used to simulate the effects of changes in pumping rates on groundwater-flow paths and extent of recharge discharging to wells for a contaminated fractured bedrock aquifer in southeastern Pennsylvania. Groundwater in the vicinity of the North Penn Area 7 Superfund site, Montgomery County, Pennsylvania, was found to be contaminated with organic compounds, such as trichloroethylene (TCE), in 1979. At the time contamination was discovered, groundwater from the underlying fractured bedrock (shale) aquifer was the main source of supply for public drinking water and industrial use. As part of technical support to the U.S. Environmental Protection Agency (EPA) during the Remedial Investigation of the North Penn Area 7 Superfund site from 2000 to 2005, the U.S. Geological Survey (USGS) developed a model of regional groundwater flow to describe changes in groundwater flow and contaminant directions as a result of changes in pumping. Subsequently, large decreases in TCE concentrations (as much as 400 micrograms per liter) were measured in groundwater samples collected by the EPA from selected wells in 2010 compared to 2005‒06 concentrations.To provide insight on the fate of potentially contaminated groundwater during the period of generally decreasing pumping rates from 1990 to 2010, steady-state simulations were run using the previously developed groundwater-flow model for two conditions prior to extensive remediation, 1990 and 2000, two conditions subsequent to some remediation 2005 and 2010, and a No Pumping case, representing pre-development or cessation of pumping conditions. The model was used to (1) quantify the amount of recharge, including potentially contaminated recharge from sources near the land surface, that discharged to wells or streams and (2) delineate the areas contributing recharge that discharged to wells or streams for the five conditions.In all simulations, groundwater divides differed from

  1. RCRA and CERCLA requirements affecting cleanup activities at a federal facility superfund site

    International Nuclear Information System (INIS)

    Walsh, T.J.

    1994-01-01

    The Fernald Environmental Management Project (FEMP) achieved success on an integrated groundwater monitoring program which addressed both RCRA and CERCLA requirements. The integrated plan resulted in a cost savings of approximately $2.6 million. At present, the FEMP is also working on an integrated closure process to address Hazardous Waste Management Units (HWMUs) at the site. To date, Ohio EPA seems willing to discuss an integrated program with some stipulations. If an integrated program is implemented, a cost savings of several million dollars will be realized since the CERCLA documents can be used in place of a RCRA closure plan. The success of an integrated program at the FEMP is impossible without the support of DOE and the regulators. Since DOE is an owner/operator of the facility and Ohio EPA regulates hazardous waste management activities at the FEMP, both parties must be satisfied with the proposed integration activities. Similarly, US EPA retains CERCLA authority over the site along with a signed consent agreement with DOE, which dictates the schedule of the CERCLA activities. Another federal facility used RCRA closure plans to satisfy CERCLA activities. This federal facility was in a different US EPA Region than the FEMP. While this approach was successful for this site, an integrated approach was required at the FEMP because of the signed Consent Agreement and Consent Decree. For federal facilities which have a large number of HWMUs along with OUs, an integrated approach may result in a timely and cost-effective cleanup

  2. Putting Priors in Mixture Density Mercer Kernels

    Science.gov (United States)

    Srivastava, Ashok N.; Schumann, Johann; Fischer, Bernd

    2004-01-01

    This paper presents a new methodology for automatic knowledge driven data mining based on the theory of Mercer Kernels, which are highly nonlinear symmetric positive definite mappings from the original image space to a very high, possibly infinite dimensional feature space. We describe a new method called Mixture Density Mercer Kernels to learn kernel function directly from data, rather than using predefined kernels. These data adaptive kernels can en- code prior knowledge in the kernel using a Bayesian formulation, thus allowing for physical information to be encoded in the model. We compare the results with existing algorithms on data from the Sloan Digital Sky Survey (SDSS). The code for these experiments has been generated with the AUTOBAYES tool, which automatically generates efficient and documented C/C++ code from abstract statistical model specifications. The core of the system is a schema library which contains template for learning and knowledge discovery algorithms like different versions of EM, or numeric optimization methods like conjugate gradient methods. The template instantiation is supported by symbolic- algebraic computations, which allows AUTOBAYES to find closed-form solutions and, where possible, to integrate them into the code. The results show that the Mixture Density Mercer-Kernel described here outperforms tree-based classification in distinguishing high-redshift galaxies from low- redshift galaxies by approximately 16% on test data, bagged trees by approximately 7%, and bagged trees built on a much larger sample of data by approximately 2%.

  3. Prior expectations facilitate metacognition for perceptual decision.

    Science.gov (United States)

    Sherman, M T; Seth, A K; Barrett, A B; Kanai, R

    2015-09-01

    The influential framework of 'predictive processing' suggests that prior probabilistic expectations influence, or even constitute, perceptual contents. This notion is evidenced by the facilitation of low-level perceptual processing by expectations. However, whether expectations can facilitate high-level components of perception remains unclear. We addressed this question by considering the influence of expectations on perceptual metacognition. To isolate the effects of expectation from those of attention we used a novel factorial design: expectation was manipulated by changing the probability that a Gabor target would be presented; attention was manipulated by instructing participants to perform or ignore a concurrent visual search task. We found that, independently of attention, metacognition improved when yes/no responses were congruent with expectations of target presence/absence. Results were modeled under a novel Bayesian signal detection theoretic framework which integrates bottom-up signal propagation with top-down influences, to provide a unified description of the mechanisms underlying perceptual decision and metacognition. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Washing of waste prior to landfilling.

    Science.gov (United States)

    Cossu, Raffaello; Lai, Tiziana

    2012-05-01

    The main impact produced by landfills is represented by the release of leachate emissions. Waste washing treatment has been investigated to evaluate its efficiency in reducing the waste leaching fraction prior to landfilling. The results of laboratory-scale washing tests applied to several significant residues from integrated management of solid waste are presented in this study, specifically: non-recyclable plastics from source separation, mechanical-biological treated municipal solid waste and a special waste, automotive shredded residues. Results obtained demonstrate that washing treatment contributes towards combating the environmental impacts of raw wastes. Accordingly, a leachate production model was applied, leading to the consideration that the concentrations of chemical oxygen demand (COD) and total Kjeldahl nitrogen (TKN), parameters of fundamental importance in the characterization of landfill leachate, from a landfill containing washed wastes, are comparable to those that would only be reached between 90 and 220years later in the presence of raw wastes. The findings obtained demonstrated that washing of waste may represent an effective means of reducing the leachable fraction resulting in a consequent decrease in landfill emissions. Further studies on pilot scale are needed to assess the potential for full-scale application of this treatment. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Pitch perception prior to cortical maturation

    Science.gov (United States)

    Lau, Bonnie K.

    Pitch perception plays an important role in many complex auditory tasks including speech perception, music perception, and sound source segregation. Because of the protracted and extensive development of the human auditory cortex, pitch perception might be expected to mature, at least over the first few months of life. This dissertation investigates complex pitch perception in 3-month-olds, 7-month-olds and adults -- time points when the organization of the auditory pathway is distinctly different. Using an observer-based psychophysical procedure, a series of four studies were conducted to determine whether infants (1) discriminate the pitch of harmonic complex tones, (2) discriminate the pitch of unresolved harmonics, (3) discriminate the pitch of missing fundamental melodies, and (4) have comparable sensitivity to pitch and spectral changes as adult listeners. The stimuli used in these studies were harmonic complex tones, with energy missing at the fundamental frequency. Infants at both three and seven months of age discriminated the pitch of missing fundamental complexes composed of resolved and unresolved harmonics as well as missing fundamental melodies, demonstrating perception of complex pitch by three months of age. More surprisingly, infants in both age groups had lower pitch and spectral discrimination thresholds than adult listeners. Furthermore, no differences in performance on any of the tasks presented were observed between infants at three and seven months of age. These results suggest that subcortical processing is not only sufficient to support pitch perception prior to cortical maturation, but provides adult-like sensitivity to pitch by three months.

  6. Febrile seizures prior to sudden cardiac death

    DEFF Research Database (Denmark)

    Stampe, Niels Kjær; Glinge, Charlotte; Jabbari, Reza

    2018-01-01

    Aims: Febrile seizure (FS) is a common disorder affecting 2-5% of children up to 5 years of age. The aim of this study was to determine whether FS in early childhood are over-represented in young adults dying from sudden cardiac death (SCD). Methods and results: We included all deaths (n = 4595...... with FS was sudden arrhythmic death syndrome (5/8; 62.5%). Conclusion: In conclusion, this study demonstrates a significantly two-fold increase in the frequency of FS prior to death in young SCD cases compared with the two control groups, suggesting that FS could potentially contribute in a risk......) nationwide and through review of all death certificates, we identified 245 SCD in Danes aged 1-30 years in 2000-09. Through the usage of nationwide registries, we identified all persons admitted with first FS among SCD cases (14/245; 5.7%) and in the corresponding living Danish population (71 027/2 369 785...

  7. Effect of Prior Health-Related Employment on the Registered Nurse Workforce Supply.

    Science.gov (United States)

    Yoo, Byung-kwan; Lin, Tzu-chun; Kim, Minchul; Sasaki, Tomoko; Spetz, Joanne

    2016-01-01

    Registered nurses (RN) who held prior health-related employment in occupations other than licensed practical or vocational nursing (LPN/LVN) are reported to have increased rapidly in the past decades. Researchers examined whether prior health-related employment affects RN workforce supply. A cross-sectional bivariate probit model using the 2008 National Sample Survey of Registered Nurses was esti- mated. Prior health-related employment in relatively lower-wage occupations, such as allied health, clerk, or nursing aide, was positively associated with working s an RN. ~>Prior health-related employ- ment in relatively higher-wage categories, such as a health care manager or LPN/LVN, was positively associated with working full-time as an RN. Policy implications are to promote an expanded career ladder program and a nursing school admission policy that targets non-RN health care workers with an interest in becoming RNs.

  8. Relating Magnetic Parameters to Heavy Metal Concentrations and Environmental Factors at Formosa Mine Superfund Site, Douglas County, OR

    Science.gov (United States)

    Upton, T. L.

    2016-12-01

    Advances in the field of environmental magnetism have led to exciting new applications for this field. Magnetic minerals are ubiquitous in the environment and tend to have an affinity for heavy metals. Hence, it has been demonstrated that magnetic properties are often significantly related to concentrations of heavy metals and other pollutants. As a result, magnetic techniques have been used as proxy for determining hot spots of several types of pollution produced from a diversity of anthropogenic sources. Magnetic measurements are non-destructive and relatively inexpensive compared to geochemical analyses. The utility of environmental magnetic methods varies widely depending on biological, chemical and physical processes that create and transform soils and sediments. Applications in the direction of mapping heavy metals have been studied and shown to be quite useful in countries such as China and India but to date, little research has been done in the US. As such, there is need to expand the scope of research to a wider range of soil types and land uses, especially within the US. This study investigates the application of environmental magnetic techniques to mapping of heavy metal concentrations at the Formosa Mine Superfund Site, an abandoned mine about 25 miles southwest of Roseburg, OR. The soils and sediment at this site are derived from pyrite-rich bedrock which is weak in terms of magnetic susceptibility. Using hotspot analysis, correlation and cluster analyses, interactions between metals and magnetic parameters are investigated in relation to environmental factors such as proximity to seeps and adits. Preliminary results suggest significant correlation of magnetic susceptibility with certain heavy metals, signifying that magnetic methods may be useful in mapping heavy metal hotspots at this site. Further analysis examines the relation of various land use differences in magnetic signatures obtained throughout the Cow Creek watershed.

  9. Stakeholder value-linked sustainability assessment: Evaluating remedial alternatives for the Portland Harbor Superfund Site, Portland, Oregon, USA.

    Science.gov (United States)

    Apitz, Sabine E; Fitzpatrick, Anne G; McNally, Amanda; Harrison, David; Coughlin, Conor; Edwards, Deborah A

    2018-01-01

    Regulatory decisions on remediation should consider affected communities' needs and values, and how these might be impacted by remedial options; this process requires that diverse stakeholders are able to engage in a transparent consideration of value trade-offs and of the distribution of risks and benefits associated with remedial actions and outcomes. The Stakeholder Values Assessment (SVA) tool was developed to evaluate remedial impacts on environmental quality, economic viability, and social equity in the context of stakeholder values and priorities. Stakeholder values were linked to the pillars of sustainability and also to a range of metrics to evaluate how sediment remediation affects these values. Sediment remedial alternatives proposed by the US Environmental Protection Agency (USEPA) for the Portland Harbor Superfund Site were scored for each metric, based upon data provided in published feasibility study (FS) documents. Metric scores were aggregated to generate scores for each value; these were then aggregated to generate scores for each pillar of sustainability. In parallel, the inferred priorities (in terms of regional remediation, restoration, planning, and development) of diverse stakeholder groups (SGs) were used to evaluate the sensitivity and robustness of the values-based sustainability assessment to diverse SG priorities. This approach, which addresses social indicators of impact and then integrates them with indicators of environmental and economic impacts, goes well beyond the Comprehensive Environmental Response, Compensation and Liability Act's (CERCLA) 9 criteria for evaluating remedial alternatives because it evaluates how remedial alternatives might be ranked in terms of the diverse values and priorities of stakeholders. This approach identified trade-offs and points of potential contention, providing a systematic, semiquantitative, transparent valuation tool that can be used in community engagement. Integr Environ Assess Manag 2018

  10. "The Teacher Is an Octopus": Uncovering Preservice English Language Teachers' Prior Beliefs through Metaphor Analysis

    Science.gov (United States)

    Farrell, Thomas S. C.

    2006-01-01

    Preservice teachers come to any teacher education course with prior experiences, knowledge and beliefs about learning and teaching. Additionally, the belief systems of preservice teachers often serve as a lens through which they view the content of the teacher education program. Consequently, it is essential that teacher educators take these prior…

  11. Racial/Ethnic Differences in Dietary Intake among WIC Families Prior to Food Package Revisions

    Science.gov (United States)

    Kong, Angela; Odoms-Young, Angela M.; Schiffer, Linda A.; Berbaum, Michael L.; Porter, Summer J.; Blumstein, Lara; Fitzgibbon, Marian L.

    2013-01-01

    Objective: To compare the diets of African American and Hispanic families in the Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) prior to the 2009 food package revisions. Methods: Mother-child dyads were recruited from 12 WIC sites in Chicago, IL. Individuals with 1 valid 24-hour recall were included in the analyses…

  12. Moving the Starting Line through Prior Learning Assessment (PLA). Research Brief

    Science.gov (United States)

    Council for Adult and Experiential Learning (NJ1), 2011

    2011-01-01

    Prior learning assessment (PLA) methods can help adult students earn college credit for what they already know. PLA can be an important offering by postsecondary degree programs because it can save students time and money. In addition, the Council for Adult and Experiential Learning's (CAEL's) "Fueling the Race to Postsecondary Success"…

  13. Prior Mental Fatigue Impairs Marksmanship Decision Performance

    Directory of Open Access Journals (Sweden)

    James Head

    2017-09-01

    Full Text Available Purpose: Mental fatigue has been shown to impair subsequent physical performance in continuous and discontinuous exercise. However, its influence on subsequent fine-motor performance in an applied setting (e.g., marksmanship for trained soldiers is relatively unknown. The purpose of this study was to investigate whether prior mental fatigue influences subsequent marksmanship performance as measured by shooting accuracy and judgment of soldiers in a live-fire scenario.Methods: Twenty trained infantry soldiers engaged targets after completing either a mental fatigue or control intervention in a repeated measure design. Heart rate variability and the NASA-TLX were used to gauge physiological and subjective effects of the interventions. Target hit proportion, projectile group accuracy, and precision were used to measure marksmanship accuracy. Marksmanship accuracy was assessed by measuring bullet group accuracy (i.e., how close a group of shots are relative to center of mass and bullet group precision (i.e., how close are each individual shot to each other. Additionally, marksmanship decision accuracy (correctly shooting vs. correctly withholding shot when engaging targets was used to examine marksmanship performance.Results: Soldiers rated the mentally fatiguing task (59.88 ± 23.7 as having greater mental workload relative to the control intervention [31.29 ± 12.3, t(19 = 1.72, p < 0.001]. Additionally, soldiers completing the mental fatigue intervention (96.04 ± = 37.1 also had lower time-domain (standard deviation of normal to normal R-R intervals heart rate variability relative to the control [134.39 ± 47.4, t(18 = 3.59, p < 0.001]. Projectile group accuracy and group precision failed to show differences between interventions [t(19 = 0.98, p = 0.34, t(19 = 0.18, p = 0.87, respectively]. Marksmanship decision errors significantly increased after soldiers completed the mental fatigue intervention (48% ± 22.4 relative to the control

  14. Digital communication constraints in prior space missions

    Science.gov (United States)

    Yassine, Nathan K.

    2004-01-01

    Digital communication is crucial for space endeavors. Jt transmits scientific and command data between earth stations and the spacecraft crew. It facilitates communications between astronauts, and provides live coverage during all phases of the mission. Digital communications provide ground stations and spacecraft crew precise data on the spacecraft position throughout the entire mission. Lessons learned from prior space missions are valuable for our new lunar and Mars missions set by our president s speech. These data will save our agency time and money, and set course our current developing technologies. Limitations on digital communications equipment pertaining mass, volume, data rate, frequency, antenna type and size, modulation, format, and power in the passed space missions are of particular interest. This activity is in support of ongoing communication architectural studies pertaining to robotic and human lunar exploration. The design capabilities and functionalities will depend on the space and power allocated for digital communication equipment. My contribution will be gathering these data, write a report, and present it to Communications Technology Division Staff. Antenna design is very carefully studied for each mission scenario. Currently, Phased array antennas are being developed for the lunar mission. Phased array antennas use little power, and electronically steer a beam instead of DC motors. There are 615 patches in the phased array antenna. These patches have to be modified to have high yield. 50 patches were created for testing. My part is to assist in the characterization of these patch antennas, and determine whether or not certain modifications to quartz micro-strip patch radiators result in a significant yield to warrant proceeding with repairs to the prototype 19 GHz ferroelectric reflect-array antenna. This work requires learning how to calibrate an automatic network, and mounting and testing antennas in coaxial fixtures. The purpose of this

  15. Estimating security betas using prior information based on firm fundamentals

    NARCIS (Netherlands)

    Cosemans, M.; Frehen, R.; Schotman, P.C.; Bauer, R.

    2010-01-01

    This paper proposes a novel approach for estimating time-varying betas of individual stocks that incorporates prior information based on fundamentals. We shrink the rolling window estimate of beta towards a firm-specific prior that is motivated by asset pricing theory. The prior captures structural

  16. The standard for program management

    CERN Document Server

    2017-01-01

    The Standard for Program Management – Fourth Edition differs from prior editions by focusing on the principles of good program management. Program activities have been realigned to program lifecycle phases rather than topics, and the first section was expanded to address the key roles of program manager, program sponsor and program management office. It has also been updated to better align with PMI’s Governance of Portfolios, Programs, and Projects: A Practice Guide.

  17. Concentration and trend of 1,4-dioxane in wells sampled during 2002–2017 in the vicinity of the Tucson International Airport Area Superfund Site, Arizona

    Science.gov (United States)

    Tillman, Fred D.

    2017-09-25

    Industrial activities causing extensive groundwater contamination led to the listing of the Tucson International Airport Area (TIAA) as a Superfund Site in 1983. Early groundwater investigations identified volatile organic compounds (VOCs), including the chlorinated solvents trichloroethylene (TCE) and perchloroethylene (PCE), in wells in the area. Several responsible parties were identified and cleanup activities began in the late 1980s. In 2002, the compound 1,4-dioxane was discovered in wells in the area and has since been detected in measurable concentrations throughout the site. The U.S. Environmental Protection Agency (USEPA) classifies 1,4-dioxane as a likely human carcinogen.The purpose of this map is to present 1,4-dioxane concentrations in wells sampled from 2002 through mid-2017 in the TIAA Superfund Site area to indicate both the current status and trends in 1,4-dioxane groundwater contamination. This map includes data from wells in the commercial and residential community in the TIAA and does not include data from wells in suspected or confirmed source areas, such as Air Force Plant 44 and Tucson International Airport, or from wells within treatment facilities.

  18. Birth-death prior on phylogeny and speed dating

    Directory of Open Access Journals (Sweden)

    Sennblad Bengt

    2008-03-01

    Full Text Available Abstract Background In recent years there has been a trend of leaving the strict molecular clock in order to infer dating of speciations and other evolutionary events. Explicit modeling of substitution rates and divergence times makes formulation of informative prior distributions for branch lengths possible. Models with birth-death priors on tree branching and auto-correlated or iid substitution rates among lineages have been proposed, enabling simultaneous inference of substitution rates and divergence times. This problem has, however, mainly been analysed in the Markov chain Monte Carlo (MCMC framework, an approach requiring computation times of hours or days when applied to large phylogenies. Results We demonstrate that a hill-climbing maximum a posteriori (MAP adaptation of the MCMC scheme results in considerable gain in computational efficiency. We demonstrate also that a novel dynamic programming (DP algorithm for branch length factorization, useful both in the hill-climbing and in the MCMC setting, further reduces computation time. For the problem of inferring rates and times parameters on a fixed tree, we perform simulations, comparisons between hill-climbing and MCMC on a plant rbcL gene dataset, and dating analysis on an animal mtDNA dataset, showing that our methodology enables efficient, highly accurate analysis of very large trees. Datasets requiring a computation time of several days with MCMC can with our MAP algorithm be accurately analysed in less than a minute. From the results of our example analyses, we conclude that our methodology generally avoids getting trapped early in local optima. For the cases where this nevertheless can be a problem, for instance when we in addition to the parameters also infer the tree topology, we show that the problem can be evaded by using a simulated-annealing like (SAL method in which we favour tree swaps early in the inference while biasing our focus towards rate and time parameter changes

  19. Voluntary program promotes equitable and expedited remediation of contaminated properties

    Energy Technology Data Exchange (ETDEWEB)

    Wolfenden, A.K.; Cambridge, M. [California Environmental Protection Agency, Sacramento, CA (United States). Dept. of Toxic Substances Control

    1995-12-31

    In California, the California Environmental Protection Agency (Cal/EPA) has developed a more equitable and expedited approach for the redevelopment of sites contaminated with hazardous substances. Senate Bill 923 enacted in 1994, established the Expedited Remedial Action Program (ERAP) under Chapter 6.85 of the California Health and Safety Code. This bill responds to a nationwide demand to reform Superfund laws and promote the restoration of blighted and contaminated parcels--often referred to as Brownfields. The program was designed as an alternative to CERCLA, which has come under criticism for being inefficient, unfair and restricting opportunities for effective cleanups. Cal/EPA`s Department of Toxic Substances Control will implement this pilot program. This pilot program, which will eventually comprise 30 sites, provides incentives for voluntary remediation by addressing key economic issues associated with the remediation and redevelopment of contaminated properties.

  20. TREATABILITY STUDY REPORT OF GREEN MOUNTAIN LABORATORIES, INC.'S BIOREMEDIATION PROCESS, TREATMENT OF PCB CONTAMINATED SOILS, AT BEEDE WASTE OIL/CASH ENERGY SUPERFUND SITE, PLAISTOW, NEW HAMPSHIRE

    Science.gov (United States)

    In 1998, Green Mountain Laboratories, Inc. (GML) and the USEPA agreed to carry out a Superfund Innovative Technology Evaluation (SITE) project to evaluate the effectiveness of GML's Bioremediation Process for the treatment of PCB contaminated soils at the Beede Waste Oil/Cash Ene...

  1. Generalized Bayesian inference with sets of conjugate priors for dealing with prior-data conflict : course at Lund University

    NARCIS (Netherlands)

    Walter, G.

    2015-01-01

    In the Bayesian approach to statistical inference, possibly subjective knowledge on model parameters can be expressed by so-called prior distributions. A prior distribution is updated, via Bayes’ Rule, to the so-called posterior distribution, which combines prior information and information from

  2. The Influence of Prior Knowledge on the Retrieval-Directed Function of Note Taking in Prior Knowledge Activation

    Science.gov (United States)

    Wetzels, Sandra A. J.; Kester, Liesbeth; van Merrienboer, Jeroen J. G.; Broers, Nick J.

    2011-01-01

    Background: Prior knowledge activation facilitates learning. Note taking during prior knowledge activation (i.e., note taking directed at retrieving information from memory) might facilitate the activation process by enabling learners to build an external representation of their prior knowledge. However, taking notes might be less effective in…

  3. Northwest Hazardous Waste Research, Development, and Demonstration Center: Program Plan

    International Nuclear Information System (INIS)

    1988-02-01

    The Northwest Hazardous Waste Research, Development, and Demonstration Center was created as part of an ongoing federal effort to provide technologies and methods that protect human health and welfare and environment from hazardous wastes. The Center was established by the Superfund Amendments and Reauthorization Act (SARA) to develop and adapt innovative technologies and methods for assessing the impacts of and remediating inactive hazardous and radioactive mixed-waste sites. The Superfund legislation authorized $10 million for Pacific Northwest Laboratory to establish and operate the Center over a 5-year period. Under this legislation, Congress authorized $10 million each to support research, development, and demonstration (RD and D) on hazardous and radioactive mixed-waste problems in Idaho, Montana, Oregon, and Washington, including the Hanford Site. In 1987, the Center initiated its RD and D activities and prepared this Program Plan that presents the framework within which the Center will carry out its mission. Section 1.0 describes the Center, its mission, objectives, organization, and relationship to other programs. Section 2.0 describes the Center's RD and D strategy and contains the RD and D objectives, priorities, and process to be used to select specific projects. Section 3.0 contains the Center's FY 1988 operating plan and describes the specific RD and D projects to be carried out and their budgets and schedules. 9 refs., 18 figs., 5 tabs

  4. Prior Sensitivity Analysis in Default Bayesian Structural Equation Modeling.

    Science.gov (United States)

    van Erp, Sara; Mulder, Joris; Oberski, Daniel L

    2017-11-27

    Bayesian structural equation modeling (BSEM) has recently gained popularity because it enables researchers to fit complex models and solve some of the issues often encountered in classical maximum likelihood estimation, such as nonconvergence and inadmissible solutions. An important component of any Bayesian analysis is the prior distribution of the unknown model parameters. Often, researchers rely on default priors, which are constructed in an automatic fashion without requiring substantive prior information. However, the prior can have a serious influence on the estimation of the model parameters, which affects the mean squared error, bias, coverage rates, and quantiles of the estimates. In this article, we investigate the performance of three different default priors: noninformative improper priors, vague proper priors, and empirical Bayes priors-with the latter being novel in the BSEM literature. Based on a simulation study, we find that these three default BSEM methods may perform very differently, especially with small samples. A careful prior sensitivity analysis is therefore needed when performing a default BSEM analysis. For this purpose, we provide a practical step-by-step guide for practitioners to conducting a prior sensitivity analysis in default BSEM. Our recommendations are illustrated using a well-known case study from the structural equation modeling literature, and all code for conducting the prior sensitivity analysis is available in the online supplemental materials. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Turner syndrome: counseling prior to oocyte donation

    Directory of Open Access Journals (Sweden)

    Ester Silveira Ramos

    2007-03-01

    Full Text Available Ovarian failure is a typical feature of Turner syndrome (TS. Patients are followed clinically with hormone replacement therapy (HRT and inclusion in the oocyte donation program, if necessary. For patients with spontaneous puberty, genetic counseling regarding preimplantation genetic diagnosis and prenatal diagnosis is indicated. Patients with dysgenetic gonads and a Y chromosome are at increased risk of developing gonadoblastoma. Even though this is not an invasive tumor, its frequent association with other malignant forms justifies prophylactic gonadectomy. It is important to perform gonadectomy before HRT and pregnancy with oocyte donation. Among patients with TS stigmata and female genitalia, many have the Y chromosome in one of the cell lines. For this reason, all patients should undergo cytogenetic analysis. Nevertheless, in cases of structural chromosomal alterations or hidden mosaicism, the conventional cytogenetic techniques may be ineffective and molecular investigation is indicated. The author proposes a practical approach for investigating women with TS stigmata in whom identification of the X or Y chromosome is important for clinical management and follow-up.

  6. Third party Superfund lawsuit defense influenced by the choice of remediation method

    International Nuclear Information System (INIS)

    Haddad, B.I.; Parish, G.B.

    1994-01-01

    Paper Company A was sued in a third party action suit initiated by a local utility who was a potential responsible party (PRP) to a contaminated site regulated under the Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) program. In addition to Paper Company A, other parties to the third party suit included Paper Company B and Contractor C, a demolition contractor/waste hauler. Other PRPs included land owners where the contaminated debris was dumped, Mr. and Mrs. D. Based on background information, Paper Company A dumped coal ash, off quality feed stock, wood and trash north of the D-property. Paper Company B admitted dumping material north of the D-property. Samples of industrial sludges on the D-property had properties characteristic of the Paper Company B's sludges. Paper Company B dumped ash, chromium contaminated gypsum sludge and other waste. The utility company dumped ash on the D-property. Contractor C hauled demolition debris to the D property. A third PRP, Company E was the original owner of the buildings that were demolished. This PRP settled with the EPA as part of a bankruptcy settlement. The hazardous substances encountered at the site included PCBs, chromium and lead in the coal ash, demolition debris and industrial sludges. Disposal of material containing hazardous substances resulted in PCB contaminated debris and sediment, and chromium and lead contamination in the sediment, soil and groundwater

  7. Vadose zone investigations at the Lawrence Livermore National Laboratory Superfund Site: An overview

    International Nuclear Information System (INIS)

    Iovenitti, J.L.; Nitao, J.J.; Bishop, D.J.

    1992-09-01

    Lawrence Livermore National Laboratory (LLNL)is investigating the fate and transport of vadose zone contaminants at their Livermore site in Livermore, California. The principal objectives of this work are to identify potential source areas at the Livermore site which require remediation, to prioritize those areas, and finally, to optimize the remediation process. Primary contaminants of interest for this investigation are volatile organic compounds (VOCs) and tritium. A fully integrated, three-part program, consisting of quantitative modeling, field studies, and laboratory measurements, is in progress. To evaluate and predict vadose zone contaminant migration, quantitative modeling is used. Our modeling capabilities are being enhanced through the development of a multicomponent,three-dimensional,nonaqueous phase liquid-liquid-vapor,nonisothermal flow and transport computer code. This code will be also used to evaluate vadose zone remediation requirements. Field studies to acquire LLNL site-specific soil (sediment) characteristics for computer code calibration and validation include subsurf ace lithologic and contaminant profiling, in situ soil moisture content, ground surface emission flux of VOCs and tritium, transpiration of tritium, and ground surface evapotranspiration of water. Multilevel vadose zone monitoring devices are used to monitor the gaseous and aqueous transport of contaminants

  8. Socializing processes in relation to the recognition of unskilled adults’ prior learning

    DEFF Research Database (Denmark)

    Aarkrog, Vibe

    for the workplace-based training. However the study can contribute to the discussion of the value of practical experiences: are practical experiences creditable in educational programs? The study shows that the recognition and assessment of prior learning requires that the students can verbalize and preferably also......The ordinary Danish VET programs are organized as dual programs in which the students alternate between school-based education and training and workplace-based training. The adult students in the course “From unskilled worker to skilled worker in record time” are automatically credited...

  9. Application of probabilistic risk assessment: Evaluating remedial alternatives at the Portland Harbor Superfund Site, Portland, Oregon, USA.

    Science.gov (United States)

    Ruffle, Betsy; Henderson, James; Murphy-Hagan, Clare; Kirkwood, Gemma; Wolf, Frederick; Edwards, Deborah A

    2018-01-01

    A probabilistic risk assessment (PRA) was performed to evaluate the range of potential baseline and postremedy health risks to fish consumers at the Portland Harbor Superfund Site (the "Site"). The analysis focused on risks of consuming fish resident to the Site containing polychlorinated biphenyls (PCBs), given that this exposure scenario and contaminant are the primary basis for US Environmental Protection Agency's (USEPA's) selected remedy per the January 2017 Record of Decision (ROD). The PRA used probability distributions fit to the same data sets used in the deterministic baseline human health risk assessment (BHHRA) as well as recent sediment and fish tissue data to evaluate the range and likelihood of current baseline cancer risks and noncancer hazards for anglers. Areas of elevated PCBs in sediment were identified on the basis of a geospatial evaluation of the surface sediment data, and the ranges of risks and hazards associated with pre- and postremedy conditions were calculated. The analysis showed that less active remediation (targeted to areas with the highest concentrations) compared to the remedial alternative selected by USEPA in the ROD can achieve USEPA's interim risk management benchmarks (cancer risk of 10 -4 and noncancer hazard index [HI] of 10) immediately postremediation for the vast majority of subsistence anglers that consume smallmouth bass (SMB) fillet tissue. In addition, the same targeted remedy achieves USEPA's long-term benchmarks (10 -5 and HI of 1) for the majority of recreational anglers. Additional sediment remediation would result in negligible additional risk reduction due to the influence of background. The PRA approach applied here provides a simple but adaptive framework for analysis of risks and remedial options focused on variability in exposures. It can be updated and refined with new data to evaluate and reduce uncertainty, improve understanding of the Site and target populations, and foster informed remedial decision

  10. The Role of Prior Knowledge in International Franchise Partner Recruitment

    OpenAIRE

    Wang, Catherine; Altinay, Levent

    2006-01-01

    Purpose To investigate the role of prior knowledge in the international franchise partner recruitment process and to evaluate how cultural distance influences the role of prior knowledge in this process. Design/Methodology/Approach A single embedded case study of an international hotel firm was the focus of the enquiry. Interviews, observations and document analysis were used as the data collection techniques. Findings Findings reveal that prior knowledge of the franchisor enab...

  11. Spectrally Consistent Satellite Image Fusion with Improved Image Priors

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Aanæs, Henrik; Jensen, Thomas B.S.

    2006-01-01

    Here an improvement to our previous framework for satellite image fusion is presented. A framework purely based on the sensor physics and on prior assumptions on the fused image. The contributions of this paper are two fold. Firstly, a method for ensuring 100% spectrally consistency is proposed......, even when more sophisticated image priors are applied. Secondly, a better image prior is introduced, via data-dependent image smoothing....

  12. Acquisition of multiple prior distributions in tactile temporal order judgment

    Directory of Open Access Journals (Sweden)

    Yasuhito eNagai

    2012-08-01

    Full Text Available The Bayesian estimation theory proposes that the brain acquires the prior distribution of a task and integrates it with sensory signals to minimize the effect of sensory noise. Psychophysical studies have demonstrated that our brain actually implements Bayesian estimation in a variety of sensory-motor tasks. However, these studies only imposed one prior distribution on participants within a task period. In this study, we investigated the conditions that enable the acquisition of multiple prior distributions in temporal order judgment (TOJ of two tactile stimuli across the hands. In Experiment 1, stimulation intervals were randomly selected from one of two prior distributions (biased to right hand earlier and biased to left hand earlier in association with color cues (green and red, respectively. Although the acquisition of the two priors was not enabled by the color cues alone, it was significant when participants shifted their gaze (above or below in response to the color cues. However, the acquisition of multiple priors was not significant when participants moved their mouths (opened or closed. In Experiment 2, the spatial cues (above and below were used to identify which eye position or retinal cue position was crucial for the eye-movement-dependent acquisition of multiple priors in Experiment 1. The acquisition of the two priors was significant when participants moved their gaze to the cues (i.e., the cue positions on the retina were constant across the priors, as well as when participants did not shift their gazes (i.e., the cue positions on the retina changed according to the priors. Thus, both eye and retinal cue positions were effective in acquiring multiple priors. Based on previous neurophysiological reports, we discuss possible neural correlates that contribute to the acquisition of multiple priors.

  13. Training shortest-path tractography: Automatic learning of spatial priors

    DEFF Research Database (Denmark)

    Kasenburg, Niklas; Liptrot, Matthew George; Reislev, Nina Linde

    2016-01-01

    Tractography is the standard tool for automatic delineation of white matter tracts from diffusion weighted images. However, the output of tractography often requires post-processing to remove false positives and ensure a robust delineation of the studied tract, and this demands expert prior...... knowledge. Here we demonstrate how such prior knowledge, or indeed any prior spatial information, can be automatically incorporated into a shortest-path tractography approach to produce more robust results. We describe how such a prior can be automatically generated (learned) from a population, and we...

  14. Crowdsourcing prior information to improve study design and data analysis.

    Directory of Open Access Journals (Sweden)

    Jeffrey S Chrabaszcz

    Full Text Available Though Bayesian methods are being used more frequently, many still struggle with the best method for setting priors with novel measures or task environments. We propose a method for setting priors by eliciting continuous probability distributions from naive participants. This allows us to include any relevant information participants have for a given effect. Even when prior means are near-zero, this method provides a principle way to estimate dispersion and produce shrinkage, reducing the occurrence of overestimated effect sizes. We demonstrate this method with a number of published studies and compare the effect of different prior estimation and aggregation methods.

  15. Prior knowledge in recalling arguments in bioethical dilemmas

    Directory of Open Access Journals (Sweden)

    Hiemke Katharina Schmidt

    2015-09-01

    Full Text Available Prior knowledge is known to facilitate learning new information. Normally in studies confirming this outcome the relationship between prior knowledge and the topic to be learned is obvious: the information to be acquired is part of the domain or topic to which the prior knowledge belongs. This raises the question as to whether prior knowledge of various domains facilitates recalling information. In this study 79 eleventh-grade students completed a questionnaire on their prior knowledge of seven different domains related to the bioethical dilemma of prenatal diagnostics. The students read a text containing arguments for and arguments against prenatal diagnostics. After one week and again 12 weeks later they were asked to write down all the arguments they remembered. Prior knowledge helped them recall the arguments one week (r = .350 and 12 weeks (r = .316 later. Prior knowledge of three of the seven domains significantly helped them recall the arguments one week later (correlations between r = .194 to r = .394. Partial correlations with interest as a control item revealed that interest did not explain the relationship between prior knowledge and recall. Prior knowledge of different domains jointly supports the recall of arguments related to bioethical topics.

  16. Chromosomal differences between acute nonlymphocytic leukemia in patients with prior solid tumors and prior hematologic malignancies. A study of 14 cases with prior breast cancer

    International Nuclear Information System (INIS)

    Mamuris, Z.; Dumont, J.; Dutrillaux, B.; Aurias, A.

    1989-01-01

    A cytogenetic study of 14 patients with secondary acute nonlymphocytic leukemia (S-ANLL) with prior treatment for breast cancer is reported. The chromosomes recurrently involved in numerical or structural anomalies are chromosomes 7, 5, 17, and 11, in decreasing order of frequency. The distribution of the anomalies detected in this sample of patients is similar to that observed in published cases with prior breast or other solid tumors, though anomalies of chromosome 11 were not pointed out, but it significantly differs from that of the S-ANLL with prior hematologic malignancies. This difference is principally due to a higher involvement of chromosome 7 in patients with prior hematologic malignancies and of chromosomes 11 and 17 in patients with prior solid tumors. A genetic determinism involving abnormal recessive alleles located on chromosomes 5, 7, 11, and 17 uncovered by deletions of the normal homologs may be a cause of S-ANLL. The difference between patients with prior hematologic malignancies or solid tumors may be explained by different constitutional mutations of recessive genes in the two groups of patients

  17. Testability evaluation using prior information of multiple sources

    Directory of Open Access Journals (Sweden)

    Wang Chao

    2014-08-01

    Full Text Available Testability plays an important role in improving the readiness and decreasing the life-cycle cost of equipment. Testability demonstration and evaluation is of significance in measuring such testability indexes as fault detection rate (FDR and fault isolation rate (FIR, which is useful to the producer in mastering the testability level and improving the testability design, and helpful to the consumer in making purchase decisions. Aiming at the problems with a small sample of testability demonstration test data (TDTD such as low evaluation confidence and inaccurate result, a testability evaluation method is proposed based on the prior information of multiple sources and Bayes theory. Firstly, the types of prior information are analyzed. The maximum entropy method is applied to the prior information with the mean and interval estimate forms on the testability index to obtain the parameters of prior probability density function (PDF, and the empirical Bayesian method is used to get the parameters for the prior information with a success-fail form. Then, a parametrical data consistency check method is used to check the compatibility between all the sources of prior information and TDTD. For the prior information to pass the check, the prior credibility is calculated. A mixed prior distribution is formed based on the prior PDFs and the corresponding credibility. The Bayesian posterior distribution model is acquired with the mixed prior distribution and TDTD, based on which the point and interval estimates are calculated. Finally, examples of a flying control system are used to verify the proposed method. The results show that the proposed method is feasible and effective.

  18. Testability evaluation using prior information of multiple sources

    Institute of Scientific and Technical Information of China (English)

    Wang Chao; Qiu Jing; Liu Guanjun; Zhang Yong

    2014-01-01

    Testability plays an important role in improving the readiness and decreasing the life-cycle cost of equipment. Testability demonstration and evaluation is of significance in measuring such testability indexes as fault detection rate (FDR) and fault isolation rate (FIR), which is useful to the producer in mastering the testability level and improving the testability design, and helpful to the consumer in making purchase decisions. Aiming at the problems with a small sample of testabil-ity demonstration test data (TDTD) such as low evaluation confidence and inaccurate result, a test-ability evaluation method is proposed based on the prior information of multiple sources and Bayes theory. Firstly, the types of prior information are analyzed. The maximum entropy method is applied to the prior information with the mean and interval estimate forms on the testability index to obtain the parameters of prior probability density function (PDF), and the empirical Bayesian method is used to get the parameters for the prior information with a success-fail form. Then, a parametrical data consistency check method is used to check the compatibility between all the sources of prior information and TDTD. For the prior information to pass the check, the prior credibility is calculated. A mixed prior distribution is formed based on the prior PDFs and the corresponding credibility. The Bayesian posterior distribution model is acquired with the mixed prior distribution and TDTD, based on which the point and interval estimates are calculated. Finally, examples of a flying control system are used to verify the proposed method. The results show that the proposed method is feasible and effective.

  19. Superfund TIO videos: Set A. Settlement tools and practices, win-win negotiations, closeout, records management, authorities and liabilities. Part 5. Audio-Visual

    International Nuclear Information System (INIS)

    1990-01-01

    The videotape is divided into 5 sections. Section 1 provides an overview of settlement activities including conducting an information exchange, issuing general notice letters, initiating special notice procedures, receiving good faith offers (GFO), negotiating and settlements, and pursuing enforcement actions. Section 2 covers the types of negotiations that commonly involve OSCs and RPMs. The characteristics of a negotiating style that satisfy all the parties as well as methods for preparing and conducting this type of negotiation are outlined. Section 3 deals with post-removal site control arrangements and other closeout requirements for a removal site, such as completing necessary paperwork. The remedial project closeout procedures also are covered, including the remedial closeout report, operation and maintenance (O ampersand M) arrangements, transfer of site responsibility, and deletion from the National Priorities List (NPL). Section 4 discusses the purpose, procedures, roles and responsibilities associated with records management under Superfund. Section 5 outlines the response authority provided by CERCLA to OCSs and RPMs

  20. Geophysical log analysis of selected test and residential wells at the Shenandoah Road National Superfund Site, East Fishkill, Dutchess County, New York

    Science.gov (United States)

    Reynolds, Richard J.; Anderson, J. Alton; Williams, John H.

    2015-01-01

    The U.S. Geological Survey collected and analyzed geophysical logs from 20 test wells and 23 residential wells at the Shenandoah Road National Superfund Site in East Fishkill, New York, from 2006 through 2010 as part of an Interagency Agreement to provide hydrogeologic technical support to the U.S. Environmental Protection Agency, Region 2. The geophysical logs collected include caliper, gamma, acoustic and optical televiewer, deviation, electromagnetic-induction, magnetic-susceptibility, fluid-property, and flow under ambient and pumped conditions. The geophysical logs were analyzed along with single-well aquifer test data and drilling logs to characterize the lithology, fabric, fractures, and flow zones penetrated by the wells. The results of the geophysical log analysis were used as part of the hydrogeologic characterization of the site and in the design of discrete-zone monitoring installations in the test wells and selected residential wells.

  1. Superfund Record of Decision (EPA region 2): Glen Ridge Radium site, Essex County, NJ. (Second remedial action), June 1990. Final report

    International Nuclear Information System (INIS)

    1990-01-01

    The 90-acre Glen Ridge Radium site is a residential community in the Borough of Glen Ridge, Essex County, New Jersey. The site is adjacent to another Superfund site, the Montclair/West Orange site. The Glen Ridge site includes a community of 274 properties serviced by surface reservoirs in northern New Jersey. In the early 1900s, a radium processing or utilization facility was located in the vicinity of the site. EPA investigations in 1981 and 1983 confirmed the presence of gamma radiation contamination in the Glen Ridge area and in several adjacent houses. The ROD complements the previous 1989 ROD for this site and provides a final remedy. The primary contaminant of concern affecting the soil is radium 226

  2. Treatability Study of In Situ Technologies for Remediation of Hexavalent Chromium in Groundwater at the Puchack Well Field Superfund Site, New Jersey

    Energy Technology Data Exchange (ETDEWEB)

    Vermeul, Vince R.; Szecsody, Jim E.; Truex, Michael J.; Burns, Carolyn A.; Girvin, Donald C.; Phillips, Jerry L.; Devary, Brooks J.; Fischer, Ashley E.; Li, Shu-Mei W.

    2006-11-13

    This treatability study was conducted by Pacific Northwest National Laboratory (PNNL), at the request of the U. S. Environmental Protection Agency (EPA) Region 2, to evaluate the feasibility of using in situ treatment technologies for chromate reduction and immobilization at the Puchack Well Field Superfund Site in Pennsauken Township, New Jersey. In addition to in situ reductive treatments, which included the evaluation of both abiotic and biotic reduction of Puchack aquifer sediments, natural attenuation mechanisms were evaluated (i.e., chromate adsorption and reduction). Chromate exhibited typical anionic adsorption behavior, with greater adsorption at lower pH, at lower chromate concentration, and at lower concentrations of other competing anions. In particular, sulfate (at 50 mg/L) suppressed chromate adsorption by up to 50%. Chromate adsorption was not influenced by inorganic colloids.

  3. Analysis of the IJCNN 2007 agnostic learning vs. prior knowledge challenge.

    Science.gov (United States)

    Guyon, Isabelle; Saffari, Amir; Dror, Gideon; Cawley, Gavin

    2008-01-01

    We organized a challenge for IJCNN 2007 to assess the added value of prior domain knowledge in machine learning. Most commercial data mining programs accept data pre-formatted in the form of a table, with each example being encoded as a linear feature vector. Is it worth spending time incorporating domain knowledge in feature construction or algorithm design, or can off-the-shelf programs working directly on simple low-level features do better than skilled data analysts? To answer these questions, we formatted five datasets using two data representations. The participants in the "prior knowledge" track used the raw data, with full knowledge of the meaning of the data representation. Conversely, the participants in the "agnostic learning" track used a pre-formatted data table, with no knowledge of the identity of the features. The results indicate that black-box methods using relatively unsophisticated features work quite well and rapidly approach the best attainable performance. The winners on the prior knowledge track used feature extraction strategies yielding a large number of low-level features. Incorporating prior knowledge in the form of generic coding/smoothing methods to exploit regularities in data is beneficial, but incorporating actual domain knowledge in feature construction is very time consuming and seldom leads to significant improvements. The AL vs. PK challenge web site remains open for post-challenge submissions: http://www.agnostic.inf.ethz.ch/.

  4. Oak Ridge Reservation Site Management Plan for the Environmental Restoration Program

    International Nuclear Information System (INIS)

    1991-09-01

    This site management for the Environmental Restoration (ER) Program implements the Oak Ridge Reservation (ORR) Federal Facility Agreement (FFA) (EPA 1990), also known as an Interagency Agreement (IAG), hereafter referred to as ''the Agreement.'' The Department of Energy (DOE), the US Environmental Protection Agency (EPA), and the Tennessee Department of Environment and Conservation (TDEC), hereafter known as ''the Parties,'' entered into this Agreement for the purpose of coordinating remediation activities undertaken on the ORR to comply with the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) as amended by the Superfund Amendments, the Resource Conservation and Recovery Act (RCRA), and the National Environmental Policy Act (NEPA). 7 refs., 17 figs

  5. Construction and test of the PRIOR proton microscope; Aufbau und Test des Protonenmikroskops PRIOR

    Energy Technology Data Exchange (ETDEWEB)

    Lang, Philipp-Michael

    2015-01-15

    The study of High Energy Density Matter (HEDM) in the laboratory makes great demands on the diagnostics because these states can usually only be created for a short time and usual diagnostic techniques with visible light or X-rays come to their limit because of the high density. The high energy proton radiography technique that was developed in the 1990s at the Los Alamos National Laboratory is a very promising possibility to overcome those limits so that one can measure the density of HEDM with high spatial and time resolution. For this purpose the proton microscope PRIOR (Proton Radiography for FAIR) was set up at GSI, which not only reproduces the image, but also magnifies it by a factor of 4.2 and thereby penetrates matter with a density up to 20 g/cm{sup 2}. Straightaway a spatial resolution of less than 30 μm and a time resolution on the nanosecond scale was achieved. This work describes details to the principle, design and construction of the proton microscope as well as first measurements and simulations of essential components like magnetic lenses, a collimator and a scintillator screen. For the latter one it was possible to show that plastic scintillators can be used as converter as an alternative to the slower but more radiation resistant crystals, so that it is possible to reach a time resolution of 10 ns. Moreover the characteristics were investigated for the system at the commissioning in April 2014. Also the changes in the magnetic field due to radiation damage were studied. Besides that an overview about future applications is given. First experiments with Warm Dense Matter created by using a Pulsed Power Setup have already been performed. Furthermore the promising concept of combining proton radiography with particle therapy has been investigated in context of the PaNTERA project. An outlook on the possibilities with future experiments at the FAIR accelerator facility is given as well. Because of higher beam intensity an energy one can expect even

  6. Mercury in tree swallow food, eggs, bodies, and feathers at Acadia National Park, Maine, and an EPA superfund site, Ayer, Massachusetts.

    Science.gov (United States)

    Longcore, Jerry R; Haines, Terry A; Halteman, William A

    2007-03-01

    We monitored nest boxes during 1997-1999 at Acadia National Park, Mt. Desert Island, ME and at an old-field site in Orono, ME to determine mercury (Hg) uptake in tree swallow (Tachycineta bicolor) eggs, tissues, and food boluses. Also, in 1998-1999 we monitored nest boxes at Grove Pond and Plow Shop Pond at a U.S. Environmental Protection Agency Superfund site in Ayer, MA. We recorded breeding success at all locations. On average among locations, total mercury (THg) biomagnified 2 to 4-fold from food to eggs and 9 to 18-fold from food to feathers. These are minimum values because the proportion of transferable methyl mercury (MeHg) of the THg in insects varies (i.e., 35%-95% of THg) in food boluses. THg was highest in food boluses at Aunt Betty Pond at Acadia, whereas THg in eggs was highest at the Superfund site. A few eggs from nests at each of these locations exceeded the threshold (i.e., 800-1,000 ng/g, wet wt.) of embryotoxicity established for Hg. Hatching success was 88.9% to 100% among locations, but five eggs failed to hatch from 4 of the 11 clutches in which an egg exceeded this threshold. MeHg in feathers was highest in tree swallows at Aunt Betty Pond and the concentration of THg in bodies was related to the concentration in feathers. Transfer of an average of 80%-92% of the Hg in bodies to feathers may have enhanced nestling survival. Residues of Hg in tissues of tree swallows in the Northeast seem higher than those of the Midwest.

  7. Adaptive nonparametric Bayesian inference using location-scale mixture priors

    NARCIS (Netherlands)

    Jonge, de R.; Zanten, van J.H.

    2010-01-01

    We study location-scale mixture priors for nonparametric statistical problems, including multivariate regression, density estimation and classification. We show that a rate-adaptive procedure can be obtained if the prior is properly constructed. In particular, we show that adaptation is achieved if

  8. Nudging toward Inquiry: Awakening and Building upon Prior Knowledge

    Science.gov (United States)

    Fontichiaro, Kristin, Comp.

    2010-01-01

    "Prior knowledge" (sometimes called schema or background knowledge) is information one already knows that helps him/her make sense of new information. New learning builds on existing prior knowledge. In traditional reporting-style research projects, students bypass this crucial step and plow right into answer-finding. It's no wonder that many…

  9. Drunkorexia: Calorie Restriction Prior to Alcohol Consumption among College Freshman

    Science.gov (United States)

    Burke, Sloane C.; Cremeens, Jennifer; Vail-Smith, Karen; Woolsey, Conrad

    2010-01-01

    Using a sample of 692 freshmen at a southeastern university, this study examined caloric restriction among students prior to planned alcohol consumption. Participants were surveyed for self-reported alcohol consumption, binge drinking, and caloric intake habits prior to drinking episodes. Results indicated that 99 of 695 (14%) of first year…

  10. Personality, depressive symptoms and prior trauma exposure of new ...

    African Journals Online (AJOL)

    Background. Police officers are predisposed to trauma exposure. The development of depression and post-traumatic stress disorder (PTSD) may be influenced by personality style, prior exposure to traumatic events and prior depression. Objectives. To describe the personality profiles of new Metropolitan Police Service ...

  11. 34 CFR 303.403 - Prior notice; native language.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 2 2010-07-01 2010-07-01 false Prior notice; native language. 303.403 Section 303.403... TODDLERS WITH DISABILITIES Procedural Safeguards General § 303.403 Prior notice; native language. (a... file a complaint and the timelines under those procedures. (c) Native language. (1) The notice must be...

  12. On the use of a pruning prior for neural networks

    DEFF Research Database (Denmark)

    Goutte, Cyril

    1996-01-01

    We address the problem of using a regularization prior that prunes unnecessary weights in a neural network architecture. This prior provides a convenient alternative to traditional weight-decay. Two examples are studied to support this method and illustrate its use. First we use the sunspots...

  13. Bayesian Inference for Structured Spike and Slab Priors

    DEFF Research Database (Denmark)

    Andersen, Michael Riis; Winther, Ole; Hansen, Lars Kai

    2014-01-01

    Sparse signal recovery addresses the problem of solving underdetermined linear inverse problems subject to a sparsity constraint. We propose a novel prior formulation, the structured spike and slab prior, which allows to incorporate a priori knowledge of the sparsity pattern by imposing a spatial...

  14. 5 CFR 6201.103 - Prior approval for outside employment.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Prior approval for outside employment. 6201.103 Section 6201.103 Administrative Personnel EXPORT-IMPORT BANK OF THE UNITED STATES SUPPLEMENTAL STANDARDS OF ETHICAL CONDUCT FOR EMPLOYEES OF THE EXPORT-IMPORT BANK OF THE UNITED STATES § 6201.103 Prior...

  15. Prior authorisation schemes: trade barriers in need of scientific justification

    NARCIS (Netherlands)

    Meulen, van der B.M.J.

    2010-01-01

    Case C-333/08 Commission v. French Republic ‘processing aids’ [2010] ECR-0000 French prior authorisation scheme for processing aids in food production infringes upon Article 34 TFEU** 1. A prior authorisation scheme not complying with the principle of proportionality, infringes upon Article 34 TFEU.

  16. Arsenic species in weathering mine tailings and biogenic solids at the Lava Cap Mine Superfund Site, Nevada City, CA

    Directory of Open Access Journals (Sweden)

    Ashley Roger P

    2011-01-01

    Full Text Available Abstract Background A realistic estimation of the health risk of human exposure to solid-phase arsenic (As derived from historic mining operations is a major challenge to redevelopment of California's famed "Mother Lode" region. Arsenic, a known carcinogen, occurs in multiple solid forms that vary in bioaccessibility. X-ray absorption fine-structure spectroscopy (XAFS was used to identify and quantify the forms of As in mine wastes and biogenic solids at the Lava Cap Mine Superfund (LCMS site, a historic "Mother Lode" gold mine. Principal component analysis (PCA was used to assess variance within water chemistry, solids chemistry, and XAFS spectral datasets. Linear combination, least-squares fits constrained in part by PCA results were then used to quantify arsenic speciation in XAFS spectra of tailings and biogenic solids. Results The highest dissolved arsenic concentrations were found in Lost Lake porewater and in a groundwater-fed pond in the tailings deposition area. Iron, dissolved oxygen, alkalinity, specific conductivity, and As were the major variables in the water chemistry PCA. Arsenic was, on average, 14 times more concentrated in biologically-produced iron (hydroxide than in mine tailings. Phosphorous, manganese, calcium, aluminum, and As were the major variables in the solids chemistry PCA. Linear combination fits to XAFS spectra indicate that arsenopyrite (FeAsS, the dominant form of As in ore material, remains abundant (average: 65% in minimally-weathered ore samples and water-saturated tailings at the bottom of Lost Lake. However, tailings that underwent drying and wetting cycles contain an average of only 30% arsenopyrite. The predominant products of arsenopyrite weathering were identified by XAFS to be As-bearing Fe (hydroxide and arseniosiderite (Ca2Fe(AsO43O3•3H2O. Existence of the former species is not in question, but the presence of the latter species was not confirmed by additional measurements, so its identification is

  17. Variational segmentation problems using prior knowledge in imaging and vision

    DEFF Research Database (Denmark)

    Fundana, Ketut

    This dissertation addresses variational formulation of segmentation problems using prior knowledge. Variational models are among the most successful approaches for solving many Computer Vision and Image Processing problems. The models aim at finding the solution to a given energy functional defined......, prior knowledge is needed to obtain the desired solution. The introduction of shape priors in particular, has proven to be an effective way to segment objects of interests. Firstly, we propose a prior-based variational segmentation model to segment objects of interest in image sequences, that can deal....... Many objects have high variability in shape and orientation. This often leads to unsatisfactory results, when using a segmentation model with single shape template. One way to solve this is by using more sophisticated shape models. We propose to incorporate shape priors from a shape sub...

  18. Logarithmic Laplacian Prior Based Bayesian Inverse Synthetic Aperture Radar Imaging.

    Science.gov (United States)

    Zhang, Shuanghui; Liu, Yongxiang; Li, Xiang; Bi, Guoan

    2016-04-28

    This paper presents a novel Inverse Synthetic Aperture Radar Imaging (ISAR) algorithm based on a new sparse prior, known as the logarithmic Laplacian prior. The newly proposed logarithmic Laplacian prior has a narrower main lobe with higher tail values than the Laplacian prior, which helps to achieve performance improvement on sparse representation. The logarithmic Laplacian prior is used for ISAR imaging within the Bayesian framework to achieve better focused radar image. In the proposed method of ISAR imaging, the phase errors are jointly estimated based on the minimum entropy criterion to accomplish autofocusing. The maximum a posterior (MAP) estimation and the maximum likelihood estimation (MLE) are utilized to estimate the model parameters to avoid manually tuning process. Additionally, the fast Fourier Transform (FFT) and Hadamard product are used to minimize the required computational efficiency. Experimental results based on both simulated and measured data validate that the proposed algorithm outperforms the traditional sparse ISAR imaging algorithms in terms of resolution improvement and noise suppression.

  19. Total Variability Modeling using Source-specific Priors

    DEFF Research Database (Denmark)

    Shepstone, Sven Ewan; Lee, Kong Aik; Li, Haizhou

    2016-01-01

    sequence of an utterance. In both cases the prior for the latent variable is assumed to be non-informative, since for homogeneous datasets there is no gain in generality in using an informative prior. This work shows in the heterogeneous case, that using informative priors for com- puting the posterior......, can lead to favorable results. We focus on modeling the priors using minimum divergence criterion or fac- tor analysis techniques. Tests on the NIST 2008 and 2010 Speaker Recognition Evaluation (SRE) dataset show that our proposed method beats four baselines: For i-vector extraction using an already...... trained matrix, for the short2-short3 task in SRE’08, five out of eight female and four out of eight male common conditions, were improved. For the core-extended task in SRE’10, four out of nine female and six out of nine male common conditions were improved. When incorporating prior information...

  20. Example-driven manifold priors for image deconvolution.

    Science.gov (United States)

    Ni, Jie; Turaga, Pavan; Patel, Vishal M; Chellappa, Rama

    2011-11-01

    Image restoration methods that exploit prior information about images to be estimated have been extensively studied, typically using the Bayesian framework. In this paper, we consider the role of prior knowledge of the object class in the form of a patch manifold to address the deconvolution problem. Specifically, we incorporate unlabeled image data of the object class, say natural images, in the form of a patch-manifold prior for the object class. The manifold prior is implicitly estimated from the given unlabeled data. We show how the patch-manifold prior effectively exploits the available sample class data for regularizing the deblurring problem. Furthermore, we derive a generalized cross-validation (GCV) function to automatically determine the regularization parameter at each iteration without explicitly knowing the noise variance. Extensive experiments show that this method performs better than many competitive image deconvolution methods.

  1. Littoral Combat Ship: Knowledge of Survivability and Lethality Capabilities Needed Prior to Making Major Funding Decisions

    Science.gov (United States)

    2015-12-01

    USS Port Royal hit a coral reef in order to provide an independent review of the damage the ship sustained. Our classified report discussed...Improved Weight Management Needed Prior to Further Investments, GAO-14-349SU (Washington, D.C.: Apr. 8, 2014); Littoral Combat Ship: Knowledge of...Early in the program, the Navy decided to forgo a number of traditional ship requirements in order to help reduce the costs and the weight and size

  2. Improving experimental phases for strong reflections prior to density modification

    International Nuclear Information System (INIS)

    Uervirojnangkoorn, Monarin; Hilgenfeld, Rolf; Terwilliger, Thomas C.; Read, Randy J.

    2013-01-01

    A genetic algorithm has been developed to optimize the phases of the strongest reflections in SIR/SAD data. This is shown to facilitate density modification and model building in several test cases. Experimental phasing of diffraction data from macromolecular crystals involves deriving phase probability distributions. These distributions are often bimodal, making their weighted average, the centroid phase, improbable, so that electron-density maps computed using centroid phases are often non-interpretable. Density modification brings in information about the characteristics of electron density in protein crystals. In successful cases, this allows a choice between the modes in the phase probability distributions, and the maps can cross the borderline between non-interpretable and interpretable. Based on the suggestions by Vekhter [Vekhter (2005 ▶), Acta Cryst. D61, 899–902], the impact of identifying optimized phases for a small number of strong reflections prior to the density-modification process was investigated while using the centroid phase as a starting point for the remaining reflections. A genetic algorithm was developed that optimizes the quality of such phases using the skewness of the density map as a target function. Phases optimized in this way are then used in density modification. In most of the tests, the resulting maps were of higher quality than maps generated from the original centroid phases. In one of the test cases, the new method sufficiently improved a marginal set of experimental SAD phases to enable successful map interpretation. A computer program, SISA, has been developed to apply this method for phase improvement in macromolecular crystallography

  3. Superfund Green Remediation

    Science.gov (United States)

    Green remediation is the practice of considering all environmental effects of site cleanup and incorporating options – like the use of renewable energy resources – to maximize the environmental benefits of cleanups.

  4. The influence of prior knowledge on the retrieval-directed function of note taking in prior knowledge activation

    NARCIS (Netherlands)

    Wetzels, Sandra; Kester, Liesbeth; Van Merriënboer, Jeroen; Broers, Nick

    2010-01-01

    Wetzels, S. A. J., Kester, L., Van Merriënboer, J. J. G., & Broers, N. J. (2011). The influence of prior knowledge on the retrieval-directed function of note taking in prior knowledge activation. British Journal of Educational Psychology, 81(2), 274-291. doi: 10.1348/000709910X517425

  5. Learning priors for Bayesian computations in the nervous system.

    Directory of Open Access Journals (Sweden)

    Max Berniker

    Full Text Available Our nervous system continuously combines new information from our senses with information it has acquired throughout life. Numerous studies have found that human subjects manage this by integrating their observations with their previous experience (priors in a way that is close to the statistical optimum. However, little is known about the way the nervous system acquires or learns priors. Here we present results from experiments where the underlying distribution of target locations in an estimation task was switched, manipulating the prior subjects should use. Our experimental design allowed us to measure a subject's evolving prior while they learned. We confirm that through extensive practice subjects learn the correct prior for the task. We found that subjects can rapidly learn the mean of a new prior while the variance is learned more slowly and with a variable learning rate. In addition, we found that a Bayesian inference model could predict the time course of the observed learning while offering an intuitive explanation for the findings. The evidence suggests the nervous system continuously updates its priors to enable efficient behavior.

  6. Implicit Priors in Galaxy Cluster Mass and Scaling Relation Determinations

    Science.gov (United States)

    Mantz, A.; Allen, S. W.

    2011-01-01

    Deriving the total masses of galaxy clusters from observations of the intracluster medium (ICM) generally requires some prior information, in addition to the assumptions of hydrostatic equilibrium and spherical symmetry. Often, this information takes the form of particular parametrized functions used to describe the cluster gas density and temperature profiles. In this paper, we investigate the implicit priors on hydrostatic masses that result from this fully parametric approach, and the implications of such priors for scaling relations formed from those masses. We show that the application of such fully parametric models of the ICM naturally imposes a prior on the slopes of the derived scaling relations, favoring the self-similar model, and argue that this prior may be influential in practice. In contrast, this bias does not exist for techniques which adopt an explicit prior on the form of the mass profile but describe the ICM non-parametrically. Constraints on the slope of the cluster mass-temperature relation in the literature show a separation based the approach employed, with the results from fully parametric ICM modeling clustering nearer the self-similar value. Given that a primary goal of scaling relation analyses is to test the self-similar model, the application of methods subject to strong, implicit priors should be avoided. Alternative methods and best practices are discussed.

  7. Comparison of different strategies for using fossil calibrations to generate the time prior in Bayesian molecular clock dating.

    Science.gov (United States)

    Barba-Montoya, Jose; Dos Reis, Mario; Yang, Ziheng

    2017-09-01

    Fossil calibrations are the utmost source of information for resolving the distances between molecular sequences into estimates of absolute times and absolute rates in molecular clock dating analysis. The quality of calibrations is thus expected to have a major impact on divergence time estimates even if a huge amount of molecular data is available. In Bayesian molecular clock dating, fossil calibration information is incorporated in the analysis through the prior on divergence times (the time prior). Here, we evaluate three strategies for converting fossil calibrations (in the form of minimum- and maximum-age bounds) into the prior on times, which differ according to whether they borrow information from the maximum age of ancestral nodes and minimum age of descendent nodes to form constraints for any given node on the phylogeny. We study a simple example that is analytically tractable, and analyze two real datasets (one of 10 primate species and another of 48 seed plant species) using three Bayesian dating programs: MCMCTree, MrBayes and BEAST2. We examine how different calibration strategies, the birth-death process, and automatic truncation (to enforce the constraint that ancestral nodes are older than descendent nodes) interact to determine the time prior. In general, truncation has a great impact on calibrations so that the effective priors on the calibration node ages after the truncation can be very different from the user-specified calibration densities. The different strategies for generating the effective prior also had considerable impact, leading to very different marginal effective priors. Arbitrary parameters used to implement minimum-bound calibrations were found to have a strong impact upon the prior and posterior of the divergence times. Our results highlight the importance of inspecting the joint time prior used by the dating program before any Bayesian dating analysis. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  8. Estimating Functions with Prior Knowledge, (EFPK) for diffusions

    DEFF Research Database (Denmark)

    Nolsøe, Kim; Kessler, Mathieu; Madsen, Henrik

    2003-01-01

    In this paper a method is formulated in an estimating function setting for parameter estimation, which allows the use of prior information. The main idea is to use prior knowledge of the parameters, either specified as moments restrictions or as a distribution, and use it in the construction of a...... of an estimating function. It may be useful when the full Bayesian analysis is difficult to carry out for computational reasons. This is almost always the case for diffusions, which is the focus of this paper, though the method applies in other settings.......In this paper a method is formulated in an estimating function setting for parameter estimation, which allows the use of prior information. The main idea is to use prior knowledge of the parameters, either specified as moments restrictions or as a distribution, and use it in the construction...

  9. 29 CFR 452.40 - Prior office holding.

    Science.gov (United States)

    2010-07-01

    ... DISCLOSURE ACT OF 1959 Candidacy for Office; Reasonable Qualifications § 452.40 Prior office holding. A.... 26 26 Wirtz v. Hotel, Motel and Club Employees Union, Local 6, 391 U.S. 492 at 504. The Court stated...

  10. Form of prior for constrained thermodynamic processes with uncertainty

    Science.gov (United States)

    Aneja, Preety; Johal, Ramandeep S.

    2015-05-01

    We consider the quasi-static thermodynamic processes with constraints, but with additional uncertainty about the control parameters. Motivated by inductive reasoning, we assign prior distribution that provides a rational guess about likely values of the uncertain parameters. The priors are derived explicitly for both the entropy-conserving and the energy-conserving processes. The proposed form is useful when the constraint equation cannot be treated analytically. The inference is performed using spin-1/2 systems as models for heat reservoirs. Analytical results are derived in the high-temperatures limit. An agreement beyond linear response is found between the estimates of thermal quantities and their optimal values obtained from extremum principles. We also seek an intuitive interpretation for the prior and the estimated value of temperature obtained therefrom. We find that the prior over temperature becomes uniform over the quantity kept conserved in the process.

  11. On the prior probabilities for two-stage Bayesian estimates

    International Nuclear Information System (INIS)

    Kohut, P.

    1992-01-01

    The method of Bayesian inference is reexamined for its applicability and for the required underlying assumptions in obtaining and using prior probability estimates. Two different approaches are suggested to determine the first-stage priors in the two-stage Bayesian analysis which avoid certain assumptions required for other techniques. In the first scheme, the prior is obtained through a true frequency based distribution generated at selected intervals utilizing actual sampling of the failure rate distributions. The population variability distribution is generated as the weighed average of the frequency distributions. The second method is based on a non-parametric Bayesian approach using the Maximum Entropy Principle. Specific features such as integral properties or selected parameters of prior distributions may be obtained with minimal assumptions. It is indicated how various quantiles may also be generated with a least square technique

  12. Prior Expectations Bias Sensory Representations in Visual Cortex

    NARCIS (Netherlands)

    Kok, P.; Brouwer, G.J.; Gerven, M.A.J. van; Lange, F.P. de

    2013-01-01

    Perception is strongly influenced by expectations. Accordingly, perception has sometimes been cast as a process of inference, whereby sensory inputs are combined with prior knowledge. However, despite a wealth of behavioral literature supporting an account of perception as probabilistic inference,

  13. Bayesian optimal experimental design for priors of compact support

    KAUST Repository

    Long, Quan

    2016-01-01

    to account for the bounded domain of the uniform prior pdf of the parameters. The underlying Gaussian distribution is obtained in the spirit of the Laplace method, more precisely, the mode is chosen as the maximum a posteriori (MAP) estimate

  14. What good are actions? Accelerating learning using learned action priors

    CSIR Research Space (South Africa)

    Rosman, Benjamin S

    2012-11-01

    Full Text Available The computational complexity of learning in sequential decision problems grows exponentially with the number of actions available to the agent at each state. We present a method for accelerating this process by learning action priors that express...

  15. Investigations of groundwater system and simulation of regional groundwater flow for North Penn Area 7 Superfund site, Montgomery County, Pennsylvania

    Science.gov (United States)

    Senior, Lisa A.; Goode, Daniel J.

    2013-01-01

    Groundwater in the vicinity of several industrial facilities in Upper Gwynedd Township and vicinity, Montgomery County, in southeast Pennsylvania has been shown to be contaminated with volatile organic compounds (VOCs), the most common of which is the solvent trichloroethylene (TCE). The 2-square-mile area was placed on the National Priorities List as the North Penn Area 7 Superfund site by the U.S. Environmental Protection Agency (USEPA) in 1989. The U.S. Geological Survey (USGS) conducted geophysical logging, aquifer testing, and water-level monitoring, and measured streamflows in and near North Penn Area 7 from fall 2000 through fall 2006 in a technical assistance study for the USEPA to develop an understanding of the hydrogeologic framework in the area as part of the USEPA Remedial Investigation. In addition, the USGS developed a groundwater-flow computer model based on the hydrogeologic framework to simulate regional groundwater flow and to estimate directions of groundwater flow and pathways of groundwater contaminants. The study area is underlain by Triassic- and Jurassic-age sandstones and shales of the Lockatong Formation and Brunswick Group in the Mesozoic Newark Basin. Regionally, these rocks strike northeast and dip to the northwest. The sequence of rocks form a fractured-sedimentary-rock aquifer that acts as a set of confined to partially confined layers of differing permeabilities. Depth to competent bedrock typically is less than 20 ft below land surface. The aquifer layers are recharged locally by precipitation and discharge locally to streams. The general configuration of the potentiometric surface in the aquifer is similar to topography, except in areas affected by pumping. The headwaters of Wissahickon Creek are nearby, and the stream flows southwest, parallel to strike, to bisect North Penn Area 7. Groundwater is pumped in the vicinity of North Penn Area 7 for industrial use, public supply, and residential supply. Results of field investigations

  16. Valid MR imaging predictors of prior knee arthroscopy

    International Nuclear Information System (INIS)

    Discepola, Federico; Le, Huy B.Q.; Park, John S.; Clopton, Paul; Knoll, Andrew N.; Austin, Matthew J.; Resnick, Donald L.

    2012-01-01

    To determine whether fibrosis of the medial patellar reticulum (MPR), lateral patellar reticulum (LPR), deep medial aspect of Hoffa's fat pad (MDH), or deep lateral aspect of Hoffa's fat pad (LDH) is a valid predictor of prior knee arthroscopy. Institutional review board approval and waiver of informed consent were obtained for this HIPPA-compliant study. Initially, fibrosis of the MPR, LPR, MDH, or LDH in MR imaging studies of 50 patients with prior knee arthroscopy and 100 patients without was recorded. Subsequently, two additional radiologists, blinded to clinical data, retrospectively and independently recorded the presence of fibrosis of the MPR in 50 patients with prior knee arthroscopy and 50 without. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and accuracy for detecting the presence of fibrosis in the MPR were calculated. κ statistics were used to analyze inter-observer agreement. Fibrosis of each of the regions examined during the first portion of the study showed a significant association with prior knee arthroscopy (p < 0.005 for each). A patient with fibrosis of the MPR, LDH, or LPR was 45.5, 9, or 3.7 times more likely, respectively, to have had a prior knee arthroscopy. Logistic regression analysis indicated that fibrosis of the MPR supplanted the diagnostic utility of identifying fibrosis of the LPR, LDH, or MDH, or combinations of these (p ≥ 0.09 for all combinations). In the second portion of the study, fibrosis of the MPR demonstrated a mean sensitivity of 82%, specificity of 72%, PPV of 75%, NPV of 81%, and accuracy of 77% for predicting prior knee arthroscopy. Analysis of MR images can be used to determine if a patient has had prior knee arthroscopy by identifying fibrosis of the MPR, LPR, MDH, or LDH. Fibrosis of the MPR was the strongest predictor of prior knee arthroscopy. (orig.)

  17. Valid MR imaging predictors of prior knee arthroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Discepola, Federico; Le, Huy B.Q. [McGill University Health Center, Jewsih General Hospital, Division of Musculoskeletal Radiology, Montreal, Quebec (Canada); Park, John S. [Annapolis Radiology Associates, Division of Musculoskeletal Radiology, Annapolis, MD (United States); Clopton, Paul; Knoll, Andrew N.; Austin, Matthew J.; Resnick, Donald L. [University of California San Diego (UCSD), Division of Musculoskeletal Radiology, San Diego, CA (United States)

    2012-01-15

    To determine whether fibrosis of the medial patellar reticulum (MPR), lateral patellar reticulum (LPR), deep medial aspect of Hoffa's fat pad (MDH), or deep lateral aspect of Hoffa's fat pad (LDH) is a valid predictor of prior knee arthroscopy. Institutional review board approval and waiver of informed consent were obtained for this HIPPA-compliant study. Initially, fibrosis of the MPR, LPR, MDH, or LDH in MR imaging studies of 50 patients with prior knee arthroscopy and 100 patients without was recorded. Subsequently, two additional radiologists, blinded to clinical data, retrospectively and independently recorded the presence of fibrosis of the MPR in 50 patients with prior knee arthroscopy and 50 without. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and accuracy for detecting the presence of fibrosis in the MPR were calculated. {kappa} statistics were used to analyze inter-observer agreement. Fibrosis of each of the regions examined during the first portion of the study showed a significant association with prior knee arthroscopy (p < 0.005 for each). A patient with fibrosis of the MPR, LDH, or LPR was 45.5, 9, or 3.7 times more likely, respectively, to have had a prior knee arthroscopy. Logistic regression analysis indicated that fibrosis of the MPR supplanted the diagnostic utility of identifying fibrosis of the LPR, LDH, or MDH, or combinations of these (p {>=} 0.09 for all combinations). In the second portion of the study, fibrosis of the MPR demonstrated a mean sensitivity of 82%, specificity of 72%, PPV of 75%, NPV of 81%, and accuracy of 77% for predicting prior knee arthroscopy. Analysis of MR images can be used to determine if a patient has had prior knee arthroscopy by identifying fibrosis of the MPR, LPR, MDH, or LDH. Fibrosis of the MPR was the strongest predictor of prior knee arthroscopy. (orig.)

  18. Generalized multiple kernel learning with data-dependent priors.

    Science.gov (United States)

    Mao, Qi; Tsang, Ivor W; Gao, Shenghua; Wang, Li

    2015-06-01

    Multiple kernel learning (MKL) and classifier ensemble are two mainstream methods for solving learning problems in which some sets of features/views are more informative than others, or the features/views within a given set are inconsistent. In this paper, we first present a novel probabilistic interpretation of MKL such that maximum entropy discrimination with a noninformative prior over multiple views is equivalent to the formulation of MKL. Instead of using the noninformative prior, we introduce a novel data-dependent prior based on an ensemble of kernel predictors, which enhances the prediction performance of MKL by leveraging the merits of the classifier ensemble. With the proposed probabilistic framework of MKL, we propose a hierarchical Bayesian model to learn the proposed data-dependent prior and classification model simultaneously. The resultant problem is convex and other information (e.g., instances with either missing views or missing labels) can be seamlessly incorporated into the data-dependent priors. Furthermore, a variety of existing MKL models can be recovered under the proposed MKL framework and can be readily extended to incorporate these priors. Extensive experiments demonstrate the benefits of our proposed framework in supervised and semisupervised settings, as well as in tasks with partial correspondence among multiple views.

  19. Fractional Gaussian noise: Prior specification and model comparison

    KAUST Repository

    Sø rbye, Sigrunn Holbek; Rue, Haavard

    2017-01-01

    Fractional Gaussian noise (fGn) is a stationary stochastic process used to model antipersistent or persistent dependency structures in observed time series. Properties of the autocovariance function of fGn are characterised by the Hurst exponent (H), which, in Bayesian contexts, typically has been assigned a uniform prior on the unit interval. This paper argues why a uniform prior is unreasonable and introduces the use of a penalised complexity (PC) prior for H. The PC prior is computed to penalise divergence from the special case of white noise and is invariant to reparameterisations. An immediate advantage is that the exact same prior can be used for the autocorrelation coefficient ϕ(symbol) of a first-order autoregressive process AR(1), as this model also reflects a flexible version of white noise. Within the general setting of latent Gaussian models, this allows us to compare an fGn model component with AR(1) using Bayes factors, avoiding the confounding effects of prior choices for the two hyperparameters H and ϕ(symbol). Among others, this is useful in climate regression models where inference for underlying linear or smooth trends depends heavily on the assumed noise model.

  20. Fractional Gaussian noise: Prior specification and model comparison

    KAUST Repository

    Sørbye, Sigrunn Holbek

    2017-07-07

    Fractional Gaussian noise (fGn) is a stationary stochastic process used to model antipersistent or persistent dependency structures in observed time series. Properties of the autocovariance function of fGn are characterised by the Hurst exponent (H), which, in Bayesian contexts, typically has been assigned a uniform prior on the unit interval. This paper argues why a uniform prior is unreasonable and introduces the use of a penalised complexity (PC) prior for H. The PC prior is computed to penalise divergence from the special case of white noise and is invariant to reparameterisations. An immediate advantage is that the exact same prior can be used for the autocorrelation coefficient ϕ(symbol) of a first-order autoregressive process AR(1), as this model also reflects a flexible version of white noise. Within the general setting of latent Gaussian models, this allows us to compare an fGn model component with AR(1) using Bayes factors, avoiding the confounding effects of prior choices for the two hyperparameters H and ϕ(symbol). Among others, this is useful in climate regression models where inference for underlying linear or smooth trends depends heavily on the assumed noise model.

  1. SUSPENSION OF THE PRIOR DISCIPLINARY INVESTIGATION ACCORDING TO LABOR LAW

    Directory of Open Access Journals (Sweden)

    Nicolae, GRADINARU

    2014-11-01

    Full Text Available In order to conduct the prior disciplinary investigation, the employee shall be convoked in writing by the person authorized by the employer to carry out the research, specifying the subject, date, time and place of the meeting. For this purpose the employer shall appoint a committee charged with conducting the prior disciplinary investigation. Prior disciplinary research cannot be done without the possibility of the accused person to defend himself. It would be an abuse of the employer to violate these provisions. Since the employee is entitled to formulate and sustain defence in proving innocence or lesser degree of guilt than imputed, it needs between the moment were disclosed to the employee and the one of performing the prior disciplinary investigation to be a reasonable term for the employee to be able to prepare a defence in this regard. The employee's failure to present at the convocation, without an objective reason entitles the employer to dispose the sanctioning without making the prior disciplinary investigation. The objective reason which makes the employee, that is subject to prior disciplinary investigation, unable to present to the preliminary disciplinary investigation, should be at the time of the investigation in question.

  2. Identification of subsurface structures using electromagnetic data and shape priors

    Energy Technology Data Exchange (ETDEWEB)

    Tveit, Svenn, E-mail: svenn.tveit@uni.no [Uni CIPR, Uni Research, Bergen 5020 (Norway); Department of Mathematics, University of Bergen, Bergen 5020 (Norway); Bakr, Shaaban A., E-mail: shaaban.bakr1@gmail.com [Department of Mathematics, Faculty of Science, Assiut University, Assiut 71516 (Egypt); Uni CIPR, Uni Research, Bergen 5020 (Norway); Lien, Martha, E-mail: martha.lien@octio.com [Uni CIPR, Uni Research, Bergen 5020 (Norway); Octio AS, Bøhmergaten 44, Bergen 5057 (Norway); Mannseth, Trond, E-mail: trond.mannseth@uni.no [Uni CIPR, Uni Research, Bergen 5020 (Norway); Department of Mathematics, University of Bergen, Bergen 5020 (Norway)

    2015-03-01

    We consider the inverse problem of identifying large-scale subsurface structures using the controlled source electromagnetic method. To identify structures in the subsurface where the contrast in electric conductivity can be small, regularization is needed to bias the solution towards preserving structural information. We propose to combine two approaches for regularization of the inverse problem. In the first approach we utilize a model-based, reduced, composite representation of the electric conductivity that is highly flexible, even for a moderate number of degrees of freedom. With a low number of parameters, the inverse problem is efficiently solved using a standard, second-order gradient-based optimization algorithm. Further regularization is obtained using structural prior information, available, e.g., from interpreted seismic data. The reduced conductivity representation is suitable for incorporation of structural prior information. Such prior information cannot, however, be accurately modeled with a gaussian distribution. To alleviate this, we incorporate the structural information using shape priors. The shape prior technique requires the choice of kernel function, which is application dependent. We argue for using the conditionally positive definite kernel which is shown to have computational advantages over the commonly applied gaussian kernel for our problem. Numerical experiments on various test cases show that the methodology is able to identify fairly complex subsurface electric conductivity distributions while preserving structural prior information during the inversion.

  3. Neutrino masses and their ordering: global data, priors and models

    Science.gov (United States)

    Gariazzo, S.; Archidiacono, M.; de Salas, P. F.; Mena, O.; Ternes, C. A.; Tórtola, M.

    2018-03-01

    We present a full Bayesian analysis of the combination of current neutrino oscillation, neutrinoless double beta decay and Cosmic Microwave Background observations. Our major goal is to carefully investigate the possibility to single out one neutrino mass ordering, namely Normal Ordering or Inverted Ordering, with current data. Two possible parametrizations (three neutrino masses versus the lightest neutrino mass plus the two oscillation mass splittings) and priors (linear versus logarithmic) are exhaustively examined. We find that the preference for NO is only driven by neutrino oscillation data. Moreover, the values of the Bayes factor indicate that the evidence for NO is strong only when the scan is performed over the three neutrino masses with logarithmic priors; for every other combination of parameterization and prior, the preference for NO is only weak. As a by-product of our Bayesian analyses, we are able to (a) compare the Bayesian bounds on the neutrino mixing parameters to those obtained by means of frequentist approaches, finding a very good agreement; (b) determine that the lightest neutrino mass plus the two mass splittings parametrization, motivated by the physical observables, is strongly preferred over the three neutrino mass eigenstates scan and (c) find that logarithmic priors guarantee a weakly-to-moderately more efficient sampling of the parameter space. These results establish the optimal strategy to successfully explore the neutrino parameter space, based on the use of the oscillation mass splittings and a logarithmic prior on the lightest neutrino mass, when combining neutrino oscillation data with cosmology and neutrinoless double beta decay. We also show that the limits on the total neutrino mass ∑ mν can change dramatically when moving from one prior to the other. These results have profound implications for future studies on the neutrino mass ordering, as they crucially state the need for self-consistent analyses which explore the

  4. Finding A Minimally Informative Dirichlet Prior Using Least Squares

    International Nuclear Information System (INIS)

    Kelly, Dana

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straightforward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson λ, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in the form of a standard distribution (e.g., beta, gamma), and so a beta distribution is used as an approximation in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial model for common-cause failure, must be estimated from data that are often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  5. Prior-based artifact correction (PBAC) in computed tomography

    International Nuclear Information System (INIS)

    Heußer, Thorsten; Brehm, Marcus; Ritschl, Ludwig; Sawall, Stefan; Kachelrieß, Marc

    2014-01-01

    Purpose: Image quality in computed tomography (CT) often suffers from artifacts which may reduce the diagnostic value of the image. In many cases, these artifacts result from missing or corrupt regions in the projection data, e.g., in the case of metal, truncation, and limited angle artifacts. The authors propose a generalized correction method for different kinds of artifacts resulting from missing or corrupt data by making use of available prior knowledge to perform data completion. Methods: The proposed prior-based artifact correction (PBAC) method requires prior knowledge in form of a planning CT of the same patient or in form of a CT scan of a different patient showing the same body region. In both cases, the prior image is registered to the patient image using a deformable transformation. The registered prior is forward projected and data completion of the patient projections is performed using smooth sinogram inpainting. The obtained projection data are used to reconstruct the corrected image. Results: The authors investigate metal and truncation artifacts in patient data sets acquired with a clinical CT and limited angle artifacts in an anthropomorphic head phantom data set acquired with a gantry-based flat detector CT device. In all cases, the corrected images obtained by PBAC are nearly artifact-free. Compared to conventional correction methods, PBAC achieves better artifact suppression while preserving the patient-specific anatomy at the same time. Further, the authors show that prominent anatomical details in the prior image seem to have only minor impact on the correction result. Conclusions: The results show that PBAC has the potential to effectively correct for metal, truncation, and limited angle artifacts if adequate prior data are available. Since the proposed method makes use of a generalized algorithm, PBAC may also be applicable to other artifacts resulting from missing or corrupt data

  6. Finding a minimally informative Dirichlet prior distribution using least squares

    International Nuclear Information System (INIS)

    Kelly, Dana; Atwood, Corwin

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straightforward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson λ, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in the form of a standard distribution (e.g., beta, gamma), and so a beta distribution is used as an approximation in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial model for common-cause failure, must be estimated from data that are often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  7. Finding a Minimally Informative Dirichlet Prior Distribution Using Least Squares

    International Nuclear Information System (INIS)

    Kelly, Dana; Atwood, Corwin

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straight-forward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in closed form, and so an approximate beta distribution is used in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial aleatory model for common-cause failure, must be estimated from data that is often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  8. Advanced prior modeling for 3D bright field electron tomography

    Science.gov (United States)

    Sreehari, Suhas; Venkatakrishnan, S. V.; Drummy, Lawrence F.; Simmons, Jeffrey P.; Bouman, Charles A.

    2015-03-01

    Many important imaging problems in material science involve reconstruction of images containing repetitive non-local structures. Model-based iterative reconstruction (MBIR) could in principle exploit such redundancies through the selection of a log prior probability term. However, in practice, determining such a log prior term that accounts for the similarity between distant structures in the image is quite challenging. Much progress has been made in the development of denoising algorithms like non-local means and BM3D, and these are known to successfully capture non-local redundancies in images. But the fact that these denoising operations are not explicitly formulated as cost functions makes it unclear as to how to incorporate them in the MBIR framework. In this paper, we formulate a solution to bright field electron tomography by augmenting the existing bright field MBIR method to incorporate any non-local denoising operator as a prior model. We accomplish this using a framework we call plug-and-play priors that decouples the log likelihood and the log prior probability terms in the MBIR cost function. We specifically use 3D non-local means (NLM) as the prior model in the plug-and-play framework, and showcase high quality tomographic reconstructions of a simulated aluminum spheres dataset, and two real datasets of aluminum spheres and ferritin structures. We observe that streak and smear artifacts are visibly suppressed, and that edges are preserved. Also, we report lower RMSE values compared to the conventional MBIR reconstruction using qGGMRF as the prior model.

  9. Logarithmic Laplacian Prior Based Bayesian Inverse Synthetic Aperture Radar Imaging

    Directory of Open Access Journals (Sweden)

    Shuanghui Zhang

    2016-04-01

    Full Text Available This paper presents a novel Inverse Synthetic Aperture Radar Imaging (ISAR algorithm based on a new sparse prior, known as the logarithmic Laplacian prior. The newly proposed logarithmic Laplacian prior has a narrower main lobe with higher tail values than the Laplacian prior, which helps to achieve performance improvement on sparse representation. The logarithmic Laplacian prior is used for ISAR imaging within the Bayesian framework to achieve better focused radar image. In the proposed method of ISAR imaging, the phase errors are jointly estimated based on the minimum entropy criterion to accomplish autofocusing. The maximum a posterior (MAP estimation and the maximum likelihood estimation (MLE are utilized to estimate the model parameters to avoid manually tuning process. Additionally, the fast Fourier Transform (FFT and Hadamard product are used to minimize the required computational efficiency. Experimental results based on both simulated and measured data validate that the proposed algorithm outperforms the traditional sparse ISAR imaging algorithms in terms of resolution improvement and noise suppression.

  10. Source Localization by Entropic Inference and Backward Renormalization Group Priors

    Directory of Open Access Journals (Sweden)

    Nestor Caticha

    2015-04-01

    Full Text Available A systematic method of transferring information from coarser to finer resolution based on renormalization group (RG transformations is introduced. It permits building informative priors in finer scales from posteriors in coarser scales since, under some conditions, RG transformations in the space of hyperparameters can be inverted. These priors are updated using renormalized data into posteriors by Maximum Entropy. The resulting inference method, backward RG (BRG priors, is tested by doing simulations of a functional magnetic resonance imaging (fMRI experiment. Its results are compared with a Bayesian approach working in the finest available resolution. Using BRG priors sources can be partially identified even when signal to noise ratio levels are up to ~ -25dB improving vastly on the single step Bayesian approach. For low levels of noise the BRG prior is not an improvement over the single scale Bayesian method. Analysis of the histograms of hyperparameters can show how to distinguish if the method is failing, due to very high levels of noise, or if the identification of the sources is, at least partially possible.

  11. A Noninformative Prior on a Space of Distribution Functions

    Directory of Open Access Journals (Sweden)

    Alexander Terenin

    2017-07-01

    Full Text Available In a given problem, the Bayesian statistical paradigm requires the specification of a prior distribution that quantifies relevant information about the unknowns of main interest external to the data. In cases where little such information is available, the problem under study may possess an invariance under a transformation group that encodes a lack of information, leading to a unique prior—this idea was explored at length by E.T. Jaynes. Previous successful examples have included location-scale invariance under linear transformation, multiplicative invariance of the rate at which events in a counting process are observed, and the derivation of the Haldane prior for a Bernoulli success probability. In this paper we show that this method can be extended, by generalizing Jaynes, in two ways: (1 to yield families of approximately invariant priors; and (2 to the infinite-dimensional setting, yielding families of priors on spaces of distribution functions. Our results can be used to describe conditions under which a particular Dirichlet Process posterior arises from an optimal Bayesian analysis, in the sense that invariances in the prior and likelihood lead to one and only one posterior distribution.

  12. On Bayesian reliability analysis with informative priors and censoring

    International Nuclear Information System (INIS)

    Coolen, F.P.A.

    1996-01-01

    In the statistical literature many methods have been presented to deal with censored observations, both within the Bayesian and non-Bayesian frameworks, and such methods have been successfully applied to, e.g., reliability problems. Also, in reliability theory it is often emphasized that, through shortage of statistical data and possibilities for experiments, one often needs to rely heavily on judgements of engineers, or other experts, for which means Bayesian methods are attractive. It is therefore important that such judgements can be elicited easily to provide informative prior distributions that reflect the knowledge of the engineers well. In this paper we focus on this aspect, especially on the situation that the judgements of the consulted engineers are based on experiences in environments where censoring has also been present previously. We suggest the use of the attractive interpretation of hyperparameters of conjugate prior distributions when these are available for assumed parametric models for lifetimes, and we show how one may go beyond the standard conjugate priors, using similar interpretations of hyper-parameters, to enable easier elicitation when censoring has been present in the past. This may even lead to more flexibility for modelling prior knowledge than when using standard conjugate priors, whereas the disadvantage of more complicated calculations that may be needed to determine posterior distributions play a minor role due to the advanced mathematical and statistical software that is widely available these days

  13. Inflammatory pathways are central to posterior cerebrovascular artery remodelling prior to the onset of congenital hypertension.

    Science.gov (United States)

    Walas, Dawid; Nowicki-Osuch, Karol; Alibhai, Dominic; von Linstow Roloff, Eva; Coghill, Jane; Waterfall, Christy; Paton, Julian Fr

    2018-01-01

    Cerebral artery hypoperfusion may provide the basis for linking ischemic stroke with hypertension. Brain hypoperfusion may induce hypertension that may serve as an auto-protective mechanism to prevent ischemic stroke. We hypothesised that hypertension is caused by remodelling of the cerebral arteries, which is triggered by inflammation. We used a congenital rat model of hypertension and examined age-related changes in gene expression of the cerebral arteries using RNA sequencing. Prior to hypertension, we found changes in signalling pathways associated with the immune system and fibrosis. Validation studies using second harmonics generation microscopy revealed upregulation of collagen type I and IV in both tunica externa and media. These changes in the extracellular matrix of cerebral arteries pre-empted hypertension accounting for their increased stiffness and resistance, both potentially conducive to stroke. These data indicate that inflammatory driven cerebral artery remodelling occurs prior to the onset of hypertension and may be a trigger elevating systemic blood pressure in genetically programmed hypertension.

  14. Use of prior mammograms in the classification of benign and malignant masses

    International Nuclear Information System (INIS)

    Varela, Celia; Karssemeijer, Nico; Hendriks, Jan H.C.L.; Holland, Roland

    2005-01-01

    The purpose of this study was to determine the importance of using prior mammograms for classification of benign and malignant masses. Five radiologists and one resident classified mass lesions in 198 mammograms obtained from a population-based screening program. Cases were interpreted twice, once without and once with comparison of previous mammograms, in a sequential reading order using soft copy image display. The radiologists' performances in classifying benign and malignant masses without and with previous mammograms were evaluated with receiver operating characteristic (ROC) analysis. The statistical significance of the difference in performances was calculated using analysis of variance. The use of prior mammograms improved the classification performance of all participants in the study. The mean area under the ROC curve of the readers increased from 0.763 to 0.796. This difference in performance was statistically significant (P = 0.008)

  15. 29 CFR 5.16 - Training plans approved or recognized by the Department of Labor prior to August 20, 1975.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 1 2010-07-01 2010-07-01 true Training plans approved or recognized by the Department of... STANDARDS ACT) Davis-Bacon and Related Acts Provisions and Procedures § 5.16 Training plans approved or... contractor shall be required to obtain approval of a training program which, prior to August 20, 1975, was...

  16. Superposing pure quantum states with partial prior information

    Science.gov (United States)

    Dogra, Shruti; Thomas, George; Ghosh, Sibasish; Suter, Dieter

    2018-05-01

    The principle of superposition is an intriguing feature of quantum mechanics, which is regularly exploited in many different circumstances. A recent work [M. Oszmaniec et al., Phys. Rev. Lett. 116, 110403 (2016), 10.1103/PhysRevLett.116.110403] shows that the fundamentals of quantum mechanics restrict the process of superimposing two unknown pure states, even though it is possible to superimpose two quantum states with partial prior knowledge. The prior knowledge imposes geometrical constraints on the choice of input states. We discuss an experimentally feasible protocol to superimpose multiple pure states of a d -dimensional quantum system and carry out an explicit experimental realization for two single-qubit pure states with partial prior information on a two-qubit NMR quantum information processor.

  17. Understanding sleep disturbance in athletes prior to important competitions.

    Science.gov (United States)

    Juliff, Laura E; Halson, Shona L; Peiffer, Jeremiah J

    2015-01-01

    Anecdotally many athletes report worse sleep in the nights prior to important competitions. Despite sleep being acknowledged as an important factor for optimal athletic performance and overall health, little is understood about athlete sleep around competition. The aims of this study were to identify sleep complaints of athletes prior to competitions and determine whether complaints were confined to competition periods. Cross-sectional study. A sample of 283 elite Australian athletes (129 male, 157 female, age 24±5 y) completed two questionnaires; Competitive Sport and Sleep questionnaire and the Pittsburgh Sleep Quality Index. 64.0% of athletes indicated worse sleep on at least one occasion in the nights prior to an important competition over the past 12 months. The main sleep problem specified by athletes was problems falling asleep (82.1%) with the main reasons responsible for poor sleep indicated as thoughts about the competition (83.5%) and nervousness (43.8%). Overall 59.1% of team sport athletes reported having no strategy to overcome poor sleep compared with individual athletes (32.7%, p=0.002) who utilised relaxation and reading as strategies. Individual sport athletes had increased likelihood of poor sleep as they aged. The poor sleep reported by athletes prior to competition was situational rather than a global sleep problem. Poor sleep is common prior to major competitions in Australian athletes, yet most athletes are unaware of strategies to overcome the poor sleep experienced. It is essential coaches and scientists monitor and educate both individual and team sport athletes to facilitate sleep prior to important competitions. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  18. Neutrino mass priors for cosmology from random matrices

    Science.gov (United States)

    Long, Andrew J.; Raveri, Marco; Hu, Wayne; Dodelson, Scott

    2018-02-01

    Cosmological measurements of structure are placing increasingly strong constraints on the sum of the neutrino masses, Σ mν, through Bayesian inference. Because these constraints depend on the choice for the prior probability π (Σ mν), we argue that this prior should be motivated by fundamental physical principles rather than the ad hoc choices that are common in the literature. The first step in this direction is to specify the prior directly at the level of the neutrino mass matrix Mν, since this is the parameter appearing in the Lagrangian of the particle physics theory. Thus by specifying a probability distribution over Mν, and by including the known squared mass splittings, we predict a theoretical probability distribution over Σ mν that we interpret as a Bayesian prior probability π (Σ mν). Assuming a basis-invariant probability distribution on Mν, also known as the anarchy hypothesis, we find that π (Σ mν) peaks close to the smallest Σ mν allowed by the measured mass splittings, roughly 0.06 eV (0.1 eV) for normal (inverted) ordering, due to the phenomenon of eigenvalue repulsion in random matrices. We consider three models for neutrino mass generation: Dirac, Majorana, and Majorana via the seesaw mechanism; differences in the predicted priors π (Σ mν) allow for the possibility of having indications about the physical origin of neutrino masses once sufficient experimental sensitivity is achieved. We present fitting functions for π (Σ mν), which provide a simple means for applying these priors to cosmological constraints on the neutrino masses or marginalizing over their impact on other cosmological parameters.

  19. Physical examination prior to initiating hormonal contraception: a systematic review.

    Science.gov (United States)

    Tepper, Naomi K; Curtis, Kathryn M; Steenland, Maria W; Marchbanks, Polly A

    2013-05-01

    Provision of contraception is often linked with physical examination, including clinical breast examination (CBE) and pelvic examination. This review was conducted to evaluate the evidence regarding outcomes among women with and without physical examination prior to initiating hormonal contraceptives. The PubMed database was searched from database inception through March 2012 for all peer-reviewed articles in any language concerning CBE and pelvic examination prior to initiating hormonal contraceptives. The quality of each study was assessed using the United States Preventive Services Task Force grading system. The search did not identify any evidence regarding outcomes among women screened versus not screened with CBE prior to initiation of hormonal contraceptives. The search identified two case-control studies of fair quality which compared women who did or did not undergo pelvic examination prior to initiating oral contraceptives (OCs) or depot medroxyprogesterone acetate (DMPA). No differences in risk factors for cervical neoplasia, incidence of sexually transmitted infections, incidence of abnormal Pap smears or incidence of abnormal wet mount findings were observed. Although women with breast cancer should not use hormonal contraceptives, there is little utility in screening prior to initiation, due to the low incidence of breast cancer and uncertain value of CBE among women of reproductive age. Two fair quality studies demonstrated no differences between women who did or did not undergo pelvic examination prior to initiating OCs or DMPA with respect to risk factors or clinical outcomes. In addition, pelvic examination is not likely to detect any conditions for which hormonal contraceptives would be unsafe. Published by Elsevier Inc.

  20. Compressive Online Robust Principal Component Analysis with Multiple Prior Information

    DEFF Research Database (Denmark)

    Van Luong, Huynh; Deligiannis, Nikos; Seiler, Jürgen

    -rank components. Unlike conventional batch RPCA, which processes all the data directly, our method considers a small set of measurements taken per data vector (frame). Moreover, our method incorporates multiple prior information signals, namely previous reconstructed frames, to improve these paration...... and thereafter, update the prior information for the next frame. Using experiments on synthetic data, we evaluate the separation performance of the proposed algorithm. In addition, we apply the proposed algorithm to online video foreground and background separation from compressive measurements. The results show...

  1. Prior knowledge processing for initial state of Kalman filter

    Czech Academy of Sciences Publication Activity Database

    Suzdaleva, Evgenia

    2010-01-01

    Roč. 24, č. 3 (2010), s. 188-202 ISSN 0890-6327 R&D Projects: GA ČR(CZ) GP201/06/P434 Institutional research plan: CEZ:AV0Z10750506 Keywords : Kalman filtering * prior knowledge * state-space model * initial state distribution Subject RIV: BC - Control Systems Theory Impact factor: 0.729, year: 2010 http://library.utia.cas.cz/separaty/2009/AS/suzdaleva-prior knowledge processing for initial state of kalman filter.pdf

  2. Role of strategies and prior exposure in mental rotation.

    Science.gov (United States)

    Cherney, Isabelle D; Neff, Nicole L

    2004-06-01

    The purpose of these two studies was to examine sex differences in strategy use and the effect of prior exposure on the performance on Vandenberg and Kuse's 1978 Mental Rotation Test. A total of 152 participants completed the spatial task and self-reported their strategy use. Consistent with previous studies, men outperformed women. Strategy usage did not account for these differences, although guessing did. Previous exposure to the Mental Rotation Test, American College Test scores and frequent computer or video game play predicted performance on the test. These results suggest that prior exposure to spatial tasks may provide cues to improve participants' performance.

  3. Phase transitions in restricted Boltzmann machines with generic priors

    Science.gov (United States)

    Barra, Adriano; Genovese, Giuseppe; Sollich, Peter; Tantari, Daniele

    2017-10-01

    We study generalized restricted Boltzmann machines with generic priors for units and weights, interpolating between Boolean and Gaussian variables. We present a complete analysis of the replica symmetric phase diagram of these systems, which can be regarded as generalized Hopfield models. We underline the role of the retrieval phase for both inference and learning processes and we show that retrieval is robust for a large class of weight and unit priors, beyond the standard Hopfield scenario. Furthermore, we show how the paramagnetic phase boundary is directly related to the optimal size of the training set necessary for good generalization in a teacher-student scenario of unsupervised learning.

  4. Prospective regularization design in prior-image-based reconstruction

    International Nuclear Information System (INIS)

    Dang, Hao; Siewerdsen, Jeffrey H; Stayman, J Webster

    2015-01-01

    Prior-image-based reconstruction (PIBR) methods leveraging patient-specific anatomical information from previous imaging studies and/or sequences have demonstrated dramatic improvements in dose utilization and image quality for low-fidelity data. However, a proper balance of information from the prior images and information from the measurements is required (e.g. through careful tuning of regularization parameters). Inappropriate selection of reconstruction parameters can lead to detrimental effects including false structures and failure to improve image quality. Traditional methods based on heuristics are subject to error and sub-optimal solutions, while exhaustive searches require a large number of computationally intensive image reconstructions. In this work, we propose a novel method that prospectively estimates the optimal amount of prior image information for accurate admission of specific anatomical changes in PIBR without performing full image reconstructions. This method leverages an analytical approximation to the implicitly defined PIBR estimator, and introduces a predictive performance metric leveraging this analytical form and knowledge of a particular presumed anatomical change whose accurate reconstruction is sought. Additionally, since model-based PIBR approaches tend to be space-variant, a spatially varying prior image strength map is proposed to optimally admit changes everywhere in the image (eliminating the need to know change locations a priori). Studies were conducted in both an ellipse phantom and a realistic thorax phantom emulating a lung nodule surveillance scenario. The proposed method demonstrated accurate estimation of the optimal prior image strength while achieving a substantial computational speedup (about a factor of 20) compared to traditional exhaustive search. Moreover, the use of the proposed prior strength map in PIBR demonstrated accurate reconstruction of anatomical changes without foreknowledge of change locations in

  5. C. A. Meredith, A. N. Prior, and Possible Worlds

    DEFF Research Database (Denmark)

    Hasle, Per Frederik Vilhelm; Rybaříková, Zuzana

    of Meredith’s and Prior’s work. On the one hand, it might cause corruption of Meredith’s system of logic and lead to paradoxes, as Prior pointed out in ‘Modal Logic with Functorial Variables and a Contingent Constant’. On the other hand, considering Prior as a mere follower of Meredith could cause......, their understanding of the relevant formal representations and indeed their general approach to modal logic considerably differed. These differences should be pointed out in order to more precisely appreciate the contribution of each of these authors. To neglect the differences could cause the misinterpretation...

  6. Identification of potential water-bearing zones by the use of borehole geophysics in the vicinity of Keystone Sanitation Superfund Site, Adams County, Pennsylvania and Carroll County, Maryland

    Science.gov (United States)

    Conger, Randall W.

    1997-01-01

    Between April 23, 1996, and June 21, 1996, the U.S. Environmental Protection Agency contracted Haliburton-NUS, Inc., to drill four clusters of three monitoring wells near the Keystone Sanitation Superfund Site. The purpose of the wells is to allow monitoring and sampling of shallow, intermediate, and deep waterbearing zones for the purpose of determining the horizontal and vertical distribution of any contaminated ground water migrating from the Keystone Site. Twelve monitoring wells, ranging in depth from 50 to 397.9 feet below land surface, were drilled in the vicinity of the Keystone Site. The U.S. Geological Survey conducted borehole-geophysical logging and determined, with geophysical logs and other available data, the ideal intervals to be screened in each well. Geophysical logs were run on four intermediate and four deep wells, and a caliper log only was run on shallow well CL-AD-173 (HN-1S). Interpretation of geophysical logs and existing data determined the placement of screens within each borehole.

  7. Occurrences and fate of DDT principal isomers/metabolites, DDA, and o,p'-DDD enantiomers in fish, sediment and water at a DDT-impacted Superfund site.

    Science.gov (United States)

    Garrison, A W; Cyterski, M; Roberts, K D; Burdette, D; Williamson, J; Avants, J K

    2014-11-01

    In the 1950s and 60s, discharges from a DDT manufacturing plant contaminated a tributary system of the Tennessee River near Huntsville, Alabama, USA. Regulatory action resulted in declaring the area a Superfund site which required remediation and extensive monitoring. Monitoring data collected from 1988, after remediation, through 2011 showed annual decreases approximating first-order decay in concentrations of total DDT and its six principal congeners (p,p'-DDT, o,p'-DDT, p,p'-DDD, o,p'-DDD, p,p'-DDE and o,p'-DDE) in filets from three species of fish. As of 2013, these concentrations met the regulatory requirements of 5 mg/kg or less total DDT for each fish tested. The enantiomer fractions (EF) of chiral o,p'-DDD in smallmouth buffalo and channel catfish were always below 0.5, indicating preferential decay of the (+)-enantiomer of this congener; this EF did not change significantly over 15 years. The often-neglected DDT metabolite p,p'-DDA was found at a concentration of about 20 μg/l in the ecosystem water. Published by Elsevier Ltd.

  8. Time series geophysical monitoring of permanganate injections and in situ chemical oxidation of PCE, OU1 area, Savage Superfund Site, Milford, NH, USA

    Science.gov (United States)

    Harte, Philip T.; Smith, Thor E.; Williams, John H.; Degnan, James R.

    2012-01-01

    In situ chemical oxidation (ISCO) treatment with sodium permanganate, an electrically conductive oxidant, provides a strong electrical signal for tracking of injectate transport using time series geophysical surveys including direct current (DC) resistivity and electromagnetic (EM) methods. Effective remediation is dependent upon placing the oxidant in close contact with the contaminated aquifer. Therefore, monitoring tools that provide enhanced tracking capability of the injectate offer considerable benefit to guide subsequent ISCO injections. Time-series geophysical surveys were performed at a superfund site in New Hampshire, USA over a one-year period to identify temporal changes in the bulk electrical conductivity of a tetrachloroethylene (PCE; also called tetrachloroethene) contaminated, glacially deposited aquifer due to the injection of sodium permanganate. The ISCO treatment involved a series of pulse injections of sodium permanganate from multiple injection wells within a contained area of the aquifer. After the initial injection, the permanganate was allowed to disperse under ambient groundwater velocities. Time series geophysical surveys identified the downward sinking and pooling of the sodium permanganate atop of the underlying till or bedrock surface caused by density-driven flow, and the limited horizontal spread of the sodium permanganate in the shallow parts of the aquifer during this injection period. When coupled with conventional monitoring, the surveys allowed for an assessment of ISCO treatment effectiveness in targeting the PCE plume and helped target areas for subsequent treatment.

  9. Time series geophysical monitoring of permanganate injections and in situ chemical oxidation of PCE, OU1 area, Savage Superfund Site, Milford, NH, USA.

    Science.gov (United States)

    Harte, Philip T; Smith, Thor E; Williams, John H; Degnan, James R

    2012-05-01

    In situ chemical oxidation (ISCO) treatment with sodium permanganate, an electrically conductive oxidant, provides a strong electrical signal for tracking of injectate transport using time series geophysical surveys including direct current (DC) resistivity and electromagnetic (EM) methods. Effective remediation is dependent upon placing the oxidant in close contact with the contaminated aquifer. Therefore, monitoring tools that provide enhanced tracking capability of the injectate offer considerable benefit to guide subsequent ISCO injections. Time-series geophysical surveys were performed at a superfund site in New Hampshire, USA over a one-year period to identify temporal changes in the bulk electrical conductivity of a tetrachloroethylene (PCE; also called tetrachloroethene) contaminated, glacially deposited aquifer due to the injection of sodium permanganate. The ISCO treatment involved a series of pulse injections of sodium permanganate from multiple injection wells within a contained area of the aquifer. After the initial injection, the permanganate was allowed to disperse under ambient groundwater velocities. Time series geophysical surveys identified the downward sinking and pooling of the sodium permanganate atop of the underlying till or bedrock surface caused by density-driven flow, and the limited horizontal spread of the sodium permanganate in the shallow parts of the aquifer during this injection period. When coupled with conventional monitoring, the surveys allowed for an assessment of ISCO treatment effectiveness in targeting the PCE plume and helped target areas for subsequent treatment. Published by Elsevier B.V.

  10. Superfund Record of Decision (EPA Rregion 4): Oak Ridge Reservation (USDOE), (Operable Unit 3), Anderson County, Oak Ridge, TN. (Second remedial action), September 1991

    International Nuclear Information System (INIS)

    1991-01-01

    The Oak Ridge Reservation (ORR) (USDOE) (Operable Unit 3) site is an active nuclear weapons component manufacturing facility located in Oak Ridge, Anderson County, Tennessee. The Y-12 plant, which is addressed as Operable Unit 3, is one of several hundred waste disposal sites or areas of contamination at the ORR site requiring Superfund remedial action. The site occupies the upper reaches of East Fork Poplar Creek (EFPC) in Bear Creek Valley. From 1940 to the present, the Y-12 plant has been used to produce nuclear weapons components. From 1955 to 1963, mercury was used in a column-exchange process to separate lithium isotopes. Testing of the three concrete tanks showed that the tank sediment contained mercury, and that contaminated waste is still being discharged into two of the three tanks. The Record of Decision (ROD) focuses on the contaminated sediment in the sedimentation tanks as an interim action. The primary contaminants of concern affecting the sediment are mercury, a metal and radioactive materials. The selected interim remedial action for the site is included

  11. PCBs and DDE in tree swallow (Tachycineta bicolor) eggs and nestlings from an estuarine PCB superfund site, New Bedford Harbor, MA, U.S.A.

    Science.gov (United States)

    Jayaraman, Saro; Nacci, Diane E.; Champlin, Denise M.; Pruell, Richard J.; Rocha, Kenneth J.; Custer, Christine M.; Custer, Thomas W.; Cantwell, Mark

    2009-01-01

    While breeding tree swallows (Tachycineta bicolor) have been used as biomonitors for freshwater sites, we report the first use of this species to assess contaminant bioaccumulation from estuarine breeding grounds into these aerial insectivores. Eggs and nestlings were collected from nest boxes in a polychlorinated biphenyl (PCB) contaminated estuary, the New Bedford Harbor Superfund site (NBH, Massachusetts, USA), and a reference salt marsh, Fox Hill (FH, Jamestown, Rhode Island, USA). Sediments, eggs, and nestlings were compared on a ng g−1 wet weight basis for total PCBs and DDE (1,1-bis-(4-chlorophenyl)-2,2-dichloroethene), metabolite of DDT (1,1,1-trichloro-2,2-bis-(p-chlorophenyl)ethane). NBH samples contained high concentrations of PCBs compared to FH for sediment (36,500 and 0.2), eggs (11,200 and 323), and nestlings (16,800 and 26). PCB homologue patterns linked tree swallow contamination to NBH sediment. NBH samples were also contaminated with DDE compared to FH for sediment (207 and 0.9) and nestlings (235 and 30) but not for eggs (526 and 488), suggesting both NBH and nonbreeding ground sources for DDE. The relationships between sediment and tree swallow egg and nestling PCBs were similar to those reported for freshwater sites. Like some highly contaminated freshwater sites, NBH PCB bioaccumulation had little apparent effect on reproductive success.

  12. SUPERFUND INNOVATIVE TECHNOLOGY EVALUATION PROGRAM Evaluation of Soil Amendment Technologies at the Crooksville/RosevillePottery Area of Concern Rocky Mountain Remediation ServicesEnvirobond™ Process

    Science.gov (United States)

    RMRS developed the Envirobond™ process to treat heavy metals in soil.This phosphate-based technology consists of a proprietary powder and solution that binds with metals in contaminated waste. RMRS claims that the Envirobond™ process converts metal contaminants from their leach...

  13. SUPERFUND TREATABILITY CLEARINGHOUSE: SOIL STABILIZATION PILOT STUDY, UNITED CHROME NPL SITE PILOT STUDY AND HEALTH AND SAFETY PROGRAM, UNITED CHROME NPL SITE PILOT STUDY

    Science.gov (United States)

    This document is a project plan for a pilot study at the United Chrome NPL site, Corvallis, Oregon and includes the health and safety and quality assurance/quality control plans. The plan reports results of a bench-scale study of the treatment process as iieasured by the ...

  14. Establishing a regulatory framework for a RCRA corrective action program

    International Nuclear Information System (INIS)

    Krueger, J.W.

    1989-01-01

    Recently, the environmental community has become keenly aware of problems associated with integration of the demanding regulations that apply to environmental restoration activities. Once can not attend an EPA-sponsored conference on Superfund without hearing questions concerning the Resource, Conservation, and Recovery Act (RCRA) and the applicability of the National Contingency Plan (NCP) to sites that do not qualify for the National Priorities List (NPL). In particular, the U.S. Department of Energy (DOE) has been greatly criticized for its inability to define a comprehensive approach for cleaning up its hazardous waste sites. This article presents two decision flowcharts designed to resolve some of this confusion for DOE. The RCRA/CERCLA integration diagram can help the environmental manager determine which law applies and under what conditions, and the RCRA corrective action decision flowchart can guide the manager in determining which specific sections of RCRA apply to a RCRA-lead environmental restoration program

  15. The Prior Can Often Only Be Understood in the Context of the Likelihood

    Directory of Open Access Journals (Sweden)

    Andrew Gelman

    2017-10-01

    Full Text Available A key sticking point of Bayesian analysis is the choice of prior distribution, and there is a vast literature on potential defaults including uniform priors, Jeffreys’ priors, reference priors, maximum entropy priors, and weakly informative priors. These methods, however, often manifest a key conceptual tension in prior modeling: a model encoding true prior information should be chosen without reference to the model of the measurement process, but almost all common prior modeling techniques are implicitly motivated by a reference likelihood. In this paper we resolve this apparent paradox by placing the choice of prior into the context of the entire Bayesian analysis, from inference to prediction to model evaluation.

  16. Extending Prior Posts in Dyadic Online Text Chat

    Science.gov (United States)

    Tudini, Vincenza

    2015-01-01

    This study explores whether chat users are able to extend prior, apparently completed posts in the dyadic online text chat context. Dyadic text chat has a unique turn-taking system, and most chat softwares do not permit users to monitor one another's written messages-in-progress. This is likely to impact on their use of online extensions as an…

  17. 13 CFR 305.14 - Occupancy prior to completion.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Occupancy prior to completion. 305.14 Section 305.14 Business Credit and Assistance ECONOMIC DEVELOPMENT ADMINISTRATION, DEPARTMENT OF... the Recipient's risk and must follow the requirements of local and State law. ...

  18. Do managers manipulate earnings prior to management buyouts?

    NARCIS (Netherlands)

    Mao, Yaping; Renneboog, Luc

    2015-01-01

    To address the question as to whether managers intending to purchase their company by means of a levered buyout transaction manipulate earnings in order to buy their firm on the cheap, we study the different types of earnings management prior to the transaction: accrual management, real earnings

  19. Do Managers Manipulate Earnings Prior to Management Buyouts?

    NARCIS (Netherlands)

    Mao, Y.; Renneboog, L.D.R.

    2013-01-01

    Abstract: To address the question as to whether managers manipulate accounting numbers downwards prior to management buyouts (MBOs), we implement an industry-adjusted buyout-specific approach and receive an affirmative answer. In UK buyout companies, negative earnings manipulation (understating the

  20. Recognition of Prior Learning as an integral component of ...

    African Journals Online (AJOL)

    This is irrespective of whether that learning has been acquired through unstructured learning, performance development, off-the-job assessment, or skills and knowledge that meet workplace needs but have been gained through various previous learning experiences. The concept Recognition of Prior Learning (RPL) is ...

  1. Investigation into alternative sludge conditioning prior to dewatering

    CSIR Research Space (South Africa)

    Smollen, M

    1997-01-01

    Full Text Available have proven that the mixture of char and a small quantity of polyelectrolyte (0.5 to 1kg per ton of dry solids), used as a conditioner prior to centrifugation and filtration tests, produced cake solids concentration superior to that obtained by using...

  2. An Adaptively Accelerated Bayesian Deblurring Method with Entropy Prior

    Directory of Open Access Journals (Sweden)

    Yong-Hoon Kim

    2008-05-01

    Full Text Available The development of an efficient adaptively accelerated iterative deblurring algorithm based on Bayesian statistical concept has been reported. Entropy of an image has been used as a “prior” distribution and instead of additive form, used in conventional acceleration methods an exponent form of relaxation constant has been used for acceleration. Thus the proposed method is called hereafter as adaptively accelerated maximum a posteriori with entropy prior (AAMAPE. Based on empirical observations in different experiments, the exponent is computed adaptively using first-order derivatives of the deblurred image from previous two iterations. This exponent improves speed of the AAMAPE method in early stages and ensures stability at later stages of iteration. In AAMAPE method, we also consider the constraint of the nonnegativity and flux conservation. The paper discusses the fundamental idea of the Bayesian image deblurring with the use of entropy as prior, and the analytical analysis of superresolution and the noise amplification characteristics of the proposed method. The experimental results show that the proposed AAMAPE method gives lower RMSE and higher SNR in 44% lesser iterations as compared to nonaccelerated maximum a posteriori with entropy prior (MAPE method. Moreover, AAMAPE followed by wavelet wiener filtering gives better result than the state-of-the-art methods.

  3. Reference Priors For Non-Normal Two-Sample Problems

    NARCIS (Netherlands)

    Fernández, C.; Steel, M.F.J.

    1997-01-01

    The reference prior algorithm (Berger and Bernardo, 1992) is applied to locationscale models with any regular sampling density. A number of two-sample problems is analyzed in this general context, extending the dierence, ratio and product of Normal means problems outside Normality, while explicitly

  4. Reference Priors for the General Location-Scale Model

    NARCIS (Netherlands)

    Fernández, C.; Steel, M.F.J.

    1997-01-01

    The reference prior algorithm (Berger and Bernardo 1992) is applied to multivariate location-scale models with any regular sampling density, where we establish the irrelevance of the usual assumption of Normal sampling if our interest is in either the location or the scale. This result immediately

  5. Bayesian genomic selection: the effect of haplotype lenghts and priors

    DEFF Research Database (Denmark)

    Villumsen, Trine Michelle; Janss, Luc

    2009-01-01

    Breeding values for animals with marker data are estimated using a genomic selection approach where data is analyzed using Bayesian multi-marker association models. Fourteen model scenarios with varying haplotype lengths, hyper parameter and prior distributions were compared to find the scenario ...

  6. Prior-to-Exam: What Activities Enhance Performance?

    Science.gov (United States)

    Rhoads, C. J.; Healy, Therese

    2013-01-01

    Can instructors impact their student performance by recommending an activity just prior to taking an exam? In this study, college students were randomly assigned to one of three treatment groups (study, exercise, or meditation) or a control group. Each group was given two different types of tests; a traditional concept exam, and a non-traditional…

  7. 5 CFR 6401.103 - Prior approval for outside employment.

    Science.gov (United States)

    2010-01-01

    .... 6401.103 Section 6401.103 Administrative Personnel ENVIRONMENTAL PROTECTION AGENCY SUPPLEMENTAL STANDARDS OF ETHICAL CONDUCT FOR EMPLOYEES OF THE ENVIRONMENTAL PROTECTION AGENCY § 6401.103 Prior approval... her Deputy Ethics Official before engaging in outside employment, with or without compensation, that...

  8. Anxiety and blood pressure prior to dental treatment.

    NARCIS (Netherlands)

    Benjamins, C.; Schuurs, A.H.; Asscheman, H.; Hoogstraten, J.

    1990-01-01

    Assessed dental anxiety and blood pressure immediately prior to a dental appointment in 24 patients attending a university dental clinic or a clinic for anxious dental patients in the Netherlands. Blood pressure was assessed by 2 independent methods, and the interchangeability of the blood-pressure

  9. Morbidity prior to a Diagnosis of Sleep-Disordered Breathing

    DEFF Research Database (Denmark)

    Jennum, Poul; Ibsen, Rikke Falkner; Kjellberg, Jakob

    2013-01-01

    Sleep-disordered breathing (SDB) causes burden to the sufferer, the healthcare system, and society. Most studies have focused on cardiovascular diseases (CVDs) after a diagnosis of obstructive sleep apnea (OSA) or obesity hypoventilation syndrome (OHS); however, the overall morbidity prior...

  10. Source-specific Informative Prior for i-Vector Extraction

    DEFF Research Database (Denmark)

    Shepstone, Sven Ewan; Lee, Kong Aik; Li, Haizhou

    2015-01-01

    An i-vector is a low-dimensional fixed-length representation of a variable-length speech utterance, and is defined as the posterior mean of a latent variable conditioned on the observed feature sequence of an utterance. The assumption is that the prior for the latent variable is non...

  11. Risk for malnutrition in patients prior to vascular surgery

    NARCIS (Netherlands)

    Beek, Lies Ter; Banning, Louise B D; Visser, Linda; Roodenburg, Jan L N; Krijnen, Wim P; van der Schans, Cees P; Pol, Robert A; Jager-Wittenaar, Harriët

    2017-01-01

    BACKGROUND: Malnutrition is an important risk factor for adverse post-operative outcomes. The prevalence of risk for malnutrition is unknown in patients prior to vascular surgery. We aimed to assess prevalence and associated factors of risk for malnutrition in this patient group. METHODS: Patients

  12. Imprecision and prior-data conflict in generalized Bayesian inference

    NARCIS (Netherlands)

    Walter, Gero; Augustin, T. (Thomas)

    2009-01-01

    A great advantage of imprecise probability models over models based on precise, traditional probabilities is the potential to reflect the amount of knowledge they stand for. Consequently, imprecise probability models promise to offer a vivid tool for handling situations of prior-data conflict in

  13. 5 CFR 7901.102 - Prior approval for outside employment.

    Science.gov (United States)

    2010-01-01

    ... ETHICAL CONDUCT FOR EMPLOYEES OF THE TENNESSEE VALLEY AUTHORITY § 7901.102 Prior approval for outside... or designee. The written request shall be submitted through the employee's supervisor or human resource office and shall, at a minimum, identify the employer or other person for whom the services are to...

  14. Simultaneous tomographic reconstruction and segmentation with class priors

    DEFF Research Database (Denmark)

    Romanov, Mikhail; Dahl, Anders Bjorholm; Dong, Yiqiu

    2015-01-01

    are combined to produce a reconstruction that is identical to the segmentation. We consider instead a hybrid approach that simultaneously produces both a reconstructed image and segmentation. We incorporate priors about the desired classes of the segmentation through a Hidden Markov Measure Field Model, and we...

  15. 47 CFR 25.118 - Modifications not requiring prior authorization.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false Modifications not requiring prior authorization. 25.118 Section 25.118 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES SATELLITE COMMUNICATIONS Applications and Licenses General Application Filing Requirements § 25...

  16. The epizootiology of the highly pathogenic avian influenza prior to ...

    African Journals Online (AJOL)

    The epizootiology of the highly pathogenic avian influenza prior to the anticipated pandemic of the early twenty first century. ... Transmission of highly pathogenic H5N1 from domestic fowls back to migratory waterfowl in western China has increased the geographic spread. This has grave consequences for the poultry ...

  17. 18 CFR 415.51 - Prior non-conforming structures.

    Science.gov (United States)

    2010-04-01

    ... 18 Conservation of Power and Water Resources 2 2010-04-01 2010-04-01 false Prior non-conforming structures. 415.51 Section 415.51 Conservation of Power and Water Resources DELAWARE RIVER BASIN COMMISSION... damaged by any means, including a flood, to the extent of 50 percent or more of its market value at that...

  18. Estimating security betas using prior information based on firm fundamentals

    NARCIS (Netherlands)

    Cosemans, Mathijs; Frehen, Rik; Schotman, Peter; Bauer, Rob

    We propose a hybrid approach for estimating beta that shrinks rolling window estimates towards firm-specific priors motivated by economic theory. Our method yields superior forecasts of beta that have important practical implications. First, hybrid betas carry a significant price of risk in the

  19. Estimating Security Betas Using Prior Information Based on Firm Fundamentals

    NARCIS (Netherlands)

    Cosemans, Mathijs; Frehen, Rik; Schotman, Peter; Bauer, Rob

    2016-01-01

    We propose a hybrid approach for estimating beta that shrinks rolling window estimates toward firm-specific priors motivated by economic theory. Our method yields superior forecasts of beta that have important practical implications. First, unlike standard rolling window betas, hybrid betas carry a

  20. Incorporating priors for EEG source imaging and connectivity analysis

    Directory of Open Access Journals (Sweden)

    Xu eLei

    2015-08-01

    Full Text Available Electroencephalography source imaging (ESI is a useful technique to localize the generators from a given scalp electric measurement and to investigate the temporal dynamics of the large-scale neural circuits. By introducing reasonable priors from other modalities, ESI reveals the most probable sources and communication structures at every moment in time. Here, we review the available priors from such techniques as magnetic resonance imaging (MRI, functional MRI (fMRI, and positron emission tomography (PET. The modality's specific contribution is analyzed from the perspective of source reconstruction. For spatial priors, such as EEG-correlated fMRI, temporally coherent networks and resting-state fMRI are systematically introduced in the ESI. Moreover, the fiber tracking (diffusion tensor imaging, DTI and neuro-stimulation techniques (transcranial magnetic stimulation, TMS are also introduced as the potential priors, which can help to draw inferences about the neuroelectric connectivity in the source space. We conclude that combining EEG source imaging with other complementary modalities is a promising approach towards the study of brain networks in cognitive and clinical neurosciences.

  1. Nonextensive Entropy, Prior PDFs and Spontaneous Symmetry Breaking

    OpenAIRE

    Shafee, Fariel

    2008-01-01

    We show that using nonextensive entropy can lead to spontaneous symmetry breaking when a parameter changes its value from that applicable for a symmetric domain, as in field theory. We give the physical reasons and also show that even for symmetric Dirichlet priors, such a defnition of the entropy and the parameter value can lead to asymmetry when entropy is maximized.

  2. Bayesian nonparametric system reliability using sets of priors

    NARCIS (Netherlands)

    Walter, G.M.; Aslett, L.J.M.; Coolen, F.P.A.

    2016-01-01

    An imprecise Bayesian nonparametric approach to system reliability with multiple types of components is developed. This allows modelling partial or imperfect prior knowledge on component failure distributions in a flexible way through bounds on the functioning probability. Given component level test

  3. Effects of prior interpretation on situation assessment is crime analysis

    NARCIS (Netherlands)

    Kerstholt, J.H.; Eikelboom, A.R.

    2007-01-01

    Purpose - To investigate the effects of prior case knowledge on the judgement of crime analysts. Design/methodology/approach - Explains that crime analysts assist when an investigation team has converged/agreed on a probable scenario, attributes this convergence to group-think, but points out this

  4. Prior experience, cognitive perceptions and psychological skills of ...

    African Journals Online (AJOL)

    The objective of this study was to investigate the interaction between the prior experience, cognitive perceptions and psychological skills of senior rugby players in South Africa. The study population included 139 trans-national players, 106 provincial players and 95 club rugby players (N=340). A cross-sectional design was ...

  5. Mountain bike racing - the influence of prior glycogen-inducing ...

    African Journals Online (AJOL)

    Objective. To investigate the effect of pre-exercise glutamine supplementation and the influence of a prior acute bout of glycogen-reducing exercise on the general stress and immune response to acute high-intensity cycling. Design. Randomised, double-blind, cross-over supplementation study. Setting and intervention.

  6. Preparing learners with partly incorrect intuitive prior knowledge for learning

    Directory of Open Access Journals (Sweden)

    Andrea eOhst

    2014-07-01

    Full Text Available Learners sometimes have incoherent and fragmented intuitive prior knowledge that is (partly ‘incompatible’ with the to-be-learned contents. Such knowledge in pieces can cause conceptual disorientation and cognitive overload while learning. We hypothesized that a pre-training intervention providing a generalized schema as a structuring framework for such knowledge in pieces would support (reorganizing-processes of prior knowledge and thus reduce unnecessary cognitive load during subsequent learning. Fifty-six student teachers participated in the experiment. A framework group underwent a pre-training intervention providing a generalized, categorical schema for categorizing primary learning strategies and related but different strategies as a cognitive framework for (re-organizing their prior knowledge. Our control group received comparable factual information but no framework. Afterwards, all participants learned about primary learning strategies. The framework group claimed to possess higher levels of interest and self-efficacy, achieved higher learning outcomes, and learned more efficiently. Hence, providing a categorical framework can help overcome the barrier of incorrect prior knowledge in pieces.

  7. Preparing learners with partly incorrect intuitive prior knowledge for learning

    Science.gov (United States)

    Ohst, Andrea; Fondu, Béatrice M. E.; Glogger, Inga; Nückles, Matthias; Renkl, Alexander

    2014-01-01

    Learners sometimes have incoherent and fragmented intuitive prior knowledge that is (partly) “incompatible” with the to-be-learned contents. Such knowledge in pieces can cause conceptual disorientation and cognitive overload while learning. We hypothesized that a pre-training intervention providing a generalized schema as a structuring framework for such knowledge in pieces would support (re)organizing-processes of prior knowledge and thus reduce unnecessary cognitive load during subsequent learning. Fifty-six student teachers participated in the experiment. A framework group underwent a pre-training intervention providing a generalized, categorical schema for categorizing primary learning strategies and related but different strategies as a cognitive framework for (re-)organizing their prior knowledge. Our control group received comparable factual information but no framework. Afterwards, all participants learned about primary learning strategies. The framework group claimed to possess higher levels of interest and self-efficacy, achieved higher learning outcomes, and learned more efficiently. Hence, providing a categorical framework can help overcome the barrier of incorrect prior knowledge in pieces. PMID:25071638

  8. 5 CFR 6701.106 - Prior approval for outside employment.

    Science.gov (United States)

    2010-01-01

    ... STANDARDS OF ETHICAL CONDUCT FOR EMPLOYEES OF THE GENERAL SERVICES ADMINISTRATION § 6701.106 Prior approval... to be performed; (4) The name and address of the prospective outside employer for which work will be... affects the outside employer and will disqualify himself from future participation in matters that could...

  9. 5 CFR 7401.102 - Prior approval for outside employment.

    Science.gov (United States)

    2010-01-01

    ... STANDARDS OF ETHICAL CONDUCT FOR EMPLOYEES OF THE MERIT SYSTEMS PROTECTION BOARD § 7401.102 Prior approval... written approval from the employee's supervisor and the concurrence of the Designated Agency Ethics... name of the employer or organization; (ii) The nature of the legal activity or other work to be...

  10. 5 CFR 7101.102 - Prior approval for outside employment.

    Science.gov (United States)

    2010-01-01

    ... STANDARDS OF ETHICAL CONDUCT FOR EMPLOYEES OF THE NATIONAL LABOR RELATIONS BOARD § 7101.102 Prior approval... forth, at a minimum: (i) The name of the employer; (ii) The nature of the legal activity or other work... designee may consult with the Designated Agency Ethics Official to ensure that the request for outside...

  11. Effects of regularisation priors on dynamic PET Data

    International Nuclear Information System (INIS)

    Caldeira, Liliana; Scheins, Juergen; Silva, Nuno da; Gaens, Michaela; Shah, N Jon

    2014-01-01

    Dynamic PET provides temporal information about tracer uptake. However, each PET frame has usually low statistics, resulting in noisy images. The goal is to study effects of prior regularisation on dynamic PET data. Quantification and noise in image-domain and time-domain as well as impact on parametric images is assessed.

  12. Exploiting prior knowledge of English, Mathematics and Chemistry ...

    African Journals Online (AJOL)

    This paper explores prior knowledge with the view to enhancing the study of French. Juxtaposing sentences in French and English to underscore syntactic differences and similarities, the paper attributes numerical values to nouns and adjectives in French in order to demonstrate the mathematical imbalance and lack of ...

  13. Using Students' Prior Knowledge to Teach Social Penetration Theory

    Science.gov (United States)

    Chornet-Roses, Daniel

    2010-01-01

    Bransford, Brown, and Cocking argue that acknowledging students' prior ideas and beliefs about a subject and incorporating them into the classroom enhances student learning. This article presents an activity which serves to hone three student learning outcomes: analysis of communication, inductive reasoning, and self-reflection. The goal of this…

  14. 40 CFR 266.101 - Management prior to burning.

    Science.gov (United States)

    2010-07-01

    ... storage units that store mixtures of hazardous waste and the primary fuel to the boiler or industrial... MANAGEMENT FACILITIES Hazardous Waste Burned in Boilers and Industrial Furnaces § 266.101 Management prior to burning. (a) Generators. Generators of hazardous waste that is burned in a boiler or industrial furnace...

  15. The Cost of Prior Restraint: "U. S. v. The Progressive."

    Science.gov (United States)

    Soloski, John; Dyer, Carolyn Stewart

    Increased litigation and rising litigation costs threaten the future of newspapers and magazines. A case study was conducted to determine the costs and effects of "United States v. 'The Progressive,'" a prior restraint case over the publication in 1979 of an article on the hydrogen bomb. "The Progressive," which operates at a…

  16. Class II correction prior to orthodontics with the carriere distalizer.

    Science.gov (United States)

    McFarlane, Bruce

    2013-01-01

    Class II correction is a challenge in orthodontics with many existing devices being complex, too compliance-driven, or too prone to breakage. The Carriere Distalizer allows for straightforward Class II correction prior to orthodontics (fixed or clear aligners) at a time when no other mechanics interfere, and compliance is at its best.

  17. Prior Exposure and Educational Environment towards Entrepreneurial Intention

    Directory of Open Access Journals (Sweden)

    Karla Soria-Barreto

    2017-07-01

    Full Text Available This research is based on the responses to a questionnaire applied to 351 students of business management in Chile and Colombia. Through the analysis of structural equations on Ajzen’s model, we found that entrepreneurial education, the University environment, and the prior entrepreneurial exposure are mediated by the factors of the Ajzen`s model to generate entrepreneurial intention in higher education students. The results show that entrepreneurial education strengthens the perceived control of behavior and, with it, albeit in a differentiated way, the entrepreneurial intention of men and women. University environment affects entrepreneurial intention through attitude towards entrepreneurship; and finally, the work experience, used as one of the variables that measure prior entrepreneurial exposure, explains the entrepreneurial intention inversely through the subjective norms. We found that gender has a moderate effect on perceived control of behavior and entrepreneurial education. The scarce studies on the impact of the University environment and the mixed results of the entrepreneurial education and prior entrepreneurial exposure toward entrepreneurial intention show the necessity for further research. A second contribution is the opportunity to present new evidence about the relationship between University environment, entrepreneurial education and prior exposure to developing countries of South America, including the gender effect (moderator for entrepreneurial intention. It is important to note that most of the research in this area applies to developed countries, and some scholars suggest that extrapolating the results is not convenient.

  18. Prior implicit knowledge shapes human threshold for orientation noise

    DEFF Research Database (Denmark)

    Christensen, Jeppe H; Bex, Peter J; Fiser, József

    2015-01-01

    , resulting in an image-class-specific threshold that changes the shape and position of the dipper function according to image class. These findings do not fit a filter-based feed-forward view of orientation coding, but can be explained by a process that utilizes an experience-based perceptual prior...

  19. Clinical utility of carotid duplex ultrasound prior to cardiac surgery.

    Science.gov (United States)

    Lin, Judith C; Kabbani, Loay S; Peterson, Edward L; Masabni, Khalil; Morgan, Jeffrey A; Brooks, Sara; Wertella, Kathleen P; Paone, Gaetano

    2016-03-01

    Clinical utility and cost-effectiveness of carotid duplex examination prior to cardiac surgery have been questioned by the multidisciplinary committee creating the 2012 Appropriate Use Criteria for Peripheral Vascular Laboratory Testing. We report the clinical outcomes and postoperative neurologic symptoms in patients who underwent carotid duplex ultrasound prior to open heart surgery at a tertiary institution. Using the combined databases from our clinical vascular laboratory and the Society of Thoracic Surgery, a retrospective analysis of all patients who underwent carotid duplex ultrasound within 13 months prior to open heart surgery from March 2005 to March 2013 was performed. The outcomes between those who underwent carotid duplex scanning (group A) and those who did not (group B) were compared. Among 3233 patients in the cohort who underwent cardiac surgery, 515 (15.9%) patients underwent a carotid duplex ultrasound preoperatively, and 2718 patients did not (84.1%). Among the patients who underwent carotid screening vs no screening, there was no statistically significant difference in the risk factors of cerebrovascular disease (10.9% vs 12.7%; P = .26), prior stroke (8.2% vs 7.2%; P = .41), and prior transient ischemic attack (2.9% vs 3.3%; P = .24). For those undergoing isolated coronary artery bypass grafting (CABG), 306 (17.8%) of 1723 patients underwent preoperative carotid duplex ultrasound. Among patients who had carotid screening prior to CABG, the incidence of carotid disease was low: 249 (81.4%) had minimal or mild stenosis (duplex scanning and those who did not. Primary outcomes of patients who underwent open heart surgery also showed no difference in the perioperative mortality (5.1% vs 6.9%; P = .14) and stroke (2.6% vs 2.4%; P = .85) between patients undergoing preoperative duplex scanning and those who did not. Operative intervention of severe carotid stenosis prior to isolated CABG occurred in 2 of the 17 patients (11.8%) identified who

  20. Current irritability robustly related to current and prior anxiety in bipolar disorder.

    Science.gov (United States)

    Yuen, Laura D; Miller, Shefali; Wang, Po W; Hooshmand, Farnaz; Holtzman, Jessica N; Goffin, Kathryn C; Shah, Saloni; Ketter, Terence A

    2016-08-01

    Although current irritability and current/prior anxiety have been associated in unipolar depression, these relationships are less well understood in bipolar disorder (BD). We investigated relationships between current irritability and current/prior anxiety as well as other current emotions and BD illness characteristics. Outpatients referred to the Stanford Bipolar Disorders Clinic during 2000-2011 were assessed with the Systematic Treatment Enhancement Program for BD (STEP-BD) Affective Disorders Evaluation. Prevalence and clinical correlates of current irritability and current/prior anxiety and other illness characteristics were examined. Among 497 BD outpatients (239 Type I, 258 Type II; 58.1% female; mean ± SD age 35.6 ± 13.1 years), 301 (60.6%) had baseline current irritability. Patients with versus without current irritability had significantly higher rates of current anxiety (77.1% versus 42.9%, p anxiety disorder (73.1% versus 52.6%, p anxiety than to current anhedonia, sadness, or euphoria (all p anxiety associations persisted across current predominant mood states. Current irritability was more robustly related to past anxiety than to all other assessed illness characteristics, including 1° family history of mood disorder, history of alcohol/substance use disorder, bipolar subtype, and current syndromal/subsyndromal depression (all p anxiety. Further studies are warranted to assess longitudinal clinical implications of relationships between irritability and anxiety in BD. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Vesicles between plasma membrane and cell wall prior to visible senescence of Iris and Dendrobium flowers.

    Science.gov (United States)

    Kamdee, Channatika; Kirasak, Kanjana; Ketsa, Saichol; van Doorn, Wouter G

    2015-09-01

    Cut Iris flowers (Iris x hollandica, cv. Blue Magic) show visible senescence about two days after full opening. Epidermal cells of the outer tepals collapse due to programmed cell death (PCD). Transmission electron microscopy (TEM) showed irregular swelling of the cell walls, starting prior to cell collapse. Compared to cells in flowers that had just opened, wall thickness increased up to tenfold prior to cell death. Fibrils were visible in the swollen walls. After cell death very little of the cell wall remained. Prior to and during visible wall swelling, vesicles (paramural bodies) were observed between the plasma membrane and the cell walls. The vesicles were also found in groups and were accompanied by amorphous substance. They usually showed a single membrane, and had a variety of diameters and electron densities. Cut Dendrobium hybrid cv. Lucky Duan flowers exhibited visible senescence about 14 days after full flower opening. Paramural bodies were also found in Dendrobium tepal epidermis and mesophyll cells, related to wall swelling and degradation. Although alternative explanations are well possible, it is hypothesized that paramural bodies carry enzymes involved in cell wall breakdown. The literature has not yet reported such bodies in association with senescence/PCD. Copyright © 2015 Elsevier GmbH. All rights reserved.

  2. Oropharyngeal colonization by Haemophilus influenzae in healthy children from Taubaté (São Paulo, prior to the Haemophilus influenzae type b vaccination program in Brazil Colonização da orofaringe de crianças saudáveis de Taubaté (São Paulo por Haemophilus influenzae, antes da introdução da vacina contra Haemophilus influenzae do tipo b no Brasil

    Directory of Open Access Journals (Sweden)

    Lucia Ferro Bricks

    2004-01-01

    Full Text Available Haemophilus influenzae is one of the most important bacterial agents of otitis and sinusitis. H. influenzae type b (Hib is one of the main causes of meningitis, pneumonia, and septicemia in nonvaccinated children under 6 years of age. The aims of this study were to determine the prevalence of H. influenzae and Hib oropharyngeal colonization prior to the onset of the Hib vaccination program in Brazil in previously healthy children and to assess the susceptibility profile of this microorganism to a selected group of antimicrobials that are used to treat acute respiratory infections. METHOD: Cultures of Haemophilus influenzae were made from oropharynx swabs from 987 children under 6 years of age who were enrolled in 29 day-care centers in Taubaté (a city of São Paulo state, Brazil between July and December 1998. RESULTS: The prevalence of H. influenzae carriers was 17.4%, and only 5.5% of the strains were beta-lactamase producers. The prevalence of Hib carriers was high, 7.3% on average (range, 0.0 - 33.3%. CONCLUSIONS: The low prevalence of colonization by penicillin-resistant strains indicates that it is not necessary to substitute ampicilin or amoxicilin to effectively treat otitis and sinusitis caused by H. influenzae in Taubaté.Haemophilus influenzae é um dos mais importantes agentes bacterianos de otites e sinusites. Em crianças menores de seis anos de idade não vacinadas contra o H. influenzae do tipo b (Hib, essa bactéria é uma das principais causadoras de meningite, pneumonia e sepse. O objetivo deste estudo foi determinar a prevalência da colonização da orofaringe de crianças previamente saudáveis por H. influenzae e Hib e avaliar o perfil de suscetibilidade desses microorganismos a um grupo seleto de antimicrobianos, que habitualmente são utilizados para tratar as infecções respiratórias agudas. MÉTODO: Foram colhidos swabs da orofaringe de 987 crianças menores de seis anos de idade que freqüentavam 29 creches da

  3. Satellite Infrared Radiation Measurements Prior to the Major Earthquakes

    Science.gov (United States)

    Ouzounov, Dimitar; Pulintes, S.; Bryant, N.; Taylor, Patrick; Freund, F.

    2005-01-01

    This work describes our search for a relationship between tectonic stresses and increases in mid-infrared (IR) flux as part of a possible ensemble of electromagnetic (EM) phenomena that may be related to earthquake activity. We present and &scuss observed variations in thermal transients and radiation fields prior to the earthquakes of Jan 22, 2003 Colima (M6.7) Mexico, Sept. 28 .2004 near Parkfield (M6.0) in California and Northern Sumatra (M8.5) Dec. 26,2004. Previous analysis of earthquake events has indicated the presence of an IR anomaly, where temperatures increased or did not return to its usual nighttime value. Our procedures analyze nighttime satellite data that records the general condtion of the ground after sunset. We have found from the MODIS instrument data that five days before the Colima earthquake the IR land surface nighttime temperature rose up to +4 degrees C in a 100 km radius around the epicenter. The IR transient field recorded by MODIS in the vicinity of Parkfield, also with a cloud free environment, was around +1 degree C and is significantly smaller than the IR anomaly around the Colima epicenter. Ground surface temperatures near the Parkfield epicenter four days prior to the earthquake show steady increase. However, on the night preceding the quake, a significant drop in relative humidity was indicated, process similar to those register prior to the Colima event. Recent analyses of continuous ongoing long- wavelength Earth radiation (OLR) indicate significant and anomalous variability prior to some earthquakes. The cause of these anomalies is not well understood but could be the result of a triggering by an interaction between the lithosphere-hydrosphere and atmospheric related to changes in the near surface electrical field and/or gas composition prior to the earthquake. The OLR anomaly usually covers large areas surrounding the main epicenter. We have found strong anomalies signal (two sigma) along the epicentral area signals on Dec 21

  4. Prior image constrained image reconstruction in emerging computed tomography applications

    Science.gov (United States)

    Brunner, Stephen T.

    Advances have been made in computed tomography (CT), especially in the past five years, by incorporating prior images into the image reconstruction process. In this dissertation, we investigate prior image constrained image reconstruction in three emerging CT applications: dual-energy CT, multi-energy photon-counting CT, and cone-beam CT in image-guided radiation therapy. First, we investigate the application of Prior Image Constrained Compressed Sensing (PICCS) in dual-energy CT, which has been called "one of the hottest research areas in CT." Phantom and animal studies are conducted using a state-of-the-art 64-slice GE Discovery 750 HD CT scanner to investigate the extent to which PICCS can enable radiation dose reduction in material density and virtual monochromatic imaging. Second, we extend the application of PICCS from dual-energy CT to multi-energy photon-counting CT, which has been called "one of the 12 topics in CT to be critical in the next decade." Numerical simulations are conducted to generate multiple energy bin images for a photon-counting CT acquisition and to investigate the extent to which PICCS can enable radiation dose efficiency improvement. Third, we investigate the performance of a newly proposed prior image constrained scatter correction technique to correct scatter-induced shading artifacts in cone-beam CT, which, when used in image-guided radiation therapy procedures, can assist in patient localization, and potentially, dose verification and adaptive radiation therapy. Phantom studies are conducted using a Varian 2100 EX system with an on-board imager to investigate the extent to which the prior image constrained scatter correction technique can mitigate scatter-induced shading artifacts in cone-beam CT. Results show that these prior image constrained image reconstruction techniques can reduce radiation dose in dual-energy CT by 50% in phantom and animal studies in material density and virtual monochromatic imaging, can lead to radiation

  5. Learning Science through Writing: Associations with Prior Conceptions of Writing and Perceptions of a Writing Program

    Science.gov (United States)

    Ellis, Robert A.; Taylor, Charlotte E.; Drury, Helen

    2007-01-01

    Students in a large undergraduate biology course were expected to write a scientific report as a key part of their course design. This study investigates the quality of learning arising from the writing experience and how it relates to the quality of students' preconceptions of learning through writing and their perceptions of their writing…

  6. 77 FR 71687 - Federal Employees' Group Life Insurance Program: Court Orders Prior to July 22, 1998

    Science.gov (United States)

    2012-12-04

    ... regulations regarding the effect of any court decree of divorce, annulment, or legal separation, or any court- approved property settlement agreement incident to any court decree of divorce, annulment, or legal... court decision and adds little substantive interpretation of the law. For the foregoing reasons, OPM...

  7. Five Apollo astronauts with Lunar Module at ASVC prior to grand opening

    Science.gov (United States)

    1997-01-01

    Some of the former Apollo program astronauts observe a Lunar Module and Moon mockup during a tour the new Apollo/Saturn V Center (ASVC) at KSC prior to the gala grand opening ceremony for the facility that was held Jan. 8, 1997. The astronauts were invited to participate in the event, which also featured NASA Administrator Dan Goldin and KSC Director Jay Honeycutt. Some of the visiting astonauts were (from left): Apollo 10 Lunar Module Pilot and Apollo 17 Commander Eugene A. Cernan; Apollo 9 Lunar Module Pilot Russell L. Schweikart; Apollo 10 Command Module Pilot and Apollo 16 Commander John W. Young; Apollo 10 Commander Thomas P. Stafford; and Apollo 11 Lunar Module Pilot Edwin E. 'Buzz' Aldrin, Jr. The ASVC also features several other Apollo program spacecraft components, multimedia presentations and a simulated Apollo/Saturn V liftoff. The facility will be a part of the KSC bus tour that embarks from the KSC Visitor Center.

  8. Post-prior equivalence for transfer reactions with complex potentials

    Science.gov (United States)

    Lei, Jin; Moro, Antonio M.

    2018-01-01

    In this paper, we address the problem of the post-prior equivalence in the calculation of inclusive breakup and transfer cross sections. For that, we employ the model proposed by Ichimura et al. [Phys. Rev. C 32, 431 (1985), 10.1103/PhysRevC.32.431], conveniently generalized to include the part of the cross section corresponding the transfer to bound states. We pay particular attention to the case in which the unobserved particle is left in a bound state of the residual nucleus, in which case the theory prescribes the use of a complex potential, responsible for the spreading width of the populated single-particle states. We see that the introduction of this complex potential gives rise to an additional term in the prior cross-section formula, not present in the usual case of real binding potentials. The equivalence is numerically tested for the 58Ni(d ,p X ) reaction.

  9. Bayesian Image Segmentations by Potts Prior and Loopy Belief Propagation

    Science.gov (United States)

    Tanaka, Kazuyuki; Kataoka, Shun; Yasuda, Muneki; Waizumi, Yuji; Hsu, Chiou-Ting

    2014-12-01

    This paper presents a Bayesian image segmentation model based on Potts prior and loopy belief propagation. The proposed Bayesian model involves several terms, including the pairwise interactions of Potts models, and the average vectors and covariant matrices of Gauss distributions in color image modeling. These terms are often referred to as hyperparameters in statistical machine learning theory. In order to determine these hyperparameters, we propose a new scheme for hyperparameter estimation based on conditional maximization of entropy in the Potts prior. The algorithm is given based on loopy belief propagation. In addition, we compare our conditional maximum entropy framework with the conventional maximum likelihood framework, and also clarify how the first order phase transitions in loopy belief propagations for Potts models influence our hyperparameter estimation procedures.

  10. Prior processes and their applications nonparametric Bayesian estimation

    CERN Document Server

    Phadia, Eswar G

    2016-01-01

    This book presents a systematic and comprehensive treatment of various prior processes that have been developed over the past four decades for dealing with Bayesian approach to solving selected nonparametric inference problems. This revised edition has been substantially expanded to reflect the current interest in this area. After an overview of different prior processes, it examines the now pre-eminent Dirichlet process and its variants including hierarchical processes, then addresses new processes such as dependent Dirichlet, local Dirichlet, time-varying and spatial processes, all of which exploit the countable mixture representation of the Dirichlet process. It subsequently discusses various neutral to right type processes, including gamma and extended gamma, beta and beta-Stacy processes, and then describes the Chinese Restaurant, Indian Buffet and infinite gamma-Poisson processes, which prove to be very useful in areas such as machine learning, information retrieval and featural modeling. Tailfree and P...

  11. The Effect of Prior Knowledge and Gender on Physics Achievement

    Science.gov (United States)

    Stewart, John; Henderson, Rachel

    2017-01-01

    Gender differences on the Conceptual Survey in Electricity and Magnetism (CSEM) have been extensively studied. Ten semesters (N=1621) of CSEM data is presented showing male students outperform female students on the CSEM posttest by 5 % (p qualitative in-semester test questions by 3 % (p = . 004), but no significant difference between male and female students was found on quantitative test questions. Male students enter the class with superior prior preparation in the subject and score 4 % higher on the CSEM pretest (p questions correctly (N=822), male and female differences on the CSEM and qualitative test questions cease to be significant. This suggests no intrinsic gender bias exists in the CSEM itself and that gender differences are the result of prior preparation measured by CSEM pretest score. Gender differences between male and female students increase with pretest score. Regression analyses are presented to further explore interactions between preparation, gender, and achievement.

  12. Shocking Path of Least Resistance Shines Light on Subsurface by Revealing the Paths of Water and the Presence of Faults: Stacked EM Case Studies over Barite Hills Superfund Site in South Carolina

    Science.gov (United States)

    Haggar, K. S.; Nelson, H. R., Jr.; Berent, L. J.

    2017-12-01

    The Barite Hills/Nevada Gold Fields mines are in Late Proterozoic and early Paleozoic rocks of the gold and iron sulfides rich Carolina slate belt. The mines were active from 1989 to1995. EPA and USGS site investigations in 2003 resulted in the declaration of the waste pit areas as a superfund site. The USGS and private consulting firms have evaluated subsurface water flow paths, faults & other groundwater-related features at this superfund site utilizing 2-D conductivity & 3-D electromagnetic (EM) surveys. The USGS employed conductivity to generate instantaneous 2-D profiles to evaluate shallow groundwater patterns. Porous regolith sediments, contaminated water & mine debris have high conductivity whereas bedrock is identified by its characteristic low conductivity readings. Consulting contractors integrated EM technology, magnetic & shallow well data to generate 3-D images of groundwater flow paths at given depths across the superfund site. In so doing several previously undetected faults were identified. Lighting strike data was integrated with the previously evaluated electrical and EM data to determine whether this form of natural-sourced EM data could complement and supplement the more traditional geophysical data described above. Several lightning attributes derived from 3-D lightning volumes were found to correlate to various features identified in the previous geophysical studies. Specifically, the attributes Apparent Resistivity, Apparent Permittivity, Peak Current & Tidal Gravity provided the deepest structural geological framework & provided insights into rock properties & earth tides. Most significantly, Peak Current showed remarkable coincidence with the preferred groundwater flow map identified by one of the contractors utilizing EM technology. This study demonstrates the utility of robust integrated EM technology applications for projects focused on hydrology, geohazards to dams, levees, and structures, as well as mineral and hydrocarbon exploration.

  13. Maximum entropy reconstruction of spin densities involving non uniform prior

    International Nuclear Information System (INIS)

    Schweizer, J.; Ressouche, E.; Papoular, R.J.; Zheludev, A.I.

    1997-01-01

    Diffraction experiments give microscopic information on structures in crystals. A method which uses the concept of maximum of entropy (MaxEnt), appears to be a formidable improvement in the treatment of diffraction data. This method is based on a bayesian approach: among all the maps compatible with the experimental data, it selects that one which has the highest prior (intrinsic) probability. Considering that all the points of the map are equally probable, this probability (flat prior) is expressed via the Boltzman entropy of the distribution. This method has been used for the reconstruction of charge densities from X-ray data, for maps of nuclear densities from unpolarized neutron data as well as for distributions of spin density. The density maps obtained by this method, as compared to those resulting from the usual inverse Fourier transformation, are tremendously improved. In particular, any substantial deviation from the background is really contained in the data, as it costs entropy compared to a map that would ignore such features. However, in most of the cases, before the measurements are performed, some knowledge exists about the distribution which is investigated. It can range from the simple information of the type of scattering electrons to an elaborate theoretical model. In these cases, the uniform prior which considers all the different pixels as equally likely, is too weak a requirement and has to be replaced. In a rigorous bayesian analysis, Skilling has shown that prior knowledge can be encoded into the Maximum Entropy formalism through a model m(rvec r), via a new definition for the entropy given in this paper. In the absence of any data, the maximum of the entropy functional is reached for ρ(rvec r) = m(rvec r). Any substantial departure from the model, observed in the final map, is really contained in the data as, with the new definition, it costs entropy. This paper presents illustrations of model testing

  14. Incorporating outcome uncertainty and prior outcome beliefs in stated preferences

    DEFF Research Database (Denmark)

    Lundhede, Thomas; Jacobsen, Jette Bredahl; Hanley, Nick

    2015-01-01

    Stated preference studies tell respondents that policies create environmental changes with varying levels of uncertainty. However, respondents may include their own a priori assessments of uncertainty when making choices among policy options. Using a choice experiment eliciting respondents......’ preferences for conservation policies under climate change, we find that higher outcome uncertainty reduces utility. When accounting for endogeneity, we find that prior beliefs play a significant role in this cost of uncertainty. Thus, merely stating “objective” levels of outcome uncertainty...

  15. Morbidity in early Parkinson's disease and prior to diagnosis

    DEFF Research Database (Denmark)

    Frandsen, Rune; Kjellberg, Jakob; Ibsen, Rikke

    2014-01-01

    BACKGROUND: Nonmotor symptoms are probably present prior to, early on, and following, a diagnosis of Parkinson's disease. Nonmotor symptoms may hold important information about the progression of Parkinson's disease. OBJECTIVE: To evaluated the total early and prediagnostic morbidities in the 3......, poisoning and certain other external causes, and other factors influencing health status and contact with health services. It was negatively associated with neoplasm, cardiovascular, and respiratory diseases. CONCLUSIONS: Patients with a diagnosis of Parkinson's disease present significant differences...

  16. Anticipatory parental care: acquiring resources for offspring prior to conception.

    OpenAIRE

    Boutin, S; Larsen, K W; Berteaux, D

    2000-01-01

    Many organisms acquire and defend resources outside the breeding season and this is thought to be for immediate survival and reproductive benefits. Female red squirrels (Tamiasciurus hudsonicus) acquire traditional food cache sites up to four months prior to the presence of any physiological or behavioural cues associated with mating or offspring dependency. They subsequently relinquish these resources to one of their offspring at independence (ten months later). We experimentally show that a...

  17. Rapid sampling of molecular motions with prior information constraints.

    Science.gov (United States)

    Raveh, Barak; Enosh, Angela; Schueler-Furman, Ora; Halperin, Dan

    2009-02-01

    Proteins are active, flexible machines that perform a range of different functions. Innovative experimental approaches may now provide limited partial information about conformational changes along motion pathways of proteins. There is therefore a need for computational approaches that can efficiently incorporate prior information into motion prediction schemes. In this paper, we present PathRover, a general setup designed for the integration of prior information into the motion planning algorithm of rapidly exploring random trees (RRT). Each suggested motion pathway comprises a sequence of low-energy clash-free conformations that satisfy an arbitrary number of prior information constraints. These constraints can be derived from experimental data or from expert intuition about the motion. The incorporation of prior information is very straightforward and significantly narrows down the vast search in the typically high-dimensional conformational space, leading to dramatic reduction in running time. To allow the use of state-of-the-art energy functions and conformational sampling, we have integrated this framework into Rosetta, an accurate protocol for diverse types of structural modeling. The suggested framework can serve as an effective complementary tool for molecular dynamics, Normal Mode Analysis, and other prevalent techniques for predicting motion in proteins. We applied our framework to three different model systems. We show that a limited set of experimentally motivated constraints may effectively bias the simulations toward diverse predicates in an outright fashion, from distance constraints to enforcement of loop closure. In particular, our analysis sheds light on mechanisms of protein domain swapping and on the role of different residues in the motion.

  18. Real earnings management activities prior to bond issuance

    Directory of Open Access Journals (Sweden)

    Cristhian Mellado-Cid

    2017-07-01

    Full Text Available We examine real activities manipulation by firms prior to their debt issuances and how such manipulation activities affect bond yield spreads. We find that bond-issuing firms increase their real activities manipulation in the five quarters leading to a bond issuance. We document an inverse association between yield spread and pre-issue real activities manipulation, i.e., firms engaged in abnormally high levels of real activities manipulation are associated with subsequent lower cost of debt.

  19. Astronauts Parise and Jernigan check helmets prior to training session

    Science.gov (United States)

    1994-01-01

    Attired in training versions of the Shuttle partial-pressure launch and entry suits, payload specialist Dr. Ronald A Parise (left) and astronaut Tamara E. Jernigan, payload commander, check over their helmets prior to a training session. Holding the helmets is suit expert Alan M. Rochford, of NASA. The two were about to join their crew mates in a session of emergency bailout training at JSC's Weightless Environment Training Facility (WETF).

  20. Rapid sampling of molecular motions with prior information constraints.

    Directory of Open Access Journals (Sweden)

    Barak Raveh

    2009-02-01

    Full Text Available Proteins are active, flexible machines that perform a range of different functions. Innovative experimental approaches may now provide limited partial information about conformational changes along motion pathways of proteins. There is therefore a need for computational approaches that can efficiently incorporate prior information into motion prediction schemes. In this paper, we present PathRover, a general setup designed for the integration of prior information into the motion planning algorithm of rapidly exploring random trees (RRT. Each suggested motion pathway comprises a sequence of low-energy clash-free conformations that satisfy an arbitrary number of prior information constraints. These constraints can be derived from experimental data or from expert intuition about the motion. The incorporation of prior information is very straightforward and significantly narrows down the vast search in the typically high-dimensional conformational space, leading to dramatic reduction in running time. To allow the use of state-of-the-art energy functions and conformational sampling, we have integrated this framework into Rosetta, an accurate protocol for diverse types of structural modeling. The suggested framework can serve as an effective complementary tool for molecular dynamics, Normal Mode Analysis, and other prevalent techniques for predicting motion in proteins. We applied our framework to three different model systems. We show that a limited set of experimentally motivated constraints may effectively bias the simulations toward diverse predicates in an outright fashion, from distance constraints to enforcement of loop closure. In particular, our analysis sheds light on mechanisms of protein domain swapping and on the role of different residues in the motion.

  1. Hysteresis as an Implicit Prior in Tactile Spatial Decision Making

    Science.gov (United States)

    Thiel, Sabrina D.; Bitzer, Sebastian; Nierhaus, Till; Kalberlah, Christian; Preusser, Sven; Neumann, Jane; Nikulin, Vadim V.; van der Meer, Elke; Villringer, Arno; Pleger, Burkhard

    2014-01-01

    Perceptual decisions not only depend on the incoming information from sensory systems but constitute a combination of current sensory evidence and internally accumulated information from past encounters. Although recent evidence emphasizes the fundamental role of prior knowledge for perceptual decision making, only few studies have quantified the relevance of such priors on perceptual decisions and examined their interplay with other decision-relevant factors, such as the stimulus properties. In the present study we asked whether hysteresis, describing the stability of a percept despite a change in stimulus property and known to occur at perceptual thresholds, also acts as a form of an implicit prior in tactile spatial decision making, supporting the stability of a decision across successively presented random stimuli (i.e., decision hysteresis). We applied a variant of the classical 2-point discrimination task and found that hysteresis influenced perceptual decision making: Participants were more likely to decide ‘same’ rather than ‘different’ on successively presented pin distances. In a direct comparison between the influence of applied pin distances (explicit stimulus property) and hysteresis, we found that on average, stimulus property explained significantly more variance of participants’ decisions than hysteresis. However, when focusing on pin distances at threshold, we found a trend for hysteresis to explain more variance. Furthermore, the less variance was explained by the pin distance on a given decision, the more variance was explained by hysteresis, and vice versa. Our findings suggest that hysteresis acts as an implicit prior in tactile spatial decision making that becomes increasingly important when explicit stimulus properties provide decreasing evidence. PMID:24587045

  2. Analysis of Extracting Prior BRDF from MODIS BRDF Data

    OpenAIRE

    Hu Zhang; Ziti Jiao; Yadong Dong; Peng Du; Yang Li; Yi Lian; Tiejun Cui

    2016-01-01

    Many previous studies have attempted to extract prior reflectance anisotropy knowledge from the historical MODIS Bidirectional Reflectance Distribution Function (BRDF) product based on land cover or Normalized Difference Vegetation Index (NDVI) data. In this study, the feasibility of the method is discussed based on MODIS data and archetypal BRDFs. The BRDF is simplified into six archetypal BRDFs that represent different reflectance anisotropies. Five-year time series of MODIS BRDF data over ...

  3. 24 CFR 570.426 - Program income.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 3 2010-04-01 2010-04-01 false Program income. 570.426 Section 570... in Hawaii and Insular Areas Programs § 570.426 Program income. (a) The provisions of § 570.504(b) apply to all program income generated by a specific grant and received prior to grant closeout. (b) If...

  4. A Single Image Dehazing Method Using Average Saturation Prior

    Directory of Open Access Journals (Sweden)

    Zhenfei Gu

    2017-01-01

    Full Text Available Outdoor images captured in bad weather are prone to yield poor visibility, which is a fatal problem for most computer vision applications. The majority of existing dehazing methods rely on an atmospheric scattering model and therefore share a common limitation; that is, the model is only valid when the atmosphere is homogeneous. In this paper, we propose an improved atmospheric scattering model to overcome this inherent limitation. By adopting the proposed model, a corresponding dehazing method is also presented. In this method, we first create a haze density distribution map of a hazy image, which enables us to segment the hazy image into scenes according to the haze density similarity. Then, in order to improve the atmospheric light estimation accuracy, we define an effective weight assignment function to locate a candidate scene based on the scene segmentation results and therefore avoid most potential errors. Next, we propose a simple but powerful prior named the average saturation prior (ASP, which is a statistic of extensive high-definition outdoor images. Using this prior combined with the improved atmospheric scattering model, we can directly estimate the scene atmospheric scattering coefficient and restore the scene albedo. The experimental results verify that our model is physically valid, and the proposed method outperforms several state-of-the-art single image dehazing methods in terms of both robustness and effectiveness.

  5. The search for Infrared radiation prior to major earthquakes

    Science.gov (United States)

    Ouzounov, D.; Taylor, P.; Pulinets, S.

    2004-12-01

    This work describes our search for a relationship between tectonic stresses and electro-chemical and thermodynamic processes in the Earth and increases in mid-IR flux as part of a possible ensemble of electromagnetic (EM) phenomena that may be related to earthquake activity. Recent analysis of continuous ongoing long- wavelength Earth radiation (OLR) indicates significant and anomalous variability prior to some earthquakes. The cause of these anomalies is not well understood but could be the result of a triggering by an interaction between the lithosphere-hydrosphere and atmospheric related to changes in the near surface electrical field and gas composition prior to the earthquake. The OLR anomaly covers large areas surrounding the main epicenter. We have use the NOAA IR data to differentiate between the global and seasonal variability and these transient local anomalies. Indeed, on the basis of a temporal and spatial distribution analysis, an anomaly pattern is found to occur several days prior some major earthquakes. The significance of these observations was explored using data sets of some recent worldwide events.

  6. Generalized species sampling priors with latent Beta reinforcements

    Science.gov (United States)

    Airoldi, Edoardo M.; Costa, Thiago; Bassetti, Federico; Leisen, Fabrizio; Guindani, Michele

    2014-01-01

    Many popular Bayesian nonparametric priors can be characterized in terms of exchangeable species sampling sequences. However, in some applications, exchangeability may not be appropriate. We introduce a novel and probabilistically coherent family of non-exchangeable species sampling sequences characterized by a tractable predictive probability function with weights driven by a sequence of independent Beta random variables. We compare their theoretical clustering properties with those of the Dirichlet Process and the two parameters Poisson-Dirichlet process. The proposed construction provides a complete characterization of the joint process, differently from existing work. We then propose the use of such process as prior distribution in a hierarchical Bayes modeling framework, and we describe a Markov Chain Monte Carlo sampler for posterior inference. We evaluate the performance of the prior and the robustness of the resulting inference in a simulation study, providing a comparison with popular Dirichlet Processes mixtures and Hidden Markov Models. Finally, we develop an application to the detection of chromosomal aberrations in breast cancer by leveraging array CGH data. PMID:25870462

  7. Evaluation of the macula prior to cataract surgery.

    Science.gov (United States)

    McKeague, Marta; Sharma, Priya; Ho, Allen C

    2018-01-01

    To describe recent evidence regarding methods of evaluation of retinal structure and function prior to cataract surgery. Studies in patients with cataract but no clinically detectable retinal disease have shown that routine use of optical coherence tomography (OCT) prior to cataract surgery can detect subtle macular disease, which may alter the course of treatment or lead to modification of consent. The routine use of OCT has been especially useful in patients being considered for advanced-technology intraocular lenses (IOLs) as subtle macular disease can be a contraindication to the use of these lenses. The cost-effectiveness of routine use of OCT prior to cataract surgery has not been studied. Other technologies that assess retinal function rather than structure, such as microperimetry and electroretinogram (ERG) need further study to determine whether they can predict retinal potential in cataract patients. There is growing evidence for the importance of more detailed retinal evaluation of cataract patients even with clinically normal exam. OCT has been the most established and studied method for retinal evaluation in cataract patients, but other technologies such as microperimetry and ERG are beginning to be studied.

  8. Natural priors, CMSSM fits and LHC weather forecasts

    International Nuclear Information System (INIS)

    Allanach, Benjamin C.; Cranmer, Kyle; Lester, Christopher G.; Weber, Arne M.

    2007-01-01

    Previous LHC forecasts for the constrained minimal supersymmetric standard model (CMSSM), based on current astrophysical and laboratory measurements, have used priors that are flat in the parameter tan β, while being constrained to postdict the central experimental value of M Z . We construct a different, new and more natural prior with a measure in μ and B (the more fundamental MSSM parameters from which tan β and M Z are actually derived). We find that as a consequence this choice leads to a well defined fine-tuning measure in the parameter space. We investigate the effect of such on global CMSSM fits to indirect constraints, providing posterior probability distributions for Large Hadron Collider (LHC) sparticle production cross sections. The change in priors has a significant effect, strongly suppressing the pseudoscalar Higgs boson dark matter annihilation region, and diminishing the probable values of sparticle masses. We also show how to interpret fit information from a Markov Chain Monte Carlo in a frequentist fashion; namely by using the profile likelihood. Bayesian and frequentist interpretations of CMSSM fits are compared and contrasted

  9. Hazardous materials emergency response training program at Texas A ampersand M University

    International Nuclear Information System (INIS)

    Stirling, A.G.

    1989-01-01

    The Texas Engineering Extension Service (TEEX) as the engineering vocational training arm of the Texas A ampersand M University system has conducted oil-spill, hazardous-material, and related safety training for industry since 1976 and fire suppression training since 1931. In 1987 TEEX conducted training for some 66,000 persons, of which some 6000 were in hazardous-materials safety training and 22,000 in fire suppression or related fields. Various laws and regulations exist relative to employee training at an industrial facility, such as the Hazard Communication Act, the Resource Conservation and Recovery Act (RCRA), the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA or more commonly Superfund), the Community Right to Know Law, and the Superfund Amendments and Reauthorization Act (SARA), Titles I and III. The TEEX programs developed on the foundation emphasize the hands-on approach (60% field exercises) to provide a comprehensive training curriculum resulting in regulatory compliance, an effective emergency response capability, a prepared community, and a safe work environment

  10. New Riemannian Priors on the Univariate Normal Model

    Directory of Open Access Journals (Sweden)

    Salem Said

    2014-07-01

    Full Text Available The current paper introduces new prior distributions on the univariate normal model, with the aim of applying them to the classification of univariate normal populations. These new prior distributions are entirely based on the Riemannian geometry of the univariate normal model, so that they can be thought of as “Riemannian priors”. Precisely, if {pθ ; θ ∈ Θ} is any parametrization of the univariate normal model, the paper considers prior distributions G( θ - , γ with hyperparameters θ - ∈ Θ and γ > 0, whose density with respect to Riemannian volume is proportional to exp(−d2(θ, θ - /2γ2, where d2(θ, θ - is the square of Rao’s Riemannian distance. The distributions G( θ - , γ are termed Gaussian distributions on the univariate normal model. The motivation for considering a distribution G( θ - , γ is that this distribution gives a geometric representation of a class or cluster of univariate normal populations. Indeed, G( θ - , γ has a unique mode θ - (precisely, θ - is the unique Riemannian center of mass of G( θ - , γ, as shown in the paper, and its dispersion away from θ - is given by γ.  Therefore, one thinks of members of the class represented by G( θ - , γ as being centered around θ - and  lying within a typical  distance determined by γ. The paper defines rigorously the Gaussian distributions G( θ - , γ and describes an algorithm for computing maximum likelihood estimates of their hyperparameters. Based on this algorithm and on the Laplace approximation, it describes how the distributions G( θ - , γ can be used as prior distributions for Bayesian classification of large univariate normal populations. In a concrete application to texture image classification, it is shown that  this  leads  to  an  improvement  in  performance  over  the  use  of  conjugate  priors.

  11. TNS Program

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    The fusion program plan is briefly reviewed and the role of the prototype experimental power reactor, thought of as The Next Step (TNS), is discussed. The required device capabilities and basic reactor concepts for a TNS fusion electric plant are given. A detailed discussion of the physics considerations for the Power Generating Fusion Reactor (PGFR), including plasma heating, MHD equilibrium and stability, burn control resulting from toroidal field ripple, fueling, and boundary effects, is presented. Engineering considerations of the major PGFR systems, as well as diagnostics, instrumentation, control, and programmatic issues are also considered in detail. It is concluded that TNS design studies have established the existence of a technical basis for constructing a long pulse, D-T burning tokamak to be operational prior to 1990

  12. Prior and present evidence: how prior experience interacts with present information in a perceptual decision making task.

    Directory of Open Access Journals (Sweden)

    Muhsin Karim

    Full Text Available Vibrotactile discrimination tasks have been used to examine decision making processes in the presence of perceptual uncertainty, induced by barely discernible frequency differences between paired stimuli or by the presence of embedded noise. One lesser known property of such tasks is that decisions made on a single trial may be biased by information from prior trials. An example is the time-order effect whereby the presentation order of paired stimuli may introduce differences in accuracy. Subjects perform better when the first stimulus lies between the second stimulus and the global mean of all stimuli on the judged dimension ("preferred" time-orders compared to the alternative presentation order ("nonpreferred" time-orders. This has been conceptualised as a "drift" of the first stimulus representation towards the global mean of the stimulus-set (an internal standard. We describe the influence of prior information in relation to the more traditionally studied factors of interest in a classic discrimination task.Sixty subjects performed a vibrotactile discrimination task with different levels of uncertainty parametrically induced by increasing task difficulty, aperiodic stimulus noise, and changing the task instructions whilst maintaining identical stimulus properties (the "context".The time-order effect had a greater influence on task performance than two of the explicit factors-task difficulty and noise-but not context. The influence of prior information increased with the distance of the first stimulus from the global mean, suggesting that the "drift" velocity of the first stimulus towards the global mean representation was greater for these trials.Awareness of the time-order effect and prior information in general is essential when studying perceptual decision making tasks. Implicit mechanisms may have a greater influence than the explicit factors under study. It also affords valuable insights into basic mechanisms of information

  13. Transcriptomic assessment of resistance to effects of an aryl hydrocarbon receptor (AHR agonist in embryos of Atlantic killifish (Fundulus heteroclitus from a marine Superfund site

    Directory of Open Access Journals (Sweden)

    Franks Diana G

    2011-05-01

    Full Text Available Abstract Background Populations of Atlantic killifish (Fundulus heteroclitus have evolved resistance to the embryotoxic effects of polychlorinated biphenyls (PCBs and other halogenated and nonhalogenated aromatic hydrocarbons that act through an aryl hydrocarbon receptor (AHR-dependent signaling pathway. The resistance is accompanied by reduced sensitivity to induction of cytochrome P450 1A (CYP1A, a widely used biomarker of aromatic hydrocarbon exposure and effect, but whether the reduced sensitivity is specific to CYP1A or reflects a genome-wide reduction in responsiveness to all AHR-mediated changes in gene expression is unknown. We compared gene expression profiles and the response to 3,3',4,4',5-pentachlorobiphenyl (PCB-126 exposure in embryos (5 and 10 dpf and larvae (15 dpf from F. heteroclitus populations inhabiting the New Bedford Harbor, Massachusetts (NBH Superfund site (PCB-resistant and a reference site, Scorton Creek, Massachusetts (SC; PCB-sensitive. Results Analysis using a 7,000-gene cDNA array revealed striking differences in responsiveness to PCB-126 between the populations; the differences occur at all three stages examined. There was a sizeable set of PCB-responsive genes in the sensitive SC population, a much smaller set of PCB-responsive genes in NBH fish, and few similarities in PCB-responsive genes between the two populations. Most of the array results were confirmed, and additional PCB-regulated genes identified, by RNA-Seq (deep pyrosequencing. Conclusions The results suggest that NBH fish possess a gene regulatory defect that is not specific to one target gene such as CYP1A but rather lies in a regulatory pathway that controls the transcriptional response of multiple genes to PCB exposure. The results are consistent with genome-wide disruption of AHR-dependent signaling in NBH fish.

  14. Radiochemical Analyses of the Filter Cake, Granular Activated Carbon, and Treated Ground Water from the DTSC Stringfellow Superfund Site Pretreatment Plant

    International Nuclear Information System (INIS)

    Esser, B K; McConachie, W; Fischer, R; Sutton, M; Szechenyi, S

    2005-01-01

    The Department of Toxic Substance Control (DTSC) requested that Lawrence Livermore National Laboratory (LLNL) evaluate the treatment process currently employed at the Department's Stringfellow Superfund Site Pretreatment Plant (PTP) site to determine if wastes originating from the site were properly managed with regards to their radioactivity. In order to evaluate the current management strategy, LLNL suggested that DTSC characterize the effluents from the waste treatment system for radionuclide content. A sampling plan was developed; samples were collected and analyzed for radioactive constituents. Following is brief summary of those results and what implications for waste characterization may be made. (1) The sampling and analysis provides strong evidence that the radionuclides present are Naturally Occurring Radioactive Material (NORM). (2) The greatest source of radioactivity in the samples was naturally occurring uranium. The sample results indicate that the uranium concentration in the filter cake is higher than the Granular Activated Carbon (GAC) samples. (11 -14 and 2-6 ppm respectively). (3) No radiologic background for geologic materials has been established for the Stringfellow site, and comprehensive testing of the process stream has not been conducted. Without site-specific testing of geologic materials and waste process streams, it is not possible to conclude if filter cake and spent GAC samples contain radioactivity concentrated above natural background levels, or if radionuclides are being concentrated by the waste treatment process. Recommendation: The regulation of Technologically Enhanced, Naturally Occurring Radioactive Materials (T-NORM) is complex. Since the results of this study do not conclusively demonstrate that natural radioactive materials have not been concentrated by the treatment process it is recommended that the DTSC consult with the Department of Health Services (DHS) Radiological Health Branch to determine if any further action is

  15. Quantitative analysis of the extent of heavy-metal contamination in soils near Picher, Oklahoma, within the Tar Creek Superfund Site.

    Science.gov (United States)

    Beattie, Rachelle E; Henke, Wyatt; Davis, Conor; Mottaleb, M Abdul; Campbell, James H; McAliley, L Rex

    2017-04-01

    The Tri-State Mining District of Missouri, Kansas and Oklahoma was the site of large-scale mining operations primarily for lead and zinc until the mid-1950s. Although mining across the area has ceased, high concentrations of heavy metals remain in the region's soil and water systems. The town of Picher, Ottawa County, OK, lies within this district and was included in the Tar Creek Superfund Site by the U.S. Environmental Protection Agency in 1980 due to extensive contamination. To elucidate the extent of heavy-metal contamination, a soil-chemistry survey of the town of Picher was conducted. Samples (n = 111) were collected from mine tailings, locally known as chat, in Picher and along cardinal-direction transects within an 8.05-km radius of the town in August 2015. Samples were analyzed for soil pH, moisture, and metal content. Inductively Coupled Plasma Optical Emission Spectrometry (ICP-OES) analyses of 20 metals showed high concentrations of lead (>1000 ppm), cadmium (>40 ppm) and zinc (>4000 ppm) throughout the sampled region. Soil moisture content ranged from 0.30 to 35.9%, and pH values ranged from 5.14 to 7.42. MANOVA of metal profiles determined that soils collected from the north transect and chat were significantly different (p zinc were correlated with one another. These data show an unequal distribution of contamination surrounding the Picher mining site. Mapping heavy-metal contamination in these soils represents the first step in understanding the distribution of these contaminants at the Picher mining site. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Superfund record of decision (EPA Region 7): Weldon Spring Quarry/Plant/Pits (USDOE), St. Charles, MO, September 30, 1998

    International Nuclear Information System (INIS)

    1999-03-01

    The Weldon Spring Quarry is one of two noncontiguous areas that constitute the US Department of Energy's (DOE) Weldon Spring site. The main area of the site is the chemical plant. Both areas are located in St. Charles County, Missouri, about 48 km (30 mi) west of St. Louis. The US Environmental Protection Agency (EPA) listed the quarry on the National Priorities List (NPL) in 1987, and the chemical plant area was added to the list in 1989. The quarry is about 6.4 km (4 mi) south-southwest of the chemical plant area; it is accessible from State Route 94 and is currently fenced and closed to the public. The quarry is approximately 300 m (1,000 ft) long by 140 m (450 ft) wide and covers an area of approximately 3.6 ha (9 acres). The quarry was used by the Army for disposal of chemically contaminated (explosive) materials in the 1940s and was later used for the disposal of radioactively contaminated material by the Atomic Energy Commission (AEC) in the 1960s. Approximately 110,000 m 3 (144,000 yd 3 ) of soil and waste material was removed from the quarry and transported to the chemical plant area as part of completing the remedial action stipulated in the Record of Decision (ROD) for the Quarry Bulk Waste Operable Unit (DOE 1990). Bulk waste removal was completed in October 1995. These wastes have been placed in the disposal cell at the chemical plant. Prior to bulk waste removal, contaminated water contained in the quarry pond was also removed; approximately 170 million liters (44 million gal) have been treated as of March 1998

  17. Prior video game exposure does not enhance robotic surgical performance.

    Science.gov (United States)

    Harper, Jonathan D; Kaiser, Stefan; Ebrahimi, Kamyar; Lamberton, Gregory R; Hadley, H Roger; Ruckle, Herbert C; Baldwin, D Duane

    2007-10-01

    Prior research has demonstrated that counterintuitive laparoscopic surgical skills are enhanced by experience with video games. A similar relation with robotic surgical skills has not been tested. The purpose of this study was to determine whether prior video-game experience enhances the acquisition of robotic surgical skills. A series of 242 preclinical medical students completed a self-reported video-game questionnaire detailing the frequency, duration, and peak playing time. The 10 students with the highest and lowest video-game exposure completed a follow-up questionnaire further quantifying video game, sports, musical instrument, and craft and hobby exposure. Each subject viewed a training video demonstrating the use of the da Vinci surgical robot in tying knots, followed by 3 minutes of proctored practice time. Subjects then tied knots for 5 minutes while an independent blinded observer recorded the number of knots tied, missed knots, frayed sutures, broken sutures, and mechanical errors. The mean playing time for the 10 game players was 15,136 total hours (range 5,840-30,000 hours). Video-game players tied fewer knots than nonplayers (5.8 v 9.0; P = 0.04). Subjects who had played sports for at least 4 years had fewer mechanical errors (P = 0.04), broke fewer sutures (P = 0.01), and committed fewer total errors (P = 0.01). Similarly, those playing musical instruments longer than 5 years missed fewer knots (P = 0.05). In the extremes of video-game experience tested in this study, game playing was inversely correlated with the ability to learn robotic suturing. This study suggests that advanced surgical skills such as robotic suturing may be learned more quickly by athletes and musicians. Prior extensive video-game exposure had a negative impact on robotic performance.

  18. Prior knowledge of category size impacts visual search.

    Science.gov (United States)

    Wu, Rachel; McGee, Brianna; Echiverri, Chelsea; Zinszer, Benjamin D

    2018-03-30

    Prior research has shown that category search can be similar to one-item search (as measured by the N2pc ERP marker of attentional selection) for highly familiar, smaller categories (e.g., letters and numbers) because the finite set of items in a category can be grouped into one unit to guide search. Other studies have shown that larger, more broadly defined categories (e.g., healthy food) also can elicit N2pc components during category search, but the amplitude of these components is typically attenuated. Two experiments investigated whether the perceived size of a familiar category impacts category and exemplar search. We presented participants with 16 familiar company logos: 8 from a smaller category (social media companies) and 8 from a larger category (entertainment/recreation manufacturing companies). The ERP results from Experiment 1 revealed that, in a two-item search array, search was more efficient for the smaller category of logos compared to the larger category. In a four-item search array (Experiment 2), where two of the four items were placeholders, search was largely similar between the category types, but there was more attentional capture by nontarget members from the same category as the target for smaller rather than larger categories. These results support a growing literature on how prior knowledge of categories affects attentional selection and capture during visual search. We discuss the implications of these findings in relation to assessing cognitive abilities across the lifespan, given that prior knowledge typically increases with age. © 2018 Society for Psychophysiological Research.

  19. Shape prior modeling using sparse representation and online dictionary learning.

    Science.gov (United States)

    Zhang, Shaoting; Zhan, Yiqiang; Zhou, Yan; Uzunbas, Mustafa; Metaxas, Dimitris N

    2012-01-01

    The recently proposed sparse shape composition (SSC) opens a new avenue for shape prior modeling. Instead of assuming any parametric model of shape statistics, SSC incorporates shape priors on-the-fly by approximating a shape instance (usually derived from appearance cues) by a sparse combination of shapes in a training repository. Theoretically, one can increase the modeling capability of SSC by including as many training shapes in the repository. However, this strategy confronts two limitations in practice. First, since SSC involves an iterative sparse optimization at run-time, the more shape instances contained in the repository, the less run-time efficiency SSC has. Therefore, a compact and informative shape dictionary is preferred to a large shape repository. Second, in medical imaging applications, training shapes seldom come in one batch. It is very time consuming and sometimes infeasible to reconstruct the shape dictionary every time new training shapes appear. In this paper, we propose an online learning method to address these two limitations. Our method starts from constructing an initial shape dictionary using the K-SVD algorithm. When new training shapes come, instead of re-constructing the dictionary from the ground up, we update the existing one using a block-coordinates descent approach. Using the dynamically updated dictionary, sparse shape composition can be gracefully scaled up to model shape priors from a large number of training shapes without sacrificing run-time efficiency. Our method is validated on lung localization in X-Ray and cardiac segmentation in MRI time series. Compared to the original SSC, it shows comparable performance while being significantly more efficient.

  20. Mission Specialist Pedro Duque undergoes equipment check prior to launch

    Science.gov (United States)

    1998-01-01

    In the Operations and Checkout Building, STS-95 Mission Specialist Pedro Duque of Spain, with the European Space Agency, gets help with his suit from suit technician Tommy McDonald. The STS-95 crew were conducting flight crew equipment fit checks prior to launch on Oct. 29. STS-95 is expected to launch at 2 p.m. EST on Oct. 29, last 8 days, 21 hours and 49 minutes, and land at 11:49 a.m. EST on Nov. 7.

  1. Challenges in relation to assessment of prior learning

    DEFF Research Database (Denmark)

    Wahlgren, Bjarne; Aarkrog, Vibe

    2013-01-01

    The paper deals about preliminary results from an on-going project: “From unskilled worker to skilled worker in record time”. The aim of the project is to qualify unskilled workers for skilled positions in record time by drafting up a plan for the training based on assessment of the students’ (the....... Observations and or with the students about the students’ workplace based experiences and learning and 2. Drafting up an individual study plan based on the individual student’s prior learning....

  2. Online persuasion process: a critical literature review of prior research

    OpenAIRE

    Poorrezaei, M

    2013-01-01

    In this paper, some of the limitations of prior research in terms of online persuasion process are\\ud highlighted. To do this, two main approaches which have been considered to study online persuasion\\ud process in context of social media are identified. Then, this study discusses the limitations and gaps\\ud of each approach. This paper is a part of author’s PhD dissertation which is being conducted to\\ud examine how different online behaviours are persuaded in online brand communities. The r...

  3. Optimal design of priors constrained by external predictors

    Czech Academy of Sciences Publication Activity Database

    Quinn, A.; Kárný, Miroslav; Guy, Tatiana Valentine

    2017-01-01

    Roč. 84, č. 1 (2017), s. 150-158 ISSN 0888-613X R&D Projects: GA ČR(CZ) GA16-09848S Institutional support: RVO:67985556 Keywords : Fully probabilistic design * Parameter prior * External predictive distribution * Bayesian transfer learning * Kullback–Leibler divergence Subject RIV: BC - Control Systems Theory OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 2.845, year: 2016 http://library.utia.cas.cz/separaty/2017/AS/guy-0473911.pdf

  4. Technology Corner: Dating of Electronic Hardware for Prior Art Investigations

    Directory of Open Access Journals (Sweden)

    Sellam Ismail

    2012-03-01

    Full Text Available In many legal matters, specifically patent litigation, determining and authenticating the date of computer hardware or other electronic products or components is often key to establishing the item as legitimate evidence of prior art. Such evidence can be used to buttress claims of technologies available or of events transpiring by or at a particular date.In 1945, the Electronics Industry Association published a standard, EIA 476-A, standardized in the reference Source and Date Code Marking (Electronic Industries Association, 1988.(see PDF for full tech corner

  5. Prognostic impact of physical activity prior to myocardial infarction

    DEFF Research Database (Denmark)

    Ejlersen, Hanne; Andersen, Zorana Jovanovic; von Euler-Chelpin, My Catarina

    2017-01-01

    the course of myocardial infarction by reducing case fatality and the subsequent risk of heart failure and mortality. Methods: A total of 14,223 participants in the Copenhagen City Heart Study were assessed at baseline in 1976-1978; 1,664 later developed myocardial infarction (mean age at myocardial...... estimated by logistic and Cox proportional hazards regression models, adjusted for age at myocardial infarction and other potential confounders. Results: A total of 425 (25.5%) myocardial infarctions were fatal. Higher levels of LTPA prior to myocardial infarction were associated with lower case fatality...

  6. Number of patients studied prior to approval of new medicines

    DEFF Research Database (Denmark)

    Duijnhoven, Ruben G; Straus, Sabine M J M; Raine, June M

    2013-01-01

    length of time), whereas 67 (79.8%) of the medicines met the criteria for 12-mo patient exposure (at least 100 participants studied for 12 mo). CONCLUSIONS: For medicines intended for chronic use, the number of patients studied before marketing is insufficient to evaluate safety and long-term efficacy....... Both safety and efficacy require continued study after approval. New epidemiologic tools and legislative actions necessitate a review of the requirements for the number of patients studied prior to approval, particularly for chronic use, and adequate use of post-marketing studies. Please see later...

  7. Prior childhood sexual abuse in mothers of sexually abused children.

    Science.gov (United States)

    Oates, R K; Tebbutt, J; Swanston, H; Lynch, D L; O'Toole, B I

    1998-11-01

    To see if mothers who were sexually abused in their own childhood are at increased risk of their children being sexually abused and to see if prior sexual abuse in mothers affects their parenting abilities. Sixty-seven mothers whose children had been sexually abused by others and 65 control mothers were asked about sexual abuse in their own childhood. The sexually abused children of mothers who had been sexually abused in their own childhood were compared with the sexually abused children of mothers who had not suffered child sexual abuse as children. Comparisons were made on self-esteem, depression and behavior in the children. Thirty-four percent of mothers of sexually abused children gave a history of sexual abuse in their own childhoods, compared with 12% of control mothers. Assessment of the sexually abused children for self-esteem, depression and behavior at the time of diagnosis, after 18 months and after 5 years showed no difference in any of these measures at any of the three time intervals between those whose mothers had suffered child sexual abuse and those whose mothers had not been abused. In this study, sexual abuse in a mother's own childhood was related to an increased risk of sexual abuse occurring in the next generation, although prior maternal sexual abuse did not effect outcome in children who were sexually abused.

  8. Prior storm experience moderates water surge perception and risk.

    Directory of Open Access Journals (Sweden)

    Gregory D Webster

    Full Text Available BACKGROUND: How accurately do people perceive extreme water speeds and how does their perception affect perceived risk? Prior research has focused on the characteristics of moving water that can reduce human stability or balance. The current research presents the first experiment on people's perceptions of risk and moving water at different speeds and depths. METHODS: Using a randomized within-person 2 (water depth: 0.45, 0.90 m ×3 (water speed: 0.4, 0.8, 1.2 m/s experiment, we immersed 76 people in moving water and asked them to estimate water speed and the risk they felt. RESULTS: Multilevel modeling showed that people increasingly overestimated water speeds as actual water speeds increased or as water depth increased. Water speed perceptions mediated the direct positive relationship between actual water speeds and perceptions of risk; the faster the moving water, the greater the perceived risk. Participants' prior experience with rip currents and tropical cyclones moderated the strength of the actual-perceived water speed relationship; consequently, mediation was stronger for people who had experienced no rip currents or fewer storms. CONCLUSIONS: These findings provide a clearer understanding of water speed and risk perception, which may help communicate the risks associated with anticipated floods and tropical cyclones.

  9. Science Literacy and Prior Knowledge of Astronomy MOOC Students

    Science.gov (United States)

    Impey, Chris David; Buxner, Sanlyn; Wenger, Matthew; Formanek, Martin

    2018-01-01

    Many of science classes offered on Coursera fall into fall into the category of general education or general interest classes for lifelong learners, including our own, Astronomy: Exploring Time and Space. Very little is known about the backgrounds and prior knowledge of these students. In this talk we present the results of a survey of our Astronomy MOOC students. We also compare these results to our previous work on undergraduate students in introductory astronomy courses. Survey questions examined student demographics and motivations as well as their science and information literacy (including basic science knowledge, interest, attitudes and beliefs, and where they get their information about science). We found that our MOOC students are different than the undergraduate students in more ways than demographics. Many MOOC students demonstrated high levels of science and information literacy. With a more comprehensive understanding of our students’ motivations and prior knowledge about science and how they get their information about science, we will be able to develop more tailored learning experiences for these lifelong learners.

  10. Adaptive local thresholding for robust nucleus segmentation utilizing shape priors

    Science.gov (United States)

    Wang, Xiuzhong; Srinivas, Chukka

    2016-03-01

    This paper describes a novel local thresholding method for foreground detection. First, a Canny edge detection method is used for initial edge detection. Then, tensor voting is applied on the initial edge pixels, using a nonsymmetric tensor field tailored to encode prior information about nucleus size, shape, and intensity spatial distribution. Tensor analysis is then performed to generate the saliency image and, based on that, the refined edge. Next, the image domain is divided into blocks. In each block, at least one foreground and one background pixel are sampled for each refined edge pixel. The saliency weighted foreground histogram and background histogram are then created. These two histograms are used to calculate a threshold by minimizing the background and foreground pixel classification error. The block-wise thresholds are then used to generate the threshold for each pixel via interpolation. Finally, the foreground is obtained by comparing the original image with the threshold image. The effective use of prior information, combined with robust techniques, results in far more reliable foreground detection, which leads to robust nucleus segmentation.

  11. Lymphography prior to laparoscopic Palomo varicocelectomy to prevent postoperative hydrocele.

    Science.gov (United States)

    Chiarenza, Salvatore F; D'Agostino, Sergio; Scarpa, Mariagrazia; Fabbro, Mariangelica; Costa, Lorenzo; Musi, Luciano

    2006-08-01

    We report our experience with preoperative lymphography to identify and perioperatively preserve the ligature of the lymphatic vessels to reduce the incidence of postoperative testicular hydrocele in patients undergoing laparoscopic Palomo varicocelectomy for adolescent varicocele. Twenty-seven consecutive patients with varicocele had preoperative lymphography. The mean age was 13.5 years (range, 8-18 years) and the mean grade of varicocele was III. We performed lymphography with intrascrotal isosulfan blue. The laparoscopic Palomo procedure was successfully carried out in all patients. In 17 patients (63%) we were able to identify and conserve the lymphatic vessels by lymphography. Mean follow-up was 9.5 months (range, 6-24 months). None of the 27 patients had a recurrence. None of the 17 patients with positive lymphography had a testicular hydrocele. One of the 10 remaining patients developed a sizable hydrocele. Preoperative lymphography prior to laparoscopic Palomo varicocelectomy is a simple and feasible method for preventing testicular hydrocele. However, the method should be standardized to identify the exact site, the correct level of injection of blue dye, and to determine the optimal time to perform lymphography prior to the procedure.

  12. Bayesian inference from count data using discrete uniform priors.

    Directory of Open Access Journals (Sweden)

    Federico Comoglio

    Full Text Available We consider a set of sample counts obtained by sampling arbitrary fractions of a finite volume containing an homogeneously dispersed population of identical objects. We report a Bayesian derivation of the posterior probability distribution of the population size using a binomial likelihood and non-conjugate, discrete uniform priors under sampling with or without replacement. Our derivation yields a computationally feasible formula that can prove useful in a variety of statistical problems involving absolute quantification under uncertainty. We implemented our algorithm in the R package dupiR and compared it with a previously proposed Bayesian method based on a Gamma prior. As a showcase, we demonstrate that our inference framework can be used to estimate bacterial survival curves from measurements characterized by extremely low or zero counts and rather high sampling fractions. All in all, we provide a versatile, general purpose algorithm to infer population sizes from count data, which can find application in a broad spectrum of biological and physical problems.

  13. Prior oral conditions in patients undergoing heart valve surgery.

    Science.gov (United States)

    Silvestre, Francisco-Javier; Gil-Raga, Irene; Martinez-Herrera, Mayte; Lauritano, Dorina; Silvestre-Rangil, Javier

    2017-11-01

    Patients scheduled for heart valve surgery should be free of any oral infectious disorders that might pose a risk in the postoperative period. Few studies have been made on the dental conditions of such patients prior to surgery. The present study describes the most frequent prior oral diseases in this population group. A prospective, observational case-control study was designed involving 60 patients (30 with heart valve disease and 30 controls, with a mean age of 71 years in both groups). A dental exploration was carried out, with calculation of the DMFT (decayed, missing and filled teeth) index and recording of the periodontal parameters (plaque index, gingival bleeding index, periodontal pocket depth, and attachment loss). The oral mucosa was also examined, and panoramic X-rays were used to identify possible intrabony lesions. Significant differences in bacterial plaque index were observed between the two groups ( p <0.05), with higher scores in the patients with valve disease. Probing depth and the presence of moderate pockets were also greater in the patients with valve disease than among the controls ( p <0.01). Sixty percent of the patients with valve disease presented periodontitis. Patients scheduled for heart valve surgery should be examined for possible active periodontitis before the operation. Those individuals found to have periodontal disease should receive adequate periodontal treatment before heart surgery. Key words: Valve disease, aortic, mitral, heart surgery, periodontitis.

  14. Weakly supervised semantic segmentation using fore-background priors

    Science.gov (United States)

    Han, Zheng; Xiao, Zhitao; Yu, Mingjun

    2017-07-01

    Weakly-supervised semantic segmentation is a challenge in the field of computer vision. Most previous works utilize the labels of the whole training set and thereby need the construction of a relationship graph about image labels, thus result in expensive computation. In this study, we tackle this problem from a different perspective. We proposed a novel semantic segmentation algorithm based on background priors, which avoids the construction of a huge graph in whole training dataset. Specifically, a random forest classifier is obtained using weakly supervised training data .Then semantic texton forest (STF) feature is extracted from image superpixels. Finally, a CRF based optimization algorithm is proposed. The unary potential of CRF derived from the outputting probability of random forest classifier and the robust saliency map as background prior. Experiments on the MSRC21 dataset show that the new algorithm outperforms some previous influential weakly-supervised segmentation algorithms. Furthermore, the use of efficient decision forests classifier and parallel computing of saliency map significantly accelerates the implementation.

  15. Physiology declines prior to death in Drosophila melanogaster.

    Science.gov (United States)

    Shahrestani, Parvin; Tran, Xuan; Mueller, Laurence D

    2012-10-01

    For a period of 6-15 days prior to death, the fecundity and virility of Drosophila melanogaster fall significantly below those of same-aged flies that are not near death. It is likely that other aspects of physiology may decline during this period. This study attempts to document changes in two physiological characteristics prior to death: desiccation resistance and time-in-motion. Using individual fecundity estimates and previously described models, it is possible to accurately predict which flies in a population are near death at any given age; these flies are said to be in the "death spiral". In this study of approximately 7,600 females, we used cohort mortality data and individual fecundity estimates to dichotomize each of five replicate populations of same-aged D. melanogaster into "death spiral" and "non-spiral" groups. We then compared these groups for two physiological characteristics that decline during aging. We describe the statistical properties of a new multivariate test statistic that allows us to compare the desiccation resistance and time-in-motion for two populations chosen on the basis of their fecundity. This multivariate representation of the desiccation resistance and time-in-motion of spiral and non-spiral females was shown to be significantly different with the spiral females characterized by lower desiccation resistance and time spent in motion. Our results suggest that D. melanogaster may be used as a model organism to study physiological changes that occur when death is imminent.

  16. Adaptive estimation of multivariate functions using conditionally Gaussian tensor-product spline priors

    NARCIS (Netherlands)

    Jonge, de R.; Zanten, van J.H.

    2012-01-01

    We investigate posterior contraction rates for priors on multivariate functions that are constructed using tensor-product B-spline expansions. We prove that using a hierarchical prior with an appropriate prior distribution on the partition size and Gaussian prior weights on the B-spline

  17. Validation of Ulchin Units 1, 2 CONTEMPT Model Prior to the Production of EQ Envelope Curve

    International Nuclear Information System (INIS)

    Hwang, Su Hyun; Kim, Min Ki; Hong, Soon Joon; Lee, Byung Chul; Suh, Jeong Kwan; Lee, Jae Yong; Song, Dong Soo

    2010-01-01

    The Ulchin Units 1, 2 will be refurbished with RSG (Replacement of Steam Generator) and PU (Power Uprate). The current EQ (Environmental Qualification) envelope curve should be modified according to RSG and PU. The containment P/T (Pressure/Temperature) analysis in Ulchin Units 1, 2 FSAR was done using EDF computer program PAREO6. PAREO6 uses the same assumptions as the US NRC CONTEMPT program, and the results given by both programs are in good agreement. It is utilized to determine pressure and temperature variations in a PWR containment subsequent to a reactor coolant or secondary system pipe break. But PAREO6 cannot be available to the production of EQ envelope curve, so CONTEMPT code should be used instead of PAREO6. It is essential to validate the CONTEMPT OSG (Original Steam Generator) model prior to the production of EQ envelope curve considering RSG and PU. This study has been performed to validate the CONTEMPT model of Ulchin Units 1, 2 by comparing the CONTEMPT results with the PAERO6 results in Ulchin Units 1, 2 FSAR

  18. Validation of Ulchin Units 1, 2 CONTEMPT Model Prior to the Production of EQ Envelope Curve

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Su Hyun; Kim, Min Ki; Hong, Soon Joon; Lee, Byung Chul [FNC Technology Co., SNU, Seoul (Korea, Republic of); Suh, Jeong Kwan; Lee, Jae Yong; Song, Dong Soo [KEPCO Research Institute, Daejeon (Korea, Republic of)

    2010-10-15

    The Ulchin Units 1, 2 will be refurbished with RSG (Replacement of Steam Generator) and PU (Power Uprate). The current EQ (Environmental Qualification) envelope curve should be modified according to RSG and PU. The containment P/T (Pressure/Temperature) analysis in Ulchin Units 1, 2 FSAR was done using EDF computer program PAREO6. PAREO6 uses the same assumptions as the US NRC CONTEMPT program, and the results given by both programs are in good agreement. It is utilized to determine pressure and temperature variations in a PWR containment subsequent to a reactor coolant or secondary system pipe break. But PAREO6 cannot be available to the production of EQ envelope curve, so CONTEMPT code should be used instead of PAREO6. It is essential to validate the CONTEMPT OSG (Original Steam Generator) model prior to the production of EQ envelope curve considering RSG and PU. This study has been performed to validate the CONTEMPT model of Ulchin Units 1, 2 by comparing the CONTEMPT results with the PAERO6 results in Ulchin Units 1, 2 FSAR

  19. Bayesian Model Comparison With the g-Prior

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Christensen, Mads Græsbøll; Cemgil, Ali Taylan

    2014-01-01

    ’s asymptotic MAP rule was an improvement, and in this paper we extend the work by Djuric in several ways. Specifically, we consider the elicitation of proper prior distributions, treat the case of real- and complex-valued data simultaneously in a Bayesian framework similar to that considered by Djuric......, and develop new model selection rules for a regression model containing both linear and non-linear parameters. Moreover, we use this framework to give a new interpretation of the popular information criteria and relate their performance to the signal-to-noise ratio of the data. By use of simulations, we also...... demonstrate that our proposed model comparison and selection rules outperform the traditional information criteria both in terms of detecting the true model and in terms of predicting unobserved data. The simulation code is available online....

  20. Negotiating Multicollinearity with Spike-and-Slab Priors.

    Science.gov (United States)

    Ročková, Veronika; George, Edward I

    2014-08-01

    In multiple regression under the normal linear model, the presence of multicollinearity is well known to lead to unreliable and unstable maximum likelihood estimates. This can be particularly troublesome for the problem of variable selection where it becomes more difficult to distinguish between subset models. Here we show how adding a spike-and-slab prior mitigates this difficulty by filtering the likelihood surface into a posterior distribution that allocates the relevant likelihood information to each of the subset model modes. For identification of promising high posterior models in this setting, we consider three EM algorithms, the fast closed form EMVS version of Rockova and George (2014) and two new versions designed for variants of the spike-and-slab formulation. For a multimodal posterior under multicollinearity, we compare the regions of convergence of these three algorithms. Deterministic annealing versions of the EMVS algorithm are seen to substantially mitigate this multimodality. A single simple running example is used for illustration throughout.

  1. Weight reduction intervention for obese infertile women prior to IVF

    DEFF Research Database (Denmark)

    Einarsson, Snorri; Bergh, Christina; Friberg, Britt

    2017-01-01

    in the weight reduction group reaching BMI ≤ 25 kg/m2 or reaching a weight loss of at least five BMI units to the IVF only group. No statistical differences in live birth rates between the groups in either subgroup analysis were found. LIMITATIONS, REASON FOR CAUTION: The study was not powered to detect a small......STUDY QUESTION: Does an intensive weight reduction programme prior to IVF increase live birth rates for infertile obese women? SUMMARY ANSWER: An intensive weight reduction programme resulted in a large weight loss but did not substantially affect live birth rates in obese women scheduled for IVF...... in infertile obese women. STUDY DESIGN, SIZE, DURATION: A prospective, multicentre, randomized controlled trial was performed between 2010 and 2016 in the Nordic countries. In total, 962 women were assessed for eligibility and 317 women were randomized. Computerized randomization with concealed allocation...

  2. Creating illusions of knowledge: learning errors that contradict prior knowledge.

    Science.gov (United States)

    Fazio, Lisa K; Barber, Sarah J; Rajaram, Suparna; Ornstein, Peter A; Marsh, Elizabeth J

    2013-02-01

    Most people know that the Pacific is the largest ocean on Earth and that Edison invented the light bulb. Our question is whether this knowledge is stable, or if people will incorporate errors into their knowledge bases, even if they have the correct knowledge stored in memory. To test this, we asked participants general-knowledge questions 2 weeks before they read stories that contained errors (e.g., "Franklin invented the light bulb"). On a later general-knowledge test, participants reproduced story errors despite previously answering the questions correctly. This misinformation effect was found even for questions that were answered correctly on the initial test with the highest level of confidence. Furthermore, prior knowledge offered no protection against errors entering the knowledge base; the misinformation effect was equivalent for previously known and unknown facts. Errors can enter the knowledge base even when learners have the knowledge necessary to catch the errors. 2013 APA, all rights reserved

  3. Comparing nonparametric Bayesian tree priors for clonal reconstruction of tumors.

    Science.gov (United States)

    Deshwar, Amit G; Vembu, Shankar; Morris, Quaid

    2015-01-01

    Statistical machine learning methods, especially nonparametric Bayesian methods, have become increasingly popular to infer clonal population structure of tumors. Here we describe the treeCRP, an extension of the Chinese restaurant process (CRP), a popular construction used in nonparametric mixture models, to infer the phylogeny and genotype of major subclonal lineages represented in the population of cancer cells. We also propose new split-merge updates tailored to the subclonal reconstruction problem that improve the mixing time of Markov chains. In comparisons with the tree-structured stick breaking prior used in PhyloSub, we demonstrate superior mixing and running time using the treeCRP with our new split-merge procedures. We also show that given the same number of samples, TSSB and treeCRP have similar ability to recover the subclonal structure of a tumor…

  4. Was the Universe actually radiation dominated prior to nucleosynthesis?

    Science.gov (United States)

    Giblin, John T.; Kane, Gordon; Nesbit, Eva; Watson, Scott; Zhao, Yue

    2017-08-01

    Maybe not. String theory approaches to both beyond the Standard Model and inflationary model building generically predict the existence of scalars (moduli) that are light compared to the scale of quantum gravity. These moduli become displaced from their low energy minima in the early Universe and lead to a prolonged matter-dominated epoch prior to big bang nucleosynthesis (BBN). In this paper, we examine whether nonperturbative effects such as parametric resonance or tachyonic instabilities can shorten, or even eliminate, the moduli condensate and matter-dominated epoch. Such effects depend crucially on the strength of the couplings, and we find that unless the moduli become strongly coupled, the matter-dominated epoch is unavoidable. In particular, we find that in string and M-theory compactifications where the lightest moduli are near the TeV scale, a matter-dominated epoch will persist until the time of big bang nucleosynthesis.

  5. Visibility Restoration for Single Hazy Image Using Dual Prior Knowledge

    Directory of Open Access Journals (Sweden)

    Mingye Ju

    2017-01-01

    Full Text Available Single image haze removal has been a challenging task due to its super ill-posed nature. In this paper, we propose a novel single image algorithm that improves the detail and color of such degraded images. More concretely, we redefine a more reliable atmospheric scattering model (ASM based on our previous work and the atmospheric point spread function (APSF. Further, by taking the haze density spatial feature into consideration, we design a scene-wise APSF kernel prediction mechanism to eliminate the multiple-scattering effect. With the redefined ASM and designed APSF, combined with the existing prior knowledge, the complex dehazing problem can be subtly converted into one-dimensional searching problem, which allows us to directly obtain the scene transmission and thereby recover visually realistic results via the proposed ASM. Experimental results verify that our algorithm outperforms several state-of-the-art dehazing techniques in terms of robustness, effectiveness, and efficiency.

  6. The effect of prior transfusion history on blood donor anti-hepatitis C virus antibody.

    Science.gov (United States)

    Mazda, T; Nakata, K; Ota, K; Kaminuma, Y; Katayama, T

    1993-01-01

    In Japan, the major transfusion-associated disease is non-A, non-B hepatitis. We studied the relationship between transfusion history and blood donor antibodies to hepatitis C virus (HCV). The positive rate of antibodies to the HCV nonstructural protein (c100-3) depended on age and the time elapsed since transfusion. The anti-c100-3 ratio for subjects with transfusions made prior to 20 years ago was high. One quarter century ago, a change occurred in national blood policy from paid to non-paid voluntary donations. We also have studied the anti-HCV positive rate among donors with prior transfusion using a second generation HCV test kit which includes anti-HCV core antibody detection. The anti-HCV positive rate for the second generation test was higher than that for the anti-c100-3 test. Introduction of the second generation test is therefore more useful in screening than the anti-c100-3 test for blood programs.

  7. Algorithms and tools for system identification using prior knowledge

    International Nuclear Information System (INIS)

    Lindskog, P.

    1994-01-01

    One of the hardest problems in system identification is that of model structure selection. In this thesis two different kinds of a priori process knowledge are used to address this fundamental problem. Concentrating on linear model structures, the first prior advantage of is knowledge about the systems' dominating time constants and resonance frequencies. The idea is to generalize FIR modelling by replacing the usual delay operator with discrete so-called Laguerre or Kautz filters. The generalization is such that stability, the linear regression structure and the approximation ability of the FIR model structure is retained, whereas the prior is used to reduce the number of parameters needed to arrive at a reasonable model. Tailorized and efficient system identification algorithms for these model structures are detailed in this work. The usefulness of the proposed methods is demonstrated through concrete simulation and application studies. The other approach is referred to as semi-physical modelling. The main idea is to use simple physical insight into the application, often in terms of a set of unstructured equations, in order to come up with suitable nonlinear transformation of the raw measurements, so as to allow for a good model structure. Semi-physical modelling is less ''ambitious'' than physical modelling in that no complete physical structure is sought, just combinations of inputs and outputs that can be subjected to more or less standard model structures, such as linear regressions. The suggested modelling procedure shows a first step where symbolic computations are employed to determine a suitable model structure - a set of regressors. We show how constructive methods from commutative and differential algebra can be applied for this. Subsequently, different numerical schemes for finding a subset of ''good'' regressors and for estimating the corresponding linear-in-the-parameters model are discussed. 107 refs, figs, tabs

  8. Global tractography with embedded anatomical priors for quantitative connectivity analysis

    Directory of Open Access Journals (Sweden)

    Alia eLemkaddem

    2014-11-01

    Full Text Available The main assumption of fiber-tracking algorithms is that fiber trajectories are represented by paths of highest diffusion, which is usually accomplished by following the principal diffusion directions estimated in every voxel from the measured diffusion MRI data. The state-of-the-art approaches, known as global tractography, reconstruct all the fiber tracts of the whole brain simultaneously by solving a global energy minimization problem. The tractograms obtained with these algorithms outperform any previous technique but, unfortunately, the price to pay is an increased computational cost which is not suitable in many practical settings, both in terms of time and memory requirements. Furthermore, existing global tractography algorithms suffer from an important shortcoming that is crucial in the context of brain connectivity analyses. As no anatomical priors are used during in the reconstruction process, the recovered fiber tracts are not guaranteed to connect cortical regions and, as a matter of fact, most of them stop prematurely in the white matter. This does not only unnecessarily slow down the estimation procedure and potentially biases any subsequent analysis but also, most importantly, prevents the de facto quantification of brain connectivity. In this work, we propose a novel approach for global tractography that is specifically designed for connectivity analysis applications by explicitly enforcing anatomical priors of the tracts in the optimization and considering the effective contribution of each of them, i.e. volume, to the acquired diffusion MRI image. We evaluated our approach on both a realistic diffusion MRI phantom and in-vivo data, and also compared its performance to existing tractography aloprithms.

  9. Bayesian optimal experimental design for priors of compact support

    KAUST Repository

    Long, Quan

    2016-01-08

    In this study, we optimize the experimental setup computationally by optimal experimental design (OED) in a Bayesian framework. We approximate the posterior probability density functions (pdf) using truncated Gaussian distributions in order to account for the bounded domain of the uniform prior pdf of the parameters. The underlying Gaussian distribution is obtained in the spirit of the Laplace method, more precisely, the mode is chosen as the maximum a posteriori (MAP) estimate, and the covariance is chosen as the negative inverse of the Hessian of the misfit function at the MAP estimate. The model related entities are obtained from a polynomial surrogate. The optimality, quantified by the information gain measures, can be estimated efficiently by a rejection sampling algorithm against the underlying Gaussian probability distribution, rather than against the true posterior. This approach offers a significant error reduction when the magnitude of the invariants of the posterior covariance are comparable to the size of the bounded domain of the prior. We demonstrate the accuracy and superior computational efficiency of our method for shock-tube experiments aiming to measure the model parameters of a key reaction which is part of the complex kinetic network describing the hydrocarbon oxidation. In the experiments, the initial temperature and fuel concentration are optimized with respect to the expected information gain in the estimation of the parameters of the target reaction rate. We show that the expected information gain surface can change its shape dramatically according to the level of noise introduced into the synthetic data. The information that can be extracted from the data saturates as a logarithmic function of the number of experiments, and few experiments are needed when they are conducted at the optimal experimental design conditions.

  10. Targeted Memory Reactivation during Sleep Depends on Prior Learning.

    Science.gov (United States)

    Creery, Jessica D; Oudiette, Delphine; Antony, James W; Paller, Ken A

    2015-05-01

    When sounds associated with learning are presented again during slow-wave sleep, targeted memory reactivation (TMR) can produce improvements in subsequent location recall. Here we used TMR to investigate memory consolidation during an afternoon nap as a function of prior learning. Twenty healthy individuals (8 male, 19-23 y old). Participants learned to associate each of 50 common objects with a unique screen location. When each object appeared, its characteristic sound was played. After electroencephalography (EEG) electrodes were applied, location recall was assessed for each object, followed by a 90-min interval for sleep. During EEG-verified slow-wave sleep, half of the sounds were quietly presented over white noise. Recall was assessed 3 h after initial learning. A beneficial effect of TMR was found in the form of higher recall accuracy for cued objects compared to uncued objects when pre-sleep accuracy was used as an explanatory variable. An analysis of individual differences revealed that this benefit was greater for participants with higher pre-sleep recall accuracy. In an analysis for individual objects, cueing benefits were apparent as long as initial recall was not highly accurate. Sleep physiology analyses revealed that the cueing benefit correlated with delta power and fast spindle density. These findings substantiate the use of targeted memory reactivation (TMR) methods for manipulating consolidation during sleep. TMR can selectively strengthen memory storage for object-location associations learned prior to sleep, except for those near-perfectly memorized. Neural measures found in conjunction with TMR-induced strengthening provide additional evidence about mechanisms of sleep consolidation. © 2015 Associated Professional Sleep Societies, LLC.

  11. Forecasting economy with Bayesian autoregressive distributed lag model: choosing optimal prior in economic downturn

    OpenAIRE

    Bušs, Ginters

    2009-01-01

    Bayesian inference requires an analyst to set priors. Setting the right prior is crucial for precise forecasts. This paper analyzes how optimal prior changes when an economy is hit by a recession. For this task, an autoregressive distributed lag (ADL) model is chosen. The results show that a sharp economic slowdown changes the optimal prior in two directions. First, it changes the structure of the optimal weight prior, setting smaller weight on the lagged dependent variable compared to varia...

  12. 5 CFR 7701.102 - Prior approval for outside employment.

    Science.gov (United States)

    2010-01-01

    ... facilities not available to the general public will be used in connection with the outside employment; and (4... teaching a course which is part of the established curriculum of an accredited institution of higher education, secondary school, elementary school, or an education or training program sponsored by a Federal...

  13. Combining prior day contours to improve automated prostate segmentation

    International Nuclear Information System (INIS)

    Godley, Andrew; Sheplan Olsen, Lawrence J.; Stephans, Kevin; Zhao Anzi

    2013-01-01

    Purpose: To improve the accuracy of automatically segmented prostate, rectum, and bladder contours required for online adaptive therapy. The contouring accuracy on the current image guidance [image guided radiation therapy (IGRT)] scan is improved by combining contours from earlier IGRT scans via the simultaneous truth and performance level estimation (STAPLE) algorithm. Methods: Six IGRT prostate patients treated with daily kilo-voltage (kV) cone-beam CT (CBCT) had their original plan CT and nine CBCTs contoured by the same physician. Three types of automated contours were produced for analysis. (1) Plan: By deformably registering the plan CT to each CBCT and then using the resulting deformation field to morph the plan contours to match the CBCT anatomy. (2) Previous: The contour set drawn by the physician on the previous day CBCT is similarly deformed to match the current CBCT anatomy. (3) STAPLE: The contours drawn by the physician, on each prior CBCT and the plan CT, are deformed to match the CBCT anatomy to produce multiple contour sets. These sets are combined using the STAPLE algorithm into one optimal set. Results: Compared to plan and previous, STAPLE improved the average Dice's coefficient (DC) with the original physician drawn CBCT contours to a DC as follows: Bladder: 0.81 ± 0.13, 0.91 ± 0.06, and 0.92 ± 0.06; Prostate: 0.75 ± 0.08, 0.82 ± 0.05, and 0.84 ± 0.05; and Rectum: 0.79 ± 0.06, 0.81 ± 0.06, and 0.85 ± 0.04, respectively. The STAPLE results are within intraobserver consistency, determined by the physician blindly recontouring a subset of CBCTs. Comparing plans recalculated using the physician and STAPLE contours showed an average disagreement less than 1% for prostate D98 and mean dose, and 5% and 3% for bladder and rectum mean dose, respectively. One scan takes an average of 19 s to contour. Using five scans plus STAPLE takes less than 110 s on a 288 core graphics processor unit. Conclusions: Combining the plan and all prior days via

  14. Iterative CT shading correction with no prior information

    Science.gov (United States)

    Wu, Pengwei; Sun, Xiaonan; Hu, Hongjie; Mao, Tingyu; Zhao, Wei; Sheng, Ke; Cheung, Alice A.; Niu, Tianye

    2015-11-01

    Shading artifacts in CT images are caused by scatter contamination, beam-hardening effect and other non-ideal imaging conditions. The purpose of this study is to propose a novel and general correction framework to eliminate low-frequency shading artifacts in CT images (e.g. cone-beam CT, low-kVp CT) without relying on prior information. The method is based on the general knowledge of the relatively uniform CT number distribution in one tissue component. The CT image is first segmented to construct a template image where each structure is filled with the same CT number of a specific tissue type. Then, by subtracting the ideal template from the CT image, the residual image from various error sources are generated. Since forward projection is an integration process, non-continuous shading artifacts in the image become continuous signals in a line integral. Thus, the residual image is forward projected and its line integral is low-pass filtered in order to estimate the error that causes shading artifacts. A compensation map is reconstructed from the filtered line integral error using a standard FDK algorithm and added back to the original image for shading correction. As the segmented image does not accurately depict a shaded CT image, the proposed scheme is iterated until the variation of the residual image is minimized. The proposed method is evaluated using cone-beam CT images of a Catphan©600 phantom and a pelvis patient, and low-kVp CT angiography images for carotid artery assessment. Compared with the CT image without correction, the proposed method reduces the overall CT number error from over 200 HU to be less than 30 HU and increases the spatial uniformity by a factor of 1.5. Low-contrast object is faithfully retained after the proposed correction. An effective iterative algorithm for shading correction in CT imaging is proposed that is only assisted by general anatomical information without relying on prior knowledge. The proposed method is thus practical

  15. Surgical excision of eroded mesh after prior abdominal sacrocolpopexy.

    Science.gov (United States)

    South, Mary M T; Foster, Raymond T; Webster, George D; Weidner, Alison C; Amundsen, Cindy L

    2007-12-01

    We previously described an endoscopic-assisted transvaginal mesh excision technique. This study compares surgical outcomes after transvaginal mesh excision vs endoscopic-assisted transvaginal mesh excision. In addition, we reviewed our postoperative outcomes with excision via laparotomy. This was an inclusive retrospective analysis of patients presenting to our institution from 1997 to 2006 for surgical management of vaginal erosion of permanent mesh after sacrocolpopexy. Three techniques were utilized: transvaginal, endoscopic-assisted transvaginal, and laparotomy. For the patients undergoing transvaginal excision, data recorded included number and type of excisions performed, number of prior excisions performed at outside facilities, intraoperative and postoperative complications (including blood transfusions, pelvic abscess, or bowel complications), use of postoperative antibiotics, persistent symptoms of vaginal bleeding and discharge at follow-up, and demographic characteristics. The intraoperative and postoperative complications and the postoperative symptoms were recorded for the laparotomy cases. Thirty-one patients underwent transvaginal mesh excision during this time period: 17 endoscopic-assisted transvaginal and 14 transvaginal without endoscope assistance. In addition, a total of 7 patients underwent abdominal excision via laparotomy. Comparison of the 2 vaginal methods revealed no difference in the demographics or success rate, with success defined as no symptoms at follow-up. Endoscopic-assisted transvaginal excision was successful in 7 of 17 patients and transvaginal without endoscopic assistance in 9 of 13 patients (1 patient excluded for lack of follow-up data) for a total vaginal success rate of 53.3%. No intraoperative and only minor postoperative complications occurred with either vaginal method. Three patients underwent 3 vaginal attempts to achieve complete symptom resolution. The average follow-up time for the entire vaginal group was 14

  16. Construction and test of the PRIOR proton microscope

    International Nuclear Information System (INIS)

    Lang, Philipp-Michael

    2015-01-01

    The study of High Energy Density Matter (HEDM) in the laboratory makes great demands on the diagnostics because these states can usually only be created for a short time and usual diagnostic techniques with visible light or X-rays come to their limit because of the high density. The high energy proton radiography technique that was developed in the 1990s at the Los Alamos National Laboratory is a very promising possibility to overcome those limits so that one can measure the density of HEDM with high spatial and time resolution. For this purpose the proton microscope PRIOR (Proton Radiography for FAIR) was set up at GSI, which not only reproduces the image, but also magnifies it by a factor of 4.2 and thereby penetrates matter with a density up to 20 g/cm 2 . Straightaway a spatial resolution of less than 30 μm and a time resolution on the nanosecond scale was achieved. This work describes details to the principle, design and construction of the proton microscope as well as first measurements and simulations of essential components like magnetic lenses, a collimator and a scintillator screen. For the latter one it was possible to show that plastic scintillators can be used as converter as an alternative to the slower but more radiation resistant crystals, so that it is possible to reach a time resolution of 10 ns. Moreover the characteristics were investigated for the system at the commissioning in April 2014. Also the changes in the magnetic field due to radiation damage were studied. Besides that an overview about future applications is given. First experiments with Warm Dense Matter created by using a Pulsed Power Setup have already been performed. Furthermore the promising concept of combining proton radiography with particle therapy has been investigated in context of the PaNTERA project. An outlook on the possibilities with future experiments at the FAIR accelerator facility is given as well. Because of higher beam intensity an energy one can expect even

  17. Geochemical Characterization of Mine Waste, Mine Drainage, and Stream Sediments at the Pike Hill Copper Mine Superfund Site, Orange County, Vermont

    Science.gov (United States)

    Piatak, Nadine M.; Seal, Robert R.; Hammarstrom, Jane M.; Kiah, Richard G.; Deacon, Jeffrey R.; Adams, Monique; Anthony, Michael W.; Briggs, Paul H.; Jackson, John C.

    2006-01-01

    The Pike Hill Copper Mine Superfund Site in the Vermont copper belt consists of the abandoned Smith, Eureka, and Union mines, all of which exploited Besshi-type massive sulfide deposits. The site was listed on the U.S. Environmental Protection Agency (USEPA) National Priorities List in 2004 due to aquatic ecosystem impacts. This study was intended to be a precursor to a formal remedial investigation by the USEPA, and it focused on the characterization of mine waste, mine drainage, and stream sediments. A related study investigated the effects of the mine drainage on downstream surface waters. The potential for mine waste and drainage to have an adverse impact on aquatic ecosystems, on drinking- water supplies, and to human health was assessed on the basis of mineralogy, chemical concentrations, acid generation, and potential for metals to be leached from mine waste and soils. The results were compared to those from analyses of other Vermont copper belt Superfund sites, the Elizabeth Mine and Ely Copper Mine, to evaluate if the waste material at the Pike Hill Copper Mine was sufficiently similar to that of the other mine sites that USEPA can streamline the evaluation of remediation technologies. Mine-waste samples consisted of oxidized and unoxidized sulfidic ore and waste rock, and flotation-mill tailings. These samples contained as much as 16 weight percent sulfides that included chalcopyrite, pyrite, pyrrhotite, and sphalerite. During oxidation, sulfides weather and may release potentially toxic trace elements and may produce acid. In addition, soluble efflorescent sulfate salts were identified at the mines; during rain events, the dissolution of these salts contributes acid and metals to receiving waters. Mine waste contained concentrations of cadmium, copper, and iron that exceeded USEPA Preliminary Remediation Goals. The concentrations of selenium in mine waste were higher than the average composition of eastern United States soils. Most mine waste was

  18. Changes in Groundwater Flow and Volatile Organic Compound Concentrations at the Fischer and Porter Superfund Site, Warminster Township, Bucks County, Pennsylvania, 1993-2009

    Science.gov (United States)

    Sloto, Ronald A.

    2010-01-01

    The 38-acre Fischer and Porter Company Superfund Site is in Warminster Township, Bucks County, Pa. Historically, as part of the manufacturing process, trichloroethylene (TCE) degreasers were used for parts cleaning. In 1979, the Bucks County Health Department detected TCE and other volatile organic compounds (VOCs) in water from the Fischer and Porter on-site supply wells and nearby public-supply wells. The Fischer and Porter Site was designated as a Superfund Site and placed on the National Priorities List in September 1983. A 1984 Record of Decision for the site required the Fischer and Porter Company to pump and treat groundwater contaminated by VOCs from three on-site wells at a combined rate of 75 gallons per minute to contain groundwater contamination on the property. Additionally, the Record of Decision recognized the need for treatment of the water from two nearby privately owned supply wells operated by the Warminster Heights Home Ownership Association. In 2004, the Warminster Heights Home Ownership Association sold its water distribution system, and both wells were taken out of service. The report describes changes in groundwater levels and contaminant concentrations and migration caused by the shutdown of the Warminster Heights supply wells and presents a delineation of the off-site groundwater-contamination plume. The U.S. Geological Survey (USGS) conducted this study (2006-09) in cooperation with the U.S. Environmental Protection Agency (USEPA). The Fischer and Porter Site and surrounding area are underlain by sedimentary rocks of the Stockton Formation of Late Triassic age. The rocks are chiefly interbedded arkosic sandstone and siltstone. The Stockton aquifer system is comprised of a series of gently dipping lithologic units with different hydraulic properties. A three-dimensional lithostratigraphic model was developed for the site on the basis of rock cores and borehole geophysical logs. The model was simplified by combining individual lithologic

  19. Assessment of prior learning in adult vocational education and training

    DEFF Research Database (Denmark)

    Aarkrog, Vibe; Wahlgren, Bjarne

    2015-01-01

    in the programs for gastronomes, respectively child care assistants the article discusses two issues in relation to APL: the encounter of practical experience and school-based knowledge and the validity and reliability of the assessment procedures. Through focusing on the students’ knowing that and knowing why...... the assessment is based on a scholastic perception of the students’ needs for training, reflecting one of the most important challenges in APL: how can practical experience be transformed into credits for the knowledge parts of the programs? The study shows that by combining several APL methods and comparing...... the teachers’ assessments the teachers respond to the issues of validity and reliability. However, validity and reliability might be even further strengthened, if the competencies are well defined, if the education system is aware ofsecuring a reasonable balance between knowing how, knowing that, and knowing...

  20. An Analysis of Naval Officer Accession Programs

    National Research Council Canada - National Science Library

    Lehner, William D

    2008-01-01

    This thesis conducts an extensive literature review of prior studies on the three major commissioning programs for United States naval officers the United States Naval Academy, Naval Reserve Officers...

  1. Review and Implementation Status of Prior Defense Business Board Recommendations

    Science.gov (United States)

    2007-04-01

    key recommendation to appoint a single Fund Manager – Improved training programs, including eLearning , underway – Capabilities-Based Budgeting allowed...framework focused on et. al. enterprise enhancements, support for the Global War on Terror, progress on implementation of the Quadrennial Defense...Improve Global Mail Operations – MPSA in receipt of all industry responses by Aug 2006 – On December 12th MPSA will brief the Postal Oversight Board on

  2. Estimating kinetic mechanisms with prior knowledge I: Linear parameter constraints.

    Science.gov (United States)

    Salari, Autoosa; Navarro, Marco A; Milescu, Mirela; Milescu, Lorin S

    2018-02-05

    To understand how ion channels and other proteins function at the molecular and cellular levels, one must decrypt their kinetic mechanisms. Sophisticated algorithms have been developed that can be used to extract kinetic parameters from a variety of experimental data types. However, formulating models that not only explain new data, but are also consistent with existing knowledge, remains a challenge. Here, we present a two-part study describing a mathematical and computational formalism that can be used to enforce prior knowledge into the model using constraints. In this first part, we focus on constraints that enforce explicit linear relationships involving rate constants or other model parameters. We develop a simple, linear algebra-based transformation that can be applied to enforce many types of model properties and assumptions, such as microscopic reversibility, allosteric gating, and equality and inequality parameter relationships. This transformation converts the set of linearly interdependent model parameters into a reduced set of independent parameters, which can be passed to an automated search engine for model optimization. In the companion article, we introduce a complementary method that can be used to enforce arbitrary parameter relationships and any constraints that quantify the behavior of the model under certain conditions. The procedures described in this study can, in principle, be coupled to any of the existing methods for solving molecular kinetics for ion channels or other proteins. These concepts can be used not only to enforce existing knowledge but also to formulate and test new hypotheses. © 2018 Salari et al.

  3. An Ensemble Approach to Building Mercer Kernels with Prior Information

    Science.gov (United States)

    Srivastava, Ashok N.; Schumann, Johann; Fischer, Bernd

    2005-01-01

    This paper presents a new methodology for automatic knowledge driven data mining based on the theory of Mercer Kernels, which are highly nonlinear symmetric positive definite mappings from the original image space to a very high, possibly dimensional feature space. we describe a new method called Mixture Density Mercer Kernels to learn kernel function directly from data, rather than using pre-defined kernels. These data adaptive kernels can encode prior knowledge in the kernel using a Bayesian formulation, thus allowing for physical information to be encoded in the model. Specifically, we demonstrate the use of the algorithm in situations with extremely small samples of data. We compare the results with existing algorithms on data from the Sloan Digital Sky Survey (SDSS) and demonstrate the method's superior performance against standard methods. The code for these experiments has been generated with the AUTOBAYES tool, which automatically generates efficient and documented C/C++ code from abstract statistical model specifications. The core of the system is a schema library which contains templates for learning and knowledge discovery algorithms like different versions of EM, or numeric optimization methods like conjugate gradient methods. The template instantiation is supported by symbolic-algebraic computations, which allows AUTOBAYES to find closed-form solutions and, where possible, to integrate them into the code.

  4. Prolonged Instability Prior to a Regime Shift | Science ...

    Science.gov (United States)

    Regime shifts are generally defined as the point of ‘abrupt’ change in the state of a system. However, a seemingly abrupt transition can be the product of a system reorganization that has been ongoing much longer than is evident in statistical analysis of a single component of the system. Using both univariate and multivariate statistical methods, we tested a long-term high-resolution paleoecological dataset with a known change in species assemblage for a regime shift. Analysis of this dataset with Fisher Information and multivariate time series modeling showed that there was a∼2000 year period of instability prior to the regime shift. This period of instability and the subsequent regime shift coincide with regional climate change, indicating that the system is undergoing extrinsic forcing. Paleoecological records offer a unique opportunity to test tools for the detection of thresholds and stable-states, and thus to examine the long-term stability of ecosystems over periods of multiple millennia. This manuscript explores various methods of assessing the transition between alternative states in an ecological system described by a long-term high-resolution paleoecological dataset.

  5. Autoshaping as a function of prior food presentations1

    Science.gov (United States)

    Downing, Kevin; Neuringer, Allen

    1976-01-01

    Young chickens were given 1, 10, 100, or 1000 presentations of grain in a hopper. Subsequently, the key was illuminated before each presentation of grain to study autoshaping of the key-peck response. The number of keylight-grain pairings before a bird first pecked the lighted key was found to be a U-shaped function of the number of prior food-only presentations, with pecks occurring significantly sooner after 100 food-only trials than after any of the other values. Two of five chicks at the 100-trial value pecked on the first illumination of the key. Experiment II showed further that when a series of food-only trials (no keylight) preceded keylight-only trials (no food) 30% of the chicks pecked the illuminated key. Experiment III extended the generality of first-trial pecking to pigeons. After preliminary training with food-only, two of five pigeons pecked on the first illumination of a key. The results suggest a close relationship between autoshaping and pseudo-conditioning. PMID:16811961

  6. Autoshaping as a function of prior food presentations.

    Science.gov (United States)

    Downing, K; Neuringer, A

    1976-11-01

    Young chickens were given 1, 10, 100, or 1000 presentations of grain in a hopper. Subsequently, the key was illuminated before each presentation of grain to study autoshaping of the key-peck response. The number of keylight-grain pairings before a bird first pecked the lighted key was found to be a U-shaped function of the number of prior food-only presentations, with pecks occurring significantly sooner after 100 food-only trials than after any of the other values. Two of five chicks at the 100-trial value pecked on the first illumination of the key. Experiment II showed further that when a series of food-only trials (no keylight) preceded keylight-only trials (no food) 30% of the chicks pecked the illuminated key. Experiment III extended the generality of first-trial pecking to pigeons. After preliminary training with food-only, two of five pigeons pecked on the first illumination of a key. The results suggest a close relationship between autoshaping and pseudo-conditioning.

  7. Cryopreservation of human colorectal carcinomas prior to xenografting

    International Nuclear Information System (INIS)

    Linnebacher, Michael; Maletzki, Claudia; Ostwald, Christiane; Klier, Ulrike; Krohn, Mathias; Klar, Ernst; Prall, Friedrich

    2010-01-01

    Molecular heterogeneity of colorectal carcinoma (CRC) is well recognized, forming the rationale for molecular tests required before administration of some of the novel targeted therapies that now are rapidly entering the clinics. For clinical research at least, but possibly even for future individualized tumor treatment on a routine basis, propagation of patients' CRC tissue may be highly desirable for detailed molecular, biochemical or functional analyses. However, complex logistics requiring close liaison between surgery, pathology, laboratory researchers and animal care facilities are a major drawback in this. We here describe and evaluate a very simple cryopreservation procedure for colorectal carcinoma tissue prior to xenografting that will considerably reduce this logistic complexity. Fourty-eight CRC collected ad hoc were xenografted subcutaneously into immunodeficient mice either fresh from surgery (N = 23) or after cryopreservation (N = 31; up to 643 days). Take rates after cryopreservation were satisfactory (71%) though somewhat lower than with tumor tissues fresh from surgery (74%), but this difference was not statistically significant. Re-transplantation of cryopreserved established xenografts (N = 11) was always successful. Of note, in this series, all of the major molecular types of CRC were xenografted successfully, even after cryopreservation. Our procedure facilitates collection, long-time storage and propagation of clinical CRC specimens (even from different centres) for (pre)clinical studies of novel therapies or for basic research

  8. Simultaneous tensor decomposition and completion using factor priors.

    Science.gov (United States)

    Chen, Yi-Lei; Hsu, Chiou-Ting; Liao, Hong-Yuan Mark

    2014-03-01

    The success of research on matrix completion is evident in a variety of real-world applications. Tensor completion, which is a high-order extension of matrix completion, has also generated a great deal of research interest in recent years. Given a tensor with incomplete entries, existing methods use either factorization or completion schemes to recover the missing parts. However, as the number of missing entries increases, factorization schemes may overfit the model because of incorrectly predefined ranks, while completion schemes may fail to interpret the model factors. In this paper, we introduce a novel concept: complete the missing entries and simultaneously capture the underlying model structure. To this end, we propose a method called simultaneous tensor decomposition and completion (STDC) that combines a rank minimization technique with Tucker model decomposition. Moreover, as the model structure is implicitly included in the Tucker model, we use factor priors, which are usually known a priori in real-world tensor objects, to characterize the underlying joint-manifold drawn from the model factors. By exploiting this auxiliary information, our method leverages two classic schemes and accurately estimates the model factors and missing entries. We conducted experiments to empirically verify the convergence of our algorithm on synthetic data and evaluate its effectiveness on various kinds of real-world data. The results demonstrate the efficacy of the proposed method and its potential usage in tensor-based applications. It also outperforms state-of-the-art methods on multilinear model analysis and visual data completion tasks.

  9. Full system decontamination. AREVAs experience in decontamination prior to decommissioning

    International Nuclear Information System (INIS)

    Topf, Christian

    2010-01-01

    Minimizing collective radiation exposure and producing free-release material are two of the highest priorities in the decommissioning of a Nuclear Power Plant (NPP). Full System Decontamination (FSD) is the most effective measure to reduce source term and remove oxide layer contamination within the plant systems. FSD is typically a decontamination of the primary coolant circuit and the auxiliary systems. In recent years AREVA NP has performed several FSDs in PWRs and BWRs prior to decommissioning by applying the proprietary CORD copyright family and AMDA copyright technology. Chemical Oxidation Reduction Decontamination or CORD represents the chemical decontamination process while AMDA stands for Automated Mobile Decontamination Appliance, AREVA NPs decontamination equipment. Described herein are the excellent results achieved for the FSDs applied at the German PWRs Stade in 2004 and Obrigheim in 2007 and for the FSDs performed at the Swedish BWRs, Barsebaeck Unit 1 in 2007 and Barsebaeck Unit 2 in 2008. All four FSDs were performed using the AREVA NP CORD family decontamination technology in combination with the AREVA NP decontamination equipment, AMDA. (orig.)

  10. Optimization of Evacuation Warnings Prior to a Hurricane Disaster

    Directory of Open Access Journals (Sweden)

    Dian Sun

    2017-11-01

    Full Text Available The key purpose of this paper is to demonstrate that optimization of evacuation warnings by time period and impacted zone is crucial for efficient evacuation of an area impacted by a hurricane. We assume that people behave in a manner consistent with the warnings they receive. By optimizing the issuance of hurricane evacuation warnings, one can control the number of evacuees at different time intervals to avoid congestion in the process of evacuation. The warning optimization model is applied to a case study of Hurricane Sandy using the study region of Brooklyn. We first develop a model for shelter assignment and then use this outcome to model hurricane evacuation warning optimization, which prescribes an evacuation plan that maximizes the number of evacuees. A significant technical contribution is the development of an iterative greedy heuristic procedure for the nonlinear formulation, which is shown to be optimal for the case of a single evacuation zone with a single evacuee type case, while it does not guarantee optimality for multiple zones under unusual circumstances. A significant applied contribution is the demonstration of an interface of the evacuation warning method with a public transportation scheme to facilitate evacuation of a car-less population. This heuristic we employ can be readily adapted to the case where response rate is a function of evacuation number in prior periods and other variable factors. This element is also explored in the context of our experiment.

  11. Ionospheric characteristics prior to the greatest earthquake in recorded history

    Science.gov (United States)

    Villalobos, C. U.; Bravo, M. A.; Ovalle, E. M.; Foppiano, A. J.

    2016-03-01

    Although several reports on the variations of some radio observed ionospheric properties prior to the very large Chile earthquakes of 21-22 May 1960 have been published, no one up to now has reported on the variations of simultaneous E- and F-region characteristics observed at Concepción (36.8°S; 73.0°W) using a ground based ionosonde. This paper analyses values of the NmE, NmEs, h'E, NmF2, h'F, M3000F2 and fmin. Possible solar and geomagnetic activity effects are first identified and then anomalies are calculated for all characteristics using reference values (15-day running medians ± interquartile range). Occasions when anomalies are larger than an upper threshold and less than a lower threshold are discussed and compared, whenever possible, with other published studies. Further study is suggested to unambiguously associate some found possible Es-layer and M3000F2 anomalies with very strong earthquakes.

  12. 38 CFR 41.520 - Major program determination.

    Science.gov (United States)

    2010-07-01

    ... major programs. This risk-based approach shall include consideration of: Current and prior audit... of prior audit findings under § 41.510(a)(7) do not preclude the Type A program from being low-risk... this section), except this paragraph (e)(2)(i)(A) does not require the auditor to audit more high-risk...

  13. A new effective method for estimating missing values in the sequence data prior to phylogenetic analysis

    Directory of Open Access Journals (Sweden)

    Abdoulaye Baniré Diallo

    2006-01-01

    Full Text Available In this article we address the problem of phylogenetic inference from nucleic acid data containing missing bases. We introduce a new effective approach, called “Probabilistic estimation of missing values” (PEMV, allowing one to estimate unknown nucleotides prior to computing the evolutionary distances between them. We show that the new method improves the accuracy of phylogenetic inference compared to the existing methods “Ignoring Missing Sites” (IMS, “Proportional Distribution of Missing and Ambiguous Bases” (PDMAB included in the PAUP software [26]. The proposed strategy for estimating missing nucleotides is based on probabilistic formulae developed in the framework of the Jukes-Cantor [10] and Kimura 2-parameter [11] models. The relative performances of the new method were assessed through simulations carried out with the SeqGen program [20], for data generation, and the BioNJ method [7], for inferring phylogenies. We also compared the new method to the DNAML program [5] and “Matrix Representation using Parsimony” (MRP [13], [19] considering an example of 66 eutherian mammals originally analyzed in [17].

  14. Cost estimate and economic issues associated with the MOX option (prior to DOE's record of decision)

    International Nuclear Information System (INIS)

    Reid, R.L.; Miller, J.W.

    1997-04-01

    Before the January 1997 Record of Decision (ROD), the U.S. Department of Energy Office of Fissile Materials Disposition (DOE-MD) evaluated three technologies for the disposition of ∼50 MT of surplus plutonium from defense-related programs-reactors, immobilization, and deep boreholes. As part of the process supporting the ROD, and comprehensive assessment of technical viability, cost, and schedule was conducted by DOE-MD and its national laboratory contractors. Oak Ridge National Laboratory managed and coordinated the life-cycle cost (LCC) assessment effort for this program. This report discusses the economic analysis methodology and the results for the reactor options considered prior to ROD. A secondary intent of the report is to discuss major technical and economic issues that impact cost and schedule. To evaluate the economics of the reactor option and other technologies on an equitable basis, a set of cost-estimating guidelines and a common cost-estimating format were utilized by all three technology teams. This report includes the major economic analysis assumptions and the comparative constant-dollar and discounted-dollar LCCs for all nine reactor scenarios

  15. Evaluation of stream crossing methods prior to gas pipeline construction

    International Nuclear Information System (INIS)

    Murphy, M.H.; Rogers, J.S.; Ricca, A.

    1995-01-01

    Stream surveys are conducted along proposed gas pipeline routes prior to construction to assess potential impacts to stream ecosystems and to recommend preferred crossing methods. Recently, there has been a high level of scrutiny from the Public Service Commission (PSC) to conduct these stream crossings with minimal effects to the aquatic community. PSC's main concern is the effect of sediment on aquatic biota. Smaller, low flowing or intermittent streams are generally crossed using a wet crossing technique. This technique involves digging a trench for the pipeline while the stream is flowing. Sediment control measures are used to reduce sediment loads downstream. Wider, faster flowing, or protected streams are typically crossed with a dry crossing technique. A dry crossing involves placing a barrier upstream of the crossing and diverting the water around the crossing location. The pipeline trench is then dug in the dry area. O'Brien and Gere and NYSEG have jointly designed a modified wet crossing for crossing streams that exceed maximum flows for a dry crossing, and are too wide for a typical wet crossing. This method diverts water around the crossing using a pumping system, instead of constructing a dam. The trench is similar to a wet crossing, with sediment control devices in place upstream and downstream. If streams are crossed during low flow periods, the pumping system will be able to reduce the majority of water flow and volume form the crossing area, thereby reducing ecological impacts. Evaluation of effects of this crossing type on the stream biota are currently proposed and may proceed when construction begins

  16. MANU. Handling of bentonite prior buffer block manufacturing

    International Nuclear Information System (INIS)

    Laaksonen, R.

    2010-01-01

    The aim of this study is to describe the entire bentonite handling process starting from freight from harbour to storage facility and ending up to the manufacturing filling process of the bentonite block moulds. This work describes the bentonite handling prior to the process in which bentonite blocks are manufactured in great quantities. This work included a study of relevant Nordic and international well documented cases of storage, processing and techniques involving bentonite material. Information about storage and handling processes from producers or re-sellers of bentonite was collected while keeping in mind the requirements coming from the Posiva side. Also a limited experiment was made for humidification of different material types. This work includes a detailed description of methods and equipment needed for bentonite storage and processing. Posiva Oy used Jauhetekniikka Oy as a consultant to prepare handling process flow charts for bentonite. Jauhetekniikka Oy also evaluated the content of this report. The handling of bentonite was based on the assumption that bentonite process work is done in one factory for 11 months of work time while the weekly volume is around 41-45 tons. Storage space needed in this case is about 300 tons of bentonite which equals about seven weeks of raw material consumption. This work concluded several things to be carefully considered: sampling at various phases of the process, the air quality at the production/storage facilities (humidity and temperature), the level of automation/process control of the manufacturing process and the means of producing/saving data from different phases of the process. (orig.)

  17. Oak Ridge Reservation Site Management Plan for the Environmental Restoration Program

    Energy Technology Data Exchange (ETDEWEB)

    1991-09-01

    This site management for the Environmental Restoration (ER) Program implements the Oak Ridge Reservation (ORR) Federal Facility Agreement (FFA) (EPA 1990), also known as an Interagency Agreement (IAG), hereafter referred to as the Agreement.'' The Department of Energy (DOE), the US Environmental Protection Agency (EPA), and the Tennessee Department of Environment and Conservation (TDEC), hereafter known as the Parties,'' entered into this Agreement for the purpose of coordinating remediation activities undertaken on the ORR to comply with the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) as amended by the Superfund Amendments, the Resource Conservation and Recovery Act (RCRA), and the National Environmental Policy Act (NEPA). 7 refs., 17 figs.

  18. Functional Programming

    OpenAIRE

    Chitil, Olaf

    2009-01-01

    Functional programming is a programming paradigm like object-oriented programming and logic programming. Functional programming comprises both a specific programming style and a class of programming languages that encourage and support this programming style. Functional programming enables the programmer to describe an algorithm on a high-level, in terms of the problem domain, without having to deal with machine-related details. A program is constructed from functions that only map inputs to ...

  19. Graphite electrode DC arc technology program for buried waste treatment

    International Nuclear Information System (INIS)

    Wittle, J.K.; Hamilton, R.A.; Cohn, D.R.; Woskov, P.P.; Thomas, P.; Surma, J.E.; Titus, C.H.

    1994-01-01

    The goal of the program is to apply EPI's Arc Furnace to the processing of Subsurface Disposal Area (SDA) waste from Idaho National Engineering Laboratory. This is being facilitated through the Department of Energy's Buried Waste Integrated Demonstration (BWID) program. A second objective is to apply the diagnostics capability of MIT's Plasma Fusion Center to the understanding of the high temperature processes taking place in the furnace. This diagnostics technology has promise for being applicable in other thermal treatment processes. The program has two parts, a test series in an engineering-scale DC arc furnace which was conducted in an EPI furnace installed at the Plasma Fusion Center and a pilot-scale unit which is under construction at MIT. This pilot-scale furnace will be capable of operating in a continuous feed and continuous tap mode. Included in this work is the development and implementation of diagnostics to evaluate high temperature processes such as DC arc technology. This technology can be used as an effective stabilization process for Superfund wastes

  20. Bioaccumulation of polychlorinated biphenyls and organochlorine pesticides in young-of-the-year bluefish (Pomatomus saltatrix) in the vicinity of a Superfund Site in New Bedford Harbor, Massachusetts, and in the adjacent waters.

    Science.gov (United States)

    Deshpande, Ashok D; Dockum, Bruce W; Cleary, Thomas; Farrington, Cameron; Wieczorek, Daniel

    2013-07-15

    Spatial gradients of polychlorinated biphenyls (PCBs) and organochlorine pesticides were examined in the young-of-the-year (YOY) bluefish (Pomatomus saltatrix) in the vicinity of a PCB Superfund Site in New Bedford Harbor, Massachusetts, and in the adjacent waters. PCB concentrations in bluefish varied between different locations, and also among fish from a given location. A generally decreasing gradient in PCB concentrations was evident as the bluefish were collected away from the Superfund Site. The average sum of PCB concentrations were highest for bluefish collected in the Upper Harbor between Interstate-195 Bridge and Coggeshall Street Bridge (Upper Harbor), followed by bluefish in Lower Harbor from north of Popes Island Bridge (Lower Harbor), and bluefish from Outer Harbor south of Hurricane Barrier (Outer Harbor). The levels of PCBs in bluefish from Clarks Cove and PCBs in bluefish from Buzzards Bay were similar and lowest among all bluefish specimens analyzed in the present study. Pesticide concentrations were about one order of magnitude or lower than the PCB concentrations, and the gradient of pesticide concentrations generally followed the gradient of PCB concentrations. Some of the commonly detected pesticides in the order of decreasing concentrations included DDTs and metabolites, heptachlor epoxide, endosulfan sulfate, and α-chlordane. Distribution of PCBs and organochlorine pesticides were examined in the tissues of YOY bluefish from Clarks Cove. PCBs and lipids in the brain samples of YOY bluefish were generally numerically greater than PCBs in the liver samples, but these differences were not statistically significant. PCBs and lipids in hypaxial muscle samples were numerically greater than PCBs in epaxial muscle samples, although these two groups of tissues were not statistically different. Despite the higher susceptibility of lighter PCB homologs to geophysical and biogeochemical weathering processes, the relative dominance of lighter homologs