WorldWideScience

Sample records for llnl genomic assessment

  1. LLNL Genomic Assessment: Viral and Bacterial Sequencing Needs for TMTI, Tier 1 Report

    Energy Technology Data Exchange (ETDEWEB)

    Slezak, T; Borucki, M; Lenhoff, R; Vitalis, E

    2009-09-29

    The Lawrence Livermore National Lab Bioinformatics group has recently taken on a role in DTRA's Transformation Medical Technologies Initiative (TMTI). The high-level goal of TMTI is to accelerate the development of broad-spectrum countermeasures. To achieve those goals, TMTI has a near term need to obtain more sequence information across a large range of pathogens, near neighbors, and across a broad geographical and host range. Our role in this project is to research available sequence data for the organisms of interest and identify critical microbial sequence and knowledge gaps that need to be filled to meet TMTI objectives. This effort includes: (1) assessing current genomic sequence for each agent including phylogenetic and geographical diversity, host range, date of isolation range, virulence, sequence availability of key near neighbors, and other characteristics; (2) identifying Subject Matter Experts (SME's) and potential holders of isolate collections, contacting appropriate SME's with known expertise and isolate collections to obtain information on isolate availability and specific recommendations; (3) identifying sequence as well as knowledge gaps (eg virulence, host range, and antibiotic resistance determinants); (4) providing specific recommendations as to the most valuable strains to be placed on the DTRA sequencing queue. We acknowledge that criteria for prioritization of isolates for sequencing falls into two categories aligning with priority queues 1 and 2 as described in the summary. (Priority queue 0 relates to DTRA operational isolates whose availability is not predictable in advance.) 1. Selection of isolates that appear to have likelihood to provide information on virulence and antibiotic resistance. This will include sequence of known virulent strains. Particularly valuable would be virulent strains that have genetically similar yet avirulent, or non human transmissible, counterparts that can be used for comparison to help

  2. LLNL Genomic Assessment: Viral and Bacterial Sequencing Needs for TMTI, Task 1.4.2 Report

    Energy Technology Data Exchange (ETDEWEB)

    Slezak, T; Borucki, M; Lam, M; Lenhoff, R; Vitalis, E

    2010-01-26

    Good progress has been made on both bacterial and viral sequencing by the TMTI centers. While access to appropriate samples is a limiting factor to throughput, excellent progress has been made with respect to getting agreements in place with key sources of relevant materials. Sharing of sequenced genomes funded by TMTI has been extremely limited to date. The April 2010 exercise should force a resolution to this, but additional managerial pressures may be needed to ensure that rapid sharing of TMTI-funded sequencing occurs, regardless of collaborator constraints concerning ultimate publication(s). Policies to permit TMTI-internal rapid sharing of sequenced genomes should be written into all TMTI agreements with collaborators now being negotiated. TMTI needs to establish a Web-based system for tracking samples destined for sequencing. This includes metadata on sample origins and contributor, information on sample shipment/receipt, prioritization by TMTI, assignment to one or more sequencing centers (including possible TMTI-sponsored sequencing at a contributor site), and status history of the sample sequencing effort. While this system could be a component of the AFRL system, it is not part of any current development effort. Policy and standardized procedures are needed to ensure appropriate verification of all TMTI samples prior to the investment in sequencing. PCR, arrays, and classical biochemical tests are examples of potential verification methods. Verification is needed to detect miss-labeled, degraded, mixed or contaminated samples. Regular QC exercises are needed to ensure that the TMTI-funded centers are meeting all standards for producing quality genomic sequence data.

  3. Joint FAM/Line Management Assessment Report on LLNL Machine Guarding Safety Program

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, J. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-07-19

    The LLNL Safety Program for Machine Guarding is implemented to comply with requirements in the ES&H Manual Document 11.2, "Hazards-General and Miscellaneous," Section 13 Machine Guarding (Rev 18, issued Dec. 15, 2015). The primary goal of this LLNL Safety Program is to ensure that LLNL operations involving machine guarding are managed so that workers, equipment and government property are adequately protected. This means that all such operations are planned and approved using the Integrated Safety Management System to provide the most cost effective and safest means available to support the LLNL mission.

  4. LLNL's Big Science Capabilities Help Spur Over $796 Billion in U.S. Economic Activity Sequencing the Human Genome

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Jeffrey S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-28

    LLNL’s successful history of taking on big science projects spans beyond national security and has helped create billions of dollars per year in new economic activity. One example is LLNL’s role in helping sequence the human genome. Over $796 billion in new economic activity in over half a dozen fields has been documented since LLNL successfully completed this Grand Challenge.

  5. Assessment of the proposed decontamination and waste treatment facility at LLNL

    International Nuclear Information System (INIS)

    Cohen, J.J.

    1987-01-01

    To provide a centralized decontamination and waste treatment facility (DWTF) at LLNL, the construction of a new installation has been planned. Objectives for this new facility were to replace obsolete, structurally and environmentally sub-marginal liquid and solid waste process facilities and decontamination facility and to bring these facilities into compliance with existing federal, state and local regulations as well as DOE orders. In a previous study, SAIC conducted a preliminary review and evaluation of existing facilities at LLNL and cost effectiveness of the proposed DWTF. This document reports on a detailed review of specific aspects of the proposed DWTF

  6. Dispersion of Radionuclides and Exposure Assessment in Urban Environments: A Joint CEA and LLNL Report

    Energy Technology Data Exchange (ETDEWEB)

    Glascoe, Lee [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gowardhan, Akshay [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lennox, Kristin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Simpson, Matthew [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Yu, Kristen [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Armand, Patrick [Alternative Energies and Atomic Energy Commission (CEA), Paris (France); Duchenne, Christophe [Alternative Energies and Atomic Energy Commission (CEA), Paris (France); Mariotte, Frederic [Alternative Energies and Atomic Energy Commission (CEA), Paris (France); Pectorin, Xavier [Alternative Energies and Atomic Energy Commission (CEA), Paris (France)

    2014-12-19

    In the interest of promoting the international exchange of technical expertise, the US Department of Energy’s Office of Emergency Operations (NA-40) and the French Commissariat à l'Energie Atomique et aux énergies alternatives (CEA) requested that the National Atmospheric Release Advisory Center (NARAC) of Lawrence Livermore National Laboratory (LLNL) in Livermore, California host a joint table top exercise with experts in emergency management and atmospheric transport modeling. In this table top exercise, LLNL and CEA compared each other’s flow and dispersion models. The goal of the comparison is to facilitate the exchange of knowledge, capabilities, and practices, and to demonstrate the utility of modeling dispersal at different levels of computational fidelity. Two modeling approaches were examined, a regional scale modeling approach, appropriate for simple terrain and/or very large releases, and an urban scale modeling approach, appropriate for small releases in a city environment. This report is a summary of LLNL and CEA modeling efforts from this exercise. Two different types of LLNL and CEA models were employed in the analysis: urban-scale models (Aeolus CFD at LLNL/NARAC and Parallel- Micro-SWIFT-SPRAY, PMSS, at CEA) for analysis of a 5,000 Ci radiological release and Lagrangian Particle Dispersion Models (LODI at LLNL/NARAC and PSPRAY at CEA) for analysis of a much larger (500,000 Ci) regional radiological release. Two densely-populated urban locations were chosen: Chicago with its high-rise skyline and gridded street network and Paris with its more consistent, lower building height and complex unaligned street network. Each location was considered under early summer daytime and nighttime conditions. Different levels of fidelity were chosen for each scale: (1) lower fidelity mass-consistent diagnostic, intermediate fidelity Navier-Stokes RANS models, and higher fidelity Navier-Stokes LES for urban-scale analysis, and (2) lower-fidelity single

  7. Dispersion of Radionuclides and Exposure Assessment in Urban Environments: A Joint CEA and LLNL Report

    International Nuclear Information System (INIS)

    Glascoe, Lee; Gowardhan, Akshay; Lennox, Kristin; Simpson, Matthew; Yu, Kristen; Armand, Patrick; Duchenne, Christophe; Mariotte, Frederic; Pectorin, Xavier

    2014-01-01

    In the interest of promoting the international exchange of technical expertise, the US Department of Energy’s Office of Emergency Operations (NA-40) and the French Commissariat à l'Energie Atomique et aux énergies alternatives (CEA) requested that the National Atmospheric Release Advisory Center (NARAC) of Lawrence Livermore National Laboratory (LLNL) in Livermore, California host a joint table top exercise with experts in emergency management and atmospheric transport modeling. In this table top exercise, LLNL and CEA compared each other's flow and dispersion models. The goal of the comparison is to facilitate the exchange of knowledge, capabilities, and practices, and to demonstrate the utility of modeling dispersal at different levels of computational fidelity. Two modeling approaches were examined, a regional scale modeling approach, appropriate for simple terrain and/or very large releases, and an urban scale modeling approach, appropriate for small releases in a city environment. This report is a summary of LLNL and CEA modeling efforts from this exercise. Two different types of LLNL and CEA models were employed in the analysis: urban-scale models (Aeolus CFD at LLNL/NARAC and Parallel- Micro-SWIFT-SPRAY, PMSS, at CEA) for analysis of a 5,000 Ci radiological release and Lagrangian Particle Dispersion Models (LODI at LLNL/NARAC and PSPRAY at CEA) for analysis of a much larger (500,000 Ci) regional radiological release. Two densely-populated urban locations were chosen: Chicago with its high-rise skyline and gridded street network and Paris with its more consistent, lower building height and complex unaligned street network. Each location was considered under early summer daytime and nighttime conditions. Different levels of fidelity were chosen for each scale: (1) lower fidelity mass-consistent diagnostic, intermediate fidelity Navier-Stokes RANS models, and higher fidelity Navier-Stokes LES for urban-scale analysis, and (2) lower-fidelity single

  8. A probabilistic risk assessment of the LLNL Plutonium facility's evaluation basis fire operational accident

    International Nuclear Information System (INIS)

    Brumburgh, G.

    1994-01-01

    The Lawrence Livermore National Laboratory (LLNL) Plutonium Facility conducts numerous involving plutonium to include device fabrication, development of fabrication techniques, metallurgy research, and laser isotope separation. A Safety Analysis Report (SAR) for the building 332 Plutonium Facility was completed rational safety and acceptable risk to employees, the public, government property, and the environment. This paper outlines the PRA analysis of the Evaluation Basis Fire (EDF) operational accident. The EBF postulates the worst-case programmatic impact event for the Plutonium Facility

  9. A probabilistic risk assessment of the LLNL Plutonium Facility's evaluation basis fire operational accident. Revision 1

    International Nuclear Information System (INIS)

    Brumburgh, G.P.

    1995-01-01

    The Lawrence Livermore National Laboratory (LLNL) Plutonium Facility conducts numerous programmatic activities involving plutonium to include device fabrication, development of improved and/or unique fabrication techniques, metallurgy research, and laser isotope separation. A Safety Analysis Report (SAR) for the building 332 Plutonium Facility was completed in July 1994 to address operational safety and acceptable risk to employees, the public, government property, and the environmental. This paper outlines the PRA analysis of the Evaluation Basis Fire (EBF) operational accident. The EBF postulates the worst-case programmatic impact event for the Plutonium Facility

  10. Assessment and cleanup of the Taxi Strip waste storage area at LLNL [Lawrence Livermore National Laboratory

    International Nuclear Information System (INIS)

    Buerer, A.

    1983-01-01

    In September 1982 the Hazards Control Department of the Lawrence Livermore National Laboratory (LLNL) began a final radiological survey of a former low-level radioactive waste storage area called the Taxi Strip so that the area could be released for construction of an office building. Collection of soil samples at the location of a proposed sewer line led to the discovery of an old disposal pit containing soil contaminated with low-level radioactive waste and organic solvents. The Taxi Strip area was excavated leading to the discovery of three additional small pits. The clean-up of Pit No. 1 is considered to be complete for radioactive contamination. The results from the chlorinated solvent analysis of the borehole samples and the limited number of samples analyzed by gas chromatography/mass spectrometry indicate that solvent clean-up at this pit is complete. This is being verified by gas chromatography/mass spectrometry analysis of a few additional soil samples from the bottom sides and ends of the pit. As a precaution, samples are also being analyzed for metals to determine if further excavation is necessary. Clean-up of Pits No. 2 and No. 3 is considered to be complete for radioactive and solvent contamination. Results of analysis for metals will determine if excavation is complete. Excavation of Pit No. 4 which resulted from surface leakage of radioactive contamination from an evaporation tray is complete

  11. Seismic hazard for the Savannah River Site: A comparative evaluation of the EPRI and LLNL assessments

    International Nuclear Information System (INIS)

    Wingo, H.E.

    1992-01-01

    This report was conducted to: (1) develop an understanding of causes for the vast differences between the two comprehensive studies, and (2) using a methodology consistent with the reconciled methods employed in the two studies, develop a single seismic hazard for the Savannah River Site suitable for use in seismic probabilistic risk assessments with emphasis on the K Reactor. Results are presented for a rock site which is a typical because detailed evaluations of soil characteristics at the K Reactor are still in progress that account for the effects of a soil stablizing grouting program. However when the soils analysis is completed, the effects of soils can be included with this analysis with the addition of a single factor that will decrease slightly the seismic hazard for a rock site

  12. LLNL 1981: technical horizons

    International Nuclear Information System (INIS)

    1981-07-01

    Research programs at LLNL for 1981 are described in broad terms. In his annual State of the Laboratory address, Director Roger Batzel projected a $481 million operating budget for fiscal year 1982, up nearly 13% from last year. In projects for the Department of Energy and the Department of Defense, the Laboratory applies its technical facilities and capabilities to nuclear weapons design and development and other areas of defense research that include inertial confinement fusion, nonnuclear ordnances, and particle-beam technology. LLNL is also applying its unique experience and capabilities to a variety of projects that will help the nation meet its energy needs in an environmentally acceptable manner. A sampling of recent achievements by LLNL support organizations indicates their diversity

  13. The LLNL AMS facility

    International Nuclear Information System (INIS)

    Roberts, M.L.; Bench, G.S.; Brown, T.A.

    1996-05-01

    The AMS facility at Lawrence Livermore National Laboratory (LLNL) routinely measures the isotopes 3 H, 7 Be, 10 Be, 14 C, 26 Al, 36 Cl, 41 Ca, 59,63 Ni, and 129 I. During the past two years, over 30,000 research samples have been measured. Of these samples, approximately 30% were for 14 C bioscience tracer studies, 45% were 14 C samples for archaeology and the geosciences, and the other isotopes constitute the remaining 25%. During the past two years at LLNL, a significant amount of work has gone into the development of the Projectile X-ray AMS (PXAMS) technique. PXAMS uses induced characteristic x-rays to discriminate against competing atomic isobars. PXAMS has been most fully developed for 63 Ni but shows promise for the measurement of several other long lived isotopes. During the past year LLNL has also conducted an 129 I interlaboratory comparison exercise. Recent hardware changes at the LLNL AMS facility include the installation and testing of a new thermal emission ion source, a new multianode gas ionization detector for general AMS use, re-alignment of the vacuum tank of the first of the two magnets that make up the high energy spectrometer, and a new cryo-vacuum system for the AMS ion source. In addition, they have begun design studies and carried out tests for a new high-resolution injector and a new beamline for heavy element AMS

  14. LLNL Site 200 Risk Management Plan

    International Nuclear Information System (INIS)

    Pinkston, D.; Johnson, M.

    2008-01-01

    It is the Lawrence Livermore National Laboratory's (LLNL) policy to perform work in a manner that protects the health and safety of employees and the public, preserves the quality of the environment, and prevents property damage using the Integrated Safety Management System. The environment, safety, and health are to take priority in the planning and execution of work activities at the Laboratory. Furthermore, it is the policy of LLNL to comply with applicable ES and H laws, regulations, and requirements (LLNL Environment, Safety and Health Manual, Document 1.2, ES and H Policies of LLNL). The program and policies that improve LLNL's ability to prevent or mitigate accidental releases are described in the LLNL Environment, Health, and Safety Manual that is available to the public. The laboratory uses an emergency management system known as the Incident Command System, in accordance with the California Standardized Emergency Management System (SEMS) to respond to Operational Emergencies and to mitigate consequences resulting from them. Operational Emergencies are defined as unplanned, significant events or conditions that require time-urgent response from outside the immediate area of the incident that could seriously impact the safety or security of the public, LLNL's employees, its facilities, or the environment. The Emergency Plan contains LLNL's Operational Emergency response policies, commitments, and institutional responsibilities for managing and recovering from emergencies. It is not possible to list in the Emergency Plan all events that could occur during any given emergency situation. However, a combination of hazard assessments, an effective Emergency Plan, and Emergency Plan Implementing Procedures (EPIPs) can provide the framework for responses to postulated emergency situations. Revision 7, 2004 of the above mentioned LLNL Emergency Plan is available to the public. The most recent revision of the LLNL Emergency Plan LLNL-AM-402556, Revision 11, March

  15. Evaluation of LLNL BSL-3 Maximum Credible Event Potential Consequence to the General Population and Surrounding Environment

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2010-08-16

    The purpose of this evaluation is to establish reproducibility of the analysis and consequence results to the general population and surrounding environment in the LLNL Biosafety Level 3 Facility Environmental Assessment (LLNL 2008).

  16. LLNL NESHAPs, 1993 annual report

    International Nuclear Information System (INIS)

    Harrach, R.J.; Surano, K.A.; Biermann, A.H.; Gouveia, F.J.; Fields, B.C.; Tate, P.J.

    1994-06-01

    The standard defined in NESHAPSs CFR Part 61.92 limits the emission of radionuclides to the ambient air from DOE facilities to those that would cause any member of the public to receive in any year an effective dose equivalent of 10 mrem. In August 1993 DOE and EPA signed a Federal Facility Compliance Agreement which established a schedule of work for LLNL to perform to demonstrate compliance with NESHAPs, 40 CFR part 61, Subpart H. The progress in LLNL's NESHAPs program - evaluations of all emission points for the Livermore site and Site 300, of collective EDEs for populations within 80 km of each site, status in reguard to continuous monitoring requirements and periodic confirmatory measurements, improvements in the sampling and monitoring systems and progress on a NESHAPs quality assurance program - is described in this annual report. In April 1994 the EPA notified DOE and LLNL that all requirements of the FFCA had been met, and that LLNL was in compliance with the NESHAPs regulations

  17. LLNL's Regional Seismic Discrimination Research

    International Nuclear Information System (INIS)

    Hanley, W; Mayeda, K; Myers, S; Pasyanos, M; Rodgers, A; Sicherman, A; Walter, W

    1999-01-01

    As part of the Department of Energy's research and development effort to improve the monitoring capability of the planned Comprehensive Nuclear-Test-Ban Treaty international monitoring system, Lawrence Livermore Laboratory (LLNL) is testing and calibrating regional seismic discrimination algorithms in the Middle East, North Africa and Western Former Soviet Union. The calibration process consists of a number of steps: (1) populating the database with independently identified regional events; (2) developing regional boundaries and pre-identifying severe regional phase blockage zones; (3) measuring and calibrating coda based magnitude scales; (4a) measuring regional amplitudes and making magnitude and distance amplitude corrections (MDAC); (4b) applying the DOE modified kriging methodology to MDAC results using the regionalized background model; (5) determining the thresholds of detectability of regional phases as a function of phase type and frequency; (6) evaluating regional phase discriminant performance both singly and in combination; (7) combining steps 1-6 to create a calibrated discrimination surface for each stations; (8) assessing progress and iterating. We have now developed this calibration procedure to the point where it is fairly straightforward to apply earthquake-explosion discrimination in regions with ample empirical data. Several of the steps outlined above are discussed in greater detail in other DOE papers in this volume or in recent publications. Here we emphasize the results of the above process: station correction surfaces and their improvement to discrimination results compared with simpler calibration methods. Some of the outstanding discrimination research issues involve cases in which there is little or no empirical data. For example in many cases there is no regional nuclear explosion data at IMS stations or nearby surrogates. We have taken two approaches to this problem, first finding and using mining explosion data when available, and

  18. LLNL NESHAPs 2014 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bertoldo, N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gallegos, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); MacQueen, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Wegrecki, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-01

    Lawrence Livermore National Security, LLC operates facilities at Lawrence Livermore National Laboratory (LLNL) where radionuclides are handled and stored. These facilities are subject to the U.S. Environmental Protection Agency (EPA) National Emission Standards for Hazardous Air Pollutants (NESHAPs) in Code of Federal Regulations (CFR) Title 40, Part 61, Subpart H, which regulates radionuclide emissions to air from Department of Energy (DOE) facilities. Specifically, NESHAPs limits the emission of radionuclides to the ambient air to levels resulting in an annual effective dose equivalent of 10 mrem (100 μSv) to any member of the public. Using measured and calculated emissions, and building-specific and common parameters, LLNL personnel applied the EPA-approved computer code, CAP88-PC, Version 4.0.1.17, to calculate the dose to the maximally exposed individual member of the public for the Livermore Site and Site 300.

  19. LLNL Chemical Kinetics Modeling Group

    Energy Technology Data Exchange (ETDEWEB)

    Pitz, W J; Westbrook, C K; Mehl, M; Herbinet, O; Curran, H J; Silke, E J

    2008-09-24

    The LLNL chemical kinetics modeling group has been responsible for much progress in the development of chemical kinetic models for practical fuels. The group began its work in the early 1970s, developing chemical kinetic models for methane, ethane, ethanol and halogenated inhibitors. Most recently, it has been developing chemical kinetic models for large n-alkanes, cycloalkanes, hexenes, and large methyl esters. These component models are needed to represent gasoline, diesel, jet, and oil-sand-derived fuels.

  20. LLNL pure positron plasma program

    International Nuclear Information System (INIS)

    Hartley, J.H.; Beck, B.R.; Cowan, T.E.; Howell, R.H.; McDonald, J.L.; Rohatgi, R.R.; Fajans, J.; Gopalan, R.

    1995-01-01

    Assembly and initial testing of the Positron Time-of-Flight Trap at the Lawrence Livermore National Laboratory (LLNL) Increase Pulsed Positron Facility has been completed. The goal of the project is to accumulate at high-density positron plasma in only a few seconds., in order to facilitate study that may require destructive diagnostics. To date, densities of at least 6 x 10 6 positrons per cm 3 have been achieved

  1. LLNL Waste Minimization Program Plan

    International Nuclear Information System (INIS)

    1990-01-01

    This document is the February 14, 1990 version of the LLNL Waste Minimization Program Plan (WMPP). The Waste Minimization Policy field has undergone continuous changes since its formal inception in the 1984 HSWA legislation. The first LLNL WMPP, Revision A, is dated March 1985. A series of informal revision were made on approximately a semi-annual basis. This Revision 2 is the third formal issuance of the WMPP document. EPA has issued a proposed new policy statement on source reduction and recycling. This policy reflects a preventative strategy to reduce or eliminate the generation of environmentally-harmful pollutants which may be released to the air, land surface, water, or ground water. In accordance with this new policy new guidance to hazardous waste generators on the elements of a Waste Minimization Program was issued. In response to these policies, DOE has revised and issued implementation guidance for DOE Order 5400.1, Waste Minimization Plan and Waste Reduction reporting of DOE Hazardous, Radioactive, and Radioactive Mixed Wastes, final draft January 1990. This WMPP is formatted to meet the current DOE guidance outlines. The current WMPP will be revised to reflect all of these proposed changes when guidelines are established. Updates, changes and revisions to the overall LLNL WMPP will be made as appropriate to reflect ever-changing regulatory requirements. 3 figs., 4 tabs

  2. LLNL Livermore site Groundwater Surveillance Plan

    International Nuclear Information System (INIS)

    1992-04-01

    Department of Energy (DOE) Order 5400.1 establishes environ-mental protection program requirements, authorities, and responsibilities for DOE operations to assume compliance with federal, state, and local environmental protection laws and regulations; Federal Executive Orders; and internal DOE policies. ne DOE Order contains requirements and guidance for environmental monitoring programs, the objectives of which are to demonstrate compliance with legal and regulatory requirements imposed by federal, state, and local agencies; confirm adherence to DOE environmental protection polices; and support environmental management decisions. The environmental monitoring programs consist of two major activities: (1) measurement and monitoring of effluents from DOE operations, and (2) surveillance through measurement, monitoring, and calculation of the effects of those operations on the environment and public health. The latter concern, that of assessing the effects, if any, of Lawrence Livermore National Laboratory (LLNL) operations and activities on on-site and off-site surface waters and groundwaters is addressed by an Environmental Surveillance Program being developed by LLNL. The Groundwater Surveillance Plan presented here has been developed on a sitespecific basis, taking into consideration facility characteristics, applicable regulations, hazard potential, quantities and concentrations of materials released, the extent and use of local water resources, and specific local public interest and concerns

  3. Status of LLNL granite projects

    International Nuclear Information System (INIS)

    Ramspott, L.D.

    1980-01-01

    The status of LLNL Projects dealing with nuclear waste disposal in granitic rocks is reviewed. This review covers work done subsequent to the June 1979 Workshop on Thermomechanical Modeling for a Hardrock Waste Repository and is prepared for the July 1980 Workshop on Thermomechanical-Hydrochemical Modeling for a Hardrock Waste Repository. Topics reviewed include laboratory determination of thermal, mechanical, and transport properties of rocks at conditions simulating a deep geologic repository, and field testing at the Climax granitic stock at the USDOE Nevada Test Site

  4. LLNL Waste Minimization Program Plan

    International Nuclear Information System (INIS)

    1990-05-01

    This document is the February 14, 1990 version of the LLNL Waste Minimization Program Plan (WMPP). Now legislation at the federal level is being introduced. Passage will result in new EPA regulations and also DOE orders. At the state level the Hazardous Waste Reduction and Management Review Act of 1989 was signed by the Governor. DHS is currently promulgating regulations to implement the new law. EPA has issued a proposed new policy statement on source reduction and recycling. This policy reflects a preventative strategy to reduce or eliminate the generation of environmentally-harmful pollutants which may be released to the air, land surface, water, or ground water. In accordance with this policy new guidance to hazardous waste generators on the elements of a Waste Minimization Program was issued. This WMPP is formatted to meet the current DOE guidance outlines. The current WMPP will be revised to reflect all of these proposed changes when guidelines are established. Updates, changes and revisions to the overall LLNL WMPP will be made as appropriate to reflect ever-changing regulatory requirements

  5. 2016 LLNL Nuclear Forensics Summer Program

    Energy Technology Data Exchange (ETDEWEB)

    Zavarin, Mavrik [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-11-15

    The Lawrence Livermore National Laboratory (LLNL) Nuclear Forensics Summer Program is designed to give graduate students an opportunity to come to LLNL for 8–10 weeks for a hands-on research experience. Students conduct research under the supervision of a staff scientist, attend a weekly lecture series, interact with other students, and present their work in poster format at the end of the program. Students also have the opportunity to meet staff scientists one-on-one, participate in LLNL facility tours (e.g., the National Ignition Facility and Center for Accelerator Mass Spectrometry), and gain a better understanding of the various science programs at LLNL.

  6. 2017 LLNL Nuclear Forensics Summer Internship Program

    Energy Technology Data Exchange (ETDEWEB)

    Zavarin, Mavrik [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-12-13

    The Lawrence Livermore National Laboratory (LLNL) Nuclear Forensics Summer Internship Program (NFSIP) is designed to give graduate students an opportunity to come to LLNL for 8-10 weeks of hands-on research. Students conduct research under the supervision of a staff scientist, attend a weekly lecture series, interact with other students, and present their work in poster format at the end of the program. Students can also meet staff scientists one-on-one, participate in LLNL facility tours (e.g., the National Ignition Facility and Center for Accelerator Mass Spectrometry), and gain a better understanding of the various science programs at LLNL.

  7. 2016 LLNL Nuclear Forensics Summer Program

    International Nuclear Information System (INIS)

    Zavarin, Mavrik

    2016-01-01

    The Lawrence Livermore National Laboratory (LLNL) Nuclear Forensics Summer Program is designed to give graduate students an opportunity to come to LLNL for 8-10 weeks for a hands-on research experience. Students conduct research under the supervision of a staff scientist, attend a weekly lecture series, interact with other students, and present their work in poster format at the end of the program. Students also have the opportunity to meet staff scientists one-on-one, participate in LLNL facility tours (e.g., the National Ignition Facility and Center for Accelerator Mass Spectrometry), and gain a better understanding of the various science programs at LLNL.

  8. Quality Assessment of Domesticated Animal Genome Assemblies

    DEFF Research Database (Denmark)

    Seemann, Stefan E; Anthon, Christian; Palasca, Oana

    2015-01-01

    affected by the lack of genomic sequence. Herein, we quantify the quality of the genome assemblies of 20 domesticated animals and related species by assessing a range of measurable parameters, and we show that there is a positive correlation between the fraction of mappable reads from RNAseq data...... domesticated animal genomes still need to be sequenced deeper in order to produce high-quality assemblies. In the meanwhile, ironically, the extent to which RNAseq and other next-generation data is produced frequently far exceeds that of the genomic sequence. Furthermore, basic comparative analysis is often...

  9. Laser wakefields at UCLA and LLNL

    International Nuclear Information System (INIS)

    Mori, W.B.; Clayton, C.E.; Joshi, C.; Dawson, J.M.; Decker, C.B.; Marsh, K.; Katsouleas, T.; Darrow, C.B.; Wilks, S.C.

    1991-01-01

    The authors report on recent progress at UCLA and LLNL on the nonlinear laser wakefield scheme. They find advantages to operating in the limit where the laser pulse is narrow enough to expel all the plasma electrons from the focal region. A description of the experimental program for the new short pulse 10 TW laser facility at LLNL is also presented

  10. Fire science at LLNL: A review

    Energy Technology Data Exchange (ETDEWEB)

    Hasegawa, H.K. (ed.)

    1990-03-01

    This fire sciences report from LLNL includes topics on: fire spread in trailer complexes, properties of welding blankets, validation of sprinkler systems, fire and smoke detectors, fire modeling, and other fire engineering and safety issues. (JEF)

  11. Compilation of LLNL CUP-2 Data

    Energy Technology Data Exchange (ETDEWEB)

    Eppich, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kips, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lindvall, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-07-31

    The CUP-2 uranium ore concentrate (UOC) standard reference material, a powder, was produced at the Blind River uranium refinery of Eldorado Resources Ltd. in Canada in 1986. This material was produced as part of a joint effort by the Canadian Certified Reference Materials Project and the Canadian Uranium Producers Metallurgical Committee to develop a certified reference material for uranium concentration and the concentration of several impurity constituents. This standard was developed to satisfy the requirements of the UOC mining and milling industry, and was characterized with this purpose in mind. To produce CUP-2, approximately 25 kg of UOC derived from the Blind River uranium refinery was blended, homogenized, and assessed for homogeneity by X-ray fluorescence (XRF) analysis. The homogenized material was then packaged into bottles, containing 50 g of material each, and distributed for analysis to laboratories in 1986. The CUP-2 UOC standard was characterized by an interlaboratory analysis program involving eight member laboratories, six commercial laboratories, and three additional volunteer laboratories. Each laboratory provided five replicate results on up to 17 analytes, including total uranium concentration, and moisture content. The selection of analytical technique was left to each participating laboratory. Uranium was reported on an “as-received” basis; all other analytes (besides moisture content) were reported on a “dry-weight” basis. A bottle of 25g of CUP-2 UOC standard as described above was purchased by LLNL and characterized by the LLNL Nuclear Forensics Group. Non-destructive and destructive analytical techniques were applied to the UOC sample. Information obtained from short-term techniques such as photography, gamma spectrometry, and scanning electron microscopy were used to guide the performance of longer-term techniques such as ICP-MS. Some techniques, such as XRF and ICP-MS, provided complementary types of data. The results

  12. QUAST: quality assessment tool for genome assemblies.

    Science.gov (United States)

    Gurevich, Alexey; Saveliev, Vladislav; Vyahhi, Nikolay; Tesler, Glenn

    2013-04-15

    Limitations of genome sequencing techniques have led to dozens of assembly algorithms, none of which is perfect. A number of methods for comparing assemblers have been developed, but none is yet a recognized benchmark. Further, most existing methods for comparing assemblies are only applicable to new assemblies of finished genomes; the problem of evaluating assemblies of previously unsequenced species has not been adequately considered. Here, we present QUAST-a quality assessment tool for evaluating and comparing genome assemblies. This tool improves on leading assembly comparison software with new ideas and quality metrics. QUAST can evaluate assemblies both with a reference genome, as well as without a reference. QUAST produces many reports, summary tables and plots to help scientists in their research and in their publications. In this study, we used QUAST to compare several genome assemblers on three datasets. QUAST tables and plots for all of them are available in the Supplementary Material, and interactive versions of these reports are on the QUAST website. http://bioinf.spbau.ru/quast . Supplementary data are available at Bioinformatics online.

  13. LLNL Mercury Project Trinity Open Science Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Brantley, Patrick [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dawson, Shawn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McKinley, Scott [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); O' Brien, Matt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Peters, Doug [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pozulp, Mike [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Becker, Greg [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Moody, Adam [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-04-20

    The Mercury Monte Carlo particle transport code developed at Lawrence Livermore National Laboratory (LLNL) is used to simulate the transport of radiation through urban environments. These challenging calculations include complicated geometries and require significant computational resources to complete. As a result, a question arises as to the level of convergence of the calculations with Monte Carlo simulation particle count. In the Trinity Open Science calculations, one main focus was to investigate convergence of the relevant simulation quantities with Monte Carlo particle count to assess the current simulation methodology. Both for this application space but also of more general applicability, we also investigated the impact of code algorithms on parallel scaling on the Trinity machine as well as the utilization of the Trinity DataWarp burst buffer technology in Mercury via the LLNL Scalable Checkpoint/Restart (SCR) library.

  14. FY16 LLNL Omega Experimental Programs

    International Nuclear Information System (INIS)

    Heeter, R. F.; Ali, S. J.; Benstead, J.; Celliers, P. M.; Coppari, F.; Eggert, J.; Erskine, D.; Panella, A. F.; Fratanduono, D. E.; Hua, R.; Huntington, C. M.; Jarrott, L. C.; Jiang, S.; Kraus, R. G.; Lazicki, A. E.; LePape, S.; Martinez, D. A.; McNaney, J. M.; Millot, M. A.; Moody, J.; Pak, A. E.; Park, H. S.; Ping, Y.; Pollock, B. B.; Rinderknecht, H.; Ross, J. S.; Rubery, M.; Sio, H.; Smith, R. F.; Swadling, G. F.; Wehrenberg, C. E.; Collins, G. W.; Landen, O. L.; Wan, A.; Hsing, W.

    2016-01-01

    In FY16, LLNL's High-Energy-Density Physics (HED) and Indirect Drive Inertial Confinement Fusion (ICF-ID) programs conducted several campaigns on the OMEGA laser system and on the EP laser system, as well as campaigns that used the OMEGA and EP beams jointly. Overall, these LLNL programs led 430 target shots in FY16, with 304 shots using just the OMEGA laser system, and 126 shots using just the EP laser system. Approximately 21% of the total number of shots (77 OMEGA shots and 14 EP shots) supported the Indirect Drive Inertial Confinement Fusion Campaign (ICF-ID). The remaining 79% (227 OMEGA shots and 112 EP shots) were dedicated to experiments for High-Energy-Density Physics (HED). Highlights of the various HED and ICF campaigns are summarized in the following reports. In addition to these experiments, LLNL Principal Investigators led a variety of Laboratory Basic Science campaigns using OMEGA and EP, including 81 target shots using just OMEGA and 42 shots using just EP. The highlights of these are also summarized, following the ICF and HED campaigns. Overall, LLNL PIs led a total of 553 shots at LLE in FY 2016. In addition, LLNL PIs also supported 57 NLUF shots on Omega and 31 NLUF shots on EP, in collaboration with the academic community.

  15. FY16 LLNL Omega Experimental Programs

    Energy Technology Data Exchange (ETDEWEB)

    Heeter, R. F. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ali, S. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Benstead, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Celliers, P. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Coppari, F. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Eggert, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Erskine, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Panella, A. F. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Fratanduono, D. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hua, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Huntington, C. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Jarrott, L. C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Jiang, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kraus, R. G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lazicki, A. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); LePape, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Martinez, D. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McNaney, J. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Millot, M. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Moody, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pak, A. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Park, H. S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ping, Y. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pollock, B. B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Rinderknecht, H. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ross, J. S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Rubery, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sio, H. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Smith, R. F. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Swadling, G. F. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Wehrenberg, C. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Collins, G. W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Landen, O. L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Wan, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hsing, W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-12-01

    In FY16, LLNL’s High-Energy-Density Physics (HED) and Indirect Drive Inertial Confinement Fusion (ICF-ID) programs conducted several campaigns on the OMEGA laser system and on the EP laser system, as well as campaigns that used the OMEGA and EP beams jointly. Overall, these LLNL programs led 430 target shots in FY16, with 304 shots using just the OMEGA laser system, and 126 shots using just the EP laser system. Approximately 21% of the total number of shots (77 OMEGA shots and 14 EP shots) supported the Indirect Drive Inertial Confinement Fusion Campaign (ICF-ID). The remaining 79% (227 OMEGA shots and 112 EP shots) were dedicated to experiments for High-Energy-Density Physics (HED). Highlights of the various HED and ICF campaigns are summarized in the following reports. In addition to these experiments, LLNL Principal Investigators led a variety of Laboratory Basic Science campaigns using OMEGA and EP, including 81 target shots using just OMEGA and 42 shots using just EP. The highlights of these are also summarized, following the ICF and HED campaigns. Overall, LLNL PIs led a total of 553 shots at LLE in FY 2016. In addition, LLNL PIs also supported 57 NLUF shots on Omega and 31 NLUF shots on EP, in collaboration with the academic community.

  16. LLNL NESHAPs 2015 Annual Report - June 2016

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, K. R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gallegos, G. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); MacQueen, D. H. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Wegrecki, A. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-06-01

    Lawrence Livermore National Security, LLC operates facilities at Lawrence Livermore National Laboratory (LLNL) in which radionuclides are handled and stored. These facilities are subject to the U.S. Environmental Protection Agency (EPA) National Emission Standards for Hazardous Air Pollutants (NESHAPs) in Code of Federal Regulations (CFR) Title 40, Part 61, Subpart H, which regulates radionuclide emissions to air from Department of Energy (DOE) facilities. Specifically, NESHAPs limits the emission of radionuclides to the ambient air to levels resulting in an annual effective dose equivalent of 10 mrem (100 μSv) to any member of the public. Using measured and calculated emissions, and building-specific and common parameters, LLNL personnel applied the EPA-approved computer code, CAP88-PC, Version 4.0.1.17, to calculate the dose to the maximally exposed individual member of the public for the Livermore Site and Site 300.

  17. High intensity positron program at LLNL

    International Nuclear Information System (INIS)

    Asoka-Kumar, P.; Howell, R.; Stoeffl, W.; Carter, D.

    1999-01-01

    Lawrence Livermore National Laboratory (LLNL) is the home of the world's highest current beam of keV positrons. The potential for establishing a national center for materials analysis using positron annihilation techniques around this capability is being actively pursued. The high LLNL beam current will enable investigations in several new areas. We are developing a positron microprobe that will produce a pulsed, focused positron beam for 3-dimensional scans of defect size and concentration with submicron resolution. Below we summarize the important design features of this microprobe. Several experimental end stations will be available that can utilize the high current beam with a time distribution determined by the electron linac pulse structure, quasi-continuous, or bunched at 20 MHz, and can operate in an electrostatic or (and) magnetostatic environment. Some of the planned early experiments are: two-dimensional angular correlation of annihilation radiation of thin films and buried interfaces, positron diffraction holography, positron induced desorption, and positron induced Auger spectroscopy

  18. High intensity positron program at LLNL

    International Nuclear Information System (INIS)

    Asoka-Kumar, P.; Howell, R.H.; Stoeffl, W.

    1998-01-01

    Lawrence Livermore National Laboratory (LLNL) is the home of the world's highest current beam of keV positrons. The potential for establishing a national center for materials analysis using positron annihilation techniques around this capability is being actively pursued. The high LLNL beam current will enable investigations in several new areas. We are developing a positron microprobe that will produce a pulsed, focused positron beam for 3-dimensional scans of defect size and concentration with submicron resolution. Below we summarize the important design features of this microprobe. Several experimental end stations will be available that can utilize the high current beam with a time distribution determined by the electron linac pulse structure, quasi-continuous, or bunched at 20 MHz, and can operate in an electrostatic or (and) magnetostatic environment. Some of the planned early experiments are: two-dimensional angular correlation of annihilation radiation of thin films and buried interfaces, positron diffraction holography, positron induced desorption, and positron induced Auger spectra

  19. LLNL high-field coil program

    International Nuclear Information System (INIS)

    Miller, J.R.

    1986-01-01

    An overview is presented of the LLNL High-Field Superconducting Magnet Development Program wherein the technology is being developed for producing fields in the range of 15 T and higher for both mirror and tokamak applications. Applications requiring less field will also benefit from this program. In addition, recent results on the thermomechanical performance of cable-in-conduit conductor systems are presented and their importance to high-field coil design discussed

  20. LIFTERS-hyperspectral imaging at LLNL

    Energy Technology Data Exchange (ETDEWEB)

    Fields, D. [Lawrence Livermore National Lab., CA (United States); Bennett, C.; Carter, M.

    1994-11-15

    LIFTIRS, the Livermore Imaging Fourier Transform InfraRed Spectrometer, recently developed at LLNL, is an instrument which enables extremely efficient collection and analysis of hyperspectral imaging data. LIFTIRS produces a spatial format of 128x128 pixels, with spectral resolution arbitrarily variable up to a maximum of 0.25 inverse centimeters. Time resolution and spectral resolution can be traded off for each other with great flexibility. We will discuss recent measurements made with this instrument, and present typical images and spectra.

  1. Probabilistic Seismic Hazards Update for LLNL

    Energy Technology Data Exchange (ETDEWEB)

    Menchawi, O. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Fernandez, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-03-30

    Fugro Consultants, Inc. (FCL) completed the Probabilistic Seismic Hazard Analysis (PSHA) performed for Building 332 at the Lawrence Livermore National Laboratory (LLNL), near Livermore, CA. The study performed for the LLNL site includes a comprehensive review of recent information relevant to the LLNL regional tectonic setting and regional seismic sources in the vicinity of the site and development of seismic wave transmission characteristics. The Seismic Source Characterization (SSC), documented in Project Report No. 2259-PR-02 (FCL, 2015b), and Ground Motion Characterization (GMC), documented in Project Report No. 2259-PR-06 (FCL, 2015a) were developed in accordance with ANS/ANSI 2.29- 2008 Level 2 PSHA guidelines. The ANS/ANSI 2.29-2008 Level 2 PSHA framework is documented in Project Report No. 2259-PR-05 (FCL, 2016a). The Hazard Input Document (HID) for input into the PSHA developed from the SSC and GMC is presented in Project Report No. 2259-PR-04 (FCL, 2016b). The site characterization used as input for development of the idealized site profiles including epistemic uncertainty and aleatory variability is presented in Project Report No. 2259-PR-03 (FCL, 2015c). The PSHA results are documented in Project Report No. 2259-PR-07 (FCL, 2016c).

  2. Comprehensive Angular Response Study of LLNL Panasonic Dosimeter Configurations and Artificial Intelligence Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Stone, D. K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-06-30

    In April of 2016, the Lawrence Livermore National Laboratory External Dosimetry Program underwent a Department of Energy Laboratory Accreditation Program (DOELAP) on-site assessment. The assessment reported a concern that the study performed in 2013 Angular Dependence Study Panasonic UD-802 and UD-810 Dosimeters LLNL Artificial Intelligence Algorithm was incomplete. Only the responses at ±60° and 0° were evaluated and independent data from dosimeters was not used to evaluate the algorithm. Additionally, other configurations of LLNL dosimeters were not considered in this study. This includes nuclear accident dosimeters (NAD) which are placed in the wells surrounding the TLD in the dosimeter holder.

  3. The LLNL portable tritium processing system

    International Nuclear Information System (INIS)

    Anon.

    1995-01-01

    The end of the Cold War significantly reduced the need for facilities to handle radioactive materials for the US nuclear weapons program. The LLNL Tritium Facility was among those slated for decommissioning. The plans for the facility have since been reversed, and it remains open. Nevertheless, in the early 1990s, the cleanup (the Tritium Inventory Removal Project) was undertaken. However, removing the inventory of tritium within the facility and cleaning up any pockets of high-level residual contamination required that we design a system adequate to the task and meeting today's stringent standards of worker and environmental protection. In collaboration with Sandia National Laboratory and EG ampersand G Mound Applied Technologies, we fabricated a three-module Portable Tritium Processing System (PTPS) that meets current glovebox standards, is operated from a portable console, and is movable from laboratory to laboratory for performing the basic tritium processing operations: pumping and gas transfer, gas analysis, and gas-phase tritium scrubbing. The Tritium Inventory Removal Project is now in its final year, and the portable system continues to be the workhorse. To meet a strong demand for tritium services, the LLNL Tritium Facility will be reconfigured to provide state-of-the-art tritium and radioactive decontamination research and development. The PTPS will play a key role in this new facility

  4. High intensity positron program at LLNL

    International Nuclear Information System (INIS)

    Asoka-Kumar, P.; Howell, R.; Stoeffl, W.; Carter, D.

    1999-01-01

    Lawrence Livermore National Laboratory (LLNL) is the home of the world close-quote s highest current beam of keV positrons. The potential for establishing a national center for materials analysis using positron annihilation techniques around this capability is being actively pursued. The high LLNL beam current will enable investigations in several new areas. We are developing a positron microprobe that will produce a pulsed, focused positron beam for 3-dimensional scans of defect size and concentration with submicron resolution. Below we summarize the important design features of this microprobe. Several experimental end stations will be available that can utilize the high current beam with a time distribution determined by the electron linac pulse structure, quasi-continuous, or bunched at 20 MHz, and can operate in an electrostatic or (and) magnetostatic environment. Some of the planned early experiments are: two-dimensional angular correlation of annihilation radiation of thin films and buried interfaces, positron diffraction holography, positron induced desorption, and positron induced Auger spectroscopy. copyright 1999 American Institute of Physics

  5. Evaluation of LLNL's Nuclear Accident Dosimeters at the CALIBAN Reactor September 2010

    International Nuclear Information System (INIS)

    Hickman, D.P.; Wysong, A.R.; Heinrichs, D.P.; Wong, C.T.; Merritt, M.J.; Topper, J.D.; Gressmann, F.A.; Madden, D.J.

    2011-01-01

    participants were limited in what they were allowed to do at the Caliban and Silene exercises and testing of various elements of the nuclear accident dosimetry programs cannot always be performed as guests at other sites, it has become evident that DOE needs its own capability to test nuclear accident dosimeters. Angular dependence determination and correction factors for NADs desperately need testing as well as more evaluation regarding the correct determination of gamma doses. It will be critical to properly design any testing facility so that the necessary experiments can be performed by DOE laboratories as well as guest laboratories. Alternate methods of dose assessment such as using various metals commonly found in pockets and clothing have yet to be evaluated. The DOE is planning to utilize the Godiva or Flattop reactor for testing nuclear accident dosimeters. LLNL has been assigned the primary operational authority for such testing. Proper testing of nuclear accident dosimeters will require highly specific characterization of the pulse fields. Just as important as the characterization of the pulsed fields will be the design of facilities used to process the NADs. Appropriate facilities will be needed to allow for early access to dosimeters to test and develop quick sorting techniques. These facilities will need appropriate laboratory preparation space and an area for measurements. Finally, such a facility will allow greater numbers of LLNL and DOE laboratory personnel to train on the processing and interpretation of nuclear accident dosimeters and results. Until this facility is fully operational for test purposes, DOE laboratories may need to continue periodic testing as guests of other reactor facilities such as Silene and Caliban.

  6. Proposed LLNL electron beam ion trap

    International Nuclear Information System (INIS)

    Marrs, R.E.; Egan, P.O.; Proctor, I.; Levine, M.A.; Hansen, L.; Kajiyama, Y.; Wolgast, R.

    1985-01-01

    The interaction of energetic electrons with highly charged ions is of great importance to several research fields such as astrophysics, laser fusion and magnetic fusion. In spite of this importance there are almost no measurements of electron interaction cross sections for ions more than a few times ionized. To address this problem an electron beam ion trap (EBIT) is being developed at LLNL. The device is essentially an EBIS except that it is not intended as a source of extracted ions. Instead the (variable energy) electron beam interacting with the confined ions will be used to obtain measurements of ionization cross sections, dielectronic recombination cross sections, radiative recombination cross sections, energy levels and oscillator strengths. Charge-exchange recombinaion cross sections with neutral gasses could also be measured. The goal is to produce and study elements in many different charge states up to He-like xenon and Ne-like uranium. 5 refs., 2 figs

  7. FY14 LLNL OMEGA Experimental Programs

    Energy Technology Data Exchange (ETDEWEB)

    Heeter, R. F. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Fournier, K. B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Baker, K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Barrios, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bernstein, L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brown, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Celliers, P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chen, H. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Coppari, F. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Fratanduono, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Johnson, M. G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Huntington, C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Jenei, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kraus, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ma, T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Martinez, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McNabb, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Millot, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Moore, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Nagel, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Park, H. S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Patel, P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Perez, F. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ping, Y. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pollock, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ross, J. S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Rygg, J. R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Smith, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Zylstra, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Collins, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Landen, O. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Wan, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hsing, W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-10-13

    In FY14, LLNL’s High-Energy-Density Physics (HED) and Indirect Drive Inertial Confinement Fusion (ICF-ID) programs conducted several campaigns on the OMEGA laser system and on the EP laser system, as well as campaigns that used the OMEGA and EP beams jointly. Overall these LLNL programs led 324 target shots in FY14, with 246 shots using just the OMEGA laser system, 62 shots using just the EP laser system, and 16 Joint shots using Omega and EP together. Approximately 31% of the total number of shots (62 OMEGA shots, 42 EP shots) shots supported the Indirect Drive Inertial Confinement Fusion Campaign (ICF-ID). The remaining 69% (200 OMEGA shots and 36 EP shots, including the 16 Joint shots) were dedicated to experiments for High- Energy-Density Physics (HED). Highlights of the various HED and ICF campaigns are summarized in the following reports.

  8. FY15 LLNL OMEGA Experimental Programs

    Energy Technology Data Exchange (ETDEWEB)

    Heeter, R. F. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Baker, K. L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Barrios, M. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Beckwith, M. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Casey, D. T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Celliers, P. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chen, H. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Coppari, F. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Fournier, K. B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Fratanduono, D. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Frenje, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Huntington, C. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kraus, R. G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lazicki, A. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Martinez, D. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McNaney, J. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Millot, M. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pak, A. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Park, H. S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ping, Y. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pollock, B. B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Smith, R. F. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Wehrenberg, C. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Widmann, K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Collins, G. W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Landen, O. L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Wan, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hsing, W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-12-04

    In FY15, LLNL’s High-Energy-Density Physics (HED) and Indirect Drive Inertial Confinement Fusion (ICF-ID) programs conducted several campaigns on the OMEGA laser system and on the EP laser system, as well as campaigns that used the OMEGA and EP beams jointly. Overall these LLNL programs led 468 target shots in FY15, with 315 shots using just the OMEGA laser system, 145 shots using just the EP laser system, and 8 Joint shots using Omega and EP together. Approximately 25% of the total number of shots (56 OMEGA shots and 67 EP shots, including the 8 Joint shots) supported the Indirect Drive Inertial Confinement Fusion Campaign (ICF-ID). The remaining 75% (267 OMEGA shots and 86 EP shots) were dedicated to experiments for High-Energy-Density Physics (HED). Highlights of the various HED and ICF campaigns are summarized in the following reports.

  9. Challenges in biotechnology at LLNL: from genes to proteins; TOPICAL

    International Nuclear Information System (INIS)

    Albala, J S

    1999-01-01

    This effort has undertaken the task of developing a link between the genomics, DNA repair and structural biology efforts within the Biology and Biotechnology Research Program at LLNL. Through the advent of the I.M.A.G.E. (Integrated Molecular Analysis of Genomes and their Expression) Consortium, a world-wide effort to catalog the largest public collection of genes, accepted and maintained within BBRP, it is now possible to systematically express the protein complement of these to further elucidate novel gene function and structure. The work has ensued in four phases, outlined as follows: (1) Gene and System selection; (2) Protein expression and purification; (3) Structural analysis; and (4) biological integration. Proteins to be expressed have been those of high programmatic interest. This includes, in particular, proteins involved in the maintenance of genome integrity, particularly those involved in the repair of DNA damage, including ERCC1, ERCC4, XRCC2, XRCC3, XRCC9, HEX1, APN1, p53, RAD51B, RAD51C, and RAD51. Full-length cDNA cognates of selected genes were isolated, and cloned into baculovirus-based expression vectors. The baculoviral expression system for protein over-expression is now well-established in the Albala laboratory. Procedures have been successfully optimized for full-length cDNA clining into expression vectors for protein expression from recombinant constructs. This includes the reagents, cell lines, techniques necessary for expression of recombinant baculoviral constructs in Spodoptera frugiperda (Sf9) cells. The laboratory has also generated a high-throughput baculoviral expression paradigm for large scale expression and purification of human recombinant proteins amenable to automation

  10. The new LLNL AMS sample changer

    International Nuclear Information System (INIS)

    Roberts, M.L.; Norman, P.J.; Garibaldi, J.L.; Hornady, R.S.

    1993-01-01

    The Center for Accelerator Mass Spectrometry at LLNL has installed a new 64 position AMS sample changer on our spectrometer. This new sample changer has the capability of being controlled manually by an operator or automatically by the AMS data acquisition computer. Automatic control of the sample changer by the data acquisition system is a necessary step towards unattended AMS operation in our laboratory. The sample changer uses a fiber optic shaft encoder for rough rotational indexing of the sample wheel and a series of sequenced pneumatic cylinders for final mechanical indexing of the wheel and insertion and retraction of samples. Transit time from sample to sample varies from 4 s to 19 s, depending on distance moved. Final sample location can be set to within 50 microns on the x and y axis and within 100 microns in the z axis. Changing sample wheels on the new sample changer is also easier and faster than was possible on our previous sample changer and does not require the use of any tools

  11. Nuclear physics and heavy element research at LLNL

    Energy Technology Data Exchange (ETDEWEB)

    Stoyer, M A; Ahle, L E; Becker, J A; Bernstein, L A; Bleuel, D L; Burke, J T; Dashdorj, D; Henderson, R A; Hurst, A M; Kenneally, J M; Lesher, S R; Moody, K J; Nelson, S L; Norman, E B; Pedretti, M; Scielzo, N D; Shaughnessy, D A; Sheets, S A; Stoeffl, W; Stoyer, N J; Wiedeking, M; Wilk, P A; Wu, C Y

    2009-05-11

    This paper highlights some of the current basic nuclear physics research at Lawrence Livermore National Laboratory (LLNL). The work at LLNL concentrates on investigating nuclei at the extremes. The Experimental Nuclear Physics Group performs research to improve our understanding of nuclei, nuclear reactions, nuclear decay processes and nuclear astrophysics; an expertise utilized for important laboratory national security programs and for world-class peer-reviewed basic research.

  12. Development of positron diffraction and holography at LLNL

    International Nuclear Information System (INIS)

    Hamza, A.; Asoka-Kumar, P.; Stoeffl, W.; Howell, R.; Miller, D.; Denison, A.

    2003-01-01

    A low-energy positron diffraction and holography spectrometer is currently being constructed at the Lawrence Livermore National Laboratory (LLNL) to study surfaces and adsorbed structures. This instrument will operate in conjunction with the LLNL intense positron beam produced by the 100 MeV LINAC allowing data to be acquired in minutes rather than days. Positron diffraction possesses certain advantages over electron diffraction which are discussed. Details of the instrument based on that of low-energy electron diffraction are described

  13. LLNL/JNC repository collaboration interim progress report

    International Nuclear Information System (INIS)

    Bourcier, W.L.; Couch, R.G.; Gansemer, J.; Halsey, W.G.; Palmer, C.E.; Sinz, K.H.; Stout, R.B.; Wijesinghe, A.; Wolery, T.J.

    1999-01-01

    Under this Annex, a research program on the near-field performance assessment related to the geological disposal of radioactive waste will be carried out at the Lawrence Livermore National Laboratory (LLNL) in close collaboration with the Power Reactor and Nuclear Fuel Development Corporation of Japan (PNC). This program will focus on activities that provide direct support for PNC's near-term and long-term needs that will, in turn, utilize and further strengthen US capabilities for radioactive waste management. The work scope for two years will be designed based on the PNC's priorities for its second progress report (the H12 report) of research and development for high-level radioactive waste disposal and on the interest and capabilities of the LLNL. The work will focus on the chemical modeling for the near-field environment and long-term mechanical modeling of engineered barrier system as it evolves. Certain activities in this program will provide for a final iteration of analyses to provide additional technical basis prior to the year 2000 as determined in discussions with the PNC's technical coordinator. The work for two years will include the following activities: Activity 1: Chemical Modeling of EBS Materials Interactions--Task 1.1 Chemical Modeling of Iron Effects on Borosilicate Glass Durability; and Task 1.2 Changes in Overpack and Bentonite Properties Due to Metal, Bentonite and Water Interactions. Activity 2: Thermodynamic Database Validation and Comparison--Task 2.1 Set up EQ3/6 to Run with the Pitzer-based PNC Thermodynamic Data Base; Task 2.2 Provide Expert Consultation on the Thermodynamic Data Base; and Task 2.3 Provide Analysis of Likely Solubility Controls on Selenium. Activity 3: Engineered Barrier Performance Assessment of the Unsaturated, Oxidizing Transient--Task 3.1 Apply YMIM to PNC Transient EBS Performance; Task 3.2 Demonstrate Methods for Modeling the Return to Reducing Conditions; and Task 3.3 Evaluate the Potential for Stress Corrosion

  14. nGASP - the nematode genome annotation assessment project

    Energy Technology Data Exchange (ETDEWEB)

    Coghlan, A; Fiedler, T J; McKay, S J; Flicek, P; Harris, T W; Blasiar, D; Allen, J; Stein, L D

    2008-12-19

    While the C. elegans genome is extensively annotated, relatively little information is available for other Caenorhabditis species. The nematode genome annotation assessment project (nGASP) was launched to objectively assess the accuracy of protein-coding gene prediction software in C. elegans, and to apply this knowledge to the annotation of the genomes of four additional Caenorhabditis species and other nematodes. Seventeen groups worldwide participated in nGASP, and submitted 47 prediction sets for 10 Mb of the C. elegans genome. Predictions were compared to reference gene sets consisting of confirmed or manually curated gene models from WormBase. The most accurate gene-finders were 'combiner' algorithms, which made use of transcript- and protein-alignments and multi-genome alignments, as well as gene predictions from other gene-finders. Gene-finders that used alignments of ESTs, mRNAs and proteins came in second place. There was a tie for third place between gene-finders that used multi-genome alignments and ab initio gene-finders. The median gene level sensitivity of combiners was 78% and their specificity was 42%, which is nearly the same accuracy as reported for combiners in the human genome. C. elegans genes with exons of unusual hexamer content, as well as those with many exons, short exons, long introns, a weak translation start signal, weak splice sites, or poorly conserved orthologs were the most challenging for gene-finders. While the C. elegans genome is extensively annotated, relatively little information is available for other Caenorhabditis species. The nematode genome annotation assessment project (nGASP) was launched to objectively assess the accuracy of protein-coding gene prediction software in C. elegans, and to apply this knowledge to the annotation of the genomes of four additional Caenorhabditis species and other nematodes. Seventeen groups worldwide participated in nGASP, and submitted 47 prediction sets for 10 Mb of the C

  15. DOE/LLNL verification symposium on technologies for monitoring nuclear tests related to weapons proliferation

    International Nuclear Information System (INIS)

    Nakanishi, K.K.

    1993-01-01

    The rapidly changing world situation has raised concerns regarding the proliferation of nuclear weapons and the ability to monitor a possible clandestine nuclear testing program. To address these issues, Lawrence Livermore National Laboratory's (LLNL) Treaty Verification Program sponsored a symposium funded by the US Department of Energy's (DOE) Office of Arms Control, Division of Systems and Technology. The DOE/LLNL Symposium on Technologies for Monitoring Nuclear Tests Related to Weapons Proliferation was held at the DOE's Nevada Operations Office in Las Vegas, May 6--7,1992. This volume is a collection of several papers presented at the symposium. Several experts in monitoring technology presented invited talks assessing the status of monitoring technology with emphasis on the deficient areas requiring more attention in the future. In addition, several speakers discussed proliferation monitoring technologies being developed by the DOE's weapons laboratories

  16. LLNL/YMP Waste Container Fabrication and Closure Project

    International Nuclear Information System (INIS)

    1990-10-01

    The Department of Energy's Office of Civilian Radioactive Waste Management (OCRWM) Program is studying Yucca Mountain, Nevada as a suitable site for the first US high-level nuclear waste repository. Lawrence Livermore National Laboratory (LLNL) has the responsibility for designing and developing the waste package for the permanent storage of high-level nuclear waste. This report is a summary of the technical activities for the LLNL/YMP Nuclear Waste Disposal Container Fabrication and Closure Development Project. Candidate welding closure processes were identified in the Phase 1 report. This report discusses Phase 2. Phase 2 of this effort involved laboratory studies to determine the optimum fabrication and closure processes. Because of budget limitations, LLNL narrowed the materials for evaluation in Phase 2 from the original six to four: Alloy 825, CDA 715, CDA 102 (or CDA 122) and CDA 952. Phase 2 studies focused on evaluation of candidate material in conjunction with fabrication and closure processes

  17. Performance of HEPA filters at LLNL following the 1980 and 1989 earthquakes

    International Nuclear Information System (INIS)

    Bergman, W.; Elliott, J.; Wilson, K.

    1995-01-01

    The Lawrence Livermore National Laboratory has experienced two significant earthquakes for which data is available to assess the ability of HEPA filters to withstand seismic conditions. A 5.9 magnitude earthquake with an epicenter 10 miles from LLNL struck on January 24, l980. Estimates of the peak ground accelerations ranged from 0.2 to 0.3 g. A 7.0 magnitude earthquake with an epicenter about 50 miles from LLNL struck on October 17, 1989. Measurements of the ground accelerations at LLNL averaged 0.1 g. The results from the in-place filter tests obtained after each of the earthquakes were compiled and studied to determine if the earthquakes had caused filter leakage. Our study showed that only the 1980 earthquake resulted in a small increase in the number of HEPA filters developing leaks. In the 12 months following the 1980 and 1989 earthquakes, the in-place filter tests showed 8.0% and 4.1% of all filters respectively developed leaks. The average percentage of filters developing leaks from 1980 to 1993 was 3.3%+/-1.7%. The increase in the filter leaks is significant for the 1980 earthquake, but not for the 1989 earthquake. No contamination was detected following the earthquakes that would suggest transient releases from the filtration system

  18. Performance of HEPA filters at LLNL following the 1980 and 1989 earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, W.; Elliott, J.; Wilson, K. [Lawrence Livermore National Laboratory, CA (United States)

    1995-02-01

    The Lawrence Livermore National Laboratory has experienced two significant earthquakes for which data is available to assess the ability of HEPA filters to withstand seismic conditions. A 5.9 magnitude earthquake with an epicenter 10 miles from LLNL struck on January 24, l980. Estimates of the peak ground accelerations ranged from 0.2 to 0.3 g. A 7.0 magnitude earthquake with an epicenter about 50 miles from LLNL struck on October 17, 1989. Measurements of the ground accelerations at LLNL averaged 0.1 g. The results from the in-place filter tests obtained after each of the earthquakes were compiled and studied to determine if the earthquakes had caused filter leakage. Our study showed that only the 1980 earthquake resulted in a small increase in the number of HEPA filters developing leaks. In the 12 months following the 1980 and 1989 earthquakes, the in-place filter tests showed 8.0% and 4.1% of all filters respectively developed leaks. The average percentage of filters developing leaks from 1980 to 1993 was 3.3%+/-1.7%. The increase in the filter leaks is significant for the 1980 earthquake, but not for the 1989 earthquake. No contamination was detected following the earthquakes that would suggest transient releases from the filtration system.

  19. Diversification and strategic management of LLNL's R ampersand D portfolio

    International Nuclear Information System (INIS)

    Glinsky, M.E.

    1994-12-01

    Strategic management of LLNL's research effort is addressed. A general framework is established by presenting the McKinsey/BCG Matrix Analysis as it applies to the research portfolio. The framework is used to establish the need for the diversification into new attractive areas of research and for the improvement of the market position of existing research in those attractive areas. With the need for such diversification established, attention is turned to optimizing it. There are limited resources available. It is concluded that LLNL should diversify into only a few areas and try to obtain full market share as soon as possible

  20. Thermochemical hydrogen production studies at LLNL: a status report

    International Nuclear Information System (INIS)

    Krikorian, O.H.

    1982-01-01

    Currently, studies are underway at the Lawrence Livermore National Laboratory (LLNL) on thermochemical hydrogen production based on magnetic fusion energy (MFE) and solar central receivers as heat sources. These areas of study were described earlier at the previous IEA Annex I Hydrogen Workshop (Juelich, West Germany, September 23-25, 1981), and a brief update will be given here. Some basic research has also been underway at LLNL on the electrolysis of water from fused phosphate salts, but there are no current results in that area, and the work is being terminated

  1. Spill exercise 1980: an LLNL emergency training exercise

    International Nuclear Information System (INIS)

    Morse, J.L.; Gibson, T.A.; Vance, W.F.

    1981-01-01

    An emergency training exercise at Lawrence Livermore National Laboratory (LLNL) demonstrated that off-hours emergency personnel can respond promptly and effecively to an emergency situation involving radiation, hazardous chemicals, and injured persons. The exercise simulated an explosion in a chemistry laboratory and a subsequent toxic-gas release

  2. Capabilities required to conduct the LLNL plutonium mission

    International Nuclear Information System (INIS)

    Kass, J.; Bish, W.; Copeland, A.; West, J.; Sack, S.; Myers, B.

    1991-01-01

    This report outlines the LLNL plutonium related mission anticipated over the next decade and defines the capabilities required to meet that mission wherever the Plutonium Facility is located. If plutonium work is relocated to a place where the facility is shared, then some capabilities can be commonly used by the sharing parties. However, it is essential that LLNL independently control about 20000 sq ft of net lab space, filled with LLNL controlled equipment, and staffed by LLNL employees. It is estimated that the cost to construct this facility should range from $140M to $200M. Purchase and installation of equipment to replace that already in Bldg 332 along with additional equipment identified as being needed to meet the mission for the next ten to fifteen years, is estimated to cost $118M. About $29M of the equipment could be shared. The Hardened Engineering Test Building (HETB) with its additional 8000 sq ft of unique test capability must also be replaced. The fully equipped replacement cost is estimated to be about $10M. About 40000 sq ft of setup and support space are needed along with office and related facilities for a 130 person resident staff. The setup space is estimated to cost $8M. The annual cost of a 130 person resident staff (100 programmatic and 30 facility operation) is estimated to be $20M

  3. Proceedings of the LLNL Technical Women`s Symposium

    Energy Technology Data Exchange (ETDEWEB)

    von Holtz, E. [ed.

    1993-12-31

    This report documents events of the LLNL Technical Women`s Symposium. Topics include; future of computer systems, environmental technology, defense and space, Nova Inertial Confinement Fusion Target Physics, technical communication, tools and techniques for biology in the 1990s, automation and robotics, software applications, materials science, atomic vapor laser isotope separation, technical communication, technology transfer, and professional development workshops.

  4. Proceedings of the LLNL technical women`s symposium

    Energy Technology Data Exchange (ETDEWEB)

    von Holtz, E. [ed.

    1994-12-31

    Women from institutions such as LLNL, LBL, Sandia, and SLAC presented papers at this conference. The papers deal with many aspects of global security, global ecology, and bioscience; they also reflect the challenges faced in improving business practices, communicating effectively, and expanding collaborations in the industrial world. Approximately 87 ``abstracts`` are included in six sessions; more are included in the addendum.

  5. The design and implementation of the LLNL gigabit testbed

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, D. [Lawrence Livermore National Labs., CA (United States)

    1994-12-01

    This paper will look at the design and implementation of the LLNL Gigabit testbed (LGTB), where various high speed networking products, can be tested in one environment. The paper will discuss the philosophy behind the design of and the need for the testbed, the tests that are performed in the testbed, and the tools used to implement those tests.

  6. LLNL X-ray Calibration and Standards Laboratory

    International Nuclear Information System (INIS)

    Anon.

    1982-01-01

    The LLNL X-ray Calibration and Standards Laboratory is a unique facility for developing and calibrating x-ray sources, detectors, and materials, and for conducting x-ray physics research in support of our weapon and fusion-energy programs

  7. Experimental assessment of the accuracy of genomic selection in sugarcane.

    Science.gov (United States)

    Gouy, M; Rousselle, Y; Bastianelli, D; Lecomte, P; Bonnal, L; Roques, D; Efile, J-C; Rocher, S; Daugrois, J; Toubi, L; Nabeneza, S; Hervouet, C; Telismart, H; Denis, M; Thong-Chane, A; Glaszmann, J C; Hoarau, J-Y; Nibouche, S; Costet, L

    2013-10-01

    Sugarcane cultivars are interspecific hybrids with an aneuploid, highly heterozygous polyploid genome. The complexity of the sugarcane genome is the main obstacle to the use of marker-assisted selection in sugarcane breeding. Given the promising results of recent studies of plant genomic selection, we explored the feasibility of genomic selection in this complex polyploid crop. Genetic values were predicted in two independent panels, each composed of 167 accessions representing sugarcane genetic diversity worldwide. Accessions were genotyped with 1,499 DArT markers. One panel was phenotyped in Reunion Island and the other in Guadeloupe. Ten traits concerning sugar and bagasse contents, digestibility and composition of the bagasse, plant morphology, and disease resistance were used. We used four statistical predictive models: bayesian LASSO, ridge regression, reproducing kernel Hilbert space, and partial least square regression. The accuracy of the predictions was assessed through the correlation between observed and predicted genetic values by cross validation within each panel and between the two panels. We observed equivalent accuracy among the four predictive models for a given trait, and marked differences were observed among traits. Depending on the trait concerned, within-panel cross validation yielded median correlations ranging from 0.29 to 0.62 in the Reunion Island panel and from 0.11 to 0.5 in the Guadeloupe panel. Cross validation between panels yielded correlations ranging from 0.13 for smut resistance to 0.55 for brix. This level of correlations is promising for future implementations. Our results provide the first validation of genomic selection in sugarcane.

  8. The impact of the human genome project on risk assessment

    International Nuclear Information System (INIS)

    Katarzyna Doerffer; Paul Unrau.

    1996-01-01

    The radiation protection approach to risk assessment assumes that cancer induction following radiation exposure is purely random. Present risk assessment methods derive risk from cancer incidence frequencies in exposed populations and associate disease outcomes totally with the level of exposure to ionizing red aeon. Exposure defines a risk factor that affects the probability of the disease outcome. But cancer risk can be affected by other risk factors such as underlying genetic factors (predisposition) of the exposed organism. These genetic risk factors are now becoming available for incorporation into ionizing radiation risk assessment Progress in the Human Genome Project (HOP) will lead to direct assays to measure the effects of genetic risk determinants in disease outcomes. When all genetic risk determinants are known and incorporated into risk assessment it will be possible to reevaluate the role of ionizing radiation in the causation of cancer. (author)

  9. Collembase: a repository for springtail genomics and soil quality assessment

    Directory of Open Access Journals (Sweden)

    Klein-Lankhorst Rene M

    2007-09-01

    Full Text Available Abstract Background Environmental quality assessment is traditionally based on responses of reproduction and survival of indicator organisms. For soil assessment the springtail Folsomia candida (Collembola is an accepted standard test organism. We argue that environmental quality assessment using gene expression profiles of indicator organisms exposed to test substrates is more sensitive, more toxicant specific and significantly faster than current risk assessment methods. To apply this species as a genomic model for soil quality testing we conducted an EST sequencing project and developed an online database. Description Collembase is a web-accessible database comprising springtail (F. candida genomic data. Presently, the database contains information on 8686 ESTs that are assembled into 5952 unique gene objects. Of those gene objects ~40% showed homology to other protein sequences available in GenBank (blastx analysis; non-redundant (nr database; expect-value -5. Software was applied to infer protein sequences. The putative peptides, which had an average length of 115 amino-acids (ranging between 23 and 440 were annotated with Gene Ontology (GO terms. In total 1025 peptides (~17% of the gene objects were assigned at least one GO term (expect-value -25. Within Collembase searches can be conducted based on BLAST and GO annotation, cluster name or using a BLAST server. The system furthermore enables easy sequence retrieval for functional genomic and Quantitative-PCR experiments. Sequences are submitted to GenBank (Accession numbers: EV473060 – EV481745. Conclusion Collembase http://www.collembase.org is a resource of sequence data on the springtail F. candida. The information within the database will be linked to a custom made microarray, based on the Agilent platform, which can be applied for soil quality testing. In addition, Collembase supplies information that is valuable for related scientific disciplines such as molecular ecology

  10. Genome Assembly Forensics: Metrics for Assessing Assembly Correctness (Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    Energy Technology Data Exchange (ETDEWEB)

    Pop, Mihai

    2011-10-13

    University of Maryland's Mihai Pop on Genome Assembly Forensics: Metrics for Assessing Assembly Correctness at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.

  11. Joint research and development on toxic-material emergency response between ENEA and LLNL. 1982 progress report

    International Nuclear Information System (INIS)

    Gudiksen, P.; Lange, R.; Dickerson, M.; Sullivan, T.; Rosen, L.; Walker, H.; Boeri, G.B.; Caracciolo, R.; Fiorenza, R.

    1982-11-01

    A summary is presented of current and future cooperative studies between ENEA and LLNL researchers designed to develop improved real-time emergency response capabilities for assessing the environmental consequences resulting from an accidental release of toxic materials into the atmosphere. These studies include development and evaluation of atmospheric transport and dispersion models, interfacing of data processing and communications systems, supporting meteorological field experiments, and integration of radiological measurements and model results into real-time assessments

  12. Hazardous-waste analysis plan for LLNL operations

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, R.S.

    1982-02-12

    The Lawrence Livermore National Laboratory is involved in many facets of research ranging from nuclear weapons research to advanced Biomedical studies. Approximately 80% of all programs at LLNL generate hazardous waste in one form or another. Aside from producing waste from industrial type operations (oils, solvents, bottom sludges, etc.) many unique and toxic wastes are generated such as phosgene, dioxin (TCDD), radioactive wastes and high explosives. One key to any successful waste management program must address the following: proper identification of the waste, safe handling procedures and proper storage containers and areas. This section of the Waste Management Plan will address methodologies used for the Analysis of Hazardous Waste. In addition to the wastes defined in 40 CFR 261, LLNL and Site 300 also generate radioactive waste not specifically covered by RCRA. However, for completeness, the Waste Analysis Plan will address all hazardous waste.

  13. Hazardous-waste analysis plan for LLNL operations

    International Nuclear Information System (INIS)

    Roberts, R.S.

    1982-01-01

    The Lawrence Livermore National Laboratory is involved in many facets of research ranging from nuclear weapons research to advanced Biomedical studies. Approximately 80% of all programs at LLNL generate hazardous waste in one form or another. Aside from producing waste from industrial type operations (oils, solvents, bottom sludges, etc.) many unique and toxic wastes are generated such as phosgene, dioxin (TCDD), radioactive wastes and high explosives. One key to any successful waste management program must address the following: proper identification of the waste, safe handling procedures and proper storage containers and areas. This section of the Waste Management Plan will address methodologies used for the Analysis of Hazardous Waste. In addition to the wastes defined in 40 CFR 261, LLNL and Site 300 also generate radioactive waste not specifically covered by RCRA. However, for completeness, the Waste Analysis Plan will address all hazardous waste

  14. Lawrence Livermore National Laboratory (LLNL) Waste Minimization Program Plan

    International Nuclear Information System (INIS)

    Heckman, R.A.; Tang, W.R.

    1989-01-01

    This Program Plan document describes the background of the Waste Minimization field at Lawrence Livermore National Laboratory (LLNL) and refers to the significant studies that have impacted on legislative efforts, both at the federal and state levels. A short history of formal LLNL waste minimization efforts is provided. Also included are general findings from analysis of work to date, with emphasis on source reduction findings. A short summary is provided on current regulations and probable future legislation which may impact on waste minimization methodology. The LLN Waste Minimization Program Plan is designed to be dynamic and flexible so as to meet current regulations, and yet is able to respond to an everchanging regulatory environment. 19 refs., 12 figs., 8 tabs

  15. Seismic evaluation of the LLNL plutonium facility (Building 332)

    International Nuclear Information System (INIS)

    Hall, W.J.; Sozen, M.A.

    1982-03-01

    The expected performance of the Lawrence Livermore National Laboratory (LLNL) Plutonium Facility (Building 332) subjected to earthquake ground motion has been evaluated. Anticipated behavior of the building, glove boxes, ventilation system and other systems critical for containment of plutonium is described for three severe postulated earthquake excitations. Based upon this evaluation, some damage to the building, glove boxes and ventilation system would be expected but no collapse of any structure is anticipated as a result of the postulated earthquake ground motions

  16. Probabilistic Seismic Hazards Update for LLNL: PSHA Results Report

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez, Alfredo [Fugro Consultants, Inc., Houston, TX (United States); Altekruse, Jason [Fugro Consultants, Inc., Houston, TX (United States); Menchawi, Osman El [Fugro Consultants, Inc., Houston, TX (United States)

    2016-03-11

    This report presents the Probabilistic Seismic Hazard Analysis (PSHA) performed for Building 332 at the Lawrence Livermore National Laboratory (LLNL), near Livermore, CA by Fugro Consultants, Inc. (FCL). This report is specific to Building 332 only and not to other portions of the Laboratory. The study performed for the LLNL site includes a comprehensive review of recent information relevant to the LLNL regional tectonic setting and regional seismic sources in the vicinity of the site and development of seismic wave transmission characteristics. The Seismic Source Characterization (SSC), documented in Project Report No. 2259-PR-02 (FCL, 2015a), and Ground Motion Characterization (GMC), documented in Project Report No. 2259-PR-06 (FCL, 2015c) were developed in accordance with ANS/ANSI 2.29-2008 Level 2 PSHA guidelines. The ANS/ANSI 2.29-2008 Level 2 PSHA framework is documented in Project Report No. 2259-PR-05 (FCL, 2016a). The Hazard Input Document (HID) for input into the PSHA developed from the SSC is presented in Project Report No. 2259-PR-04 (FCL, 2016b). The site characterization used as input for development of the idealized site profiles including epistemic uncertainty and aleatory variability is presented in Project Report No. 2259-PR-03 (FCL, 2015b).

  17. GAMA-LLNL Alpine Basin Special Study: Scope of Work

    Energy Technology Data Exchange (ETDEWEB)

    Singleton, M J; Visser, A; Esser, B K; Moran, J E

    2011-12-12

    For this task LLNL will examine the vulnerability of drinking water supplies in foothills and higher elevation areas to climate change impacts on recharge. Recharge locations and vulnerability will be determined through examination of groundwater ages and noble gas recharge temperatures in high elevation basins. LLNL will determine whether short residence times are common in one or more subalpine basin. LLNL will measure groundwater ages, recharge temperatures, hydrogen and oxygen isotopes, major anions and carbon isotope compositions on up to 60 samples from monitoring wells and production wells in these basins. In addition, a small number of carbon isotope analyses will be performed on surface water samples. The deliverable for this task will be a technical report that provides the measured data and an interpretation of the data from one or more subalpine basins. Data interpretation will: (1) Consider climate change impacts to recharge and its impact on water quality; (2) Determine primary recharge locations and their vulnerability to climate change; and (3) Delineate the most vulnerable areas and describe the likely impacts to recharge.

  18. Assessing the origin of species in the genomic era

    OpenAIRE

    Moyle, Leonie C

    2005-01-01

    Advances in genomics have rapidly accelerated research into the genetics of species differences, reproductive isolating barriers, and hybrid incompatibility. Recent genomic analyses in Drosophila species suggest that modified olfactory cues are involved in discrimination that is reinforced by natural selection.

  19. Assessment of genomic relationship between Oryza sativa and ...

    African Journals Online (AJOL)

    STORAGESEVER

    2010-03-01

    Mar 1, 2010 ... For genomic in situ hybridization, genomic DNA from O. australiensis was used as probe for the mitotic and meiotic ... Wide hybridization is one of the plant breeding approaches ..... Disease and insect resistance in rice.

  20. CheckM: assessing the quality of microbial genomes recovered from isolates, single cells, and metagenomes

    Science.gov (United States)

    Parks, Donovan H.; Imelfort, Michael; Skennerton, Connor T.; Hugenholtz, Philip; Tyson, Gene W.

    2015-01-01

    Large-scale recovery of genomes from isolates, single cells, and metagenomic data has been made possible by advances in computational methods and substantial reductions in sequencing costs. Although this increasing breadth of draft genomes is providing key information regarding the evolutionary and functional diversity of microbial life, it has become impractical to finish all available reference genomes. Making robust biological inferences from draft genomes requires accurate estimates of their completeness and contamination. Current methods for assessing genome quality are ad hoc and generally make use of a limited number of “marker” genes conserved across all bacterial or archaeal genomes. Here we introduce CheckM, an automated method for assessing the quality of a genome using a broader set of marker genes specific to the position of a genome within a reference genome tree and information about the collocation of these genes. We demonstrate the effectiveness of CheckM using synthetic data and a wide range of isolate-, single-cell-, and metagenome-derived genomes. CheckM is shown to provide accurate estimates of genome completeness and contamination and to outperform existing approaches. Using CheckM, we identify a diverse range of errors currently impacting publicly available isolate genomes and demonstrate that genomes obtained from single cells and metagenomic data vary substantially in quality. In order to facilitate the use of draft genomes, we propose an objective measure of genome quality that can be used to select genomes suitable for specific gene- and genome-centric analyses of microbial communities. PMID:25977477

  1. Global assessment of genomic variation in cattle by genome resequencing and high-throughput genotyping

    DEFF Research Database (Denmark)

    Zhan, Bujie; Fadista, João; Thomsen, Bo

    2011-01-01

    Background Integration of genomic variation with phenotypic information is an effective approach for uncovering genotype-phenotype associations. This requires an accurate identification of the different types of variation in individual genomes. Results We report the integration of the whole genome...... of split-read and read-pair approaches proved to be complementary in finding different signatures. CNVs were identified on the basis of the depth of sequenced reads, and by using SNP and CGH arrays. Conclusions Our results provide high resolution mapping of diverse classes of genomic variation...

  2. Description and application of the AERIN Code at LLNL

    International Nuclear Information System (INIS)

    King, W.C.

    1986-01-01

    The AERIN code was written at the Lawrence Livermore National Laboratory in 1976 to compute the organ burdens and absorbed dose resulting from a chronic or acute inhalation of transuranic isotopes. The code was revised in 1982 to reflect the concepts of ICRP-30. This paper will describe the AERIN code and how it has been used at LLNL to study more than 80 cases of internal deposition and obtain estimates of internal dose. A comparison with the computed values of the committed organ dose is made with ICRP-30 values. The benefits of using the code are described. 3 refs., 3 figs., 6 tabs

  3. Final report on the LLNL compact torus acceleration project

    International Nuclear Information System (INIS)

    Eddleman, J.; Hammer, J.; Hartman, C.; McLean, H.; Molvik, A.

    1995-01-01

    In this report, we summarize recent work at LLNL on the compact torus (CT) acceleration project. The CT accelerator is a novel technique for projecting plasmas to high velocities and reaching high energy density states. The accelerator exploits magnetic confinement in the CT to stably transport plasma over large distances and to directed kinetic energies large in comparison with the CT internal and magnetic energy. Applications range from heating and fueling magnetic fusion devices, generation of intense pulses of x-rays or neutrons for weapons effects and high energy-density fusion concepts

  4. Report on the B-Fields at NIF Workshop Held at LLNL October 12-13, 2015

    International Nuclear Information System (INIS)

    Fournier, K. B.; Moody, J. D.

    2015-01-01

    A national ICF laboratory workshop on requirements for a magnetized target capability on NIF was held by NIF at LLNL on October 12 and 13, attended by experts from LLNL, SNL, LLE, LANL, GA, and NRL. Advocates for indirect drive (LLNL), magnetic (Z) drive (SNL), polar direct drive (LLE), and basic science needing applied B (many institutions) presented and discussed requirements for the magnetized target capabilities they would like to see. 30T capability was most frequently requested. A phased operation increasing the field in steps experimentally can be envisioned. The NIF management will take the inputs from the scientific community represented at the workshop and recommend pulse-powered magnet parameters for NIF that best meet the collective user requests. In parallel, LLNL will continue investigating magnets for future generations that might be powered by compact laser-B-field generators (Moody, Fujioka, Santos, Woolsey, Pollock). The NIF facility engineers will start to analyze compatibility of the recommended pulsed magnet parameters (size, field, rise time, materials) with NIF chamber constraints, diagnostic access, and final optics protection against debris in FY16. The objective of this assessment will be to develop a schedule for achieving an initial Bfield capability. Based on an initial assessment, room temperature magnetized gas capsules will be fielded on NIF first. Magnetized cryo-ice-layered targets will take longer (more compatibility issues). Magnetized wetted foam DT targets (Olson) may have somewhat fewer compatibility issues making them a more likely choice for the first cryo-ice-layered target fielded with applied Bz.

  5. A Novel Approach to Semantic and Coreference Annotation at LLNL

    Energy Technology Data Exchange (ETDEWEB)

    Firpo, M

    2005-02-04

    A case is made for the importance of high quality semantic and coreference annotation. The challenges of providing such annotation are described. Asperger's Syndrome is introduced, and the connections are drawn between the needs of text annotation and the abilities of persons with Asperger's Syndrome to meet those needs. Finally, a pilot program is recommended wherein semantic annotation is performed by people with Asperger's Syndrome. The primary points embodied in this paper are as follows: (1) Document annotation is essential to the Natural Language Processing (NLP) projects at Lawrence Livermore National Laboratory (LLNL); (2) LLNL does not currently have a system in place to meet its need for text annotation; (3) Text annotation is challenging for a variety of reasons, many related to its very rote nature; (4) Persons with Asperger's Syndrome are particularly skilled at rote verbal tasks, and behavioral experts agree that they would excel at text annotation; and (6) A pilot study is recommend in which two to three people with Asperger's Syndrome annotate documents and then the quality and throughput of their work is evaluated relative to that of their neuro-typical peers.

  6. LLNL (Lawrence Livermore National Laboratory) research on cold fusion

    Energy Technology Data Exchange (ETDEWEB)

    Thomassen, K I; Holzrichter, J F [eds.

    1989-09-14

    With the appearance of reports on Cold Fusion,'' scientists at the Lawrence Livermore National Laboratory (LLNL) began a series of increasingly sophisticated experiments and calculations to explain these phenomena. These experiments can be categorized as follows: (a) simple experiments to replicate the Utah results, (b) more sophisticated experiments to place lower bounds on the generation of heat and production of nuclear products, (c) a collaboration with Texas A M University to analyze electrodes and electrolytes for fusion by-products in a cell producing 10% excess heat (we found no by-products), and (d) attempts to replicate the Frascati experiment that first found neutron bursts when high-pressure deuterium gas in a cylinder with Ti chips was temperature-cycled. We failed in categories (a) and (b) to replicate either the Pons/Fleischmann or the Jones phenomena. We have seen phenomena similar to the Frascati results, (d) but these low-level burst signals may not be coming from neutrons generated in the Ti chips. Summaries of our experiments are described in Section II, as is a theoretical effort based on cosmic ray muons to describe low-level neutron production. Details of the experimental groups' work are contained in the six appendices. At LLNL, independent teams were spontaneously formed in response to the early announcements on cold fusion. This report's format follows this organization.

  7. An Assessment of Different Genomic Approaches for Inferring Phylogeny of Listeria monocytogenes

    DEFF Research Database (Denmark)

    Henri, Clementine; Leekitcharoenphon, Pimlapas; Carleton, Heather A.

    2017-01-01

    Background/objectives: Whole genome sequencing (WGS) has proven to be a powerful subtyping tool for foodborne pathogenic bacteria like L. monocytogenes. The interests of genome-scale analysis for national surveillance, outbreak detection or source tracking has been largely documented. The genomic......MLPPST) or pan genome (wgMLPPST). Currently, there are little comparisons studies of these different analytical approaches. Our objective was to assess and compare different genomic methods that can be implemented in order to cluster isolates of L monocytogenes.Methods: The clustering methods were evaluated...... on a collection of 207 L. monocytogenes genomes of food origin representative of the genetic diversity of the Anses collection. The trees were then compared using robust statistical analyses.Results: The backward comparability between conventional typing methods and genomic methods revealed a near...

  8. Genomes

    National Research Council Canada - National Science Library

    Brown, T. A. (Terence A.)

    2002-01-01

    ... of genome expression and replication processes, and transcriptomics and proteomics. This text is richly illustrated with clear, easy-to-follow, full color diagrams, which are downloadable from the book's website...

  9. The Contribution of Health Technology Assessment, Health Needs Assessment, and Health Impact Assessment to the Assessment and Translation of Technologies in the Field of Public Health Genomics

    DEFF Research Database (Denmark)

    Rosenkotter, N.; Vondeling, H.; Blancquaert, I.

    2011-01-01

    contribute to the systematic translation and assessment of genomic health applications by focussing at population level and on public health policy making. It is shown to what extent HTA, HNA and HIA contribute to translational research by using the continuum of translational research (T1-T4) in genomic...... into the impact on public health and health care practice of those technologies that are actually introduced. This paper aims to give an overview of the major assessment instruments in public health [ health technology assessment (HTA), health needs assessment (HNA) and health impact assessment (HIA)] which could...... medicine as an analytic framework. The selected assessment methodologies predominantly cover 2 to 4 phases within the T1-T4 system. HTA delivers the most complete set of methodologies when assessing health applications. HNA can be used to prioritize areas where genomic health applications are needed...

  10. Genotyping-by-sequencing for Populus population genomics: an assessment of genome sampling patterns and filtering approaches.

    Directory of Open Access Journals (Sweden)

    Martin P Schilling

    Full Text Available Continuing advances in nucleotide sequencing technology are inspiring a suite of genomic approaches in studies of natural populations. Researchers are faced with data management and analytical scales that are increasing by orders of magnitude. With such dramatic advances comes a need to understand biases and error rates, which can be propagated and magnified in large-scale data acquisition and processing. Here we assess genomic sampling biases and the effects of various population-level data filtering strategies in a genotyping-by-sequencing (GBS protocol. We focus on data from two species of Populus, because this genus has a relatively small genome and is emerging as a target for population genomic studies. We estimate the proportions and patterns of genomic sampling by examining the Populus trichocarpa genome (Nisqually-1, and demonstrate a pronounced bias towards coding regions when using the methylation-sensitive ApeKI restriction enzyme in this species. Using population-level data from a closely related species (P. tremuloides, we also investigate various approaches for filtering GBS data to retain high-depth, informative SNPs that can be used for population genetic analyses. We find a data filter that includes the designation of ambiguous alleles resulted in metrics of population structure and Hardy-Weinberg equilibrium that were most consistent with previous studies of the same populations based on other genetic markers. Analyses of the filtered data (27,910 SNPs also resulted in patterns of heterozygosity and population structure similar to a previous study using microsatellites. Our application demonstrates that technically and analytically simple approaches can readily be developed for population genomics of natural populations.

  11. OMICRON, LLNL ENDL Charged Particle Data Library Processing

    International Nuclear Information System (INIS)

    Mengoni, A.; Panini, G.C.

    2002-01-01

    1 - Description of program or function: The program has been designed to read the Evaluated Charged Particle Library (ECPL) of the LLNL Evaluated Nuclear Data Library (ENDL) and generate output in various forms: interpreted listing, ENDF format and graphs. 2 - Method of solution: A file containing ECPL in card image transmittal format is scanned to retrieve the requested reactions from the requested materials; in addition selections can be made by data type or incident particle. 3 - Restrictions on the complexity of the problem: The Reaction Property Designator I determines the type of data in the ENDL library (e.g. cross sections, angular distributions, Maxwellian averages, etc.); the program does not take into account the data for I=3,4 (energy-angle-distributions) since there are no data in the current ECPL version

  12. Release isentrope measurements with the LLNL electric gun

    Energy Technology Data Exchange (ETDEWEB)

    Gathers, G.R.; Osher, J.E.; Chau, H.H.; Weingart, R.C.; Lee, C.G.; Diaz, E.

    1987-06-01

    The liquid-vapor coexistence boundary is not well known for most metals because the extreme conditions near the critical point create severe experimental difficulties. The isentropes passing through the liquid-vapor region typically begin from rather large pressures on the Hugoniot. We are attempting to use the high velocities achievable with the Lawrence Livermore National Laboratory (LLNL) electric gun to obtain these extreme states in aluminum and measure the release isentropes by releasing into a series of calibrated standards with known Hugoniots. To achieve large pressure drops needed to explore the liquid-vapor region, we use argon gas for which Hugoniots have been calculated using the ACTEX code, as one of the release materials.

  13. Results of LLNL investigation of NYCT data sets

    International Nuclear Information System (INIS)

    Sale, K; Harrison, M; Guo, M; Groza, M

    2007-01-01

    Upon examination we have concluded that none of the alarms indicate the presence of a real threat. A brief history and results from our examination of the NYCT ASP occupancy data sets dated from 2007-05-14 19:11:07 to 2007-06-20 15:46:15 are presented in this letter report. When the ASP data collection campaign at NYCT was completed, rather than being shut down, the Canberra ASP annunciator box was unplugged leaving the data acquisition system running. By the time it was discovered that the ASP was still acquiring data about 15,000 occupancies had been recorded. Among these were about 500 alarms (classified by the ASP analysis system as either Threat Alarms or Suspect Alarms). At your request, these alarms have been investigated. Our conclusion is that none of the alarm data sets indicate the presence of a real threat (within statistics). The data sets (ICD1 and ICD2 files with concurrent JPEG pictures) were delivered to LLNL on a removable hard drive labeled FOUO. The contents of the data disk amounted to 53.39 GB of data requiring over two days for the standard LLNL virus checking software to scan before work could really get started. Our first step was to walk through the directory structure of the disk and create a database of occupancies. For each occupancy, the database was populated with the occupancy date and time, occupancy number, file path to the ICD1 data and the alarm ('No Alarm', 'Suspect Alarm' or 'Threat Alarm') from the ICD2 file along with some other incidental data. In an attempt to get a global understanding of what was going on, we investigated the occupancy information. The occupancy date/time and alarm type were binned into one-hour counts. These data are shown in Figures 1 and 2

  14. Assessment of whole genome amplification-induced bias through high-throughput, massively parallel whole genome sequencing

    Directory of Open Access Journals (Sweden)

    Plant Ramona N

    2006-08-01

    Full Text Available Abstract Background Whole genome amplification is an increasingly common technique through which minute amounts of DNA can be multiplied to generate quantities suitable for genetic testing and analysis. Questions of amplification-induced error and template bias generated by these methods have previously been addressed through either small scale (SNPs or large scale (CGH array, FISH methodologies. Here we utilized whole genome sequencing to assess amplification-induced bias in both coding and non-coding regions of two bacterial genomes. Halobacterium species NRC-1 DNA and Campylobacter jejuni were amplified by several common, commercially available protocols: multiple displacement amplification, primer extension pre-amplification and degenerate oligonucleotide primed PCR. The amplification-induced bias of each method was assessed by sequencing both genomes in their entirety using the 454 Sequencing System technology and comparing the results with those obtained from unamplified controls. Results All amplification methodologies induced statistically significant bias relative to the unamplified control. For the Halobacterium species NRC-1 genome, assessed at 100 base resolution, the D-statistics from GenomiPhi-amplified material were 119 times greater than those from unamplified material, 164.0 times greater for Repli-G, 165.0 times greater for PEP-PCR and 252.0 times greater than the unamplified controls for DOP-PCR. For Campylobacter jejuni, also analyzed at 100 base resolution, the D-statistics from GenomiPhi-amplified material were 15 times greater than those from unamplified material, 19.8 times greater for Repli-G, 61.8 times greater for PEP-PCR and 220.5 times greater than the unamplified controls for DOP-PCR. Conclusion Of the amplification methodologies examined in this paper, the multiple displacement amplification products generated the least bias, and produced significantly higher yields of amplified DNA.

  15. Assessing Predictive Properties of Genome-Wide Selection in Soybeans

    Directory of Open Access Journals (Sweden)

    Alencar Xavier

    2016-08-01

    Full Text Available Many economically important traits in plant breeding have low heritability or are difficult to measure. For these traits, genomic selection has attractive features and may boost genetic gains. Our goal was to evaluate alternative scenarios to implement genomic selection for yield components in soybean (Glycine max L. merr. We used a nested association panel with cross validation to evaluate the impacts of training population size, genotyping density, and prediction model on the accuracy of genomic prediction. Our results indicate that training population size was the factor most relevant to improvement in genome-wide prediction, with greatest improvement observed in training sets up to 2000 individuals. We discuss assumptions that influence the choice of the prediction model. Although alternative models had minor impacts on prediction accuracy, the most robust prediction model was the combination of reproducing kernel Hilbert space regression and BayesB. Higher genotyping density marginally improved accuracy. Our study finds that breeding programs seeking efficient genomic selection in soybeans would best allocate resources by investing in a representative training set.

  16. Assessing Predictive Properties of Genome-Wide Selection in Soybeans.

    Science.gov (United States)

    Xavier, Alencar; Muir, William M; Rainey, Katy Martin

    2016-08-09

    Many economically important traits in plant breeding have low heritability or are difficult to measure. For these traits, genomic selection has attractive features and may boost genetic gains. Our goal was to evaluate alternative scenarios to implement genomic selection for yield components in soybean (Glycine max L. merr). We used a nested association panel with cross validation to evaluate the impacts of training population size, genotyping density, and prediction model on the accuracy of genomic prediction. Our results indicate that training population size was the factor most relevant to improvement in genome-wide prediction, with greatest improvement observed in training sets up to 2000 individuals. We discuss assumptions that influence the choice of the prediction model. Although alternative models had minor impacts on prediction accuracy, the most robust prediction model was the combination of reproducing kernel Hilbert space regression and BayesB. Higher genotyping density marginally improved accuracy. Our study finds that breeding programs seeking efficient genomic selection in soybeans would best allocate resources by investing in a representative training set. Copyright © 2016 Xavie et al.

  17. A chromosomal genomics approach to assess and validate the desi and kabuli draft chickpea genome assemblies

    Czech Academy of Sciences Publication Activity Database

    Ruperao, P.; Chan, C.K.K.; Azam, S.; Karafiátová, Miroslava; Hayashi, S.; Čížková, Jana; Šimková, Hana; Vrána, Jan; Doležel, Jaroslav; Varshney, R.K.; Edwards, D.

    2014-01-01

    Roč. 12, č. 6 (2014), s. 778-786 ISSN 1467-7644 R&D Projects: GA ČR GBP501/12/G090; GA MŠk(CZ) LO1204 Institutional support: RVO:61389030 Keywords : chickpea * genome assembly * cytogenetics Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 5.752, year: 2014

  18. BYSTANDER EFFECTS GENOMIC INSTABILITY, ADAPTIVE RESPONSE AND CANCER RISK ASSESSMENT FOR RADIAION AND CHEMICAL EXPOSURES

    Science.gov (United States)

    BYSTANDER EFFECTS, GENOMIC INSTABILITY, ADAPTIVE RESPONSE AND CANCER RISK ASSESSMENT FOR RADIATION AND CHEMICAL EXPOSURESR. Julian PrestonEnvironmental Carcinogenesis Division, U.S. Environmental Protection Agency, Research Triangle Park, N.C. 27711, USAThere ...

  19. Functional assessment of human enhancer activities using whole-genome STARR-sequencing.

    Science.gov (United States)

    Liu, Yuwen; Yu, Shan; Dhiman, Vineet K; Brunetti, Tonya; Eckart, Heather; White, Kevin P

    2017-11-20

    Genome-wide quantification of enhancer activity in the human genome has proven to be a challenging problem. Recent efforts have led to the development of powerful tools for enhancer quantification. However, because of genome size and complexity, these tools have yet to be applied to the whole human genome.  In the current study, we use a human prostate cancer cell line, LNCaP as a model to perform whole human genome STARR-seq (WHG-STARR-seq) to reliably obtain an assessment of enhancer activity. This approach builds upon previously developed STARR-seq in the fly genome and CapSTARR-seq techniques in targeted human genomic regions. With an improved library preparation strategy, our approach greatly increases the library complexity per unit of starting material, which makes it feasible and cost-effective to explore the landscape of regulatory activity in the much larger human genome. In addition to our ability to identify active, accessible enhancers located in open chromatin regions, we can also detect sequences with the potential for enhancer activity that are located in inaccessible, closed chromatin regions. When treated with the histone deacetylase inhibitor, Trichostatin A, genes nearby this latter class of enhancers are up-regulated, demonstrating the potential for endogenous functionality of these regulatory elements. WHG-STARR-seq provides an improved approach to current pipelines for analysis of high complexity genomes to gain a better understanding of the intricacies of transcriptional regulation.

  20. Emergency Response Capability Baseline Needs Assessment - Requirements Document

    Energy Technology Data Exchange (ETDEWEB)

    Sharry, J A

    2016-10-04

    This document was prepared by John A. Sharry, LLNL Fire Marshal and LLNL Division Leader for Fire Protection and reviewed by LLNL Emergency Management Department Head James Colson. The document follows and expands upon the format and contents of the DOE Model Fire Protection Baseline Capabilities Assessment document contained on the DOE Fire Protection Web Site, but only addresses emergency response.

  1. Summary Statistics for Homemade ?Play Dough? -- Data Acquired at LLNL

    Energy Technology Data Exchange (ETDEWEB)

    Kallman, J S; Morales, K E; Whipple, R E; Huber, R D; Martz, A; Brown, W D; Smith, J A; Schneberk, D J; Martz, Jr., H E; White, III, W T

    2010-03-11

    Using x-ray computerized tomography (CT), we have characterized the x-ray linear attenuation coefficients (LAC) of a homemade Play Dough{trademark}-like material, designated as PDA. Table 1 gives the first-order statistics for each of four CT measurements, estimated with a Gaussian kernel density estimator (KDE) analysis. The mean values of the LAC range from a high of about 2700 LMHU{sub D} 100kVp to a low of about 1200 LMHUD at 300kVp. The standard deviation of each measurement is around 10% to 15% of the mean. The entropy covers the range from 6.0 to 7.4. Ordinarily, we would model the LAC of the material and compare the modeled values to the measured values. In this case, however, we did not have the detailed chemical composition of the material and therefore did not model the LAC. Using a method recently proposed by Lawrence Livermore National Laboratory (LLNL), we estimate the value of the effective atomic number, Z{sub eff}, to be near 10. LLNL prepared about 50mL of the homemade 'Play Dough' in a polypropylene vial and firmly compressed it immediately prior to the x-ray measurements. We used the computer program IMGREC to reconstruct the CT images. The values of the key parameters used in the data capture and image reconstruction are given in this report. Additional details may be found in the experimental SOP and a separate document. To characterize the statistical distribution of LAC values in each CT image, we first isolated an 80% central-core segment of volume elements ('voxels') lying completely within the specimen, away from the walls of the polypropylene vial. All of the voxels within this central core, including those comprised of voids and inclusions, are included in the statistics. We then calculated the mean value, standard deviation and entropy for (a) the four image segments and for (b) their digital gradient images. (A digital gradient image of a given image was obtained by taking the absolute value of the difference

  2. LLNL medical and industrial laser isotope separation: large volume, low cost production through advanced laser technologies

    International Nuclear Information System (INIS)

    Comaskey, B.; Scheibner, K. F.; Shaw, M.; Wilder, J.

    1998-01-01

    The goal of this LDRD project was to demonstrate the technical and economical feasibility of applying laser isotope separation technology to the commercial enrichment (>lkg/y) of stable isotopes. A successful demonstration would well position the laboratory to make a credible case for the creation of an ongoing medical and industrial isotope production and development program at LLNL. Such a program would establish LLNL as a center for advanced medical isotope production, successfully leveraging previous LLNL Research and Development hardware, facilities, and knowledge

  3. On detection and assessment of statistical significance of Genomic Islands

    Directory of Open Access Journals (Sweden)

    Chaudhuri Probal

    2008-04-01

    Full Text Available Abstract Background Many of the available methods for detecting Genomic Islands (GIs in prokaryotic genomes use markers such as transposons, proximal tRNAs, flanking repeats etc., or they use other supervised techniques requiring training datasets. Most of these methods are primarily based on the biases in GC content or codon and amino acid usage of the islands. However, these methods either do not use any formal statistical test of significance or use statistical tests for which the critical values and the P-values are not adequately justified. We propose a method, which is unsupervised in nature and uses Monte-Carlo statistical tests based on randomly selected segments of a chromosome. Such tests are supported by precise statistical distribution theory, and consequently, the resulting P-values are quite reliable for making the decision. Results Our algorithm (named Design-Island, an acronym for Detection of Statistically Significant Genomic Island runs in two phases. Some 'putative GIs' are identified in the first phase, and those are refined into smaller segments containing horizontally acquired genes in the refinement phase. This method is applied to Salmonella typhi CT18 genome leading to the discovery of several new pathogenicity, antibiotic resistance and metabolic islands that were missed by earlier methods. Many of these islands contain mobile genetic elements like phage-mediated genes, transposons, integrase and IS elements confirming their horizontal acquirement. Conclusion The proposed method is based on statistical tests supported by precise distribution theory and reliable P-values along with a technique for visualizing statistically significant islands. The performance of our method is better than many other well known methods in terms of their sensitivity and accuracy, and in terms of specificity, it is comparable to other methods.

  4. Test results from the LLNL 250 GHz CARM experiment

    International Nuclear Information System (INIS)

    Kulke, B.; Caplan, M.; Bubp, D.; Houck, T.; Rogers, D.; Trimble, D.; VanMaren, R.; Westenskow, G.; McDermott, D.B.; Luhmann, N.C. Jr.; Danly, B.

    1991-01-01

    The authors have completed the initial phase of a 250 GHz CARM experiment, driven by the 2 MeV, 1 kA, 30 ns induction linac at the LLNL ARC facility. A non-Brillouin, solid, electron beam is generated from a flux-threaded, thermionic cathode. As the beam traverses a 10 kG plateau produced by a superconducting magnet, ten percent of the beam energy is converted into rotational energy in a bifilar helix wiggler that produces a spiraling, 50 G, transverse magnetic field. The beam is then compressed to a 5 mm diameter as it drifts into a 30 kG plateau. For the present experiment, the CARM interaction region consisted of a single Bragg section resonator, followed by a smooth-bore amplifier section. Using high-pass filters, they have observed broadband output signals estimated to be at the several megawatt level in the range 140 to over 230 GHz. This is consistent with operation as a superradiant amplifier. Simultaneously, they also observed K a band power levels near 3 MW

  5. Test results from the LLNL 250 GHz CARM experiment

    International Nuclear Information System (INIS)

    Kulke, B.; Caplan, M.; Bubp, D.; Houck, T.; Rogers, D.; Trimble, D.; VanMaren, R.; Westenskow, G.; McDermott, D.B.; Luhmann, N.C. Jr.; Danly, B.

    1991-05-01

    We have completed the initial phase of a 250 GHz CARM experiment, driven by the 2 MeV, 1 kA, 30 ns induction linac at the LLNL ARC facility. A non-Brillouin, solid, electron beam is generated from a flux-threaded, thermionic cathode. As the beam traverses a 10 kG plateau produced by a superconducting magnet, ten percent of the beam energy is converted into rotational energy in a bifilar helix wiggler that produces a spiraling, 50 G, transverse magnetic field. The beam is then compressed to a 5 mm diameter as it drifts into a 30 kG plateau. For the present experiment, the CARM interaction region consisted of a single Bragg section resonator, followed by a smooth-bore amplifier section. Using high-pass filters, we have observed broadband output signals estimated to be at the several megawatt level in the range 140 to over 230 GHz. This is consistent with operation as a superradiant amplifier. Simultaneously, we also observed K a band power levels near 3 MW

  6. Net Weight Issue LLNL DOE-STD-3013 Containers

    International Nuclear Information System (INIS)

    Wilk, P

    2008-01-01

    The following position paper will describe DOE-STD-3013 container sets No.L000072 and No.L000076, and how they are compliant with DOE-STD-3013-2004. All masses of accountable nuclear materials are measured on LLNL certified balances maintained under an MC and A Program approved by DOE/NNSA LSO. All accountability balances are recalibrated annually and checked to be within calibration on each day that the balance is used for accountability purposes. A statistical analysis of the historical calibration checks from the last seven years indicates that the full-range Limit of Error (LoE, 95% confidence level) for the balance used to measure the mass of the contents of the above indicated 3013 containers is 0.185 g. If this error envelope, at the 95% confidence level, were to be used to generate an upper-limit to the measured weight of the containers No.L000072 and No.L000076, the error-envelope would extend beyond the 5.0 kg 3013-standard limit on the package contents by less than 0.3 g. However, this is still well within the intended safety bounds of DOE-STD-3013-2004

  7. BATCH-GE: Batch analysis of Next-Generation Sequencing data for genome editing assessment

    Science.gov (United States)

    Boel, Annekatrien; Steyaert, Woutert; De Rocker, Nina; Menten, Björn; Callewaert, Bert; De Paepe, Anne; Coucke, Paul; Willaert, Andy

    2016-01-01

    Targeted mutagenesis by the CRISPR/Cas9 system is currently revolutionizing genetics. The ease of this technique has enabled genome engineering in-vitro and in a range of model organisms and has pushed experimental dimensions to unprecedented proportions. Due to its tremendous progress in terms of speed, read length, throughput and cost, Next-Generation Sequencing (NGS) has been increasingly used for the analysis of CRISPR/Cas9 genome editing experiments. However, the current tools for genome editing assessment lack flexibility and fall short in the analysis of large amounts of NGS data. Therefore, we designed BATCH-GE, an easy-to-use bioinformatics tool for batch analysis of NGS-generated genome editing data, available from https://github.com/WouterSteyaert/BATCH-GE.git. BATCH-GE detects and reports indel mutations and other precise genome editing events and calculates the corresponding mutagenesis efficiencies for a large number of samples in parallel. Furthermore, this new tool provides flexibility by allowing the user to adapt a number of input variables. The performance of BATCH-GE was evaluated in two genome editing experiments, aiming to generate knock-out and knock-in zebrafish mutants. This tool will not only contribute to the evaluation of CRISPR/Cas9-based experiments, but will be of use in any genome editing experiment and has the ability to analyze data from every organism with a sequenced genome. PMID:27461955

  8. Molecular Heterogeneity in Primary Breast Carcinomas and Axillary Lymph Node Metastases Assessed by Genomic Fingerprinting Analysis

    Science.gov (United States)

    Ellsworth, Rachel E; Toro, Allyson L; Blackburn, Heather L; Decewicz, Alisha; Deyarmin, Brenda; Mamula, Kimberly A; Costantino, Nicholas S; Hooke, Jeffrey A; Shriver, Craig D; Ellsworth, Darrell L

    2015-01-01

    Molecular heterogeneity within primary breast carcinomas and among axillary lymph node (LN) metastases may impact diagnosis and confound treatment. In this study, we used short tandem repeated sequences to assess genomic heterogeneity and to determine hereditary relationships among primary tumor areas and regional metastases from 30 breast cancer patients. We found that primary carcinomas were genetically heterogeneous and sampling multiple areas was necessary to adequately assess genomic variability. LN metastases appeared to originate at different time periods during disease progression from different sites of the primary tumor and the extent of genomic divergence among regional metastases was associated with a less favorable patient outcome (P = 0.009). In conclusion, metastasis is a complex process influenced by primary tumor heterogeneity and variability in the timing of dissemination. Genomic variation in primary breast tumors and regional metastases may negatively impact clinical diagnostics and contribute to therapeutic resistance. PMID:26279627

  9. Molecular Heterogeneity in Primary Breast Carcinomas and Axillary Lymph Node Metastases Assessed by Genomic Fingerprinting Analysis

    Directory of Open Access Journals (Sweden)

    Rachel E. Ellsworth

    2015-01-01

    Full Text Available Molecular heterogeneity within primary breast carcinomas and among axillary lymph node (LN metastases may impact diagnosis and confound treatment. In this study, we used short tandem repeated sequences to assess genomic heterogeneity and to determine hereditary relationships among primary tumor areas and regional metastases from 30 breast cancer patients. We found that primary carcinomas were genetically heterogeneous and sampling multiple areas was necessary to adequately assess genomic variability. LN metastases appeared to originate at different time periods during disease progression from different sites of the primary tumor and the extent of genomic divergence among regional metastases was associated with a less favorable patient outcome ( P = 0.009. In conclusion, metastasis is a complex process influenced by primary tumor heterogeneity and variability in the timing of dissemination. Genomic variation in primary breast tumors and regional metastases may negatively impact clinical diagnostics and contribute to therapeutic resistance.

  10. The contribution of health technology assessment, health needs assessment, and health impact assessment to the assessment and translation of technologies in the field of public health genomics.

    Science.gov (United States)

    Rosenkötter, N; Vondeling, H; Blancquaert, I; Mekel, O C L; Kristensen, F B; Brand, A

    2011-01-01

    The European Union has named genomics as one of the promising research fields for the development of new health technologies. Major concerns with regard to these fields are, on the one hand, the rather slow and limited translation of new knowledge and, on the other hand, missing insights into the impact on public health and health care practice of those technologies that are actually introduced. This paper aims to give an overview of the major assessment instruments in public health [health technology assessment (HTA), health needs assessment (HNA) and health impact assessment (HIA)] which could contribute to the systematic translation and assessment of genomic health applications by focussing at population level and on public health policy making. It is shown to what extent HTA, HNA and HIA contribute to translational research by using the continuum of translational research (T1-T4) in genomic medicine as an analytic framework. The selected assessment methodologies predominantly cover 2 to 4 phases within the T1-T4 system. HTA delivers the most complete set of methodologies when assessing health applications. HNA can be used to prioritize areas where genomic health applications are needed or to identify infrastructural needs. HIA delivers information on the impact of technologies in a wider scope and promotes informed decision making. HTA, HNA and HIA provide a partly overlapping and partly unique set of methodologies and infrastructure for the translation and assessment of genomic health applications. They are broad in scope and go beyond the continuum of T1-T4 translational research regarding policy translation. Copyright © 2010 S. Karger AG, Basel.

  11. Assessment of a Competency-Based Undergraduate Course on Genetic and Genomics.

    Science.gov (United States)

    Kronk, Rebecca; Colbert, Alison; Lengetti, Evelyn

    2017-08-24

    In response to new demands in the nursing profession, an innovative undergraduate genetics course was designed based on the Essential Nursing Competencies and Curricula Guidelines for Genetics and Genomics. Reflective journaling and storytelling were used as major pedagogies, alongside more traditional approaches. Thematic content analysis of student reflections revealed transformational learning as the major theme emerging from genomic and genetic knowledge acquisition. Quantitative analyses of precourse/postcourse student self-assessments of competencies revealed significant findings.

  12. Training the Masses ? Web-based Laser Safety Training at LLNL

    Energy Technology Data Exchange (ETDEWEB)

    Sprague, D D

    2004-12-17

    The LLNL work smart standard requires us to provide ongoing laser safety training for a large number of persons on a three-year cycle. In order to meet the standard, it was necessary to find a cost and performance effective method to perform this training. This paper discusses the scope of the training problem, specific LLNL training needs, various training methods used at LLNL, the advantages and disadvantages of these methods and the rationale for selecting web-based laser safety training. The tools and costs involved in developing web-based training courses are also discussed, in addition to conclusions drawn from our training operating experience. The ILSC lecture presentation contains a short demonstration of the LLNL web-based laser safety-training course.

  13. LLNL Compliance Plan for TRUPACT-2 Authorized Methods for Payload Control

    International Nuclear Information System (INIS)

    1995-03-01

    This document describes payload control at LLNL to ensure that all shipments of CH-TRU waste in the TRUPACT-II (Transuranic Package Transporter-II) meet the requirements of the TRUPACT-II SARP (safety report for packaging). This document also provides specific instructions for the selection of authorized payloads once individual payload containers are qualified for transport. The physical assembly of the qualified payload and operating procedures for the use of the TRUPACT-II, including loading and unloading operations, are described in HWM Procedure No. 204, based on the information in the TRUPACT-II SARP. The LLNL TRAMPAC, along with the TRUPACT-II operating procedures contained in HWM Procedure No. 204, meet the documentation needs for the use of the TRUPACT-II at LLNL. Table 14-1 provides a summary of the LLNL waste generation and certification procedures as they relate to TRUPACT-II payload compliance

  14. Proposals for ORNL [Oak Ridge National Laboratory] support to Tiber LLNL [Lawrence Livermore National Laboratory

    International Nuclear Information System (INIS)

    Berry, L.A.; Rosenthal, M.W.; Saltmarsh, M.J.; Shannon, T.E.; Sheffield, J.

    1987-01-01

    This document describes the interests and capabilities of Oak Ridge National Laboratory in their proposals to support the Lawrence Livermore National Laboratory (LLNL) Engineering Test Reactor (ETR) project. Five individual proposals are cataloged separately. (FI)

  15. LLNL Center of Excellence Work Items for Q9-Q10 period

    Energy Technology Data Exchange (ETDEWEB)

    Neely, J. R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-09-02

    This work plan encompasses a slice of effort going on within the ASC program, and for projects utilizing COE vendor resources, describes work that will be performed by both LLNL staff and COE vendor staff collaboratively.

  16. An Assessment of Different Genomic Approaches for Inferring Phylogeny of Listeria monocytogenes

    Directory of Open Access Journals (Sweden)

    Clémentine Henri

    2017-11-01

    Full Text Available Background/objectives: Whole genome sequencing (WGS has proven to be a powerful subtyping tool for foodborne pathogenic bacteria like L. monocytogenes. The interests of genome-scale analysis for national surveillance, outbreak detection or source tracking has been largely documented. The genomic data however can be exploited with many different bioinformatics methods like single nucleotide polymorphism (SNP, core-genome multi locus sequence typing (cgMLST, whole-genome multi locus sequence typing (wgMLST or multi locus predicted protein sequence typing (MLPPST on either core-genome (cgMLPPST or pan-genome (wgMLPPST. Currently, there are little comparisons studies of these different analytical approaches. Our objective was to assess and compare different genomic methods that can be implemented in order to cluster isolates of L. monocytogenes.Methods: The clustering methods were evaluated on a collection of 207 L. monocytogenes genomes of food origin representative of the genetic diversity of the Anses collection. The trees were then compared using robust statistical analyses.Results: The backward comparability between conventional typing methods and genomic methods revealed a near-perfect concordance. The importance of selecting a proper reference when calling SNPs was highlighted, although distances between strains remained identical. The analysis also revealed that the topology of the phylogenetic trees between wgMLST and cgMLST were remarkably similar. The comparison between SNP and cgMLST or SNP and wgMLST approaches showed that the topologies of phylogenic trees were statistically similar with an almost equivalent clustering.Conclusion: Our study revealed high concordance between wgMLST, cgMLST, and SNP approaches which are all suitable for typing of L. monocytogenes. The comparable clustering is an important observation considering that the two approaches have been variously implemented among reference laboratories.

  17. Review of LLNL Mixed Waste Streams for the Application of Potential Waste Reduction Controls

    International Nuclear Information System (INIS)

    Belue, A; Fischer, R P

    2007-01-01

    In July 2004, LLNL adopted the International Standard ISO 14001 as a Work Smart Standard in lieu of DOE Order 450.1. In support of this new requirement the Director issued a new environmental policy that was documented in Section 3.0 of Document 1.2, ''ES and H Policies of LLNL'', in the ES and H Manual. In recent years the Environmental Management System (EMS) process has become formalized as LLNL adopted ISO 14001 as part of the contract under which the laboratory is operated for the Department of Energy (DOE). On May 9, 2005, LLNL revised its Integrated Safety Management System Description to enhance existing environmental requirements to meet ISO 14001. Effective October 1, 2005, each new project or activity is required to be evaluated from an environmental aspect, particularly if a potential exists for significant environmental impacts. Authorizing organizations are required to consider the management of all environmental aspects, the applicable regulatory requirements, and reasonable actions that can be taken to reduce negative environmental impacts. During 2006, LLNL has worked to implement the corrective actions addressing the deficiencies identified in the DOE/LSO audit. LLNL has begun to update the present EMS to meet the requirements of ISO 14001:2004. The EMS commits LLNL--and each employee--to responsible stewardship of all the environmental resources in our care. The generation of mixed radioactive waste was identified as a significant environmental aspect. Mixed waste for the purposes of this report is defined as waste materials containing both hazardous chemical and radioactive constituents. Significant environmental aspects require that an Environmental Management Plan (EMP) be developed. The objective of the EMP developed for mixed waste (EMP-005) is to evaluate options for reducing the amount of mixed waste generated. This document presents the findings of the evaluation of mixed waste generated at LLNL and a proposed plan for reduction

  18. The National Ignition Facility (NIF) and High Energy Density Science Research at LLNL (Briefing Charts)

    Science.gov (United States)

    2013-06-21

    The National Ignition Facility ( NIF ) and High Energy Density Science Research at LLNL Presentation to: IEEE Pulsed Power and Plasma Science...Conference C. J. Keane Director, NIF User Office June 21, 2013 1491978-1-4673-5168-3/13/$31.00 ©2013 IEEE Report Documentation Page Form ApprovedOMB No...4. TITLE AND SUBTITLE The National Ignition Facility ( NIF ) and High Energy Density Science Research at LLNL 5a. CONTRACT NUMBER 5b. GRANT

  19. 76 FR 38399 - Assessing the Current Research, Policy, and Practice Environment in Public Health Genomics

    Science.gov (United States)

    2011-06-30

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention [Docket Number CDC-2011-0008] Assessing the Current Research, Policy, and Practice Environment in Public Health... information helpful to assess the current research, policy, and practice environment in public health genomics...

  20. Human Genome Editing in the Clinic: New Challenges in Regulatory Benefit-Risk Assessment.

    Science.gov (United States)

    Abou-El-Enein, Mohamed; Cathomen, Toni; Ivics, Zoltán; June, Carl H; Renner, Matthias; Schneider, Christian K; Bauer, Gerhard

    2017-10-05

    As genome editing rapidly progresses toward the realization of its clinical promise, assessing the suitability of current tools and processes used for its benefit-risk assessment is critical. Although current regulations may initially provide an adequate regulatory framework, improvements are recommended to overcome several existing technology-based safety and efficacy issues. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Linear collider research and development at SLAC, LBL and LLNL

    International Nuclear Information System (INIS)

    Mattison, T.S.

    1988-10-01

    The study of electron-positron (e + e/sup /minus//) annihilation in storage ring colliders has been very fruitful. It is by now well understood that the optimized cost and size of e + e/sup /minus// storage rings scales as E(sub cm//sup 2/ due to the need to replace energy lost to synchrotron radiation in the ring bending magnets. Linear colliders, using the beams from linear accelerators, evade this scaling law. The study of e/sup +/e/sup /minus// collisions at TeV energy will require linear colliders. The luminosity requirements for a TeV linear collider are set by the physics. Advanced accelerator research and development at SLAC is focused toward a TeV Linear Collider (TLC) of 0.5--1 TeV in the center of mass, with a luminosity of 10/sup 33/--10/sup 34/. The goal is a design for two linacs of less than 3 km each, and requiring less than 100 MW of power each. With a 1 km final focus, the TLC could be fit on Stanford University land (although not entirely within the present SLAC site). The emphasis is on technologies feasible for a proposal to be framed in 1992. Linear collider development work is progressing on three fronts: delivering electrical energy to a beam, delivering a focused high quality beam, and system optimization. Sources of high peak microwave radio frequency (RF) power to drive the high gradient linacs are being developed in collaboration with Lawrence Berkeley Laboratory (LBL) and Lawrence Livermore National Laboratory (LLNL). Beam generation, beam dynamics and final focus work has been done at SLAC and in collaboration with KEK. Both the accelerator physics and the utilization of TeV linear colliders were topics at the 1988 Snowmass Summer Study. 14 refs., 4 figs., 1 tab

  2. Progress in AMS measurements at the LLNL spectrometer

    International Nuclear Information System (INIS)

    Southon, J.R.; Vogel, J.S.; Trumbore, S.E.; Davis, J.C.; Roberts, M.L.; Caffee, M.; Finkel, R.; Proctor, I.D.; Heikkinen, D.W.; Berno, A.J.; Hornady, R.S.

    1991-06-01

    The AMS measurement program at LLNL began in earnest in late 1989, and has initially concentrated on 14 C measurements for biomedical and geoscience applications. We have now begun measurements on 10 Be and 36 Cl, are presently testing the spectrometer performance for 26 Al and 3 H, and will begin tests on 7 Be, 41 Ca and 129 I within the next few months. Our laboratory has a strong biomedical AMS program of 14 C tracer measurements involving large numbers of samples (sometimes hundreds in a single experiment) at 14 C concentrations which are typically .5--5 times Modern, but are occasionally highly enriched. The sample preparation techniques required for high throughput and low cross-contamination for this work are discussed elsewhere. Similar demands are placed on the AMS measurement system, and in particular on the ion source. Modifications to our GIC 846 ion source, described below, allow us to run biomedical and geoscience or archaeological samples in the same source wheel with no adverse effects. The source has a capacity for 60 samples (about 45 unknown) in a single wheel and provides currents of 30--60μA of C - from hydrogen-reduced graphite. These currents and sample capacity provide high throughput for both biomedical and other measurements: the AMS system can be started up, tuned, and a wheel of carbon samples measured to 1--1.5% in under a day; and 2 biomedical wheels can be measured per day without difficulty. We report on the present status of the Lawrence Livermore AMS spectrometer, including sample throughput and progress towards routine 1% measurement capability for 14 C, first results on other isotopes, and experience with a multi-sample high intensity ion source. 5 refs

  3. An assessment on epitope prediction methods for protozoa genomes

    Directory of Open Access Journals (Sweden)

    Resende Daniela M

    2012-11-01

    Full Text Available Abstract Background Epitope prediction using computational methods represents one of the most promising approaches to vaccine development. Reduction of time, cost, and the availability of completely sequenced genomes are key points and highly motivating regarding the use of reverse vaccinology. Parasites of genus Leishmania are widely spread and they are the etiologic agents of leishmaniasis. Currently, there is no efficient vaccine against this pathogen and the drug treatment is highly toxic. The lack of sufficiently large datasets of experimentally validated parasites epitopes represents a serious limitation, especially for trypanomatids genomes. In this work we highlight the predictive performances of several algorithms that were evaluated through the development of a MySQL database built with the purpose of: a evaluating individual algorithms prediction performances and their combination for CD8+ T cell epitopes, B-cell epitopes and subcellular localization by means of AUC (Area Under Curve performance and a threshold dependent method that employs a confusion matrix; b integrating data from experimentally validated and in silico predicted epitopes; and c integrating the subcellular localization predictions and experimental data. NetCTL, NetMHC, BepiPred, BCPred12, and AAP12 algorithms were used for in silico epitope prediction and WoLF PSORT, Sigcleave and TargetP for in silico subcellular localization prediction against trypanosomatid genomes. Results A database-driven epitope prediction method was developed with built-in functions that were capable of: a removing experimental data redundancy; b parsing algorithms predictions and storage experimental validated and predict data; and c evaluating algorithm performances. Results show that a better performance is achieved when the combined prediction is considered. This is particularly true for B cell epitope predictors, where the combined prediction of AAP12 and BCPred12 reached an AUC value

  4. A Critical Analysis of Assessment Quality in Genomics and Bioinformatics Education Research

    Science.gov (United States)

    Campbell, Chad E.; Nehm, Ross H.

    2013-01-01

    The growing importance of genomics and bioinformatics methods and paradigms in biology has been accompanied by an explosion of new curricula and pedagogies. An important question to ask about these educational innovations is whether they are having a meaningful impact on students' knowledge, attitudes, or skills. Although assessments are…

  5. Prospects for introgressing tomato chromosomes into the potato genome: An assessment through GISH analysis

    NARCIS (Netherlands)

    Garriga Calderé, F.; Huigen, D.J.; Jacobsen, E.; Ramanna, M.S.

    1999-01-01

    With a view to assess the possibility of homoeologous pairing and crossing-over between the chromosomes of potato (Solanum tuberosum) and tomato (Lycopersicon esculentum), a somatic fusion hybrid and two monosomic alien tomato addition genotypes were investigated through genomic in situ

  6. Genomic diversity among Danish field strains of Mycoplasma hyosynoviae assessed by amplified fragment length polymorphism analysis

    DEFF Research Database (Denmark)

    Kokotovic, Branko; Friis, Niels F.; Nielsen, Elisabeth O.

    2002-01-01

    Genomic diversity among strains of Mycoplasma hyosynoviae isolated in Denmark was assessed by using amplified fragment length polymorphism (AFLP) analysis. Ninety-six strains, obtained from different specimens and geographical locations during 30 years and the type strain of M. hyosynoviae S16(T......) were concurrently examined for variance in BglII-MfeI and EcoRI-Csp6I-A AFLP markers. A total of 56 different genomic fingerprints having an overall similarity between 77 and 96% were detected. No correlation between AFLP variability and period of isolation or anatomical site of isolation could...

  7. Reporting of Human Genome Epidemiology (HuGE association studies: An empirical assessment

    Directory of Open Access Journals (Sweden)

    Gwinn Marta

    2008-05-01

    Full Text Available Abstract Background Several thousand human genome epidemiology association studies are published every year investigating the relationship between common genetic variants and diverse phenotypes. Transparent reporting of study methods and results allows readers to better assess the validity of study findings. Here, we document reporting practices of human genome epidemiology studies. Methods Articles were randomly selected from a continuously updated database of human genome epidemiology association studies to be representative of genetic epidemiology literature. The main analysis evaluated 315 articles published in 2001–2003. For a comparative update, we evaluated 28 more recent articles published in 2006, focusing on issues that were poorly reported in 2001–2003. Results During both time periods, most studies comprised relatively small study populations and examined one or more genetic variants within a single gene. Articles were inconsistent in reporting the data needed to assess selection bias and the methods used to minimize misclassification (of the genotype, outcome, and environmental exposure or to identify population stratification. Statistical power, the use of unrelated study participants, and the use of replicate samples were reported more often in articles published during 2006 when compared with the earlier sample. Conclusion We conclude that many items needed to assess error and bias in human genome epidemiology association studies are not consistently reported. Although some improvements were seen over time, reporting guidelines and online supplemental material may help enhance the transparency of this literature.

  8. LLNL/YMP Waste Container Fabrication and Closure Project; GFY technical activity summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1990-10-01

    The Department of Energy`s Office of Civilian Radioactive Waste Management (OCRWM) Program is studying Yucca Mountain, Nevada as a suitable site for the first US high-level nuclear waste repository. Lawrence Livermore National Laboratory (LLNL) has the responsibility for designing and developing the waste package for the permanent storage of high-level nuclear waste. This report is a summary of the technical activities for the LLNL/YMP Nuclear Waste Disposal Container Fabrication and Closure Development Project. Candidate welding closure processes were identified in the Phase 1 report. This report discusses Phase 2. Phase 2 of this effort involved laboratory studies to determine the optimum fabrication and closure processes. Because of budget limitations, LLNL narrowed the materials for evaluation in Phase 2 from the original six to four: Alloy 825, CDA 715, CDA 102 (or CDA 122) and CDA 952. Phase 2 studies focused on evaluation of candidate material in conjunction with fabrication and closure processes.

  9. Operating characteristics and modeling of the LLNL 100-kV electric gun

    International Nuclear Information System (INIS)

    Osher, J.E.; Barnes, G.; Chau, H.H.; Lee, R.S.; Lee, C.; Speer, R.; Weingart, R.C.

    1989-01-01

    In the electric gun, the explosion of an electrically heated metal foil and the accompanying magnetic forces drive a thin flyer plate up a short barrel. Flyer velocities of up to 18 km/s make the gun useful for hypervelocity impact studies. The authors briefly review the technological evolution of the exploding-metal circuit elements that power the gun, describe the 100-kV electric gun designed at Lawrence Livermore National Laboratory (LLNL) in some detail, and present the general principles of electric gun operation. They compare the experimental performance of the LLNL gun with a simple model and with predictions of a magnetohydrodynamics code

  10. Summary of the LLNL one-dimensional transport-kinetics model of the troposphere and stratosphere: 1981

    International Nuclear Information System (INIS)

    Wuebbles, D.J.

    1981-09-01

    Since the LLNL one-dimensional coupled transport and chemical kinetics model of the troposphere and stratosphere was originally developed in 1972 (Chang et al., 1974), there have been many changes to the model's representation of atmospheric physical and chemical processes. A brief description is given of the current LLNL one-dimensional coupled transport and chemical kinetics model of the troposphere and stratosphere

  11. Assessing computational genomics skills: Our experience in the H3ABioNet African bioinformatics network.

    Directory of Open Access Journals (Sweden)

    C Victor Jongeneel

    2017-06-01

    Full Text Available The H3ABioNet pan-African bioinformatics network, which is funded to support the Human Heredity and Health in Africa (H3Africa program, has developed node-assessment exercises to gauge the ability of its participating research and service groups to analyze typical genome-wide datasets being generated by H3Africa research groups. We describe a framework for the assessment of computational genomics analysis skills, which includes standard operating procedures, training and test datasets, and a process for administering the exercise. We present the experiences of 3 research groups that have taken the exercise and the impact on their ability to manage complex projects. Finally, we discuss the reasons why many H3ABioNet nodes have declined so far to participate and potential strategies to encourage them to do so.

  12. Assessing computational genomics skills: Our experience in the H3ABioNet African bioinformatics network.

    Science.gov (United States)

    Jongeneel, C Victor; Achinike-Oduaran, Ovokeraye; Adebiyi, Ezekiel; Adebiyi, Marion; Adeyemi, Seun; Akanle, Bola; Aron, Shaun; Ashano, Efejiro; Bendou, Hocine; Botha, Gerrit; Chimusa, Emile; Choudhury, Ananyo; Donthu, Ravikiran; Drnevich, Jenny; Falola, Oluwadamila; Fields, Christopher J; Hazelhurst, Scott; Hendry, Liesl; Isewon, Itunuoluwa; Khetani, Radhika S; Kumuthini, Judit; Kimuda, Magambo Phillip; Magosi, Lerato; Mainzer, Liudmila Sergeevna; Maslamoney, Suresh; Mbiyavanga, Mamana; Meintjes, Ayton; Mugutso, Danny; Mpangase, Phelelani; Munthali, Richard; Nembaware, Victoria; Ndhlovu, Andrew; Odia, Trust; Okafor, Adaobi; Oladipo, Olaleye; Panji, Sumir; Pillay, Venesa; Rendon, Gloria; Sengupta, Dhriti; Mulder, Nicola

    2017-06-01

    The H3ABioNet pan-African bioinformatics network, which is funded to support the Human Heredity and Health in Africa (H3Africa) program, has developed node-assessment exercises to gauge the ability of its participating research and service groups to analyze typical genome-wide datasets being generated by H3Africa research groups. We describe a framework for the assessment of computational genomics analysis skills, which includes standard operating procedures, training and test datasets, and a process for administering the exercise. We present the experiences of 3 research groups that have taken the exercise and the impact on their ability to manage complex projects. Finally, we discuss the reasons why many H3ABioNet nodes have declined so far to participate and potential strategies to encourage them to do so.

  13. Sequencing quality assessment tools to enable data-driven informatics for high throughput genomics

    Directory of Open Access Journals (Sweden)

    Richard Mark Leggett

    2013-12-01

    Full Text Available The processes of quality assessment and control are an active area of research at The Genome Analysis Centre (TGAC. Unlike other sequencing centres that often concentrate on a certain species or technology, TGAC applies expertise in genomics and bioinformatics to a wide range of projects, often requiring bespoke wet lab and in silico workflows. TGAC is fortunate to have access to a diverse range of sequencing and analysis platforms, and we are at the forefront of investigations into library quality and sequence data assessment. We have developed and implemented a number of algorithms, tools, pipelines and packages to ascertain, store, and expose quality metrics across a number of next-generation sequencing platforms, allowing rapid and in-depth cross-platform QC bioinformatics. In this review, we describe these tools as a vehicle for data-driven informatics, offering the potential to provide richer context for downstream analysis and to inform experimental design.

  14. LLNL radioactive waste management plan as per DOE Order 5820.2

    International Nuclear Information System (INIS)

    1984-01-01

    The following aspects of LLNL's radioactive waste management plan are discussed: program administration; description of waste generating processes; radioactive waste collection, treatment, and disposal; sanitary waste management; site 300 operations; schedules and major milestones for waste management activities; and environmental monitoring programs (sampling and analysis)

  15. National Uranium Resource Evaluation Program: the Hydrogeochemical Stream Sediment Reconnaissance Program at LLNL

    International Nuclear Information System (INIS)

    Higgins, G.H.

    1980-08-01

    From early 1975 to mid 1979, Lawrence Livermore National Laboratory (LLNL) participated in the Hydrogeochemical Stream Sediment Reconnaissance (HSSR), part of the National Uranium Resource Evaluation (NURE) program sponsored by the Department of Energy (DOE). The Laboratory was initially responsible for collecting, analyzing, and evaluating sediment and water samples from approximately 200,000 sites in seven western states. Eventually, however, the NURE program redefined its sampling priorities, objectives, schedules, and budgets, with the increasingly obvious result that LLNL objectives and methodologies were not compatible with those of the NURE program office, and the LLNL geochemical studies were not relevant to the program goal. The LLNL portion of the HSSR program was consequently terminated, and all work was suspended by June 1979. Of the 38,000 sites sampled, 30,000 were analyzed by instrumental neutron activation analyses (INAA), delayed neutron counting (DNC), optical emission spectroscopy (OES), and automated chloride-sulfate analyses (SC). Data from about 13,000 sites have been formally reported. From each site, analyses were published of about 30 of the 60 elements observed. Uranium mineralization has been identified at several places which were previously not recognized as potential uranium source areas, and a number of other geochemical anomalies were discovered

  16. LLNL Site plan for a MOX fuel lead assembly mission in support of surplus plutonium disposition

    Energy Technology Data Exchange (ETDEWEB)

    Bronson, M.C.

    1997-10-01

    The principal facilities that LLNL would use to support a MOX Fuel Lead Assembly Mission are Building 332 and Building 334. Both of these buildings are within the security boundary known as the LLNL Superblock. Building 332 is the LLNL Plutonium Facility. As an operational plutonium facility, it has all the infrastructure and support services required for plutonium operations. The LLNL Plutonium Facility routinely handles kilogram quantities of plutonium and uranium. Currently, the building is limited to a plutonium inventory of 700 kilograms and a uranium inventory of 300 kilograms. Process rooms (excluding the vaults) are limited to an inventory of 20 kilograms per room. Ongoing operations include: receiving SSTS, material receipt, storage, metal machining and casting, welding, metal-to-oxide conversion, purification, molten salt operations, chlorination, oxide calcination, cold pressing and sintering, vitrification, encapsulation, chemical analysis, metallography and microprobe analysis, waste material processing, material accountability measurements, packaging, and material shipping. Building 334 is the Hardened Engineering Test Building. This building supports environmental and radiation measurements on encapsulated plutonium and uranium components. Other existing facilities that would be used to support a MOX Fuel Lead Assembly Mission include Building 335 for hardware receiving and storage and TRU and LLW waste storage and shipping facilities, and Building 331 or Building 241 for storage of depleted uranium.

  17. Beam-beam studies for the proposed SLAC/LBL/LLNL B Factory

    International Nuclear Information System (INIS)

    Furman, M.A.

    1991-05-01

    We present a summary of beam-beam dynamics studies that have been carried out to date for the proposed SLAC/LBL/LLNL B Factory. Most of the material presented here is contained in the proposal's Conceptual Design Report, although post-CDR studies are also presented. 15 refs., 6 figs., 2 tabs

  18. LLNL Site plan for a MOX fuel lead assembly mission in support of surplus plutonium disposition

    International Nuclear Information System (INIS)

    Bronson, M.C.

    1997-01-01

    The principal facilities that LLNL would use to support a MOX Fuel Lead Assembly Mission are Building 332 and Building 334. Both of these buildings are within the security boundary known as the LLNL Superblock. Building 332 is the LLNL Plutonium Facility. As an operational plutonium facility, it has all the infrastructure and support services required for plutonium operations. The LLNL Plutonium Facility routinely handles kilogram quantities of plutonium and uranium. Currently, the building is limited to a plutonium inventory of 700 kilograms and a uranium inventory of 300 kilograms. Process rooms (excluding the vaults) are limited to an inventory of 20 kilograms per room. Ongoing operations include: receiving SSTS, material receipt, storage, metal machining and casting, welding, metal-to-oxide conversion, purification, molten salt operations, chlorination, oxide calcination, cold pressing and sintering, vitrification, encapsulation, chemical analysis, metallography and microprobe analysis, waste material processing, material accountability measurements, packaging, and material shipping. Building 334 is the Hardened Engineering Test Building. This building supports environmental and radiation measurements on encapsulated plutonium and uranium components. Other existing facilities that would be used to support a MOX Fuel Lead Assembly Mission include Building 335 for hardware receiving and storage and TRU and LLW waste storage and shipping facilities, and Building 331 or Building 241 for storage of depleted uranium

  19. Attenuation Drift in the Micro-Computed Tomography System at LLNL

    Energy Technology Data Exchange (ETDEWEB)

    Dooraghi, Alex A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brown, William [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Seetho, Isaac [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kallman, Jeff [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lennox, Kristin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Glascoe, Lee [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-12

    The maximum allowable level of drift in the linear attenuation coefficients (μ) for a Lawrence Livermore National Laboratory (LLNL) micro-computed tomography (MCT) system was determined to be 0.1%. After ~100 scans were acquired during the period of November 2014 to March 2015, the drift in μ for a set of six reference materials reached or exceeded 0.1%. Two strategies have been identified to account for or correct the drift. First, normalizing the 160 kV and 100 kV μ data by the μ of water at the corresponding energy, in contrast to conducting normalization at the 160 kV energy only, significantly compensates for measurement drift. Even after the modified normalization, μ of polytetrafluoroethylene (PTFE) increases linearly with scan number at an average rate of 0.00147% per scan. This is consistent with PTFE radiation damage documented in the literature. The second strategy suggested is the replacement of the PTFE reference with fluorinated ethylene propylene (FEP), which has the same effective atomic number (Ze) and electron density (ρe) as PTFE, but is 10 times more radiation resistant. This is important as effective atomic number and electron density are key parameters in analysis. The presence of a material with properties such as PTFE, when taken together with the remaining references, allows for a broad range of the (Ze, ρe) feature space to be used in analysis. While FEP is documented as 10 times more radiation resistant, testing will be necessary to assess how often, if necessary, FEP will need to be replaced. As radiation damage to references has been observed, it will be necessary to monitor all reference materials for radiation damage to ensure consistent x-ray characteristics of the references.

  20. Genomic-based tools for the risk assessment, management, and prevention of type 2 diabetes

    Directory of Open Access Journals (Sweden)

    Johansen Taber KA

    2015-01-01

    Full Text Available Katherine A Johansen Taber, Barry D DickinsonDepartment of Science and Biotechnology, American Medical Association, Chicago, IL, USAAbstract: Type 2 diabetes (T2D is a common and serious disorder and is a significant risk factor for the development of cardiovascular disease, neuropathy, nephropathy, retinopathy, periodontal disease, and foot ulcers and amputations. The burden of disease associated with T2D has led to an emphasis on early identification of the millions of individuals at high risk so that management and intervention strategies can be effectively implemented before disease progression begins. With increasing knowledge about the genetic basis of T2D, several genomic-based strategies have been tested for their ability to improve risk assessment, management and prevention. Genetic risk scores have been developed with the intent to more accurately identify those at risk for T2D and to potentially improve motivation and adherence to lifestyle modification programs. In addition, evidence is building that oral antihyperglycemic medications are subject to pharmacogenomic variation in a substantial number of patients, suggesting genomics may soon play a role in determining the most effective therapies. T2D is a complex disease that affects individuals differently, and risk prediction and treatment may be challenging for health care providers. Genomic approaches hold promise for their potential to improve risk prediction and tailor management for individual patients and to contribute to better health outcomes for those with T2D.Keywords: diabetes, genomic, risk prediction, management

  1. Prediction of malting quality traits in barley based on genome-wide marker data to assess the potential of genomic selection.

    Science.gov (United States)

    Schmidt, Malthe; Kollers, Sonja; Maasberg-Prelle, Anja; Großer, Jörg; Schinkel, Burkhard; Tomerius, Alexandra; Graner, Andreas; Korzun, Viktor

    2016-02-01

    Genomic prediction of malting quality traits in barley shows the potential of applying genomic selection to improve selection for malting quality and speed up the breeding process. Genomic selection has been applied to various plant species, mostly for yield or yield-related traits such as grain dry matter yield or thousand kernel weight, and improvement of resistances against diseases. Quality traits have not been the main scope of analysis for genomic selection, but have rather been addressed by marker-assisted selection. In this study, the potential to apply genomic selection to twelve malting quality traits in two commercial breeding programs of spring and winter barley (Hordeum vulgare L.) was assessed. Phenotypic means were calculated combining multilocational field trial data from 3 or 4 years, depending on the trait investigated. Three to five locations were available in each of these years. Heritabilities for malting traits ranged between 0.50 and 0.98. Predictive abilities (PA), as derived from cross validation, ranged between 0.14 to 0.58 for spring barley and 0.40-0.80 for winter barley. Small training sets were shown to be sufficient to obtain useful PAs, possibly due to the narrow genetic base in this breeding material. Deployment of genomic selection in malting barley breeding clearly has the potential to reduce cost intensive phenotyping for quality traits, increase selection intensity and to shorten breeding cycles.

  2. A Critical Analysis of Assessment Quality in Genomics and Bioinformatics Education Research

    Science.gov (United States)

    Campbell, Chad E.; Nehm, Ross H.

    2013-01-01

    The growing importance of genomics and bioinformatics methods and paradigms in biology has been accompanied by an explosion of new curricula and pedagogies. An important question to ask about these educational innovations is whether they are having a meaningful impact on students’ knowledge, attitudes, or skills. Although assessments are necessary tools for answering this question, their outputs are dependent on their quality. Our study 1) reviews the central importance of reliability and construct validity evidence in the development and evaluation of science assessments and 2) examines the extent to which published assessments in genomics and bioinformatics education (GBE) have been developed using such evidence. We identified 95 GBE articles (out of 226) that contained claims of knowledge increases, affective changes, or skill acquisition. We found that 1) the purpose of most of these studies was to assess summative learning gains associated with curricular change at the undergraduate level, and 2) a minority (<10%) of studies provided any reliability or validity evidence, and only one study out of the 95 sampled mentioned both validity and reliability. Our findings raise concerns about the quality of evidence derived from these instruments. We end with recommendations for improving assessment quality in GBE. PMID:24006400

  3. Assessing the evolutionary impact of amino acid mutations in the human genome

    DEFF Research Database (Denmark)

    Boyko, Adam R; Williamson, Scott H; Indap, Amit R

    2008-01-01

    Quantifying the distribution of fitness effects among newly arising mutations in the human genome is key to resolving important debates in medical and evolutionary genetics. Here, we present a method for inferring this distribution using Single Nucleotide Polymorphism (SNP) data from a population...... of demographic and selective effects to patterning amino acid variation in the human genome. We find evidence of an ancient population expansion in the sample with African ancestry and a relatively recent bottleneck in the sample with European ancestry. After accounting for these demographic effects, we find...... with non-stationary demographic history (such as that of modern humans). Application of our method to 47,576 coding SNPs found by direct resequencing of 11,404 protein coding-genes in 35 individuals (20 European Americans and 15 African Americans) allows us to assess the relative contribution...

  4. Comparative genomic characterization of three Streptococcus parauberis strains in fish pathogen, as assessed by wide-genome analyses.

    Directory of Open Access Journals (Sweden)

    Seong-Won Nho

    Full Text Available Streptococcus parauberis, which is the main causative agent of streptococcosis among olive flounder (Paralichthys olivaceus in northeast Asia, can be distinctly divided into two groups (type I and type II by an agglutination test. Here, the whole genome sequences of two Japanese strains (KRS-02083 and KRS-02109 were determined and compared with the previously determined genome of a Korean strain (KCTC 11537. The genomes of S. parauberis are intermediate in size and have lower GC contents than those of other streptococci. We annotated 2,236 and 2,048 genes in KRS-02083 and KRS-02109, respectively. Our results revealed that the three S. parauberis strains contain different genomic insertions and deletions. In particular, the genomes of Korean and Japanese strains encode different factors for sugar utilization; the former encodes the phosphotransferase system (PTS for sorbose, whereas the latter encodes proteins for lactose hydrolysis, respectively. And the KRS-02109 strain, specifically, was the type II strain found to be able to resist phage infection through the clustered regularly interspaced short palindromic repeats (CRISPR/Cas system and which might contribute valuably to serologically distribution. Thus, our genome-wide association study shows that polymorphisms can affect pathogen responses, providing insight into biological/biochemical pathways and phylogenetic diversity.

  5. Joint research and development and exchange of technology on toxic material emergency response between LLNL and ENEA. 1985 progress report

    International Nuclear Information System (INIS)

    Dickerson, M.H.; Caracciolo, R.

    1986-01-01

    For the past six years, the US Department of Energy, LLNL, and the ENEA, Rome, Italy, have participated in cooperative studies for improving a systems approach to an emergency response following nuclear accidents. Technology exchange between LLNL and the ENEA was initially confined to the development, application, and evaluation of atmospheric transport and diffusion models. With the emergence of compatible hardware configurations between LLNL and ENEA, exchanges of technology and ideas for improving the development and implementation of systems are beginning to emerge. This report describes cooperative work that has occurred during the past three years, the present state of each system, and recommendations for future exchanges of technology

  6. Evaluation of Quality Assessment Protocols for High Throughput Genome Resequencing Data.

    Science.gov (United States)

    Chiara, Matteo; Pavesi, Giulio

    2017-01-01

    Large-scale initiatives aiming to recover the complete sequence of thousands of human genomes are currently being undertaken worldwide, concurring to the generation of a comprehensive catalog of human genetic variation. The ultimate and most ambitious goal of human population scale genomics is the characterization of the so-called human "variome," through the identification of causal mutations or haplotypes. Several research institutions worldwide currently use genotyping assays based on Next-Generation Sequencing (NGS) for diagnostics and clinical screenings, and the widespread application of such technologies promises major revolutions in medical science. Bioinformatic analysis of human resequencing data is one of the main factors limiting the effectiveness and general applicability of NGS for clinical studies. The requirement for multiple tools, to be combined in dedicated protocols in order to accommodate different types of data (gene panels, exomes, or whole genomes) and the high variability of the data makes difficult the establishment of a ultimate strategy of general use. While there already exist several studies comparing sensitivity and accuracy of bioinformatic pipelines for the identification of single nucleotide variants from resequencing data, little is known about the impact of quality assessment and reads pre-processing strategies. In this work we discuss major strengths and limitations of the various genome resequencing protocols are currently used in molecular diagnostics and for the discovery of novel disease-causing mutations. By taking advantage of publicly available data we devise and suggest a series of best practices for the pre-processing of the data that consistently improve the outcome of genotyping with minimal impacts on computational costs.

  7. The LLNL Multiuser Tandem Laboratory computer-controlled radiation monitoring system

    International Nuclear Information System (INIS)

    Homann, S.G.

    1992-01-01

    The Physics Department of the Lawrence Livermore National Laboratory (LLNL) recently constructed a Multiuser Tandem Laboratory (MTL) to perform a variety of basic and applied measurement programs. The laboratory and its research equipment were constructed with support from a consortium of LLNL Divisions, Sandia National Laboratories Livermore, and the University of California. Primary design goals for the facility were inexpensive construction and operation, high beam quality at a large number of experimental stations, and versatility in adapting to new experimental needs. To accomplish these goals, our main design decisions were to place the accelerator in an unshielded structure, to make use of reconfigured cyclotrons as effective switching magnets, and to rely on computer control systems for both radiological protection and highly reproducible and well-characterized accelerator operation. This paper addresses the radiological control computer system

  8. GENEPEASE Genomic tools for assessment of pesticide effects on the agricultural soil ecosystem

    DEFF Research Database (Denmark)

    Jacobsen, Carsten Suhr; Feld, Louise; Hjelmsø, Mathis Hjort

    The project focussed on validating RNA based methods as potential genomic tools in assessment of agricultural soil ecosystems. It was shown that the mRNA based technique was very sensitive and the effects was seen in the same situations as when the OECD nitrification assay showed an effect. 16S r......RNA based pyrosequencing of bacterial communities in soil was shown to report different than just DNA based analysis and indicated unlike the DNA measurement that the community was developing. Finally microarray analysis was compared to traditional test for toxicity testing of Folsomia candida and showed...

  9. Effects of stratospheric aerosol surface processes on the LLNL two-dimensional zonally averaged model

    International Nuclear Information System (INIS)

    Connell, P.S.; Kinnison, D.E.; Wuebbles, D.J.; Burley, J.D.; Johnston, H.S.

    1992-01-01

    We have investigated the effects of incorporating representations of heterogeneous chemical processes associated with stratospheric sulfuric acid aerosol into the LLNL two-dimensional, zonally averaged, model of the troposphere and stratosphere. Using distributions of aerosol surface area and volume density derived from SAGE 11 satellite observations, we were primarily interested in changes in partitioning within the Cl- and N- families in the lower stratosphere, compared to a model including only gas phase photochemical reactions

  10. Status of the SLAC/LBL/LLNL B-factory and the BABAR detector

    International Nuclear Information System (INIS)

    Oddone, P.

    1994-10-01

    After a brief introduction on the physics reach of the SLAC/LBL/LLNL Asymmetric B-Factory, the author describes the status of the accelerator and the detector as of the end of 1994. At this time, essentially all major decisions have been made, including the choice of particle identification for the detector. The author concludes this report with the description of the schedule for the construction of both accelerator and detector

  11. Evaluation of the neutron dose received by personnel at the LLNL

    International Nuclear Information System (INIS)

    Hankins, D.E.

    1982-01-01

    This report was prepared to document the techniques being used to evaluate the neutron exposures received by personnel at the LLNL. Two types of evaluations are discussed covering the use of the routine personnel dosimeter and of the albedo neutron dosimeter. Included in the report are field survey results which were used to determine the calibration factors being applied to the dosimeter readings. Calibration procedures are discussed and recommendations are made on calibration and evaluation procedures

  12. LLNL Contribution to LLE FY09 Annual Report: NIC and HED Results

    International Nuclear Information System (INIS)

    Heeter, R.F.; Landen, O.L.; Hsing, W.W.; Fournier, K.B.

    2009-01-01

    In FY09, LLNL led 238 target shots on the OMEGA Laser System. Approximately half of these LLNL-led shots supported the National Ignition Campaign (NIC). The remainder was dedicated to experiments for the high-energy-density stewardship experiments (HEDSE). Objectives of the LLNL led NIC campaigns at OMEGA included: (1) Laser-plasma interaction studies in physical conditions relevant for the NIF ignition targets; (2) Demonstration of Tr = 100 eV foot symmetry tuning using a reemission sphere; (3) X-ray scattering in support of conductivity measurements of solid density Be plasmas; (4) Experiments to study the physical properties (thermal conductivity) of shocked fusion fuels; (5) High-resolution measurements of velocity nonuniformities created by microscopic perturbations in NIF ablator materials; (6) Development of a novel Compton Radiography diagnostic platform for ICF experiments; and (7) Precision validation of the equation of state for quartz. The LLNL HEDSE campaigns included the following experiments: (1) Quasi-isentropic (ICE) drive used to study material properties such as strength, equation of state, phase, and phase-transition kinetics under high pressure; (2) Development of a high-energy backlighter for radiography in support of material strength experiments using Omega EP and the joint OMEGA-OMEGA-EP configuration; (3) Debris characterization from long-duration, point-apertured, point-projection x-ray backlighters for NIF radiation transport experiments; (4) Demonstration of ultrafast temperature and density measurements with x-ray Thomson scattering from short-pulse laser-heated matter; (5) The development of an experimental platform to study nonlocal thermodynamic equilibrium (NLTE) physics using direct-drive implosions; (6) Opacity studies of high-temperature plasmas under LTE conditions; and (7) Characterization of copper (Cu) foams for HEDSE experiments.

  13. Superconducting magnet development capability of the LLNL [Lawrence Livermore National Laboratory] High Field Test Facility

    International Nuclear Information System (INIS)

    Miller, J.R.; Shen, S.; Summers, L.T.

    1990-02-01

    This paper discusses the following topics: High-Field Test Facility Equipment at LLNL; FENIX Magnet Facility; High-Field Test Facility (HFTF) 2-m Solenoid; Cryogenic Mechanical Test Facility; Electro-Mechanical Conductor Test Apparatus; Electro-Mechanical Wire Test Apparatus; FENIX/HFTF Data System and Network Topology; Helium Gas Management System (HGMS); Airco Helium Liquefier/Refrigerator; CTI 2800 Helium Liquefier; and MFTF-B/ITER Magnet Test Facility

  14. LLNL Containment Program nuclear test effects and geologic data base: glossary and parameter definitions

    International Nuclear Information System (INIS)

    Howard, N.W.

    1983-01-01

    This report lists, defines, and updates Parameters in DBASE, an LLNL test effects data bank in which data are stored from experiments performed at NTS and other test sites. Parameters are listed by subject and by number. Part 2 of this report presents the same information for parameters for which some of the data may be classified; it was issued in 1979 and is not being reissued at this time as it is essentially unchanged

  15. Comparative genomic assessment of Multi-Locus Sequence Typing: rapid accumulation of genomic heterogeneity among clonal isolates of Campylobacter jejuni

    Directory of Open Access Journals (Sweden)

    Nash John HE

    2008-08-01

    Full Text Available Abstract Background Multi-Locus Sequence Typing (MLST has emerged as a leading molecular typing method owing to its high ability to discriminate among bacterial isolates, the relative ease with which data acquisition and analysis can be standardized, and the high portability of the resulting sequence data. While MLST has been successfully applied to the study of the population structure for a number of different bacterial species, it has also provided compelling evidence for high rates of recombination in some species. We have analyzed a set of Campylobacter jejuni strains using MLST and Comparative Genomic Hybridization (CGH on a full-genome microarray in order to determine whether recombination and high levels of genomic mosaicism adversely affect the inference of strain relationships based on the analysis of a restricted number of genetic loci. Results Our results indicate that, in general, there is significant concordance between strain relationships established by MLST and those based on shared gene content as established by CGH. While MLST has significant predictive power with respect to overall genome similarity of isolates, we also found evidence for significant differences in genomic content among strains that would otherwise appear to be highly related based on their MLST profiles. Conclusion The extensive genomic mosaicism between closely related strains has important implications in the context of establishing strain to strain relationships because it suggests that the exact gene content of strains, and by extension their phenotype, is less likely to be "predicted" based on a small number of typing loci. This in turn suggests that a greater emphasis should be placed on analyzing genes of clinical interest as we forge ahead with the next generation of molecular typing methods.

  16. Physics of laser fusion. Volume II. Diagnostics of experiments on laser fusion targets at LLNL

    Energy Technology Data Exchange (ETDEWEB)

    Ahlstrom, H.G.

    1982-01-01

    These notes present the experimental basis and status for laser fusion as developed at LLNL. There are two other volumes in this series: Vol. I, by C.E. Max, presents the theoretical laser-plasma interaction physics; Vol. III, by J.F. Holzrichter et al., presents the theory and design of high-power pulsed lasers. A fourth volume will present the theoretical implosion physics. The notes consist of six sections. The first, an introductory section, provides some of the history of inertial fusion and a simple explanation of the concepts involved. The second section presents an extensive discussion of diagnostic instrumentation used in the LLNL Laser Fusion Program. The third section is a presentation of laser facilities and capabilities at LLNL. The purpose here is to define capability, not to derive how it was obtained. The fourth and fifth sections present the experimental data on laser-plasma interaction and implosion physics. The last chapter is a short projection of the future.

  17. Physics of laser fusion. Volume II. Diagnostics of experiments on laser fusion targets at LLNL

    International Nuclear Information System (INIS)

    Ahlstrom, H.G.

    1982-01-01

    These notes present the experimental basis and status for laser fusion as developed at LLNL. There are two other volumes in this series: Vol. I, by C.E. Max, presents the theoretical laser-plasma interaction physics; Vol. III, by J.F. Holzrichter et al., presents the theory and design of high-power pulsed lasers. A fourth volume will present the theoretical implosion physics. The notes consist of six sections. The first, an introductory section, provides some of the history of inertial fusion and a simple explanation of the concepts involved. The second section presents an extensive discussion of diagnostic instrumentation used in the LLNL Laser Fusion Program. The third section is a presentation of laser facilities and capabilities at LLNL. The purpose here is to define capability, not to derive how it was obtained. The fourth and fifth sections present the experimental data on laser-plasma interaction and implosion physics. The last chapter is a short projection of the future

  18. Institute of Geophysics and Planetary Physics (IGPP), Lawrence Livermore National Laboratory (LLNL): Quinquennial report, November 14-15, 1996

    Energy Technology Data Exchange (ETDEWEB)

    Tweed, J.

    1996-10-01

    This Quinquennial Review Report of the Lawrence Livermore National Laboratory (LLNL) branch of the Institute for Geophysics and Planetary Physics (IGPP) provides an overview of IGPP-LLNL, its mission, and research highlights of current scientific activities. This report also presents an overview of the University Collaborative Research Program (UCRP), a summary of the UCRP Fiscal Year 1997 proposal process and the project selection list, a funding summary for 1993-1996, seminars presented, and scientific publications. 2 figs., 3 tabs.

  19. New approaches to assessing the effects of mutagenic agents on the integrity of the human genome

    International Nuclear Information System (INIS)

    Elespuru, R.K.; Sankaranarayanan, K.

    2007-01-01

    Heritable genetic alterations, although individually rare, have a substantial collective health impact. Approximately 20% of these are new mutations of unknown cause. Assessment of the effect of exposures to DNA damaging agents, i.e. mutagenic chemicals and radiations, on the integrity of the human genome and on the occurrence of genetic disease remains a daunting challenge. Recent insights may explain why previous examination of human exposures to ionizing radiation, as in Hiroshima and Nagasaki, failed to reveal heritable genetic effects. New opportunities to assess the heritable genetic damaging effects of environmental mutagens are afforded by: (1) integration of knowledge on the molecular nature of genetic disorders and the molecular effects of mutagens; (2) the development of more practical assays for germline mutagenesis; (3) the likely use of population-based genetic screening in personalized medicine

  20. Genome-wide assessment in Escherichia coli reveals time-dependent nanotoxicity paradigms.

    Science.gov (United States)

    Reyes, Vincent C; Li, Minghua; Hoek, Eric M V; Mahendra, Shaily; Damoiseaux, Robert

    2012-11-27

    The use of engineered nanomaterials (eNM) in consumer and industrial products is increasing exponentially. Our ability to rapidly assess their potential effects on human and environmental health is limited by our understanding of nanomediated toxicity. High-throughput screening (HTS) enables the investigation of nanomediated toxicity on a genome-wide level, thus uncovering their novel mechanisms and paradigms. Herein, we investigate the toxicity of zinc-containing nanomaterials (Zn-eNMs) using a time-resolved HTS methodology in an arrayed Escherichia coli genome-wide knockout (KO) library. The library was screened against nanoscale zerovalent zinc (nZn), nanoscale zinc oxide (nZnO), and zinc chloride (ZnCl(2)) salt as reference. Through sequential screening over 24 h, our method identified 173 sensitive clones from diverse biological pathways, which fell into two general groups: early and late responders. The overlap between these groups was small. Our results suggest that bacterial toxicity mechanisms change from pathways related to general metabolic function, transport, signaling, and metal ion homeostasis to membrane synthesis pathways over time. While all zinc sources shared pathways relating to membrane damage and metal ion homeostasis, Zn-eNMs and ZnCl(2) displayed differences in their sensitivity profiles. For example, ZnCl(2) and nZnO elicited unique responses in pathways related to two-component signaling and monosaccharide biosynthesis, respectively. Single isolated measurements, such as MIC or IC(50), are inadequate, and time-resolved approaches utilizing genome-wide assays are therefore needed to capture this crucial dimension and illuminate the dynamic interplay at the nano-bio interface.

  1. Quantitative Assessment of Thermodynamic Constraints on the Solution Space of Genome-Scale Metabolic Models

    Science.gov (United States)

    Hamilton, Joshua J.; Dwivedi, Vivek; Reed, Jennifer L.

    2013-01-01

    Constraint-based methods provide powerful computational techniques to allow understanding and prediction of cellular behavior. These methods rely on physiochemical constraints to eliminate infeasible behaviors from the space of available behaviors. One such constraint is thermodynamic feasibility, the requirement that intracellular flux distributions obey the laws of thermodynamics. The past decade has seen several constraint-based methods that interpret this constraint in different ways, including those that are limited to small networks, rely on predefined reaction directions, and/or neglect the relationship between reaction free energies and metabolite concentrations. In this work, we utilize one such approach, thermodynamics-based metabolic flux analysis (TMFA), to make genome-scale, quantitative predictions about metabolite concentrations and reaction free energies in the absence of prior knowledge of reaction directions, while accounting for uncertainties in thermodynamic estimates. We applied TMFA to a genome-scale network reconstruction of Escherichia coli and examined the effect of thermodynamic constraints on the flux space. We also assessed the predictive performance of TMFA against gene essentiality and quantitative metabolomics data, under both aerobic and anaerobic, and optimal and suboptimal growth conditions. Based on these results, we propose that TMFA is a useful tool for validating phenotypes and generating hypotheses, and that additional types of data and constraints can improve predictions of metabolite concentrations. PMID:23870272

  2. Genomic sequencing: assessing the health care system, policy, and big-data implications.

    Science.gov (United States)

    Phillips, Kathryn A; Trosman, Julia R; Kelley, Robin K; Pletcher, Mark J; Douglas, Michael P; Weldon, Christine B

    2014-07-01

    New genomic sequencing technologies enable the high-speed analysis of multiple genes simultaneously, including all of those in a person's genome. Sequencing is a prominent example of a "big data" technology because of the massive amount of information it produces and its complexity, diversity, and timeliness. Our objective in this article is to provide a policy primer on sequencing and illustrate how it can affect health care system and policy issues. Toward this end, we developed an easily applied classification of sequencing based on inputs, methods, and outputs. We used it to examine the implications of sequencing for three health care system and policy issues: making care more patient-centered, developing coverage and reimbursement policies, and assessing economic value. We conclude that sequencing has great promise but that policy challenges include how to optimize patient engagement as well as privacy, develop coverage policies that distinguish research from clinical uses and account for bioinformatics costs, and determine the economic value of sequencing through complex economic models that take into account multiple findings and downstream costs. Project HOPE—The People-to-People Health Foundation, Inc.

  3. Quantitative assessment of thermodynamic constraints on the solution space of genome-scale metabolic models.

    Science.gov (United States)

    Hamilton, Joshua J; Dwivedi, Vivek; Reed, Jennifer L

    2013-07-16

    Constraint-based methods provide powerful computational techniques to allow understanding and prediction of cellular behavior. These methods rely on physiochemical constraints to eliminate infeasible behaviors from the space of available behaviors. One such constraint is thermodynamic feasibility, the requirement that intracellular flux distributions obey the laws of thermodynamics. The past decade has seen several constraint-based methods that interpret this constraint in different ways, including those that are limited to small networks, rely on predefined reaction directions, and/or neglect the relationship between reaction free energies and metabolite concentrations. In this work, we utilize one such approach, thermodynamics-based metabolic flux analysis (TMFA), to make genome-scale, quantitative predictions about metabolite concentrations and reaction free energies in the absence of prior knowledge of reaction directions, while accounting for uncertainties in thermodynamic estimates. We applied TMFA to a genome-scale network reconstruction of Escherichia coli and examined the effect of thermodynamic constraints on the flux space. We also assessed the predictive performance of TMFA against gene essentiality and quantitative metabolomics data, under both aerobic and anaerobic, and optimal and suboptimal growth conditions. Based on these results, we propose that TMFA is a useful tool for validating phenotypes and generating hypotheses, and that additional types of data and constraints can improve predictions of metabolite concentrations. Copyright © 2013 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  4. Single Amplified Genomes as Source for Novel Extremozymes: Annotation, Expression and Functional Assessment

    KAUST Repository

    Grötzinger, Stefan

    2017-12-01

    Enzymes, as nature’s catalysts, show remarkable abilities that can revolutionize the chemical, biotechnological, bioremediation, agricultural and pharmaceutical industries. However, the narrow range of stability of the majority of described biocatalysts limits their use for many applications. To overcome these restrictions, extremozymes derived from microorganisms thriving under harsh conditions can be used. Extremophiles living in high salinity are especially interesting as they operate at low water activity, which is similar to conditions used in standard chemical applications. Because only about 0.1 % of all microorganisms can be cultured, the traditional way of culture-based enzyme function determination needs to be overcome. The rise of high-throughput next-generation-sequencing technologies allows for deep insight into nature’s variety. Single amplified genomes (SAGs) specifically allow for whole genome assemblies from small sample volumes with low cell yields, as are typical for extreme environments. Although these technologies have been available for years, the expected boost in biotechnology has held off. One of the main reasons is the lack of reliable functional annotation of the genomic data, which is caused by the low amount (0.15 %) of experimentally described genes. Here, we present a novel annotation algorithm, designed to annotate the enzymatic function of genomes from microorganisms with low homologies to described microorganisms. The algorithm was established on SAGs from the extreme environment of selected hypersaline Red Sea brine pools with 4.3 M salinity and temperatures up to 68°C. Additionally, a novel consensus pattern for the identification of γ-carbonic anhydrases was created and applied in the algorithm. To verify the annotation, selected genes were expressed in the hypersaline expression system Halobacterium salinarum. This expression system was established and optimized in a continuously stirred tank reactor, leading to

  5. Assessing Student Understanding of the "New Biology": Development and Evaluation of a Criterion-Referenced Genomics and Bioinformatics Assessment

    Science.gov (United States)

    Campbell, Chad Edward

    Over the past decade, hundreds of studies have introduced genomics and bioinformatics (GB) curricula and laboratory activities at the undergraduate level. While these publications have facilitated the teaching and learning of cutting-edge content, there has yet to be an evaluation of these assessment tools to determine if they are meeting the quality control benchmarks set forth by the educational research community. An analysis of these assessment tools indicated that valid and reliable inferences about student learning. To remedy this situation the development of a robust GB assessment aligned with the quality control benchmarks was undertaken in order to ensure evidence-based evaluation of student learning outcomes. Content validity is a central piece of construct validity, and it must be used to guide instrument and item development. This study reports on: (1) the correspondence of content validity evidence gathered from independent sources; (2) the process of item development using this evidence; (3) the results from a pilot administration of the assessment; (4) the subsequent modification of the assessment based on the pilot administration results and; (5) the results from the second administration of the assessment. Twenty-nine different subtopics within GB (Appendix B: Genomics and Bioinformatics Expert Survey) were developed based on preliminary GB textbook analyses. These subtopics were analyzed using two methods designed to gather content validity evidence: (1) a survey of GB experts (n=61) and (2) a detailed content analyses of GB textbooks (n=6). By including only the subtopics that were shown to have robust support across these sources, 22 GB subtopics were established for inclusion in the assessment. An expert panel subsequently developed, evaluated, and revised two multiple-choice items to align with each of the 22 subtopics, producing a final item pool of 44 items. These items were piloted with student samples of varying content exposure levels

  6. Assessment of Genetic Heterogeneity in Structured Plant Populations Using Multivariate Whole-Genome Regression Models.

    Science.gov (United States)

    Lehermeier, Christina; Schön, Chris-Carolin; de Los Campos, Gustavo

    2015-09-01

    Plant breeding populations exhibit varying levels of structure and admixture; these features are likely to induce heterogeneity of marker effects across subpopulations. Traditionally, structure has been dealt with as a potential confounder, and various methods exist to "correct" for population stratification. However, these methods induce a mean correction that does not account for heterogeneity of marker effects. The animal breeding literature offers a few recent studies that consider modeling genetic heterogeneity in multibreed data, using multivariate models. However, these methods have received little attention in plant breeding where population structure can have different forms. In this article we address the problem of analyzing data from heterogeneous plant breeding populations, using three approaches: (a) a model that ignores population structure [A-genome-based best linear unbiased prediction (A-GBLUP)], (b) a stratified (i.e., within-group) analysis (W-GBLUP), and (c) a multivariate approach that uses multigroup data and accounts for heterogeneity (MG-GBLUP). The performance of the three models was assessed on three different data sets: a diversity panel of rice (Oryza sativa), a maize (Zea mays L.) half-sib panel, and a wheat (Triticum aestivum L.) data set that originated from plant breeding programs. The estimated genomic correlations between subpopulations varied from null to moderate, depending on the genetic distance between subpopulations and traits. Our assessment of prediction accuracy features cases where ignoring population structure leads to a parsimonious more powerful model as well as others where the multivariate and stratified approaches have higher predictive power. In general, the multivariate approach appeared slightly more robust than either the A- or the W-GBLUP. Copyright © 2015 by the Genetics Society of America.

  7. Status of the SLAC/LBL/LLNL B-Factory and the BaBar detector

    International Nuclear Information System (INIS)

    Oddone, P.

    1994-08-01

    The primary motivation of the Asymmetric B-Factory is the study of CP violation. The decay of B mesons and, in particular, the decay of neutral B mesons, offers the possibility of determining conclusively whether CP violation is part and parcel of the Standard Model with three generations of quarks and leptons. Alternatively, the authors may discover that CP violation lies outside the present framework. In this paper the authors briefly describe the physics reach of the SLAC/LBL/LLNL Asymmetric B-Factory, the progress on the machine design and construction, the progress on the detector design, and the schedule to complete both projects

  8. M4FT-15LL0806062-LLNL Thermodynamic and Sorption Data FY15 Progress Report

    Energy Technology Data Exchange (ETDEWEB)

    Zavarin, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Wolery, T. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-31

    This progress report (Milestone Number M4FT-15LL0806062) summarizes research conducted at Lawrence Livermore National Laboratory (LLNL) within Work Package Number FT-15LL080606. The focus of this research is the thermodynamic modeling of Engineered Barrier System (EBS) materials and properties and development of thermodynamic databases and models to evaluate the stability of EBS materials and their interactions with fluids at various physicochemical conditions relevant to subsurface repository environments. The development and implementation of equilibrium thermodynamic models are intended to describe chemical and physical processes such as solubility, sorption, and diffusion.

  9. Analyses in Support of Z-IFE LLNL Progress Report for FY-05

    International Nuclear Information System (INIS)

    Moir, R W; Abbott, R P; Callahan, D A; Latkowski, J F; Meier, W R; Reyes, S

    2005-01-01

    The FY04 LLNL study of Z-IFE [1] proposed and evaluated a design that deviated from SNL's previous baseline design. The FY04 study included analyses of shock mitigation, stress in the first wall, neutronics and systems studies. In FY05, the subject of this report, we build on our work and the theme of last year. Our emphasis continues to be on alternatives that hold promise of considerable improvements in design and economics compared to the base-line design. Our key results are summarized here

  10. Integrate genome-based assessment of safety for probiotic strains: Bacillus coagulans GBI-30, 6086 as a case study.

    Science.gov (United States)

    Salvetti, Elisa; Orrù, Luigi; Capozzi, Vittorio; Martina, Alessia; Lamontanara, Antonella; Keller, David; Cash, Howard; Felis, Giovanna E; Cattivelli, Luigi; Torriani, Sandra; Spano, Giuseppe

    2016-05-01

    Probiotics are microorganisms that confer beneficial effects on the host; nevertheless, before being allowed for human consumption, their safety must be verified with accurate protocols. In the genomic era, such procedures should take into account the genomic-based approaches. This study aims at assessing the safety traits of Bacillus coagulans GBI-30, 6086 integrating the most updated genomics-based procedures and conventional phenotypic assays. Special attention was paid to putative virulence factors (VF), antibiotic resistance (AR) genes and genes encoding enzymes responsible for harmful metabolites (i.e. biogenic amines, BAs). This probiotic strain was phenotypically resistant to streptomycin and kanamycin, although the genome analysis suggested that the AR-related genes were not easily transferrable to other bacteria, and no other genes with potential safety risks, such as those related to VF or BA production, were retrieved. Furthermore, no unstable elements that could potentially lead to genomic rearrangements were detected. Moreover, a workflow is proposed to allow the proper taxonomic identification of a microbial strain and the accurate evaluation of risk-related gene traits, combining whole genome sequencing analysis with updated bioinformatics tools and standard phenotypic assays. The workflow presented can be generalized as a guideline for the safety investigation of novel probiotic strains to help stakeholders (from scientists to manufacturers and consumers) to meet regulatory requirements and avoid misleading information.

  11. Over Batch Analysis for the LLNL Plutonium Packaging System (PuPS)

    International Nuclear Information System (INIS)

    Riley, D.; Dodson, K.

    2007-01-01

    This document addresses the concern raised in the Savannah River Site (SRS) Acceptance Criteria (Reference 1, Section 6.a.3) about receiving an item that is over batched by 1.0 kg of fissile materials. This document shows that the occurrence of this is incredible. Some of the Department of Energy Standard 3013 (DOE-STD-3013) requirements are described in Section 2.1. The SRS requirement is discussed in Section 2.2. Section 2.3 describes the way fissile materials are handled in the Lawrence Livermore National Laboratory (LLNL) Plutonium Facility (B332). Based on the material handling discussed in Section 2.3, there are only three errors that could result in a shipping container being over batched. These are: incorrect measurement of the item, selecting the wrong item to package, and packaging two items into a single shipping container. The analysis in Section 3 shows that the first two events are incredible because of the controls that exist at LLNL. The third event is physically impossible. Therefore, it is incredible for an item to be shipped to SRS that is more than 1.0 kg of fissile materials over batched

  12. Over Batch Analysis for the LLNL DOE-STD-3013 Packaging System

    International Nuclear Information System (INIS)

    Riley, D.C.; Dodson, K.

    2009-01-01

    This document addresses the concern raised in the Savannah River Site (SRS) Acceptance Criteria about receiving an item that is over batched by 1.0 kg of fissile materials. This document shows that the occurrence of this is incredible. Some of the Department of Energy Standard 3013 (DOE-STD-3013) requirements are described in Section 2.1. The SRS requirement is discussed in Section 2.2. Section 2.3 describes the way fissile materials are handled in the Lawrence Livermore National Laboratory (LLNL) Plutonium Facility (B332). Based on the material handling discussed in Section 2.3, there are only three errors that could result in a shipping container being over batched. These are: incorrect measurement of the item, selecting the wrong item to package, and packaging two items into a single shipping container. The analysis in Section 3 shows that the first two events are incredible because of the controls that exist at LLNL. The third event is physically impossible. Therefore, it is incredible for an item to be shipped to SRS that is more than 1.0 kg of fissile materials over batched.

  13. Implementing necessary and sufficient standards for radioactive waste management at LLNL

    International Nuclear Information System (INIS)

    Sims, J.M.; Ladran, A.; Hoyt, D.

    1995-01-01

    Lawrence Livermore National Laboratory (LLNL) and the U.S. Department of Energy, Oakland Field Office (DOE/OAK), are participating in a pilot program to evaluate the process to develop necessary and sufficient sets of standards for contractor activities. This concept of contractor and DOE jointly and locally deciding on what constitutes the set of standards that are necessary and sufficient to perform work safely and in compliance with federal, state, and local regulations, grew out of DOE's Department Standards Committee (Criteria for the Department's Standards Program, August 1994, DOE/EH/-0416). We have chosen radioactive waste management activities as the pilot program at LLNL. This pilot includes low-level radioactive waste, transuranic (TRU) waste, and the radioactive component of low-level and TRU mixed wastes. Guidance for the development and implementation of the necessary and sufficient set of standards is provided in open-quotes The Department of Energy Closure Process for Necessary and Sufficient Sets of Standards,close quotes March 27, 1995 (draft)

  14. LLNL Experimental Test Site (Site 300) Potable Water System Operations Plan

    Energy Technology Data Exchange (ETDEWEB)

    Ocampo, R. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bellah, W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-09-14

    The existing Lawrence Livermore National Laboratory (LLNL) Site 300 drinking water system operation schematic is shown in Figures 1 and 2 below. The sources of water are from two Site 300 wells (Well #18 and Well #20) and San Francisco Public Utilities Commission (SFPUC) Hetch-Hetchy water through the Thomas shaft pumping station. Currently, Well #20 with 300 gallons per minute (gpm) pump capacity is the primary source of well water used during the months of September through July, while Well #18 with 225 gpm pump capacity is the source of well water for the month of August. The well water is chlorinated using sodium hypochlorite to provide required residual chlorine throughout Site 300. Well water chlorination is covered in the Lawrence Livermore National Laboratory Experimental Test Site (Site 300) Chlorination Plan (“the Chlorination Plan”; LLNL-TR-642903; current version dated August 2013). The third source of water is the SFPUC Hetch-Hetchy Water System through the Thomas shaft facility with a 150 gpm pump capacity. At the Thomas shaft station the pumped water is treated through SFPUC-owned and operated ultraviolet (UV) reactor disinfection units on its way to Site 300. The Thomas Shaft Hetch- Hetchy water line is connected to the Site 300 water system through the line common to Well pumps #18 and #20 at valve box #1.

  15. Use of Genomic Data in Risk Assessment Caes Study: II. Evaluation of the Dibutyl Phthalate Toxicogenomic Dataset

    Science.gov (United States)

    An evaluation of the toxicogenomic data set for dibutyl phthalate (DBP) and male reproductive developmental effects was performed as part of a larger case study to test an approach for incorporating genomic data in risk assessment. The DBP toxicogenomic data set is composed of ni...

  16. Integrated genomic and BMI analysis for type 2 diabetes risk assessment.

    Directory of Open Access Journals (Sweden)

    Dayanara eLebrón-Aldea

    2015-03-01

    Full Text Available Type 2 Diabetes (T2D is a chronic disease arising from the development of insulin absence or resistance within the body, and a complex interplay of environmental and genetic factors. The incidence of T2D has increased throughout the last few decades, together with the occurrence of the obesity epidemic. The consideration of variants identified by Genome Wide Association Studies (GWAS into risk assessment models for T2D could aid in the identification of at-risk patients who could benefit from preventive medicine. In this study, we build several risk assessment models, and evaluated them with two different classification approaches (Logistic Regression and Neural Networks, to measure the effect of including genetic information in the prediction of T2D. We used data from to the Original and the Offspring cohorts of the Framingham Heart Study, which provides phenotypic and genetic information for 5,245 subjects (4,306 controls and 939 cases. Models were built by using several covariates: gender, exposure time, cohort, body mass index (BMI, and 65 established T2D-associated SNPs. We fitted Logistic Regressions and Bayesian Regularized Neural Network and then assessed their predictive ability by using a ten-fold cross validation. We found that the inclusion of genetic information into the risk assessment models increased the predictive ability by 2%, when compared to the baseline model. Furthermore, the models that included BMI at the onset of diabetes as a possible effector, gave an improvement of 6% in the area under the curve derived from the ROC analysis. The highest AUC achieved (0.75 belonged to the model that included BMI, and a genetic score based on the 65 established T2D-associated SNPs. Finally, the inclusion of SNPs and BMI raised predictive ability in all models as expected; however, results from the AUC in Neural Networks and Logistic Regression did not differ significantly in their prediction accuracy.

  17. Genomic selection needs to be carefully assessed to meet specific requirements in livestock breeding programs

    Directory of Open Access Journals (Sweden)

    Elisabeth eJonas

    2015-02-01

    Full Text Available Genomic selection is a promising development in agriculture, aiming improved production by exploiting molecular genetic markers to design novel breeding programs and to develop new markers-based models for genetic evaluation. It opens opportunities for research, as novel algorithms and lab methodologies are developed. Genomic selection can be applied in many breeds and species. Further research on the implementation of genomic selection in breeding programs is highly desirable not only for the common good, but also the private sector (breeding companies. It has been projected that this approach will improve selection routines, especially in species with long reproduction cycles, late or sex-limited or expensive trait recording and for complex traits. The task of integrating genomic selection into existing breeding programs is, however, not straightforward. Despite successful integration into breeding programs for dairy cattle, it has yet to be shown how much emphasis can be given to the genomic information and how much additional phenotypic information is needed from new selection candidates. Genomic selection is already part of future planning in many breeding companies of pigs and beef cattle among others, but further research is needed to fully estimate how effective the use of genomic information will be for the prediction of the performance of future breeding stock. Genomic prediction of production in crossbreeding and across-breed schemes, costs and choice of individuals for genotyping are reasons for a reluctance to fully rely on genomic information for selection decisions. Breeding objectives are highly dependent on the industry and the additional gain when using genomic information has to be considered carefully. This review synthesizes some of the suggested approaches in selected livestock species including cattle, pig, chicken and fish. It outlines tasks to help understanding possible consequences when applying genomic information in

  18. The role of the LLNL Atmospheric Release Advisory Capability in a FRMAC response to a nuclear power plant incident

    International Nuclear Information System (INIS)

    Baskett, R.L.; Sullivan, T.J.; Ellis, J.S.; Foster, C.S.

    1994-01-01

    The Federal Radiological Emergency Response Plan (FRERP) can provide several emergency response resources in response to a nuclear power plant (NPP) accident if requested by a state or local agency. The primary FRERP technical resources come from the US Department of Energy's (DOE) Federal Radiological Monitoring and Assessment Center (FRMAC). Most of the FRMAC assets are located at the DOE Remote Sensing Laboratory (RSL) at Nellis Air Force Base, Las Vegas, Nevada. In addition, the primary atmospheric dispersion modeling and dose assessment asset, the Atmospheric Release Advisory Capability (ARAC) is located at Lawrence Livermore National Laboratory (LLNL) in Livermore, California. In the early stages of a response, ARAC relies on its automatic worldwide meteorological data acquisition via the Air Force Global Weather Center (AFGWC). The regional airport data are supplemented with data from on-site towers and sodars and the National Oceanographic ampersand Atmospheric Administration's (NOAA) field-deployable real-time rawinsonde system. ARAC is prepared with three-dimensional regional-scale diagnostic dispersion model to simulate the complex mixed fission product release from a reactor accident. The program has been operational for 18 years and is presently developing its third generation system. The current modernization includes faster central computers, a new site workstation system. The current modernization includes faster central computers, a new site workstation system, improvements in its diagnostic dispersion models, addition of a new hybrid-particle source term, and implementation of a mesoscale prognostic model. AS these new capabilities evolve, they will be integrated into the FRMAC's field-deployable assets

  19. Application of Whole Genome Expression Analysis to Assess Bacterial Responses to Environmental Conditions

    Science.gov (United States)

    Vukanti, R. V.; Mintz, E. M.; Leff, L. G.

    2005-05-01

    Bacterial responses to environmental signals are multifactorial and are coupled to changes in gene expression. An understanding of bacterial responses to environmental conditions is possible using microarray expression analysis. In this study, the utility of microarrays for examining changes in gene expression in Escherichia coli under different environmental conditions was assessed. RNA was isolated, hybridized to Affymetrix E. coli Genome 2.0 chips and analyzed using Affymetrix GCOS and Genespring software. Major limiting factors were obtaining enough quality RNA (107-108 cells to get 10μg RNA)and accounting for differences in growth rates under different conditions. Stabilization of RNA prior to isolation and taking extreme precautions while handling RNA were crucial. In addition, use of this method in ecological studies is limited by availability and cost of commercial arrays; choice of primers for cDNA synthesis, reproducibility, complexity of results generated and need to validate findings. This method may be more widely applicable with the development of better approaches for RNA recovery from environmental samples and increased number of available strain-specific arrays. Diligent experimental design and verification of results with real-time PCR or northern blots is needed. Overall, there is a great potential for use of this technology to discover mechanisms underlying organisms' responses to environmental conditions.

  20. Integration of HIV in the Human Genome: Which Sites Are Preferential? A Genetic and Statistical Assessment

    Science.gov (United States)

    Gonçalves, Juliana; Moreira, Elsa; Sequeira, Inês J.; Rodrigues, António S.; Rueff, José; Brás, Aldina

    2016-01-01

    Chromosomal fragile sites (FSs) are loci where gaps and breaks may occur and are preferential integration targets for some viruses, for example, Hepatitis B, Epstein-Barr virus, HPV16, HPV18, and MLV vectors. However, the integration of the human immunodeficiency virus (HIV) in Giemsa bands and in FSs is not yet completely clear. This study aimed to assess the integration preferences of HIV in FSs and in Giemsa bands using an in silico study. HIV integration positions from Jurkat cells were used and two nonparametric tests were applied to compare HIV integration in dark versus light bands and in FS versus non-FS (NFSs). The results show that light bands are preferential targets for integration of HIV-1 in Jurkat cells and also that it integrates with equal intensity in FSs and in NFSs. The data indicates that HIV displays different preferences for FSs compared to other viruses. The aim was to develop and apply an approach to predict the conditions and constraints of HIV insertion in the human genome which seems to adequately complement empirical data. PMID:27294106

  1. Genomic selection needs to be carefully assessed to meet specific requirements in livestock breeding programs.

    Science.gov (United States)

    Jonas, Elisabeth; de Koning, Dirk-Jan

    2015-01-01

    Genomic selection is a promising development in agriculture, aiming improved production by exploiting molecular genetic markers to design novel breeding programs and to develop new markers-based models for genetic evaluation. It opens opportunities for research, as novel algorithms and lab methodologies are developed. Genomic selection can be applied in many breeds and species. Further research on the implementation of genomic selection (GS) in breeding programs is highly desirable not only for the common good, but also the private sector (breeding companies). It has been projected that this approach will improve selection routines, especially in species with long reproduction cycles, late or sex-limited or expensive trait recording and for complex traits. The task of integrating GS into existing breeding programs is, however, not straightforward. Despite successful integration into breeding programs for dairy cattle, it has yet to be shown how much emphasis can be given to the genomic information and how much additional phenotypic information is needed from new selection candidates. Genomic selection is already part of future planning in many breeding companies of pigs and beef cattle among others, but further research is needed to fully estimate how effective the use of genomic information will be for the prediction of the performance of future breeding stock. Genomic prediction of production in crossbreeding and across-breed schemes, costs and choice of individuals for genotyping are reasons for a reluctance to fully rely on genomic information for selection decisions. Breeding objectives are highly dependent on the industry and the additional gain when using genomic information has to be considered carefully. This review synthesizes some of the suggested approaches in selected livestock species including cattle, pig, chicken, and fish. It outlines tasks to help understanding possible consequences when applying genomic information in breeding scenarios.

  2. SeqEntropy: genome-wide assessment of repeats for short read sequencing.

    Directory of Open Access Journals (Sweden)

    Hsueh-Ting Chu

    Full Text Available BACKGROUND: Recent studies on genome assembly from short-read sequencing data reported the limitation of this technology to reconstruct the entire genome even at very high depth coverage. We investigated the limitation from the perspective of information theory to evaluate the effect of repeats on short-read genome assembly using idealized (error-free reads at different lengths. METHODOLOGY/PRINCIPAL FINDINGS: We define a metric H(k to be the entropy of sequencing reads at a read length k and use the relative loss of entropy ΔH(k to measure the impact of repeats for the reconstruction of whole-genome from sequences of length k. In our experiments, we found that entropy loss correlates well with de-novo assembly coverage of a genome, and a score of ΔH(k>1% indicates a severe loss in genome reconstruction fidelity. The minimal read lengths to achieve ΔH(k<1% are different for various organisms and are independent of the genome size. For example, in order to meet the threshold of ΔH(k<1%, a read length of 60 bp is needed for the sequencing of human genome (3.2 10(9 bp and 320 bp for the sequencing of fruit fly (1.8×10(8 bp. We also calculated the ΔH(k scores for 2725 prokaryotic chromosomes and plasmids at several read lengths. Our results indicate that the levels of repeats in different genomes are diverse and the entropy of sequencing reads provides a measurement for the repeat structures. CONCLUSIONS/SIGNIFICANCE: The proposed entropy-based measurement, which can be calculated in seconds to minutes in most cases, provides a rapid quantitative evaluation on the limitation of idealized short-read genome sequencing. Moreover, the calculation can be parallelized to scale up to large euakryotic genomes. This approach may be useful to tune the sequencing parameters to achieve better genome assemblies when a closely related genome is already available.

  3. Genomic stability and physiological assessments of live offspring sired by a bull clone, Starbuck II.

    Science.gov (United States)

    Ortegon, H; Betts, D H; Lin, L; Coppola, G; Perrault, S D; Blondin, P; King, W A

    2007-01-01

    It appears that overt phenotypic abnormalities observed in some domestic animal clones are not transmitted to their progeny. The current study monitored Holstein heifers sired by a bull clone, Starbuck II, from weaning to puberty. Genomic stability was assessed by telomere length status and chromosomal analysis. Growth parameters, blood profiles, physical exams and reproductive parameters were assessed for 12 months (and compared to age-matched control heifers). Progeny sired by the clone bull did not differ (P>0.05) in weight, length and height compared to controls. However, progeny had lower heart rates (HR) (P=0.009), respiratory rates (RR) (P=0.007) and body temperature (P=0.03). Hematological profiles were within normal ranges and did not differ (P>0.05) between both groups. External and internal genitalia were normal and both groups reached puberty at expected ages. Progeny had two or three ovarian follicular waves per estrous cycle and serum progesterone concentrations were similar (P=0.99) to controls. Telomere lengths of sperm and blood cells from Starbuck II were not different (P>0.05) than those of non-cloned cattle; telomere lengths of progeny were not different (P>0.05) from age-matched controls. In addition, progeny had normal karyotypes in peripheral blood leukocytes compared to controls (89.1% versus 86.3% diploid, respectively). In summary, heifers sired by a bull clone had normal chromosomal stability, growth, physical, hematological and reproductive parameters, compared to normal heifers. Furthermore, they had moderate stress responses to routine handling and restraint.

  4. Overview of the LBL/LLNL negative-ion-based neutral beam program

    International Nuclear Information System (INIS)

    Pyle, R.V.

    1980-01-01

    The LBL/LLNL negative-ion-based neutral beam development program and status are described. The emphasis has shifted in some details since the first symposium in 1977, but our overall objectives remain the same, namely, the development of megawatt d.c. injection systems. Previous emphasis was on a system in which the negative ions were produced by double charge exchange in sodium vapor. At present, the emphasis is on a self-extraction source in which the negative ions are produced on a biased surface imbedded in a plasma. A one-ampere beam will be accelerated to at least 40 keV next year. Studies of negative-ion formation and interactions help provide a data base for the technology program

  5. Research at Clark in the early '60s and at LLNL in the late '80s

    International Nuclear Information System (INIS)

    Gatrousis, C.

    1993-01-01

    Tom Sugihara's scientific leadership over a period of almost four decades covered many areas. His early research at Clark dealt with fission yields measurements and radiochemical separations of fallout species in the marine environment. Tom pioneered many of the methods for detecting soft beta emitters and low levels of radioactivity. Studies of the behavior of radioactivity in the marine ecosystem were important adjuncts to Tom's nuclear science research at Clark University which emphasized investigations of nuclear reaction mechanisms. Among Tom's most important contributions while at Clark was his work with Matsuo and Dudey on the interpretation of isomeric yield ratios and fission studies with Noshkin and Baba. Tom's scientific career oscillated between research and administration. During the latter part of his career his great breadth of interests and his scientific open-quotes tasteclose quotes had a profound influence at LLNL in areas that were new to him, materials science and solid state physics

  6. A historical perspective on fifteen years of laser damage thresholds at LLNL

    International Nuclear Information System (INIS)

    Rainer, F.; De Marco, F.P.; Staggs, M.C.; Kozlowski, M.R.; Atherton, L.J.; Sheehan, L.M.

    1993-01-01

    We have completed a fifteen year, referenced and documented compilation of more than 15,000 measurements of laser-induced damage thresholds (LIDT) conducted at the Lawrence Livermore National Laboratory (LLNL). These measurements cover the spectrum from 248 to 1064 nm with pulse durations ranging from < 1 ns to 65 ns and at pulse-repetition frequencies (PRF) from single shots to 6.3 kHz. We emphasize the changes in LIDTs during the past two years since we last summarized our database. We relate these results to earlier data concentrating on improvements in processing methods, materials, and conditioning techniques. In particular, we highlight the current status of anti-reflective (AR) coatings, high reflectors (HR), polarizers, and frequency-conversion crystals used primarily at 355 nm and 1064 nm

  7. Production of High Harmonic X-Ray Radiation from Non-linear Thomson at LLNL PLEIADES

    CERN Document Server

    Lim, Jae; Betts, Shawn; Crane, John; Doyuran, Adnan; Frigola, Pedro; Gibson, David J; Hartemann, Fred V; Rosenzweig, James E; Travish, Gil; Tremaine, Aaron M

    2005-01-01

    We describe an experiment for production of high harmonic x-ray radiation from Thomson backscattering of an ultra-short high power density laser by a relativistic electron beam at the PLEIADES facility at LLNL. In this scenario, electrons execute a “figure-8” motion under the influence of the high-intensity laser field, where the constant characterizing the field strength is expected to exceed unity: $aL=e*EL/m*c*ωL ≥ 1$. With large $aL$ this motion produces high harmonic x-ray radiation and significant broadening of the spectral peaks. This paper is intended to give a layout of the PLEIADES experiment, along with progress towards experimental goals.

  8. Author Contribution to the Pu Handbook II: Chapter 37 LLNL Integrated Sample Preparation Glovebox (TEM) Section

    International Nuclear Information System (INIS)

    Wall, Mark A.

    2016-01-01

    The development of our Integrated Actinide Sample Preparation Laboratory (IASPL) commenced in 1998 driven by the need to perform transmission electron microscopy studies on naturally aged plutonium and its alloys looking for the microstructural effects of the radiological decay process (1). Remodeling and construction of a laboratory within the Chemistry and Materials Science Directorate facilities at LLNL was required to turn a standard radiological laboratory into a Radiological Materials Area (RMA) and Radiological Buffer Area (RBA) containing type I, II and III workplaces. Two inert atmosphere dry-train glove boxes with antechambers and entry/exit fumehoods (Figure 1), having a baseline atmosphere of 1 ppm oxygen and 1 ppm water vapor, a utility fumehood and a portable, and a third double-walled enclosure have been installed and commissioned. These capabilities, along with highly trained technical staff, facilitate the safe operation of sample preparation processes and instrumentation, and sample handling while minimizing oxidation or corrosion of the plutonium. In addition, we are currently developing the capability to safely transfer small metallographically prepared samples to a mini-SEM for microstructural imaging and chemical analysis. The gloveboxes continue to be the most crucial element of the laboratory allowing nearly oxide-free sample preparation for a wide variety of LLNL-based characterization experiments, which includes transmission electron microscopy, electron energy loss spectroscopy, optical microscopy, electrical resistivity, ion implantation, X-ray diffraction and absorption, magnetometry, metrological surface measurements, high-pressure diamond anvil cell equation-of-state, phonon dispersion measurements, X-ray absorption and emission spectroscopy, and differential scanning calorimetry. The sample preparation and materials processing capabilities in the IASPL have also facilitated experimentation at world-class facilities such as the

  9. Author Contribution to the Pu Handbook II: Chapter 37 LLNL Integrated Sample Preparation Glovebox (TEM) Section

    Energy Technology Data Exchange (ETDEWEB)

    Wall, Mark A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-10-25

    The development of our Integrated Actinide Sample Preparation Laboratory (IASPL) commenced in 1998 driven by the need to perform transmission electron microscopy studies on naturally aged plutonium and its alloys looking for the microstructural effects of the radiological decay process (1). Remodeling and construction of a laboratory within the Chemistry and Materials Science Directorate facilities at LLNL was required to turn a standard radiological laboratory into a Radiological Materials Area (RMA) and Radiological Buffer Area (RBA) containing type I, II and III workplaces. Two inert atmosphere dry-train glove boxes with antechambers and entry/exit fumehoods (Figure 1), having a baseline atmosphere of 1 ppm oxygen and 1 ppm water vapor, a utility fumehood and a portable, and a third double-walled enclosure have been installed and commissioned. These capabilities, along with highly trained technical staff, facilitate the safe operation of sample preparation processes and instrumentation, and sample handling while minimizing oxidation or corrosion of the plutonium. In addition, we are currently developing the capability to safely transfer small metallographically prepared samples to a mini-SEM for microstructural imaging and chemical analysis. The gloveboxes continue to be the most crucial element of the laboratory allowing nearly oxide-free sample preparation for a wide variety of LLNL-based characterization experiments, which includes transmission electron microscopy, electron energy loss spectroscopy, optical microscopy, electrical resistivity, ion implantation, X-ray diffraction and absorption, magnetometry, metrological surface measurements, high-pressure diamond anvil cell equation-of-state, phonon dispersion measurements, X-ray absorption and emission spectroscopy, and differential scanning calorimetry. The sample preparation and materials processing capabilities in the IASPL have also facilitated experimentation at world-class facilities such as the

  10. Final report for the 1996 DOE grant supporting research at the SLAC/LBNL/LLNL B factory

    International Nuclear Information System (INIS)

    Judd, D.; Wright, D.

    1997-01-01

    This final report discusses Department of Energy-supported research funded through Lawrence Livermore National Laboratory (LLNL) which was performed as part of a collaboration between LLNL and Prairie View A and M University to develop part of the BaBar detector at the SLAC B Factory. This work focuses on the Instrumented Flux Return (IFR) subsystem of BaBar and involves a full range of detector development activities: computer simulations of detector performance, creation of reconstruction algorithms, and detector hardware R and D. Lawrence Livermore National Laboratory has a leading role in the IFR subsystem and has established on-site computing and detector facilities to conduct this research. By establishing ties with the existing LLNL Research Collaboration Program and leveraging LLNL resources, the experienced Prairie View group was able to quickly achieve a more prominent role within the BaBar collaboration and make significant contributions to the detector design. In addition, this work provided the first entry point for Historically Black Colleges and Universities into the B Factory collaboration, and created an opportunity to train a new generation of minority students at the premier electron-positron high energy physics facility in the US

  11. The Contribution of Health Technology Assessment, Health Needs Assessment, and Health Impact Assessment to the Assessment and Translation of Technologies in the Field of Public Health Genomics

    NARCIS (Netherlands)

    Rosenköttera, N.; Vondeling, Hindrik; Blancquaert, I.; Mekel, O.C.L.; Kristensen, F.B.; Brand, A.

    2011-01-01

    The European Union has named genomics as one of the promising research fields for the development of new health technologies. Major concerns with regard to these fields are, on the one hand, the rather slow and limited translation of new knowledge and, on the other hand, missing insights into the

  12. The Use of Non-Variant Sites to Improve the Clinical Assessment of Whole-Genome Sequence Data.

    Directory of Open Access Journals (Sweden)

    Alberto Ferrarini

    Full Text Available Genetic testing, which is now a routine part of clinical practice and disease management protocols, is often based on the assessment of small panels of variants or genes. On the other hand, continuous improvements in the speed and per-base costs of sequencing have now made whole exome sequencing (WES and whole genome sequencing (WGS viable strategies for targeted or complete genetic analysis, respectively. Standard WGS/WES data analytical workflows generally rely on calling of sequence variants respect to the reference genome sequence. However, the reference genome sequence contains a large number of sites represented by rare alleles, by known pathogenic alleles and by alleles strongly associated to disease by GWAS. It's thus critical, for clinical applications of WGS and WES, to interpret whether non-variant sites are homozygous for the reference allele or if the corresponding genotype cannot be reliably called. Here we show that an alternative analytical approach based on the analysis of both variant and non-variant sites from WGS data allows to genotype more than 92% of sites corresponding to known SNPs compared to 6% genotyped by standard variant analysis. These include homozygous reference sites of clinical interest, thus leading to a broad and comprehensive characterization of variation necessary to an accurate evaluation of disease risk. Altogether, our findings indicate that characterization of both variant and non-variant clinically informative sites in the genome is necessary to allow an accurate clinical assessment of a personal genome. Finally, we propose a highly efficient extended VCF (eVCF file format which allows to store genotype calls for sites of clinical interest while remaining compatible with current variant interpretation software.

  13. Assessing quality and completeness of human transcriptional regulatory pathways on a genome-wide scale

    Directory of Open Access Journals (Sweden)

    Aifantis Iannis

    2011-02-01

    Full Text Available Abstract Background Pathway databases are becoming increasingly important and almost omnipresent in most types of biological and translational research. However, little is known about the quality and completeness of pathways stored in these databases. The present study conducts a comprehensive assessment of transcriptional regulatory pathways in humans for seven well-studied transcription factors: MYC, NOTCH1, BCL6, TP53, AR, STAT1, and RELA. The employed benchmarking methodology first involves integrating genome-wide binding with functional gene expression data to derive direct targets of transcription factors. Then the lists of experimentally obtained direct targets are compared with relevant lists of transcriptional targets from 10 commonly used pathway databases. Results The results of this study show that for the majority of pathway databases, the overlap between experimentally obtained target genes and targets reported in transcriptional regulatory pathway databases is surprisingly small and often is not statistically significant. The only exception is MetaCore pathway database which yields statistically significant intersection with experimental results in 84% cases. Additionally, we suggest that the lists of experimentally derived direct targets obtained in this study can be used to reveal new biological insight in transcriptional regulation and suggest novel putative therapeutic targets in cancer. Conclusions Our study opens a debate on validity of using many popular pathway databases to obtain transcriptional regulatory targets. We conclude that the choice of pathway databases should be informed by solid scientific evidence and rigorous empirical evaluation. Reviewers This article was reviewed by Prof. Wing Hung Wong, Dr. Thiago Motta Venancio (nominated by Dr. L Aravind, and Prof. Geoff J McLachlan.

  14. Criticality Safety Support to a Project Addressing SNM Legacy Items at LLNL

    International Nuclear Information System (INIS)

    Pearson, J S; Burch, J G; Dodson, K E; Huang, S T

    2005-01-01

    The programmatic, facility and criticality safety support staffs at the LLNL Plutonium Facility worked together to successfully develop and implement a project to process legacy (DNFSB Recommendation 94-1 and non-Environmental, Safety, and Health (ES and H) labeled) materials in storage. Over many years, material had accumulated in storage that lacked information to adequately characterize the material for current criticality safety controls used in the facility. Generally, the fissionable material mass information was well known, but other information such as form, impurities, internal packaging, and presence of internal moderating or reflecting materials were not well documented. In many cases, the material was excess to programmatic need, but such a determination was difficult with the little information given on MC and A labels and in the MC and A database. The material was not packaged as efficiently as possible, so it also occupied much more valuable storage space than was necessary. Although safe as stored, the inadequately characterized material posed a risk for criticality safety noncompliances if moved within the facility under current criticality safety controls. A Legacy Item Implementation Plan was developed and implemented to deal with this problem. Reasonable bounding conditions were determined for the material involved, and criticality safety evaluations were completed. Two appropriately designated glove boxes were identified and criticality safety controls were developed to safely inspect the material. Inspecting the material involved identifying containers of legacy material, followed by opening, evaluating, processing if necessary, characterizing and repackaging the material. Material from multiple containers was consolidated more efficiently thus decreasing the total number of stored items to about one half of the highest count. Current packaging requirements were implemented. Detailed characterization of the material was captured in databases

  15. Impact of the Revised 10 CFR 835 on the Neutron Dose Rates at LLNL

    International Nuclear Information System (INIS)

    Radev, R.

    2009-01-01

    In June 2007, 10 CFR 835 (1) was revised to include new radiation weighting factors for neutrons, updated dosimetric models, and dose terms consistent with the newer ICRP recommendations. A significant aspect of the revised 10 CFR 835 is the adoption of the recommendations outlined in ICRP-60 (2). The recommended new quantities demand a review of much of the basic data used in protection against exposure to sources of ionizing radiation. The International Commission on Radiation Units and Measurements has defined a number of quantities for use in personnel and area monitoring (3,4,5) including the ambient dose equivalent H*(d) to be used for area monitoring and instrument calibrations. These quantities are used in ICRP-60 and ICRP-74. This report deals only with the changes in the ambient dose equivalent and ambient dose rate equivalent for neutrons as a result of the implementation of the revised 10 CFR 835. In the report, the terms neutron dose and neutron dose rate will be used for convenience for ambient neutron dose and ambient neutron dose rate unless otherwise stated. This report provides a qualitative and quantitative estimate of how much the neutron dose rates at LLNL will change with the implementation of the revised 10 CFR 835. Neutron spectra and dose rates from selected locations at the LLNL were measured with a high resolution spectroscopic neutron dose rate system (ROSPEC) as well as with a standard neutron rem meter (a.k.a., a remball). The spectra obtained at these locations compare well with the spectra from the Radiation Calibration Laboratory's (RCL) bare californium source that is currently used to calibrate neutron dose rate instruments. The measurements obtained from the high resolution neutron spectrometer and dose meter ROSPEC and the NRD dose meter compare within the range of ±25%. When the new radiation weighting factors are adopted with the implementation of the revised 10 CFR 835, the measured dose rates will increase by up to 22%. The

  16. Assessment of Ploidy and Genome Constitution of Some Musa balbisiana Cultivars using DArT Markers

    Czech Academy of Sciences Publication Activity Database

    Sales, E. K.; Butardo, N. G.; Paniagua, H. G.; Jansen, H.; Doležel, Jaroslav

    2011-01-01

    Roč. 36, č. 1 (2011), s. 11-18 ISSN 0115-463X Institutional research plan: CEZ:AV0Z50380511 Keywords : DArT * genome * Musa balbisiana Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 0.075, year: 2011 http://home.ueb.cas.cz/publikace/2011_Sales_PHILIPPINE_JOURNAL_OF_CROP_SCIENCE_11.pdf

  17. LLNL-G3Dv3: Global P wave tomography model for improved regional and teleseismic travel time prediction: LLNL-G3DV3---GLOBAL P WAVE TOMOGRAPHY

    Energy Technology Data Exchange (ETDEWEB)

    Simmons, N. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Myers, S. C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Johannesson, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Matzel, E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2012-10-06

    [1] We develop a global-scale P wave velocity model (LLNL-G3Dv3) designed to accurately predict seismic travel times at regional and teleseismic distances simultaneously. The model provides a new image of Earth's interior, but the underlying practical purpose of the model is to provide enhanced seismic event location capabilities. The LLNL-G3Dv3 model is based on ∼2.8 millionP and Pnarrivals that are re-processed using our global multiple-event locator called Bayesloc. We construct LLNL-G3Dv3 within a spherical tessellation based framework, allowing for explicit representation of undulating and discontinuous layers including the crust and transition zone layers. Using a multiscale inversion technique, regional trends as well as fine details are captured where the data allow. LLNL-G3Dv3 exhibits large-scale structures including cratons and superplumes as well numerous complex details in the upper mantle including within the transition zone. Particularly, the model reveals new details of a vast network of subducted slabs trapped within the transition beneath much of Eurasia, including beneath the Tibetan Plateau. We demonstrate the impact of Bayesloc multiple-event location on the resulting tomographic images through comparison with images produced without the benefit of multiple-event constraints (single-event locations). We find that the multiple-event locations allow for better reconciliation of the large set of direct P phases recorded at 0–97° distance and yield a smoother and more continuous image relative to the single-event locations. Travel times predicted from a 3-D model are also found to be strongly influenced by the initial locations of the input data, even when an iterative inversion/relocation technique is employed.

  18. Summary of Environmental Data Analysis and Work Performed by Lawrence Livermore National Laboratory (LLNL) in Support of the Navajo Nation Abandoned Mine Lands Project at Tse Tah, Arizona

    Energy Technology Data Exchange (ETDEWEB)

    Taffet, Michael J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Esser, Bradley K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Madrid, Victor M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-05-17

    This report summarizes work performed by Lawrence Livermore National Laboratory (LLNL) under Navajo Nation Services Contract CO9729 in support of the Navajo Abandoned Mine Lands Reclamation Program (NAMLRP). Due to restrictions on access to uranium mine waste sites at Tse Tah, Arizona that developed during the term of the contract, not all of the work scope could be performed. LLNL was able to interpret environmental monitoring data provided by NAMLRP. Summaries of these data evaluation activities are provided in this report. Additionally, during the contract period, LLNL provided technical guidance, instructional meetings, and review of relevant work performed by NAMLRP and its contractors that was not contained in the contract work scope.

  19. Estimating The Reliability of the Lawrence Livermore National Laboratory (LLNL) Flash X-ray (FXR) Machine

    International Nuclear Information System (INIS)

    Ong, M M; Kihara, R; Zentler, J M; Kreitzer, B R; DeHope, W J

    2007-01-01

    At Lawrence Livermore National Laboratory (LLNL), our flash X-ray accelerator (FXR) is used on multi-million dollar hydrodynamic experiments. Because of the importance of the radiographs, FXR must be ultra-reliable. Flash linear accelerators that can generate a 3 kA beam at 18 MeV are very complex. They have thousands, if not millions, of critical components that could prevent the machine from performing correctly. For the last five years, we have quantified and are tracking component failures. From this data, we have determined that the reliability of the high-voltage gas-switches that initiate the pulses, which drive the accelerator cells, dominates the statistics. The failure mode is a single-switch pre-fire that reduces the energy of the beam and degrades the X-ray spot-size. The unfortunate result is a lower resolution radiograph. FXR is a production machine that allows only a modest number of pulses for testing. Therefore, reliability switch testing that requires thousands of shots is performed on our test stand. Study of representative switches has produced pre-fire statistical information and probability distribution curves. This information is applied to FXR to develop test procedures and determine individual switch reliability using a minimal number of accelerator pulses

  20. LLNL Underground-Coal-Gasification Project. Quarterly progress report, July-September 1981

    Energy Technology Data Exchange (ETDEWEB)

    Stephens, D.R.; Clements, W. (eds.)

    1981-11-09

    We have continued our laboratory studies of forward gasification in small blocks of coal mounted in 55-gal drums. A steam/oxygen mixture is fed into a small hole drilled longitudinally through the center of the block, the coal is ignited near the inlet and burns toward the outlet, and the product gases come off at the outlet. Various diagnostic measurements are made during the course of the burn, and afterward the coal block is split open so that the cavity can be examined. Development work continues on our mathematical model for the small coal block experiments. Preparations for the large block experiments at a coal outcrop in the Tono Basin of Washington State have required steadily increasing effort with the approach of the scheduled starting time for the experiments (Fall 1981). Also in preparation is the deep gasification experiment, Tono 1, planned for another site in the Tono Basin after the large block experiments have been completed. Wrap-up work continues on our previous gasification experiments in Wyoming. Results of the postburn core-drilling program Hoe Creek 3 are presented here. Since 1976 the Soviets have been granted four US patents on various aspects of the underground coal gasification process. These patents are described here, and techniques of special interest are noted. Finally, we include ten abstracts of pertinent LLNL reports and papers completed during the quarter.

  1. Status of experiments at LLNL on high-power X-band microwave generators

    International Nuclear Information System (INIS)

    Houck, T.L.; Westenskow, G.A.

    1994-01-01

    The Microwave Source Facility at the Lawrence Livermore National Laboratory (LLNL) is studying the application of induction accelerator technology to high-power microwave generators suitable for linear collider power sources. The authors report on the results of two experiments, both using the Choppertron's 11.4 GHz modulator and a 5-MeV, 1-kA induction beam. The first experimental configuration has a single traveling wave output structure designed to produce in excess of 300 MW in a single fundamental waveguide. This output structure consists of 12 individual cells, the first two incorporating de-Q-ing circuits to dampen higher order resonant modes. The second experiment studies the feasibility of enhancing beam to microwave power conversion by accelerating a modulated beam with induction cells. Referred to as the ''Reacceleration Experiment,'' this experiment consists of three traveling-wave output structures designed to produce about 125 MW per output and two induction cells located between the outputs. Status of current and planned experiments are presented

  2. Pleiades: A Sub-picosecond Tunable X-ray Source at the LLNL Electron Linac

    International Nuclear Information System (INIS)

    Slaughter, Dennis; Springer, Paul; Le Sage, Greg; Crane, John; Ditmire, Todd; Cowan, Tom; Anderson, Scott G.; Rosenzweig, James B.

    2002-01-01

    The use of ultra fast laser pulses to generate very high brightness, ultra short (fs to ps) pulses of x-rays is a topic of great interest to the x-ray user community. In principle, femto-second-scale pump-probe experiments can be used to temporally resolve structural dynamics of materials on the time scale of atomic motion. The development of sub-ps x-ray pulses will make possible a wide range of materials and plasma physics studies with unprecedented time resolution. A current project at LLNL will provide such a novel x-ray source based on Thomson scattering of high power, short laser pulses with a high peak brightness, relativistic electron bunch. The system is based on a 5 mm-mrad normalized emittance photo-injector, a 100 MeV electron RF linac, and a 300 mJ, 35 fs solid-state laser system. The Thomson x-ray source produces ultra fast pulses with x-ray energies capable of probing into high-Z metals, and a high flux per pulse enabling single shot experiments. The system will also operate at a high repetition rate (∼ 10 Hz). (authors)

  3. Summary of LLNL's accomplishments for the FY93 Waste Processing Operations Program

    International Nuclear Information System (INIS)

    Grasz, E.; Domning, E.; Heggins, D.; Huber, L.; Hurd, R.; Martz, H.; Roberson, P.; Wilhelmsen, K.

    1994-04-01

    Under the US Department of Energy's (DOE's) Office of Technology Development (OTD)-Robotic Technology Development Program (RTDP), the Waste Processing Operations (WPO) Program was initiated in FY92 to address the development of automated material handling and automated chemical and physical processing systems for mixed wastes. The Program's mission was to develop a strategy for the treatment of all DOE mixed, low-level, and transuranic wastes. As part of this mission, DOE's Mixed Waste Integrated Program (MWIP) was charged with the development of innovative waste treatment technologies to surmount shortcomings of existing baseline systems. Current technology advancements and applications results from cooperation of private industry, educational institutions, and several national laboratories operated for DOE. This summary document presents the LLNL Environmental Restoration and Waste Management (ER and WM) Automation and Robotics Section's contributions in support of DOE's FY93 WPO Program. This document further describes the technological developments that were integrated in the 1993 Mixed Waste Operations (MWO) Demonstration held at SRTC in November 1993

  4. The EBIT Calorimeter Spectrometer: a new, permanent user facility at the LLNL EBIT

    International Nuclear Information System (INIS)

    Porter, F.S.; Beiersdorfer, P.; Brown, G.V.; Doriese, W.; Gygax, J.; Kelley, R.L.; Kilbourne, C.A.; King, J.; Irwin, K.; Reintsema, C.; Ullom, J.

    2007-01-01

    The EBIT Calorimeter Spectrometer (ECS) is currently being completed and will be installed at the EBIT facility at the Lawrence Livermore National Laboratory in October 2007. The ECS will replace the smaller XRS/EBIT microcalorimeter spectrometer that has been in almost continuous operation since 2000. The XRS/EBIT was based on a spare laboratory cryostat and an engineering model detector system from the Suzaku/XRS observatory program. The new ECS spectrometer was built to be a low maintenance, high performance implanted silicon microcalorimeter spectrometer with 4 eV resolution at 6 keV, 32 detector channels, 10 (micro)s event timing, and capable of uninterrupted acquisition sessions of over 60 hours at 50 mK. The XRS/EBIT program has been very successful, producing many results on topics such as laboratory astrophysics, atomic physics, nuclear physics, and calibration of the spectrometers for the National Ignition Facility. The ECS spectrometer will continue this work into the future with improved spectral resolution, integration times, and ease-of-use. We designed the ECS instrument with TES detectors in mind by using the same highly successful magnetic shielding as our laboratory TES cryostats. This design will lead to a future TES instrument at the LLNL EBIT. Here we discuss the legacy of the XRS/EBIT program, the performance of the new ECS spectrometer, and plans for a future TES instrument.

  5. Overview and applications of the Monte Carlo radiation transport kit at LLNL

    International Nuclear Information System (INIS)

    Sale, K. E.

    1999-01-01

    Modern Monte Carlo radiation transport codes can be applied to model most applications of radiation, from optical to TeV photons, from thermal neutrons to heavy ions. Simulations can include any desired level of detail in three-dimensional geometries using the right level of detail in the reaction physics. The technology areas to which we have applied these codes include medical applications, defense, safety and security programs, nuclear safeguards and industrial and research system design and control. The main reason such applications are interesting is that by using these tools substantial savings of time and effort (i.e. money) can be realized. In addition it is possible to separate out and investigate computationally effects which can not be isolated and studied in experiments. In model calculations, just as in real life, one must take care in order to get the correct answer to the right question. Advancing computing technology allows extensions of Monte Carlo applications in two directions. First, as computers become more powerful more problems can be accurately modeled. Second, as computing power becomes cheaper Monte Carlo methods become accessible more widely. An overview of the set of Monte Carlo radiation transport tools in use a LLNL will be presented along with a few examples of applications and future directions

  6. Assessing genetic diversity among Brettanomyces yeasts by DNA fingerprinting and whole-genome sequencing.

    Science.gov (United States)

    Crauwels, Sam; Zhu, Bo; Steensels, Jan; Busschaert, Pieter; De Samblanx, Gorik; Marchal, Kathleen; Willems, Kris A; Verstrepen, Kevin J; Lievens, Bart

    2014-07-01

    Brettanomyces yeasts, with the species Brettanomyces (Dekkera) bruxellensis being the most important one, are generally reported to be spoilage yeasts in the beer and wine industry due to the production of phenolic off flavors. However, B. bruxellensis is also known to be a beneficial contributor in certain fermentation processes, such as the production of certain specialty beers. Nevertheless, despite its economic importance, Brettanomyces yeasts remain poorly understood at the genetic and genomic levels. In this study, the genetic relationship between more than 50 Brettanomyces strains from all presently known species and from several sources was studied using a combination of DNA fingerprinting techniques. This revealed an intriguing correlation between the B. bruxellensis fingerprints and the respective isolation source. To further explore this relationship, we sequenced a (beneficial) beer isolate of B. bruxellensis (VIB X9085; ST05.12/22) and compared its genome sequence with the genome sequences of two wine spoilage strains (AWRI 1499 and CBS 2499). ST05.12/22 was found to be substantially different from both wine strains, especially at the level of single nucleotide polymorphisms (SNPs). In addition, there were major differences in the genome structures between the strains investigated, including the presence of large duplications and deletions. Gene content analysis revealed the presence of 20 genes which were present in both wine strains but absent in the beer strain, including many genes involved in carbon and nitrogen metabolism, and vice versa, no genes that were missing in both AWRI 1499 and CBS 2499 were found in ST05.12/22. Together, this study provides tools to discriminate Brettanomyces strains and provides a first glimpse at the genetic diversity and genome plasticity of B. bruxellensis. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  7. Genome-environment interactions and prospective technology assessment: evolution from pharmacogenomics to nutrigenomics and ecogenomics.

    Science.gov (United States)

    Ozdemir, Vural; Motulsky, Arno G; Kolker, Eugene; Godard, Béatrice

    2009-02-01

    The relationships between food, nutrition science, and health outcomes have been mapped over the past century. Genomic variation among individuals and populations is a new factor that enriches and challenges our understanding of these complex relationships. Hence, the confluence of nutritional science and genomics-nutrigenomics--was the focus of the OMICS: A Journal of Integrative Biology in December 2008 (Part 1). The 2009 Special Issue (Part 2) concludes the analysis of nutrigenomics research and innovations. Together, these two issues expand the scope and depth of critical scholarship in nutrigenomics, in keeping with an integrated multidisciplinary analysis across the bioscience, omics technology, social, ethical, intellectual property and policy dimensions. Historically, the field of pharmacogenetics provided the first examples of specifically identifiable gene variants predisposing to unexpected responses to drugs since the 1950s. Brewer coined the term ecogenetics in 1971 to broaden the concept of gene-environment interactions from drugs and nutrition to include environmental agents in general. In the mid-1990s, introduction of high-throughput technologies led to the terms pharmacogenomics, nutrigenomics and ecogenomics to describe, respectively, the contribution of genomic variability to differential responses to drugs, food, and environment defined in the broadest sense. The distinctions, if any, between these newer fields (e.g., nutrigenomics) and their predecessors (e.g., nutrigenetics) remain to be delineated. For nutrigenomics, its reliance on genome-wide analyses may lead to detection of new biological mechanisms governing host response to food. Recognizing "genome-environment interactions" as the conceptual thread that connects and runs through pharmacogenomics, nutrigenomics, and ecogenomics may contribute toward anticipatory governance and prospective real-time analysis of these omics fields. Such real-time analysis of omics technologies and

  8. Assessing genome-wide copy number variation in the Han Chinese population.

    Science.gov (United States)

    Lu, Jianqi; Lou, Haiyi; Fu, Ruiqing; Lu, Dongsheng; Zhang, Feng; Wu, Zhendong; Zhang, Xi; Li, Changhua; Fang, Baijun; Pu, Fangfang; Wei, Jingning; Wei, Qian; Zhang, Chao; Wang, Xiaoji; Lu, Yan; Yan, Shi; Yang, Yajun; Jin, Li; Xu, Shuhua

    2017-10-01

    Copy number variation (CNV) is a valuable source of genetic diversity in the human genome and a well-recognised cause of various genetic diseases. However, CNVs have been considerably under-represented in population-based studies, particularly the Han Chinese which is the largest ethnic group in the world. To build a representative CNV map for the Han Chinese population. We conducted a genome-wide CNV study involving 451 male Han Chinese samples from 11 geographical regions encompassing 28 dialect groups, representing a less-biased panel compared with the currently available data. We detected CNVs by using 4.2M NimbleGen comparative genomic hybridisation array and whole-genome deep sequencing of 51 samples to optimise the filtering conditions in CNV discovery. A comprehensive Han Chinese CNV map was built based on a set of high-quality variants (positive predictive value >0.8, with sizes ranging from 369 bp to 4.16 Mb and a median of 5907 bp). The map consists of 4012 CNV regions (CNVRs), and more than half are novel to the 30 East Asian CNV Project and the 1000 Genomes Project Phase 3. We further identified 81 CNVRs specific to regional groups, which was indicative of the subpopulation structure within the Han Chinese population. Our data are complementary to public data sources, and the CNV map may facilitate in the identification of pathogenic CNVs and further biomedical research studies involving the Han Chinese population. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  9. Recovery of community genomes to assess subsurface metabolic potential: exploiting the capacity of next generation sequencing-based metagenomics

    Science.gov (United States)

    Wrighton, K. C.; Thomas, B.; Miller, C. S.; Sharon, I.; Wilkins, M. J.; VerBerkmoes, N. C.; Handley, K. M.; Lipton, M. S.; Hettich, R. L.; Williams, K. H.; Long, P. E.; Banfield, J. F.

    2011-12-01

    With the goal of developing a deterministic understanding of the microbiological and geochemical processes controlling subsurface environments, groundwater bacterial communities were collected from the Rifle Integrated Field Research Challenge (IFRC) site. Biomass from three temporal acetate-stimulated groundwater samples were collected during a period of dominant Fe(III)-reduction, in a region of the aquifer that had previously received acetate amendment the year prior. Phylogenetic analysis revealed a diverse Bacterial community, notably devoid of Archaea with 249 taxa from 9 Bacterial phyla including the dominance of uncultured candidate divisions, BD1-5, OD1, and OP11. We have reconstructed 86 partial to near-complete genomes and have performed a detailed characterization of the underlying metabolic potential of the ecosystem. We assessed the natural variation and redundancy in multi-heme c-type cytochromes, sulfite reductases, and central carbon metabolic pathways. Deep genomic sampling indicated the community contained various metabolic pathways: sulfur oxidation coupled to microaerophilic conditions, nitrate reduction with both acetate and inorganic compounds as donors, carbon and nitrogen fixation, antibiotic warfare, and heavy-metal detoxification. Proteomic investigations using predicted proteins from metagenomics corroborated that acetate oxidation is coupled to reduction of oxygen, sulfur, nitrogen, and iron across the samples. Of particular interest was the detection of acetate oxidizing and sulfate reducing proteins from a Desulfotalea-like bacterium in all three time points, suggesting that aqueous sulfide produced by active sulfate-reducing bacteria could contribute to abiotic iron reduction during the dominant iron reduction phase. Additionally, proteogenomic analysis verified that a large portion of the community, including members of the uncultivated BD1-5, are obligate fermenters, characterized by the presence of hydrogen-evolving hydrogenases

  10. Reference genome-independent assessment of mutation density using restriction enzyme-phased sequencing

    Directory of Open Access Journals (Sweden)

    Monson-Miller Jennifer

    2012-02-01

    Full Text Available Abstract Background The availability of low cost sequencing has spurred its application to discovery and typing of variation, including variation induced by mutagenesis. Mutation discovery is challenging as it requires a substantial amount of sequencing and analysis to detect very rare changes and distinguish them from noise. Also challenging are the cases when the organism of interest has not been sequenced or is highly divergent from the reference. Results We describe the development of a simple method for reduced representation sequencing. Input DNA was digested with a single restriction enzyme and ligated to Y adapters modified to contain a sequence barcode and to provide a compatible overhang for ligation. We demonstrated the efficiency of this method at SNP discovery using rice and arabidopsis. To test its suitability for the discovery of very rare SNP, one control and three mutagenized rice individuals (1, 5 and 10 mM sodium azide were used to prepare genomic libraries for Illumina sequencers by ligating barcoded adapters to NlaIII restriction sites. For genome-dependent discovery 15-30 million of 80 base reads per individual were aligned to the reference sequence achieving individual sequencing coverage from 7 to 15×. We identified high-confidence base changes by comparing sequences across individuals and identified instances consistent with mutations, i.e. changes that were found in a single treated individual and were solely GC to AT transitions. For genome-independent discovery 70-mers were extracted from the sequence of the control individual and single-copy sequence was identified by comparing the 70-mers across samples to evaluate copy number and variation. This de novo "genome" was used to align the reads and identify mutations as above. Covering approximately 1/5 of the 380 Mb genome of rice we detected mutation densities ranging from 0.6 to 4 per Mb of diploid DNA depending on the mutagenic treatment. Conclusions The

  11. Genomic assessment of the evolution of the prion protein gene family in vertebrates.

    Science.gov (United States)

    Harrison, Paul M; Khachane, Amit; Kumar, Manish

    2010-05-01

    Prion diseases are devastating neurological disorders caused by the propagation of particles containing an alternative beta-sheet-rich form of the prion protein (PrP). Genes paralogous to PrP, called Doppel and Shadoo, have been identified, that also have neuropathological relevance. To aid in the further functional characterization of PrP and its relatives, we annotated completely the PrP gene family (PrP-GF), in the genomes of 42 vertebrates, through combined strategic application of gene prediction programs and advanced remote homology detection techniques (such as HMMs, PSI-TBLASTN and pGenThreader). We have uncovered several previously undescribed paralogous genes and pseudogenes. We find that current high-quality genomic evidence indicates that the PrP relative Doppel, was likely present in the last common ancestor of present-day Tetrapoda, but was lost in the bird lineage, since its divergence from reptiles. Using the new gene annotations, we have defined the consensus of structural features that are characteristic of the PrP and Doppel structures, across diverse Tetrapoda clades. Furthermore, we describe in detail a transcribed pseudogene derived from Shadoo that is conserved across primates, and that overlaps the meiosis gene, SYCE1, thus possibly regulating its expression. In addition, we analysed the locus of PRNP/PRND for significant conservation across the genomic DNA of eleven mammals, and determined the phylogenetic penetration of non-coding exons. The genomic evidence indicates that the second PRNP non-coding exon found in even-toed ungulates and rodents, is conserved in all high-coverage genome assemblies of primates (human, chimp, orang utan and macaque), and is, at least, likely to have fallen out of use during primate speciation. Furthermore, we have demonstrated that the PRNT gene (at the PRNP human locus) is conserved across at least sixteen mammals, and evolves like a long non-coding RNA, fashioned from fragments of ancient, long

  12. Approaches to advancing quantitative human health risk assessment of environmental chemicals in the post-genomic era.

    Science.gov (United States)

    Chiu, Weihsueh A; Euling, Susan Y; Scott, Cheryl Siegel; Subramaniam, Ravi P

    2013-09-15

    The contribution of genomics and associated technologies to human health risk assessment for environmental chemicals has focused largely on elucidating mechanisms of toxicity, as discussed in other articles in this issue. However, there is interest in moving beyond hazard characterization to making more direct impacts on quantitative risk assessment (QRA)--i.e., the determination of toxicity values for setting exposure standards and cleanup values. We propose that the evolution of QRA of environmental chemicals in the post-genomic era will involve three, somewhat overlapping phases in which different types of approaches begin to mature. The initial focus (in Phase I) has been and continues to be on "augmentation" of weight of evidence--using genomic and related technologies qualitatively to increase the confidence in and scientific basis of the results of QRA. Efforts aimed towards "integration" of these data with traditional animal-based approaches, in particular quantitative predictors, or surrogates, for the in vivo toxicity data to which they have been anchored are just beginning to be explored now (in Phase II). In parallel, there is a recognized need for "expansion" of the use of established biomarkers of susceptibility or risk of human diseases and disorders for QRA, particularly for addressing the issues of cumulative assessment and population risk. Ultimately (in Phase III), substantial further advances could be realized by the development of novel molecular and pathway-based biomarkers and statistical and in silico models that build on anticipated progress in understanding the pathways of human diseases and disorders. Such efforts would facilitate a gradual "reorientation" of QRA towards approaches that more directly link environmental exposures to human outcomes. Published by Elsevier Inc.

  13. Approaches to advancing quantitative human health risk assessment of environmental chemicals in the post-genomic era

    Energy Technology Data Exchange (ETDEWEB)

    Chiu, Weihsueh A., E-mail: chiu.weihsueh@epa.gov [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Washington DC, 20460 (United States); Euling, Susan Y.; Scott, Cheryl Siegel; Subramaniam, Ravi P. [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Washington DC, 20460 (United States)

    2013-09-15

    The contribution of genomics and associated technologies to human health risk assessment for environmental chemicals has focused largely on elucidating mechanisms of toxicity, as discussed in other articles in this issue. However, there is interest in moving beyond hazard characterization to making more direct impacts on quantitative risk assessment (QRA) — i.e., the determination of toxicity values for setting exposure standards and cleanup values. We propose that the evolution of QRA of environmental chemicals in the post-genomic era will involve three, somewhat overlapping phases in which different types of approaches begin to mature. The initial focus (in Phase I) has been and continues to be on “augmentation” of weight of evidence — using genomic and related technologies qualitatively to increase the confidence in and scientific basis of the results of QRA. Efforts aimed towards “integration” of these data with traditional animal-based approaches, in particular quantitative predictors, or surrogates, for the in vivo toxicity data to which they have been anchored are just beginning to be explored now (in Phase II). In parallel, there is a recognized need for “expansion” of the use of established biomarkers of susceptibility or risk of human diseases and disorders for QRA, particularly for addressing the issues of cumulative assessment and population risk. Ultimately (in Phase III), substantial further advances could be realized by the development of novel molecular and pathway-based biomarkers and statistical and in silico models that build on anticipated progress in understanding the pathways of human diseases and disorders. Such efforts would facilitate a gradual “reorientation” of QRA towards approaches that more directly link environmental exposures to human outcomes.

  14. Approaches to advancing quantitative human health risk assessment of environmental chemicals in the post-genomic era

    International Nuclear Information System (INIS)

    Chiu, Weihsueh A.; Euling, Susan Y.; Scott, Cheryl Siegel; Subramaniam, Ravi P.

    2013-01-01

    The contribution of genomics and associated technologies to human health risk assessment for environmental chemicals has focused largely on elucidating mechanisms of toxicity, as discussed in other articles in this issue. However, there is interest in moving beyond hazard characterization to making more direct impacts on quantitative risk assessment (QRA) — i.e., the determination of toxicity values for setting exposure standards and cleanup values. We propose that the evolution of QRA of environmental chemicals in the post-genomic era will involve three, somewhat overlapping phases in which different types of approaches begin to mature. The initial focus (in Phase I) has been and continues to be on “augmentation” of weight of evidence — using genomic and related technologies qualitatively to increase the confidence in and scientific basis of the results of QRA. Efforts aimed towards “integration” of these data with traditional animal-based approaches, in particular quantitative predictors, or surrogates, for the in vivo toxicity data to which they have been anchored are just beginning to be explored now (in Phase II). In parallel, there is a recognized need for “expansion” of the use of established biomarkers of susceptibility or risk of human diseases and disorders for QRA, particularly for addressing the issues of cumulative assessment and population risk. Ultimately (in Phase III), substantial further advances could be realized by the development of novel molecular and pathway-based biomarkers and statistical and in silico models that build on anticipated progress in understanding the pathways of human diseases and disorders. Such efforts would facilitate a gradual “reorientation” of QRA towards approaches that more directly link environmental exposures to human outcomes

  15. Multimedia Presentations on the Human Genome: Implementation and Assessment of a Teaching Program for the Introduction to Genome Science Using a Poster and Animations

    Science.gov (United States)

    Kano, Kei; Yahata, Saiko; Muroi, Kaori; Kawakami, Masahiro; Tomoda, Mari; Miyaki, Koichi; Nakayama, Takeo; Kosugi, Shinji; Kato, Kazuto

    2008-01-01

    Genome science, including topics such as gene recombination, cloning, genetic tests, and gene therapy, is now an established part of our daily lives; thus we need to learn genome science to better equip ourselves for the present day. Learning from topics directly related to the human has been suggested to be more effective than learning from…

  16. The LLNL [Lawrence Livermore National Laboratory] ICF [Inertial Confinement Fusion] Program: Progress toward ignition in the Laboratory

    International Nuclear Information System (INIS)

    Storm, E.; Batha, S.H.; Bernat, T.P.; Bibeau, C.; Cable, M.D.; Caird, J.A.; Campbell, E.M.; Campbell, J.H.; Coleman, L.W.; Cook, R.C.; Correll, D.L.; Darrow, C.B.; Davis, J.I.; Drake, R.P.; Ehrlich, R.B.; Ellis, R.J.; Glendinning, S.G.; Haan, S.W.; Haendler, B.L.; Hatcher, C.W.; Hatchett, S.P.; Hermes, G.L.; Hunt, J.P.; Kania, D.R.; Kauffman, R.L.; Kilkenny, J.D.; Kornblum, H.N.; Kruer, W.L.; Kyrazis, D.T.; Lane, S.M.; Laumann, C.W.; Lerche, R.A.; Letts, S.A.; Lindl, J.D.; Lowdermilk, W.H.; Mauger, G.J.; Montgomery, D.S.; Munro, D.H.; Murray, J.R.; Phillion, D.W.; Powell, H.T.; Remington, B.R.; Ress, D.B.; Speck, D.R.; Suter, L.J.; Tietbohl, G.L.; Thiessen, A.R.; Trebes, J.E.; Trenholme, J.B.; Turner, R.E.; Upadhye, R.S.; Wallace, R.J.; Wiedwald, J.D.; Woodworth, J.G.; Young, P.M.; Ze, F.

    1990-01-01

    The Inertial Confinement Fusion (ICF) Program at the Lawrence Livermore National Laboratory (LLNL) has made substantial progress in target physics, target diagnostics, and laser science and technology. In each area, progress required the development of experimental techniques and computational modeling. The objectives of the target physics experiments in the Nova laser facility are to address and understand critical physics issues that determine the conditions required to achieve ignition and gain in an ICF capsule. The LLNL experimental program primarily addresses indirect-drive implosions, in which the capsule is driven by x rays produced by the interaction of the laser light with a high-Z plasma. Experiments address both the physics of generating the radiation environment in a laser-driven hohlraum and the physics associated with imploding ICF capsules to ignition and high-gain conditions in the absence of alpha deposition. Recent experiments and modeling have established much of the physics necessary to validate the basic concept of ignition and ICF target gain in the laboratory. The rapid progress made in the past several years, and in particular, recent results showing higher radiation drive temperatures and implosion velocities than previously obtained and assumed for high-gain target designs, has led LLNL to propose an upgrade of the Nova laser to 1.5 to 2 MJ (at 0.35 μm) to demonstrate ignition and energy gains of 10 to 20 -- the Nova Upgrade

  17. High-confidence assessment of functional impact of human mitochondrial non-synonymous genome variations by APOGEE.

    Directory of Open Access Journals (Sweden)

    Stefano Castellana

    2017-06-01

    Full Text Available 24,189 are all the possible non-synonymous amino acid changes potentially affecting the human mitochondrial DNA. Only a tiny subset was functionally evaluated with certainty so far, while the pathogenicity of the vast majority was only assessed in-silico by software predictors. Since these tools proved to be rather incongruent, we have designed and implemented APOGEE, a machine-learning algorithm that outperforms all existing prediction methods in estimating the harmfulness of mitochondrial non-synonymous genome variations. We provide a detailed description of the underlying algorithm, of the selected and manually curated training and test sets of variants, as well as of its classification ability.

  18. Quality assessment of buccal versus blood genomic DNA using the Affymetrix 500 K GeneChip

    Directory of Open Access Journals (Sweden)

    Martin Lisa J

    2007-11-01

    Full Text Available Abstract Background With the advent of genome-wide genotyping, the utility of stored buccal brushes for DNA extraction and genotyping has been questioned. We sought to describe the genomic DNA yield and concordance between stored buccal brushes and blood samples from the same individuals in the context of Affymetrix 500 K Human GeneChip genotyping. Results Buccal cytobrushes stored for ~7 years at -80°C prior to extraction yielded sufficient double stranded DNA (dsDNA to be successfully genotyped on the Affymetrix ~262 K NspI chip, with yields between 536 and 1047 ng dsDNA. Using the BRLMM algorithm, genotyping call rates for blood samples averaged 98.4%, and for buccal samples averaged 97.8%. Matched blood samples exhibited 99.2% concordance, while matched blood and buccal samples exhibited 98.8% concordance. Conclusion Buccal cytobrushes stored long-term result in sufficient dsDNA concentrations to achieve high genotyping call rates and concordance with stored blood samples in the context of Affymetrix 500 K SNP genotyping. Thus, given high-quality collection and storage protocols, it is possible to use stored buccal cytobrush samples for genome-wide association studies.

  19. Carbohydrate-active enzymes from pigmented Bacilli: a genomic approach to assess carbohydrate utilization and degradation

    Directory of Open Access Journals (Sweden)

    Henrissat Bernard

    2011-09-01

    Full Text Available Abstract Background Spore-forming Bacilli are Gram-positive bacteria commonly found in a variety of natural habitats, including soil, water and the gastro-intestinal (GI-tract of animals. Isolates of various Bacillus species produce pigments, mostly carotenoids, with a putative protective role against UV irradiation and oxygen-reactive forms. Results We report the annotation of carbohydrate active enzymes (CAZymes of two pigmented Bacilli isolated from the human GI-tract and belonging to the Bacillus indicus and B. firmus species. A high number of glycoside hydrolases (GHs and carbohydrate binding modules (CBMs were found in both isolates. A detailed analysis of CAZyme families, was performed and supported by growth data. Carbohydrates able to support growth as the sole carbon source negatively effected carotenoid formation in rich medium, suggesting that a catabolite repression-like mechanism controls carotenoid biosynthesis in both Bacilli. Experimental results on biofilm formation confirmed genomic data on the potentials of B. indicus HU36 to produce a levan-based biofilm, while mucin-binding and -degradation experiments supported genomic data suggesting the ability of both Bacilli to degrade mammalian glycans. Conclusions CAZy analyses of the genomes of the two pigmented Bacilli, compared to other Bacillus species and validated by experimental data on carbohydrate utilization, biofilm formation and mucin degradation, suggests that the two pigmented Bacilli are adapted to the intestinal environment and are suited to grow in and colonize the human gut.

  20. The value of assessments in Lawrence Livermore National Laboratory's Waste Certification Programs

    International Nuclear Information System (INIS)

    Ryan, E.M.

    1995-05-01

    This paper will discuss the value of assessments in Lawrence Livermore National Laboratory's Waste Certification Programs by: introducing the organization and purpose of the LLNL Waste Certification Programs for transuranic, low-level, and hazardous waste; examining the differences in internal assessment/audit requirements for these programs; discussing the values and costs of assessments in a waste certification program; presenting practical recommendations to maximize the value of your assessment programs; and presenting improvements in LLNL's waste certification processes that resulted from assessments

  1. Strengthening LLNL Missions through Laboratory Directed Research and Development in High Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Willis, D. K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-12-01

    High performance computing (HPC) has been a defining strength of Lawrence Livermore National Laboratory (LLNL) since its founding. Livermore scientists have designed and used some of the world’s most powerful computers to drive breakthroughs in nearly every mission area. Today, the Laboratory is recognized as a world leader in the application of HPC to complex science, technology, and engineering challenges. Most importantly, HPC has been integral to the National Nuclear Security Administration’s (NNSA’s) Stockpile Stewardship Program—designed to ensure the safety, security, and reliability of our nuclear deterrent without nuclear testing. A critical factor behind Lawrence Livermore’s preeminence in HPC is the ongoing investments made by the Laboratory Directed Research and Development (LDRD) Program in cutting-edge concepts to enable efficient utilization of these powerful machines. Congress established the LDRD Program in 1991 to maintain the technical vitality of the Department of Energy (DOE) national laboratories. Since then, LDRD has been, and continues to be, an essential tool for exploring anticipated needs that lie beyond the planning horizon of our programs and for attracting the next generation of talented visionaries. Through LDRD, Livermore researchers can examine future challenges, propose and explore innovative solutions, and deliver creative approaches to support our missions. The present scientific and technical strengths of the Laboratory are, in large part, a product of past LDRD investments in HPC. Here, we provide seven examples of LDRD projects from the past decade that have played a critical role in building LLNL’s HPC, computer science, mathematics, and data science research capabilities, and describe how they have impacted LLNL’s mission.

  2. LLNL MOX fuel lead assemblies data report for the surplus plutonium disposition environmental impact statement

    International Nuclear Information System (INIS)

    O'Connor, D.G.; Fisher, S.E.; Holdaway, R.

    1998-08-01

    The purpose of this document is to support the US Department of Energy (DOE) Fissile Materials Disposition Program's preparation of the draft surplus plutonium disposition environmental impact statement. This is one of several responses to data call requests for background information on activities associated with the operation of the lead assembly (LA) mixed-oxide (MOX) fuel fabrication facility. The DOE Office of Fissile Materials Disposition (DOE-MD) has developed a dual-path strategy for disposition of surplus weapons-grade plutonium. One of the paths is to disposition surplus plutonium through irradiation of MOX fuel in commercial nuclear reactors. MOX fuel consists of plutonium and uranium oxides (PuO 2 and UO 2 ), typically containing 95% or more UO 2 . DOE-MD requested that the DOE Site Operations Offices nominate DOE sites that meet established minimum requirements that could produce MOX LAs. LLNL has proposed an LA MOX fuel fabrication approach that would be done entirely inside an S and S Category 1 area. This includes receipt and storage of PuO 2 powder, fabrication of MOX fuel pellets, assembly of fuel rods and bundles, and shipping of the packaged fuel to a commercial reactor site. Support activities will take place within a Category 1 area. Building 332 will be used to receive and store the bulk PuO 2 powder, fabricate MOX fuel pellets, and assemble fuel rods. Building 334 will be used to assemble, store, and ship fuel bundles. Only minor modifications would be required of Building 332. Uncontaminated glove boxes would need to be removed, petition walls would need to be removed, and minor modifications to the ventilation system would be required

  3. LLNL MOX fuel lead assemblies data report for the surplus plutonium disposition environmental impact statement

    Energy Technology Data Exchange (ETDEWEB)

    O`Connor, D.G.; Fisher, S.E.; Holdaway, R. [and others

    1998-08-01

    The purpose of this document is to support the US Department of Energy (DOE) Fissile Materials Disposition Program`s preparation of the draft surplus plutonium disposition environmental impact statement. This is one of several responses to data call requests for background information on activities associated with the operation of the lead assembly (LA) mixed-oxide (MOX) fuel fabrication facility. The DOE Office of Fissile Materials Disposition (DOE-MD) has developed a dual-path strategy for disposition of surplus weapons-grade plutonium. One of the paths is to disposition surplus plutonium through irradiation of MOX fuel in commercial nuclear reactors. MOX fuel consists of plutonium and uranium oxides (PuO{sub 2} and UO{sub 2}), typically containing 95% or more UO{sub 2}. DOE-MD requested that the DOE Site Operations Offices nominate DOE sites that meet established minimum requirements that could produce MOX LAs. LLNL has proposed an LA MOX fuel fabrication approach that would be done entirely inside an S and S Category 1 area. This includes receipt and storage of PuO{sub 2} powder, fabrication of MOX fuel pellets, assembly of fuel rods and bundles, and shipping of the packaged fuel to a commercial reactor site. Support activities will take place within a Category 1 area. Building 332 will be used to receive and store the bulk PuO{sub 2} powder, fabricate MOX fuel pellets, and assemble fuel rods. Building 334 will be used to assemble, store, and ship fuel bundles. Only minor modifications would be required of Building 332. Uncontaminated glove boxes would need to be removed, petition walls would need to be removed, and minor modifications to the ventilation system would be required.

  4. Breeding Jatropha curcas by genomic selection: A pilot assessment of the accuracy of predictive models.

    Science.gov (United States)

    Azevedo Peixoto, Leonardo de; Laviola, Bruno Galvêas; Alves, Alexandre Alonso; Rosado, Tatiana Barbosa; Bhering, Leonardo Lopes

    2017-01-01

    Genomic wide selection is a promising approach for improving the selection accuracy in plant breeding, particularly in species with long life cycles, such as Jatropha. Therefore, the objectives of this study were to estimate the genetic parameters for grain yield (GY) and the weight of 100 seeds (W100S) using restricted maximum likelihood (REML); to compare the performance of GWS methods to predict GY and W100S; and to estimate how many markers are needed to train the GWS model to obtain the maximum accuracy. Eight GWS models were compared in terms of predictive ability. The impact that the marker density had on the predictive ability was investigated using a varying number of markers, from 2 to 1,248. Because the genetic variance between evaluated genotypes was significant, it was possible to obtain selection gain. All of the GWS methods tested in this study can be used to predict GY and W100S in Jatropha. A training model fitted using 1,000 and 800 markers is sufficient to capture the maximum genetic variance and, consequently, maximum prediction ability of GY and W100S, respectively. This study demonstrated the applicability of genome-wide prediction to identify useful genetic sources of GY and W100S for Jatropha breeding. Further research is needed to confirm the applicability of the proposed approach to other complex traits.

  5. Breeding Jatropha curcas by genomic selection: A pilot assessment of the accuracy of predictive models.

    Directory of Open Access Journals (Sweden)

    Leonardo de Azevedo Peixoto

    Full Text Available Genomic wide selection is a promising approach for improving the selection accuracy in plant breeding, particularly in species with long life cycles, such as Jatropha. Therefore, the objectives of this study were to estimate the genetic parameters for grain yield (GY and the weight of 100 seeds (W100S using restricted maximum likelihood (REML; to compare the performance of GWS methods to predict GY and W100S; and to estimate how many markers are needed to train the GWS model to obtain the maximum accuracy. Eight GWS models were compared in terms of predictive ability. The impact that the marker density had on the predictive ability was investigated using a varying number of markers, from 2 to 1,248. Because the genetic variance between evaluated genotypes was significant, it was possible to obtain selection gain. All of the GWS methods tested in this study can be used to predict GY and W100S in Jatropha. A training model fitted using 1,000 and 800 markers is sufficient to capture the maximum genetic variance and, consequently, maximum prediction ability of GY and W100S, respectively. This study demonstrated the applicability of genome-wide prediction to identify useful genetic sources of GY and W100S for Jatropha breeding. Further research is needed to confirm the applicability of the proposed approach to other complex traits.

  6. Functional genomics to assess biological responses to marine pollution at physiological and evolutionary timescales: toward a vision of predictive ecotoxicology.

    Science.gov (United States)

    Reid, Noah M; Whitehead, Andrew

    2016-09-01

    Marine pollution is ubiquitous, and is one of the key factors influencing contemporary marine biodiversity worldwide. To protect marine biodiversity, how do we surveil, document and predict the short- and long-term impacts of pollutants on at-risk species? Modern genomics tools offer high-throughput, information-rich and increasingly cost-effective approaches for characterizing biological responses to environmental stress, and are important tools within an increasing sophisticated kit for surveiling and assessing impacts of pollutants on marine species. Through the lens of recent research in marine killifish, we illustrate how genomics tools may be useful for screening chemicals and pollutants for biological activity and to reveal specific mechanisms of action. The high dimensionality of transcriptomic responses enables their usage as highly specific fingerprints of exposure, and these fingerprints can be used to diagnose environmental problems. We also emphasize that molecular pathways recruited to respond at physiological timescales are the same pathways that may be targets for natural selection during chronic exposure to pollutants. Gene complement and sequence variation in those pathways can be related to variation in sensitivity to environmental pollutants within and among species. Furthermore, allelic variation associated with evolved tolerance in those pathways could be tracked to estimate the pace of environmental health decline and recovery. We finish by integrating these paradigms into a vision of how genomics approaches could anchor a modernized framework for advancing the predictive capacity of environmental and ecotoxicological science. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  7. Interim report on updated microarray probes for the LLNL Burkholderia pseudomallei SNP array

    Energy Technology Data Exchange (ETDEWEB)

    Gardner, S; Jaing, C

    2012-03-27

    The overall goal of this project is to forensically characterize 100 unknown Burkholderia isolates in the US-Australia collaboration. We will identify genome-wide single nucleotide polymorphisms (SNPs) from B. pseudomallei and near neighbor species including B. mallei, B. thailandensis and B. oklahomensis. We will design microarray probes to detect these SNP markers and analyze 100 Burkholderia genomic DNAs extracted from environmental, clinical and near neighbor isolates from Australian collaborators on the Burkholderia SNP microarray. We will analyze the microarray genotyping results to characterize the genetic diversity of these new isolates and triage the samples for whole genome sequencing. In this interim report, we described the SNP analysis and the microarray probe design for the Burkholderia SNP microarray.

  8. Systematic Identification and Assessment of Therapeutic Targets for Breast Cancer Based on Genome-Wide RNA Interference Transcriptomes

    Directory of Open Access Journals (Sweden)

    Yang Liu

    2017-02-01

    Full Text Available With accumulating public omics data, great efforts have been made to characterize the genetic heterogeneity of breast cancer. However, identifying novel targets and selecting the best from the sizeable lists of candidate targets is still a key challenge for targeted therapy, largely owing to the lack of economical, efficient and systematic discovery and assessment to prioritize potential therapeutic targets. Here, we describe an approach that combines the computational evaluation and objective, multifaceted assessment to systematically identify and prioritize targets for biological validation and therapeutic exploration. We first establish the reference gene expression profiles from breast cancer cell line MCF7 upon genome-wide RNA interference (RNAi of a total of 3689 genes, and the breast cancer query signatures using RNA-seq data generated from tissue samples of clinical breast cancer patients in the Cancer Genome Atlas (TCGA. Based on gene set enrichment analysis, we identified a set of 510 genes that when knocked down could significantly reverse the transcriptome of breast cancer state. We then perform multifaceted assessment to analyze the gene set to prioritize potential targets for gene therapy. We also propose drug repurposing opportunities and identify potentially druggable proteins that have been poorly explored with regard to the discovery of small-molecule modulators. Finally, we obtained a small list of candidate therapeutic targets for four major breast cancer subtypes, i.e., luminal A, luminal B, HER2+ and triple negative breast cancer. This RNAi transcriptome-based approach can be a helpful paradigm for relevant researches to identify and prioritize candidate targets for experimental validation.

  9. Bifidobacterium Bacteremia: Clinical Characteristics and a Genomic Approach To Assess Pathogenicity

    Science.gov (United States)

    Hjerde, Erik; Cavanagh, Jorunn Pauline; Simonsen, Gunnar Skov; Klingenberg, Claus

    2017-01-01

    ABSTRACT Bifidobacteria are commensals that colonize the orogastrointestinal tract and rarely cause invasive human infections. However, an increasing number of bifidobacterial blood culture isolates has lately been observed in Norway. In order to investigate the pathogenicity of the Bifidobacterium species responsible for bacteremia, we studied Bifidobacterium isolates from 15 patients for whom cultures of blood obtained from 2013 to 2015 were positive. We collected clinical data and analyzed phenotypic and genotypic antibiotic susceptibility. All isolates (11 Bifidobacterium longum, 2 B. breve, and 2 B. animalis isolates) were subjected to whole-genome sequencing. The 15 patients were predominantly in the extreme lower or upper age spectrum, many were severely immunocompromised, and 11 of 15 had gastrointestinal tract-related conditions. In two elderly patients, the Bifidobacterium bacteremia caused a sepsis-like picture, interpreted as the cause of death. Most bifidobacterial isolates had low MICs (≤0.5 mg/liter) to beta-lactam antibiotics, vancomycin, and clindamycin and relatively high MICs to ciprofloxacin and metronidazole. We performed a pangenomic comparison of invasive and noninvasive B. longum isolates based on 65 sequences available from GenBank and the sequences of 11 blood culture isolates from this study. Functional annotation identified unique genes among both invasive and noninvasive isolates of Bifidobacterium. Phylogenetic clusters of invasive isolates were identified for a subset of the B. longum subsp. longum isolates. However, there was no difference in the number of putative virulence genes between invasive and noninvasive isolates. In conclusion, Bifidobacterium has an invasive potential in the immunocompromised host and may cause a sepsis-like picture. Using comparative genomics, we could not delineate specific pathogenicity traits characterizing invasive isolates. PMID:28490487

  10. Design and construction of a 208-L drum containing representative LLNL transuranic and low-level wastes

    International Nuclear Information System (INIS)

    Camp, D.C.; Pickering, J.; Martz, H.E.

    1994-01-01

    At the Lawrence Livermore National Laboratory (LLNL), we are developing the nondestructive analysis (NDA) technique of active (A) computed tomography (CT) to measure waste matrix attenuation as a function of gamma-ray energy (ACT); and passive. (P) Cr to locate and identify all gamma-ray emitting isotopes within a waste container. Coupling the ACT and PCT results will quantify each isotope identified, thereby categorize the amount of radioactivity within waste drums having volumes up to 416-liters (L), i.e., 110-gallon drums

  11. Evaluation of dynamic range for LLNL streak cameras using high contrast pulsed and pulse podiatry on the Nova laser system

    International Nuclear Information System (INIS)

    Richards, J.B.; Weiland, T.L.; Prior, J.A.

    1990-01-01

    This paper reports on a standard LLNL streak camera that has been used to analyze high contrast pulses on the Nova laser facility. These pulses have a plateau at their leading edge (foot) with an amplitude which is approximately 1% of the maximum pulse height. Relying on other features of the pulses and on signal multiplexing, we were able to determine how accurately the foot amplitude was being represented by the camera. Results indicate that the useful single channel dynamic range of the instrument approaches 100:1

  12. Report for Detection of Biothreat Agents and Environmental Samples using the LLNL Virulence Array for DHS

    Energy Technology Data Exchange (ETDEWEB)

    Jaing, Crystal [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gardner, Shea [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McLoughlin, Kevin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Thissen, James [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Jackson, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2011-04-18

    The objective of this project is to provide DHS a comprehensive evaluation of the current genomic technologies including genotyping, Taqman PCR, multiple locus variable tandem repeat analysis (MLVA), microarray and high-throughput DNA sequencing in the analysis of biothreat agents from complex environmental samples. This report focuses on the design, testing and results of samples on the Virulence Array.

  13. Thermal safety characterization on PETN, PBX-9407, LX-10-2, LX-17-1 and detonator in the LLNL's P-ODTX system

    Energy Technology Data Exchange (ETDEWEB)

    Hsu, P. C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Strout, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, J. G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kahl, E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ellsworth, F. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Healy, T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-21

    Incidents caused by fire and other thermal events can heat energetic materials that may lead to thermal explosion and result in structural damage and casualty. Thus, it is important to understand the response of energetic materials to thermal insults. The One-Dimensional-Time to Explosion (ODTX) system at the Lawrence Livermore National Laboratory (LLNL) has been used for decades to characterize thermal safety of energetic materials. In this study, an integration of a pressure monitoring element has been added into the ODTX system (P-ODTX) to perform thermal explosion (cook-off) experiments (thermal runaway) on PETN powder, PBX-9407, LX-10-2, LX-17-1, and detonator samples (cup tests). The P-ODTX testing generates useful data (thermal explosion temperature, thermal explosion time, and gas pressures) to assist with the thermal safety assessment of relevant energetic materials and components. This report summarizes the results of P-ODTX experiments that were performed from May 2015 to July 2017. Recent upgrades to the data acquisition system allows for rapid pressure monitoring in microsecond intervals during thermal explosion. These pressure data are also included in the report.

  14. Assessment of genome origins and genetic diversity in the genus Eleusine with DNA markers.

    Science.gov (United States)

    Salimath, S S; de Oliveira, A C; Godwin, I D; Bennetzen, J L

    1995-08-01

    Finger millet (Eleusine coracana), an allotetraploid cereal, is widely cultivated in the arid and semiarid regions of the world. Three DNA marker techniques, restriction fragment length polymorphism (RFLP), randomly amplified polymorphic DNA (RAPD), and inter simple sequence repeat amplification (ISSR), were employed to analyze 22 accessions belonging to 5 species of Eleusine. An 8 probe--3 enzyme RFLP combination, 18 RAPD primers, and 6 ISSR primers, respectively, revealed 14, 10, and 26% polymorphism in 17 accessions of E. coracana from Africa and Asia. These results indicated a very low level of DNA sequence variability in the finger millets but did allow each line to be distinguished. The different Eleusine species could be easily identified by DNA marker technology and the 16% intraspecific polymorphism exhibited by the two analyzed accessions of E. floccifolia suggested a much higher level of diversity in this species than in E. coracana. Between species, E. coracana and E. indica shared the most markers, while E. indica and E. tristachya shared a considerable number of markers, indicating that these three species form a close genetic assemblage within the Eleusine. Eleusine floccifolia and E. compressa were found to be the most divergent among the species examined. Comparison of RFLP, RAPD, and ISSR technologies, in terms of the quantity and quality of data output, indicated that ISSRs are particularly promising for the analysis of plant genome diversity.

  15. Emergency Response Capability Baseline Needs Assessment - Compliance Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Sharry, John A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-09-01

    This document was prepared by John A. Sharry, LLNL Fire Marshal and Division Leader for Fire Protection and was reviewed by LLNL Emergency Management Department Head, James Colson. This document is the second of a two-part analysis on Emergency Response Capabilities of Lawrence Livermore National Laboratory. The first part, 2016 Baseline Needs Assessment Requirements Document established the minimum performance criteria necessary to meet mandatory requirements. This second part analyses the performance of Lawrence Livermore Laboratory Emergency Management Department to the contents of the Requirements Document. The document was prepared based on an extensive review of information contained in the 2016 BNA, a review of Emergency Planning Hazards Assessments, a review of building construction, occupancy, fire protection features, dispatch records, LLNL alarm system records, fire department training records, and fire department policies and procedures. The 2013 BNA was approved by NNSA’s Livermore Field Office on January 22, 2014.

  16. [Utilizing the ultraintense JanUSP laser at LLNL]. 99-ERD-049 Final LDRD Report

    International Nuclear Information System (INIS)

    Patel, P K; Price, D F; Mackinnon, A J; Springer, P T

    2002-01-01

    Recent advances in laser and optical technologies have now enabled the current generation of high intensity, ultrashort-pulse lasers to achieve focal intensities of 10 20 -10 21 W/cm 2 in pulse durations of 100-500fs. These ultraintense laser pulses are capable of producing highly relativistic plasma states with densities, temperatures, and pressures rivaling those found in the interiors of stars and nuclear weapons. Utilizing the ultraintense 100TW JanUSP laser at LLNL we have explored the possibility of ion shock heating small micron-sized plasmas to extremely high energy densities approaching 1GJ/g on timescales of a few hundred femtoseconds. The JanUSP laser delivers 10 Joules of energy in a 100fs pulse in a near diffraction-limited beam, producing intensities on target of up to 10 21 W/cm 2 . The electric field of the laser at this intensity ionizes and accelerates electrons to relativistic MeV energies. The sudden ejection of electrons from the focal region produces tremendous electrostatic forces which in turn accelerate heavier ions to MeV energies. The predicted ion flux of 1 MJ/cm 2 is sufficient to achieve thermal equilibrium conditions at high temperature in solid density targets. Our initial experiments were carried out at the available laser contrast of 10 -7 (i.e. the contrast of the amplified spontaneous emission (ASE), and of the pre-pules produced in the regenerative amplifier). We used the nuclear photoactivation of Au-197 samples to measure the gamma production above 12MeV-corresponding to the threshold for the Au-197(y,n) reaction. Since the predominant mechanism for gamma production is through the bremsstrahlung emission of energetic electrons as they pass through the solid target we were able to infer a conversion yield of several percent of the incident laser energy into electrons with energies >12MeV. This result is consistent with the interaction of the main pulse with a large pre-formed plasma. The contrast of the laser was improved to

  17. Dielectronic Satellite Spectra of Na-like Mo Ions Benchmarked by LLNL EBIT with Application to HED Plasmas

    Science.gov (United States)

    Stafford, A.; Safronova, A. S.; Kantsyrev, V. L.; Safronova, U. I.; Petkov, E. E.; Shlyaptseva, V. V.; Childers, R.; Shrestha, I.; Beiersdorfer, P.; Hell, H.; Brown, G. V.

    2017-10-01

    Dielectronic recombination (DR) is an important process for astrophysical and laboratory high energy density (HED) plasmas and the associated satellite lines are frequently used for plasma diagnostics. In particular, K-shell DR satellite lines were studied in detail in low-Z plasmas. L-shell Na-like spectral features from Mo X-pinches considered here represent the blend of DR and inner shell satellites and motivated the detailed study of DR at the EBIT-1 electron beam ion trap at LLNL. In these experiments the beam energy was swept between 0.6 - 2.4 keV to produce resonances at certain electron beam energies. The advantages of using an electron beam ion trap to better understand atomic processes with highly ionized ions in HED Mo plasma are highlighted. This work was supported by NNSA under DOE Grant DE-NA0002954. Work at LLNL was performed under the auspices of the U.S. DOE under Contract No. DE-AC52-07NA27344.

  18. Estimate of aircraft crash hit frequencies on to facilities at the Lawrence Livermore National Laboratory (LLNL) Site 200

    International Nuclear Information System (INIS)

    Kimura, C.Y.

    1997-01-01

    Department of Energy (DOE) nuclear facilities are required by DOE Order 5480.23, Section 8.b.(3)(k) to consider external events as initiating events to accidents within the scope of their Safety Analysis Reports (SAR). One of the external initiating events which should be considered within the scope of a SAR is an aircraft accident, i.e., an aircraft crashing into the nuclear facility with the related impact and fire leading to penetration of the facility and to the release of radioactive and/or hazardous materials. This report presents the results of an Aircraft Crash Frequency analysis performed for the Materials Management Area (MMA), and the National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory (LLNL) Site 200. The analysis estimates only the aircraft crash hit frequency on to the analyzed facilities. No initial aircraft crash hit frequency screening structural response calculations of the facilities to the aircraft impact, or consequence analysis of radioactive/hazardous materials released following the aircraft impact are performed. The method used to estimate the aircraft crash hit frequencies on to facilities at the Lawrence Livermore National Laboratory (LLNL) generally follows the procedure given by the DOE Standard 3014-96 on Aircraft Crash Analysis. However, certain adjustments were made to the DOE Standard procedure because of the site specific fight environment or because of facility specific characteristics

  19. Assessing Genomic Selection Prediction Accuracy in a Dynamic Barley Breeding Population

    Directory of Open Access Journals (Sweden)

    A. H. Sallam

    2015-03-01

    Full Text Available Prediction accuracy of genomic selection (GS has been previously evaluated through simulation and cross-validation; however, validation based on progeny performance in a plant breeding program has not been investigated thoroughly. We evaluated several prediction models in a dynamic barley breeding population comprised of 647 six-row lines using four traits differing in genetic architecture and 1536 single nucleotide polymorphism (SNP markers. The breeding lines were divided into six sets designated as one parent set and five consecutive progeny sets comprised of representative samples of breeding lines over a 5-yr period. We used these data sets to investigate the effect of model and training population composition on prediction accuracy over time. We found little difference in prediction accuracy among the models confirming prior studies that found the simplest model, random regression best linear unbiased prediction (RR-BLUP, to be accurate across a range of situations. In general, we found that using the parent set was sufficient to predict progeny sets with little to no gain in accuracy from generating larger training populations by combining the parent set with subsequent progeny sets. The prediction accuracy ranged from 0.03 to 0.99 across the four traits and five progeny sets. We explored characteristics of the training and validation populations (marker allele frequency, population structure, and linkage disequilibrium, LD as well as characteristics of the trait (genetic architecture and heritability, . Fixation of markers associated with a trait over time was most clearly associated with reduced prediction accuracy for the mycotoxin trait DON. Higher trait in the training population and simpler trait architecture were associated with greater prediction accuracy.

  20. Coupling of realistic rate estimates with genomic for Assessing Contaminant Attenuation and Long-Term Phone

    Energy Technology Data Exchange (ETDEWEB)

    Colwell, F. S.; Crawford, R. L.; Sorenson, K.

    2003-06-01

    Dissolved dense nonaqueous-phase liquid plumes are persistent, widespread problems in the DOE complex. While perceived as being difficult to degrade, at the Idaho National Engineering and Environmental Laboratory, dissolved trichloroethylene (TCE) is disappearing from the Snake River Plain aquifer (SRPA) by natural attenuation, a finding that saves significant site restoration costs. Acceptance of monitored natural attenuation as a preferred treatment technology requires direct proof of the process and rate of the degradation. Our proposal aims to provide that proof for one such site by testing two hypotheses. First, we believe that realistic values for in situ rates of TCE cometabolism can be obtained by sustaining the putative microorganisms at the low catabolic activities consistent with aquifer conditions. Second, the patterns of functional gene expression evident in these communities under starvation conditions while carrying out TCE cometabolism can be used to diagnose the cometabolic activity in the aquifer itself. Using the cometabolism rate parameters derived in low-growth bioreactors, we will complete the models that predict the time until background levels of TCE are attained at this location and validate the long term stewardship of this plume. Realistic terms for cometabolism of TCE will provide marked improvements in DOE's ability to predict and monitor natural attenuation of chlorinated organics at other sites, increase the acceptability of this solution, and provide significant economic and health benefits through this noninvasive remediation strategy. Finally, this project will derive valuable genomic information about the functional attributes of subsurface microbial communities upon which DOE must depend to resolve some of its most difficult contamination issues.

  1. BYSTANDER EFFECTS, GENOMIC INSTABILITY, ADAPTIVE RESPONSE AND CANCER RISK ASSESSMENT FOR RADIATION AND CHEMICAL EXPOSURES

    Science.gov (United States)

    There is an increased interest in utilizing mechanistic data in support of the cancer risk assessment process for ionizing radiation and environmental chemical exposures. In this regard the use of biologically based dose-response models is particularly advocated. The aim is to pr...

  2. Functional toxicogenomic assessment of triclosan in human HepG2 cells using genome-wide CRISPR-Cas9 screen

    Science.gov (United States)

    Thousands of chemicals for which limited toxicological data are available are used and then detected in humans and the environment. Rapid and cost-effective approaches for assessing the toxicological properties of chemicals are needed. We used CRISPR-Cas9 functional genomic scree...

  3. Assessment of adaptability of zebu cattle (Bos indicus) breeds in two different climatic conditions: using cytogenetic techniques on genome integrity.

    Science.gov (United States)

    Kumar, Anil; Waiz, Syma Ashraf; Sridhar Goud, T; Tonk, R K; Grewal, Anita; Singh, S V; Yadav, B R; Upadhyay, R C

    2016-06-01

    The aim of this study was to evaluate the genome integrity so as to assess the adaptability of three breeds of indigenous cattle reared under arid and semi-arid regions of Rajasthan (Bikaner) and Haryana (Karnal) India. The cattle were of homogenous group (same age and sex) of indigenous breeds viz. Sahiwal, Tharparkar and Kankrej. A total of 100 animals were selected for this study from both climatic conditions. The sister chromatid exchanges (SCE's), chromosomal gaps and chromatid breaks were observed in metaphase plates of chromosome preparations obtained from in vitro culture of peripheral blood lymphocytes. The mean number of breaks and gaps in Sahiwal and Tharparkar of semi-arid zone were 8.56 ± 3.16, 6.4 ± 3.39 and 8.72 ± 2.04, 3.52 ± 6.29, respectively. Similarly, the mean number of breaks and gaps in Tharparkar and Kankrej cattle of arid zone were 5.26 ± 1.76, 2.74 ± 1.76 and 5.24 ± 1.84, 2.5 ± 1.26, respectively. The frequency of SCEs in chromosomes was found significantly higher (P  0.05) was observed in the same zone. The analysis of frequency of CAs and SCEs revealed significant effects of environmental conditions on the genome integrity of animals, thereby indicating an association with their adaptability.

  4. From disease association to risk assessment: an optimistic view from genome-wide association studies on type 1 diabetes.

    Directory of Open Access Journals (Sweden)

    Zhi Wei

    2009-10-01

    Full Text Available Genome-wide association studies (GWAS have been fruitful in identifying disease susceptibility loci for common and complex diseases. A remaining question is whether we can quantify individual disease risk based on genotype data, in order to facilitate personalized prevention and treatment for complex diseases. Previous studies have typically failed to achieve satisfactory performance, primarily due to the use of only a limited number of confirmed susceptibility loci. Here we propose that sophisticated machine-learning approaches with a large ensemble of markers may improve the performance of disease risk assessment. We applied a Support Vector Machine (SVM algorithm on a GWAS dataset generated on the Affymetrix genotyping platform for type 1 diabetes (T1D and optimized a risk assessment model with hundreds of markers. We subsequently tested this model on an independent Illumina-genotyped dataset with imputed genotypes (1,008 cases and 1,000 controls, as well as a separate Affymetrix-genotyped dataset (1,529 cases and 1,458 controls, resulting in area under ROC curve (AUC of approximately 0.84 in both datasets. In contrast, poor performance was achieved when limited to dozens of known susceptibility loci in the SVM model or logistic regression model. Our study suggests that improved disease risk assessment can be achieved by using algorithms that take into account interactions between a large ensemble of markers. We are optimistic that genotype-based disease risk assessment may be feasible for diseases where a notable proportion of the risk has already been captured by SNP arrays.

  5. Genome-wide assessment for genetic variants associated with ventricular dysfunction after primary coronary artery bypass graft surgery.

    Directory of Open Access Journals (Sweden)

    Amanda A Fox

    Full Text Available BACKGROUND: Postoperative ventricular dysfunction (VnD occurs in 9-20% of coronary artery bypass graft (CABG surgical patients and is associated with increased postoperative morbidity and mortality. Understanding genetic causes of postoperative VnD should enhance patient risk stratification and improve treatment and prevention strategies. We aimed to determine if genetic variants associate with occurrence of in-hospital VnD after CABG surgery. METHODS: A genome-wide association study identified single nucleotide polymorphisms (SNPs associated with postoperative VnD in male subjects of European ancestry undergoing isolated primary CABG surgery with cardiopulmonary bypass. VnD was defined as the need for ≥2 inotropes or mechanical ventricular support after CABG surgery. Validated SNPs were assessed further in two replication CABG cohorts and meta-analysis was performed. RESULTS: Over 100 SNPs were associated with VnD (P2.1 of developing in-hospital VnD after CABG surgery. However, three genetic loci identified by meta-analysis were more modestly associated with development of postoperative VnD. Studies of larger cohorts to assess these loci as well as to define other genetic mechanisms and related biology that link genetic variants to postoperative ventricular dysfunction are warranted.

  6. Technology Assessment for Powertrain Components Final Report CRADA No. TC-1124-95

    Energy Technology Data Exchange (ETDEWEB)

    Tokarz, F. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gough, C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-10-19

    LLNL utilized its defense technology assessment methodologies in combination with its capabilities in the energy; manufacturing, and transportation technologies to demonstrate a methodology that synthesized available but incomplete information on advanced automotive technologies into a comprehensive framework.

  7. Assessment of heterogeneity between European Populations: a Baltic and Danish replication case-control study of SNPs from a recent European ulcerative colitis genome wide association study

    DEFF Research Database (Denmark)

    Andersen, Vibeke; Ernst, Anja; Sventoraityte, Jurgita

    2011-01-01

    the combined Baltic, Danish, and Norwegian panel versus the combined German, British, Belgian, and Greek panel (rs7520292 (P = 0.001), rs12518307 (P = 0.007), and rs2395609 (TCP11) (P = 0.01), respectively). No SNP reached genome-wide significance in the combined analyses of all the panels. Conclusions......Background: Differences in the genetic architecture of inflammatory bowel disease between different European countries and ethnicities have previously been reported. In the present study, we wanted to assess the role of 11 newly identified UC risk variants, derived from a recent European UC genome...... wide association study (GWAS) (Franke et al., 2010), for 1) association with UC in the Nordic countries, 2) for population heterogeneity between the Nordic countries and the rest of Europe, and, 3) eventually, to drive some of the previous findings towards overall genome-wide significance. Methods...

  8. Environmental Assessment for the vacuum process laboratory (VPL) relocation at the Lawrence Livermore National Laboratory

    International Nuclear Information System (INIS)

    1992-04-01

    This Environmental Assessment (EA) evaluates the potential environmental impacts of relocating a vacuum process laboratory (VPL) from Building 321 to Building 2231 at Lawrence Livermore National Laboratory (LLNL). The VPL provides the latest technology in the field of vacuum deposition of coatings onto various substrates for several weapons-related and energy-related programs at LLNL. Operations within the VPL at LLNL will not be expanded nor reduced by the relocation. No significant environmental impacts are expected as a result of the relocation of the VPL

  9. Genomic regions under selection in crop-wild hybrids of lettuce: implications for crop breeding and environmental risk assessment

    NARCIS (Netherlands)

    Hartman, Y.

    2012-01-01

    The results of this thesis show that the probability of introgression of a putative transgene to wild relatives indeed depends strongly on the insertion location of the transgene. The study of genomic selection patterns can identify crop genomic regions under negative selection in multiple

  10. First experimental results from IBM/TENN/TULANE/LLNL/LBL undulator beamline at the advanced light source

    International Nuclear Information System (INIS)

    Jia, J.J.; Callcott, T.A.; Yurkas, J.; Ellis, A.W.; Himpsel, F.J.; Samant, M.G.; Stoehr, J.; Ederer, D.L.; Carlisle, J.A.; Hudson, E.A.; Terminello, L.J.; Shuh, D.K.; Perera, R.C.C.

    1995-01-01

    The IBM/TENN/TULANE/LLNL/LBL Beamline 8.0 at the advanced light source combining a 5.0 cm, 89 period undulator with a high-throughput, high-resolution spherical grating monochromator, provides a powerful excitation source over a spectral range of 70--1200 eV for surface physics and material science research. The beamline progress and the first experimental results obtained with a fluorescence end station on graphite and titanium oxides are presented here. The dispersive features in K emission spectra of graphite excited near threshold, and found a clear relationship between them and graphite band structure are observed. The monochromator is operated at a resolving power of roughly 2000, while the spectrometer has a resolving power of 400 for these fluorescence experiments

  11. Production of High Harmonic X-ray Radiation from Non-linear Thomson Scattering at LLNL PLEIADES

    International Nuclear Information System (INIS)

    Lim, J; Doyuran, A; Frigola, P; Travish, G; Rosenzweig, J; Anderson, S; Betts, S; Crane, J; Gibson, D; Hartemann, F; Tremaine, A

    2005-01-01

    We describe an experiment for production of high harmonic x-ray radiation from Thomson backscattering of an ultra-short high power density laser by a relativistic electron beam at the PLEIADES facility at LLNL. In this scenario, electrons execute a ''figure-8'' motion under the influence of the high-intensity laser field, where the constant characterizing the field strength is expected to exceed unity: a L = eE L /m e cw L (ge) 1. With large a L this motion produces high harmonic x-ray radiation and significant broadening of the spectral peaks. This paper is intended to give a layout of the PLEIADES experiment, along with progress towards experimental goals

  12. Use of genomic data in risk assessment case study: II. Evaluation of the dibutyl phthalate toxicogenomic data set

    Energy Technology Data Exchange (ETDEWEB)

    Euling, Susan Y., E-mail: euling.susan@epa.gov [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Washington, DC (United States); White, Lori D. [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Kim, Andrea S. [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Washington, DC (United States); Sen, Banalata [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Wilson, Vickie S. [National Health and Environmental Effects Research Laboratory, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Keshava, Channa; Keshava, Nagalakshmi [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Washington, DC (United States); Hester, Susan [National Health and Environmental Effects Research Laboratory, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Ovacik, Meric A.; Ierapetritou, Marianthi G.; Androulakis, Ioannis P. [National Center for Environmental Research Science to Achieve Results (STAR) Bioinformatics Center, Environmental Bioinformatics and Computational Toxicology Center (ebCTC), Rutgers University and University of Medicine and Dentistry of New Jersey, Piscataway, NJ (United States); Gaido, Kevin W. [Center for Veterinary Medicine, U.S. Food and Drug Administration, Rockville, MD 20855 (United States)

    2013-09-15

    An evaluation of the toxicogenomic data set for dibutyl phthalate (DBP) and male reproductive developmental effects was performed as part of a larger case study to test an approach for incorporating genomic data in risk assessment. The DBP toxicogenomic data set is composed of nine in vivo studies from the published literature that exposed rats to DBP during gestation and evaluated gene expression changes in testes or Wolffian ducts of male fetuses. The exercise focused on qualitative evaluation, based on a lack of available dose–response data, of the DBP toxicogenomic data set to postulate modes and mechanisms of action for the male reproductive developmental outcomes, which occur in the lower dose range. A weight-of-evidence evaluation was performed on the eight DBP toxicogenomic studies of the rat testis at the gene and pathway levels. The results showed relatively strong evidence of DBP-induced downregulation of genes in the steroidogenesis pathway and lipid/sterol/cholesterol transport pathway as well as effects on immediate early gene/growth/differentiation, transcription, peroxisome proliferator-activated receptor signaling and apoptosis pathways in the testis. Since two established modes of action (MOAs), reduced fetal testicular testosterone production and Insl3 gene expression, explain some but not all of the testis effects observed in rats after in utero DBP exposure, other MOAs are likely to be operative. A reanalysis of one DBP microarray study identified additional pathways within cell signaling, metabolism, hormone, disease, and cell adhesion biological processes. These putative new pathways may be associated with DBP effects on the testes that are currently unexplained. This case study on DBP identified data gaps and research needs for the use of toxicogenomic data in risk assessment. Furthermore, this study demonstrated an approach for evaluating toxicogenomic data in human health risk assessment that could be applied to future chemicals

  13. Integrating molecular QTL data into genome-wide genetic association analysis: Probabilistic assessment of enrichment and colocalization.

    Directory of Open Access Journals (Sweden)

    Xiaoquan Wen

    2017-03-01

    Full Text Available We propose a novel statistical framework for integrating the result from molecular quantitative trait loci (QTL mapping into genome-wide genetic association analysis of complex traits, with the primary objectives of quantitatively assessing the enrichment of the molecular QTLs in complex trait-associated genetic variants and the colocalizations of the two types of association signals. We introduce a natural Bayesian hierarchical model that treats the latent association status of molecular QTLs as SNP-level annotations for candidate SNPs of complex traits. We detail a computational procedure to seamlessly perform enrichment, fine-mapping and colocalization analyses, which is a distinct feature compared to the existing colocalization analysis procedures in the literature. The proposed approach is computationally efficient and requires only summary-level statistics. We evaluate and demonstrate the proposed computational approach through extensive simulation studies and analyses of blood lipid data and the whole blood eQTL data from the GTEx project. In addition, a useful utility from our proposed method enables the computation of expected colocalization signals using simple characteristics of the association data. Using this utility, we further illustrate the importance of enrichment analysis on the ability to discover colocalized signals and the potential limitations of currently available molecular QTL data. The software pipeline that implements the proposed computation procedures, enloc, is freely available at https://github.com/xqwen/integrative.

  14. A Population Genomics Approach to Assessing the Genetic Basis of Within-Host Microevolution Underlying Recurrent Cryptococcal Meningitis Infection

    Directory of Open Access Journals (Sweden)

    Johanna Rhodes

    2017-04-01

    Full Text Available Recurrence of meningitis due to Cryptococcus neoformans after treatment causes substantial mortality in HIV/AIDS patients across sub-Saharan Africa. In order to determine whether recurrence occurred due to relapse of the original infecting isolate or reinfection with a different isolate weeks or months after initial treatment, we used whole-genome sequencing (WGS to assess the genetic basis of infection in 17 HIV-infected individuals with recurrent cryptococcal meningitis (CM. Comparisons revealed a clonal relationship for 15 pairs of isolates recovered before and after recurrence showing relapse of the original infection. The two remaining pairs showed high levels of genetic heterogeneity; in one pair we found this to be a result of infection by mixed genotypes, while the second was a result of nonsense mutations in the gene encoding the DNA mismatch repair proteins MSH2, MSH5, and RAD5. These nonsense mutations led to a hypermutator state, leading to dramatically elevated rates of synonymous and nonsynonymous substitutions. Hypermutator phenotypes owing to nonsense mutations in these genes have not previously been reported in C. neoformans, and represent a novel pathway for rapid within-host adaptation and evolution of resistance to first-line antifungal drugs.

  15. Progress Toward Measuring CO2 Isotopologue Fluxes in situ with the LLNL Miniature, Laser-based CO2 Sensor

    Science.gov (United States)

    Osuna, J. L.; Bora, M.; Bond, T.

    2015-12-01

    One method to constrain photosynthesis and respiration independently at the ecosystem scale is to measure the fluxes of CO2­ isotopologues. Instrumentation is currently available to makes these measurements but they are generally costly, large, bench-top instruments. Here, we present progress toward developing a laser-based sensor that can be deployed directly to a canopy to passively measure CO2 isotopologue fluxes. In this study, we perform initial proof-of-concept and sensor characterization tests in the laboratory and in the field to demonstrate performance of the Lawrence Livermore National Laboratory (LLNL) tunable diode laser flux sensor. The results shown herein demonstrate measurement of bulk CO2 as a first step toward achieving flux measurements of CO2 isotopologues. The sensor uses a Vertical Cavity Surface Emitting Laser (VCSEL) in the 2012 nm range. The laser is mounted in a multi-pass White Cell. In order to amplify the absorption signal of CO2 in this range we employ wave modulation spectroscopy, introducing an alternating current (AC) bias component where f is the frequency of modulation on the laser drive current in addition to the direct current (DC) emission scanning component. We observed a strong linear relationship (r2 = 0.998 and r2 = 0.978 at all and low CO2 concentrations, respectively) between the 2f signal and the CO2 concentration in the cell across the range of CO2 concentrations relevant for flux measurements. We use this calibration to interpret CO2 concentration of a gas flowing through the White cell in the laboratory and deployed over a grassy field. We will discuss sensor performance in the lab and in situ as well as address steps toward achieving canopy-deployed, passive measurements of CO2 isotopologue fluxes. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. LLNL-ABS-675788

  16. Responses of intestinal virome to silver nanoparticles: safety assessment by classical virology, whole-genome sequencing and bioinformatics approaches

    Directory of Open Access Journals (Sweden)

    Gokulan K

    2018-05-01

    Full Text Available Kuppan Gokulan,1,* Aschalew Z Bekele,1,* Kenneth L Drake,2 Sangeeta Khare1 1Division of Microbiology, US Food and Drug Administration, National Center for Toxicological Research, Jefferson, AR, USA; 2Seralogix, Inc., Austin, TX, USA *These authors contributed equally to this work Background: Effects of silver nanoparticles (AgNP on the intestinal virome/phage community are mostly unknown. The working hypothesis of this study was that the exposure of pharmaceutical/nanomedicine and other consumer-use material containing silver ions and nanoparticles to the gastrointestinal tract may result in disturbance of the beneficial gut viruses/phages. Methods: This study assesses the impact of AgNP on the survival of individual bacteriophages using classical virology cultivation and electron microscopic techniques. Moreover, how the ingested AgNP may affect the intestinal virus/phages was investigated by conducting whole-genome sequencing (WGS. Results: The viral cultivation methods showed minimal effect on selected viruses during short-term exposure (24 h to 10 nm AgNP. However, long-term exposure (7 days resulted in significant reduction in the viral/phage population. Data obtained from WGS were filtered and compared with a nonredundant viral database composed of the complete viral genomes from NCBI using KRAKEN (confidence scoring threshold of 0.5. To compare the relative differential changes, the sequence counts in each treatment group were normalized to account for differences in DNA sequencing library sizes. Bioinformatics techniques were developed to visualize the virome comparative changes in a phylogenic tree graph. The computed data revealed that AgNP had an impact on several intestinal bacteriophages that prey on bacterial genus Enterobacteria, Yersinia and Staphylococcus as host species. Moreover, there was an independent effect of nanoparticles and released ions. Conclusion: Overall, this study reveals that the small-size AgNP could lead to

  17. Probabilistic quantitative microbial risk assessment model of norovirus from wastewater irrigated vegetables in Ghana using genome copies and fecal indicator ratio conversion for estimating exposure dose

    DEFF Research Database (Denmark)

    Owusu-Ansah, Emmanuel de-Graft Johnson; Sampson, Angelina; Amponsah, Samuel K.

    2017-01-01

    physical and environmental factors that might influence the reliability of using indicator organisms in microbial risk assessment. The challenges facing analytical studies on virus enumeration (genome copies or particles) have contributed to the already existing lack of data in QMRA modelling. This study......The need to replace the commonly applied fecal indicator conversions ratio (an assumption of 1:10− 5 virus to fecal indicator organism) in Quantitative Microbial Risk Assessment (QMRA) with models based on quantitative data on the virus of interest has gained prominence due to the different...... attempts to fit a QMRA model to genome copies of norovirus data. The model estimates the risk of norovirus infection from the intake of vegetables irrigated with wastewater from different sources. The results were compared to the results of a corresponding model using the fecal indicator conversion ratio...

  18. Probabilistic quantitative microbial risk assessment model of norovirus from wastewater irrigated vegetables in Ghana using genome copies and fecal indicator ratio conversion for estimating exposure dose.

    Science.gov (United States)

    Owusu-Ansah, Emmanuel de-Graft Johnson; Sampson, Angelina; Amponsah, Samuel K; Abaidoo, Robert C; Dalsgaard, Anders; Hald, Tine

    2017-12-01

    The need to replace the commonly applied fecal indicator conversions ratio (an assumption of 1:10 -5 virus to fecal indicator organism) in Quantitative Microbial Risk Assessment (QMRA) with models based on quantitative data on the virus of interest has gained prominence due to the different physical and environmental factors that might influence the reliability of using indicator organisms in microbial risk assessment. The challenges facing analytical studies on virus enumeration (genome copies or particles) have contributed to the already existing lack of data in QMRA modelling. This study attempts to fit a QMRA model to genome copies of norovirus data. The model estimates the risk of norovirus infection from the intake of vegetables irrigated with wastewater from different sources. The results were compared to the results of a corresponding model using the fecal indicator conversion ratio to estimate the norovirus count. In all scenarios of using different water sources, the application of the fecal indicator conversion ratio underestimated the norovirus disease burden, measured by the Disability Adjusted Life Years (DALYs), when compared to results using the genome copies norovirus data. In some cases the difference was >2 orders of magnitude. All scenarios using genome copies met the 10 -4 DALY per person per year for consumption of vegetables irrigated with wastewater, although these results are considered to be highly conservative risk estimates. The fecal indicator conversion ratio model of stream-water and drain-water sources of wastewater achieved the 10 -6 DALY per person per year threshold, which tends to indicate an underestimation of health risk when compared to using genome copies for estimating the dose. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Tiling array-CGH for the assessment of genomic similarities among synchronous unilateral and bilateral invasive breast cancer tumor pairs

    Directory of Open Access Journals (Sweden)

    Ringnér Markus

    2008-07-01

    Full Text Available Abstract Background Today, no objective criteria exist to differentiate between individual primary tumors and intra- or intermammary dissemination respectively, in patients diagnosed with two or more synchronous breast cancers. To elucidate whether these tumors most likely arise through clonal expansion, or whether they represent individual primary tumors is of tumor biological interest and may have clinical implications. In this respect, high resolution genomic profiling may provide a more reliable approach than conventional histopathological and tumor biological factors. Methods 32 K tiling microarray-based comparative genomic hybridization (aCGH was used to explore the genomic similarities among synchronous unilateral and bilateral invasive breast cancer tumor pairs, and was compared with histopathological and tumor biological parameters. Results Based on global copy number profiles and unsupervised hierarchical clustering, five of ten (p = 1.9 × 10-5 unilateral tumor pairs displayed similar genomic profiles within the pair, while only one of eight bilateral tumor pairs (p = 0.29 displayed pair-wise genomic similarities. DNA index, histological type and presence of vessel invasion correlated with the genomic analyses. Conclusion Synchronous unilateral tumor pairs are often genomically similar, while synchronous bilateral tumors most often represent individual primary tumors. However, two independent unilateral primary tumors can develop synchronously and contralateral tumor spread can occur. The presence of an intraductal component is not informative when establishing the independence of two tumors, while vessel invasion, the presence of which was found in clustering tumor pairs but not in tumor pairs that did not cluster together, supports the clustering outcome. Our data suggest that genomically similar unilateral tumor pairs may represent a more aggressive disease that requires the addition of more severe treatment modalities, and

  20. Genome-Wide Association Mapping for Intelligence in Military Working Dogs: Canine Cohort, Canine Intelligence Assessment Regimen, Genome-Wide Single Nucleotide Polymorphism (SNP) Typing, and Unsupervised Classification Algorithm for Genome-Wide Association Data Analysis

    Science.gov (United States)

    2011-09-01

    SNP Array v2. A ‘proof-of-concept’ advanced data mining algorithm for unsupervised analysis of genome-wide association study (GWAS) dataset was... Opal F AUS Yes U141 Peggs F AUS Yes U142 Taxi F AUS Yes U143 Riso MI MAL Yes U144 Szarik MI GSD Yes U145 Astor MI MAL Yes U146 Roy MC MAL Yes... mining of genetic studies in general, and especially GWAS. As a proof-of-concept, a classification analysis of the WG SNP typing dataset of a

  1. Quantitative RNA-Seq analysis in non-model species: assessing transcriptome assemblies as a scaffold and the utility of evolutionary divergent genomic reference species

    Directory of Open Access Journals (Sweden)

    Hornett Emily A

    2012-08-01

    Full Text Available Abstract Background How well does RNA-Seq data perform for quantitative whole gene expression analysis in the absence of a genome? This is one unanswered question facing the rapidly growing number of researchers studying non-model species. Using Homo sapiens data and resources, we compared the direct mapping of sequencing reads to predicted genes from the genome with mapping to de novo transcriptomes assembled from RNA-Seq data. Gene coverage and expression analysis was further investigated in the non-model context by using increasingly divergent genomic reference species to group assembled contigs by unique genes. Results Eight transcriptome sets, composed of varying amounts of Illumina and 454 data, were assembled and assessed. Hybrid 454/Illumina assemblies had the highest transcriptome and individual gene coverage. Quantitative whole gene expression levels were highly similar between using a de novo hybrid assembly and the predicted genes as a scaffold, although mapping to the de novo transcriptome assembly provided data on fewer genes. Using non-target species as reference scaffolds does result in some loss of sequence and expression data, and bias and error increase with evolutionary distance. However, within a 100 million year window these effect sizes are relatively small. Conclusions Predicted gene sets from sequenced genomes of related species can provide a powerful method for grouping RNA-Seq reads and annotating contigs. Gene expression results can be produced that are similar to results obtained using gene models derived from a high quality genome, though biased towards conserved genes. Our results demonstrate the power and limitations of conducting RNA-Seq in non-model species.

  2. Genome-wide assessment of the association of rare and common copy number variations to testicular germ cell cancer

    DEFF Research Database (Denmark)

    Edsgard, Stefan Daniel; Dalgaard, Marlene Danner; Weinhold, Nils

    2013-01-01

    Testicular germ cell cancer (TGCC) is one of the most heritable forms of cancer. Previous genome-wide association studies have focused on single nucleotide polymorphisms, largely ignoring the influence of copy number variants (CNVs). Here we present a genome-wide study of CNV on a cohort of 212...... of rare CNVs related to cell migration (false-discovery rate = 0.021, 1.8% of cases and 1.1% of controls). Dysregulation during migration of primordial germ cells has previously been suspected to be a part of TGCC development and this set of multiple rare variants may thereby have a minor contribution...

  3. Environmental assessment for the proposed construction and operation of a Genome Sequencing Facility in Building 64 at Lawrence Berkeley Laboratory, Berkeley, California

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-04-01

    This document is an Environmental Assessment (EA) for a proposed project to modify 14,900 square feet of an existing building (Building 64) at Lawrence Berkeley Laboratory (LBL) to operate as a Genome Sequencing Facility. This EA addresses the potential environmental impacts from the proposed modifications to Building 64 and operation of the Genome Sequencing Facility. The proposed action is to modify Building 64 to provide space and equipment allowing LBL to demonstrate that the Directed DNA Sequencing Strategy can be scaled up from the current level of 750,000 base pairs per year to a facility that produces over 6,000,000 base pairs per year, while still retaining its efficiency.

  4. Lawrence Livermore National Laboratory (LLNL) Oxide Material Representation in the Material Identification and Surveillance (MIS) Program, Revision 2

    Energy Technology Data Exchange (ETDEWEB)

    Riley, D C; Dodson, K

    2004-06-30

    The Materials Identification and Surveillance (MIS) program was established within the 94-1 R&D Program to confirm the suitability of plutonium-bearing materials for stabilization, packaging, and long-term storage under DOE-STD-3013-2000. Oxide materials from different sites were chemically and physically characterized. The adequacy of the stabilization process parameters of temperature and duration at temperature (950 C and 2 hours) for eliminating chemical reactivity and reducing the moisture content to less than 0.5 weight percent were validated. Studies also include surveillance monitoring to determine the behavior of the oxides and packaging materials under storage conditions. Materials selected for this program were assumed to be representative of the overall inventory for DOE sites. The Quality Assurance section of the DOE-STD-3013-2000 required that each site be responsible for assuring that oxides packaged according to this standard are represented by items in the MIS characterization program. The purpose of this document is to define the path for determining if an individual item is ''represented'' in the MIS Program and to show that oxides being packaged at Lawrence Livermore National Laboratory (LLNL) are considered represented in the MIS program. The methodology outlined in the MIS Representation Document (LA-14016-MS) for demonstrating representation requires concurrence of the MIS working Group (MIS-WG). The signature page on this document provides for the MIS-WG concurrence.

  5. Population-Based in Vitro Hazard and Concentration–Response Assessment of Chemicals: The 1000 Genomes High-Throughput Screening Study

    Science.gov (United States)

    Abdo, Nour; Xia, Menghang; Brown, Chad C.; Kosyk, Oksana; Huang, Ruili; Sakamuru, Srilatha; Zhou, Yi-Hui; Jack, John R.; Gallins, Paul; Xia, Kai; Li, Yun; Chiu, Weihsueh A.; Motsinger-Reif, Alison A.; Austin, Christopher P.; Tice, Raymond R.

    2015-01-01

    Background: Understanding of human variation in toxicity to environmental chemicals remains limited, so human health risk assessments still largely rely on a generic 10-fold factor (10½ each for toxicokinetics and toxicodynamics) to account for sensitive individuals or subpopulations. Objectives: We tested a hypothesis that population-wide in vitro cytotoxicity screening can rapidly inform both the magnitude of and molecular causes for interindividual toxicodynamic variability. Methods: We used 1,086 lymphoblastoid cell lines from the 1000 Genomes Project, representing nine populations from five continents, to assess variation in cytotoxic response to 179 chemicals. Analysis included assessments of population variation and heritability, and genome-wide association mapping, with attention to phenotypic relevance to human exposures. Results: For about half the tested compounds, cytotoxic response in the 1% most “sensitive” individual occurred at concentrations within a factor of 10½ (i.e., approximately 3) of that in the median individual; however, for some compounds, this factor was > 10. Genetic mapping suggested important roles for variation in membrane and transmembrane genes, with a number of chemicals showing association with SNP rs13120371 in the solute carrier SLC7A11, previously implicated in chemoresistance. Conclusions: This experimental approach fills critical gaps unaddressed by recent large-scale toxicity testing programs, providing quantitative, experimentally based estimates of human toxicodynamic variability, and also testable hypotheses about mechanisms contributing to interindividual variation. Citation: Abdo N, Xia M, Brown CC, Kosyk O, Huang R, Sakamuru S, Zhou YH, Jack JR, Gallins P, Xia K, Li Y, Chiu WA, Motsinger-Reif AA, Austin CP, Tice RR, Rusyn I, Wright FA. 2015. Population-based in vitro hazard and concentration–response assessment of chemicals: the 1000 Genomes high-throughput screening study. Environ Health Perspect 123:458

  6. Study of thermal sensitivity and thermal explosion violence of energetic materials in the LLNL ODTX system

    International Nuclear Information System (INIS)

    Hsu, P C; Hust, G; Zhang, M X; Lorenz, T K; Reynolds, J G; Fried, L; Springer, H K; Maienschein, J L

    2014-01-01

    Incidents caused by fire and combat operations can heat energetic materials that may lead to thermal explosion and result in structural damage and casualty. Some explosives may thermally explode at fairly low temperatures (< 100 °C) and the violence from thermal explosion may cause significant damage. Thus it is important to understand the response of energetic materials to thermal insults. The One Dimensional Time to Explosion (ODTX) system at the Lawrence Livermore National Laboratory has been used for decades to measure times to explosion, threshold thermal explosion temperature, and determine kinetic parameters of energetic materials. Samples of different configurations (pressed part, powder, paste, and liquid) can be tested in the system. The ODTX testing can also provide useful data for assessing the thermal explosion violence of energetic materials. Recent ODTX experimental data are reported in the paper.

  7. Genic non-coding microsatellites in the rice genome: characterization, marker design and use in assessing genetic and evolutionary relationships among domesticated groups

    Directory of Open Access Journals (Sweden)

    Singh Nagendra

    2009-03-01

    Full Text Available Abstract Background Completely sequenced plant genomes provide scope for designing a large number of microsatellite markers, which are useful in various aspects of crop breeding and genetic analysis. With the objective of developing genic but non-coding microsatellite (GNMS markers for the rice (Oryza sativa L. genome, we characterized the frequency and relative distribution of microsatellite repeat-motifs in 18,935 predicted protein coding genes including 14,308 putative promoter sequences. Results We identified 19,555 perfect GNMS repeats with densities ranging from 306.7/Mb in chromosome 1 to 450/Mb in chromosome 12 with an average of 357.5 GNMS per Mb. The average microsatellite density was maximum in the 5' untranslated regions (UTRs followed by those in introns, promoters, 3'UTRs and minimum in the coding sequences (CDS. Primers were designed for 17,966 (92% GNMS repeats, including 4,288 (94% hypervariable class I types, which were bin-mapped on the rice genome. The GNMS markers were most polymorphic in the intronic region (73.3% followed by markers in the promoter region (53.3% and least in the CDS (26.6%. The robust polymerase chain reaction (PCR amplification efficiency and high polymorphic potential of GNMS markers over genic coding and random genomic microsatellite markers suggest their immediate use in efficient genotyping applications in rice. A set of these markers could assess genetic diversity and establish phylogenetic relationships among domesticated rice cultivar groups. We also demonstrated the usefulness of orthologous and paralogous conserved non-coding microsatellite (CNMS markers, identified in the putative rice promoter sequences, for comparative physical mapping and understanding of evolutionary and gene regulatory complexities among rice and other members of the grass family. The divergence between long-grained aromatics and subspecies japonica was estimated to be more recent (0.004 Mya compared to short

  8. Rumen microbial genomics

    International Nuclear Information System (INIS)

    Morrison, M.; Nelson, K.E.

    2005-01-01

    Improving microbial degradation of plant cell wall polysaccharides remains one of the highest priority goals for all livestock enterprises, including the cattle herds and draught animals of developing countries. The North American Consortium for Genomics of Fibrolytic Ruminal Bacteria was created to promote the sequencing and comparative analysis of rumen microbial genomes, offering the potential to fully assess the genetic potential in a functional and comparative fashion. It has been found that the Fibrobacter succinogenes genome encodes many more endoglucanases and cellodextrinases than previously isolated, and several new processive endoglucanases have been identified by genome and proteomic analysis of Ruminococcus albus, in addition to a variety of strategies for its adhesion to fibre. The ramifications of acquiring genome sequence data for rumen microorganisms are profound, including the potential to elucidate and overcome the biochemical, ecological or physiological processes that are rate limiting for ruminal fibre degradation. (author)

  9. Microbial Genomes Multiply

    Science.gov (United States)

    Doolittle, Russell F.

    2002-01-01

    The publication of the first complete sequence of a bacterial genome in 1995 was a signal event, underscored by the fact that the article has been cited more than 2,100 times during the intervening seven years. It was a marvelous technical achievement, made possible by automatic DNA-sequencing machines. The feat is the more impressive in that complete genome sequencing has now been adopted in many different laboratories around the world. Four years ago in these columns I examined the situation after a dozen microbial genomes had been completed. Now, with upwards of 60 microbial genome sequences determined and twice that many in progress, it seems reasonable to assess just what is being learned. Are new concepts emerging about how cells work? Have there been practical benefits in the fields of medicine and agriculture? Is it feasible to determine the genomic sequence of every bacterial species on Earth? The answers to these questions maybe Yes, Perhaps, and No, respectively.

  10. Extreme genomes

    OpenAIRE

    DeLong, Edward F

    2000-01-01

    The complete genome sequence of Thermoplasma acidophilum, an acid- and heat-loving archaeon, has recently been reported. Comparative genomic analysis of this 'extremophile' is providing new insights into the metabolic machinery, ecology and evolution of thermophilic archaea.

  11. Grass genomes

    OpenAIRE

    Bennetzen, Jeffrey L.; SanMiguel, Phillip; Chen, Mingsheng; Tikhonov, Alexander; Francki, Michael; Avramova, Zoya

    1998-01-01

    For the most part, studies of grass genome structure have been limited to the generation of whole-genome genetic maps or the fine structure and sequence analysis of single genes or gene clusters. We have investigated large contiguous segments of the genomes of maize, sorghum, and rice, primarily focusing on intergenic spaces. Our data indicate that much (>50%) of the maize genome is composed of interspersed repetitive DNAs, primarily nested retrotransposons that in...

  12. A framework for assessing the concordance of molecular typing methods and the true strain phylogeny of Campylobacter jejuni and C. coli using draft genome sequence data

    Directory of Open Access Journals (Sweden)

    Catherine Dianna Carrillo

    2012-05-01

    Full Text Available Tracking of sources of sporadic cases of campylobacteriosis remains challenging, as commonly used molecular typing methods have limited ability to unambiguously link genetically related strains. Genomics has become increasingly prominent in the public health response to enteric pathogens as methods enable characterization of pathogens at an unprecedented level of resolution. However, the cost of sequencing and expertise required for bioinformatic analyses remains prohibitive, and these comprehensive analyses are limited to a few priority strains. Although several molecular typing methods are currently widely used for epidemiological analysis of campylobacters, it is not clear how accurately these methods reflect true strain relationships. To address this, we analyzed 104 publically available whole genome sequences (WGS of C. jejuni and C. coli. In addition to in silico determination of multi-locus sequence (MLST, fla and porA type, as well as comparative genomic fingerprint (CGF, we inferred a reference phylogeny based on conserved core genome elements. Molecular typing data were compared to the reference phylogeny for concordance using the Adjusted Wallace Coefficient (AWC with confidence intervals. Although MLST targets the sequence variability in core genes and CGF targets insertions/deletions of accessory genes, both methods are based on multilocus analysis and provided better estimates of true phylogeny than methods based on single loci (porA, fla. A more comprehensive WGS dataset including additional genetically related strains, both epidemiologically linked and unlinked, will be necessary to assess performance of methods for outbreak investigations and surveillance activities. Analyses of the strengths and weaknesses of widely used typing methodologies in inferring true strain relationships will provide guidance in the interpretation of this data for epidemiological purposes.

  13. Cancer genomics

    DEFF Research Database (Denmark)

    Norrild, Bodil; Guldberg, Per; Ralfkiær, Elisabeth Methner

    2007-01-01

    Almost all cells in the human body contain a complete copy of the genome with an estimated number of 25,000 genes. The sequences of these genes make up about three percent of the genome and comprise the inherited set of genetic information. The genome also contains information that determines whe...

  14. Foreign Travel Trip Report for LLNL travel with DOE FES funding, May 19th-30th, 2012

    International Nuclear Information System (INIS)

    Joseph, I.

    2012-01-01

    I attended the 20th biannual International Conference on Plasma Surface Interaction (PSI) in Fusion Devices in Aachen, Germany, hosted this year by the Forschungszentrum Julich (FZJ) research center. The PSI conference is one of the main international forums for the presentation and discussion of results on plasma surface interactions and edge plasma physics relevant to magnetic confinement fusion devices. I disseminated the recent results of FESP/LLNL tokamak research by presenting three posters on: (i) understanding reconnection and controlling edge localized modes (ELMs) using the BOUT++ code, (ii) simulation of resistive ballooning mode turbulence, and (iii) innovative design of Snowflake divertors. I learned of many new and recent results from international tokamak facilities and had the opportunity for discussion of these topics with other scientists at the poster sessions, conference lunches/receptions, etc. Some of the major highlights of the PSI conference topics were: (1) Review of the progress in using metallic tungsten and beryllium (ITER-like) walls at international tokamak facilities: JET (Culham, UK), TEXTOR (FZJ, Germany) and Alcator CMOD (MIT, USA). Results included: effect of small and large-area melting on plasma impurity content and recovery, expected reduction in retention of hydrogenic species, increased heat load during disruptions and need for mitigation with massive gas injection. (2) A review of ELM control in general (T. Evans, GA) and recent results of ELM control using n=2 external magnetic perturbations on ASDEX-Upgrade (MPI-Garching, Germany). (3) General agreement among the international tokamak database that, along the outer midplane of a low collisionality tokamak, the SOL power width in current experiments varies inversely with respect to plasma current (Ip), roughly as 1/Ip, with little dependence on other plasma parameters. This would imply roughly a factor of 1/4 of the width that was assumed for the design of the ITER tokamak

  15. Foreign Travel Trip Report for LLNL travel with DOE FES funding,May 19th-30th, 2012

    Energy Technology Data Exchange (ETDEWEB)

    Joseph, I

    2012-07-05

    I attended the 20th biannual International Conference on Plasma Surface Interaction (PSI) in Fusion Devices in Aachen, Germany, hosted this year by the Forschungszentrum Julich (FZJ) research center. The PSI conference is one of the main international forums for the presentation and discussion of results on plasma surface interactions and edge plasma physics relevant to magnetic confinement fusion devices. I disseminated the recent results of FESP/LLNL tokamak research by presenting three posters on: (i) understanding reconnection and controlling edge localized modes (ELMs) using the BOUT++ code, (ii) simulation of resistive ballooning mode turbulence, and (iii) innovative design of Snowflake divertors. I learned of many new and recent results from international tokamak facilities and had the opportunity for discussion of these topics with other scientists at the poster sessions, conference lunches/receptions, etc. Some of the major highlights of the PSI conference topics were: (1) Review of the progress in using metallic tungsten and beryllium (ITER-like) walls at international tokamak facilities: JET (Culham, UK), TEXTOR (FZJ, Germany) and Alcator CMOD (MIT, USA). Results included: effect of small and large-area melting on plasma impurity content and recovery, expected reduction in retention of hydrogenic species, increased heat load during disruptions and need for mitigation with massive gas injection. (2) A review of ELM control in general (T. Evans, GA) and recent results of ELM control using n=2 external magnetic perturbations on ASDEX-Upgrade (MPI-Garching, Germany). (3) General agreement among the international tokamak database that, along the outer midplane of a low collisionality tokamak, the SOL power width in current experiments varies inversely with respect to plasma current (Ip), roughly as 1/Ip, with little dependence on other plasma parameters. This would imply roughly a factor of 1/4 of the width that was assumed for the design of the ITER tokamak

  16. A re-sequencing based assessment of genomic heterogeneity and fast neutron-induced deletions in a common bean cultivar

    Directory of Open Access Journals (Sweden)

    Jamie A. O'Rourke

    2013-06-01

    Full Text Available A small fast neutron mutant population has been established from Phaseolus vulgaris cv. Red Hawk. We leveraged the available P. vulgaris genome sequence and high throughput next generation DNA sequencing to examine the genomic structure of five Phaseolus vulgaris cv. Red Hawk fast neutron mutants with striking visual phenotypes. Analysis of these genomes identified three classes of structural variation; between cultivar variation, natural variation within the fast neutron mutant population, and fast neutron induced mutagenesis. Our analyses focused on the latter two classes. We identified 23 large deletions (>40 bp common to multiple individuals, illustrating residual heterogeneity and regions of structural variation within the common bean cv. Red Hawk. An additional 18 large deletions were identified in individual mutant plants. These deletions, ranging in size from 40 bp to 43,000 bp, are potentially the result of fast neutron mutagenesis. Six of the 18 deletions lie near or within gene coding regions, identifying potential candidate genes causing the mutant phenotype.

  17. Normalized Tritium Quantification Approach (NoTQA) a Method for Quantifying Tritium Contaminated Trash and Debris at LLNL

    International Nuclear Information System (INIS)

    Dominick, J.L.; Rasmussen, C.L.

    2008-01-01

    Several facilities and many projects at LLNL work exclusively with tritium. These operations have the potential to generate large quantities of Low-Level Radioactive Waste (LLW) with the same or similar radiological characteristics. A standardized documented approach to characterizing these waste materials for disposal as radioactive waste will enhance the ability of the Laboratory to manage them in an efficient and timely manner while ensuring compliance with all applicable regulatory requirements. This standardized characterization approach couples documented process knowledge with analytical verification and is very conservative, overestimating the radioactivity concentration of the waste. The characterization approach documented here is the Normalized Tritium Quantification Approach (NoTQA). This document will serve as a Technical Basis Document which can be referenced in radioactive waste characterization documentation packages such as the Information Gathering Document. In general, radiological characterization of waste consists of both developing an isotopic breakdown (distribution) of radionuclides contaminating the waste and using an appropriate method to quantify the radionuclides in the waste. Characterization approaches require varying degrees of rigor depending upon the radionuclides contaminating the waste and the concentration of the radionuclide contaminants as related to regulatory thresholds. Generally, as activity levels in the waste approach a regulatory or disposal facility threshold the degree of required precision and accuracy, and therefore the level of rigor, increases. In the case of tritium, thresholds of concern for control, contamination, transportation, and waste acceptance are relatively high. Due to the benign nature of tritium and the resulting higher regulatory thresholds, this less rigorous yet conservative characterization approach is appropriate. The scope of this document is to define an appropriate and acceptable

  18. Experiment designs offered for discussion preliminary to an LLNL field scale validation experiment in the Yucca Mountain Exploratory Shaft Facility

    International Nuclear Information System (INIS)

    Lowry, B.; Keller, C.

    1988-01-01

    It has been proposed (''Progress Report on Experiment Rationale for Validation of LLNL Models of Ground Water Behavior Near Nuclear Waste Canisters,'' Keller and Lowry, Dec. 7, 1988) that a heat generating spent fuel canister emplaced in unsaturated tuff, in a ventilated hole, will cause a net flux of water into the borehole during the heating cycle of the spent fuel. Accompanying this mass flux will be the formation of mineral deposits near the borehole wall as the water evaporates and leaves behind its dissolved solids. The net effect of this process upon the containment of radioactive wastes is a function of (1) where and how much solid material is deposited in the tuff matrix and cracks, and (2) the resultant effect on the medium flow characteristics. Experimental concepts described in this report are designed to quantify the magnitude and relative location of solid mineral deposit formation due to a heated and vented borehole environment. The most simple tests address matrix effects only; after the process is understood in the homogeneous matrix, fracture effects would be investigated. Three experiment concepts have been proposed. Each has unique advantages and allows investigation of specific aspects of the precipitate formation process. All could be done in reasonable time (less than a year) and none of them are extremely expensive (the most expensive is probably the structurally loaded block test). The calculational ability exists to analyze the ''real'' situation and each of the experiment designs, and produce a credible series of tests. None of the designs requires the acquisition of material property data beyond current capabilities. The tests could be extended, if our understanding is consistent with the data produced, to analyze fracture effects. 7 figs

  19. Genus-Wide Assessment of Lignocellulose Utilization in the Extremely Thermophilic Genus Caldicellulosiruptor by Genomic, Pangenomic, and Metagenomic Analyses.

    Science.gov (United States)

    Lee, Laura L; Blumer-Schuette, Sara E; Izquierdo, Javier A; Zurawski, Jeffrey V; Loder, Andrew J; Conway, Jonathan M; Elkins, James G; Podar, Mircea; Clum, Alicia; Jones, Piet C; Piatek, Marek J; Weighill, Deborah A; Jacobson, Daniel A; Adams, Michael W W; Kelly, Robert M

    2018-05-01

    Metagenomic data from Obsidian Pool (Yellowstone National Park, USA) and 13 genome sequences were used to reassess genus-wide biodiversity for the extremely thermophilic Caldicellulosiruptor The updated core genome contains 1,401 ortholog groups (average genome size for 13 species = 2,516 genes). The pangenome, which remains open with a revised total of 3,493 ortholog groups, encodes a variety of multidomain glycoside hydrolases (GHs). These include three cellulases with GH48 domains that are colocated in the glucan degradation locus (GDL) and are specific determinants for microcrystalline cellulose utilization. Three recently sequenced species, Caldicellulosiruptor sp. strain Rt8.B8 (renamed here Caldicellulosiruptor morganii ), Thermoanaerobacter cellulolyticus strain NA10 (renamed here Caldicellulosiruptor naganoensis ), and Caldicellulosiruptor sp. strain Wai35.B1 (renamed here Caldicellulosiruptor danielii ), degraded Avicel and lignocellulose (switchgrass). C. morganii was more efficient than Caldicellulosiruptor bescii in this regard and differed from the other 12 species examined, both based on genome content and organization and in the specific domain features of conserved GHs. Metagenomic analysis of lignocellulose-enriched samples from Obsidian Pool revealed limited new information on genus biodiversity. Enrichments yielded genomic signatures closely related to that of Caldicellulosiruptor obsidiansis , but there was also evidence for other thermophilic fermentative anaerobes ( Caldanaerobacter , Fervidobacterium , Caloramator , and Clostridium ). One enrichment, containing 89.8% Caldicellulosiruptor and 9.7% Caloramator , had a capacity for switchgrass solubilization comparable to that of C. bescii These results refine the known biodiversity of Caldicellulosiruptor and indicate that microcrystalline cellulose degradation at temperatures above 70°C, based on current information, is limited to certain members of this genus that produce GH48 domain

  20. Bacterial whole genome-based phylogeny: construction of a new benchmarking dataset and assessment of some existing methods.

    Science.gov (United States)

    Ahrenfeldt, Johanne; Skaarup, Carina; Hasman, Henrik; Pedersen, Anders Gorm; Aarestrup, Frank Møller; Lund, Ole

    2017-01-05

    Whole genome sequencing (WGS) is increasingly used in diagnostics and surveillance of infectious diseases. A major application for WGS is to use the data for identifying outbreak clusters, and there is therefore a need for methods that can accurately and efficiently infer phylogenies from sequencing reads. In the present study we describe a new dataset that we have created for the purpose of benchmarking such WGS-based methods for epidemiological data, and also present an analysis where we use the data to compare the performance of some current methods. Our aim was to create a benchmark data set that mimics sequencing data of the sort that might be collected during an outbreak of an infectious disease. This was achieved by letting an E. coli hypermutator strain grow in the lab for 8 consecutive days, each day splitting the culture in two while also collecting samples for sequencing. The result is a data set consisting of 101 whole genome sequences with known phylogenetic relationship. Among the sequenced samples 51 correspond to internal nodes in the phylogeny because they are ancestral, while the remaining 50 correspond to leaves. We also used the newly created data set to compare three different online available methods that infer phylogenies from whole-genome sequencing reads: NDtree, CSI Phylogeny and REALPHY. One complication when comparing the output of these methods with the known phylogeny is that phylogenetic methods typically build trees where all observed sequences are placed as leafs, even though some of them are in fact ancestral. We therefore devised a method for post processing the inferred trees by collapsing short branches (thus relocating some leafs to internal nodes), and also present two new measures of tree similarity that takes into account the identity of both internal and leaf nodes. Based on this analysis we find that, among the investigated methods, CSI Phylogeny had the best performance, correctly identifying 73% of all branches in the

  1. Environmental assessment for construction and operation of a Human Genome Laboratory at Lawrence Berkeley Laboratory, Berkeley, California

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-01

    Lawrence Berkeley Laboratory (LBL) proposes to construct and operate a new laboratory for consolidation of current and future activities of the Human Genome Center (HGC). This document addresses the potential direct, indirect, and cumulative environmental and human-health effects from the proposed facility construction and operation. This document was prepared in accordance the National Environmental Policy Act of 1969 (United States Codes 42 USC 4321-4347) (NEPA) and the US Department of Energy`s (DOE) Final Rule for NEPA Implementing Procedures [Code of Federal Regulations 10CFR 1021].

  2. Meeting Report. Assessing Human Germ-Cell Mutagenesis in thePost-Genome Era: A Celebration of the Legacy of William Lawson (Bill)Russell

    Energy Technology Data Exchange (ETDEWEB)

    Wyrobek, Andrew J.; Mulvihill, John J.; Wassom, John S.; Malling,Heinrich V.; Shelby, Michael D.; Lewis, Susan E.; Witt, Kristine L.; Preston, R. Julian; Perreault-Darney, Sally; Allen, James W.; DeMarini,David M.; Woychik, Richard P.; Bishop Jack B; Workshop Presenters

    2006-04-18

    Although numerous germ-cell mutagens have been identified inanimal model systems, to date, no human germ-cell mutagens have beenconfirmed. Because the genomic integrity of our germ cells is essentialfor the continuation of the human species, a resolution of this enduringconundrum is needed. To facilitate such a resolution, we organized aworkshop at The Jackson Laboratory in Bar Harbor, Maine on September28-30, 2004. This interactive workshop brought together scientists from awide range of disciplines to assess the applicability of emergingmolecular methods for genomic analysis to the field of human germ-cellmutagenesis. Participants recommended that focused, coordinated humangerm-cell mutation studies be conducted in relation to important societalexposures. Because cancer survivors represent a unique cohort withwell-defined exposures, there was a consensus that studies should bedesigned to assess the mutational impact on children born to parents whohad received certain types of mutagenic cancer chemotherapy prior toconceiving their children. Within this high-risk cohort, parents andchildren could be evaluated for inherited changes in (a) gene sequencesand chromosomal structure, (b) repeat sequences and minisatelliteregions, and (c) global gene expression and chromatin. Participants alsorecommended studies to examine trans-generational effects in humansinvolving mechanisms such as changes in imprinting and methylationpatterns, expansion of nucleotide repeats, or induction of mitochondrialDNA mutations. Workshop participants advocated establishment of abio-bank of human tissue samples that could be used to conduct amultiple-endpoint, comprehensive, and collaborative effort to detectexposure-induced heritable alterations in the human genome. Appropriateanimal models of human germ-cell mutagenes is should be used in parallelwith human studies to provide insights into the mechanisms of mammaliangerm-cell mutagenesis. Finally, participants recommended that

  3. Magnesium, Iron and Aluminum in LLNL Air Particulate and Rain Samples with Reference to Magnesium in Industrial Storm Water

    Energy Technology Data Exchange (ETDEWEB)

    Esser, Bradley K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bibby, Richard K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Fish, Craig [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-08-25

    Storm water runoff from the Lawrence Livermore National Laboratory’s (LLNL’s) main site and Site 300 periodically exceeds the Discharge Permit Numeric Action Level (NAL) for Magnesium (Mg) under the Industrial General Permit (IGP) Order No. 2014-0057-DWQ. Of particular interest is the source of magnesium in storm water runoff from the site. This special study compares new metals data from air particulate and precipitation samples from the LLNL main site and Site 300 to previous metals data for storm water from the main site and Site 300 and alluvial sediment from the main site to investigate the potential source of elevated Mg in storm water runoff. Data for three metals (Mg, Iron {Fe}, and Aluminum {Al}) were available from all media; data for additional metals, such as Europium (Eu), were available from rain, air particulates, and alluvial sediment. To attribute source, this study compared metals concentration data (for Mg, Al, and Fe) in storm water and rain; metal-metal correlations (Mg with Fe, Mg with Al, Al with Fe, Mg with Eu, Eu with Fe, and Eu with Al) in storm water, rain, air particulates, and sediments; and metal-metal ratios ((Mg/Fe, Mg/Al, Al/Fe, Mg/Eu, Eu/Fe, and Eu/Al) in storm water, rain, air particulates and sediments. The results presented in this study are consistent with a simple conceptual model where the source of Mg in storm water runoff is air particulate matter that has dry-deposited on impervious surfaces and subsequently entrained in runoff during precipitation events. Such a conceptual model is consistent with 1) higher concentrations of metals in storm water runoff than in precipitation, 2) the strong correlation of Mg with Aluminum (Al) and Iron (Fe) in both storm water and air particulates, and 3) the similarity in metal mass ratios between storm water and air particulates in contrast to the dissimilarity of metal mass ratios between storm water and precipitation or alluvial sediment. The strong correlation of Mg with Fe and Al

  4. Genome-wide assessment of the carriers involved in the cellular uptake of drugs: a model system in yeast.

    Science.gov (United States)

    Lanthaler, Karin; Bilsland, Elizabeth; Dobson, Paul D; Moss, Harry J; Pir, Pınar; Kell, Douglas B; Oliver, Stephen G

    2011-10-24

    The uptake of drugs into cells has traditionally been considered to be predominantly via passive diffusion through the bilayer portion of the cell membrane. The recent recognition that drug uptake is mostly carrier-mediated raises the question of which drugs use which carriers. To answer this, we have constructed a chemical genomics platform built upon the yeast gene deletion collection, using competition experiments in batch fermenters and robotic automation of cytotoxicity screens, including protection by 'natural' substrates. Using these, we tested 26 different drugs and identified the carriers required for 18 of the drugs to gain entry into yeast cells. As well as providing a useful platform technology, these results further substantiate the notion that the cellular uptake of pharmaceutical drugs normally occurs via carrier-mediated transport and indicates that establishing the identity and tissue distribution of such carriers should be a major consideration in the design of safe and effective drugs.

  5. Evaluation of dynamic range for LLNL streak cameras using high contrast pulses and pulse podiatry'' on the Nova laser system

    Energy Technology Data Exchange (ETDEWEB)

    Richards, J.B.; Weiland, T.L.; Prior, J.A.

    1990-07-01

    A standard LLNL streak camera has been used to analyze high contrast pulses on the Nova laser facility. These pulses have a plateau at their leading edge (foot) with an amplitude which is approximately 1% of the maximum pulse height. Relying on other features of the pulses and on signal multiplexing, we were able to determine how accurately the foot amplitude was being represented by the camera. Results indicate that the useful single channel dynamic range of the instrument approaches 100:1. 1 ref., 4 figs., 1 tab.

  6. Functional Toxicogenomic Assessment of Triclosan in Human HepG2 Cells Using Genome-Wide CRISPR-Cas9 Screening.

    Science.gov (United States)

    Xia, Pu; Zhang, Xiaowei; Xie, Yuwei; Guan, Miao; Villeneuve, Daniel L; Yu, Hongxia

    2016-10-04

    There are thousands of chemicals used by humans and detected in the environment for which limited or no toxicological data are available. Rapid and cost-effective approaches for assessing the toxicological properties of chemicals are needed. We used CRISPR-Cas9 functional genomic screening to identify the potential molecular mechanism of a widely used antimicrobial triclosan (TCS) in HepG2 cells. Resistant genes at IC50 (the concentration causing a 50% reduction in cell viability) were significantly enriched in the adherens junction pathway, MAPK signaling pathway, and PPAR signaling pathway, suggesting a potential role in the molecular mechanism of TCS-induced cytotoxicity. Evaluation of the top-ranked resistant genes, FTO (encoding an mRNA demethylase) and MAP2K3 (a MAP kinase kinase family gene), revealed that their loss conferred resistance to TCS. In contrast, sensitive genes at IC10 and IC20 were specifically enriched in pathways involved with immune responses, which was concordant with transcriptomic profiling of TCS at concentrations of CRISPR-Cas9 fingerprint may reveal the patterns of TCS toxicity at low concentration levels. Moreover, we retrieved the potential connection between CRISPR-Cas9 fingerprint and disease terms, obesity, and breast cancer from an existing chemical-gene-disease database. Overall, CRISPR-Cas9 functional genomic screening offers an alternative approach for chemical toxicity testing.

  7. Lawrence Livermore National Laboratory (LLNL) Experimental Test Site (Site 300) Salinity Evaluation and Minimization Plan for Cooling Towers and Mechanical Equipment Discharges

    Energy Technology Data Exchange (ETDEWEB)

    Daily III, W D

    2010-02-24

    This document was created to comply with the Central Valley Regional Water Quality Control Board (CVRWQCB) Waste Discharge Requirement (Order No. 98-148). This order established new requirements to assess the effect of and effort required to reduce salts in process water discharged to the subsurface. This includes the review of technical, operational, and management options available to reduce total dissolved solids (TDS) concentrations in cooling tower and mechanical equipment water discharges at Lawrence Livermore National Laboratory's (LLNL's) Experimental Test Site (Site 300) facility. It was observed that for the six cooling towers currently in operation, the total volume of groundwater used as make up water is about 27 gallons per minute and the discharge to the subsurface via percolation pits is 13 gallons per minute. The extracted groundwater has a TDS concentration of 700 mg/L. The cooling tower discharge concentrations range from 700 to 1,400 mg/L. There is also a small volume of mechanical equipment effluent being discharged to percolation pits, with a TDS range from 400 to 3,300 mg/L. The cooling towers and mechanical equipment are maintained and operated in a satisfactory manner. No major leaks were identified. Currently, there are no re-use options being employed. Several approaches known to reduce the blow down flow rate and/or TDS concentration being discharged to the percolation pits and septic systems were reviewed for technical feasibility and cost efficiency. These options range from efforts as simple as eliminating leaks to implementing advanced and innovative treatment methods. The various options considered, and their anticipated effect on water consumption, discharge volumes, and reduced concentrations are listed and compared in this report. Based on the assessment, it was recommended that there is enough variability in equipment usage, chemistry, flow rate, and discharge configurations that each discharge location at Site 300

  8. Lawrence Livermore National Laboratory (LLNL) Experimental Test Site (Site 300) Salinity Evaluation and Minimization Plan for Cooling Towers and Mechanical Equipment Discharges

    International Nuclear Information System (INIS)

    Daily, W.D. III

    2010-01-01

    This document was created to comply with the Central Valley Regional Water Quality Control Board (CVRWQCB) Waste Discharge Requirement (Order No. 98-148). This order established new requirements to assess the effect of and effort required to reduce salts in process water discharged to the subsurface. This includes the review of technical, operational, and management options available to reduce total dissolved solids (TDS) concentrations in cooling tower and mechanical equipment water discharges at Lawrence Livermore National Laboratory's (LLNL's) Experimental Test Site (Site 300) facility. It was observed that for the six cooling towers currently in operation, the total volume of groundwater used as make up water is about 27 gallons per minute and the discharge to the subsurface via percolation pits is 13 gallons per minute. The extracted groundwater has a TDS concentration of 700 mg/L. The cooling tower discharge concentrations range from 700 to 1,400 mg/L. There is also a small volume of mechanical equipment effluent being discharged to percolation pits, with a TDS range from 400 to 3,300 mg/L. The cooling towers and mechanical equipment are maintained and operated in a satisfactory manner. No major leaks were identified. Currently, there are no re-use options being employed. Several approaches known to reduce the blow down flow rate and/or TDS concentration being discharged to the percolation pits and septic systems were reviewed for technical feasibility and cost efficiency. These options range from efforts as simple as eliminating leaks to implementing advanced and innovative treatment methods. The various options considered, and their anticipated effect on water consumption, discharge volumes, and reduced concentrations are listed and compared in this report. Based on the assessment, it was recommended that there is enough variability in equipment usage, chemistry, flow rate, and discharge configurations that each discharge location at Site 300 should be

  9. Genome Imprinting

    Indian Academy of Sciences (India)

    the cell nucleus (mitochondrial and chloroplast genomes), and. (3) traits governed ... tively good embryonic development but very poor development of membranes and ... Human homologies for the type of situation described above are naturally ..... imprint; (b) New modifications of the paternal genome in germ cells of each ...

  10. Baculovirus Genomics

    NARCIS (Netherlands)

    Oers, van M.M.; Vlak, J.M.

    2007-01-01

    Baculovirus genomes are covalently closed circles of double stranded-DNA varying in size between 80 and 180 kilobase-pair. The genomes of more than fourty-one baculoviruses have been sequenced to date. The majority of these (37) are pathogenic to lepidopteran hosts; three infect sawflies

  11. Genomic Testing

    Science.gov (United States)

    ... this database. Top of Page Evaluation of Genomic Applications in Practice and Prevention (EGAPP™) In 2004, the Centers for Disease Control and Prevention launched the EGAPP initiative to establish and test a ... and other applications of genomic technology that are in transition from ...

  12. Ancient genomes

    OpenAIRE

    Hoelzel, A Rus

    2005-01-01

    Ever since its invention, the polymerase chain reaction has been the method of choice for work with ancient DNA. In an application of modern genomic methods to material from the Pleistocene, a recent study has instead undertaken to clone and sequence a portion of the ancient genome of the cave bear.

  13. A genome-wide assessment of stages of elevational parapatry in Bornean passerine birds reveals no introgression: implications for processes and patterns of speciation

    Directory of Open Access Journals (Sweden)

    Robert G. Moyle

    2017-05-01

    Full Text Available Topographically complex regions often contain the close juxtaposition of closely related species along elevational gradients. The evolutionary causes of these elevational replacements, and thus the origin and maintenance of a large portion of species diversity along elevational gradients, are usually unclear because ecological differentiation along a gradient or secondary contact following allopatric diversification can produce the same pattern. We used reduced representation genomic sequencing to assess genetic relationships and gene flow between three parapatric pairs of closely related songbird taxa (Arachnothera spiderhunters, Chloropsis leafbirds, and Enicurus forktails along an elevational gradient in Borneo. Each taxon pair presents a different elevational range distribution across the island, yet results were uniform: little or no gene flow was detected in any pairwise comparisons. These results are congruent with an allopatric “species-pump” model for generation of species diversity and elevational parapatry of congeners on Borneo, rather than in situ generation of species by “ecological speciation” along an elevational gradient.

  14. Genomic research perspectives in Kazakhstan

    Directory of Open Access Journals (Sweden)

    Ainur Akilzhanova

    2014-01-01

    Full Text Available Introduction: Technological advancements rapidly propel the field of genome research. Advances in genetics and genomics such as the sequence of the human genome, the human haplotype map, open access databases, cheaper genotyping and chemical genomics, have transformed basic and translational biomedical research. Several projects in the field of genomic and personalized medicine have been conducted at the Center for Life Sciences in Nazarbayev University. The prioritized areas of research include: genomics of multifactorial diseases, cancer genomics, bioinformatics, genetics of infectious diseases and population genomics. At present, DNA-based risk assessment for common complex diseases, application of molecular signatures for cancer diagnosis and prognosis, genome-guided therapy, and dose selection of therapeutic drugs are the important issues in personalized medicine. Results: To further develop genomic and biomedical projects at Center for Life Sciences, the development of bioinformatics research and infrastructure and the establishment of new collaborations in the field are essential. Widespread use of genetic tools will allow the identification of diseases before the onset of clinical symptoms, the individualization of drug treatment, and could induce individual behavioral changes on the basis of calculated disease risk. However, many challenges remain for the successful translation of genomic knowledge and technologies into health advances, such as medicines and diagnostics. It is important to integrate research and education in the fields of genomics, personalized medicine, and bioinformatics, which will be possible with opening of the new Medical Faculty at Nazarbayev University. People in practice and training need to be educated about the key concepts of genomics and engaged so they can effectively apply their knowledge in a matter that will bring the era of genomic medicine to patient care. This requires the development of well

  15. Development and Use of Integrated Microarray-Based Genomic Technologies for Assessing Microbial Community Composition and Dynamics

    Energy Technology Data Exchange (ETDEWEB)

    J. Zhou; S.-K. Rhee; C. Schadt; T. Gentry; Z. He; X. Li; X. Liu; J. Liebich; S.C. Chong; L. Wu

    2004-03-17

    To effectively monitor microbial populations involved in various important processes, a 50-mer-based oligonucleotide microarray was developed based on known genes and pathways involved in: biodegradation, metal resistance and reduction, denitrification, nitrification, nitrogen fixation, methane oxidation, methanogenesis, carbon polymer decomposition, and sulfate reduction. This array contains approximately 2000 unique and group-specific probes with <85% similarity to their non-target sequences. Based on artificial probes, our results showed that at hybridization conditions of 50 C and 50% formamide, the 50-mer microarray hybridization can differentiate sequences having <88% similarity. Specificity tests with representative pure cultures indicated that the designed probes on the arrays appeared to be specific to their corresponding target genes. Detection limits were about 5-10ng genomic DNA in the absence of background DNA, and 50-100ng ({approx}1.3{sup o} 10{sup 7} cells) in the presence background DNA. Strong linear relationships between signal intensity and target DNA and RNA concentration were observed (r{sup 2} = 0.95-0.99). Application of this microarray to naphthalene-amended enrichments and soil microcosms demonstrated that composition of the microflora varied depending on incubation conditions. While the naphthalene-degrading genes from Rhodococcus-type microorganisms were dominant in enrichments, the genes involved in naphthalene degradation from Gram-negative microorganisms such as Ralstonia, Comamonas, and Burkholderia were most abundant in the soil microcosms (as well as those for polyaromatic hydrocarbon and nitrotoluene degradation). Although naphthalene degradation is widely known and studied in Pseudomonas, Pseudomonas genes were not detected in either system. Real-time PCR analysis of 4 representative genes was consistent with microarray-based quantification (r{sup 2} = 0.95). Currently, we are also applying this microarray to the study of several

  16. Imputation across genotyping arrays for genome-wide association studies: assessment of bias and a correction strategy.

    Science.gov (United States)

    Johnson, Eric O; Hancock, Dana B; Levy, Joshua L; Gaddis, Nathan C; Saccone, Nancy L; Bierut, Laura J; Page, Grier P

    2013-05-01

    A great promise of publicly sharing genome-wide association data is the potential to create composite sets of controls. However, studies often use different genotyping arrays, and imputation to a common set of SNPs has shown substantial bias: a problem which has no broadly applicable solution. Based on the idea that using differing genotyped SNP sets as inputs creates differential imputation errors and thus bias in the composite set of controls, we examined the degree to which each of the following occurs: (1) imputation based on the union of genotyped SNPs (i.e., SNPs available on one or more arrays) results in bias, as evidenced by spurious associations (type 1 error) between imputed genotypes and arbitrarily assigned case/control status; (2) imputation based on the intersection of genotyped SNPs (i.e., SNPs available on all arrays) does not evidence such bias; and (3) imputation quality varies by the size of the intersection of genotyped SNP sets. Imputations were conducted in European Americans and African Americans with reference to HapMap phase II and III data. Imputation based on the union of genotyped SNPs across the Illumina 1M and 550v3 arrays showed spurious associations for 0.2 % of SNPs: ~2,000 false positives per million SNPs imputed. Biases remained problematic for very similar arrays (550v1 vs. 550v3) and were substantial for dissimilar arrays (Illumina 1M vs. Affymetrix 6.0). In all instances, imputing based on the intersection of genotyped SNPs (as few as 30 % of the total SNPs genotyped) eliminated such bias while still achieving good imputation quality.

  17. Effects of temperature on gene expression patterns in Leptospira interrogans serovar Lai as assessed by whole-genome microarrays.

    Science.gov (United States)

    Lo, Miranda; Bulach, Dieter M; Powell, David R; Haake, David A; Matsunaga, James; Paustian, Michael L; Zuerner, Richard L; Adler, Ben

    2006-10-01

    Leptospirosis is an important zoonosis of worldwide distribution. Humans become infected via exposure to pathogenic Leptospira spp. from infected animals or contaminated water or soil. The availability of genome sequences for Leptospira interrogans, serovars Lai and Copenhageni, has opened up opportunities to examine global transcription profiles using microarray technology. Temperature is a key environmental factor known to affect leptospiral protein expression. Leptospira spp. can grow in artificial media at a range of temperatures reflecting conditions found in the environment and the mammalian host. Therefore, transcriptional changes were compared between cultures grown at 20 degrees C, 30 degrees C, 37 degrees C, and 39 degrees C to represent ambient temperatures in the environment, growth under laboratory conditions, and temperatures in healthy and febrile hosts. Data from direct pairwise comparisons of the four temperatures were consolidated to examine transcriptional changes at two generalized biological conditions representing mammalian physiological temperatures (37 degrees C and 39 degrees C) versus environmental temperatures (20 degrees C and 30 degrees C). Additionally, cultures grown at 30 degrees C then shifted overnight to 37 degrees C were compared with those grown long-term at 30 degrees C and 37 degrees C to identify genes potentially expressed in the early stages of infection. Comparison of data sets from physiological versus environmental experiments with upshift experiments provided novel insights into possible transcriptional changes at different stages of infection. Changes included differential expression of chemotaxis and motility genes, signal transduction systems, and genes encoding proteins involved in alteration of the outer membrane. These findings indicate that temperature is an important factor regulating expression of proteins that facilitate invasion and establishment of disease.

  18. Coupling of Realistic Rate Estimates with Genomics for Assessing Contaminant Attenuation and Long-Term Plume Containment

    Energy Technology Data Exchange (ETDEWEB)

    Colwell, F. S.; Crawford, R. L.; Sorenson, K.

    2005-06-01

    Dissolved dense nonaqueous-phase liquid plumes are persistent, widespread problems in the DOE complex. At the Idaho National Engineering and Environmental Laboratory, dissolved trichloroethylene (TCE) is disappearing from the Snake River Plain aquifer (SRPA) by natural attenuation, a finding that saves significant site restoration costs. Acceptance of monitored natural attenuation as a preferred treatment technology requires direct evidence of the processes and rates of the degradation. Our proposal aims to provide that evidence for one such site by testing two hypotheses. First, we believe that realistic values for in situ rates of TCE cometabolism can be obtained by sustaining the putative microorganisms at the low catabolic activities consistent with aquifer conditions. Second, the patterns of functional gene expression evident in these communities under starvation conditions while carrying out TCE cometabolism can be used to diagnose the cometabolic activity in the aquifer itself. Using the cometabolism rate parameters derived in low-growth bioreactors, we will complete the models that predict the time until background levels of TCE are attained at this location and validate the long-term stewardship of this plume. Realistic terms for cometabolism of TCE will provide marked improvements in DOE's ability to predict and monitor natural attenuation of chlorinated organics at other sites, increase the acceptability of this solution, and provide significant economic and health benefits through this noninvasive remediation strategy. Finally, this project aims to derive valuable genomic information about the functional attributes of subsurface microbial communities upon which DOE must depend to resolve some of its most difficult contamination issues.

  19. Herbarium genomics

    DEFF Research Database (Denmark)

    Bakker, Freek T.; Lei, Di; Yu, Jiaying

    2016-01-01

    Herbarium genomics is proving promising as next-generation sequencing approaches are well suited to deal with the usually fragmented nature of archival DNA. We show that routine assembly of partial plastome sequences from herbarium specimens is feasible, from total DNA extracts and with specimens...... up to 146 years old. We use genome skimming and an automated assembly pipeline, Iterative Organelle Genome Assembly, that assembles paired-end reads into a series of candidate assemblies, the best one of which is selected based on likelihood estimation. We used 93 specimens from 12 different...... correlation between plastome coverage and nuclear genome size (C value) in our samples, but the range of C values included is limited. Finally, we conclude that routine plastome sequencing from herbarium specimens is feasible and cost-effective (compared with Sanger sequencing or plastome...

  20. Assessment of heterogeneity between European Populations: a Baltic and Danish replication case-control study of SNPs from a recent European ulcerative colitis genome wide association study.

    Science.gov (United States)

    Andersen, Vibeke; Ernst, Anja; Sventoraityte, Jurgita; Kupcinskas, Limas; Jacobsen, Bent A; Krarup, Henrik B; Vogel, Ulla; Jonaitis, Laimas; Denapiene, Goda; Kiudelis, Gediminas; Balschun, Tobias; Franke, Andre

    2011-10-13

    Differences in the genetic architecture of inflammatory bowel disease between different European countries and ethnicities have previously been reported. In the present study, we wanted to assess the role of 11 newly identified UC risk variants, derived from a recent European UC genome wide association study (GWAS) (Franke et al., 2010), for 1) association with UC in the Nordic countries, 2) for population heterogeneity between the Nordic countries and the rest of Europe, and, 3) eventually, to drive some of the previous findings towards overall genome-wide significance. Eleven SNPs were replicated in a Danish sample consisting of 560 UC patients and 796 controls and nine missing SNPs of the German GWAS study were successfully genotyped in the Baltic sample comprising 441 UC cases and 1156 controls. The independent replication data was then jointly analysed with the original data and systematic comparisons of the findings between ethnicities were made. Pearson's χ2, Breslow-Day (BD) and Cochran-Mantel-Haenszel (CMH) tests were used for association analyses and heterogeneity testing. The rs5771069 (IL17REL) SNP was not associated with UC in the Danish panel. The rs5771069 (IL17REL) SNP was significantly associated with UC in the combined Baltic, Danish and Norwegian UC study sample driven by the Norwegian panel (OR = 0.89, 95% CI: 0.79-0.98, P = 0.02). No association was found between rs7809799 (SMURF1/KPNA7) and UC (OR = 1.20, 95% CI: 0.95-1.52, P = 0.10) or between UC and all other remaining SNPs. We had 94% chance of detecting an association for rs7809799 (SMURF1/KPNA7) in the combined replication sample, whereas the power were 55% or lower for the remaining SNPs.Statistically significant PBD was found for OR heterogeneity between the combined Baltic, Danish, and Norwegian panel versus the combined German, British, Belgian, and Greek panel (rs7520292 (P = 0.001), rs12518307 (P = 0.007), and rs2395609 (TCP11) (P = 0.01), respectively).No SNP reached genome

  1. Assessment of heterogeneity between European Populations: a Baltic and Danish replication case-control study of SNPs from a recent European ulcerative colitis genome wide association study

    Directory of Open Access Journals (Sweden)

    Jonaitis Laimas

    2011-10-01

    Full Text Available Abstract Background Differences in the genetic architecture of inflammatory bowel disease between different European countries and ethnicities have previously been reported. In the present study, we wanted to assess the role of 11 newly identified UC risk variants, derived from a recent European UC genome wide association study (GWAS (Franke et al., 2010, for 1 association with UC in the Nordic countries, 2 for population heterogeneity between the Nordic countries and the rest of Europe, and, 3 eventually, to drive some of the previous findings towards overall genome-wide significance. Methods Eleven SNPs were replicated in a Danish sample consisting of 560 UC patients and 796 controls and nine missing SNPs of the German GWAS study were successfully genotyped in the Baltic sample comprising 441 UC cases and 1156 controls. The independent replication data was then jointly analysed with the original data and systematic comparisons of the findings between ethnicities were made. Pearson's χ2, Breslow-Day (BD and Cochran-Mantel-Haenszel (CMH tests were used for association analyses and heterogeneity testing. Results The rs5771069 (IL17REL SNP was not associated with UC in the Danish panel. The rs5771069 (IL17REL SNP was significantly associated with UC in the combined Baltic, Danish and Norwegian UC study sample driven by the Norwegian panel (OR = 0.89, 95% CI: 0.79-0.98, P = 0.02. No association was found between rs7809799 (SMURF1/KPNA7 and UC (OR = 1.20, 95% CI: 0.95-1.52, P = 0.10 or between UC and all other remaining SNPs. We had 94% chance of detecting an association for rs7809799 (SMURF1/KPNA7 in the combined replication sample, whereas the power were 55% or lower for the remaining SNPs. Statistically significant PBD was found for OR heterogeneity between the combined Baltic, Danish, and Norwegian panel versus the combined German, British, Belgian, and Greek panel (rs7520292 (P = 0.001, rs12518307 (P = 0.007, and rs2395609 (TCP11 (P = 0

  2. Coupling of Realistic Rate Estimates with Genomics for Assessing Contaminant Attenuation and Long-Term Plume Containment

    Energy Technology Data Exchange (ETDEWEB)

    Colwell, F. S.; Crawford, R. L.; Sorenson, K.

    2005-09-01

    monitor natural attenuation of chlorinated organics, increase the acceptability of this solution, and provide significant economic and health benefits through this noninvasive remediation strategy. This project also aims to derive valuable genomic information about the functional attributes of subsurface microbial communities upon which DOE must depend to resolve some of its most difficult contamination issues.

  3. Waste Isolation Pilot Plant Technical Assessment Team Report

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2015-03-17

    This report provides the results of the Waste Isolation Pilot Plant (WIPP) technical assessment led by the Savannah River National Laboratory and conducted by a team of experts in pertinent disciplines from SRNL and Lawrence Livermore National Laboratory (LLNL), Oak Ridge National Laboratory (ORNL), Pacific Northwest National Laboratory (PNNL), and Sandia National Laboratories (SNL).

  4. Exploring genomic dark matter: A critical assessment of the performance of homology search methods on noncoding RNA

    DEFF Research Database (Denmark)

    Freyhult, E.; Bollback, J. P.; Gardner, P. P.

    2006-01-01

    Homology search is one of the most ubiquitous bioinformatic tasks, yet it is unknown how effective the currently available tools are for identifying noncoding RNAs (ncRNAs). In this work, we use reliable ncRNA data sets to assess the effectiveness of methods such as BLAST, FASTA, HMMer, and Infer......Homology search is one of the most ubiquitous bioinformatic tasks, yet it is unknown how effective the currently available tools are for identifying noncoding RNAs (ncRNAs). In this work, we use reliable ncRNA data sets to assess the effectiveness of methods such as BLAST, FASTA, HMMer......, and Infernal. Surprisingly, the most popular homology search methods are often the least accurate. As a result, many studies have used inappropriate tools for their analyses. On the basis of our results, we suggest homology search strategies using the currently available tools and some directions for future...

  5. Summary of photochemical and radiative data used in the LLNL one-dimensional transport-kinetics model of the troposphere and stratosphere: 1982

    International Nuclear Information System (INIS)

    Connell, P.S.; Wuebbles, D.J.

    1983-01-01

    This report summarizes the contents and sources of the photochemical and radiative segment of the LLNL one-dimensional transport-kinetics model of the troposphere and stratosphere. Data include the solar flux incident at the top of the atmosphere, absorption spectra for O 2 , O 3 and NO 2 , and effective absorption coefficients for about 40 photolytic processes as functions of wavelength and, in a few cases, temperature and pressure. The current data set represents understanding of atmospheric photochemical processes as of late 1982 and relies largely on NASA Evaluation Number 5 of Chemical Kinetics and Photochemical Data for Use in Stratospheric Modeling, JPL Publication 82-57 (DeMore et al., 1982). Implementation in the model, including the treatment of multiple scattering and cloud cover, is discussed in Wuebbles (1981)

  6. LLNL Radiation Protection Program (RPP) Rev 9.2, Implementation of 10 CFR 835, 'Occupational Radiation Protection'

    Energy Technology Data Exchange (ETDEWEB)

    Shingleton, K. L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2011-06-15

    The Department of Energy (DOE) originally issued Part 10 CFR 835, Occupational Radiation Protection, on January 1, 1994. This regulation, hereafter referred to as “the Rule”, required DOE contractors to develop and maintain a DOE-approved Radiation Protection Program (RPP); DOE approved the initial Lawrence Livermore National Laboratory (LLNL) RPP (Rev 2) on 6/29/95. DOE issued a revision to the Rule on December 4, 1998 and approved LLNL’s revised RPP (Rev 7.1) on 11/18/99. DOE issued a second Rule revision on June 8, 2007 (effective July 9, 2007) and on June 13, 2008 approved LLNL’s RPP (Rev 9.0) which contained plans and measures for coming into compliance with the 2007 Rule changes. DOE issued a correction to the Rule on April 21, 2009.

  7. Historical Doses from Tritiated Water and Tritiated Hydrogen Gas Released to the Atmosphere from Lawrence Livermore National Laboratory (LLNL). Part 5. Accidental Releases

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, S

    2007-08-15

    Over the course of fifty-three years, LLNL had six acute releases of tritiated hydrogen gas (HT) and one acute release of tritiated water vapor (HTO) that were too large relative to the annual releases to be included as part of the annual releases from normal operations detailed in Parts 3 and 4 of the Tritium Dose Reconstruction (TDR). Sandia National Laboratories/California (SNL/CA) had one such release of HT and one of HTO. Doses to the maximally exposed individual (MEI) for these accidents have been modeled using an equation derived from the time-dependent tritium model, UFOTRI, and parameter values based on expert judgment. All of these acute releases are described in this report. Doses that could not have been exceeded from the large HT releases of 1965 and 1970 were calculated to be 43 {micro}Sv (4.3 mrem) and 120 {micro}Sv (12 mrem) to an adult, respectively. Two published sets of dose predictions for the accidental HT release in 1970 are compared with the dose predictions of this TDR. The highest predicted dose was for an acute release of HTO in 1954. For this release, the dose that could not have been exceeded was estimated to have been 2 mSv (200 mrem), although, because of the high uncertainty about the predictions, the likely dose may have been as low as 360 {micro}Sv (36 mrem) or less. The estimated maximum exposures from the accidental releases were such that no adverse health effects would be expected. Appendix A lists all accidents and large routine puff releases that have occurred at LLNL and SNL/CA between 1953 and 2005. Appendix B describes the processes unique to tritium that must be modeled after an acute release, some of the time-dependent tritium models being used today, and the results of tests of these models.

  8. Summary of International Waste Management Programs (LLNL Input to SNL L3 MS: System-Wide Integration and Site Selection Concepts for Future Disposition Options for HLW)

    Energy Technology Data Exchange (ETDEWEB)

    Greenberg, Harris R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Blink, James A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Halsey, William G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sutton, Mark [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2011-08-11

    The Used Fuel Disposition Campaign (UFDC) within the Department of Energy’s Office of Nuclear Energy (DOE-NE) Fuel Cycle Technology (FCT) program has been tasked with investigating the disposal of the nation’s spent nuclear fuel (SNF) and high-level nuclear waste (HLW) for a range of potential waste forms and geologic environments. This Lessons Learned task is part of a multi-laboratory effort, with this LLNL report providing input to a Level 3 SNL milestone (System-Wide Integration and Site Selection Concepts for Future Disposition Options for HLW). The work package number is: FTLL11UF0328; the work package title is: Technical Bases / Lessons Learned; the milestone number is: M41UF032802; and the milestone title is: “LLNL Input to SNL L3 MS: System-Wide Integration and Site Selection Concepts for Future Disposition Options for HLW”. The system-wide integration effort will integrate all aspects of waste management and disposal, integrating the waste generators, interim storage, transportation, and ultimate disposal at a repository site. The review of international experience in these areas is required to support future studies that address all of these components in an integrated manner. Note that this report is a snapshot of nuclear power infrastructure and international waste management programs that is current as of August 2011, with one notable exception. No attempt has been made to discuss the currently evolving world-wide response to the tragic consequences of the earthquake and tsunami that devastated Japan on March 11, 2011, leaving more than 15,000 people dead and more than 8,000 people missing, and severely damaging the Fukushima Daiichi nuclear power complex. Continuing efforts in FY 2012 will update the data, and summarize it in an Excel spreadsheet for easy comparison and assist in the knowledge management of the study cases.

  9. Cephalopod genomics

    DEFF Research Database (Denmark)

    Albertin, Caroline B.; Bonnaud, Laure; Brown, C. Titus

    2012-01-01

    The Cephalopod Sequencing Consortium (CephSeq Consortium) was established at a NESCent Catalysis Group Meeting, ``Paths to Cephalopod Genomics-Strategies, Choices, Organization,'' held in Durham, North Carolina, USA on May 24-27, 2012. Twenty-eight participants representing nine countries (Austria......, Australia, China, Denmark, France, Italy, Japan, Spain and the USA) met to address the pressing need for genome sequencing of cephalopod mollusks. This group, drawn from cephalopod biologists, neuroscientists, developmental and evolutionary biologists, materials scientists, bioinformaticians and researchers...... active in sequencing, assembling and annotating genomes, agreed on a set of cephalopod species of particular importance for initial sequencing and developed strategies and an organization (CephSeq Consortium) to promote this sequencing. The conclusions and recommendations of this meeting are described...

  10. Environmental management assessment of the Lawrence Livermore National Laboratory Livermore, California

    International Nuclear Information System (INIS)

    1994-06-01

    This report documents the results of the Environmental Management Assessment performed at the Lawrence Livermore National Laboratory (LLNL), Livermore, CA. LLNL is operated by the University of California (UC) under contract with the U.S. Department of Energy (DOE). Major programs at LLNL include research, development, and test activities associated with the nuclear design aspects of the nuclear weapons life cycle and related national security tasks; inertial confinement fusion; magnetic fusion energy; biomedical and environmental research; laser isotope separation; energy-related research; beam research physics; and support to a variety of Defense and other Federal agencies. During this assessment, activities and records were reviewed and interviews were conducted with personnel from management and operating contractor, Lawrence Livermore National Laboratory; DOE Oakland Operations Office; and DOE Headquarters Program Offices, including the Office of Defense Programs, Office of Environmental Management, the Office of Nuclear Energy, and the Office of Energy Research. The onsite portion was conducted in June 1994, by the DOE Office of Environmental Audit. The goal of EH-24 is enhancement of environmental protection and minimization of risk to public health and the environment. EH-24 accomplishes its mission using systematic and periodic evaluations of DOE's environmental programs within line organizations, and through use of supplemental activities that strengthen self-assessment and oversight functions within program, field, and contractor organizations. The Environmental Management Assessment of LLNL revealed that LLNL's environmental program is exemplary within the DOE complex and that all levels of LLNL management and staff consistently exhibit a high level of commitment to achieve environmental excellence

  11. Whole community genome amplification (WCGA) leads to compositional bias in methane oxidizing communities as assessed by pmoA based microarray analyses and QPCR

    NARCIS (Netherlands)

    Bodelier, P.L.E.; Kamst, M.; Meima-Franke, M.; Stralis-Pavese, N.; Bodrossy, L.

    2009-01-01

    Whole-genome amplification (WGA) using multiple displacement amplification (MDA) has recently been introduced to the field of environmental microbiology. The amplification of single-cell genomes or whole-community metagenomes decreases the minimum amount of DNA needed for subsequent molecular

  12. Genome Sequencing

    DEFF Research Database (Denmark)

    Sato, Shusei; Andersen, Stig Uggerhøj

    2014-01-01

    The current Lotus japonicus reference genome sequence is based on a hybrid assembly of Sanger TAC/BAC, Sanger shotgun and Illumina shotgun sequencing data generated from the Miyakojima-MG20 accession. It covers nearly all expressed L. japonicus genes and has been annotated mainly based on transcr......The current Lotus japonicus reference genome sequence is based on a hybrid assembly of Sanger TAC/BAC, Sanger shotgun and Illumina shotgun sequencing data generated from the Miyakojima-MG20 accession. It covers nearly all expressed L. japonicus genes and has been annotated mainly based...

  13. Source attribution of human campylobacteriosis at the point of exposure by combining comparative exposure assessment and subtype comparison based on comparative genomic fingerprinting.

    Directory of Open Access Journals (Sweden)

    André Ravel

    Full Text Available Human campylobacteriosis is a common zoonosis with a significant burden in many countries. Its prevention is difficult because humans can be exposed to Campylobacter through various exposures: foodborne, waterborne or by contact with animals. This study aimed at attributing campylobacteriosis to sources at the point of exposure. It combined comparative exposure assessment and microbial subtype comparison with subtypes defined by comparative genomic fingerprinting (CGF. It used isolates from clinical cases and from eight potential exposure sources (chicken, cattle and pig manure, retail chicken, beef, pork and turkey meat, and surface water collected within a single sentinel site of an integrated surveillance system for enteric pathogens in Canada. Overall, 1518 non-human isolates and 250 isolates from domestically-acquired human cases were subtyped and their subtype profiles analyzed for source attribution using two attribution models modified to include exposure. Exposure values were obtained from a concurrent comparative exposure assessment study undertaken in the same area. Based on CGF profiles, attribution was possible for 198 (79% human cases. Both models provide comparable figures: chicken meat was the most important source (65-69% of attributable cases whereas exposure to cattle (manure ranked second (14-19% of attributable cases, the other sources being minor (including beef meat. In comparison with other attributions conducted at the point of production, the study highlights the fact that Campylobacter transmission from cattle to humans is rarely meat borne, calling for a closer look at local transmission from cattle to prevent campylobacteriosis, in addition to increasing safety along the chicken supply chain.

  14. Source attribution of human campylobacteriosis at the point of exposure by combining comparative exposure assessment and subtype comparison based on comparative genomic fingerprinting.

    Science.gov (United States)

    Ravel, André; Hurst, Matt; Petrica, Nicoleta; David, Julie; Mutschall, Steven K; Pintar, Katarina; Taboada, Eduardo N; Pollari, Frank

    2017-01-01

    Human campylobacteriosis is a common zoonosis with a significant burden in many countries. Its prevention is difficult because humans can be exposed to Campylobacter through various exposures: foodborne, waterborne or by contact with animals. This study aimed at attributing campylobacteriosis to sources at the point of exposure. It combined comparative exposure assessment and microbial subtype comparison with subtypes defined by comparative genomic fingerprinting (CGF). It used isolates from clinical cases and from eight potential exposure sources (chicken, cattle and pig manure, retail chicken, beef, pork and turkey meat, and surface water) collected within a single sentinel site of an integrated surveillance system for enteric pathogens in Canada. Overall, 1518 non-human isolates and 250 isolates from domestically-acquired human cases were subtyped and their subtype profiles analyzed for source attribution using two attribution models modified to include exposure. Exposure values were obtained from a concurrent comparative exposure assessment study undertaken in the same area. Based on CGF profiles, attribution was possible for 198 (79%) human cases. Both models provide comparable figures: chicken meat was the most important source (65-69% of attributable cases) whereas exposure to cattle (manure) ranked second (14-19% of attributable cases), the other sources being minor (including beef meat). In comparison with other attributions conducted at the point of production, the study highlights the fact that Campylobacter transmission from cattle to humans is rarely meat borne, calling for a closer look at local transmission from cattle to prevent campylobacteriosis, in addition to increasing safety along the chicken supply chain.

  15. Comparative Genomics

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 11; Issue 8. Comparative Genomics - A Powerful New Tool in Biology. Anand K Bachhawat. General Article Volume 11 Issue 8 August 2006 pp 22-40. Fulltext. Click here to view fulltext PDF. Permanent link:

  16. Personal genomics services: whose genomes?

    Science.gov (United States)

    Gurwitz, David; Bregman-Eschet, Yael

    2009-07-01

    New companies offering personal whole-genome information services over the internet are dynamic and highly visible players in the personal genomics field. For fees currently ranging from US$399 to US$2500 and a vial of saliva, individuals can now purchase online access to their individual genetic information regarding susceptibility to a range of chronic diseases and phenotypic traits based on a genome-wide SNP scan. Most of the companies offering such services are based in the United States, but their clients may come from nearly anywhere in the world. Although the scientific validity, clinical utility and potential future implications of such services are being hotly debated, several ethical and regulatory questions related to direct-to-consumer (DTC) marketing strategies of genetic tests have not yet received sufficient attention. For example, how can we minimize the risk of unauthorized third parties from submitting other people's DNA for testing? Another pressing question concerns the ownership of (genotypic and phenotypic) information, as well as the unclear legal status of customers regarding their own personal information. Current legislation in the US and Europe falls short of providing clear answers to these questions. Until the regulation of personal genomics services catches up with the technology, we call upon commercial providers to self-regulate and coordinate their activities to minimize potential risks to individual privacy. We also point out some specific steps, along the trustee model, that providers of DTC personal genomics services as well as regulators and policy makers could consider for addressing some of the concerns raised below.

  17. Comprehensive genetic assessment of the human embryo: can empiric application of microarray comparative genomic hybridization reduce multiple gestation rate by single fresh blastocyst transfer?

    Science.gov (United States)

    Sills, Eric Scott; Yang, Zhihong; Walsh, David J; Salem, Shala A

    2012-09-01

    The unacceptable multiple gestation rate currently associated with in vitro fertilization (IVF) would be substantially alleviated if the routine practice of transferring more than one embryo were reconsidered. While transferring a single embryo is an effective method to reduce the clinical problem of multiple gestation, rigid adherence to this approach has been criticized for negatively impacting clinical pregnancy success in IVF. In general, single embryo transfer is viewed cautiously by IVF patients although greater acceptance would result from a more effective embryo selection method. Selection of one embryo for fresh transfer on the basis of chromosomal normalcy should achieve the dual objective of maintaining satisfactory clinical pregnancy rates and minimizing the multiple gestation problem, because embryo aneuploidy is a major contributing factor in implantation failure and miscarriage in IVF. The initial techniques for preimplantation genetic screening unfortunately lacked sufficient sensitivity and did not yield the expected results in IVF. However, newer molecular genetic methods could be incorporated with standard IVF to bring the goal of single embryo transfer within reach. Aiming to make multiple embryo transfers obsolete and unnecessary, and recognizing that array comparative genomic hybridization (aCGH) will typically require an additional 12 h of laboratory time to complete, we propose adopting aCGH for mainstream use in clinical IVF practice. As aCGH technology continues to develop and becomes increasingly available at lower cost, it may soon be considered unusual for IVF laboratories to select a single embryo for fresh transfer without regard to its chromosomal competency. In this report, we provide a rationale supporting aCGH as the preferred methodology to provide a comprehensive genetic assessment of the single embryo before fresh transfer in IVF. The logistics and cost of integrating aCGH with IVF to enable fresh embryo transfer are also

  18. Identification of genomic biomarkers for anthracycline-induced cardiotoxicity in human iPSC-derived cardiomyocytes: an in vitro repeated exposure toxicity approach for safety assessment.

    Science.gov (United States)

    Chaudhari, Umesh; Nemade, Harshal; Wagh, Vilas; Gaspar, John Antonydas; Ellis, James K; Srinivasan, Sureshkumar Perumal; Spitkovski, Dimitry; Nguemo, Filomain; Louisse, Jochem; Bremer, Susanne; Hescheler, Jürgen; Keun, Hector C; Hengstler, Jan G; Sachinidis, Agapios

    2016-11-01

    The currently available techniques for the safety evaluation of candidate drugs are usually cost-intensive and time-consuming and are often insufficient to predict human relevant cardiotoxicity. The purpose of this study was to develop an in vitro repeated exposure toxicity methodology allowing the identification of predictive genomics biomarkers of functional relevance for drug-induced cardiotoxicity in human induced pluripotent stem cell-derived cardiomyocytes (hiPSC-CMs). The hiPSC-CMs were incubated with 156 nM doxorubicin, which is a well-characterized cardiotoxicant, for 2 or 6 days followed by washout of the test compound and further incubation in compound-free culture medium until day 14 after the onset of exposure. An xCELLigence Real-Time Cell Analyser was used to monitor doxorubicin-induced cytotoxicity while also monitoring functional alterations of cardiomyocytes by counting of the beating frequency of cardiomyocytes. Unlike single exposure, repeated doxorubicin exposure resulted in long-term arrhythmic beating in hiPSC-CMs accompanied by significant cytotoxicity. Global gene expression changes were studied using microarrays and bioinformatics tools. Analysis of the transcriptomic data revealed early expression signatures of genes involved in formation of sarcomeric structures, regulation of ion homeostasis and induction of apoptosis. Eighty-four significantly deregulated genes related to cardiac functions, stress and apoptosis were validated using real-time PCR. The expression of the 84 genes was further studied by real-time PCR in hiPSC-CMs incubated with daunorubicin and mitoxantrone, further anthracycline family members that are also known to induce cardiotoxicity. A panel of 35 genes was deregulated by all three anthracycline family members and can therefore be expected to predict the cardiotoxicity of compounds acting by similar mechanisms as doxorubicin, daunorubicin or mitoxantrone. The identified gene panel can be applied in the safety

  19. Visualization for genomics: the Microbial Genome Viewer.

    NARCIS (Netherlands)

    Kerkhoven, R.; Enckevort, F.H.J. van; Boekhorst, J.; Molenaar, D; Siezen, R.J.

    2004-01-01

    SUMMARY: A Web-based visualization tool, the Microbial Genome Viewer, is presented that allows the user to combine complex genomic data in a highly interactive way. This Web tool enables the interactive generation of chromosome wheels and linear genome maps from genome annotation data stored in a

  20. Ancient genomics

    DEFF Research Database (Denmark)

    Der Sarkissian, Clio; Allentoft, Morten Erik; Avila Arcos, Maria del Carmen

    2015-01-01

    throughput of next generation sequencing platforms and the ability to target short and degraded DNA molecules. Many ancient specimens previously unsuitable for DNA analyses because of extensive degradation can now successfully be used as source materials. Additionally, the analytical power obtained...... by increasing the number of sequence reads to billions effectively means that contamination issues that have haunted aDNA research for decades, particularly in human studies, can now be efficiently and confidently quantified. At present, whole genomes have been sequenced from ancient anatomically modern humans...

  1. Marine genomics

    DEFF Research Database (Denmark)

    Oliveira Ribeiro, Ângela Maria; Foote, Andrew David; Kupczok, Anne

    2017-01-01

    Marine ecosystems occupy 71% of the surface of our planet, yet we know little about their diversity. Although the inventory of species is continually increasing, as registered by the Census of Marine Life program, only about 10% of the estimated two million marine species are known. This lag......-throughput sequencing approaches have been helping to improve our knowledge of marine biodiversity, from the rich microbial biota that forms the base of the tree of life to a wealth of plant and animal species. In this review, we present an overview of the applications of genomics to the study of marine life, from...

  2. Emergency Response Capability Baseline Needs Assessment Compliance Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Sharry, John A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-09-16

    This document is the second of a two-part analysis of Emergency Response Capabilities of Lawrence Livermore National Laboratory. The first part, 2013 Baseline Needs Assessment Requirements Document established the minimum performance criteria necessary to meet mandatory requirements. This second part analyses the performance of Lawrence Livermore Laboratory Emergency Management Department to the contents of the Requirements Document. The document was prepared based on an extensive review of information contained in the 2009 BNA, the 2012 BNA document, a review of Emergency Planning Hazards Assessments, a review of building construction, occupancy, fire protection features, dispatch records, LLNL alarm system records, fire department training records, and fire department policies and procedures.

  3. Genome Improvement at JGI-HAGSC

    Energy Technology Data Exchange (ETDEWEB)

    Grimwood, Jane; Schmutz, Jeremy J.; Myers, Richard M.

    2012-03-03

    Since the completion of the sequencing of the human genome, the Joint Genome Institute (JGI) has rapidly expanded its scientific goals in several DOE mission-relevant areas. At the JGI-HAGSC, we have kept pace with this rapid expansion of projects with our focus on assessing, assembling, improving and finishing eukaryotic whole genome shotgun (WGS) projects for which the shotgun sequence is generated at the Production Genomic Facility (JGI-PGF). We follow this by combining the draft WGS with genomic resources generated at JGI-HAGSC or in collaborator laboratories (including BAC end sequences, genetic maps and FLcDNA sequences) to produce an improved draft sequence. For eukaryotic genomes important to the DOE mission, we then add further information from directed experiments to produce reference genomic sequences that are publicly available for any scientific researcher. Also, we have continued our program for producing BAC-based finished sequence, both for adding information to JGI genome projects and for small BAC-based sequencing projects proposed through any of the JGI sequencing programs. We have now built our computational expertise in WGS assembly and analysis and have moved eukaryotic genome assembly from the JGI-PGF to JGI-HAGSC. We have concentrated our assembly development work on large plant genomes and complex fungal and algal genomes.

  4. Using proteomic data to assess a genome-scale "in silico" model of metal reducing bacteria in the simulation of field-scale uranium bioremediation

    Science.gov (United States)

    Yabusaki, S.; Fang, Y.; Wilkins, M. J.; Long, P.; Rifle IFRC Science Team

    2011-12-01

    A series of field experiments in a shallow alluvial aquifer at a former uranium mill tailings site have demonstrated that indigenous bacteria can be stimulated with acetate to catalyze the conversion of hexavalent uranium in a groundwater plume to immobile solid-associated uranium in the +4 oxidation state. While this bioreduction of uranium has been shown to lower groundwater concentrations below actionable standards, a viable remediation methodology will need a mechanistic, predictive and quantitative understanding of the microbially-mediated reactions that catalyze the reduction of uranium in the context of site-specific processes, properties, and conditions. At the Rifle IFRC site, we are investigating the impacts on uranium behavior of pulsed acetate amendment, acetate-oxidizing iron and sulfate reducing bacteria, seasonal water table variation, spatially-variable physical (hydraulic conductivity, porosity) and geochemical (reactive surface area) material properties. The simulation of three-dimensional, variably saturated flow and biogeochemical reactive transport during a uranium bioremediation field experiment includes a genome-scale in silico model of Geobacter sp. to represent the Fe(III) terminal electron accepting process (TEAP). The Geobacter in silico model of cell-scale physiological metabolic pathways is comprised of hundreds of intra-cellular and environmental exchange reactions. One advantage of this approach is that the TEAP reaction stoichiometry and rate are now functions of the metabolic status of the microorganism. The linkage of in silico model reactions to specific Geobacter proteins has enabled the use of groundwater proteomic analyses to assess the accuracy of the model under evolving hydrologic and biogeochemical conditions. In this case, the largest predicted fluxes through in silico model reactions generally correspond to high abundances of proteins linked to those reactions (e.g. the condensation reaction catalyzed by the protein

  5. Comparing Mycobacterium tuberculosis genomes using genome topology networks.

    Science.gov (United States)

    Jiang, Jianping; Gu, Jianlei; Zhang, Liang; Zhang, Chenyi; Deng, Xiao; Dou, Tonghai; Zhao, Guoping; Zhou, Yan

    2015-02-14

    Over the last decade, emerging research methods, such as comparative genomic analysis and phylogenetic study, have yielded new insights into genotypes and phenotypes of closely related bacterial strains. Several findings have revealed that genomic structural variations (SVs), including gene gain/loss, gene duplication and genome rearrangement, can lead to different phenotypes among strains, and an investigation of genes affected by SVs may extend our knowledge of the relationships between SVs and phenotypes in microbes, especially in pathogenic bacteria. In this work, we introduce a 'Genome Topology Network' (GTN) method based on gene homology and gene locations to analyze genomic SVs and perform phylogenetic analysis. Furthermore, the concept of 'unfixed ortholog' has been proposed, whose members are affected by SVs in genome topology among close species. To improve the precision of 'unfixed ortholog' recognition, a strategy to detect annotation differences and complete gene annotation was applied. To assess the GTN method, a set of thirteen complete M. tuberculosis genomes was analyzed as a case study. GTNs with two different gene homology-assigning methods were built, the Clusters of Orthologous Groups (COG) method and the orthoMCL clustering method, and two phylogenetic trees were constructed accordingly, which may provide additional insights into whole genome-based phylogenetic analysis. We obtained 24 unfixable COG groups, of which most members were related to immunogenicity and drug resistance, such as PPE-repeat proteins (COG5651) and transcriptional regulator TetR gene family members (COG1309). The GTN method has been implemented in PERL and released on our website. The tool can be downloaded from http://homepage.fudan.edu.cn/zhouyan/gtn/ , and allows re-annotating the 'lost' genes among closely related genomes, analyzing genes affected by SVs, and performing phylogenetic analysis. With this tool, many immunogenic-related and drug resistance-related genes

  6. Molecular cytogenetic and genomic analyses reveal new insights into the origin of the wheat B genome.

    Science.gov (United States)

    Zhang, Wei; Zhang, Mingyi; Zhu, Xianwen; Cao, Yaping; Sun, Qing; Ma, Guojia; Chao, Shiaoman; Yan, Changhui; Xu, Steven S; Cai, Xiwen

    2018-02-01

    This work pinpointed the goatgrass chromosomal segment in the wheat B genome using modern cytogenetic and genomic technologies, and provided novel insights into the origin of the wheat B genome. Wheat is a typical allopolyploid with three homoeologous subgenomes (A, B, and D). The donors of the subgenomes A and D had been identified, but not for the subgenome B. The goatgrass Aegilops speltoides (genome SS) has been controversially considered a possible candidate for the donor of the wheat B genome. However, the relationship of the Ae. speltoides S genome with the wheat B genome remains largely obscure. The present study assessed the homology of the B and S genomes using an integrative cytogenetic and genomic approach, and revealed the contribution of Ae. speltoides to the origin of the wheat B genome. We discovered noticeable homology between wheat chromosome 1B and Ae. speltoides chromosome 1S, but not between other chromosomes in the B and S genomes. An Ae. speltoides-originated segment spanning a genomic region of approximately 10.46 Mb was detected on the long arm of wheat chromosome 1B (1BL). The Ae. speltoides-originated segment on 1BL was found to co-evolve with the rest of the B genome. Evidently, Ae. speltoides had been involved in the origin of the wheat B genome, but should not be considered an exclusive donor of this genome. The wheat B genome might have a polyphyletic origin with multiple ancestors involved, including Ae. speltoides. These novel findings will facilitate genome studies in wheat and other polyploids.

  7. MEETING REPORT ASSESSING HUMAN GERM-CELL MUTAGENESIS IN THE POST-GENOME ERA: A CELEBRATION OF THE LEGACY OF WILLIAM LAWSON (BILL) RUSSELL

    Science.gov (United States)

    Although numerous germ-cell mutagens have been identified in animal model systems, to date, no human germ-cell mutagens have been confirmed. Because the genomic integrity of our germ cells is essential for the continuation of the human species, a resolution of this enduring conu...

  8. The role of vitamin D in reducing gastrointestinal disease risk and assessment of individual dietary intake needs: Focus on genetic and genomic technologies.

    Science.gov (United States)

    Ferguson, Lynnette R; Laing, Bobbi; Marlow, Gareth; Bishop, Karen

    2016-01-01

    With the endogenous formation of vitamin D being significantly curtailed because of public awareness of skin cancer dangers, attention is turning to dietary sources. Cumulative evidence has implicated vitamin D deficiency in increasing susceptibility to various gastrointestinal disorders, including colorectal cancer, inflammatory bowel diseases, diverticulitis, and irritable bowel syndrome. There is also reason to suggest adjunct vitamin D therapy for such diseases. Although there is justification for increasing vitamin D intake overall, optimal intakes will vary among individuals. Genomic technologies have revealed several hundreds of genes associated with vitamin D actions. The nature of these genes emphasizes the potentially negative implications of modulating vitamin D intakes in the absence of complementary human genetic and genomic data, including information on the gut microbiome. However, we are not yet in a position to apply this information. Genomic data (transcriptomics, metabolomics, proteomics, and metagenomics) could provide evidence that vitamin D sufficiency has been achieved. We suggest that there is an increasingly strong case for considering the more widespread use of vitamin D fortified foods and/or dietary supplements to benefit gastrointestinal health. However, intake levels might beneficially be informed by personalized genetic and genomic information, for optimal disease prevention and maintenance of remission. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. GenHtr: a tool for comparative assessment of genetic heterogeneity in microbial genomes generated by massive short-read sequencing

    Directory of Open Access Journals (Sweden)

    Yu GongXin

    2010-10-01

    Full Text Available Abstract Background Microevolution is the study of short-term changes of alleles within a population and their effects on the phenotype of organisms. The result of the below-species-level evolution is heterogeneity, where populations consist of subpopulations with a large number of structural variations. Heterogeneity analysis is thus essential to our understanding of how selective and neutral forces shape bacterial populations over a short period of time. The Solexa Genome Analyzer, a next-generation sequencing platform, allows millions of short sequencing reads to be obtained with great accuracy, allowing for the ability to study the dynamics of the bacterial population at the whole genome level. The tool referred to as GenHtr was developed for genome-wide heterogeneity analysis. Results For particular bacterial strains, GenHtr relies on a set of Solexa short reads on given bacteria pathogens and their isogenic reference genome to identify heterogeneity sites, the chromosomal positions with multiple variants of genes in the bacterial population, and variations that occur in large gene families. GenHtr accomplishes this by building and comparatively analyzing genome-wide heterogeneity genotypes for both the newly sequenced genomes (using massive short-read sequencing and their isogenic reference (using simulated data. As proof of the concept, this approach was applied to SRX007711, the Solexa sequencing data for a newly sequenced Staphylococcus aureus subsp. USA300 cell line, and demonstrated that it could predict such multiple variants. They include multiple variants of genes critical in pathogenesis, e.g. genes encoding a LysR family transcriptional regulator, 23 S ribosomal RNA, and DNA mismatch repair protein MutS. The heterogeneity results in non-synonymous and nonsense mutations, leading to truncated proteins for both LysR and MutS. Conclusion GenHtr was developed for genome-wide heterogeneity analysis. Although it is much more time

  10. Genomic technologies in neonatology

    Directory of Open Access Journals (Sweden)

    L. N. Chernova

    2017-01-01

    Full Text Available In recent years, there has been a tremendous trend toward personalized medicine. Advances in the field forced clinicians, including neonatologists, to take a fresh look at prevention, tactics of management and therapy of various diseases. In the center of attention of foreign, and increasingly Russian, researchers and doctors, there are individual genomic data that allow not only to assess the risks of some form of pathology, but also to successfully apply personalized strategies of prediction, prevention and targeted treatment. This article provides a brief review of the latest achievements of genomic technologies in newborns, examines the problems and potential applications of genomics in promoting the concept of personalized medicine in neonatology. The increasing amount of personalized data simply impossible to analyze only by the human mind. In this connection, the need of computers and bioinformatics is obvious. The article reveals the role of translational bioinformatics in the analysis and integration of the results of the accumulated fundamental research into complete clinical decisions. The latest advances in neonatal translational bioinformatics such as clinical decision support systems are considered. It helps to monitor vital parameters of newborns influencing the course of a particular disease, to calculate the increased risks of the development of various pathologies and to select the drugs.

  11. Value-based genomics.

    Science.gov (United States)

    Gong, Jun; Pan, Kathy; Fakih, Marwan; Pal, Sumanta; Salgia, Ravi

    2018-03-20

    Advancements in next-generation sequencing have greatly enhanced the development of biomarker-driven cancer therapies. The affordability and availability of next-generation sequencers have allowed for the commercialization of next-generation sequencing platforms that have found widespread use for clinical-decision making and research purposes. Despite the greater availability of tumor molecular profiling by next-generation sequencing at our doorsteps, the achievement of value-based care, or improving patient outcomes while reducing overall costs or risks, in the era of precision oncology remains a looming challenge. In this review, we highlight available data through a pre-established and conceptualized framework for evaluating value-based medicine to assess the cost (efficiency), clinical benefit (effectiveness), and toxicity (safety) of genomic profiling in cancer care. We also provide perspectives on future directions of next-generation sequencing from targeted panels to whole-exome or whole-genome sequencing and describe potential strategies needed to attain value-based genomics.

  12. One bacterial cell, one complete genome.

    Directory of Open Access Journals (Sweden)

    Tanja Woyke

    2010-04-01

    Full Text Available While the bulk of the finished microbial genomes sequenced to date are derived from cultured bacterial and archaeal representatives, the vast majority of microorganisms elude current culturing attempts, severely limiting the ability to recover complete or even partial genomes from these environmental species. Single cell genomics is a novel culture-independent approach, which enables access to the genetic material of an individual cell. No single cell genome has to our knowledge been closed and finished to date. Here we report the completed genome from an uncultured single cell of Candidatus Sulcia muelleri DMIN. Digital PCR on single symbiont cells isolated from the bacteriome of the green sharpshooter Draeculacephala minerva bacteriome allowed us to assess that this bacteria is polyploid with genome copies ranging from approximately 200-900 per cell, making it a most suitable target for single cell finishing efforts. For single cell shotgun sequencing, an individual Sulcia cell was isolated and whole genome amplified by multiple displacement amplification (MDA. Sanger-based finishing methods allowed us to close the genome. To verify the correctness of our single cell genome and exclude MDA-derived artifacts, we independently shotgun sequenced and assembled the Sulcia genome from pooled bacteriomes using a metagenomic approach, yielding a nearly identical genome. Four variations we detected appear to be genuine biological differences between the two samples. Comparison of the single cell genome with bacteriome metagenomic sequence data detected two single nucleotide polymorphisms (SNPs, indicating extremely low genetic diversity within a Sulcia population. This study demonstrates the power of single cell genomics to generate a complete, high quality, non-composite reference genome within an environmental sample, which can be used for population genetic analyzes.

  13. One Bacterial Cell, One Complete Genome

    Energy Technology Data Exchange (ETDEWEB)

    Woyke, Tanja; Tighe, Damon; Mavrommatis, Konstantinos; Clum, Alicia; Copeland, Alex; Schackwitz, Wendy; Lapidus, Alla; Wu, Dongying; McCutcheon, John P.; McDonald, Bradon R.; Moran, Nancy A.; Bristow, James; Cheng, Jan-Fang

    2010-04-26

    While the bulk of the finished microbial genomes sequenced to date are derived from cultured bacterial and archaeal representatives, the vast majority of microorganisms elude current culturing attempts, severely limiting the ability to recover complete or even partial genomes from these environmental species. Single cell genomics is a novel culture-independent approach, which enables access to the genetic material of an individual cell. No single cell genome has to our knowledge been closed and finished to date. Here we report the completed genome from an uncultured single cell of Candidatus Sulcia muelleri DMIN. Digital PCR on single symbiont cells isolated from the bacteriome of the green sharpshooter Draeculacephala minerva bacteriome allowed us to assess that this bacteria is polyploid with genome copies ranging from approximately 200?900 per cell, making it a most suitable target for single cell finishing efforts. For single cell shotgun sequencing, an individual Sulcia cell was isolated and whole genome amplified by multiple displacement amplification (MDA). Sanger-based finishing methods allowed us to close the genome. To verify the correctness of our single cell genome and exclude MDA-derived artifacts, we independently shotgun sequenced and assembled the Sulcia genome from pooled bacteriomes using a metagenomic approach, yielding a nearly identical genome. Four variations we detected appear to be genuine biological differences between the two samples. Comparison of the single cell genome with bacteriome metagenomic sequence data detected two single nucleotide polymorphisms (SNPs), indicating extremely low genetic diversity within a Sulcia population. This study demonstrates the power of single cell genomics to generate a complete, high quality, non-composite reference genome within an environmental sample, which can be used for population genetic analyzes.

  14. Ensembl Genomes 2016: more genomes, more complexity.

    Science.gov (United States)

    Kersey, Paul Julian; Allen, James E; Armean, Irina; Boddu, Sanjay; Bolt, Bruce J; Carvalho-Silva, Denise; Christensen, Mikkel; Davis, Paul; Falin, Lee J; Grabmueller, Christoph; Humphrey, Jay; Kerhornou, Arnaud; Khobova, Julia; Aranganathan, Naveen K; Langridge, Nicholas; Lowy, Ernesto; McDowall, Mark D; Maheswari, Uma; Nuhn, Michael; Ong, Chuang Kee; Overduin, Bert; Paulini, Michael; Pedro, Helder; Perry, Emily; Spudich, Giulietta; Tapanari, Electra; Walts, Brandon; Williams, Gareth; Tello-Ruiz, Marcela; Stein, Joshua; Wei, Sharon; Ware, Doreen; Bolser, Daniel M; Howe, Kevin L; Kulesha, Eugene; Lawson, Daniel; Maslen, Gareth; Staines, Daniel M

    2016-01-04

    Ensembl Genomes (http://www.ensemblgenomes.org) is an integrating resource for genome-scale data from non-vertebrate species, complementing the resources for vertebrate genomics developed in the context of the Ensembl project (http://www.ensembl.org). Together, the two resources provide a consistent set of programmatic and interactive interfaces to a rich range of data including reference sequence, gene models, transcriptional data, genetic variation and comparative analysis. This paper provides an update to the previous publications about the resource, with a focus on recent developments. These include the development of new analyses and views to represent polyploid genomes (of which bread wheat is the primary exemplar); and the continued up-scaling of the resource, which now includes over 23 000 bacterial genomes, 400 fungal genomes and 100 protist genomes, in addition to 55 genomes from invertebrate metazoa and 39 genomes from plants. This dramatic increase in the number of included genomes is one part of a broader effort to automate the integration of archival data (genome sequence, but also associated RNA sequence data and variant calls) within the context of reference genomes and make it available through the Ensembl user interfaces. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  15. Rodent malaria parasites : genome organization & comparative genomics

    NARCIS (Netherlands)

    Kooij, Taco W.A.

    2006-01-01

    The aim of the studies described in this thesis was to investigate the genome organization of rodent malaria parasites (RMPs) and compare the organization and gene content of the genomes of RMPs and the human malaria parasite P. falciparum. The release of the complete genome sequence of P.

  16. Population Genomics of Paramecium Species.

    Science.gov (United States)

    Johri, Parul; Krenek, Sascha; Marinov, Georgi K; Doak, Thomas G; Berendonk, Thomas U; Lynch, Michael

    2017-05-01

    Population-genomic analyses are essential to understanding factors shaping genomic variation and lineage-specific sequence constraints. The dearth of such analyses for unicellular eukaryotes prompted us to assess genomic variation in Paramecium, one of the most well-studied ciliate genera. The Paramecium aurelia complex consists of ∼15 morphologically indistinguishable species that diverged subsequent to two rounds of whole-genome duplications (WGDs, as long as 320 MYA) and possess extremely streamlined genomes. We examine patterns of both nuclear and mitochondrial polymorphism, by sequencing whole genomes of 10-13 worldwide isolates of each of three species belonging to the P. aurelia complex: P. tetraurelia, P. biaurelia, P. sexaurelia, as well as two outgroup species that do not share the WGDs: P. caudatum and P. multimicronucleatum. An apparent absence of global geographic population structure suggests continuous or recent dispersal of Paramecium over long distances. Intergenic regions are highly constrained relative to coding sequences, especially in P. caudatum and P. multimicronucleatum that have shorter intergenic distances. Sequence diversity and divergence are reduced up to ∼100-150 bp both upstream and downstream of genes, suggesting strong constraints imposed by the presence of densely packed regulatory modules. In addition, comparison of sequence variation at non-synonymous and synonymous sites suggests similar recent selective pressures on paralogs within and orthologs across the deeply diverging species. This study presents the first genome-wide population-genomic analysis in ciliates and provides a valuable resource for future studies in evolutionary and functional genetics in Paramecium. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. Funding Opportunity: Genomic Data Centers

    Science.gov (United States)

    Funding Opportunity CCG, Funding Opportunity Center for Cancer Genomics, CCG, Center for Cancer Genomics, CCG RFA, Center for cancer genomics rfa, genomic data analysis network, genomic data analysis network centers,

  18. Development, characterization and use of genomic SSR markers for assessment of genetic diversity in some Saudi date palm (Phoenix dactylifera L. cultivars

    Directory of Open Access Journals (Sweden)

    Sulieman A. Al-Faifi

    2016-05-01

    Conclusions: The developed microsatellite markers are additional values to date palm characterization tools that can be used by researchers in population genetics, cultivar identification as well as genetic resource exploration and management. The tested cultivars exhibited a significant amount of genetic diversity and could be suitable for successful breeding program. Genomic sequences generated from this study are available at the National Center for Biotechnology Information (NCBI, Sequence Read Archive (Accession numbers. LIBGSS_039019.

  19. An assessment of time involved in pre-test case review and counseling for a whole genome sequencing clinical research program.

    Science.gov (United States)

    Williams, Janet L; Faucett, W Andrew; Smith-Packard, Bethanny; Wagner, Monisa; Williams, Marc S

    2014-08-01

    Whole genome sequencing (WGS) is being used for evaluation of individuals with undiagnosed disease of suspected genetic origin. Implementing WGS into clinical practice will place an increased burden upon care teams with regard to pre-test patient education and counseling about results. To quantitate the time needed for appropriate pre-test evaluation of participants in WGS testing, we documented the time spent by our clinical research group on various activities related to program preparation, participant screening, and consent prior to WGS. Participants were children or young adults with autism, intellectual or developmental disability, and/or congenital anomalies, who have remained undiagnosed despite previous evaluation, and their biologic parents. Results showed that significant time was spent in securing allocation of clinical research space to counsel participants and families, and in acquisition and review of participant's medical records. Pre-enrollment chart review identified two individuals with existing diagnoses resulting in savings of $30,000 for the genome sequencing alone, as well as saving hours of personnel time for genome interpretation and communication of WGS results. New WGS programs should plan for costs associated with additional pre-test administrative planning and patient evaluation time that will be required to provide high quality care.

  20. Fisher: a program for the detection of H/ACA snoRNAs using MFE secondary structure prediction and comparative genomics - assessment and update.

    Science.gov (United States)

    Freyhult, Eva; Edvardsson, Sverker; Tamas, Ivica; Moulton, Vincent; Poole, Anthony M

    2008-07-21

    The H/ACA family of small nucleolar RNAs (snoRNAs) plays a central role in guiding the pseudouridylation of ribosomal RNA (rRNA). In an effort to systematically identify the complete set of rRNA-modifying H/ACA snoRNAs from the genome sequence of the budding yeast, Saccharomyces cerevisiae, we developed a program - Fisher - and previously presented several candidate snoRNAs based on our analysis 1. In this report, we provide a brief update of this work, which was aborted after the publication of experimentally-identified snoRNAs 2 identical to candidates we had identified bioinformatically using Fisher. Our motivation for revisiting this work is to report on the status of the candidate snoRNAs described in 1, and secondly, to report that a modified version of Fisher together with the available multiple yeast genome sequences was able to correctly identify several H/ACA snoRNAs for modification sites not identified by the snoGPS program 3. While we are no longer developing Fisher, we briefly consider the merits of the Fisher algorithm relative to snoGPS, which may be of use for workers considering pursuing a similar search strategy for the identification of small RNAs. The modified source code for Fisher is made available as supplementary material. Our results confirm the validity of using minimum free energy (MFE) secondary structure prediction to guide comparative genomic screening for RNA families with few sequence constraints.

  1. Exploring Other Genomes: Bacteria.

    Science.gov (United States)

    Flannery, Maura C.

    2001-01-01

    Points out the importance of genomes other than the human genome project and provides information on the identified bacterial genomes Pseudomonas aeuroginosa, Leprosy, Cholera, Meningitis, Tuberculosis, Bubonic Plague, and plant pathogens. Considers the computer's use in genome studies. (Contains 14 references.) (YDS)

  2. Genome plasticity and systems evolution in Streptomyces

    Science.gov (United States)

    2012-01-01

    Background Streptomycetes are filamentous soil-dwelling bacteria. They are best known as the producers of a great variety of natural products such as antibiotics, antifungals, antiparasitics, and anticancer agents and the decomposers of organic substances for carbon recycling. They are also model organisms for the studies of gene regulatory networks, morphological differentiation, and stress response. The availability of sets of genomes from closely related Streptomyces strains makes it possible to assess the mechanisms underlying genome plasticity and systems adaptation. Results We present the results of a comprehensive analysis of the genomes of five Streptomyces species with distinct phenotypes. These streptomycetes have a pan-genome comprised of 17,362 orthologous families which includes 3,096 components in the core genome, 5,066 components in the dispensable genome, and 9,200 components that are uniquely present in only one species. The core genome makes up about 33%-45% of each genome repertoire. It contains important genes for Streptomyces biology including those involved in gene regulation, secretion, secondary metabolism and morphological differentiation. Abundant duplicate genes have been identified, with 4%-11% of the whole genomes composed of lineage-specific expansions (LSEs), suggesting that frequent gene duplication or lateral gene transfer events play a role in shaping the genome diversification within this genus. Two patterns of expansion, single gene expansion and chromosome block expansion are observed, representing different scales of duplication. Conclusions Our results provide a catalog of genome components and their potential functional roles in gene regulatory networks and metabolic networks. The core genome components reveal the minimum requirement for streptomycetes to sustain a successful lifecycle in the soil environment, reflecting the effects of both genome evolution and environmental stress acting upon the expressed phenotypes. A

  3. LLNL superconducting magnets test facility

    Energy Technology Data Exchange (ETDEWEB)

    Manahan, R; Martovetsky, N; Moller, J; Zbasnik, J

    1999-09-16

    The FENIX facility at Lawrence Livermore National Laboratory was upgraded and refurbished in 1996-1998 for testing CICC superconducting magnets. The FENIX facility was used for superconducting high current, short sample tests for fusion programs in the late 1980s--early 1990s. The new facility includes a 4-m diameter vacuum vessel, two refrigerators, a 40 kA, 42 V computer controlled power supply, a new switchyard with a dump resistor, a new helium distribution valve box, several sets of power leads, data acquisition system and other auxiliary systems, which provide a lot of flexibility in testing of a wide variety of superconducting magnets in a wide range of parameters. The detailed parameters and capabilities of this test facility and its systems are described in the paper.

  4. Genomics With Cloud Computing

    OpenAIRE

    Sukhamrit Kaur; Sandeep Kaur

    2015-01-01

    Abstract Genomics is study of genome which provides large amount of data for which large storage and computation power is needed. These issues are solved by cloud computing that provides various cloud platforms for genomics. These platforms provides many services to user like easy access to data easy sharing and transfer providing storage in hundreds of terabytes more computational power. Some cloud platforms are Google genomics DNAnexus and Globus genomics. Various features of cloud computin...

  5. Genome Maps, a new generation genome browser.

    Science.gov (United States)

    Medina, Ignacio; Salavert, Francisco; Sanchez, Rubén; de Maria, Alejandro; Alonso, Roberto; Escobar, Pablo; Bleda, Marta; Dopazo, Joaquín

    2013-07-01

    Genome browsers have gained importance as more genomes and related genomic information become available. However, the increase of information brought about by new generation sequencing technologies is, at the same time, causing a subtle but continuous decrease in the efficiency of conventional genome browsers. Here, we present Genome Maps, a genome browser that implements an innovative model of data transfer and management. The program uses highly efficient technologies from the new HTML5 standard, such as scalable vector graphics, that optimize workloads at both server and client sides and ensure future scalability. Thus, data management and representation are entirely carried out by the browser, without the need of any Java Applet, Flash or other plug-in technology installation. Relevant biological data on genes, transcripts, exons, regulatory features, single-nucleotide polymorphisms, karyotype and so forth, are imported from web services and are available as tracks. In addition, several DAS servers are already included in Genome Maps. As a novelty, this web-based genome browser allows the local upload of huge genomic data files (e.g. VCF or BAM) that can be dynamically visualized in real time at the client side, thus facilitating the management of medical data affected by privacy restrictions. Finally, Genome Maps can easily be integrated in any web application by including only a few lines of code. Genome Maps is an open source collaborative initiative available in the GitHub repository (https://github.com/compbio-bigdata-viz/genome-maps). Genome Maps is available at: http://www.genomemaps.org.

  6. Fisher: a program for the detection of H/ACA snoRNAs using MFE secondary structure prediction and comparative genomicsassessment and update

    Directory of Open Access Journals (Sweden)

    Tamas Ivica

    2008-07-01

    Full Text Available Abstract Background The H/ACA family of small nucleolar RNAs (snoRNAs plays a central role in guiding the pseudouridylation of ribosomal RNA (rRNA. In an effort to systematically identify the complete set of rRNA-modifying H/ACA snoRNAs from the genome sequence of the budding yeast, Saccharomyces cerevisiae, we developed a program – Fisher – and previously presented several candidate snoRNAs based on our analysis 1. Findings In this report, we provide a brief update of this work, which was aborted after the publication of experimentally-identified snoRNAs 2 identical to candidates we had identified bioinformatically using Fisher. Our motivation for revisiting this work is to report on the status of the candidate snoRNAs described in 1, and secondly, to report that a modified version of Fisher together with the available multiple yeast genome sequences was able to correctly identify several H/ACA snoRNAs for modification sites not identified by the snoGPS program 3. While we are no longer developing Fisher, we briefly consider the merits of the Fisher algorithm relative to snoGPS, which may be of use for workers considering pursuing a similar search strategy for the identification of small RNAs. The modified source code for Fisher is made available as supplementary material. Conclusion Our results confirm the validity of using minimum free energy (MFE secondary structure prediction to guide comparative genomic screening for RNA families with few sequence constraints.

  7. Evaluation of Genomic Instability in the Abnormal Prostate

    National Research Council Canada - National Science Library

    Haaland-Pullus, Christina; Griffith, Jeffrey K

    2006-01-01

    ...: prognosis and diagnosis. Several tools are being used to investigate this effect, specifically the assessment of telomere length, allelic imbalance, and methylation status, all markers of genomic instability...

  8. Evaluation of Genomic Instability in the Abnormal Prostate

    National Research Council Canada - National Science Library

    Haaland-Pullus, Christina; Griffth, Jeffrey K

    2008-01-01

    ...: prognosis and diagnosis. Several tools are being used to investigate this effect, specifically the assessment of telomere length, allelic imbalance, and methylation status, all markers of genomic instability...

  9. JGI Fungal Genomics Program

    Energy Technology Data Exchange (ETDEWEB)

    Grigoriev, Igor V.

    2011-03-14

    Genomes of energy and environment fungi are in focus of the Fungal Genomic Program at the US Department of Energy Joint Genome Institute (JGI). Its key project, the Genomics Encyclopedia of Fungi, targets fungi related to plant health (symbionts, pathogens, and biocontrol agents) and biorefinery processes (cellulose degradation, sugar fermentation, industrial hosts), and explores fungal diversity by means of genome sequencing and analysis. Over 50 fungal genomes have been sequenced by JGI to date and released through MycoCosm (www.jgi.doe.gov/fungi), a fungal web-portal, which integrates sequence and functional data with genome analysis tools for user community. Sequence analysis supported by functional genomics leads to developing parts list for complex systems ranging from ecosystems of biofuel crops to biorefineries. Recent examples of such 'parts' suggested by comparative genomics and functional analysis in these areas are presented here

  10. Genomic Encyclopedia of Fungi

    Energy Technology Data Exchange (ETDEWEB)

    Grigoriev, Igor

    2012-08-10

    Genomes of fungi relevant to energy and environment are in focus of the Fungal Genomic Program at the US Department of Energy Joint Genome Institute (JGI). Its key project, the Genomics Encyclopedia of Fungi, targets fungi related to plant health (symbionts, pathogens, and biocontrol agents) and biorefinery processes (cellulose degradation, sugar fermentation, industrial hosts), and explores fungal diversity by means of genome sequencing and analysis. Over 150 fungal genomes have been sequenced by JGI to date and released through MycoCosm (www.jgi.doe.gov/fungi), a fungal web-portal, which integrates sequence and functional data with genome analysis tools for user community. Sequence analysis supported by functional genomics leads to developing parts list for complex systems ranging from ecosystems of biofuel crops to biorefineries. Recent examples of such parts suggested by comparative genomics and functional analysis in these areas are presented here.

  11. Assessment of the contribution of cocoa-derived strains of Acetobacter ghanensis and Acetobacter senegalensis to the cocoa bean fermentation process through a genomic approach.

    Science.gov (United States)

    Illeghems, Koen; Pelicaen, Rudy; De Vuyst, Luc; Weckx, Stefan

    2016-09-01

    Acetobacter ghanensis LMG 23848(T) and Acetobacter senegalensis 108B are acetic acid bacteria that originate from a spontaneous cocoa bean heap fermentation process and that have been characterised as strains with interesting functionalities through metabolic and kinetic studies. As there is currently little genetic information available for these species, whole-genome sequencing of A. ghanensis LMG 23848(T) and A. senegalensis 108B and subsequent data analysis was performed. This approach not only revealed characteristics such as the metabolic potential and genomic architecture, but also allowed to indicate the genetic adaptations related to the cocoa bean fermentation process. Indeed, evidence was found that both species possessed the genetic ability to be involved in citrate assimilation and displayed adaptations in their respiratory chain that might improve their competitiveness during the cocoa bean fermentation process. In contrast, other properties such as the dependence on glycerol or mannitol and lactate as energy sources or a less efficient acid stress response may explain their low competitiveness. The presence of a gene coding for a proton-translocating transhydrogenase in A. ghanensis LMG 23848(T) and the genes involved in two aromatic compound degradation pathways in A. senegalensis 108B indicate that these strains have an extended functionality compared to Acetobacter species isolated from other ecosystems. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Genomics With Cloud Computing

    Directory of Open Access Journals (Sweden)

    Sukhamrit Kaur

    2015-04-01

    Full Text Available Abstract Genomics is study of genome which provides large amount of data for which large storage and computation power is needed. These issues are solved by cloud computing that provides various cloud platforms for genomics. These platforms provides many services to user like easy access to data easy sharing and transfer providing storage in hundreds of terabytes more computational power. Some cloud platforms are Google genomics DNAnexus and Globus genomics. Various features of cloud computing to genomics are like easy access and sharing of data security of data less cost to pay for resources but still there are some demerits like large time needed to transfer data less network bandwidth.

  13. Assessment of genetic diversity, population structure and relationships in Indian and non-Indian genotypes of finger millet (Eleusine coracana (L.) Gaertn) using genomic SSR markers.

    Science.gov (United States)

    Ramakrishnan, M; Antony Ceasar, S; Duraipandiyan, V; Al-Dhabi, N A; Ignacimuthu, S

    2016-01-01

    We evaluated the genetic variation and population structure in Indian and non-Indian genotypes of finger millet using 87 genomic SSR primers. The 128 finger millet genotypes were collected and genomic DNA was isolated. Eighty-seven genomic SSR primers with 60-70 % GC contents were used for PCR analysis of 128 finger millet genotypes. The PCR products were separated and visualized on a 6 % polyacrylamide gel followed by silver staining. The data were used to estimate major allele frequency using Power Marker v3.0. Dendrograms were constructed based on the Jaccard's similarity coefficient. Statistical fitness and population structure analyses were performed to find the genetic diversity. The mean major allele frequency was 0.92; the means of polymorphic alleles were 2.13 per primer and 1.45 per genotype; the average polymorphism was 59.94 % per primer and average PIC value was 0.44 per primer. Indian genotypes produced an additional 0.21 allele than non-Indian genotypes. Gene diversity was in the range from 0.02 to 0.35. The average heterozygosity was 0.11, close to 100 % homozygosity. The highest inbreeding coefficient was observed with SSR marker UGEP67. The Jaccard's similarity coefficient value ranged from 0.011 to 0.836. The highest similarity value was 0.836 between genotypes DPI009-04 and GPU-45. Indian genotypes were placed in Eleusine coracana major cluster (EcMC) 1 along with 6 non-Indian genotypes. AMOVA showed that molecular variance in genotypes from various geographical regions was 4 %; among populations it was 3 % and within populations it was 93 %. PCA scatter plot analysis showed that GPU-28, GPU-45 and DPI009-04 were closely dispersed in first component axis. In structural analysis, the genotypes were divided into three subpopulations (SP1, SP2 and SP3). All the three subpopulations had an admixture of alleles and no pure line was observed. These analyses confirmed that all the genotypes were genetically diverse and had been grouped based on

  14. Comparative Genome Analysis and Genome Evolution

    NARCIS (Netherlands)

    Snel, Berend

    2002-01-01

    This thesis described a collection of bioinformatic analyses on complete genome sequence data. We have studied the evolution of gene content and find that vertical inheritance dominates over horizontal gene trasnfer, even to the extent that we can use the gene content to make genome phylogenies.

  15. Assessing the genome level diversity of Listeria monocytogenes from contaminated ice cream and environmental samples linked to a listeriosis outbreak in the United States.

    Directory of Open Access Journals (Sweden)

    Yi Chen

    Full Text Available A listeriosis outbreak in the United States implicated contaminated ice cream produced by one company, which operated 3 facilities. We performed single nucleotide polymorphism (SNP-based whole genome sequencing (WGS analysis on Listeria monocytogenes from food, environmental and clinical sources, identifying two clusters and a single branch, belonging to PCR serogroup IIb and genetic lineage I. WGS Cluster I, representing one outbreak strain, contained 82 food and environmental isolates from Facility I and 4 clinical isolates. These isolates differed by up to 29 SNPs, exhibited 9 pulsed-field gel electrophoresis (PFGE profiles and multilocus sequence typing (MLST sequence type (ST 5 of clonal complex 5 (CC5. WGS Cluster II contained 51 food and environmental isolates from Facility II, 4 food isolates from Facility I and 5 clinical isolates. Among them the isolates from Facility II and clinical isolates formed a clade and represented another outbreak strain. Isolates in this clade differed by up to 29 SNPs, exhibited 3 PFGE profiles and ST5. The only isolate collected from Facility III belonged to singleton ST489, which was in a single branch separate from Clusters I and II, and was not associated with the outbreak. WGS analyses clustered together outbreak-associated isolates exhibiting multiple PFGE profiles, while differentiating them from epidemiologically unrelated isolates that exhibited outbreak PFGE profiles. The complete genome of a Cluster I isolate allowed the identification and analyses of putative prophages, revealing that Cluster I isolates differed by the gain or loss of three putative prophages, causing the banding pattern differences among all 3 AscI-PFGE profiles observed in Cluster I isolates. WGS data suggested that certain ice cream varieties and/or production lines might have contamination sources unique to them. The SNP-based analysis was able to distinguish CC5 as a group from non-CC5 isolates and differentiate among CC5

  16. Molecular epidemiology of Staphylococcus aureus bacteremia in a single large Minnesota medical center in 2015 as assessed using MLST, core genome MLST and spa typing.

    Directory of Open Access Journals (Sweden)

    Kyung-Hwa Park

    Full Text Available Staphylococcus aureus is a leading cause of bacteremia in hospitalized patients. Whether or not S. aureus bacteremia (SAB is associated with clonality, implicating potential nosocomial transmission, has not, however, been investigated. Herein, we examined the epidemiology of SAB using whole genome sequencing (WGS. 152 SAB isolates collected over the course of 2015 at a single large Minnesota medical center were studied. Staphylococcus protein A (spa typing was performed by PCR/Sanger sequencing; multilocus sequence typing (MLST and core genome MLST (cgMLST were determined by WGS. Forty-eight isolates (32% were methicillin-resistant S. aureus (MRSA. The isolates encompassed 66 spa types, clustered into 11 spa clonal complexes (CCs and 10 singleton types. 88% of 48 MRSA isolates belonged to spa CC-002 or -008. Methicillin-susceptible S. aureus (MSSA isolates were more genotypically diverse, with 61% distributed across four spa CCs (CC-002, CC-012, CC-008 and CC-084. By MLST, there was 31 sequence types (STs, including 18 divided into 6 CCs and 13 singleton STs. Amongst MSSA isolates, the common MLST clones were CC5 (23%, CC30 (19%, CC8 (15% and CC15 (11%. Common MRSA clones were CC5 (67% and CC8 (25%; there were no MRSA isolates in CC45 or CC30. By cgMLST analysis, there were 9 allelic differences between two isolates, with the remaining 150 isolates differing from each other by over 40 alleles. The two isolates were retroactively epidemiologically linked by medical record review. Overall, cgMLST analysis resulted in higher resolution epidemiological typing than did multilocus sequence or spa typing.

  17. Assessing the genome level diversity of Listeria monocytogenes from contaminated ice cream and environmental samples linked to a listeriosis outbreak in the United States.

    Science.gov (United States)

    Chen, Yi; Luo, Yan; Curry, Phillip; Timme, Ruth; Melka, David; Doyle, Matthew; Parish, Mickey; Hammack, Thomas S; Allard, Marc W; Brown, Eric W; Strain, Errol A

    2017-01-01

    A listeriosis outbreak in the United States implicated contaminated ice cream produced by one company, which operated 3 facilities. We performed single nucleotide polymorphism (SNP)-based whole genome sequencing (WGS) analysis on Listeria monocytogenes from food, environmental and clinical sources, identifying two clusters and a single branch, belonging to PCR serogroup IIb and genetic lineage I. WGS Cluster I, representing one outbreak strain, contained 82 food and environmental isolates from Facility I and 4 clinical isolates. These isolates differed by up to 29 SNPs, exhibited 9 pulsed-field gel electrophoresis (PFGE) profiles and multilocus sequence typing (MLST) sequence type (ST) 5 of clonal complex 5 (CC5). WGS Cluster II contained 51 food and environmental isolates from Facility II, 4 food isolates from Facility I and 5 clinical isolates. Among them the isolates from Facility II and clinical isolates formed a clade and represented another outbreak strain. Isolates in this clade differed by up to 29 SNPs, exhibited 3 PFGE profiles and ST5. The only isolate collected from Facility III belonged to singleton ST489, which was in a single branch separate from Clusters I and II, and was not associated with the outbreak. WGS analyses clustered together outbreak-associated isolates exhibiting multiple PFGE profiles, while differentiating them from epidemiologically unrelated isolates that exhibited outbreak PFGE profiles. The complete genome of a Cluster I isolate allowed the identification and analyses of putative prophages, revealing that Cluster I isolates differed by the gain or loss of three putative prophages, causing the banding pattern differences among all 3 AscI-PFGE profiles observed in Cluster I isolates. WGS data suggested that certain ice cream varieties and/or production lines might have contamination sources unique to them. The SNP-based analysis was able to distinguish CC5 as a group from non-CC5 isolates and differentiate among CC5 isolates from

  18. Genomic Data Commons launches

    Science.gov (United States)

    The Genomic Data Commons (GDC), a unified data system that promotes sharing of genomic and clinical data between researchers, launched today with a visit from Vice President Joe Biden to the operations center at the University of Chicago.

  19. Rat Genome Database (RGD)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Rat Genome Database (RGD) is a collaborative effort between leading research institutions involved in rat genetic and genomic research to collect, consolidate,...

  20. Visualization for genomics: the Microbial Genome Viewer.

    Science.gov (United States)

    Kerkhoven, Robert; van Enckevort, Frank H J; Boekhorst, Jos; Molenaar, Douwe; Siezen, Roland J

    2004-07-22

    A Web-based visualization tool, the Microbial Genome Viewer, is presented that allows the user to combine complex genomic data in a highly interactive way. This Web tool enables the interactive generation of chromosome wheels and linear genome maps from genome annotation data stored in a MySQL database. The generated images are in scalable vector graphics (SVG) format, which is suitable for creating high-quality scalable images and dynamic Web representations. Gene-related data such as transcriptome and time-course microarray experiments can be superimposed on the maps for visual inspection. The Microbial Genome Viewer 1.0 is freely available at http://www.cmbi.kun.nl/MGV

  1. Baseline frequency of chromosomal aberrations and sister chromatid exchanges in peripheral blood lymphocytes of healthy individuals living in Turin (North-Western Italy): assessment of the effects of age, sex and GSTs gene polymorphisms on the levels of genomic damage.

    Science.gov (United States)

    Santovito, Alfredo; Cervella, Piero; Delpero, Massimiliano

    2016-05-01

    The increased exposure to environmental pollutants has led to the awareness of the necessity for constant monitoring of human populations, especially those living in urban areas. This study evaluated the background levels of genomic damage in a sample of healthy subjects living in the urban area of Turin (Italy). The association between DNA damage with age, sex and GSTs polymorphisms was assessed. One hundred and one individuals were randomly sampled. Sister Chromatid Exchanges (SCEs) and Chromosomal Aberrations (CAs) assays, as well as genotyping of GSTT1 and GSTM1 genes, were performed. Mean values of SCEs and CAs were 5.137 ± 0.166 and 0.018 ± 0.002, respectively. Results showed age and gender associated with higher frequencies of these two cytogenetic markers. The eldest subjects (51-65 years) showed significantly higher levels of genomic damage than younger individuals. GSTs polymorphisms did not appear to significantly influence the frequencies of either markers. The CAs background frequency observed in this study is one of the highest reported among European populations. Turin is one of the most polluted cities in Europe in terms of air fine PM10 and ozone and the clastogenic potential of these pollutants may explain the high frequencies of chromosomal rearrangements reported here.

  2. Genomic prediction using subsampling

    OpenAIRE

    Xavier, Alencar; Xu, Shizhong; Muir, William; Rainey, Katy Martin

    2017-01-01

    Background Genome-wide assisted selection is a critical tool for the?genetic improvement of plants and animals. Whole-genome regression models in Bayesian framework represent the main family of prediction methods. Fitting such models with a large number of observations involves a prohibitive computational burden. We propose the use of subsampling bootstrap Markov chain in genomic prediction. Such method consists of fitting whole-genome regression models by subsampling observations in each rou...

  3. Translational genomics

    Directory of Open Access Journals (Sweden)

    Martin Kussmann

    2014-09-01

    Full Text Available The term “Translational Genomics” reflects both title and mission of this new journal. “Translational” has traditionally been understood as “applied research” or “development”, different from or even opposed to “basic research”. Recent scientific and societal developments have triggered a re-assessment of the connotation that “translational” and “basic” are either/or activities: translational research nowadays aims at feeding the best science into applications and solutions for human society. We therefore argue here basic science to be challenged and leveraged for its relevance to human health and societal benefits. This more recent approach and attitude are catalyzed by four trends or developments: evidence-based solutions; large-scale, high dimensional data; consumer/patient empowerment; and systems-level understanding.

  4. Aligning the unalignable: bacteriophage whole genome alignments.

    Science.gov (United States)

    Bérard, Sèverine; Chateau, Annie; Pompidor, Nicolas; Guertin, Paul; Bergeron, Anne; Swenson, Krister M

    2016-01-13

    In recent years, many studies focused on the description and comparison of large sets of related bacteriophage genomes. Due to the peculiar mosaic structure of these genomes, few informative approaches for comparing whole genomes exist: dot plots diagrams give a mostly qualitative assessment of the similarity/dissimilarity between two or more genomes, and clustering techniques are used to classify genomes. Multiple alignments are conspicuously absent from this scene. Indeed, whole genome aligners interpret lack of similarity between sequences as an indication of rearrangements, insertions, or losses. This behavior makes them ill-prepared to align bacteriophage genomes, where even closely related strains can accomplish the same biological function with highly dissimilar sequences. In this paper, we propose a multiple alignment strategy that exploits functional collinearity shared by related strains of bacteriophages, and uses partial orders to capture mosaicism of sets of genomes. As classical alignments do, the computed alignments can be used to predict that genes have the same biological function, even in the absence of detectable similarity. The Alpha aligner implements these ideas in visual interactive displays, and is used to compute several examples of alignments of Staphylococcus aureus and Mycobacterium bacteriophages, involving up to 29 genomes. Using these datasets, we prove that Alpha alignments are at least as good as those computed by standard aligners. Comparison with the progressive Mauve aligner - which implements a partial order strategy, but whose alignments are linearized - shows a greatly improved interactive graphic display, while avoiding misalignments. Multiple alignments of whole bacteriophage genomes work, and will become an important conceptual and visual tool in comparative genomics of sets of related strains. A python implementation of Alpha, along with installation instructions for Ubuntu and OSX, is available on bitbucket (https://bitbucket.org/thekswenson/alpha).

  5. Ebolavirus comparative genomics

    DEFF Research Database (Denmark)

    Jun, Se-Ran; Leuze, Michael R.; Nookaew, Intawat

    2015-01-01

    The 2014 Ebola outbreak in West Africa is the largest documented for this virus. To examine the dynamics of this genome, we compare more than 100 currently available ebolavirus genomes to each other and to other viral genomes. Based on oligomer frequency analysis, the family Filoviridae forms...

  6. Assessment of Residual Stresses in 3013 Inner and Outer Containers and Teardrop Samples

    Energy Technology Data Exchange (ETDEWEB)

    Stroud, Mary Ann [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Prime, Michael Bruce [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Veirs, Douglas Kirk [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Berg, John M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Clausen, Bjorn [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Worl, Laura Ann [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); DeWald, Adrian T. [Hill Engineering, LLC, Rancho Cordova, CA (United States)

    2015-12-08

    This report is an assessment performed by LANL that examines packaging for plutonium-bearing materials and the resilience of its design. This report discusses residual stresses in the 3013 outer, the SRS/Hanford and RFETS/LLNL inner containers, and teardrop samples used in studies to assess the potential for SCC in 3013 containers. Residual tensile stresses in the heat affected zones of the closure welds are of particular concern.

  7. Assessment of Residual Stresses in 3013 Inner and Outer Containers and Teardrop Samples

    International Nuclear Information System (INIS)

    Stroud, Mary Ann; Prime, Michael Bruce; Veirs, Douglas Kirk; Berg, John M.; Clausen, Bjorn; Worl, Laura Ann; DeWald, Adrian T.

    2015-01-01

    This report is an assessment performed by LANL that examines packaging for plutonium-bearing materials and the resilience of its design. This report discusses residual stresses in the 3013 outer, the SRS/Hanford and RFETS/LLNL inner containers, and teardrop samples used in studies to assess the potential for SCC in 3013 containers. Residual tensile stresses in the heat affected zones of the closure welds are of particular concern.

  8. Genomics-assisted breeding in fruit trees.

    Science.gov (United States)

    Iwata, Hiroyoshi; Minamikawa, Mai F; Kajiya-Kanegae, Hiromi; Ishimori, Motoyuki; Hayashi, Takeshi

    2016-01-01

    Recent advancements in genomic analysis technologies have opened up new avenues to promote the efficiency of plant breeding. Novel genomics-based approaches for plant breeding and genetics research, such as genome-wide association studies (GWAS) and genomic selection (GS), are useful, especially in fruit tree breeding. The breeding of fruit trees is hindered by their long generation time, large plant size, long juvenile phase, and the necessity to wait for the physiological maturity of the plant to assess the marketable product (fruit). In this article, we describe the potential of genomics-assisted breeding, which uses these novel genomics-based approaches, to break through these barriers in conventional fruit tree breeding. We first introduce the molecular marker systems and whole-genome sequence data that are available for fruit tree breeding. Next we introduce the statistical methods for biparental linkage and quantitative trait locus (QTL) mapping as well as GWAS and GS. We then review QTL mapping, GWAS, and GS studies conducted on fruit trees. We also review novel technologies for rapid generation advancement. Finally, we note the future prospects of genomics-assisted fruit tree breeding and problems that need to be overcome in the breeding.

  9. Modifications to LLNL Plutonium Packaging Systems (PuPS) to achieve ASME VIII UW-13.2(d) Requirements for the DOE Standard 3013-00 Outer Can Weld

    International Nuclear Information System (INIS)

    Riley, D; Dodson, K

    2001-01-01

    The Lawrence Livermore National Laboratory (LLNL) Plutonium Packaging System (PuPS) prepares packages to meet the DOE Standard 3013 (Reference 1). The PuPS equipment was supplied by the British Nuclear Fuels Limited (BNFL). The DOE Standard 3013 requires that the welding of the Outer Can meets ASME Section VIII Division 1 (Reference 2). ASME Section VIII references to ASME Section IX (Reference 3) for most of the welding requirements, but UW-13.2 (d) of Section VIII requires a certain depth and width of the weld. In this document the UW-13.2(d) requirement is described as the (a+b)/2t s ratio. This ratio has to be greater than or equal to one to meet the requirements of UW-13.2(d). The Outer Can welds had not been meeting this requirement. Three methods are being followed to resolve this issue: (1) Modify the welding parameters to achieve the requirement, (2) Submit a weld case to ASME that changes the UW-13.2(d) requirement for their review and approval, and (3) Change the requirements in the DOE-STD-3013. Each of these methods are being pursued. This report addresses how the first method was addressed for the LLNL PuPS. The experimental work involved adjusting the Outer Can rotational speed and the power applied to the can. These adjustments resulted in being able to achieve the ASME VIII, UW-13.2(d) requirement

  10. The Sequenced Angiosperm Genomes and Genome Databases.

    Science.gov (United States)

    Chen, Fei; Dong, Wei; Zhang, Jiawei; Guo, Xinyue; Chen, Junhao; Wang, Zhengjia; Lin, Zhenguo; Tang, Haibao; Zhang, Liangsheng

    2018-01-01

    Angiosperms, the flowering plants, provide the essential resources for human life, such as food, energy, oxygen, and materials. They also promoted the evolution of human, animals, and the planet earth. Despite the numerous advances in genome reports or sequencing technologies, no review covers all the released angiosperm genomes and the genome databases for data sharing. Based on the rapid advances and innovations in the database reconstruction in the last few years, here we provide a comprehensive review for three major types of angiosperm genome databases, including databases for a single species, for a specific angiosperm clade, and for multiple angiosperm species. The scope, tools, and data of each type of databases and their features are concisely discussed. The genome databases for a single species or a clade of species are especially popular for specific group of researchers, while a timely-updated comprehensive database is more powerful for address of major scientific mysteries at the genome scale. Considering the low coverage of flowering plants in any available database, we propose construction of a comprehensive database to facilitate large-scale comparative studies of angiosperm genomes and to promote the collaborative studies of important questions in plant biology.

  11. Multiplexed precision genome editing with trackable genomic barcodes in yeast.

    Science.gov (United States)

    Roy, Kevin R; Smith, Justin D; Vonesch, Sibylle C; Lin, Gen; Tu, Chelsea Szu; Lederer, Alex R; Chu, Angela; Suresh, Sundari; Nguyen, Michelle; Horecka, Joe; Tripathi, Ashutosh; Burnett, Wallace T; Morgan, Maddison A; Schulz, Julia; Orsley, Kevin M; Wei, Wu; Aiyar, Raeka S; Davis, Ronald W; Bankaitis, Vytas A; Haber, James E; Salit, Marc L; St Onge, Robert P; Steinmetz, Lars M

    2018-07-01

    Our understanding of how genotype controls phenotype is limited by the scale at which we can precisely alter the genome and assess the phenotypic consequences of each perturbation. Here we describe a CRISPR-Cas9-based method for multiplexed accurate genome editing with short, trackable, integrated cellular barcodes (MAGESTIC) in Saccharomyces cerevisiae. MAGESTIC uses array-synthesized guide-donor oligos for plasmid-based high-throughput editing and features genomic barcode integration to prevent plasmid barcode loss and to enable robust phenotyping. We demonstrate that editing efficiency can be increased more than fivefold by recruiting donor DNA to the site of breaks using the LexA-Fkh1p fusion protein. We performed saturation editing of the essential gene SEC14 and identified amino acids critical for chemical inhibition of lipid signaling. We also constructed thousands of natural genetic variants, characterized guide mismatch tolerance at the genome scale, and ascertained that cryptic Pol III termination elements substantially reduce guide efficacy. MAGESTIC will be broadly useful to uncover the genetic basis of phenotypes in yeast.

  12. First step in using molecular data for microbial food safety risk assessment; hazard identification of Escherichia coli O157:H7 by coupling genomic data with in vitro adherence to human epithelial cells.

    Science.gov (United States)

    Pielaat, Annemarie; Boer, Martin P; Wijnands, Lucas M; van Hoek, Angela H A M; Bouw, El; Barker, Gary C; Teunis, Peter F M; Aarts, Henk J M; Franz, Eelco

    2015-11-20

    The potential for using whole genome sequencing (WGS) data in microbiological risk assessment (MRA) has been discussed on several occasions since the beginning of this century. Still, the proposed heuristic approaches have never been applied in a practical framework. This is due to the non-trivial problem of mapping microbial information consisting of thousands of loci onto a probabilistic scale for risks. The paradigm change for MRA involves translation of multidimensional microbial genotypic information to much reduced (integrated) phenotypic information and onwards to a single measure of human risk (i.e. probability of illness). In this paper a first approach in methodology development is described for the application of WGS data in MRA; this is supported by a practical example. That is, combining genetic data (single nucleotide polymorphisms; SNPs) for Shiga toxin-producing Escherichia coli (STEC) O157 with phenotypic data (in vitro adherence to epithelial cells as a proxy for virulence) leads to hazard identification in a Genome Wide Association Study (GWAS). This application revealed practical implications when using SNP data for MRA. These can be summarized by considering the following main issues: optimum sample size for valid inference on population level, correction for population structure, quantification and calibration of results, reproducibility of the analysis, links with epidemiological data, anchoring and integration of results into a systems biology approach for the translation of molecular studies to human health risk. Future developments in genetic data analysis for MRA should aim at resolving the mapping problem of processing genetic sequences to come to a quantitative description of risk. The development of a clustering scheme focusing on biologically relevant information of the microbe involved would be a useful approach in molecular data reduction for risk assessment. Copyright © 2015. Published by Elsevier B.V.

  13. Gene conversion in the rice genome

    DEFF Research Database (Denmark)

    Xu, Shuqing; Clark, Terry; Zheng, Hongkun

    2008-01-01

    -chromosomal conversions distributed between chromosome 1 and 5, 2 and 6, and 3 and 5 are more frequent than genome average (Z-test, P ... is not tightly linked to natural selection in the rice genome. To assess the contribution of segmental duplication on gene conversion statistics, we determined locations of conversion partners with respect to inter-chromosomal segment duplication. The number of conversions associated with segmentation is less...... involved in conversion events. CONCLUSION: The evolution of gene families in the rice genome may have been accelerated by conversion with pseudogenes. Our analysis suggests a possible role for gene conversion in the evolution of pathogen-response genes....

  14. Genome-wide analysis of basic/helix-loop-helix gene family in peanut and assessment of its roles in pod development.

    Directory of Open Access Journals (Sweden)

    Chao Gao

    Full Text Available The basic/helix-loop-helix (bHLH proteins constitute a superfamily of transcription factors that are known to play a range of regulatory roles in eukaryotes. Over the past few decades, many bHLH family genes have been well-characterized in model plants, such as Arabidopsis, rice and tomato. However, the bHLH protein family in peanuts has not yet been systematically identified and characterized. Here, 132 and 129 bHLH proteins were identified from two wild ancestral diploid subgenomes of cultivated tetraploid peanuts, Arachis duranensis (AA and Arachis ipaensis (BB, respectively. Phylogenetic analysis indicated that these bHLHs could be classified into 19 subfamilies. Distribution mapping results showed that peanut bHLH genes were randomly and unevenly distributed within the 10 AA chromosomes and 10 BB chromosomes. In addition, 120 bHLH gene pairs between the AA-subgenome and BB-subgenome were found to be orthologous and 101 of these pairs were highly syntenic in AA and BB chromosomes. Furthermore, we confirmed that 184 bHLH genes expressed in different tissues, 22 of which exhibited tissue-specific expression. Meanwhile, we identified 61 bHLH genes that may be potentially involved in peanut-specific subterranean. Our comprehensive genomic analysis provides a foundation for future functional dissection and understanding of the regulatory mechanisms of bHLH transcription factors in peanuts.

  15. Closing the gap between knowledge and clinical application: challenges for genomic translation.

    Science.gov (United States)

    Burke, Wylie; Korngiebel, Diane M

    2015-01-01

    Despite early predictions and rapid progress in research, the introduction of personal genomics into clinical practice has been slow. Several factors contribute to this translational gap between knowledge and clinical application. The evidence available to support genetic test use is often limited, and implementation of new testing programs can be challenging. In addition, the heterogeneity of genomic risk information points to the need for strategies to select and deliver the information most appropriate for particular clinical needs. Accomplishing these tasks also requires recognition that some expectations for personal genomics are unrealistic, notably expectations concerning the clinical utility of genomic risk assessment for common complex diseases. Efforts are needed to improve the body of evidence addressing clinical outcomes for genomics, apply implementation science to personal genomics, and develop realistic goals for genomic risk assessment. In addition, translational research should emphasize the broader benefits of genomic knowledge, including applications of genomic research that provide clinical benefit outside the context of personal genomic risk.

  16. Selection of assessment methods for evaluating banana weevil Cosmopolites sordidus (Coleoptera: Curculionidae) damage on highland cooking banana (Musa spp., genome group AAA-EA).

    Science.gov (United States)

    Gold, C S; Ragama, P E; Coe, R; Rukazambuga, N D T M

    2005-04-01

    Cosmopolites sordidus (Germar) is an important pest on bananas and plantains. Population build-up is slow and damage becomes increasingly important in successive crop cycles (ratoons). Yield loss results from plant loss, mat disappearance and reduced bunch size. Damage assessment requires destructive sampling and is most often done on corms of recently harvested plants. A wide range of damage assessment methods exist and there are no agreed protocols. It is critical to know what types of damage best reflect C. sordidus pest status through their relationships with yield loss. Multiple damage assessment parameters (i.e. for the corm periphery, cortex and central cylinder) were employed in two yield loss trials and a cultivar-screening trial in Uganda. Damage to the central cylinder had a greater effect on plant size and yield loss than damage to the cortex or corm periphery. In some cases, a combined assessment of damage to the central cylinder and cortex showed a better relationship with yield loss than an assessment of the central cylinder alone. Correlation, logistic and linear regression analyses showed weak to modest correlations between damage to the corm periphery and damage to the central cylinder. Thus, damage to the corm periphery is not a strong predictor of the more important damage to the central cylinder. Therefore, C. sordidus damage assessment should target the central cylinder and cortex.

  17. A plant pathology perspective of fungal genome sequencing.

    Science.gov (United States)

    Aylward, Janneke; Steenkamp, Emma T; Dreyer, Léanne L; Roets, Francois; Wingfield, Brenda D; Wingfield, Michael J

    2017-06-01

    The majority of plant pathogens are fungi and many of these adversely affect food security. This mini-review aims to provide an analysis of the plant pathogenic fungi for which genome sequences are publically available, to assess their general genome characteristics, and to consider how genomics has impacted plant pathology. A list of sequenced fungal species was assembled, the taxonomy of all species verified, and the potential reason for sequencing each of the species considered. The genomes of 1090 fungal species are currently (October 2016) in the public domain and this number is rapidly rising. Pathogenic species comprised the largest category (35.5 %) and, amongst these, plant pathogens are predominant. Of the 191 plant pathogenic fungal species with available genomes, 61.3 % cause diseases on food crops, more than half of which are staple crops. The genomes of plant pathogens are slightly larger than those of other fungal species sequenced to date and they contain fewer coding sequences in relation to their genome size. Both of these factors can be attributed to the expansion of repeat elements. Sequenced genomes of plant pathogens provide blueprints from which potential virulence factors were identified and from which genes associated with different pathogenic strategies could be predicted. Genome sequences have also made it possible to evaluate adaptability of pathogen genomes and genomic regions that experience selection pressures. Some genomic patterns, however, remain poorly understood and plant pathogen genomes alone are not sufficient to unravel complex pathogen-host interactions. Genomes, therefore, cannot replace experimental studies that can be complex and tedious. Ultimately, the most promising application lies in using fungal plant pathogen genomics to inform disease management and risk assessment strategies. This will ultimately minimize the risks of future disease outbreaks and assist in preparation for emerging pathogen outbreaks.

  18. Mining genome sequencing data to identify the genomic features linked to breast cancer histopathology

    Science.gov (United States)

    Ping, Zheng; Siegal, Gene P.; Almeida, Jonas S.; Schnitt, Stuart J.; Shen, Dejun

    2014-01-01

    Background: Genetics and genomics have radically altered our understanding of breast cancer progression. However, the genomic basis of various histopathologic features of breast cancer is not yet well-defined. Materials and Methods: The Cancer Genome Atlas (TCGA) is an international database containing a large collection of human cancer genome sequencing data. cBioPortal is a web tool developed for mining these sequencing data. We performed mining of TCGA sequencing data in an attempt to characterize the genomic features correlated with breast cancer histopathology. We first assessed the quality of the TCGA data using a group of genes with known alterations in various cancers. Both genome-wide gene mutation and copy number changes as well as a group of genes with a high frequency of genetic changes were then correlated with various histopathologic features of invasive breast cancer. Results: Validation of TCGA data using a group of genes with known alterations in breast cancer suggests that the TCGA has accurately documented the genomic abnormalities of multiple malignancies. Further analysis of TCGA breast cancer sequencing data shows that accumulation of specific genomic defects is associated with higher tumor grade, larger tumor size and receptor negativity. Distinct groups of genomic changes were found to be associated with the different grades of invasive ductal carcinoma. The mutator role of the TP53 gene was validated by genomic sequencing data of invasive breast cancer and TP53 mutation was found to play a critical role in defining high tumor grade. Conclusions: Data mining of the TCGA genome sequencing data is an innovative and reliable method to help characterize the genomic abnormalities associated with histopathologic features of invasive breast cancer. PMID:24672738

  19. Mining genome sequencing data to identify the genomic features linked to breast cancer histopathology

    Directory of Open Access Journals (Sweden)

    Zheng Ping

    2014-01-01

    Full Text Available Background: Genetics and genomics have radically altered our understanding of breast cancer progression. However, the genomic basis of various histopathologic features of breast cancer is not yet well-defined. Materials and Methods: The Cancer Genome Atlas (TCGA is an international database containing a large collection of human cancer genome sequencing data. cBioPortal is a web tool developed for mining these sequencing data. We performed mining of TCGA sequencing data in an attempt to characterize the genomic features correlated with breast cancer histopathology. We first assessed the quality of the TCGA data using a group of genes with known alterations in various cancers. Both genome-wide gene mutation and copy number changes as well as a group of genes with a high frequency of genetic changes were then correlated with various histopathologic features of invasive breast cancer. Results: Validation of TCGA data using a group of genes with known alterations in breast cancer suggests that the TCGA has accurately documented the genomic abnormalities of multiple malignancies. Further analysis of TCGA breast cancer sequencing data shows that accumulation of specific genomic defects is associated with higher tumor grade, larger tumor size and receptor negativity. Distinct groups of genomic changes were found to be associated with the different grades of invasive ductal carcinoma. The mutator role of the TP53 gene was validated by genomic sequencing data of invasive breast cancer and TP53 mutation was found to play a critical role in defining high tumor grade. Conclusions: Data mining of the TCGA genome sequencing data is an innovative and reliable method to help characterize the genomic abnormalities associated with histopathologic features of invasive breast cancer.

  20. Historic Context and Building Assessments for the Lawrence Livermore National Laboratory Built Environment

    Energy Technology Data Exchange (ETDEWEB)

    Ullrich, R. A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sullivan, M. A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2007-09-14

    This document was prepared to support u.s. Department of Energy / National Nuclear Security Agency (DOE/NNSA) compliance with Sections 106 and 110 of the National Historic Preservation Act (NHPA). Lawrence Livermore National Laboratory (LLNL) is a DOE/NNSA laboratory and is engaged in determining the historic status of its properties at both its main site in Livermore, California, and Site 300, its test site located eleven miles from the main site. LLNL contracted with the authors via Sandia National Laboratories (SNL) to prepare a historic context statement for properties at both sites and to provide assessments of those properties of potential historic interest. The report contains an extensive historic context statement and the assessments of individual properties and groups of properties determined, via criteria established in the context statement, to be of potential interest. The historic context statement addresses the four contexts within which LLNL falls: Local History, World War II History (WWII), Cold War History, and Post-Cold War History. Appropriate historic preservation themes relevant to LLNL's history are delineated within each context. In addition, thresholds are identified for historic significance within each of the contexts based on the explication and understanding of the Secretary of the Interior's Guidelines for determining eligibility for the National Register of Historic Places. The report identifies specific research areas and events in LLNL's history that are of interest and the portions of the built environment in which they occurred. Based on that discussion, properties of potential interest are identified and assessments of them are provided. Twenty individual buildings and three areas of potential historic interest were assessed. The final recommendation is that, of these, LLNL has five individual historic buildings, two sets of historic objects, and two historic districts eligible for the National Register. All are

  1. Genome-wide characterization of the MADS-box gene family in radish (Raphanus sativus L. and assessment of its roles in flowering and floral organogenesis

    Directory of Open Access Journals (Sweden)

    Chao Li

    2016-09-01

    Full Text Available The MADS-box gene family is an important transcription factor (TF family that is involved in various aspects of plant growth and development, especially flowering time and floral organogenesis. Although it has been reported in many plant species, the systematic identification and characterization of MADS-box TF family is still limited in radish (Raphanus sativus L.. In the present study, a comprehensive analysis of MADS-box genes was performed, and a total of 144 MADS-box family members were identified from the whole radish genome. Meanwhile, a detailed list of MADS-box genes from other 28 plant species was also investigated. Through the phylogenetic analysis between radish and Arabidopsis thaliana, all the RsMADS genes were classified into two groups including 68 type I (31 Mα, 12 Mβ and 25Mγ and 76 type II (70 MIKCC and 6 MIKC*. Among them, 41 (28.47% RsMADS genes were located in nine linkage groups of radish from R1 to R9. Moreover, the homologous MADS-box gene pairs were identified among radish, A. thaliana, Chinese cabbage and rice. Additionally, the expression profiles of RsMADS genes were systematically investigated in different tissues and growth stages. Furthermore, quantitative real-time PCR analysis was employed to validate expression patterns of some crucial RsMADS genes. These results could provide a valuable resource to explore the potential functions of RsMADS genes in radish, and facilitate dissecting MADS-box gene-mediated molecular mechanisms underlying flowering and floral organogenesis in root vegetable crops.

  2. Genomic comparison of Escherichia coli serotype O103:H2 isolates with and without verotoxin genes: implications for risk assessment of strains commonly found in ruminant reservoirs

    Directory of Open Access Journals (Sweden)

    Robert Söderlund

    2016-02-01

    Full Text Available Introduction: Escherichia coli O103:H2 occurs as verotoxigenic E. coli (VTEC carrying only vtx1 or vtx2 or both variants, but also as vtx-negative atypical enteropathogenic E. coli (aEPEC. The majority of E. coli O103:H2 identified from cases of human disease are caused by the VTEC form. If aEPEC strains frequently acquire verotoxin genes and become VTEC, they must be considered a significant public health concern. In this study, we have characterized and compared aEPEC and VTEC isolates of E. coli O103:H2 from Swedish cattle. Methods: Fourteen isolates of E. coli O103:H2 with and without verotoxin genes were collected from samples of cattle feces taken during a nationwide cattle prevalence study 2011–2012. Isolates were sequenced with a 2×100 bp setup on a HiSeq2500 instrument producing >100× coverage per isolate. Single-nucleotide polymorphism (SNP typing was performed using the genome analysis tool kit (GATK. Virulence genes and other regions of interest were detected. Susceptibility to transduction by two verotoxin-encoding phages was investigated for one representative aEPEC O103:H2 isolate. Results and Discussion: This study shows that aEPEC O103:H2 is more commonly found (64% than VTEC O103:H2 (36% in the Swedish cattle reservoir. The only verotoxin gene variant identified was vtx1a. Phylogenetic comparison by SNP analysis indicates that while certain subgroups of aEPEC and VTEC are closely related and have otherwise near identical virulence gene repertoires, they belong to separate lineages. This indicates that the uptake or loss of verotoxin genes is a rare event in the natural cattle environment of these bacteria. However, a representative of a VTEC-like aEPEC O103:H2 subgroup could be stably lysogenized by a vtx-encoding phage in vitro.

  3. Use of genomic data in risk assessment case study: I. Evaluation of the dibutyl phthalate male reproductive development toxicity data set

    Energy Technology Data Exchange (ETDEWEB)

    Makris, Susan L., E-mail: makris.susan@epa.gov [U.S. Environmental Protection Agency, National Center for Environmental Assessment, Office of Research and Development, (Mail code 8623P), 1200 Pennsylvania Ave., NW, Washington, DC 20460 (United States); Euling, Susan Y. [U.S. Environmental Protection Agency, National Center for Environmental Assessment, Office of Research and Development, (Mail code 8623P), 1200 Pennsylvania Ave., NW, Washington, DC 20460 (United States); Gray, L. Earl [U.S. Environmental Protection Agency, National Health and Environmental Effects Research Laboratory, Office of Research and Development, (MD-72), Highway 54, Research Triangle Park, NC 27711 (United States); Benson, Robert [U.S. Environmental Protection Agency, Region 8, (Mail code 8P-W), 1595 Wynkoop Street, Denver, CO 80202 (United States); Foster, Paul M.D. [National Toxicology Program, National Institute of Environmental Health Sciences, P.O. Box 12233 (MD K2-12), Research Triangle Park, NC 27709 (United States)

    2013-09-15

    A case study was conducted, using dibutyl phthalate (DBP), to explore an approach to using toxicogenomic data in risk assessment. The toxicity and toxicogenomic data sets relative to DBP-related male reproductive developmental outcomes were considered conjointly to derive information about mode and mechanism of action. In this manuscript, we describe the case study evaluation of the toxicological database for DBP, focusing on identifying the full spectrum of male reproductive developmental effects. The data were assessed to 1) evaluate low dose and low incidence findings and 2) identify male reproductive toxicity endpoints without well-established modes of action (MOAs). These efforts led to the characterization of data gaps and research needs for the toxicity and toxicogenomic studies in a risk assessment context. Further, the identification of endpoints with unexplained MOAs in the toxicity data set was useful in the subsequent evaluation of the mechanistic information that the toxicogenomic data set evaluation could provide. The extensive analysis of the toxicology data set within the MOA context provided a resource of information for DBP in attempts to hypothesize MOAs (for endpoints without a well-established MOA) and to phenotypically anchor toxicogenomic and other mechanistic data both to toxicity endpoints and to available toxicogenomic data. This case study serves as an example of the steps that can be taken to develop a toxicological data source for a risk assessment, both in general and especially for risk assessments that include toxicogenomic data.

  4. Life-Course Genome-wide Association Study Meta-analysis of Total Body BMD and Assessment of Age-Specific Effects

    NARCIS (Netherlands)

    Medina-Gomez, Carolina; Kemp, John P.; Trajanoska, Katerina; Luan, Jian'an; Chesi, Alessandra; Ahluwalia, Tarunveer S.; Mook-Kanamori, Dennis O.; Ham, Annelies; Hartwig, Fernando P.; Evans, Daniel S.; Joro, Raimo; Nedeljkovic, Ivana; Zheng, Hou-Feng; Zhu, Kun; Atalay, Mustafa; Liu, Ching-Ti; Nethander, Maria; Broer, Linda; Porleifsson, Gudmar; Mullin, Benjamin H.; Handelman, Samuel K.; Nalls, Mike A.; Jessen, Leon E.; Heppe, Denise H. M.; Richards, J. Brent; Wang, Carol; Chawes, Bo; Schraut, Katharina E.; Amin, Najaf; Wareham, Nick; Karasik, David; van der Velde, Nathalie; Ikram, M. Arfan; Zemel, Babette S.; Zhou, Yanhua; Carlsson, Christian J.; Liu, Yongmei; McGuigan, Fiona E.; Boer, Cindy G.; Bønnelykke, Klaus; Ralston, Stuart H.; Robbins, John A.; Walsh, John P.; Zillikens, M. Carola; Langenberg, Claudia; Li-Gao, Ruifang; Williams, Frances M. K.; Harris, Tamara B.; Akesson, Kristina; Jackson, Rebecca D.; Sigurdsson, Gunnar; den Heijer, Martin; van der Eerden, Bram C. J.; van de Peppel, Jeroen; Spector, Timothy D.; Pennell, Craig; Horta, Bernardo L.; Felix, Janine F.; Zhao, Jing Hua; Wilson, Scott G.; de Mutsert, Renée; Bisgaard, Hans; Styrkársdóttir, Unnur; Jaddoe, Vincent W.; Orwoll, Eric; Lakka, Timo A.; Scott, Robert; Grant, Struan F. A.; Lorentzon, Mattias; van Duijn, Cornelia M.; Wilson, James F.; Stefansson, Kari; Psaty, Bruce M.; Kiel, Douglas P.; Ohlsson, Claes; Ntzani, Evangelia; van Wijnen, Andre J.; Forgetta, Vincenzo; Ghanbari, Mohsen; Logan, John G.; Williams, Graham R.; Bassett, J. H. Duncan; Croucher, Peter I.; Evangelou, Evangelos; Uitterlinden, Andre G.; Ackert-Bicknell, Cheryl L.; Tobias, Jonathan H.; Evans, David M.; Rivadeneira, Fernando

    2018-01-01

    Bone mineral density (BMD) assessed by DXA is used to evaluate bone health. In children, total body (TB) measurements are commonly used; in older individuals, BMD at the lumbar spine (LS) and femoral neck (FN) is used to diagnose osteoporosis. To date, genetic variants in more than 60 loci have been

  5. Life-Course Genome-wide Association Study Meta-analysis of Total Body BMD and Assessment of Age-Specific Effects

    DEFF Research Database (Denmark)

    Medina-Gomez, Carolina; Kemp, John P.; Trajanoska, Katerina

    2018-01-01

    Bone mineral density (BMD) assessed by DXA is used to evaluate bone health. In children, total body (TB) measurements are commonly used; in older individuals, BMD at the lumbar spine (LS) and femoral neck (FN) is used to diagnose osteoporosis. To date, genetic variants in more than 60 loci have b...

  6. Bioinformatics decoding the genome

    CERN Multimedia

    CERN. Geneva; Deutsch, Sam; Michielin, Olivier; Thomas, Arthur; Descombes, Patrick

    2006-01-01

    Extracting the fundamental genomic sequence from the DNA From Genome to Sequence : Biology in the early 21st century has been radically transformed by the availability of the full genome sequences of an ever increasing number of life forms, from bacteria to major crop plants and to humans. The lecture will concentrate on the computational challenges associated with the production, storage and analysis of genome sequence data, with an emphasis on mammalian genomes. The quality and usability of genome sequences is increasingly conditioned by the careful integration of strategies for data collection and computational analysis, from the construction of maps and libraries to the assembly of raw data into sequence contigs and chromosome-sized scaffolds. Once the sequence is assembled, a major challenge is the mapping of biologically relevant information onto this sequence: promoters, introns and exons of protein-encoding genes, regulatory elements, functional RNAs, pseudogenes, transposons, etc. The methodological ...

  7. Genomic research in Eucalyptus.

    Science.gov (United States)

    Poke, Fiona S; Vaillancourt, René E; Potts, Brad M; Reid, James B

    2005-09-01

    Eucalyptus L'Hérit. is a genus comprised of more than 700 species that is of vital importance ecologically to Australia and to the forestry industry world-wide, being grown in plantations for the production of solid wood products as well as pulp for paper. With the sequencing of the genomes of Arabidopsis thaliana and Oryza sativa and the recent completion of the first tree genome sequence, Populus trichocarpa, attention has turned to the current status of genomic research in Eucalyptus. For several eucalypt species, large segregating families have been established, high-resolution genetic maps constructed and large EST databases generated. Collaborative efforts have been initiated for the integration of diverse genomic projects and will provide the framework for future research including exploiting the sequence of the entire eucalypt genome which is currently being sequenced. This review summarises the current position of genomic research in Eucalyptus and discusses the direction of future research.

  8. Evolution of a seismic risk assessment technique

    International Nuclear Information System (INIS)

    Wells, J.E.; Cummings, G.E.

    1985-01-01

    To assist the NRC in its licensing evaluation role the Seismic Safety Margins Research Program (SSMRP) was started at LLNL in 1978. Its goal was to develop tools and data bases to evaluate the probability of earthquake caused radioactive releases from commercial nuclear power plants. The methodology was finalized in 1982 and a seismic risk assessment of the Zion Nuclear Power Plant was finished in 1983. Work continues on the study of the LaSalle Boiling Water Reactor. This paper will discuss some of the effects of the assumptions made during development of the systems analysis techniques used in SSMRP in light of the results obtained on studies to date. 5 refs

  9. Genome packaging in viruses

    OpenAIRE

    Sun, Siyang; Rao, Venigalla B.; Rossmann, Michael G.

    2010-01-01

    Genome packaging is a fundamental process in a viral life cycle. Many viruses assemble preformed capsids into which the genomic material is subsequently packaged. These viruses use a packaging motor protein that is driven by the hydrolysis of ATP to condense the nucleic acids into a confined space. How these motor proteins package viral genomes had been poorly understood until recently, when a few X-ray crystal structures and cryo-electron microscopy structures became available. Here we discu...

  10. Between Two Fern Genomes

    Science.gov (United States)

    2014-01-01

    Ferns are the only major lineage of vascular plants not represented by a sequenced nuclear genome. This lack of genome sequence information significantly impedes our ability to understand and reconstruct genome evolution not only in ferns, but across all land plants. Azolla and Ceratopteris are ideal and complementary candidates to be the first ferns to have their nuclear genomes sequenced. They differ dramatically in genome size, life history, and habit, and thus represent the immense diversity of extant ferns. Together, this pair of genomes will facilitate myriad large-scale comparative analyses across ferns and all land plants. Here we review the unique biological characteristics of ferns and describe a number of outstanding questions in plant biology that will benefit from the addition of ferns to the set of taxa with sequenced nuclear genomes. We explain why the fern clade is pivotal for understanding genome evolution across land plants, and we provide a rationale for how knowledge of fern genomes will enable progress in research beyond the ferns themselves. PMID:25324969

  11. Causes of genome instability

    DEFF Research Database (Denmark)

    Langie, Sabine A S; Koppen, Gudrun; Desaulniers, Daniel

    2015-01-01

    function, chromosome segregation, telomere length). The purpose of this review is to describe the crucial aspects of genome instability, to outline the ways in which environmental chemicals can affect this cancer hallmark and to identify candidate chemicals for further study. The overall aim is to make......Genome instability is a prerequisite for the development of cancer. It occurs when genome maintenance systems fail to safeguard the genome's integrity, whether as a consequence of inherited defects or induced via exposure to environmental agents (chemicals, biological agents and radiation). Thus...

  12. Fungal Genomics Program

    Energy Technology Data Exchange (ETDEWEB)

    Grigoriev, Igor

    2012-03-12

    The JGI Fungal Genomics Program aims to scale up sequencing and analysis of fungal genomes to explore the diversity of fungi important for energy and the environment, and to promote functional studies on a system level. Combining new sequencing technologies and comparative genomics tools, JGI is now leading the world in fungal genome sequencing and analysis. Over 120 sequenced fungal genomes with analytical tools are available via MycoCosm (www.jgi.doe.gov/fungi), a web-portal for fungal biologists. Our model of interacting with user communities, unique among other sequencing centers, helps organize these communities, improves genome annotation and analysis work, and facilitates new larger-scale genomic projects. This resulted in 20 high-profile papers published in 2011 alone and contributing to the Genomics Encyclopedia of Fungi, which targets fungi related to plant health (symbionts, pathogens, and biocontrol agents) and biorefinery processes (cellulose degradation, sugar fermentation, industrial hosts). Our next grand challenges include larger scale exploration of fungal diversity (1000 fungal genomes), developing molecular tools for DOE-relevant model organisms, and analysis of complex systems and metagenomes.

  13. MIPS plant genome information resources.

    Science.gov (United States)

    Spannagl, Manuel; Haberer, Georg; Ernst, Rebecca; Schoof, Heiko; Mayer, Klaus F X

    2007-01-01

    The Munich Institute for Protein Sequences (MIPS) has been involved in maintaining plant genome databases since the Arabidopsis thaliana genome project. Genome databases and analysis resources have focused on individual genomes and aim to provide flexible and maintainable data sets for model plant genomes as a backbone against which experimental data, for example from high-throughput functional genomics, can be organized and evaluated. In addition, model genomes also form a scaffold for comparative genomics, and much can be learned from genome-wide evolutionary studies.

  14. Lawrence Livermore National Laboratory- Completing the Human Genome Project and Triggering Nearly $1 Trillion in U.S. Economic Activity

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Jeffrey S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-28

    The success of the Human Genome project is already nearing $1 Trillion dollars of U.S. economic activity. Lawrence Livermore National Laboratory (LLNL) was a co-leader in one of the biggest biological research effort in history, sequencing the Human Genome Project. This ambitious research effort set out to sequence the approximately 3 billion nucleotides in the human genome, an effort many thought was nearly impossible. Deoxyribonucleic acid (DNA) was discovered in 1869, and by 1943 came the discovery that DNA was a molecule that encodes the genetic instructions used in the development and functioning of living organisms and many viruses. To make full use of the information, scientists needed to first sequence the billions of nucleotides to begin linking them to genetic traits and illnesses, and eventually more effective treatments. New medical discoveries and improved agriculture productivity were some of the expected benefits. While the potential benefits were vast, the timeline (over a decade) and cost ($3.8 Billion) exceeded what the private sector would normally attempt, especially when this would only be the first phase toward the path to new discoveries and market opportunities. The Department of Energy believed its best research laboratories could meet this Grand Challenge and soon convinced the National Institute of Health to formally propose the Human Genome project to the federal government. The U.S. government accepted the risk and challenge to potentially create new healthcare and food discoveries that could benefit the world and the U.S. Industry.

  15. Evidence that personal genome testing enhances student learning in a course on genomics and personalized medicine.

    Directory of Open Access Journals (Sweden)

    Keyan Salari

    Full Text Available An emerging debate in academic medical centers is not about the need for providing trainees with fundamental education on genomics, but rather the most effective educational models that should be deployed. At Stanford School of Medicine, a novel hands-on genomics course was developed in 2010 that provided students the option to undergo personal genome testing as part of the course curriculum. We hypothesized that use of personal genome testing in the classroom would enhance the learning experience of students. No data currently exist on how such methods impact student learning; thus, we surveyed students before and after the course to determine its impact. We analyzed responses using paired statistics from the 31 medical and graduate students who completed both pre-course and post-course surveys. Participants were stratified by those who did (N = 23 or did not (N = 8 undergo personal genome testing. In reflecting on the experience, 83% of students who underwent testing stated that they were pleased with their decision compared to 12.5% of students who decided against testing (P = 0.00058. Seventy percent of those who underwent personal genome testing self-reported a better understanding of human genetics on the basis of having undergone testing. Further, students who underwent personal genome testing demonstrated an average 31% increase in pre- to post-course scores on knowledge questions (P = 3.5×10(-6; this was significantly higher (P = 0.003 than students who did not undergo testing, who showed a non-significant improvement. Undergoing personal genome testing and using personal genotype data in the classroom enhanced students' self-reported and assessed knowledge of genomics, and did not appear to cause significant anxiety. At least for self-selected students, the incorporation of personal genome testing can be an effective educational tool to teach important concepts of clinical genomic testing.

  16. A practice guideline from the American College of Medical Genetics and Genomics and the National Society of Genetic Counselors: referral indications for cancer predisposition assessment.

    Science.gov (United States)

    Hampel, Heather; Bennett, Robin L; Buchanan, Adam; Pearlman, Rachel; Wiesner, Georgia L

    2015-01-01

    The practice guidelines of the American College of Medical Genetics and Genomics (ACMG) and the National Society of Genetic Counselors (NSGC) are developed by members of the ACMG and NSGC to assist medical geneticists, genetic counselors, and other health-care providers in making decisions about appropriate management of genetic concerns, including access to and/or delivery of services. Each practice guideline focuses on a clinical or practice-based issue and is the result of a review and analysis of current professional literature believed to be reliable. As such, information and recommendations within the ACMG and NSGC joint practice guidelines reflect the current scientific and clinical knowledge at the time of publication, are current only as of their publication date, and are subject to change without notice as advances emerge. In addition, variations in practice, which take into account the needs of the individual patient and the resources and limitations unique to the institution or type of practice, may warrant approaches, treatments, and/or procedures that differ from the recommendations outlined in this guideline. Therefore, these recommendations should not be construed as dictating an exclusive course of management, nor does the use of such recommendations guarantee a particular outcome. Genetic counseling practice guidelines are never intended to displace a health-care provider's best medical judgment based on the clinical circumstances of a particular patient or patient population. Practice guidelines are published by the ACMG or the NSGC for educational and informational purposes only, and neither the ACMG nor the NSGC "approve" or "endorse" any specific methods, practices, or sources of information.Cancer genetic consultation is an important aspect of the care of individuals at increased risk of a hereditary cancer syndrome. Yet several patient, clinician, and system-level barriers hinder identification of individuals appropriate for cancer genetics

  17. Computational genomics of hyperthermophiles

    NARCIS (Netherlands)

    Werken, van de H.J.G.

    2008-01-01

    With the ever increasing number of completely sequenced prokaryotic genomes and the subsequent use of functional genomics tools, e.g. DNA microarray and proteomics, computational data analysis and the integration of microbial and molecular data is inevitable. This thesis describes the computational

  18. Safeguarding genome integrity

    DEFF Research Database (Denmark)

    Sørensen, Claus Storgaard; Syljuåsen, Randi G

    2012-01-01

    Mechanisms that preserve genome integrity are highly important during the normal life cycle of human cells. Loss of genome protective mechanisms can lead to the development of diseases such as cancer. Checkpoint kinases function in the cellular surveillance pathways that help cells to cope with D...

  19. Human genome I

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    An international conference, Human Genome I, was held Oct. 2-4, 1989 in San Diego, Calif. Selected speakers discussed: Current Status of the Genome Project; Technique Innovations; Interesting regions; Applications; and Organization - Different Views of Current and Future Science and Procedures. Posters, consisting of 119 presentations, were displayed during the sessions. 119 were indexed for inclusion to the Energy Data Base

  20. Genome-wide study of correlations between genomic features and their relationship with the regulation of gene expression.

    Science.gov (United States)

    Kravatsky, Yuri V; Chechetkin, Vladimir R; Tchurikov, Nikolai A; Kravatskaya, Galina I

    2015-02-01

    The broad class of tasks in genetics and epigenetics can be reduced to the study of various features that are distributed over the genome (genome tracks). The rapid and efficient processing of the huge amount of data stored in the genome-scale databases cannot be achieved without the software packages based on the analytical criteria. However, strong inhomogeneity of genome tracks hampers the development of relevant statistics. We developed the criteria for the assessment of genome track inhomogeneity and correlations between two genome tracks. We also developed a software package, Genome Track Analyzer, based on this theory. The theory and software were tested on simulated data and were applied to the study of correlations between CpG islands and transcription start sites in the Homo sapiens genome, between profiles of protein-binding sites in chromosomes of Drosophila melanogaster, and between DNA double-strand breaks and histone marks in the H. sapiens genome. Significant correlations between transcription start sites on the forward and the reverse strands were observed in genomes of D. melanogaster, Caenorhabditis elegans, Mus musculus, H. sapiens, and Danio rerio. The observed correlations may be related to the regulation of gene expression in eukaryotes. Genome Track Analyzer is freely available at http://ancorr.eimb.ru/. © The Author 2015. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.

  1. Musa sebagai Model Genom

    Directory of Open Access Journals (Sweden)

    RITA MEGIA

    2005-12-01

    Full Text Available During the meeting in Arlington, USA in 2001, the scientists grouped in PROMUSA agreed with the launching of the Global Musa Genomics Consortium. The Consortium aims to apply genomics technologies to the improvement of this important crop. These genome projects put banana as the third model species after Arabidopsis and rice that will be analyzed and sequenced. Comparing to Arabidopsis and rice, banana genome provides a unique and powerful insight into structural and in functional genomics that could not be found in those two species. This paper discussed these subjects-including the importance of banana as the fourth main food in the world, the evolution and biodiversity of this genetic resource and its parasite.

  2. The genome editing revolution

    DEFF Research Database (Denmark)

    Stella, Stefano; Montoya, Guillermo

    2016-01-01

    -Cas system has become the main tool for genome editing in many laboratories. Currently the targeted genome editing technology has been used in many fields and may be a possible approach for human gene therapy. Furthermore, it can also be used to modifying the genomes of model organisms for studying human......In the last 10 years, we have witnessed a blooming of targeted genome editing systems and applications. The area was revolutionized by the discovery and characterization of the transcription activator-like effector proteins, which are easier to engineer to target new DNA sequences than...... sequence). This ribonucleoprotein complex protects bacteria from invading DNAs, and it was adapted to be used in genome editing. The CRISPR ribonucleic acid (RNA) molecule guides to the specific DNA site the Cas9 nuclease to cleave the DNA target. Two years and more than 1000 publications later, the CRISPR...

  3. Phytozome Comparative Plant Genomics Portal

    Energy Technology Data Exchange (ETDEWEB)

    Goodstein, David; Batra, Sajeev; Carlson, Joseph; Hayes, Richard; Phillips, Jeremy; Shu, Shengqiang; Schmutz, Jeremy; Rokhsar, Daniel

    2014-09-09

    The Dept. of Energy Joint Genome Institute is a genomics user facility supporting DOE mission science in the areas of Bioenergy, Carbon Cycling, and Biogeochemistry. The Plant Program at the JGI applies genomic, analytical, computational and informatics platforms and methods to: 1. Understand and accelerate the improvement (domestication) of bioenergy crops 2. Characterize and moderate plant response to climate change 3. Use comparative genomics to identify constrained elements and infer gene function 4. Build high quality genomic resource platforms of JGI Plant Flagship genomes for functional and experimental work 5. Expand functional genomic resources for Plant Flagship genomes

  4. A comprehensive genomic history of extinct and living elephants

    DEFF Research Database (Denmark)

    Palkopoulou, Eleftheria; Lipson, Mark; Mallick, Swapan

    2018-01-01

    Elephantids are the world's most iconic megafaunal family, yet there is no comprehensive genomic assessment of their relationships. We report a total of 14 genomes, including 2 from the American mastodon, which is an extinct elephantid relative, and 12 spanning all three extant and three extinct...

  5. Building a model: developing genomic resources for common milkweed (Asclepias syriaca with low coverage genome sequencing

    Directory of Open Access Journals (Sweden)

    Weitemier Kevin

    2011-05-01

    Full Text Available Abstract Background Milkweeds (Asclepias L. have been extensively investigated in diverse areas of evolutionary biology and ecology; however, there are few genetic resources available to facilitate and compliment these studies. This study explored how low coverage genome sequencing of the common milkweed (Asclepias syriaca L. could be useful in characterizing the genome of a plant without prior genomic information and for development of genomic resources as a step toward further developing A. syriaca as a model in ecology and evolution. Results A 0.5× genome of A. syriaca was produced using Illumina sequencing. A virtually complete chloroplast genome of 158,598 bp was assembled, revealing few repeats and loss of three genes: accD, clpP, and ycf1. A nearly complete rDNA cistron (18S-5.8S-26S; 7,541 bp and 5S rDNA (120 bp sequence were obtained. Assessment of polymorphism revealed that the rDNA cistron and 5S rDNA had 0.3% and 26.7% polymorphic sites, respectively. A partial mitochondrial genome sequence (130,764 bp, with identical gene content to tobacco, was also assembled. An initial characterization of repeat content indicated that Ty1/copia-like retroelements are the most common repeat type in the milkweed genome. At least one A. syriaca microread hit 88% of Catharanthus roseus (Apocynaceae unigenes (median coverage of 0.29× and 66% of single copy orthologs (COSII in asterids (median coverage of 0.14×. From this partial characterization of the A. syriaca genome, markers for population genetics (microsatellites and phylogenetics (low-copy nuclear genes studies were developed. Conclusions The results highlight the promise of next generation sequencing for development of genomic resources for any organism. Low coverage genome sequencing allows characterization of the high copy fraction of the genome and exploration of the low copy fraction of the genome, which facilitate the development of molecular tools for further study of a target species

  6. Building a model: developing genomic resources for common milkweed (Asclepias syriaca) with low coverage genome sequencing.

    Science.gov (United States)

    Straub, Shannon C K; Fishbein, Mark; Livshultz, Tatyana; Foster, Zachary; Parks, Matthew; Weitemier, Kevin; Cronn, Richard C; Liston, Aaron

    2011-05-04

    Milkweeds (Asclepias L.) have been extensively investigated in diverse areas of evolutionary biology and ecology; however, there are few genetic resources available to facilitate and compliment these studies. This study explored how low coverage genome sequencing of the common milkweed (Asclepias syriaca L.) could be useful in characterizing the genome of a plant without prior genomic information and for development of genomic resources as a step toward further developing A. syriaca as a model in ecology and evolution. A 0.5× genome of A. syriaca was produced using Illumina sequencing. A virtually complete chloroplast genome of 158,598 bp was assembled, revealing few repeats and loss of three genes: accD, clpP, and ycf1. A nearly complete rDNA cistron (18S-5.8S-26S; 7,541 bp) and 5S rDNA (120 bp) sequence were obtained. Assessment of polymorphism revealed that the rDNA cistron and 5S rDNA had 0.3% and 26.7% polymorphic sites, respectively. A partial mitochondrial genome sequence (130,764 bp), with identical gene content to tobacco, was also assembled. An initial characterization of repeat content indicated that Ty1/copia-like retroelements are the most common repeat type in the milkweed genome. At least one A. syriaca microread hit 88% of Catharanthus roseus (Apocynaceae) unigenes (median coverage of 0.29×) and 66% of single copy orthologs (COSII) in asterids (median coverage of 0.14×). From this partial characterization of the A. syriaca genome, markers for population genetics (microsatellites) and phylogenetics (low-copy nuclear genes) studies were developed. The results highlight the promise of next generation sequencing for development of genomic resources for any organism. Low coverage genome sequencing allows characterization of the high copy fraction of the genome and exploration of the low copy fraction of the genome, which facilitate the development of molecular tools for further study of a target species and its relatives. This study represents a first

  7. Genome-derived vaccines.

    Science.gov (United States)

    De Groot, Anne S; Rappuoli, Rino

    2004-02-01

    Vaccine research entered a new era when the complete genome of a pathogenic bacterium was published in 1995. Since then, more than 97 bacterial pathogens have been sequenced and at least 110 additional projects are now in progress. Genome sequencing has also dramatically accelerated: high-throughput facilities can draft the sequence of an entire microbe (two to four megabases) in 1 to 2 days. Vaccine developers are using microarrays, immunoinformatics, proteomics and high-throughput immunology assays to reduce the truly unmanageable volume of information available in genome databases to a manageable size. Vaccines composed by novel antigens discovered from genome mining are already in clinical trials. Within 5 years we can expect to see a novel class of vaccines composed by genome-predicted, assembled and engineered T- and Bcell epitopes. This article addresses the convergence of three forces--microbial genome sequencing, computational immunology and new vaccine technologies--that are shifting genome mining for vaccines onto the forefront of immunology research.

  8. The Banana Genome Hub

    Science.gov (United States)

    Droc, Gaëtan; Larivière, Delphine; Guignon, Valentin; Yahiaoui, Nabila; This, Dominique; Garsmeur, Olivier; Dereeper, Alexis; Hamelin, Chantal; Argout, Xavier; Dufayard, Jean-François; Lengelle, Juliette; Baurens, Franc-Christophe; Cenci, Alberto; Pitollat, Bertrand; D’Hont, Angélique; Ruiz, Manuel; Rouard, Mathieu; Bocs, Stéphanie

    2013-01-01

    Banana is one of the world’s favorite fruits and one of the most important crops for developing countries. The banana reference genome sequence (Musa acuminata) was recently released. Given the taxonomic position of Musa, the completed genomic sequence has particular comparative value to provide fresh insights about the evolution of the monocotyledons. The study of the banana genome has been enhanced by a number of tools and resources that allows harnessing its sequence. First, we set up essential tools such as a Community Annotation System, phylogenomics resources and metabolic pathways. Then, to support post-genomic efforts, we improved banana existing systems (e.g. web front end, query builder), we integrated available Musa data into generic systems (e.g. markers and genetic maps, synteny blocks), we have made interoperable with the banana hub, other existing systems containing Musa data (e.g. transcriptomics, rice reference genome, workflow manager) and finally, we generated new results from sequence analyses (e.g. SNP and polymorphism analysis). Several uses cases illustrate how the Banana Genome Hub can be used to study gene families. Overall, with this collaborative effort, we discuss the importance of the interoperability toward data integration between existing information systems. Database URL: http://banana-genome.cirad.fr/ PMID:23707967

  9. Genomic instability following irradiation

    International Nuclear Information System (INIS)

    Hacker-Klom, U.B.; Goehde, W.

    2001-01-01

    Ionising irradiation may induce genomic instability. The broad spectrum of stress reactions in eukaryontic cells to irradiation complicates the discovery of cellular targets and pathways inducing genomic instability. Irradiation may initiate genomic instability by deletion of genes controlling stability, by induction of genes stimulating instability and/or by activating endogeneous cellular viruses. Alternatively or additionally it is discussed that the initiation of genomic instability may be a consequence of radiation or other agents independently of DNA damage implying non nuclear targets, e.g. signal cascades. As a further mechanism possibly involved our own results may suggest radiation-induced changes in chromatin structure. Once initiated the process of genomic instability probably is perpetuated by endogeneous processes necessary for proliferation. Genomic instability may be a cause or a consequence of the neoplastic phenotype. As a conclusion from the data available up to now a new interpretation of low level radiation effects for radiation protection and in radiotherapy appears useful. The detection of the molecular mechanisms of genomic instability will be important in this context and may contribute to a better understanding of phenomenons occurring at low doses <10 cSv which are not well understood up to now. (orig.)

  10. Transportation System Risk Assessment on DOE Defense Program shipments

    International Nuclear Information System (INIS)

    Brumburgh, G.P.; Kimura, C.Y.; Alesso, H.P.; Prassinos, P.G.

    1992-01-01

    Substantial effort has been expended concerning the level of safety provided to persons, property, and the environment from the hazards associated with transporting radioactive material. This work provided an impetus for the Department of Energy to investigate the use of probabilistic risk assessment techniques to supplement the deterministic approach to transportation safety. The DOE recently decided to incorporate the methodologies associated with PRAs in the process for authorizing the transportation of nuclear components, special assemblies, and radioactive materials affiliated with the DOE Defense Program. Accordingly, the LLNL, sponsored by the DOE/AL, is tasked with developing a safety guide series to provide guidance to preparers performing a transportation system risk assessment

  11. Draft genome of the gayal, Bos frontalis

    Science.gov (United States)

    Wang, Ming-Shan; Zeng, Yan; Wang, Xiao; Nie, Wen-Hui; Wang, Jin-Huan; Su, Wei-Ting; Xiong, Zi-Jun; Wang, Sheng; Qu, Kai-Xing; Yan, Shou-Qing; Yang, Min-Min; Wang, Wen; Dong, Yang; Zhang, Ya-Ping

    2017-01-01

    Abstract Gayal (Bos frontalis), also known as mithan or mithun, is a large endangered semi-domesticated bovine that has a limited geographical distribution in the hill-forests of China, Northeast India, Bangladesh, Myanmar, and Bhutan. Many questions about the gayal such as its origin, population history, and genetic basis of local adaptation remain largely unresolved. De novo sequencing and assembly of the whole gayal genome provides an opportunity to address these issues. We report a high-depth sequencing, de novo assembly, and annotation of a female Chinese gayal genome. Based on the Illumina genomic sequencing platform, we have generated 350.38 Gb of raw data from 16 different insert-size libraries. A total of 276.86 Gb of clean data is retained after quality control. The assembled genome is about 2.85 Gb with scaffold and contig N50 sizes of 2.74 Mb and 14.41 kb, respectively. Repetitive elements account for 48.13% of the genome. Gene annotation has yielded 26 667 protein-coding genes, of which 97.18% have been functionally annotated. BUSCO assessment shows that our assembly captures 93% (3183 of 4104) of the core eukaryotic genes and 83.1% of vertebrate universal single-copy orthologs. We provide the first comprehensive de novo genome of the gayal. This genetic resource is integral for investigating the origin of the gayal and performing comparative genomic studies to improve understanding of the speciation and divergence of bovine species. The assembled genome could be used as reference in future population genetic studies of gayal. PMID:29048483

  12. Lawrence Livermore National Laboratory low-level waste systems performance assessment

    International Nuclear Information System (INIS)

    1990-11-01

    This Low-Level Radioactive Waste (LLW) Systems Performance Assessment (PA) presents a systematic analysis of the potential risks posed by the Lawrence Livermore National Laboratory (LLNL) waste management system. Potential risks to the public and environment are compared to established performance objectives as required by DOE Order 5820.2A. The report determines the associated maximum individual committed effective dose equivalent (CEDE) to a member of the public from LLW and mixed waste. A maximum annual CEDE of 0.01 mrem could result from routine radioactive liquid effluents. A maximum annual CEDE of 0.003 mrem could result from routine radioactive gaseous effluents. No other pathways for radiation exposure of the public indicated detectable levels of exposure. The dose rate, monitoring, and waste acceptance performance objectives were found to be adequately addressed by the LLNL Program. 88 refs., 3 figs., 17 tabs

  13. Five-Year NRHP Re-Evaluation of Historic Buildings Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Ullrich, R A; Heidecker, K R

    2011-09-12

    The Lawrence Livermore National Laboratory (LLNL) 'Draft Programmatic Agreement among the Department of Energy and the California State Historic Preservation Officer Regarding Operation of Lawrence Livermore National Laboratory' requires a review and re-evaluation of the eligibility of laboratory properties for the National Register of Historic Places (NRHP) every five years. The original evaluation was published in 2005; this report serves as the first five-year re-evaluation. This re-evaluation includes consideration of changes within LLNL to management, to mission, and to the built environment. it also determines the status of those buildings, objects, and districts that were recommended as NRHP-eligible in the 2005 report. Buildings that were omitted from the earlier building list, those that have reached 50 years of age since the original assessment, and new buildings are also addressed in the re-evaluation.

  14. Traditional medicine and genomics

    Directory of Open Access Journals (Sweden)

    Kalpana Joshi

    2010-01-01

    Full Text Available ′Omics′ developments in the form of genomics, proteomics and metabolomics have increased the impetus of traditional medicine research. Studies exploring the genomic, proteomic and metabolomic basis of human constitutional types based on Ayurveda and other systems of oriental medicine are becoming popular. Such studies remain important to developing better understanding of human variations and individual differences. Countries like India, Korea, China and Japan are investing in research on evidence-based traditional medicines and scientific validation of fundamental principles. This review provides an account of studies addressing relationships between traditional medicine and genomics.

  15. Traditional medicine and genomics.

    Science.gov (United States)

    Joshi, Kalpana; Ghodke, Yogita; Shintre, Pooja

    2010-01-01

    'Omics' developments in the form of genomics, proteomics and metabolomics have increased the impetus of traditional medicine research. Studies exploring the genomic, proteomic and metabolomic basis of human constitutional types based on Ayurveda and other systems of oriental medicine are becoming popular. Such studies remain important to developing better understanding of human variations and individual differences. Countries like India, Korea, China and Japan are investing in research on evidence-based traditional medicines and scientific validation of fundamental principles. This review provides an account of studies addressing relationships between traditional medicine and genomics.

  16. Bacillus subtilis genome diversity.

    Science.gov (United States)

    Earl, Ashlee M; Losick, Richard; Kolter, Roberto

    2007-02-01

    Microarray-based comparative genomic hybridization (M-CGH) is a powerful method for rapidly identifying regions of genome diversity among closely related organisms. We used M-CGH to examine the genome diversity of 17 strains belonging to the nonpathogenic species Bacillus subtilis. Our M-CGH results indicate that there is considerable genetic heterogeneity among members of this species; nearly one-third of Bsu168-specific genes exhibited variability, as measured by the microarray hybridization intensities. The variable loci include those encoding proteins involved in antibiotic production, cell wall synthesis, sporulation, and germination. The diversity in these genes may reflect this organism's ability to survive in diverse natural settings.

  17. Genomic taxonomy of vibrios

    Directory of Open Access Journals (Sweden)

    Iida Tetsuya

    2009-10-01

    Full Text Available Abstract Background Vibrio taxonomy has been based on a polyphasic approach. In this study, we retrieve useful taxonomic information (i.e. data that can be used to distinguish different taxonomic levels, such as species and genera from 32 genome sequences of different vibrio species. We use a variety of tools to explore the taxonomic relationship between the sequenced genomes, including Multilocus Sequence Analysis (MLSA, supertrees, Average Amino Acid Identity (AAI, genomic signatures, and Genome BLAST atlases. Our aim is to analyse the usefulness of these tools for species identification in vibrios. Results We have generated four new genome sequences of three Vibrio species, i.e., V. alginolyticus 40B, V. harveyi-like 1DA3, and V. mimicus strains VM573 and VM603, and present a broad analyses of these genomes along with other sequenced Vibrio species. The genome atlas and pangenome plots provide a tantalizing image of the genomic differences that occur between closely related sister species, e.g. V. cholerae and V. mimicus. The vibrio pangenome contains around 26504 genes. The V. cholerae core genome and pangenome consist of 1520 and 6923 genes, respectively. Pangenomes might allow different strains of V. cholerae to occupy different niches. MLSA and supertree analyses resulted in a similar phylogenetic picture, with a clear distinction of four groups (Vibrio core group, V. cholerae-V. mimicus, Aliivibrio spp., and Photobacterium spp.. A Vibrio species is defined as a group of strains that share > 95% DNA identity in MLSA and supertree analysis, > 96% AAI, ≤ 10 genome signature dissimilarity, and > 61% proteome identity. Strains of the same species and species of the same genus will form monophyletic groups on the basis of MLSA and supertree. Conclusion The combination of different analytical and bioinformatics tools will enable the most accurate species identification through genomic computational analysis. This endeavour will culminate in

  18. Human Genome Project

    Energy Technology Data Exchange (ETDEWEB)

    Block, S. [The MITRE Corporation, McLean, VA (US). JASON Program Office; Cornwall, J. [The MITRE Corporation, McLean, VA (US). JASON Program Office; Dally, W. [The MITRE Corporation, McLean, VA (US). JASON Program Office; Dyson, F. [The MITRE Corporation, McLean, VA (US). JASON Program Office; Fortson, N. [The MITRE Corporation, McLean, VA (US). JASON Program Office; Joyce, G. [The MITRE Corporation, McLean, VA (US). JASON Program Office; Kimble, H. J. [The MITRE Corporation, McLean, VA (US). JASON Program Office; Lewis, N. [The MITRE Corporation, McLean, VA (US). JASON Program Office; Max, C. [The MITRE Corporation, McLean, VA (US). JASON Program Office; Prince, T. [The MITRE Corporation, McLean, VA (US). JASON Program Office; Schwitters, R. [The MITRE Corporation, McLean, VA (US). JASON Program Office; Weinberger, P. [The MITRE Corporation, McLean, VA (US). JASON Program Office; Woodin, W. H. [The MITRE Corporation, McLean, VA (US). JASON Program Office

    1998-01-04

    The study reviews Department of Energy supported aspects of the United States Human Genome Project, the joint National Institutes of Health/Department of Energy program to characterize all human genetic material, to discover the set of human genes, and to render them accessible for further biological study. The study concentrates on issues of technology, quality assurance/control, and informatics relevant to current effort on the genome project and needs beyond it. Recommendations are presented on areas of the genome program that are of particular interest to and supported by the Department of Energy.

  19. Human Genome Program

    Energy Technology Data Exchange (ETDEWEB)

    1993-01-01

    The DOE Human Genome program has grown tremendously, as shown by the marked increase in the number of genome-funded projects since the last workshop held in 1991. The abstracts in this book describe the genome research of DOE-funded grantees and contractors and invited guests, and all projects are represented at the workshop by posters. The 3-day meeting includes plenary sessions on ethical, legal, and social issues pertaining to the availability of genetic data; sequencing techniques, informatics support; and chromosome and cDNA mapping and sequencing.

  20. Genomic signal processing

    CERN Document Server

    Shmulevich, Ilya

    2007-01-01

    Genomic signal processing (GSP) can be defined as the analysis, processing, and use of genomic signals to gain biological knowledge, and the translation of that knowledge into systems-based applications that can be used to diagnose and treat genetic diseases. Situated at the crossroads of engineering, biology, mathematics, statistics, and computer science, GSP requires the development of both nonlinear dynamical models that adequately represent genomic regulation, and diagnostic and therapeutic tools based on these models. This book facilitates these developments by providing rigorous mathema

  1. A quantitative account of genomic island acquisitions in prokaryotes

    Directory of Open Access Journals (Sweden)

    Roos Tom E

    2011-08-01

    Full Text Available Abstract Background Microbial genomes do not merely evolve through the slow accumulation of mutations, but also, and often more dramatically, by taking up new DNA in a process called horizontal gene transfer. These innovation leaps in the acquisition of new traits can take place via the introgression of single genes, but also through the acquisition of large gene clusters, which are termed Genomic Islands. Since only a small proportion of all the DNA diversity has been sequenced, it can be hard to find the appropriate donors for acquired genes via sequence alignments from databases. In contrast, relative oligonucleotide frequencies represent a remarkably stable genomic signature in prokaryotes, which facilitates compositional comparisons as an alignment-free alternative for phylogenetic relatedness. In this project, we test whether Genomic Islands identified in individual bacterial genomes have a similar genomic signature, in terms of relative dinucleotide frequencies, and can therefore be expected to originate from a common donor species. Results When multiple Genomic Islands are present within a single genome, we find that up to 28% of these are compositionally very similar to each other, indicative of frequent recurring acquisitions from the same donor to the same acceptor. Conclusions This represents the first quantitative assessment of common directional transfer events in prokaryotic evolutionary history. We suggest that many of the resident Genomic Islands per prokaryotic genome originated from the same source, which may have implications with respect to their regulatory interactions, and for the elucidation of the common origins of these acquired gene clusters.

  2. Analysis of intra-genomic GC content homogeneity within prokaryotes

    DEFF Research Database (Denmark)

    Bohlin, J; Snipen, L; Hardy, S.P.

    2010-01-01

    the GC content varies within microbial genomes to assess whether this property can be associated with certain biological functions related to the organism's environment and phylogeny. We utilize a new quantity GCVAR, the intra-genomic GC content variability with respect to the average GC content......Bacterial genomes possess varying GC content (total guanines (Gs) and cytosines (Cs) per total of the four bases within the genome) but within a given genome, GC content can vary locally along the chromosome, with some regions significantly more or less GC rich than on average. We have examined how...... both aerobic and facultative microbes. Although an association has previously been found between mean genomic GC content and oxygen requirement, our analysis suggests that no such association exits when phylogenetic bias is accounted for. A significant association between GCVAR and mean GC content...

  3. Characterizing Phage Genomes for Therapeutic Applications

    Directory of Open Access Journals (Sweden)

    Casandra W. Philipson

    2018-04-01

    Full Text Available Multi-drug resistance is increasing at alarming rates. The efficacy of phage therapy, treating bacterial infections with bacteriophages alone or in combination with traditional antibiotics, has been demonstrated in emergency cases in the United States and in other countries, however remains to be approved for wide-spread use in the US. One limiting factor is a lack of guidelines for assessing the genomic safety of phage candidates. We present the phage characterization workflow used by our team to generate data for submitting phages to the Federal Drug Administration (FDA for authorized use. Essential analysis checkpoints and warnings are detailed for obtaining high-quality genomes, excluding undesirable candidates, rigorously assessing a phage genome for safety and evaluating sequencing contamination. This workflow has been developed in accordance with community standards for high-throughput sequencing of viral genomes as well as principles for ideal phages used for therapy. The feasibility and utility of the pipeline is demonstrated on two new phage genomes that meet all safety criteria. We propose these guidelines as a minimum standard for phages being submitted to the FDA for review as investigational new drug candidates.

  4. Characterizing Phage Genomes for Therapeutic Applications.

    Science.gov (United States)

    Philipson, Casandra W; Voegtly, Logan J; Lueder, Matthew R; Long, Kyle A; Rice, Gregory K; Frey, Kenneth G; Biswas, Biswajit; Cer, Regina Z; Hamilton, Theron; Bishop-Lilly, Kimberly A

    2018-04-10

    Multi-drug resistance is increasing at alarming rates. The efficacy of phage therapy, treating bacterial infections with bacteriophages alone or in combination with traditional antibiotics, has been demonstrated in emergency cases in the United States and in other countries, however remains to be approved for wide-spread use in the US. One limiting factor is a lack of guidelines for assessing the genomic safety of phage candidates. We present the phage characterization workflow used by our team to generate data for submitting phages to the Federal Drug Administration (FDA) for authorized use. Essential analysis checkpoints and warnings are detailed for obtaining high-quality genomes, excluding undesirable candidates, rigorously assessing a phage genome for safety and evaluating sequencing contamination. This workflow has been developed in accordance with community standards for high-throughput sequencing of viral genomes as well as principles for ideal phages used for therapy. The feasibility and utility of the pipeline is demonstrated on two new phage genomes that meet all safety criteria. We propose these guidelines as a minimum standard for phages being submitted to the FDA for review as investigational new drug candidates.

  5. Comparative genomics of the Bifidobacterium breve taxon

    NARCIS (Netherlands)

    Bottacini, Francesca; O'Connell Motherway, Mary; Kuczynski, Justin; O'Connell, Kerry Joan; Serafini, Fausta; Duranti, Sabrina; Milani, Christian; Turroni, Francesca; Lugli, Gabriele Andrea; Zomer, Aldert|info:eu-repo/dai/nl/304642754; Zhurina, Daria; Riedel, Christian; Ventura, Marco; van Sinderen, Douwe

    2014-01-01

    BACKGROUND: Bifidobacteria are commonly found as part of the microbiota of the gastrointestinal tract (GIT) of a broad range of hosts, where their presence is positively correlated with the host's health status. In this study, we assessed the genomes of thirteen representatives of Bifidobacterium

  6. Task 1.5 Genomic Shift and Drift Trends of Emerging Pathogens

    Energy Technology Data Exchange (ETDEWEB)

    Borucki, M

    2010-01-05

    The Lawrence Livermore National Laboratory (LLNL) Bioinformatics group has recently taken on a role in DTRA's Transformation Medical Technologies Initiative (TMTI). The high-level goal of TMTI is to accelerate the development of broad-spectrum countermeasures. To achieve those goals, TMTI has a near term need to conduct analyses of genomic shift and drift trends of emerging pathogens, with a focused eye on select agent pathogens, as well as antibiotic and virulence markers. Most emerging human pathogens are zoonotic viruses with a genome composed of RNA. The high mutation rate of the replication enzymes of RNA viruses contributes to sequence drift and provides one mechanism for these viruses to adapt to diverse hosts (interspecies transmission events) and cause new human and zoonotic diseases. Additionally, new viral pathogens frequently emerge due to genetic shift (recombination and segment reassortment) which allows for dramatic genotypic and phenotypic changes to occur rapidly. Bacterial pathogens also evolve via genetic drift and shift, although sequence drift generally occurs at a much slower rate for bacteria as compared to RNA viruses. However, genetic shift such as lateral gene transfer and inter- and intragenomic recombination enables bacteria to rapidly acquire new mechanisms of survival and antibiotic resistance. New technologies such as rapid whole genome sequencing of bacterial genomes, ultra-deep sequencing of RNA virus populations, metagenomic studies of environments rich in antibiotic resistance genes, and the use of microarrays for the detection and characterization of emerging pathogens provide mechanisms to address the challenges posed by the rapid emergence of pathogens. Bioinformatic algorithms that enable efficient analysis of the massive amounts of data generated by these technologies as well computational modeling of protein structures and evolutionary processes need to be developed to allow the technology to fulfill its potential.

  7. Genomics and fish adaptation

    Directory of Open Access Journals (Sweden)

    Agostinho Antunes

    2015-12-01

    Full Text Available The completion of the human genome sequencing in 2003 opened a new perspective into the importance of whole genome sequencing projects, and currently multiple species are having their genomes completed sequenced, from simple organisms, such as bacteria, to more complex taxa, such as mammals. This voluminous sequencing data generated across multiple organisms provides also the framework to better understand the genetic makeup of such species and related ones, allowing to explore the genetic changes underlining the evolution of diverse phenotypic traits. Here, recent results from our group retrieved from comparative evolutionary genomic analyses of varied fish species will be considered to exemplify how gene novelty and gene enhancement by positive selection might have been determinant in the success of adaptive radiations into diverse habitats and lifestyles.

  8. Lophotrochozoan mitochondrial genomes

    Energy Technology Data Exchange (ETDEWEB)

    Valles, Yvonne; Boore, Jeffrey L.

    2005-10-01

    Progress in both molecular techniques and phylogeneticmethods has challenged many of the interpretations of traditionaltaxonomy. One example is in the recognition of the animal superphylumLophotrochozoa (annelids, mollusks, echiurans, platyhelminthes,brachiopods, and other phyla), although the relationships within thisgroup and the inclusion of some phyla remain uncertain. While much ofthis progress in phylogenetic reconstruction has been based on comparingsingle gene sequences, we are beginning to see the potential of comparinglarge-scale features of genomes, such as the relative order of genes.Even though tremendous progress is being made on the sequencedetermination of whole nuclear genomes, the dataset of choice forgenome-level characters for many animals across a broad taxonomic rangeremains mitochondrial genomes. We review here what is known aboutmitochondrial genomes of the lophotrochozoans and discuss the promisethat this dataset will enable insight into theirrelationships.

  9. Mouse Genome Informatics (MGI)

    Data.gov (United States)

    U.S. Department of Health & Human Services — MGI is the international database resource for the laboratory mouse, providing integrated genetic, genomic, and biological data to facilitate the study of human...

  10. Genomic definition of species

    Energy Technology Data Exchange (ETDEWEB)

    Crkvenjakov, R.; Drmanac, R.

    1991-07-01

    The subject of this paper is the definition of species based on the assumption that genome is the fundamental level for the origin and maintenance of biological diversity. For this view to be logically consistent it is necessary to assume the existence and operation of the new law which we call genome law. For this reason the genome law is included in the explanation of species phenomenon presented here even if its precise formulation and elaboration are left for the future. The intellectual underpinnings of this definition can be traced to Goldschmidt. We wish to explore some philosophical aspects of the definition of species in terms of the genome. The point of proposing the definition on these grounds is that any real advance in evolutionary theory has to be correct in both its philosophy and its science.

  11. Structural genomics in endocrinology

    NARCIS (Netherlands)

    Smit, J. W.; Romijn, J. A.

    2001-01-01

    Traditionally, endocrine research evolved from the phenotypical characterisation of endocrine disorders to the identification of underlying molecular pathophysiology. This approach has been, and still is, extremely successful. The introduction of genomics and proteomics has resulted in a reversal of

  12. Epidemiology & Genomics Research Program

    Science.gov (United States)

    The Epidemiology and Genomics Research Program, in the National Cancer Institute's Division of Cancer Control and Population Sciences, funds research in human populations to understand the determinants of cancer occurrence and outcomes.

  13. Annotating individual human genomes.

    Science.gov (United States)

    Torkamani, Ali; Scott-Van Zeeland, Ashley A; Topol, Eric J; Schork, Nicholas J

    2011-10-01

    Advances in DNA sequencing technologies have made it possible to rapidly, accurately and affordably sequence entire individual human genomes. As impressive as this ability seems, however, it will not likely amount to much if one cannot extract meaningful information from individual sequence data. Annotating variations within individual genomes and providing information about their biological or phenotypic impact will thus be crucially important in moving individual sequencing projects forward, especially in the context of the clinical use of sequence information. In this paper we consider the various ways in which one might annotate individual sequence variations and point out limitations in the available methods for doing so. It is arguable that, in the foreseeable future, DNA sequencing of individual genomes will become routine for clinical, research, forensic, and personal purposes. We therefore also consider directions and areas for further research in annotating genomic variants. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. ANNOTATING INDIVIDUAL HUMAN GENOMES*

    Science.gov (United States)

    Torkamani, Ali; Scott-Van Zeeland, Ashley A.; Topol, Eric J.; Schork, Nicholas J.

    2014-01-01

    Advances in DNA sequencing technologies have made it possible to rapidly, accurately and affordably sequence entire individual human genomes. As impressive as this ability seems, however, it will not likely to amount to much if one cannot extract meaningful information from individual sequence data. Annotating variations within individual genomes and providing information about their biological or phenotypic impact will thus be crucially important in moving individual sequencing projects forward, especially in the context of the clinical use of sequence information. In this paper we consider the various ways in which one might annotate individual sequence variations and point out limitations in the available methods for doing so. It is arguable that, in the foreseeable future, DNA sequencing of individual genomes will become routine for clinical, research, forensic, and personal purposes. We therefore also consider directions and areas for further research in annotating genomic variants. PMID:21839162

  15. Yeast genome sequencing:

    DEFF Research Database (Denmark)

    Piskur, Jure; Langkjær, Rikke Breinhold

    2004-01-01

    For decades, unicellular yeasts have been general models to help understand the eukaryotic cell and also our own biology. Recently, over a dozen yeast genomes have been sequenced, providing the basis to resolve several complex biological questions. Analysis of the novel sequence data has shown...... of closely related species helps in gene annotation and to answer how many genes there really are within the genomes. Analysis of non-coding regions among closely related species has provided an example of how to determine novel gene regulatory sequences, which were previously difficult to analyse because...... they are short and degenerate and occupy different positions. Comparative genomics helps to understand the origin of yeasts and points out crucial molecular events in yeast evolutionary history, such as whole-genome duplication and horizontal gene transfer(s). In addition, the accumulating sequence data provide...

  16. Genetical Genomics for Evolutionary Studies

    NARCIS (Netherlands)

    Prins, J.C.P.; Smant, G.; Jansen, R.C.

    2012-01-01

    Genetical genomics combines acquired high-throughput genomic data with genetic analysis. In this chapter, we discuss the application of genetical genomics for evolutionary studies, where new high-throughput molecular technologies are combined with mapping quantitative trait loci (QTL) on the genome

  17. The human genome project

    International Nuclear Information System (INIS)

    Worton, R.

    1996-01-01

    The Human Genome Project is a massive international research project, costing 3 to 5 billion dollars and expected to take 15 years, which will identify the all the genes in the human genome - i.e. the complete sequence of bases in human DNA. The prize will be the ability to identify genes causing or predisposing to disease, and in some cases the development of gene therapy, but this new knowledge will raise important ethical issues

  18. Decoding the human genome

    CERN Multimedia

    CERN. Geneva. Audiovisual Unit; Antonerakis, S E

    2002-01-01

    Decoding the Human genome is a very up-to-date topic, raising several questions besides purely scientific, in view of the two competing teams (public and private), the ethics of using the results, and the fact that the project went apparently faster and easier than expected. The lecture series will address the following chapters: Scientific basis and challenges. Ethical and social aspects of genomics.

  19. Molluscan Evolutionary Genomics

    Energy Technology Data Exchange (ETDEWEB)

    Simison, W. Brian; Boore, Jeffrey L.

    2005-12-01

    In the last 20 years there have been dramatic advances in techniques of high-throughput DNA sequencing, most recently accelerated by the Human Genome Project, a program that has determined the three billion base pair code on which we are based. Now this tremendous capability is being directed at other genome targets that are being sampled across the broad range of life. This opens up opportunities as never before for evolutionary and organismal biologists to address questions of both processes and patterns of organismal change. We stand at the dawn of a new 'modern synthesis' period, paralleling that of the early 20th century when the fledgling field of genetics first identified the underlying basis for Darwin's theory. We must now unite the efforts of systematists, paleontologists, mathematicians, computer programmers, molecular biologists, developmental biologists, and others in the pursuit of discovering what genomics can teach us about the diversity of life. Genome-level sampling for mollusks to date has mostly been limited to mitochondrial genomes and it is likely that these will continue to provide the best targets for broad phylogenetic sampling in the near future. However, we are just beginning to see an inroad into complete nuclear genome sequencing, with several mollusks and other eutrochozoans having been selected for work about to begin. Here, we provide an overview of the state of molluscan mitochondrial genomics, highlight a few of the discoveries from this research, outline the promise of broadening this dataset, describe upcoming projects to sequence whole mollusk nuclear genomes, and challenge the community to prepare for making the best use of these data.

  20. Human Germline Genome Editing

    OpenAIRE

    Ormond, Kelly E.; Mortlock, Douglas P.; Scholes, Derek T.; Bombard, Yvonne; Brody, Lawrence C.; Faucett, W. Andrew; Garrison, Nanibaa’ A.; Hercher, Laura; Isasi, Rosario; Middleton, Anna; Musunuru, Kiran; Shriner, Daniel; Virani, Alice; Young, Caroline E.

    2017-01-01

    With CRISPR/Cas9 and other genome-editing technologies, successful somatic and germline genome editing are becoming feasible. To respond, an American Society of Human Genetics (ASHG) workgroup developed this position statement, which was approved by the ASHG Board in March 2017. The workgroup included representatives from the UK Association of Genetic Nurses and Counsellors, Canadian Association of Genetic Counsellors, International Genetic Epidemiology Society, and US National Society of Gen...

  1. Comparative Pan-Genome Analysis of Piscirickettsia salmonis Reveals Genomic Divergences within Genogroups

    Directory of Open Access Journals (Sweden)

    Guillermo Nourdin-Galindo

    2017-10-01

    Full Text Available Piscirickettsia salmonis is the etiological agent of salmonid rickettsial septicemia, a disease that seriously affects the salmonid industry. Despite efforts to genomically characterize P. salmonis, functional information on the life cycle, pathogenesis mechanisms, diagnosis, treatment, and control of this fish pathogen remain lacking. To address this knowledge gap, the present study conducted an in silico pan-genome analysis of 19 P. salmonis strains from distinct geographic locations and genogroups. Results revealed an expected open pan-genome of 3,463 genes and a core-genome of 1,732 genes. Two marked genogroups were identified, as confirmed by phylogenetic and phylogenomic relationships to the LF-89 and EM-90 reference strains, as well as by assessments of genomic structures. Different structural configurations were found for the six identified copies of the ribosomal operon in the P. salmonis genome, indicating translocation throughout the genetic material. Chromosomal divergences in genomic localization and quantity of genetic cassettes were also found for the Dot/Icm type IVB secretion system. To determine divergences between core-genomes, additional pan-genome descriptions were compiled for the so-termed LF and EM genogroups. Open pan-genomes composed of 2,924 and 2,778 genes and core-genomes composed of 2,170 and 2,228 genes were respectively found for the LF and EM genogroups. The core-genomes were functionally annotated using the Gene Ontology, KEGG, and Virulence Factor databases, revealing the presence of several shared groups of genes related to basic function of intracellular survival and bacterial pathogenesis. Additionally, the specific pan-genomes for the LF and EM genogroups were defined, resulting in the identification of 148 and 273 exclusive proteins, respectively. Notably, specific virulence factors linked to adherence, colonization, invasion factors, and endotoxins were established. The obtained data suggest that these

  2. Mitochondrial genomics in the Genus Phytophthora with a focus on Phytophthora ramorum

    Science.gov (United States)

    Frank N. Martin; Paul Richardson

    2008-01-01

    The mitochondrial genomes of Phytophthora infestans, P. ramorum and P. sojae have been sequenced and comparative genomics has provided an opportunity to examine the processes involved with genome evolution in the genus Phytophthora. This approach can also be useful in assessing intraspecific...

  3. RadGenomics project

    Energy Technology Data Exchange (ETDEWEB)

    Iwakawa, Mayumi; Imai, Takashi; Harada, Yoshinobu [National Inst. of Radiological Sciences, Chiba (Japan). Frontier Research Center] [and others

    2002-06-01

    Human health is determined by a complex interplay of factors, predominantly between genetic susceptibility, environmental conditions and aging. The ultimate aim of the RadGenomics (Radiation Genomics) project is to understand the implications of heterogeneity in responses to ionizing radiation arising from genetic variation between individuals in the human population. The rapid progression of the human genome sequencing and the recent development of new technologies in molecular genetics are providing us with new opportunities to understand the genetic basis of individual differences in susceptibility to natural and/or artificial environmental factors, including radiation exposure. The RadGenomics project will inevitably lead to improved protocols for personalized radiotherapy and reductions in the potential side effects of such treatment. The project will contribute to future research into the molecular mechanisms of radiation sensitivity in humans and will stimulate the development of new high-throughput technologies for a broader application of biological and medical sciences. The staff members are specialists in a variety of fields, including genome science, radiation biology, medical science, molecular biology, and informatics, and have joined the RadGenomics project from various universities, companies, and research institutes. The project started in April 2001. (author)

  4. Comparative Genome Viewer

    International Nuclear Information System (INIS)

    Molineris, I.; Sales, G.

    2009-01-01

    The amount of information about genomes, both in the form of complete sequences and annotations, has been exponentially increasing in the last few years. As a result there is the need for tools providing a graphical representation of such information that should be comprehensive and intuitive. Visual representation is especially important in the comparative genomics field since it should provide a combined view of data belonging to different genomes. We believe that existing tools are limited in this respect as they focus on a single genome at a time (conservation histograms) or compress alignment representation to a single dimension. We have therefore developed a web-based tool called Comparative Genome Viewer (Cgv): it integrates a bidimensional representation of alignments between two regions, both at small and big scales, with the richness of annotations present in other genome browsers. We give access to our system through a web-based interface that provides the user with an interactive representation that can be updated in real time using the mouse to move from region to region and to zoom in on interesting details.

  5. Human social genomics.

    Directory of Open Access Journals (Sweden)

    Steven W Cole

    2014-08-01

    Full Text Available A growing literature in human social genomics has begun to analyze how everyday life circumstances influence human gene expression. Social-environmental conditions such as urbanity, low socioeconomic status, social isolation, social threat, and low or unstable social status have been found to associate with differential expression of hundreds of gene transcripts in leukocytes and diseased tissues such as metastatic cancers. In leukocytes, diverse types of social adversity evoke a common conserved transcriptional response to adversity (CTRA characterized by increased expression of proinflammatory genes and decreased expression of genes involved in innate antiviral responses and antibody synthesis. Mechanistic analyses have mapped the neural "social signal transduction" pathways that stimulate CTRA gene expression in response to social threat and may contribute to social gradients in health. Research has also begun to analyze the functional genomics of optimal health and thriving. Two emerging opportunities now stand to revolutionize our understanding of the everyday life of the human genome: network genomics analyses examining how systems-level capabilities emerge from groups of individual socially sensitive genomes and near-real-time transcriptional biofeedback to empirically optimize individual well-being in the context of the unique genetic, geographic, historical, developmental, and social contexts that jointly shape the transcriptional realization of our innate human genomic potential for thriving.

  6. RadGenomics project

    International Nuclear Information System (INIS)

    Iwakawa, Mayumi; Imai, Takashi; Harada, Yoshinobu

    2002-01-01

    Human health is determined by a complex interplay of factors, predominantly between genetic susceptibility, environmental conditions and aging. The ultimate aim of the RadGenomics (Radiation Genomics) project is to understand the implications of heterogeneity in responses to ionizing radiation arising from genetic variation between individuals in the human population. The rapid progression of the human genome sequencing and the recent development of new technologies in molecular genetics are providing us with new opportunities to understand the genetic basis of individual differences in susceptibility to natural and/or artificial environmental factors, including radiation exposure. The RadGenomics project will inevitably lead to improved protocols for personalized radiotherapy and reductions in the potential side effects of such treatment. The project will contribute to future research into the molecular mechanisms of radiation sensitivity in humans and will stimulate the development of new high-throughput technologies for a broader application of biological and medical sciences. The staff members are specialists in a variety of fields, including genome science, radiation biology, medical science, molecular biology, and informatics, and have joined the RadGenomics project from various universities, companies, and research institutes. The project started in April 2001. (author)

  7. Genomic affinity between Oryza sativa and Oryza brachyantha as ...

    African Journals Online (AJOL)

    USER

    2010-05-24

    May 24, 2010 ... distantly related wild species, usually difficult to produce. These hybrids are ... assessment of genomic relationship between these two species by .... closely related and the sterility of the hybrid may be due to genetics and not ...

  8. Ultrafast comparison of personal genomes

    OpenAIRE

    Mauldin, Denise; Hood, Leroy; Robinson, Max; Glusman, Gustavo

    2017-01-01

    We present an ultra-fast method for comparing personal genomes. We transform the standard genome representation (lists of variants relative to a reference) into 'genome fingerprints' that can be readily compared across sequencing technologies and reference versions. Because of their reduced size, computation on the genome fingerprints is fast and requires little memory. This enables scaling up a variety of important genome analyses, including quantifying relatedness, recognizing duplicative s...

  9. Design of Genomic Signatures of Pathogen Identification & Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Slezak, T; Gardner, S; Allen, J; Vitalis, E; Jaing, C

    2010-02-09

    This chapter will address some of the many issues associated with the identification of signatures based on genomic DNA/RNA, which can be used to identify and characterize pathogens for biodefense and microbial forensic goals. For the purposes of this chapter, we define a signature as one or more strings of contiguous genomic DNA or RNA bases that are sufficient to identify a pathogenic target of interest at the desired resolution and which could be instantiated with particular detection chemistry on a particular platform. The target may be a whole organism, an individual functional mechanism (e.g., a toxin gene), or simply a nucleic acid indicative of the organism. The desired resolution will vary with each program's goals but could easily range from family to genus to species to strain to isolate. The resolution may not be taxonomically based but rather pan-mechanistic in nature: detecting virulence or antibiotic-resistance genes shared by multiple microbes. Entire industries exist around different detection chemistries and instrument platforms for identification of pathogens, and we will only briefly mention a few of the techniques that we have used at Lawrence Livermore National Laboratory (LLNL) to support our biosecurity-related work since 2000. Most nucleic acid based detection chemistries involve the ability to isolate and amplify the signature target region(s), combined with a technique to detect the amplification. Genomic signature based identification techniques have the advantage of being precise, highly sensitive and relatively fast in comparison to biochemical typing methods and protein signatures. Classical biochemical typing methods were developed long before knowledge of DNA and resulted in dozens of tests (Gram's stain, differential growth characteristics media, etc.) that could be used to roughly characterize the major known pathogens (of course some are uncultivable). These tests could take many days to complete and precise resolution

  10. Genomics using the Assembly of the Mink Genome

    DEFF Research Database (Denmark)

    Guldbrandtsen, Bernt; Cai, Zexi; Sahana, Goutam

    2018-01-01

    The American Mink’s (Neovison vison) genome has recently been sequenced. This opens numerous avenues of research both for studying the basic genetics and physiology of the mink as well as genetic improvement in mink. Using genotyping-by-sequencing (GBS) generated marker data for 2,352 Danish farm...... mink runs of homozygosity (ROH) were detect in mink genomes. Detectable ROH made up on average 1.7% of the genome indicating the presence of at most a moderate level of genomic inbreeding. The fraction of genome regions found in ROH varied. Ten percent of the included regions were never found in ROH....... The ability to detect ROH in the mink genome also demonstrates the general reliability of the new mink genome assembly. Keywords: american mink, run of homozygosity, genome, selection, genomic inbreeding...

  11. Genome size analyses of Pucciniales reveal the largest fungal genomes.

    Science.gov (United States)

    Tavares, Sílvia; Ramos, Ana Paula; Pires, Ana Sofia; Azinheira, Helena G; Caldeirinha, Patrícia; Link, Tobias; Abranches, Rita; Silva, Maria do Céu; Voegele, Ralf T; Loureiro, João; Talhinhas, Pedro

    2014-01-01

    Rust fungi (Basidiomycota, Pucciniales) are biotrophic plant pathogens which exhibit diverse complexities in their life cycles and host ranges. The completion of genome sequencing of a few rust fungi has revealed the occurrence of large genomes. Sequencing efforts for other rust fungi have been hampered by uncertainty concerning their genome sizes. Flow cytometry was recently applied to estimate the genome size of a few rust fungi, and confirmed the occurrence of large genomes in this order (averaging 225.3 Mbp, while the average for Basidiomycota was 49.9 Mbp and was 37.7 Mbp for all fungi). In this work, we have used an innovative and simple approach to simultaneously isolate nuclei from the rust and its host plant in order to estimate the genome size of 30 rust species by flow cytometry. Genome sizes varied over 10-fold, from 70 to 893 Mbp, with an average genome size value of 380.2 Mbp. Compared to the genome sizes of over 1800 fungi, Gymnosporangium confusum possesses the largest fungal genome ever reported (893.2 Mbp). Moreover, even the smallest rust genome determined in this study is larger than the vast majority of fungal genomes (94%). The average genome size of the Pucciniales is now of 305.5 Mbp, while the average Basidiomycota genome size has shifted to 70.4 Mbp and the average for all fungi reached 44.2 Mbp. Despite the fact that no correlation could be drawn between the genome sizes, the phylogenomics or the life cycle of rust fungi, it is interesting to note that rusts with Fabaceae hosts present genomes clearly larger than those with Poaceae hosts. Although this study comprises only a small fraction of the more than 7000 rust species described, it seems already evident that the Pucciniales represent a group where genome size expansion could be a common characteristic. This is in sharp contrast to sister taxa, placing this order in a relevant position in fungal genomics research.

  12. A comprehensive and quantitative exploration of thousands of viral genomes

    Science.gov (United States)

    Mahmoudabadi, Gita

    2018-01-01

    The complete assembly of viral genomes from metagenomic datasets (short genomic sequences gathered from environmental samples) has proven to be challenging, so there are significant blind spots when we view viral genomes through the lens of metagenomics. One approach to overcoming this problem is to leverage the thousands of complete viral genomes that are publicly available. Here we describe our efforts to assemble a comprehensive resource that provides a quantitative snapshot of viral genomic trends – such as gene density, noncoding percentage, and abundances of functional gene categories – across thousands of viral genomes. We have also developed a coarse-grained method for visualizing viral genome organization for hundreds of genomes at once, and have explored the extent of the overlap between bacterial and bacteriophage gene pools. Existing viral classification systems were developed prior to the sequencing era, so we present our analysis in a way that allows us to assess the utility of the different classification systems for capturing genomic trends. PMID:29624169

  13. Genomes to Proteomes

    Energy Technology Data Exchange (ETDEWEB)

    Panisko, Ellen A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Grigoriev, Igor [USDOE Joint Genome Inst., Walnut Creek, CA (United States); Daly, Don S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Webb-Robertson, Bobbie-Jo [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Baker, Scott E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2009-03-01

    Biologists are awash with genomic sequence data. In large part, this is due to the rapid acceleration in the generation of DNA sequence that occurred as public and private research institutes raced to sequence the human genome. In parallel with the large human genome effort, mostly smaller genomes of other important model organisms were sequenced. Projects following on these initial efforts have made use of technological advances and the DNA sequencing infrastructure that was built for the human and other organism genome projects. As a result, the genome sequences of many organisms are available in high quality draft form. While in many ways this is good news, there are limitations to the biological insights that can be gleaned from DNA sequences alone; genome sequences offer only a bird's eye view of the biological processes endemic to an organism or community. Fortunately, the genome sequences now being produced at such a high rate can serve as the foundation for other global experimental platforms such as proteomics. Proteomic methods offer a snapshot of the proteins present at a point in time for a given biological sample. Current global proteomics methods combine enzymatic digestion, separations, mass spectrometry and database searching for peptide identification. One key aspect of proteomics is the prediction of peptide sequences from mass spectrometry data. Global proteomic analysis uses computational matching of experimental mass spectra with predicted spectra based on databases of gene models that are often generated computationally. Thus, the quality of gene models predicted from a genome sequence is crucial in the generation of high quality peptide identifications. Once peptides are identified they can be assigned to their parent protein. Proteins identified as expressed in a given experiment are most useful when compared to other expressed proteins in a larger biological context or biochemical pathway. In this chapter we will discuss the automatic

  14. Experimental Induction of Genome Chaos.

    Science.gov (United States)

    Ye, Christine J; Liu, Guo; Heng, Henry H

    2018-01-01

    Genome chaos, or karyotype chaos, represents a powerful survival strategy for somatic cells under high levels of stress/selection. Since the genome context, not the gene content, encodes the genomic blueprint of the cell, stress-induced rapid and massive reorganization of genome topology functions as a very important mechanism for genome (karyotype) evolution. In recent years, the phenomenon of genome chaos has been confirmed by various sequencing efforts, and many different terms have been coined to describe different subtypes of the chaotic genome including "chromothripsis," "chromoplexy," and "structural mutations." To advance this exciting field, we need an effective experimental system to induce and characterize the karyotype reorganization process. In this chapter, an experimental protocol to induce chaotic genomes is described, following a brief discussion of the mechanism and implication of genome chaos in cancer evolution.

  15. Genome Sequences of Oryza Species

    KAUST Repository

    Kumagai, Masahiko; Tanaka, Tsuyoshi; Ohyanagi, Hajime; Hsing, Yue-Ie C.; Itoh, Takeshi

    2018-01-01

    This chapter summarizes recent data obtained from genome sequencing, annotation projects, and studies on the genome diversity of Oryza sativa and related Oryza species. O. sativa, commonly known as Asian rice, is the first monocot species whose complete genome sequence was deciphered based on physical mapping by an international collaborative effort. This genome, along with its accurate and comprehensive annotation, has become an indispensable foundation for crop genomics and breeding. With the development of innovative sequencing technologies, genomic studies of O. sativa have dramatically increased; in particular, a large number of cultivars and wild accessions have been sequenced and compared with the reference rice genome. Since de novo genome sequencing has become cost-effective, the genome of African cultivated rice, O. glaberrima, has also been determined. Comparative genomic studies have highlighted the independent domestication processes of different rice species, but it also turned out that Asian and African rice share a common gene set that has experienced similar artificial selection. An international project aimed at constructing reference genomes and examining the genome diversity of wild Oryza species is currently underway, and the genomes of some species are publicly available. This project provides a platform for investigations such as the evolution, development, polyploidization, and improvement of crops. Studies on the genomic diversity of Oryza species, including wild species, should provide new insights to solve the problem of growing food demands in the face of rapid climatic changes.

  16. Genome Sequences of Oryza Species

    KAUST Repository

    Kumagai, Masahiko

    2018-02-14

    This chapter summarizes recent data obtained from genome sequencing, annotation projects, and studies on the genome diversity of Oryza sativa and related Oryza species. O. sativa, commonly known as Asian rice, is the first monocot species whose complete genome sequence was deciphered based on physical mapping by an international collaborative effort. This genome, along with its accurate and comprehensive annotation, has become an indispensable foundation for crop genomics and breeding. With the development of innovative sequencing technologies, genomic studies of O. sativa have dramatically increased; in particular, a large number of cultivars and wild accessions have been sequenced and compared with the reference rice genome. Since de novo genome sequencing has become cost-effective, the genome of African cultivated rice, O. glaberrima, has also been determined. Comparative genomic studies have highlighted the independent domestication processes of different rice species, but it also turned out that Asian and African rice share a common gene set that has experienced similar artificial selection. An international project aimed at constructing reference genomes and examining the genome diversity of wild Oryza species is currently underway, and the genomes of some species are publicly available. This project provides a platform for investigations such as the evolution, development, polyploidization, and improvement of crops. Studies on the genomic diversity of Oryza species, including wild species, should provide new insights to solve the problem of growing food demands in the face of rapid climatic changes.

  17. Genome position specific priors for genomic prediction

    DEFF Research Database (Denmark)

    Brøndum, Rasmus Froberg; Su, Guosheng; Lund, Mogens Sandø

    2012-01-01

    casual mutation is different between the populations but affects the same gene. Proportions of a four-distribution mixture for SNP effects in segments of fixed size along the genome are derived from one population and set as location specific prior proportions of distributions of SNP effects...... for the target population. The model was tested using dairy cattle populations of different breeds: 540 Australian Jersey bulls, 2297 Australian Holstein bulls and 5214 Nordic Holstein bulls. The traits studied were protein-, fat- and milk yield. Genotypic data was Illumina 777K SNPs, real or imputed Results...

  18. Genome-wide identification of significant aberrations in cancer genome.

    Science.gov (United States)

    Yuan, Xiguo; Yu, Guoqiang; Hou, Xuchu; Shih, Ie-Ming; Clarke, Robert; Zhang, Junying; Hoffman, Eric P; Wang, Roger R; Zhang, Zhen; Wang, Yue

    2012-07-27

    Somatic Copy Number Alterations (CNAs) in human genomes are present in almost all human cancers. Systematic efforts to characterize such structural variants must effectively distinguish significant consensus events from random background aberrations. Here we introduce Significant Aberration in Cancer (SAIC), a new method for characterizing and assessing the statistical significance of recurrent CNA units. Three main features of SAIC include: (1) exploiting the intrinsic correlation among consecutive probes to assign a score to each CNA unit instead of single probes; (2) performing permutations on CNA units that preserve correlations inherent in the copy number data; and (3) iteratively detecting Significant Copy Number Aberrations (SCAs) and estimating an unbiased null distribution by applying an SCA-exclusive permutation scheme. We test and compare the performance of SAIC against four peer methods (GISTIC, STAC, KC-SMART, CMDS) on a large number of simulation datasets. Experimental results show that SAIC outperforms peer methods in terms of larger area under the Receiver Operating Characteristics curve and increased detection power. We then apply SAIC to analyze structural genomic aberrations acquired in four real cancer genome-wide copy number data sets (ovarian cancer, metastatic prostate cancer, lung adenocarcinoma, glioblastoma). When compared with previously reported results, SAIC successfully identifies most SCAs known to be of biological significance and associated with oncogenes (e.g., KRAS, CCNE1, and MYC) or tumor suppressor genes (e.g., CDKN2A/B). Furthermore, SAIC identifies a number of novel SCAs in these copy number data that encompass tumor related genes and may warrant further studies. Supported by a well-grounded theoretical framework, SAIC has been developed and used to identify SCAs in various cancer copy number data sets, providing useful information to study the landscape of cancer genomes. Open-source and platform-independent SAIC software is

  19. Genome-wide identification of significant aberrations in cancer genome

    Directory of Open Access Journals (Sweden)

    Yuan Xiguo

    2012-07-01

    Full Text Available Abstract Background Somatic Copy Number Alterations (CNAs in human genomes are present in almost all human cancers. Systematic efforts to characterize such structural variants must effectively distinguish significant consensus events from random background aberrations. Here we introduce Significant Aberration in Cancer (SAIC, a new method for characterizing and assessing the statistical significance of recurrent CNA units. Three main features of SAIC include: (1 exploiting the intrinsic correlation among consecutive probes to assign a score to each CNA unit instead of single probes; (2 performing permutations on CNA units that preserve correlations inherent in the copy number data; and (3 iteratively detecting Significant Copy Number Aberrations (SCAs and estimating an unbiased null distribution by applying an SCA-exclusive permutation scheme. Results We test and compare the performance of SAIC against four peer methods (GISTIC, STAC, KC-SMART, CMDS on a large number of simulation datasets. Experimental results show that SAIC outperforms peer methods in terms of larger area under the Receiver Operating Characteristics curve and increased detection power. We then apply SAIC to analyze structural genomic aberrations acquired in four real cancer genome-wide copy number data sets (ovarian cancer, metastatic prostate cancer, lung adenocarcinoma, glioblastoma. When compared with previously reported results, SAIC successfully identifies most SCAs known to be of biological significance and associated with oncogenes (e.g., KRAS, CCNE1, and MYC or tumor suppressor genes (e.g., CDKN2A/B. Furthermore, SAIC identifies a number of novel SCAs in these copy number data that encompass tumor related genes and may warrant further studies. Conclusions Supported by a well-grounded theoretical framework, SAIC has been developed and used to identify SCAs in various cancer copy number data sets, providing useful information to study the landscape of cancer genomes

  20. Genomics of Volvocine Algae

    Science.gov (United States)

    Umen, James G.; Olson, Bradley J.S.C.

    2015-01-01

    Volvocine algae are a group of chlorophytes that together comprise a unique model for evolutionary and developmental biology. The species Chlamydomonas reinhardtii and Volvox carteri represent extremes in morphological diversity within the Volvocine clade. Chlamydomonas is unicellular and reflects the ancestral state of the group, while Volvox is multicellular and has evolved numerous innovations including germ-soma differentiation, sexual dimorphism, and complex morphogenetic patterning. The Chlamydomonas genome sequence has shed light on several areas of eukaryotic cell biology, metabolism and evolution, while the Volvox genome sequence has enabled a comparison with Chlamydomonas that reveals some of the underlying changes that enabled its transition to multicellularity, but also underscores the subtlety of this transition. Many of the tools and resources are in place to further develop Volvocine algae as a model for evolutionary genomics. PMID:25883411

  1. Genomics of Preterm Birth

    Science.gov (United States)

    Swaggart, Kayleigh A.; Pavlicev, Mihaela; Muglia, Louis J.

    2015-01-01

    The molecular mechanisms controlling human birth timing at term, or resulting in preterm birth, have been the focus of considerable investigation, but limited insights have been gained over the past 50 years. In part, these processes have remained elusive because of divergence in reproductive strategies and physiology shown by model organisms, making extrapolation to humans uncertain. Here, we summarize the evolution of progesterone signaling and variation in pregnancy maintenance and termination. We use this comparative physiology to support the hypothesis that selective pressure on genomic loci involved in the timing of parturition have shaped human birth timing, and that these loci can be identified with comparative genomic strategies. Previous limitations imposed by divergence of mechanisms provide an important new opportunity to elucidate fundamental pathways of parturition control through increasing availability of sequenced genomes and associated reproductive physiology characteristics across diverse organisms. PMID:25646385

  2. Genomics of Salmonella Species

    Science.gov (United States)

    Canals, Rocio; McClelland, Michael; Santiviago, Carlos A.; Andrews-Polymenis, Helene

    Progress in the study of Salmonella survival, colonization, and virulence has increased rapidly with the advent of complete genome sequencing and higher capacity assays for transcriptomic and proteomic analysis. Although many of these techniques have yet to be used to directly assay Salmonella growth on foods, these assays are currently in use to determine Salmonella factors necessary for growth in animal models including livestock animals and in in vitro conditions that mimic many different environments. As sequencing of the Salmonella genome and microarray analysis have revolutionized genomics and transcriptomics of salmonellae over the last decade, so are new high-throughput sequencing technologies currently accelerating the pace of our studies and allowing us to approach complex problems that were not previously experimentally tractable.

  3. Earthquake hazard assessment and small earthquakes

    International Nuclear Information System (INIS)

    Reiter, L.

    1987-01-01

    The significance of small earthquakes and their treatment in nuclear power plant seismic hazard assessment is an issue which has received increased attention over the past few years. In probabilistic studies, sensitivity studies showed that the choice of the lower bound magnitude used in hazard calculations can have a larger than expected effect on the calculated hazard. Of particular interest is the fact that some of the difference in seismic hazard calculations between the Lawrence Livermore National Laboratory (LLNL) and Electric Power Research Institute (EPRI) studies can be attributed to this choice. The LLNL study assumed a lower bound magnitude of 3.75 while the EPRI study assumed a lower bound magnitude of 5.0. The magnitudes used were assumed to be body wave magnitudes or their equivalents. In deterministic studies recent ground motion recordings of small to moderate earthquakes at or near nuclear power plants have shown that the high frequencies of design response spectra may be exceeded. These exceedances became important issues in the licensing of the Summer and Perry nuclear power plants. At various times in the past particular concerns have been raised with respect to the hazard and damage potential of small to moderate earthquakes occurring at very shallow depths. In this paper a closer look is taken at these issues. Emphasis is given to the impact of lower bound magnitude on probabilistic hazard calculations and the historical record of damage from small to moderate earthquakes. Limited recommendations are made as to how these issues should be viewed

  4. A Web-Based Comparative Genomics Tutorial for Investigating Microbial Genomes

    Directory of Open Access Journals (Sweden)

    Michael Strong

    2009-12-01

    Full Text Available As the number of completely sequenced microbial genomes continues to rise at an impressive rate, it is important to prepare students with the skills necessary to investigate microorganisms at the genomic level. As a part of the core curriculum for first-year graduate students in the biological sciences, we have implemented a web-based tutorial to introduce students to the fields of comparative and functional genomics. The tutorial focuses on recent computational methods for identifying functionally linked genes and proteins on a genome-wide scale and was used to introduce students to the Rosetta Stone, Phylogenetic Profile, conserved Gene Neighbor, and Operon computational methods. Students learned to use a number of publicly available web servers and databases to identify functionally linked genes in the Escherichia coli genome, with emphasis on genome organization and operon structure. The overall effectiveness of the tutorial was assessed based on student evaluations and homework assignments. The tutorial is available to other educators at http://www.doe-mbi.ucla.edu/~strong/m253.php.

  5. Ebolavirus comparative genomics

    Science.gov (United States)

    Jun, Se-Ran; Leuze, Michael R.; Nookaew, Intawat; Uberbacher, Edward C.; Land, Miriam; Zhang, Qian; Wanchai, Visanu; Chai, Juanjuan; Nielsen, Morten; Trolle, Thomas; Lund, Ole; Buzard, Gregory S.; Pedersen, Thomas D.; Wassenaar, Trudy M.; Ussery, David W.

    2015-01-01

    The 2014 Ebola outbreak in West Africa is the largest documented for this virus. To examine the dynamics of this genome, we compare more than 100 currently available ebolavirus genomes to each other and to other viral genomes. Based on oligomer frequency analysis, the family Filoviridae forms a distinct group from all other sequenced viral genomes. All filovirus genomes sequenced to date encode proteins with similar functions and gene order, although there is considerable divergence in sequences between the three genera Ebolavirus, Cuevavirus and Marburgvirus within the family Filoviridae. Whereas all ebolavirus genomes are quite similar (multiple sequences of the same strain are often identical), variation is most common in the intergenic regions and within specific areas of the genes encoding the glycoprotein (GP), nucleoprotein (NP) and polymerase (L). We predict regions that could contain epitope-binding sites, which might be good vaccine targets. This information, combined with glycosylation sites and experimentally determined epitopes, can identify the most promising regions for the development of therapeutic strategies. This manuscript has been authored by UT-Battelle, LLC under Contract No. DE-AC05-00OR22725 with the U.S. Department of Energy. The United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes. The Department of Energy will provide public access to these results of federally sponsored research in accordance with the DOE Public Access Plan (http://energy.gov/downloads/doe-public-access-plan). PMID:26175035

  6. Brief Guide to Genomics: DNA, Genes and Genomes

    Science.gov (United States)

    ... clinic. Most new drugs based on genome-based research are estimated to be at least 10 to 15 years away, though recent genome-driven efforts in lipid-lowering therapy have considerably shortened that interval. According ...

  7. Genomic Prediction in Barley

    DEFF Research Database (Denmark)

    Edriss, Vahid; Cericola, Fabio; Jensen, Jens D

    2015-01-01

    to next generation. The main goal of this study was to see the potential of using genomic prediction in a commercial Barley breeding program. The data used in this study was from Nordic Seed company which is located in Denmark. Around 350 advanced lines were genotyped with 9K Barely chip from Illumina....... Traits used in this study were grain yield, plant height and heading date. Heading date is number days it takes after 1st June for plant to head. Heritabilities were 0.33, 0.44 and 0.48 for yield, height and heading, respectively for the average of nine plots. The GBLUP model was used for genomic...

  8. Comparative Genomics Reveals High Genomic Diversity in the Genus Photobacterium.

    Science.gov (United States)

    Machado, Henrique; Gram, Lone

    2017-01-01

    Vibrionaceae is a large marine bacterial family, which can constitute up to 50% of the prokaryotic population in marine waters. Photobacterium is the second largest genus in the family and we used comparative genomics on 35 strains representing 16 of the 28 species described so far, to understand the genomic diversity present in the Photobacterium genus. Such understanding is important for ecophysiology studies of the genus. We used whole genome sequences to evaluate phylogenetic relationships using several analyses (16S rRNA, MLSA, fur , amino-acid usage, ANI), which allowed us to identify two misidentified strains. Genome analyses also revealed occurrence of higher and lower GC content clades, correlating with phylogenetic clusters. Pan- and core-genome analysis revealed the conservation of 25% of the genome throughout the genus, with a large and open pan-genome. The major source of genomic diversity could be traced to the smaller chromosome and plasmids. Several of the physiological traits studied in the genus did not correlate with phylogenetic data. Since horizontal gene transfer (HGT) is often suggested as a source of genetic diversity and a potential driver of genomic evolution in bacterial species, we looked into evidence of such in Photobacterium genomes. Genomic islands were the source of genomic differences between strains of the same species. Also, we found transposase genes and CRISPR arrays that suggest multiple encounters with foreign DNA. Presence of genomic exchange traits was widespread and abundant in the genus, suggesting a role in genomic evolution. The high genetic variability and indications of genetic exchange make it difficult to elucidate genome evolutionary paths and raise the awareness of the roles of foreign DNA in the genomic evolution of environmental organisms.

  9. Genomic divergences among cattle, dog and human estimated from large-scale alignments of genomic sequences

    Directory of Open Access Journals (Sweden)

    Shade Larry L

    2006-06-01

    Full Text Available Abstract Background Approximately 11 Mb of finished high quality genomic sequences were sampled from cattle, dog and human to estimate genomic divergences and their regional variation among these lineages. Results Optimal three-way multi-species global sequence alignments for 84 cattle clones or loci (each >50 kb of genomic sequence were constructed using the human and dog genome assemblies as references. Genomic divergences and substitution rates were examined for each clone and for various sequence classes under different functional constraints. Analysis of these alignments revealed that the overall genomic divergences are relatively constant (0.32–0.37 change/site for pairwise comparisons among cattle, dog and human; however substitution rates vary across genomic regions and among different sequence classes. A neutral mutation rate (2.0–2.2 × 10(-9 change/site/year was derived from ancestral repetitive sequences, whereas the substitution rate in coding sequences (1.1 × 10(-9 change/site/year was approximately half of the overall rate (1.9–2.0 × 10(-9 change/site/year. Relative rate tests also indicated that cattle have a significantly faster rate of substitution as compared to dog and that this difference is about 6%. Conclusion This analysis provides a large-scale and unbiased assessment of genomic divergences and regional variation of substitution rates among cattle, dog and human. It is expected that these data will serve as a baseline for future mammalian molecular evolution studies.

  10. phiGENOME: an integrative navigation throughout bacteriophage genomes.

    Science.gov (United States)

    Stano, Matej; Klucar, Lubos

    2011-11-01

    phiGENOME is a web-based genome browser generating dynamic and interactive graphical representation of phage genomes stored in the phiSITE, database of gene regulation in bacteriophages. phiGENOME is an integral part of the phiSITE web portal (http://www.phisite.org/phigenome) and it was optimised for visualisation of phage genomes with the emphasis on the gene regulatory elements. phiGENOME consists of three components: (i) genome map viewer built using Adobe Flash technology, providing dynamic and interactive graphical display of phage genomes; (ii) sequence browser based on precisely formatted HTML tags, providing detailed exploration of genome features on the sequence level and (iii) regulation illustrator, based on Scalable Vector Graphics (SVG) and designed for graphical representation of gene regulations. Bringing 542 complete genome sequences accompanied with their rich annotations and references, makes phiGENOME a unique information resource in the field of phage genomics. Copyright © 2011 Elsevier Inc. All rights reserved.

  11. Improving Microbial Genome Annotations in an Integrated Database Context

    Science.gov (United States)

    Chen, I-Min A.; Markowitz, Victor M.; Chu, Ken; Anderson, Iain; Mavromatis, Konstantinos; Kyrpides, Nikos C.; Ivanova, Natalia N.

    2013-01-01

    Effective comparative analysis of microbial genomes requires a consistent and complete view of biological data. Consistency regards the biological coherence of annotations, while completeness regards the extent and coverage of functional characterization for genomes. We have developed tools that allow scientists to assess and improve the consistency and completeness of microbial genome annotations in the context of the Integrated Microbial Genomes (IMG) family of systems. All publicly available microbial genomes are characterized in IMG using different functional annotation and pathway resources, thus providing a comprehensive framework for identifying and resolving annotation discrepancies. A rule based system for predicting phenotypes in IMG provides a powerful mechanism for validating functional annotations, whereby the phenotypic traits of an organism are inferred based on the presence of certain metabolic reactions and pathways and compared to experimentally observed phenotypes. The IMG family of systems are available at http://img.jgi.doe.gov/. PMID:23424620

  12. Improving microbial genome annotations in an integrated database context.

    Directory of Open Access Journals (Sweden)

    I-Min A Chen

    Full Text Available Effective comparative analysis of microbial genomes requires a consistent and complete view of biological data. Consistency regards the biological coherence of annotations, while completeness regards the extent and coverage of functional characterization for genomes. We have developed tools that allow scientists to assess and improve the consistency and completeness of microbial genome annotations in the context of the Integrated Microbial Genomes (IMG family of systems. All publicly available microbial genomes are characterized in IMG using different functional annotation and pathway resources, thus providing a comprehensive framework for identifying and resolving annotation discrepancies. A rule based system for predicting phenotypes in IMG provides a powerful mechanism for validating functional annotations, whereby the phenotypic traits of an organism are inferred based on the presence of certain metabolic reactions and pathways and compared to experimentally observed phenotypes. The IMG family of systems are available at http://img.jgi.doe.gov/.

  13. Study of laser-generated debris free x-ray sources produced in a high-density linear Ar, Kr, Xe, Kr/Ar and Xe/Kr/Ar mixtures gas jets by 2 ω, sub-ps LLNL Titan laser

    Science.gov (United States)

    Kantsyrev, V. L.; Schultz, K. A.; Shlyaptseva, V. V.; Safronova, A. S.; Cooper, M. C.; Shrestha, I. K.; Petkov, E. E.; Stafford, A.; Moschella, J. J.; Schmidt-Petersen, M. T.; Butcher, C. J.; Kemp, G. E.; Andrews, S. D.; Fournier, K. B.

    2016-10-01

    The study of laser-generated debris-free x-ray sources in an underdense plasma produced in a high-density linear gas-puff jet was carried out at the LLNL Titan laser (2 ω, 45 J, sub-ps) with an intensity in the 10 um focal spot of 7 x 1019 W/cm2. A linear nozzle with a fast valve was used for the generation of a clusters/gas jet. X-ray diagnostics for the spectral region of 0.7 - 9 keV include: two spectrometers and pinhole cameras, and 3 groups of fast filtered detectors. Electron beams were measured with the EPPS magnetic spectrometer (>1 MeV) and Faraday cups (>72 keV). Spectralon/spectrometer devices were also used to measure absorption of laser radiation in the jets. New results were obtained on: anisotropic generation of x-rays (laser to x-ray conversion coefficient was >1%) and characteristics of laser-generated electron beams; evolution of x-ray generation with the location of the laser focus in a cluster-gas jet, and observations of a strong x-ray flash in some focusing regimes. Non-LTE kinetic modeling was used to estimate plasma parameters. UNR work supported by the DTRA Basic Research Award # HDTRA1-13-1-0033. Work at LLNL was performed under the auspices of the U.S. DOE by LLNL under Contract DE-AC52-07NA27344.

  14. Matching phenotypes to whole genomes: Lessons learned from four iterations of the personal genome project community challenges.

    Science.gov (United States)

    Cai, Binghuang; Li, Biao; Kiga, Nikki; Thusberg, Janita; Bergquist, Timothy; Chen, Yun-Ching; Niknafs, Noushin; Carter, Hannah; Tokheim, Collin; Beleva-Guthrie, Violeta; Douville, Christopher; Bhattacharya, Rohit; Yeo, Hui Ting Grace; Fan, Jean; Sengupta, Sohini; Kim, Dewey; Cline, Melissa; Turner, Tychele; Diekhans, Mark; Zaucha, Jan; Pal, Lipika R; Cao, Chen; Yu, Chen-Hsin; Yin, Yizhou; Carraro, Marco; Giollo, Manuel; Ferrari, Carlo; Leonardi, Emanuela; Tosatto, Silvio C E; Bobe, Jason; Ball, Madeleine; Hoskins, Roger A; Repo, Susanna; Church, George; Brenner, Steven E; Moult, John; Gough, Julian; Stanke, Mario; Karchin, Rachel; Mooney, Sean D

    2017-09-01

    The advent of next-generation sequencing has dramatically decreased the cost for whole-genome sequencing and increased the viability for its application in research and clinical care. The Personal Genome Project (PGP) provides unrestricted access to genomes of individuals and their associated phenotypes. This resource enabled the Critical Assessment of Genome Interpretation (CAGI) to create a community challenge to assess the bioinformatics community's ability to predict traits from whole genomes. In the CAGI PGP challenge, researchers were asked to predict whether an individual had a particular trait or profile based on their whole genome. Several approaches were used to assess submissions, including ROC AUC (area under receiver operating characteristic curve), probability rankings, the number of correct predictions, and statistical significance simulations. Overall, we found that prediction of individual traits is difficult, relying on a strong knowledge of trait frequency within the general population, whereas matching genomes to trait profiles relies heavily upon a small number of common traits including ancestry, blood type, and eye color. When a rare genetic disorder is present, profiles can be matched when one or more pathogenic variants are identified. Prediction accuracy has improved substantially over the last 6 years due to improved methodology and a better understanding of features. © 2017 Wiley Periodicals, Inc.

  15. Review of the Hatfield and Dawson RF assessment for Bechtel

    Energy Technology Data Exchange (ETDEWEB)

    Kane, Ron J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2011-10-05

    The construction project at the Russell City Energy Center (RCEC) in Hayward, CA encountered a complication due to RF induction into the construction cranes resulting from operation of the two AM broadcast systems located immediately south of the site. The consulting firm Hatfield and Dawson was contacted by Bechtel for the assessment and mitigation of the induced currents and voltages and their recommendations were implemented by Bechtel. The staff at the Lawrence Livermore National Laboratory (LLNL) was subsequently asked to review the analysis of the Hatfield and Dawson work, provide an independent assessment and offer further mitigation comments. LLNL has examined the work by Hatfield and Dawson, the numerical analyses of both agrees and correlates well with local field measurements. The mitigation efforts follow the OSHA rules and have been adapted to further reduce the possibility of worker injury through specialized training, daily task planning and specific assignments to workers to minimize exposure of all to the induced RF currents. LLNL further recommends that Bechtel formalize the RF training to provide additional value to the individual workers as well as for Bechtel to maintain documentation so that future work could make use of experienced workers. There is a possibility that the RF energy will couple into the actuator and sensors as the facility is built out. The operation of the two transmitters could introduce interference formed from the interaction of the signals in nonlinear circuit responses producing intermodulation distortion. The result is interference at unexpected frequencies, some of which can be low and not filtered out of the sensors unless specifically identified. Future testing is planned for evaluating the likelihood for RF interference issues.

  16. Illuminating the Druggable Genome (IDG)

    Data.gov (United States)

    Federal Laboratory Consortium — Results from the Human Genome Project revealed that the human genome contains 20,000 to 25,000 genes. A gene contains (encodes) the information that each cell uses...

  17. National Human Genome Research Institute

    Science.gov (United States)

    ... Care Genomic Medicine Working Group New Horizons and Research Patient Management Policy and Ethics Issues Quick Links for Patient Care Education All About the Human Genome Project Fact Sheets Genetic Education Resources for ...

  18. Hospital nursing leadership-led interventions increased genomic awareness and educational intent in Magnet settings.

    Science.gov (United States)

    Calzone, Kathleen A; Jenkins, Jean; Culp, Stacey; Badzek, Laurie

    2017-11-13

    The Precision Medicine Initiative will accelerate genomic discoveries that improve health care, necessitating a genomic competent workforce. This study assessed leadership team (administrator/educator) year-long interventions to improve registered nurses' (RNs) capacity to integrate genomics into practice. We examined genomic competency outcomes in 8,150 RNs. Awareness and intention to learn more increased compared with controls. Findings suggest achieving genomic competency requires a longer intervention and support strategies such as infrastructure and policies. Leadership played a role in mobilizing staff, resources, and supporting infrastructure to sustain a large-scale competency effort on an institutional basis. Results demonstrate genomic workforce competency can be attained with leadership support and sufficient time. Our study provides evidence of the critical role health-care leaders play in facilitating genomic integration into health care to improve patient outcomes. Genomics' impact on quality, safety, and cost indicate a leader-initiated national competency effort is achievable and warranted. Published by Elsevier Inc.

  19. Genomic prediction using subsampling.

    Science.gov (United States)

    Xavier, Alencar; Xu, Shizhong; Muir, William; Rainey, Katy Martin

    2017-03-24

    Genome-wide assisted selection is a critical tool for the genetic improvement of plants and animals. Whole-genome regression models in Bayesian framework represent the main family of prediction methods. Fitting such models with a large number of observations involves a prohibitive computational burden. We propose the use of subsampling bootstrap Markov chain in genomic prediction. Such method consists of fitting whole-genome regression models by subsampling observations in each round of a Markov Chain Monte Carlo. We evaluated the effect of subsampling bootstrap on prediction and computational parameters. Across datasets, we observed an optimal subsampling proportion of observations around 50% with replacement, and around 33% without replacement. Subsampling provided a substantial decrease in computation time, reducing the time to fit the model by half. On average, losses on predictive properties imposed by subsampling were negligible, usually below 1%. For each dataset, an optimal subsampling point that improves prediction properties was observed, but the improvements were also negligible. Combining subsampling with Gibbs sampling is an interesting ensemble algorithm. The investigation indicates that the subsampling bootstrap Markov chain algorithm substantially reduces computational burden associated with model fitting, and it may slightly enhance prediction properties.

  20. The Lotus japonicus genome

    DEFF Research Database (Denmark)

    Fabaceae, groundbreaking genetic and genomic research has established a significant body of knowledge on Lotus japonicus, which was adopted as a model species more than 20 years ago. The diverse nature of legumes means that such research has a wide potential and agricultural impact, for example...

  1. Genomic taxonomy of vibrios

    DEFF Research Database (Denmark)

    Thompson, Cristiane C.; Vicente, Ana Carolina P.; Souza, Rangel C.

    2009-01-01

    BACKGROUND: Vibrio taxonomy has been based on a polyphasic approach. In this study, we retrieve useful taxonomic information (i.e. data that can be used to distinguish different taxonomic levels, such as species and genera) from 32 genome sequences of different vibrio species. We use a variety of...

  2. The Genome Atlas Resource

    DEFF Research Database (Denmark)

    Azam Qureshi, Matloob; Rotenberg, Eva; Stærfeldt, Hans Henrik

    2010-01-01

    with scripts and algorithms developed in a variety of programming languages at the Centre for Biological Sequence Analysis in order to create a three-tier software application for genome analysis. The results are made available via a web interface developed in Java, PHP and Perl CGI. User...

  3. Genomic Signatures of Reinforcement

    Directory of Open Access Journals (Sweden)

    Austin G. Garner

    2018-04-01

    Full Text Available Reinforcement is the process by which selection against hybridization increases reproductive isolation between taxa. Much research has focused on demonstrating the existence of reinforcement, yet relatively little is known about the genetic basis of reinforcement or the evolutionary conditions under which reinforcement can occur. Inspired by reinforcement’s characteristic phenotypic pattern of reproductive trait divergence in sympatry but not in allopatry, we discuss whether reinforcement also leaves a distinct genomic pattern. First, we describe three patterns of genetic variation we expect as a consequence of reinforcement. Then, we discuss a set of alternative processes and complicating factors that may make the identification of reinforcement at the genomic level difficult. Finally, we consider how genomic analyses can be leveraged to inform if and to what extent reinforcement evolved in the face of gene flow between sympatric lineages and between allopatric and sympatric populations of the same lineage. Our major goals are to understand if genome scans for particular patterns of genetic variation could identify reinforcement, isolate the genetic basis of reinforcement, or infer the conditions under which reinforcement evolved.

  4. Better chocolate through genomics

    Science.gov (United States)

    Theobroma cacao, the cacao or chocolate tree, is a tropical understory tree whose seeds are used to make chocolate. And like any important crop, cacao is the subject of much research. On September 15, 2010, scientists publicly released a preliminary sequence of the cacao genome--which contains all o...

  5. Functional genomics of tomato

    Indian Academy of Sciences (India)

    2014-10-20

    Oct 20, 2014 ... 1Repository of Tomato Genomics Resources, Department of Plant Sciences, School .... Due to its position at the crossroads of Sanger's sequencing .... replacement for the microarray-based expression profiling. .... during RNA fragmentation step prior to library construction, ...... tomato pollen as a test case.

  6. Genomic Signatures of Reinforcement

    Science.gov (United States)

    Goulet, Benjamin E.

    2018-01-01

    Reinforcement is the process by which selection against hybridization increases reproductive isolation between taxa. Much research has focused on demonstrating the existence of reinforcement, yet relatively little is known about the genetic basis of reinforcement or the evolutionary conditions under which reinforcement can occur. Inspired by reinforcement’s characteristic phenotypic pattern of reproductive trait divergence in sympatry but not in allopatry, we discuss whether reinforcement also leaves a distinct genomic pattern. First, we describe three patterns of genetic variation we expect as a consequence of reinforcement. Then, we discuss a set of alternative processes and complicating factors that may make the identification of reinforcement at the genomic level difficult. Finally, we consider how genomic analyses can be leveraged to inform if and to what extent reinforcement evolved in the face of gene flow between sympatric lineages and between allopatric and sympatric populations of the same lineage. Our major goals are to understand if genome scans for particular patterns of genetic variation could identify reinforcement, isolate the genetic basis of reinforcement, or infer the conditions under which reinforcement evolved. PMID:29614048

  7. The Nostoc punctiforme Genome

    Energy Technology Data Exchange (ETDEWEB)

    John C. Meeks

    2001-12-31

    Nostoc punctiforme is a filamentous cyanobacterium with extensive phenotypic characteristics and a relatively large genome, approaching 10 Mb. The phenotypic characteristics include a photoautotrophic, diazotrophic mode of growth, but N. punctiforme is also facultatively heterotrophic; its vegetative cells have multiple development alternatives, including terminal differentiation into nitrogen-fixing heterocysts and transient differentiation into spore-like akinetes or motile filaments called hormogonia; and N. punctiforme has broad symbiotic competence with fungi and terrestrial plants, including bryophytes, gymnosperms and an angiosperm. The shotgun-sequencing phase of the N. punctiforme strain ATCC 29133 genome has been completed by the Joint Genome Institute. Annotation of an 8.9 Mb database yielded 7432 open reading frames, 45% of which encode proteins with known or probable known function and 29% of which are unique to N. punctiforme. Comparative analysis of the sequence indicates a genome that is highly plastic and in a state of flux, with numerous insertion sequences and multilocus repeats, as well as genes encoding transposases and DNA modification enzymes. The sequence also reveals the presence of genes encoding putative proteins that collectively define almost all characteristics of cyanobacteria as a group. N. punctiforme has an extensive potential to sense and respond to environmental signals as reflected by the presence of more than 400 genes encoding sensor protein kinases, response regulators and other transcriptional factors. The signal transduction systems and any of the large number of unique genes may play essential roles in the cell differentiation and symbiotic interaction properties of N. punctiforme.

  8. Comparative Genomics of Eukaryotes.

    NARCIS (Netherlands)

    Noort, V. van

    2007-01-01

    This thesis focuses on developing comparative genomics methods in eukaryotes, with an emphasis on applications for gene function prediction and regulatory element detection. In the past, methods have been developed to predict functional associations between gene pairs in prokaryotes. The challenge

  9. Searching for genomic constraints

    Energy Technology Data Exchange (ETDEWEB)

    Lio` , P [Cambridge, Univ. (United Kingdom). Genetics Dept.; Ruffo, S [Florence, Univ. (Italy). Fac. di Ingegneria. Dipt. di Energetica ` S. Stecco`

    1998-01-01

    The authors have analyzed general properties of very long DNA sequences belonging to simple and complex organisms, by using different correlation methods. They have distinguished those base compositional rules that concern the entire genome which they call `genomic constraints` from the rules that depend on the `external natural selection` acting on single genes, i. e. protein-centered constraints. They show that G + C content, purine / pyrimidine distributions and biological complexity of the organism are the most important factors which determine base compositional rules and genome complexity. Three main facts are here reported: bacteria with high G + C content have more restrictions on base composition than those with low G + C content; at constant G + C content more complex organisms, ranging from prokaryotes to higher eukaryotes (e.g. human) display an increase of repeats 10-20 nucleotides long, which are also partly responsible for long-range correlations; work selection of length 3 to 10 is stronger in human and in bacteria for two distinct reasons. With respect to previous studies, they have also compared the genomic sequence of the archeon Methanococcus jannaschii with those of bacteria and eukaryotes: it shows sometimes an intermediate statistical behaviour.

  10. Searching for genomic constraints

    International Nuclear Information System (INIS)

    Lio', P.; Ruffo, S.

    1998-01-01

    The authors have analyzed general properties of very long DNA sequences belonging to simple and complex organisms, by using different correlation methods. They have distinguished those base compositional rules that concern the entire genome which they call 'genomic constraints' from the rules that depend on the 'external natural selection' acting on single genes, i. e. protein-centered constraints. They show that G + C content, purine / pyrimidine distributions and biological complexity of the organism are the most important factors which determine base compositional rules and genome complexity. Three main facts are here reported: bacteria with high G + C content have more restrictions on base composition than those with low G + C content; at constant G + C content more complex organisms, ranging from prokaryotes to higher eukaryotes (e.g. human) display an increase of repeats 10-20 nucleotides long, which are also partly responsible for long-range correlations; work selection of length 3 to 10 is stronger in human and in bacteria for two distinct reasons. With respect to previous studies, they have also compared the genomic sequence of the archeon Methanococcus jannaschii with those of bacteria and eukaryotes: it shows sometimes an intermediate statistical behaviour

  11. Genomic sequencing in clinical trials

    OpenAIRE

    Mestan, Karen K; Ilkhanoff, Leonard; Mouli, Samdeep; Lin, Simon

    2011-01-01

    Abstract Human genome sequencing is the process by which the exact order of nucleic acid base pairs in the 24 human chromosomes is determined. Since the completion of the Human Genome Project in 2003, genomic sequencing is rapidly becoming a major part of our translational research efforts to understand and improve human health and disease. This article reviews the current and future directions of clinical research with respect to genomic sequencing, a technology that is just beginning to fin...

  12. Statistical Methods in Integrative Genomics

    Science.gov (United States)

    Richardson, Sylvia; Tseng, George C.; Sun, Wei

    2016-01-01

    Statistical methods in integrative genomics aim to answer important biology questions by jointly analyzing multiple types of genomic data (vertical integration) or aggregating the same type of data across multiple studies (horizontal integration). In this article, we introduce different types of genomic data and data resources, and then review statistical methods of integrative genomics, with emphasis on the motivation and rationale of these methods. We conclude with some summary points and future research directions. PMID:27482531

  13. From plant genomes to phenotypes

    OpenAIRE

    Bolger, Marie; Gundlach, Heidrun; Scholz, Uwe; Mayer, Klaus; Usadel, Björn; Schwacke, Rainer; Schmutzer, Thomas; Chen, Jinbo; Arend, Daniel; Oppermann, Markus; Weise, Stephan; Lange, Matthias; Fiorani, Fabio; Spannagl, Manuel

    2017-01-01

    Recent advances in sequencing technologies have greatly accelerated the rate of plant genome and applied breeding research. Despite this advancing trend, plant genomes continue to present numerous difficulties to the standard tools and pipelines not only for genome assembly but also gene annotation and downstream analysis.Here we give a perspective on tools, resources and services necessary to assemble and analyze plant genomes and link them to plant phenotypes.

  14. Comparative genomics of the Bifidobacterium breve taxon.

    Science.gov (United States)

    Bottacini, Francesca; O'Connell Motherway, Mary; Kuczynski, Justin; O'Connell, Kerry Joan; Serafini, Fausta; Duranti, Sabrina; Milani, Christian; Turroni, Francesca; Lugli, Gabriele Andrea; Zomer, Aldert; Zhurina, Daria; Riedel, Christian; Ventura, Marco; van Sinderen, Douwe

    2014-03-01

    Bifidobacteria are commonly found as part of the microbiota of the gastrointestinal tract (GIT) of a broad range of hosts, where their presence is positively correlated with the host's health status. In this study, we assessed the genomes of thirteen representatives of Bifidobacterium breve, which is not only a frequently encountered component of the (adult and infant) human gut microbiota, but can also be isolated from human milk and vagina. In silico analysis of genome sequences from thirteen B. breve strains isolated from different environments (infant and adult faeces, human milk, human vagina) shows that the genetic variability of this species principally consists of hypothetical genes and mobile elements, but, interestingly, also genes correlated with the adaptation to host environment and gut colonization. These latter genes specify the biosynthetic machinery for sortase-dependent pili and exopolysaccharide production, as well as genes that provide protection against invasion of foreign DNA (i.e. CRISPR loci and restriction/modification systems), and genes that encode enzymes responsible for carbohydrate fermentation. Gene-trait matching analysis showed clear correlations between known metabolic capabilities and characterized genes, and it also allowed the identification of a gene cluster involved in the utilization of the alcohol-sugar sorbitol. Genome analysis of thirteen representatives of the B. breve species revealed that the deduced pan-genome exhibits an essentially close trend. For this reason our analyses suggest that this number of B. breve representatives is sufficient to fully describe the pan-genome of this species. Comparative genomics also facilitated the genetic explanation for differential carbon source utilization phenotypes previously observed in different strains of B. breve.

  15. A Thousand Fly Genomes: An Expanded Drosophila Genome Nexus.

    Science.gov (United States)

    Lack, Justin B; Lange, Jeremy D; Tang, Alison D; Corbett-Detig, Russell B; Pool, John E

    2016-12-01

    The Drosophila Genome Nexus is a population genomic resource that provides D. melanogaster genomes from multiple sources. To facilitate comparisons across data sets, genomes are aligned using a common reference alignment pipeline which involves two rounds of mapping. Regions of residual heterozygosity, identity-by-descent, and recent population admixture are annotated to enable data filtering based on the user's needs. Here, we present a significant expansion of the Drosophila Genome Nexus, which brings the current data object to a total of 1,121 wild-derived genomes. New additions include 305 previously unpublished genomes from inbred lines representing six population samples in Egypt, Ethiopia, France, and South Africa, along with another 193 genomes added from recently-published data sets. We also provide an aligned D. simulans genome to facilitate divergence comparisons. This improved resource will broaden the range of population genomic questions that can addressed from multi-population allele frequencies and haplotypes in this model species. The larger set of genomes will also enhance the discovery of functionally relevant natural variation that exists within and between populations. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  16. Genome Editing: A New Approach to Human Therapeutics.

    Science.gov (United States)

    Porteus, Matthew

    2016-01-01

    The ability to manipulate the genome with precise spatial and nucleotide resolution (genome editing) has been a powerful research tool. In the past decade, the tools and expertise for using genome editing in human somatic cells and pluripotent cells have increased to such an extent that the approach is now being developed widely as a strategy to treat human disease. The fundamental process depends on creating a site-specific DNA double-strand break (DSB) in the genome and then allowing the cell's endogenous DSB repair machinery to fix the break such that precise nucleotide changes are made to the DNA sequence. With the development and discovery of several different nuclease platforms and increasing knowledge of the parameters affecting different genome editing outcomes, genome editing frequencies now reach therapeutic relevance for a wide variety of diseases. Moreover, there is a series of complementary approaches to assessing the safety and toxicity of any genome editing process, irrespective of the underlying nuclease used. Finally, the development of genome editing has raised the issue of whether it should be used to engineer the human germline. Although such an approach could clearly prevent the birth of people with devastating and destructive genetic diseases, questions remain about whether human society is morally responsible enough to use this tool.

  17. Lawrence Livermore National Laboratory Emergency Response Capability Baseline Needs Assessment Requirement Document

    Energy Technology Data Exchange (ETDEWEB)

    Sharry, J A

    2009-12-30

    This revision of the LLNL Fire Protection Baseline Needs Assessment (BNA) was prepared by John A. Sharry, LLNL Fire Marshal and LLNL Division Leader for Fire Protection and reviewed by Martin Gresho, Sandia/CA Fire Marshal. The document follows and expands upon the format and contents of the DOE Model Fire Protection Baseline Capabilities Assessment document contained on the DOE Fire Protection Web Site, but only address emergency response. The original LLNL BNA was created on April 23, 1997 as a means of collecting all requirements concerning emergency response capabilities at LLNL (including response to emergencies at Sandia/CA) into one BNA document. The original BNA documented the basis for emergency response, emergency personnel staffing, and emergency response equipment over the years. The BNA has been updated and reissued five times since in 1998, 1999, 2000, 2002, and 2004. A significant format change was performed in the 2004 update of the BNA in that it was 'zero based.' Starting with the requirement documents, the 2004 BNA evaluated the requirements, and determined minimum needs without regard to previous evaluations. This 2010 update maintains the same basic format and requirements as the 2004 BNA. In this 2010 BNA, as in the previous BNA, the document has been intentionally divided into two separate documents - the needs assessment (1) and the compliance assessment (2). The needs assessment will be referred to as the BNA and the compliance assessment will be referred to as the BNA Compliance Assessment. The primary driver for separation is that the needs assessment identifies the detailed applicable regulations (primarily NFPA Standards) for emergency response capabilities based on the hazards present at LLNL and Sandia/CA and the geographical location of the facilities. The needs assessment also identifies areas where the modification of the requirements in the applicable NFPA standards is appropriate, due to the improved fire protection

  18. Rapid and accurate pyrosequencing of angiosperm plastid genomes

    Science.gov (United States)

    Moore, Michael J; Dhingra, Amit; Soltis, Pamela S; Shaw, Regina; Farmerie, William G; Folta, Kevin M; Soltis, Douglas E

    2006-01-01

    Background Plastid genome sequence information is vital to several disciplines in plant biology, including phylogenetics and molecular biology. The past five years have witnessed a dramatic increase in the number of completely sequenced plastid genomes, fuelled largely by advances in conventional Sanger sequencing technology. Here we report a further significant reduction in time and cost for plastid genome sequencing through the successful use of a newly available pyrosequencing platform, the Genome Sequencer 20 (GS 20) System (454 Life Sciences Corporation), to rapidly and accurately sequence the whole plastid genomes of the basal eudicot angiosperms Nandina domestica (Berberidaceae) and Platanus occidentalis (Platanaceae). Results More than 99.75% of each plastid genome was simultaneously obtained during two GS 20 sequence runs, to an average depth of coverage of 24.6× in Nandina and 17.3× in Platanus. The Nandina and Platanus plastid genomes shared essentially identical gene complements and possessed the typical angiosperm plastid structure and gene arrangement. To assess the accuracy of the GS 20 sequence, over 45 kilobases of sequence were generated for each genome using conventional sequencing. Overall error rates of 0.043% and 0.031% were observed in GS 20 sequence for Nandina and Platanus, respectively. More than 97% of all observed errors were associated with homopolymer runs, with ~60% of all errors associated with homopolymer runs of 5 or more nucleotides and ~50% of all errors associated with regions of extensive homopolymer runs. No substitution errors were present in either genome. Error rates were generally higher in the single-copy and noncoding regions of both plastid genomes relative to the inverted repeat and coding regions. Conclusion Highly accurate and essentially complete sequence information was obtained for the Nandina and Platanus plastid genomes using the GS 20 System. More importantly, the high accuracy observed in the GS 20 plastid

  19. Rapid and accurate pyrosequencing of angiosperm plastid genomes

    Directory of Open Access Journals (Sweden)

    Farmerie William G

    2006-08-01

    Full Text Available Abstract Background Plastid genome sequence information is vital to several disciplines in plant biology, including phylogenetics and molecular biology. The past five years have witnessed a dramatic increase in the number of completely sequenced plastid genomes, fuelled largely by advances in conventional Sanger sequencing technology. Here we report a further significant reduction in time and cost for plastid genome sequencing through the successful use of a newly available pyrosequencing platform, the Genome Sequencer 20 (GS 20 System (454 Life Sciences Corporation, to rapidly and accurately sequence the whole plastid genomes of the basal eudicot angiosperms Nandina domestica (Berberidaceae and Platanus occidentalis (Platanaceae. Results More than 99.75% of each plastid genome was simultaneously obtained during two GS 20 sequence runs, to an average depth of coverage of 24.6× in Nandina and 17.3× in Platanus. The Nandina and Platanus plastid genomes shared essentially identical gene complements and possessed the typical angiosperm plastid structure and gene arrangement. To assess the accuracy of the GS 20 sequence, over 45 kilobases of sequence were generated for each genome using conventional sequencing. Overall error rates of 0.043% and 0.031% were observed in GS 20 sequence for Nandina and Platanus, respectively. More than 97% of all observed errors were associated with homopolymer runs, with ~60% of all errors associated with homopolymer runs of 5 or more nucleotides and ~50% of all errors associated with regions of extensive homopolymer runs. No substitution errors were present in either genome. Error rates were generally higher in the single-copy and noncoding regions of both plastid genomes relative to the inverted repeat and coding regions. Conclusion Highly accurate and essentially complete sequence information was obtained for the Nandina and Platanus plastid genomes using the GS 20 System. More importantly, the high accuracy

  20. The perennial ryegrass GenomeZipper: targeted use of genome resources for comparative grass genomics.

    Science.gov (United States)

    Pfeifer, Matthias; Martis, Mihaela; Asp, Torben; Mayer, Klaus F X; Lübberstedt, Thomas; Byrne, Stephen; Frei, Ursula; Studer, Bruno

    2013-02-01

    Whole-genome sequences established for model and major crop species constitute a key resource for advanced genomic research. For outbreeding forage and turf grass species like ryegrasses (Lolium spp.), such resources have yet to be developed. Here, we present a model of the perennial ryegrass (Lolium perenne) genome on the basis of conserved synteny to barley (Hordeum vulgare) and the model grass genome Brachypodium (Brachypodium distachyon) as well as rice (Oryza sativa) and sorghum (Sorghum bicolor). A transcriptome-based genetic linkage map of perennial ryegrass served as a scaffold to establish the chromosomal arrangement of syntenic genes from model grass species. This scaffold revealed a high degree of synteny and macrocollinearity and was then utilized to anchor a collection of perennial ryegrass genes in silico to their predicted genome positions. This resulted in the unambiguous assignment of 3,315 out of 8,876 previously unmapped genes to the respective chromosomes. In total, the GenomeZipper incorporates 4,035 conserved grass gene loci, which were used for the first genome-wide sequence divergence analysis between perennial ryegrass, barley, Brachypodium, rice, and sorghum. The perennial ryegrass GenomeZipper is an ordered, information-rich genome scaffold, facilitating map-based cloning and genome assembly in perennial ryegrass and closely related Poaceae species. It also represents a milestone in describing synteny between perennial ryegrass and fully sequenced model grass genomes, thereby increasing our understanding of genome organization and evolution in the most important temperate forage and turf grass species.