WorldWideScience

Sample records for bnl plant analyzer

  1. Uncertainty analysis of suppression pool heating during an ATWS in a BWR-5 plant. An application of the CSAU methodology using the BNL engineering plant analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Wulff, W.; Cheng, H.S.; Mallen, A.N. [Brookhaven National Lab., Upton, NY (United States); Johnsen, G.W. [Idaho National Engineering Lab., Idaho Falls, ID (United States); Lellouche, G.S. [Technical Data Services, Chicago, IL (United States)

    1994-03-01

    The uncertainty has been estimated of predicting the peak temperature in the suppression pool of a BWR power plant, which undergoes an NRC-postulated Anticipated Transient Without Scram (ATWS). The ATWS is initiated by recirculation-pump trips, and then leads to power and flow oscillations as they had occurred at the LaSalle-2 Power Station in March of 1988. After limit-cycle oscillations have been established, the turbines are tripped, but without MSIV closure, allowing steam discharge through the turbine bypass into the condenser. Postulated operator actions, namely to lower the reactor vessel pressure and the level elevation in the downcomer, are simulated by a robot model which accounts for operator uncertainty. All balance of plant and control systems modeling uncertainties were part of the statistical uncertainty analysis that was patterned after the Code Scaling, Applicability and Uncertainty (CSAU) evaluation methodology. The analysis showed that the predicted suppression-pool peak temperature of 329.3 K (133{degrees}F) has a 95-percentile uncertainty of 14.4 K (26{degrees}F), and that the size of this uncertainty bracket is dominated by the experimental uncertainty of measuring Safety and Relief Valve mass flow rates under critical-flow conditions. The analysis showed also that the probability of exceeding the suppression-pool temperature limit of 352.6 K (175{degrees}F) is most likely zero (it is estimated as < 5-104). The square root of the sum of the squares of all the computed peak pool temperatures is 350.7 K (171.6{degrees}F).

  2. Demonstration of the BNL Continuous Dual Trap Analyzer to Detect Perfluorocarbon Tracers for the Tag, Track and Location Program

    Energy Technology Data Exchange (ETDEWEB)

    Heiser,J.H.; Adams, J.; Dietz, R..; Milian, L.; Watson, T.

    2008-10-07

    instruments that allow detection of up to seven PFTs at part per quadrillion levels (1015) with sample times as short as 60 seconds. The Continuous Dual-Trap Analyzer (CDTA) was developed for leak hunting applications and can continuously sample the air for PFTs without interruption. Sample time can be as short as 60 seconds. The CDTA has been extensively used in the commercial sector to detect PFTs that have been introduced to leaking buried dielectric fluid-filled cables or leaking subsurface gas lines. The PFTs travel through the cable or pipe until they reach the leak site. PFTs then escape into the surrounding soil and permeate/diffuse to the surface where they can be detected with the CDTA. Typically a cable is tagged with ppm levels of PFTs resulting in ppt to ppq concentrations in the air at the leak site. The CDTA is proven to be rugged, reliable and has a proven track record of successful leak location. The application of the CDTA to PFT detection for TTL is identical to application for leak detection. The CDTA operator has a general idea, with a few miles of roadway, where the leak is located, but no specific knowledge of the location (it can be any where along the road). The CDTA is mounted in a Chevy Astro Van and is dispatched to the field. In the field the van is driven at nominally 15 mph along the road. The CDTA continuously samples the air outside the van (via a 1/4-inch plastic sample tube stuck out a side window) until a positive detection occurs. The van then covers the road section where the detection occurred at a slightly slower pace to pin-point the area where the leak is and to direct soil probe samples. The soil probe samples take soil gas samples every 10 yards or so and the samples are analyzed on the CDTA. The leak can be located to within a few feet in 95% of the cases. To date the CDTA has been successful in every leak hunt performed by BNL. One interesting case was a leak hunt that resulted in repeated negative detections. The confidence in

  3. NGSPN @ BNL

    Energy Technology Data Exchange (ETDEWEB)

    Pepper, S. E. [Brookhaven National Lab. (BNL), Upton, NY (United States); Bachner, K. [Brookhaven National Lab. (BNL), Upton, NY (United States); Gomera, J. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-10-05

    Brookhaven National Laboratory’s (BNL’s) Nonproliferation and National Security Department hosted the Next Generation Safeguards Professional Network (NGSPN) at BNL September 6-9, 2016. Thirteen representatives from seven Department of Energy National Laboratories, including two from BNL, participated in the four-day meeting. The NGSPN meeting was sponsored by the Office of International Nuclear Safeguards (NA-241) of the National Nuclear Security Administration, which provided funding for BNL’s development and conduct of the meeting program and the participant’s labor and travel. NGSPN meetings were previously held at Savannah River National Laboratory, Oak Ridge National Laboratory, Idaho National Laboratory, Sandia National Laboratories, and Los Alamos National Laboratory. The purpose of NGSPN is to provide a forum for early-career international safeguards practitioners to network with their peers, to meet international safeguards experts from other institutions and to learn about organizations other than their employers who contribute to international safeguards.

  4. Plant-bacterium interactions analyzed by proteomics

    Directory of Open Access Journals (Sweden)

    Amber eAfroz

    2013-02-01

    Full Text Available The evolution of the plant immune response has resulted in a highly effective defense system that is able to resist potential attack by microbial pathogens. The primary immune response is referred to as pathogen associated molecular pattern triggered immunity and has evolved to recognize common features of microbial pathogens. In response to the delivery of pathogen effector proteins, plants acquired R proteins to fight against pathogen attack. R-dependent defense response is important in understanding the biochemical and cellular mechanisms and underlying these interactions will enable molecular and transgenic approaches for crops with increased biotic resistance. Proteomic analyses are particularly useful for understanding the mechanisms of host plant against the pathogen attack. Recent advances in the field of proteome analyses have initiated a new research area, i.e the analysis of more complex microbial communities and their interaction with plant. Such areas hold great potential to elucidate, not only the interactions between bacteria and their host plants, but also of bacteria-bacteria interactions between different bacterial taxa, symbiotic, pathogenic bacteria and commensal bacteria. During biotic stress, plant hormonal signaling pathways prioritizes defense over other cellular functions. Some plant pathogens take advantage of hormone dependent regulatory system by mimicking hormones that interfere with host immune responses to promote virulence. In this review, it is discussed the cross talk that plays important role in response to pathogens attack with different infection strategies using proteomic approaches.

  5. FPC conditioning cart at BNL

    Energy Technology Data Exchange (ETDEWEB)

    Xu, W.; Ben-Zvi, I.; Altinbas, F.Z.; Belomestnykh, S.; Burrill, A.; Cole, M.; Deonarine, J.; Jamilkowski, J.; Kayran, D.; Laloudakis, N.; Masi Jr, L.; McIntyre, G.; Pate, D.; Philips, D.; Seda, T.; Steszyn, A.; Tallerico, T.; Todd, R.; Weiss, D.; White, G.; Zaltsman, A.

    2011-03-28

    The 703 MHz superconducting gun for the BNL Energy Recovery Linac (ERL) prototype has two fundamental power couplers (FPCs), and each of them will deliver up to 500 kW of CW RF power. In order to prepare the couplers for high power RF service and process multipacting, the FPCs should be conditioned prior to installation into the gun cryomodule. A conditioning cart based test stand, which includes a vacuum pumping system, controllable bake-out system, diagnostics, interlocks and data log system has been designed, constructed and commissioned by collaboration of BNL and AES. This paper presents FPC conditioning cart systems and the conditioning process.

  6. BNL ALARA Center: ALARA Notes, No. 9

    Energy Technology Data Exchange (ETDEWEB)

    Khan, T.A.; Xie, J.W.; Beckman, M.C. [eds.] [and others

    1994-02-01

    This issue of the Brookhaven National Laboratory`s Alara Notes includes the agenda for the Third International Workshop on ALARA and specific instructions on the use of the on-line fax-on-demand service provided by BNL. Other topics included in this issue are: (1) A discussion of low-level discharges from Canadian nuclear plants, (2) Safety issues at French nuclear plants, (3) Acoustic emission as a means of leak detection, (4) Replacement of steam generators at Doel-3, Beaznau, and North Anna-1, (5) Remote handling equipment at Bruce, (6) EPRI`s low level waste program, (7) Radiation protection during concrete repairs at Savannah River, (8) Reactor vessel stud removal/repair at Comanche Peak-1, (9) Rework of reactor coolant pump motors, (10) Restoration of service water at North Anna-1 and -2, (11) Steam generator tubing problems at Mihama-1, (12) Full system decontamination at Indian Point-2, (13) Chemical decontamination at Browns Ferry-2, and (14) Inspection methodolody in France and Japan.

  7. BNL ATF II beamlines design

    Energy Technology Data Exchange (ETDEWEB)

    Fedurin, M. [Brookhaven National Lab. (BNL), Upton, NY (United States); Jing, Y. [Brookhaven National Lab. (BNL), Upton, NY (United States); Stratakis, D. [Brookhaven National Lab. (BNL), Upton, NY (United States); Swinson, C. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2015-05-03

    The Brookhaven National Laboratory. Accelerator Test Facility (BNL ATF) is currently undergoing a major upgrade (ATF-II). Together with a new location and much improved facilities, the ATF will see an upgrade in its major capabilities: electron beam energy and quality and CO2 laser power. The electron beam energy will be increased in stages, first to 100-150 MeV followed by a further increase to 500 MeV. Combined with the planned increase in CO2 laser power (from 1-100 TW), the ATF-II will be a powerful tool for Advanced Accelerator research. A high-brightness electron beam, produced by a photocathode gun, will be accelerated and optionally delivered to multiple beamlines. Besides the energy range (up to a possible 500 MeV in the final stage) the electron beam can be tailored to each experiment with options such as: small transverse beam size (<10 um), short bunch length (<100 fsec) and, combined short and small bunch options. This report gives a detailed overview of the ATFII capabilities and beamlines configuration.

  8. 2013 BNL Site Environmental Report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Ratel, K.; Remien, J.; Pohlot, P.; Williams, J.; Green, T.; Paquette, P.; Dorsch, W.; Welty, T.; Burke, J.

    2014-10-01

    A summary of Brookhaven National Laboratory’s (BNL) Site Environmental Report, meant to inform the public, regulators, employees, and other stakeholders of the Laboratory’s environmental performance in the lab’s surrounding area during the calendar year. The review is comprised of multiple volumes relevant to environmental data/environmental management performance and groundwater status report.

  9. Ensembl Plants: Integrating Tools for Visualizing, Mining, and Analyzing Plant Genomics Data.

    Science.gov (United States)

    Bolser, Dan; Staines, Daniel M; Pritchard, Emily; Kersey, Paul

    2016-01-01

    Ensembl Plants ( http://plants.ensembl.org ) is an integrative resource presenting genome-scale information for a growing number of sequenced plant species (currently 33). Data provided includes genome sequence, gene models, functional annotation, and polymorphic loci. Various additional information are provided for variation data, including population structure, individual genotypes, linkage, and phenotype data. In each release, comparative analyses are performed on whole genome and protein sequences, and genome alignments and gene trees are made available that show the implied evolutionary history of each gene family. Access to the data is provided through a genome browser incorporating many specialist interfaces for different data types, and through a variety of additional methods for programmatic access and data mining. These access routes are consistent with those offered through the Ensembl interface for the genomes of non-plant species, including those of plant pathogens, pests, and pollinators.Ensembl Plants is updated 4-5 times a year and is developed in collaboration with our international partners in the Gramene ( http://www.gramene.org ) and transPLANT projects ( http://www.transplantdb.org ).

  10. Development of the nuclear plant analyzer for Korean standard Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Shin Hwan; Kim, Hyeong Heon; Song, In Ho; Hong, Eon Yeong; Oh, Yeong Taek [Korea Power Engineering Company Inc., Yongin (Korea, Republic of)

    2000-12-15

    The purpose of this study is to develop an NPA for the Ulchin Nuclear Power Plant Unit 3 and 4, the first KSNP type plant. In this study, the process model simulating the overall plant systems, GUI and simulation executive which provide the functions of an engineering simulator were developed, and the NPA was completed by integrating them. The contents and the scope of this study are as follows : main feedwater system, auxiliary feedwater system, Chemical and Volume Control System(CVCS), Safety Injection System(SIS), Shutdown Cooling System(SCS), electric power supply system, Core Protection Calculator(CPC), various plant control system, development of the graphics screens for each system, real-time simulation, simulation control for the enhancement of functional capabilities, user friendly GUI, collection of the design and operating data, establishment of the NPA database, integration of the GUI and simulation control program with process model, collection of the data for the verification and validation of the developed NPA, collection of the plant test data, collection and review of the results of other computer codes, verification of the simulation accuracy by comparing the NPA results with the actual plant data, validation of the simulation capability of the NPA, comparison against available data from other analysis suing different computer codes.

  11. PHENIX Spinfest School 2009 at BNL

    Energy Technology Data Exchange (ETDEWEB)

    Foster,S.P.; Foster,S.; Seidl, R.; Goto, Y.; Okada, K.

    2009-08-07

    Since 2005, the PHENIX Spin Physics Working Group has set aside several weeks each summer for the purposes of training and integrating recent members of the working group as well as coordinating and making rapid progress on support tasks and data analysis. One week is dedicated to more formal didactic lectures by outside speakers. The location has so far alternated between BNL and the RIKEN campus in Wako, Japan, with support provided by RBRC and LANL.

  12. BNL ENVIRONMENTAL MONITORING PLAN TRIENNIAL UPDATE, JANUARY 2003.

    Energy Technology Data Exchange (ETDEWEB)

    BROOKHAVEN NATIONAL LABORATORY

    2003-01-01

    Brookhaven National Laboratory (BNL) is a multi-program national laboratory operated by Brookhaven Science Associates for the U.S. Department of Energy (DOE) and is located on a 5,265-acre site in Suffolk County, Long Island, New York. BNL has a comprehensive Environmental Management System (EMS) in place, which meets the requirements of the International Organization for Standardization 14001 EMS Standard, as described in the BNL EMS Manual. BNL's extensive environmental monitoring program is one component of the EMS, and the BNL Environmental Monitoring Plan (EMP) describes this program in detail. The data derived from systematically monitoring the various environmental media on site enable BNL to make informed decisions concerning the protection of human health and the environment and to be responsive to community concerns.

  13. Plant analyzer for high-speed interactive simulation of BWR power plant transients

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, H.S.; Lekach, S.V.; Mallen, A.N.; Wulff, W.; Cerbone, R.J.

    1984-04-01

    A combination of advanced modeling techniques and modern, special-purpose peripheral minicomputer technology is presented which affords realistic predictions of plant transient and severe off-normal events in LWR power plants through on-line simulations at a speed ten times faster than actual process speeds. Results are shown for a BWR plant simulation. The mathematical models account for nonequilibrium, nonhomogeneous two-phase flow effects in the coolant, for acoustical effects in the steam line and for the dynamics of the recirculation loop and feedwater train. Point kinetics incorporate reactivity feedback due to void fraction, fuel temperature, coolant temperature, and boron concentration. Control systems and trip logic are simulated for the nuclear steam supply system. The AD10 of Applied Dynamics International is the special-purpose peripheral processor. It is specifically designed for high-speed digital system simulation, accommodates hardware (instrumentation) in the input/output loop, and operates interactively on-line, like an analog computer. Results are shown to demonstrate computing capacity, accuracy, and speed. Simulation speeds have been achieved which are orders of magnitude faster than those of a CDC-7600 mainframe computer or ten times faster than real-time speed.

  14. Is water immersion useful for analyzing gravity resistance responses in terrestrial plants?

    Science.gov (United States)

    Ooume, Kentaro; Soga, Kouichi; Wakabayashi, Kazuyuki; Hoson, Takayuki

    2004-11-01

    Water immersion has been used as a simulator of microgravity for analyzing gravity responses in semiaquatic plants such as rice. To examine whether or not water immersion for a short experimental period is a useful microgravity simulator even in terrestrial plants, we analyzed effects of water immersion on the cell wall rigidity and metabolisms of its constituents in azuki bean epicotyls. The cell wall rigidity of epicotyls grown underwater was significantly lower than that in the control. Water immersion also caused a decrease in molecular mass of xyloglucans as well as the thinning of the cell wall. Such changes in the mechanical and chemical properties of the cell wall underwater were similar to those observed in microgravity conditions in space. These results suggest that water immersion for a short period is a useful system for analyzing gravity resistance responses even in terrestrial plants.

  15. A Series RCL Circuit Theory for Analyzing Non-Steady-State Water Uptake of Maize Plants

    Science.gov (United States)

    Zhuang, Jie; Yu, Gui-Rui; Nakayama, Keiichi

    2014-10-01

    Understanding water uptake and transport through the soil-plant continuum is vital for ecosystem management and agricultural water use. Plant water uptake under natural conditions is a non-steady transient flow controlled by root distribution, plant configuration, soil hydraulics, and climatic conditions. Despite significant progress in model development, a mechanistic description of transient water uptake has not been developed or remains incomplete. Here, based on advanced electrical network theory (RLC circuit theory), we developed a non-steady state biophysical model to mechanistically analyze the fluctuations of uptake rates in response to water stress. We found that the non-steady-state model captures the nature of instantaneity and hysteresis of plant water uptake due to the considerations of water storage in plant xylem and coarse roots (capacitance effect), hydraulic architecture of leaf system (inductance effect), and soil-root contact (fuse effect). The model provides insights into the important role of plant configuration and hydraulic heterogeneity in helping plants survive an adverse environment. Our tests against field data suggest that the non-steady-state model has great potential for being used to interpret the smart water strategy of plants, which is intrinsically determined by stem size, leaf size/thickness and distribution, root system architecture, and the ratio of fine-to-coarse root lengths.

  16. BNl 703 MHz superconducting RF cavity testing

    Energy Technology Data Exchange (ETDEWEB)

    Sheehy, B.; Altinbas, Z.; Burrill, A.; Ben-Zvi, I.; Gassner, D.; Hahn, H.; Hammons, L.; Jamilkowski, J.; Kayran, D.; Kewisch, J.; Laloudakis, N.; Lederle, D.; Litvinenko, V.; McIntyre, G.; Pate, D.; Phillips, D.; Schultheiss, C.; Seda,T.; Than, R.; Xu, W.; Zaltsman, A.; Schultheiss, T.

    2011-03-28

    The BNL 5-cell, 703 MHz superconducting accelerating cavity has been installed in the high-current ERL experiment. This experiment will function as a proving ground for the development of high-current machines in general and is particularly targeted at beam development for an electron-ion collider (eRHIC). The cavity performed well in vertical tests, demonstrating gradients of 20 MV/m and a Q{sub 0} of 1e10. Here we will present its performance in the horizontal tests, and discuss technical issues involved in its implementation in the ERL.

  17. OVERVIEW ON BNL ASSESSMENT OF SEISMIC ANALYSIS METHODS FOR DEEPLY EMBEDDED NPP STRUCTURES.

    Energy Technology Data Exchange (ETDEWEB)

    XU,J.; COSTANTINO, C.; HOFMAYER, C.; GRAVES, H.

    2007-04-01

    A study was performed by Brookhaven National Laboratory (BNL) under the sponsorship of the U. S. Nuclear Regulatory Commission (USNRC), to determine the applicability of established soil-structure interaction analysis methods and computer programs to deeply embedded and/or buried (DEB) nuclear power plant (NPP) structures. This paper provides an overview of the BNL study including a description and discussions of analyses performed to assess relative performance of various SSI analysis methods typically applied to NPP structures, as well as the importance of interface modeling for DEB structures. There are four main elements contained in the BNL study: (1) Review and evaluation of existing seismic design practice, (2) Assessment of simplified vs. detailed methods for SSI in-structure response spectrum analysis of DEB structures, (3) Assessment of methods for computing seismic induced earth pressures on DEB structures, and (4) Development of the criteria for benchmark problems which could be used for validating computer programs for computing seismic responses of DEB NPP structures. The BNL study concluded that the equivalent linear SSI methods, including both simplified and detailed approaches, can be extended to DEB structures and produce acceptable SSI response calculations, provided that the SSI response induced by the ground motion is very much within the linear regime or the non-linear effect is not anticipated to control the SSI response parameters. The BNL study also revealed that the response calculation is sensitive to the modeling assumptions made for the soil/structure interface and application of a particular material model for the soil.

  18. Highlights from BNL and RHIC 2015

    CERN Document Server

    Tannenbaum, M J

    2016-01-01

    Highlights of news from Brookhaven National Laboratory (BNL) and results from the Relativistic Heavy Ion Collider (RHIC) in the period July 2014-June 2015 are presented. The news this year was mostly very positive. The major event at BNL was the startup and dedication of the new NSLS II, "the World's brightest Synchrotron Light Source". The operation of RHIC was outstanding with a polarized p+p run at $\\sqrt{s}=200$ GeV with integrated luminosity that exceeded the sum of all previous p+p integrated luminosity at this $\\sqrt{s}$. For the first time at RHIC asymmetric p+Au and p+Al runs were made but the p+Al run caused damage in the PHENIX forward detectors from quenches that were inadequately shielded for this first p+A run. This was also the 10th anniversary of the 2005 announcement of the Perfect Liquid Quark Gluon Plasma at RHIC and a review is presented of the discoveries leading to this claim. A new result on net-charge fluctuations (with no particle identification) from PHENIX based on previous scans ov...

  19. Synchronized Phasor Data for Analyzing Wind Power Plant Dynamic Behavior and Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Wan, Y. H.

    2013-01-01

    The U.S. power industry is undertaking several initiatives that will improve the operations of the power grid. One of those is the implementation of 'wide area measurements' using phasor measurement units (PMUs) to dynamically monitor the operations and the status of the network and provide advanced situational awareness and stability assessment. This project seeks to obtain PMU data from wind power plants and grid reference points and develop software tools to analyze and visualize synchrophasor data for the purpose of better understanding wind power plant dynamic behaviors under normal and contingency conditions.

  20. Development of NPA4K (Nuclear Plant Analyzer for KALIMER) program

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Bum; Kim, K. K.; Seo, K. C.; Yoon, J. S.; Hahn, H. D

    2000-05-01

    The NPA4K (nuclear plant analyzer for KALIMER) is developed for the user's convenience to run the SSC-K computer code displaying the safety related parameters simultaneously. Delphi program language is used to develop the program and the program can be run in the PC environment. It consists of KalimerUnit, Read TIUnit, SSCUnit, Choose Unit, GraphUnit, ExtensionUnit and TFieldUnit. Main control is performed in KalimerUnit.

  1. Heavy Ion results from RHIC-BNL

    Directory of Open Access Journals (Sweden)

    Esumi Shinlchi

    2013-05-01

    Full Text Available Recent results from heavy ion collision experiments from RHIC at BNL are presented and discussed in terms of Quark Gluon Plasm properties, such as partonic collectivity and partonic energy loss. The experimental results with direct photons and heavy quarks have given important additional insights of the plasma on top of what has been known with light hadrons. Higher order event anisotropies and the related results have provided the geometrical, temporal and dynamical information of the plasma. The beam energy dependence of the various measurements could reveal the structure of QCD phase diagram and possibly the critical point in the diagram, where the properties of phase transition are expected to change drastically.

  2. Development of Causality Analyzer for Maintenance/Test Tasks in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Gyun Young; Oh, Kye Min; Kim, So Young; Kim, Tae Mi; Ahmed, Rizwan [KyungHee University, Yongin (Korea, Republic of)

    2010-02-15

    The purpose of this project is to propose a causality analyzer for maintenance/test tasks in nuclear power plants in terms of fault tree analysis and turbine cycle simulation for a secondary side. In nuclear power plants, a lot of efforts to reduce unanticipated trips caused by maintenance or tests have been conducted, so many of trip causalities in a primary side were eliminated. However, it is still difficult to effectively recognize the causalities for the tasks of maintenance/tests in a secondary side. This study, therefore, attempted to propose a methodology based on fault tree analysis and derate simulation, which is particularly applicable for a secondary side. Ultimately, it is possible to develop the guidelines to warn the vulnerability in the tasks by proactively providing the human errors from maintenance or tests. The products of this study is able to predict the enhancement of plant availability by correlating the human errors resulting from maintenance/tests with a various type of plant losses

  3. New result on K{sup +} {r_arrow} {pi}{sup +} {nu}{bar {nu}} from BNL E787

    Energy Technology Data Exchange (ETDEWEB)

    REDLINGER,G.

    1999-06-21

    E787 at BNL has reported evidence for the rare decay K{sup +} {r_arrow} {pi}{sup +}{nu}{bar {nu}}, based on the observation of one candidate event. In this paper, we present the result of analyzing a new dataset of comparable sensitivity to the published result.

  4. Physics of the 1 Teraflop RIKEN-BNL-Columbia QCD project. Proceedings of RIKEN BNL Research Center workshop: Volume 13

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-10-16

    A workshop was held at the RIKEN-BNL Research Center on October 16, 1998, as part of the first anniversary celebration for the center. This meeting brought together the physicists from RIKEN-BNL, BNL and Columbia who are using the QCDSP (Quantum Chromodynamics on Digital Signal Processors) computer at the RIKEN-BNL Research Center for studies of QCD. Many of the talks in the workshop were devoted to domain wall fermions, a discretization of the continuum description of fermions which preserves the global symmetries of the continuum, even at finite lattice spacing. This formulation has been the subject of analytic investigation for some time and has reached the stage where large-scale simulations in QCD seem very promising. With the computational power available from the QCDSP computers, scientists are looking forward to an exciting time for numerical simulations of QCD.

  5. Highlights from BNL and RHIC 2014

    CERN Document Server

    Tannenbaum, M J

    2015-01-01

    Highlights of news from Brookhaven National Laboratory (BNL) and results from the Relativistic Heavy Ion Collider (RHIC) in the period July 2013-June 2014 are presented. It was a busy year for news, most notably a U. S. Government shutdown for 16 days beginning October 1, 2013 due to the lack of an approved budget for FY2014. Even with this unusual government activity, the $\\sqrt{s_{NN}}=200$ GeV Au+Au Run14 at RHIC was the best ever with integrated luminosity exceeding the sum of all previous runs. Additionally there was a brief He$^3$+Au run to continue the study of collective flow in small systems which was reinforced by new results presented on identified particle flow in d+Au. The other scientific highlights are also mostly concerned with ``soft (low $p_T$)'' physics complemented by the first preliminary results of reconstructed jets from hard-scattered partons in Au+Au collisions at RHIC . The measurements of transverse energy ($E_T$) spectra in p-p, d+Au and Au+Au collisions, which demonstrated last ye...

  6. Review: BNL graphite blanket design concepts

    Energy Technology Data Exchange (ETDEWEB)

    Fillo, J.A.; Powell, J.R.

    1976-03-01

    A review of the Brookhaven National Laboratory (BNL) minimum activity graphite blanket designs is made. Three designs are identified and discussed in the context of an experimental power reactor (EPR) and commercial power reactor. Basically, the three designs employ a thick graphite screen (typically 30 cm or greater, depending on type as well as application-experimental power reactor or commercial reactor). Bremsstrahlung energy is deposited on the graphite surface and re-radiated away as thermal radiation. Fast neutrons are slowed down in the graphite, depositing most of their energy. This energy is then either radiated to a secondary blanket with coolant tubes, as in types A and B, or is removed by intermittent direct gas cooling (type C). In types A and B, radiation damage to the structural material of the coolant tubes in the secondary blanket is reduced by one or two orders of magnitude by the graphite screen, while in type C, the blanket is only cooled when the reactor is shut down, so that coolant cannot quench the plasma, whatever the degree of radiation damage.

  7. The Future Of Spin Physics At BNL

    Science.gov (United States)

    Aronson, Samuel; Deshpande, Abhay

    2007-06-01

    The Relativistic Heavy Ion Collider (RHIC) at BNL is the world's only polarized proton-proton collider. Collisions at center-of-mass energies up to 500 GeV and beam polarizations approaching 70% (longitudinal or transverse) are provided to two experiments, STAR and PHENIX, at luminosities ⩾ 1032/cm2/sec. Transverse polarized beam has also been provided to the BRAHMS experiment. Measurements that bear on the important question of the spin content of the nucleon are beginning to appear. Over the next 10 years, as the performance of polarized proton running at RHIC is farmer developed, the Spin Physics program at RHIC will provide definitive measurements of the contributions to the proton's spin of the gluon, the sea quarks and the orbital motion of the partons in the proton's wave function. We plan to extend the reach of our study of the role of spin in QCD with the development of "eRHIC," which will provide polarized e-p collisions to a new detector.

  8. Analyzing Effects of Turbulence on Power Generation Using Wind Plant Monitoring Data: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, J.; Chowdhury, S.; Hodge, B. M.

    2014-01-01

    In this paper, a methodology is developed to analyze how ambient and wake turbulence affects the power generation of a single wind turbine within an array of turbines. Using monitoring data from a wind power plant, we selected two sets of wind and power data for turbines on the edge of the wind plant that resemble (i) an out-of-wake scenario (i.e., when the turbine directly faces incoming winds) and (ii) an in-wake scenario (i.e., when the turbine is under the wake of other turbines). For each set of data, two surrogate models were then developed to represent the turbine power generation (i) as a function of the wind speed; and (ii) as a function of the wind speed and turbulence intensity. Support vector regression was adopted for the development of the surrogate models. Three types of uncertainties in the turbine power generation were also investigated: (i) the uncertainty in power generation with respect to the published/reported power curve, (ii) the uncertainty in power generation with respect to the estimated power response that accounts for only mean wind speed; and (iii) the uncertainty in power generation with respect to the estimated power response that accounts for both mean wind speed and turbulence intensity. Results show that (i) under the same wind conditions, the turbine generates different power between the in-wake and out-of-wake scenarios, (ii) a turbine generally produces more power under the in-wake scenario than under the out-of-wake scenario, (iii) the power generation is sensitive to turbulence intensity even when the wind speed is greater than the turbine rated speed, and (iv) there is relatively more uncertainty in the power generation under the in-wake scenario than under the out-of-wake scenario.

  9. Analyzing clonal fidelity of micropropagated Psidium guajava L. plants using simple sequence repeat markers

    Science.gov (United States)

    Micropropagation of Psidium guajava L. (guava) is a viable alternative to currently adopted techniques for large-scale plant propagation of commercial cultivars. Assessment of clonal fidelity in micropropagated plants is the first step towards ensuring genetic uniformity in mass production of planti...

  10. The superconducting inflector for the BNL g-2 experiment

    NARCIS (Netherlands)

    Yamamoto, A; Makida, Y; Tanaka, K; Krieman, F; Roberts, BL; Brown, HN; Bunce, G; Danby, GT; G-Perdekamp, M; Hseuh, H; Jia, L.; Lee, YY; Mapes, M; Meng, W; Morse, W; Pai, C; Prigl, R; Sampson, W; Sandberg, J; Suenaga, M; Tallerico, T; Toldo, F; Woodle, K; Green, MA; Itoh, I.; Otsuka, H.; Saito, Y; Ozawa, T; Tachiya, Y; Tanaka, H; Grossmann, A; Jungmann, K; Putlitz, GZ; Deng, H; Dhawan, S; Hughes, Robert E; Kawall, D; Pretz, J; Redin, S; Sichtermann, E; Steinmetz, A

    2002-01-01

    The muon g-2 experiment at Brookhaven National Laboratory (BNL) has the goal of determining the muon anomalous magnetic moment, a(mu) (= (g-2)/2), to the very high precision of 0.35 parts per million and thus requires a storage ring magnet with great stability and homogeneity. A super-ferric storage

  11. Data Model of the BNL Archive and Dissemination System

    Energy Technology Data Exchange (ETDEWEB)

    Heller, J; Osterer, L

    1977-02-01

    The Data Model, i.e., the information content of the data base as it is viewed by the users, of the BNL Archive and Dissemination System is presented. The syntax of the data model is stated in BNF form, and the semantic meaning is discussed. Examples of the use of the data model are given. 3 figs.

  12. Plant Cell Division Analyzed by Transient Agrobacterium-Mediated Transformation of Tobacco BY-2 Cells.

    Science.gov (United States)

    Buschmann, Henrik

    2016-01-01

    The continuing analysis of plant cell division will require additional protein localization studies. This is greatly aided by GFP-technology, but plant transformation and the maintenance of transgenic lines can present a significant technical bottleneck. In this chapter I describe a method for the Agrobacterium-mediated genetic transformation of tobacco BY-2 cells. The method allows for the microscopic analysis of fluorescence-tagged proteins in dividing cells in within 2 days after starting a coculture. This transient transformation procedure requires only standard laboratory equipment. It is hoped that this rapid method would aid researchers conducting live-cell localization studies in plant mitosis and cytokinesis.

  13. [Usefulness of the Centrifuge Accommodation Module for analyzing gravity responses in plant seedlings].

    Science.gov (United States)

    Hoson, T

    2001-10-01

    Onboard centrifuges are indispensable tools for clarifying the effects of microgravity on various physiological processes in plant seedlings. Centrifuges are basically attached to the incubators designed for the International Space Station (ISS). However, because of the limitation in size, that loaded to the Cell Biology Experiment Facility (CBEF) is usable only to some small seedlings such as Arabidopsis. The Centrifuge Accommodation Module (CAM) has great advantages in the size and the amounts of plant materials feasible to load, the quality of acceleration produced, and the easiness of operation on it. The CAM is an apparatus that characterizes the ISS most and its construction on schedule is highly expected.

  14. Life cycle assessment of a HYSOL concentrated solar power plant: Analyzing the effect of geographic location

    NARCIS (Netherlands)

    Corona, B.; Ruiz, Diego; San Miguel, Guillermo

    2016-01-01

    Concentrating Solar Power (CSP) technology is developing in order to achieve higher energy efficiency, reduced economic costs, and improved firmness and dispatchability in the generation of power on demand. To this purpose, a research project titled HYSOL has developed a new power plant, consisting

  15. Analyzing the biomass filter behavior in an anaerobic wastewater treatment plants

    Energy Technology Data Exchange (ETDEWEB)

    Carlos-Hernandez, S.

    2009-07-01

    Nowadays, waste emissions in air, water and soil must be reduced in order to reach the more and more strict environmental rules. In the case of wastewater, there exists a big interest to improve treatment plants performances. The paper deals with the analysis, via the phase portraits method, of a biomass filter behavior in a completely stirred tank reactor deals with the analysis. (Author)

  16. Storing carbon dioxide in saline formations : analyzing extracted water treatment and use for power plant cooling.

    Energy Technology Data Exchange (ETDEWEB)

    Dwyer, Brian P.; Heath, Jason E.; Borns, David James; Dewers, Thomas A.; Kobos, Peter Holmes; Roach, Jesse D.; McNemar, Andrea; Krumhansl, James Lee; Klise, Geoffrey T.

    2010-10-01

    In an effort to address the potential to scale up of carbon dioxide (CO{sub 2}) capture and sequestration in the United States saline formations, an assessment model is being developed using a national database and modeling tool. This tool builds upon the existing NatCarb database as well as supplemental geological information to address scale up potential for carbon dioxide storage within these formations. The focus of the assessment model is to specifically address the question, 'Where are opportunities to couple CO{sub 2} storage and extracted water use for existing and expanding power plants, and what are the economic impacts of these systems relative to traditional power systems?' Initial findings indicate that approximately less than 20% of all the existing complete saline formation well data points meet the working criteria for combined CO{sub 2} storage and extracted water treatment systems. The initial results of the analysis indicate that less than 20% of all the existing complete saline formation well data may meet the working depth, salinity and formation intersecting criteria. These results were taken from examining updated NatCarb data. This finding, while just an initial result, suggests that the combined use of saline formations for CO{sub 2} storage and extracted water use may be limited by the selection criteria chosen. A second preliminary finding of the analysis suggests that some of the necessary data required for this analysis is not present in all of the NatCarb records. This type of analysis represents the beginning of the larger, in depth study for all existing coal and natural gas power plants and saline formations in the U.S. for the purpose of potential CO{sub 2} storage and water reuse for supplemental cooling. Additionally, this allows for potential policy insight when understanding the difficult nature of combined potential institutional (regulatory) and physical (engineered geological sequestration and extracted water

  17. Possible use of a 3-D clinostat to analyze plant growth processes under microgravity conditions.

    Science.gov (United States)

    Hoson, T; Kamisaka, S; Buchen, B; Sievers, A; Yamashita, M; Masuda, Y

    1996-01-01

    A three-dimensional (3-D) clinostat equipped with two rotation axes placed at right angles was constructed, and various growth processes of higher plants grown on this clinostat were compared with ground controls, with plants grown on the conventional horizontal clinostat, and with those under real microgravity in space. On the 3-D clinostat, cress roots developed a normal root cap and the statocytes showed the typical polar organization except a random distribution of statoliths. The structural features of clinostatted statocytes were fundamentally similar to those observed under real microgravity. The graviresponse of cress roots grown on the 3-D clinostat was the same as the control roots. On the 3-D clinostat, shoots and roots exhibited a spontaneous curvature as well as an altered growth direction. Such an automorphogenesis was sometimes exaggerated when plants were subjected to the horizontal rotation, whereas the curvature was suppressed on the vertical rotation. These discrepancies in curvature between the 3-D clinostat and the conventional ones appear to be brought about by the centrifugal force produced. Thus, the 3-D clinostat was proven as a useful device to simulate microgravity.

  18. ELECTRON COOLING AND ELECTRON-ION COLLIDERS AT BNL.

    Energy Technology Data Exchange (ETDEWEB)

    BEN-ZVI,I.

    2007-10-03

    Superconducting Energy Recovery Linacs (ERL) have significant potential uses in various fields, including High Energy Physics and Nuclear Physics. Brookhaven National Laboratory (BNL) is pursuing some of the potential applications in this area and the technology issues that are associated with these applications. The work addressed in this paper is carried out at BNL towards applications in electron cooling of high-energy hadron beams and electron-nucleon colliders. The common issues for these applications are the generation of high currents of polarized or high-brightness unpolarized electrons, high-charge per bunch and high-current. One must address the associated issue of High-Order Modes generation and damping. Superconducting ERLs have great advantages for these applications as will be outlined in the text.

  19. Life Cycle Assessment of a HYSOL Concentrated Solar Power Plant: Analyzing the Effect of Geographic Location

    Directory of Open Access Journals (Sweden)

    Blanca Corona

    2016-05-01

    Full Text Available Concentrating Solar Power (CSP technology is developing in order to achieve higher energy efficiency, reduced economic costs, and improved firmness and dispatchability in the generation of power on demand. To this purpose, a research project titled HYSOL has developed a new power plant, consisting of a combined cycle configuration with a 100 MWe steam turbine and an 80 MWe gas-fed turbine with biomethane. Technological developments must be supported by the identification, quantification, and evaluation of the environmental impacts produced. The aim of this paper is to evaluate the environmental performance of a CSP plant based on HYSOL technology using a Life Cycle Assessment (LCA methodology while considering different locations. The scenarios investigated include different geographic locations (Spain, Chile, Kingdom of Saudi Arabia, Mexico, and South Africa, an alternative modelling procedure for biomethane, and the use of natural gas as an alternative fuel. Results indicate that the geographic location has a significant influence on the environmental profile of the HYSOL CSP plant. The results obtained for the HYSOL configuration located in different countries presented significant differences (between 35% and 43%, depending on the category, especially in climate change and water stress categories. The differences are mainly attributable to the local availability of solar and water resources and composition of the national electricity mix. In addition, HYSOL technology performs significantly better when hybridizing with biomethane instead of natural gas. This evidence is particularly relevant in the climate change category, where biomethane hybridization emits 27.9–45.9 kg CO2 eq per MWh (depending on the biomethane modelling scenario and natural gas scenario emits 264 kg CO2 eq/MWh.

  20. BNL ACTIVITIES IN ADVANCED NEUTRON SOURCE DEVELOPMENT: PAST AND PRESENT

    Energy Technology Data Exchange (ETDEWEB)

    HASTINGS,J.B.; LUDEWIG,H.; MONTANEZ,P.; TODOSOW,M.; SMITH,G.C.; LARESE,J.Z.

    1998-06-14

    Brookhaven National Laboratory has been involved in advanced neutron sources almost from its inception in 1947. These efforts have mainly focused on steady state reactors beginning with the construction of the first research reactor for neutron beams, the Brookhaven Graphite Research Reactor. This was followed by the High Flux Beam Reactor that has served as the design standard for all the subsequent high flux reactors constructed worldwide. In parallel with the reactor developments BNL has focused on the construction and use of high energy proton accelerators. The first machine to operate over 1 GeV in the world was the Cosmotron. The machine that followed this, the AGS, is still operating and is the highest intensity proton machine in the world and has nucleated an international collaboration investigating liquid metal targets for next generation pulsed spallation sources. Early work using the Cosmotron focused on spallation product studies for both light and heavy elements into the several GeV proton energy region. These original studies are still important today. In the sections below the authors discuss the facilities and activities at BNL focused on advanced neutron sources. BNL is involved in the proton source for the Spallation Neutron source, spectrometer development at LANSCE, target studies using the AGS and state-of-the-art neutron detector development.

  1. BNL Activities in Advanced Neutron Source Development: Past and Present

    Energy Technology Data Exchange (ETDEWEB)

    Hastings, J.B.; Ludewig, H.; Montanez, P.; Todosow, M.; Smith, G.C.; Larese, J.Z.

    1998-06-14

    Brookhaven National Laboratory has been involved in advanced neutron sources almost from its inception in 1947. These efforts have mainly focused on steady state reactors beginning with the construction of the first research reactor for neutron beams, the Brookhaven Graphite Research Reactor. This was followed by the High Flux Beam Reactor that has served as the design standard for all the subsequent high flux reactors constructed worldwide. In parallel with the reactor developments BNL has focused on the construction and use of high energy proton accelerators. The first machine to operate over 1 GeV in the world was the Cosmotron. The machine that followed this, the AGS, is still operating and is the highest intensity proton machine in the world and has nucleated an international collaboration investigating liquid metal targets for next generation pulsed spallation sources. Early work using the Cosmotron focused on spallation product studies for both light and heavy elements into the several GeV proton energy region. These original studies are still important today. In this report we discuss the facilities and activities at BNL focused on advanced neutron sources. BNL is involved in the proton source for the Spallation Neutron source, spectrometer development at LANSCE, target studies using the AGS and state-of-the-art neutron detector development.

  2. [Geostatistics analyzing to cause of formation of circle distribution of plant communities in Horqin Sandy Land].

    Science.gov (United States)

    He, Xingdong; Gao, Yubao; Zhao, Wenzhi; Cong, Zili

    2004-09-01

    Investigation results in the present study showed that plant communities took typical concentric circles distribution patterns along habitat gradient from top, slope to interdune on a few large fixed dunes in middle part of Korqin Sandy Land. In order to explain this phenomenon, analysis of water content and its spatial heterogeneity in sand layers on different locations of dunes was conducted. In these dunes, water contents in sand layers of the tops were lower than those of the slopes; both of them were lower than those of the interdunes. According to the results of geostatistics analysis, whether shifting dune or fixed dune, spatial heterogeneity of water contents in sand layers took on regular changes, such as ratios between nugget and sill and ranges reduced gradually, fractal dimension increased gradually, the regular changes of these parameters indicated that random spatial heterogeneity reduced gradually, and autocorrelation spatial heterogeneity increased gradually from the top, the slope to the interdune. The regular changes of water contents in sand layers and their spatial heterogeneity of different locations of the dunes, thus, might be an important cause resulted in the formation of the concentric circles patterns of the plant communities on these fixed dunes.

  3. Evolving technologies for growing, imaging and analyzing 3D root system architecture of crop plants

    Institute of Scientific and Technical Information of China (English)

    Miguel A Pineros; Pierre-Luc Pradier; Nathanael M Shaw; Ithipong Assaranurak; Susan R McCouch; Craig Sturrock; Malcolm Bennett; Leon V Kochian; Brandon G Larson; Jon E Shaff; David J Schneider; Alexandre Xavier Falcao; Lixing Yuan; Randy T Clark; Eric J Craft; Tyler W Davis

    2016-01-01

    A plant’s ability to maintain or improve its yield under limiting conditions, such as nutrient deficiency or drought, can be strongly influenced by root system architec-ture (RSA), the three-dimensional distribution of the different root types in the soil. The ability to image, track and quantify these root system attributes in a dynamic fashion is a useful tool in assessing desirable genetic and physiological root traits. Recent advances in imaging technology and phenotyp-ing software have resulted in substantive progress in describing and quantifying RSA. We have designed a hydroponic growth system which retains the three-dimen-sional RSA of the plant root system, while allowing for aeration, solution replenishment and the imposition of nutrient treatments, as well as high-quality imaging of the root system. The simplicity and flexibility of the system allows for modifications tailored to the RSA of different crop species and improved throughput. This paper details the recent improvements and innovations in our root growth and imaging system which allows for greater image sensitivity (detection of fine roots and other root details), higher efficiency, and a broad array of growing conditions for plants that more closely mimic those found under field conditions.

  4. Determination Total Phosphour of Maize Plant Samples by Continuous Flow Analyzer in Comparison with Vanadium Molybdate Yellow Colorimetric Method

    Directory of Open Access Journals (Sweden)

    LIU Yun-xia

    2015-12-01

    Full Text Available The vanadium molybdate yellow colorimetric method(VMYC method is regarded as one of conventional methods for determining total phosphorus(P in plants, but it is time consuming procedure. Continuous flow analyzer(CFA is a fluid stream segmentation technique with air segments. It is used to measure P concentration based on the molybdate-antimony-ascorbic acid method of Murphy and Riley. Sixty nine of maize plant samples were selected and digested with H2SO4-H2O2. P concentrations in the digests were determined by CFA and VMYC method, respectively. The t test found that there was no any significant difference of the plant P contents measured by the CFA and the VMYC method. A linear equation could best describe their relationship: Y(CFA-P=0.927X(VMYC-P-0.002. The Pearson's correlation coefficient was 0.985 with a significance level(n=69, P<0.01. The CFA method for plant P measurement had a high precision with relative standard deviation(RSD less than 1.5%. It is suggested that the CFA based on Murphy and Riley colorimetric detection can be used to determinate total plant P in the digests solutions with H2SO4-H2O2. The CFA method is labor saving and can handle large numbers of samples. The human error in mixing with other operations is reduced to a great extent.

  5. Analyzing the possibility of constructing the air heating system for an integrated solid fuel gasification combined-cycle power plant

    Science.gov (United States)

    Mikula, V. A.; Ryzhkov, A. F.; Val'tsev, N. V.

    2015-11-01

    Combined-cycle power plants operating on solid fuel have presently been implemented only in demonstration projects. One of possible ways for improving such plants consists in making a shift to hybrid process circuits of integrated gasification combined-cycle plants with external firing of solid fuel. A high-temperature air heater serving to heat compressed air is a key element of the hybrid process circuit. The article describes application of a high-temperature recuperative metal air heater in the process circuit of an integrated gasification combined-cycle power plant (IGCC). The available experience with high-temperature air heating is considered, and possible air heater layout arrangements are analyzed along with domestically produced heat-resistant grades of steel suitable for manufacturing such air heater. An alternative (with respect to the traditional one) design is proposed, according to which solid fuel is fired in a noncooled furnace extension, followed by mixing the combustion products with recirculation gases, after which the mixture is fed to a convective air heater. The use of this design makes it possible to achieve considerably smaller capital outlays and operating costs. The data obtained from thermal and aerodynamic calculations of the high-temperature air heater with a thermal capacity of 258 MW for heating air to a temperature of up to 800°C for being used in the hybrid process circuit of a combined-cycle power plant are presented.

  6. THE BNL ASTD FIELD LAB - NEAR - REAL - TIME CHARACTERIZATION OF BNL STOCKPILED SOILS TO ACCELERATE COMPLETION OF THE EM CHEMICAL HOLES PROJECT.

    Energy Technology Data Exchange (ETDEWEB)

    BOWERMAN,B.S.; ADAMS,J.W.; HEISER,J.; KALB,P.D.; LOCKWOOD,A.

    2003-04-01

    As of October 2001, approximately 7,000 yd{sup 3} of stockpiled soil remained at Brookhaven National Laboratory (BNL) after the remediation of the BNL Chemical/Animal/Glass Pits disposal area. The soils were originally contaminated with radioactive materials and heavy metals, depending on what materials had been interred in the pits, and how the pits were excavated. During the 1997 removal action, the more hazardous/radioactive materials were segregated, along with, chemical liquids and solids, animal carcasses, intact gas cylinders, and a large quantity of metal and glass debris. Nearly all of these materials have been disposed of. In order to ensure that all debris was removed and to characterize the large quantity of heterogeneous soil, BNL initiated an extended sorting, segregation, and characterization project directed at the remaining soil stockpiles. The project was co-funded by the Department of Energy Environmental Management Office (DOE EM) through the BNL Environmental Restoration program and through the DOE EM Office of Science and Technology Accelerated Site Technology Deployment (ASTD) program. The focus was to remove any non-conforming items, and to assure that mercury and radioactive contaminant levels were within acceptable limits for disposal as low-level radioactive waste. Soils with mercury concentrations above allowable levels would be separated for disposal as mixed waste. Sorting and segregation were conducted simultaneously. Large stockpiles (ranging from 150 to 1,200 yd{sup 3}) were subdivided into manageable 20 yd{sup 3} units after powered vibratory screening. The 1/2-inch screen removed almost all non-conforming items (plus some gravel). Non-conforming items were separated for further characterization. Soil that passed through the screen was also visually inspected before being moved to a 20 yd{sup 3} ''subpile.'' Eight samples from each subpile were collected after establishing a grid of four quadrants: north, east

  7. BNL 56 MHz HOM damper prototype fabrication at JLAB

    Energy Technology Data Exchange (ETDEWEB)

    Huque, N.; McIntyre, G.; Daly, E. F.; Clemens, W.; Wu, Q.; Seberg, S.; Bellavia, S.

    2015-05-03

    A prototype Higher-Order Mode (HOM) Damper was fabricated at JLab for the Relativistic Heavy-Ion Collider’s (RHIC) 56 MHz cavity at Brookhaven National Laboratory (BNL). Primarily constructed from high RRR Niobium and Sapphire, the coaxial damper presented significant challenges in electron-beam welding (EBW), brazing and machining via acid etching. The results of the prototype operation brought about changes in the damper design, due to overheating braze alloys and possible multi-pacting. Five production HOM dampers are currently being fabricated at JLab. This paper outlines the challenges faced in the fabrication process, and the solutions put in place.

  8. BNL 56 MHz HOM Damper Prototype Fabrication at JLab

    Energy Technology Data Exchange (ETDEWEB)

    Huque, Naeem A. [Jefferson Lab., Newport News, VA (United States); Daly, Edward F. [Jefferson Lab., Newport News, VA (United States); Clemens, William A. [Jefferson Lab., Newport News, VA (United States); McIntyre, Gary T. [Brookhaven National Lab. (BNL), Upton, NY (United States); Wu, Qiong [Brookhaven National Lab. (BNL), Upton, NY (United States); Seberg, Scott [Brookhaven National Lab. (BNL), Upton, NY (United States); Bellavia, Steve [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2015-09-01

    A prototype Higher-Order Mode (HOM) Damper was fabricated at JLab for the Relativistic Heavy-Ion Collider's (RHIC) 56 MHz cavity at Brookhaven National Laboratory (BNL). Primarily constructed from high RRR Niobium and Sapphire, the coaxial damper presented significant challenges in electron-beam welding (EBW), brazing and machining via acid etching. The results of the prototype operation brought about changes in the damper design, due to overheating braze alloys and possible multi-pacting. Five production HOM dampers are currently being fabricated at JLab. This paper outlines the challenges faced in the fabrication process, and the solutions put in place.

  9. SERPENTINE COIL TOPOLOGY FOR BNL DIRECT WIND SUPERCONDUCTING MAGNETS.

    Energy Technology Data Exchange (ETDEWEB)

    PARKER, B.; ESCALLIER, J.

    2005-05-16

    Serpentine winding, a recent innovation developed at BNL for direct winding superconducting magnets, allows winding a coil layer of arbitrary multipolarity in one continuous winding process and greatly simplifies magnet design and production compared to the planar patterns used before. Serpentine windings were used for the BEPC-II Upgrade and JPARC magnets and are proposed to make compact final focus magnets for the EC. Serpentine patterns exhibit a direct connection between 2D body harmonics and harmonics derived from the integral fields. Straightforward 2D optimization yields good integral field quality with uniformly spaced (natural) coil ends. This and other surprising features of Serpentine windings are addressed in this paper.

  10. Results from the experiment E895 at the BNL AGS

    CERN Document Server

    Rai, G; Alexander, J; Anderson, M; Best, D; Brady, F P; Case, T; Caskey, W; Cebra, D A; Chance, J L; Chung, P; Cole, B; Crowe, K; Das, A; Draper, J E; Gilkes, M L; Gushue, S; Heffner, M; Hirsch, A S; Hjort, E L; Huo, L; Justice, M; Kaplan, M; Keane, D; Kintner, J; Klay, J; Krofcheck, D; Lacey, R; Lisa, M A; Liu, H; Liu, Y M; McGrath, R; Milosevich, Z; Odyniec, Grazyna Janina; Olson, D L; Panitkin, S Y; Pinkenburg, C H; Porile, N T; Ritter, H G; Romero, J L; Scharenberg, R P; Schröder, L S; Srivastava, B K; Stone, N T B; Symons, T J M; Wang, S; Wells, R; Whitfield, J; Wienold, T; Witt, R; Wood, L; Yang, X; Zhang, W; Zhang, Y

    1999-01-01

    We present some of the latest results from the E895 experiment conducted at the BNL AGS accelerator. Au+Au collisions were recorded by the EOS Time Projection Chamber (TPC) at beam energies of 2, 4, 6, and 8 A GeV. The TPC detector permitted the reconstruction of individual collision events with almost 4 pi acceptance and good particle identification. This capability allowed E895 to study global observables and two particle correlations with respect to symmetries of the event. Flow excitation functions are examined and discussed in the context of the Nuclear Equation of State.

  11. PROCEEDINGS OF RIKEN BNL RESEARCH CENTER, VOLUME 39, RHIC SPIN COLLABORATION MEETING, VII.

    Energy Technology Data Exchange (ETDEWEB)

    FOX, B.

    2002-04-22

    In the first meeting of this series (which took place at BNL on February 22, 2002), we focused on the upgrades which are expected to be completed prior to the end of this year and thus available for the next run. The two main items are the Spin Rotators in RHIC and the CNI polarimeter for the AGS. In addition, because of the progress on technical issues related to the design of partial snake in the AGS, we also had a presentation on this topic. And, finally, in keeping with a tradition of having some theoretical presentations to accompany the experimental and machine presentations, we had presentations on single spin transverse asymmetries in proton-proton reactions and Coulomb-Nuclear Interference analyzing powers in proton-carbon elastic scattering.

  12. Antagonistic effects of ethyl methanesulfonate and maleic hydrazide in inducing somatic mutations in the stamen hairs of Tradescantia clone BNL 4430

    OpenAIRE

    市川, 定夫

    1998-01-01

    Mutagenic interaction between ethyl methanesulfonate (EMS; a monofunctional alkylating agent) and maleic hydrazide (MH; a promutagen activated into a mutagen in plants highly likely by peroxidase) was studied in the stamen hairs of Tradescantia clone BNL 4430, a blue/pink heterozygote. Since EMS has been shown to act synergistically with X rays in inducing mutations, and mutagenic synergisms have also been observed between X rays and MH by exposing to X rays before MH treatments, EMS and MH w...

  13. Cooling Scheme for BNL-Built LHC Magnets

    CERN Document Server

    Ostojic, R; Van Weelderen, R; Willen, E H; Wu, K C

    1999-01-01

    Brookhaven National Laboratory (BNL) will provide four types of magnets, identified as D1, D2, D3 and D4, for the Insertion Regions of the Large Hadron Collider (LHC) as part of an international collaboration. These magnets utilize the dipole coil design of the Relativistic Heavy Ion Collider (RHIC) at BNL, for performance, reliability and cost reasons. The magnet cold mass and cryostat have been designed to ensure that these magnets meet all performance requirements in the LHC sloped tunnel using its cryogenic distribution system. D1 is a RHIC arc dipole magnet. D2 and D4 are 2-in-1 magnets, two coils in one cold mass, in a cryostat. D3 is a 1-in-1 magnet, one coil in one cold mass, with two cold masses side by side in a cryostat. D1 and D4 will be cooled by helium II at 1.9 K using a bayonet heat exchanger similar to the main cooling system of LHC. D2 and D3 will be cooled by liquid helium at 4.5 K using a Two-Feed scheme. A detailed description of the cooling scheme for these magnets, their cryostats, spec...

  14. DOE/NORA/BNL oil heat research agenda development

    Energy Technology Data Exchange (ETDEWEB)

    McDonald, R.J. [Brookhaven National Lab., Upton, NY (United States); Batey, J. [Energy Research Center, Easton, CT (United States)

    1996-07-01

    The National Oilheat Research Alliance (NORA) has been formed and is currently working to establish a Congressionally approved oilheat check-off program to provide funding for research, education, training, safety, and marketing to benefit the US oilheat industry. NORA will be presenting this program to the Congress for its consideration and approval in the coming year. It will follow the same path as the National Propane Gas Association which is currently working on obtaining Congressional approval of a propane check off program that has already attracted over 120 cosponsors in the House of representatives. An effort to define the basis of a joint US Department of Energy (DOE) and Oilheat industry (marketers) program for future oilheat equipment research and development will be conducted during FY-1996. At the request of NORA representatives BNL will coordinate the development of a research agenda addressing three categories of activities, research appropriate for DOE support only, research appropriate for NORA support only, and research appropriate for co-funding by both organizations. This will also serve to update a prior oil-fueled research plan developed for DOE ten years ago which has been the road map for DOE`s very successful Oil Heat R&D program at BNL.

  15. Proceedings of RIKEN BNL Research Center workwhop on RHIC spin

    Energy Technology Data Exchange (ETDEWEB)

    SOFFER,J.

    1999-10-06

    This RHIC Spin Workshop is the 1999 annual meeting of the RHIC Spin Collaboration, and the second to be hosted at Brookhaven and sponsored by the RIKEN BNL Research Center. The previous meetings were at Brookhaven (1998), Marseille (1996), MIT in 1995, Argonne 1994, Tucson in 1991, and the Polarized Collider Workshop at Penn State in 1990. As noted last year, the Center provides a home for combined work on spin by theorists, experimenters, and accelerator physicists. This proceedings, as last year, is a compilation of 1 page summaries and 5 selected transparencies for each speaker. It is designed to be available soon after the workshop is completed. Speakers are welcome to include web or other references for additional material. The RHIC spin program and RHIC are rapidly becoming reality. RHIC has completed its first commissioning run, as described here by Steve Peggs. The first Siberian Snake for spin has been completed and is being installed in RHIC. A new polarized source from KEK and Triumf with over 1 milliampere of polarized H{sup minus} is being installed, described by Anatoli Zelenski. They have had a successful test of a new polarimeter for RHIC, described by Kazu Kurita and Haixin Huang. Spin commissioning is expected next spring (2000), and the first physics run for spin is anticipated for spring 2001. The purpose of the workshop is to get everyone together about once per year and discuss goals of the spin program, progress, problems, and new ideas. They also have many separate regular forums on spin. There are spin discussion sessions every Tuesday, now organized by Naohito Saito and Werner Vogelsang. The spin discussion schedule and copies of presentations are posted on http://riksg01.rhic.bnl.gov/rsc. Speakers and other spinners are encouraged to come to BNL and to lead a discussion on your favorite idea. They also have regular polarimeter and snake meetings on alternate Thursdays, led by Bill McGahern, the lead engineer for the accelerator spin

  16. Analyzing the Technology of Using Ash and Slag Waste from Thermal Power Plants in the Production of Building Ceramics

    Science.gov (United States)

    Malchik, A. G.; Litovkin, S. V.; Rodionov, P. V.; Kozik, V. V.; Gaydamak, M. A.

    2016-04-01

    The work describes the problem of impounding and storing ash and slag waste at coal thermal power plants in Russia. Recovery and recycling of ash and slag waste are analyzed. Activity of radionuclides, the chemical composition and particle sizes of ash and slag waste were determined; the acidity index, the basicity and the class of material were defined. The technology for making ceramic products with the addition of ash and slag waste was proposed. The dependencies relative to the percentage of ash and slag waste and the optimal parameters for baking were established. The obtained materials were tested for physical and mechanical properties, namely for water absorption, thermal conductivity and compression strength. Based on the findings, future prospects for use of ash and slag waste were identified.

  17. COMMIX-PPC: A three-dimensional transient multicomponent computer program for analyzing performance of power plant condensers

    Energy Technology Data Exchange (ETDEWEB)

    Chien, T.H.; Domanus, H.M.; Sha, W.T.

    1993-02-01

    The COMMIX-PPC computer program is an extended and improved version of earlier COMMIX codes and is specifically designed for evaluating the thermal performance of power plant condensers. The COMMIX codes are general-purpose computer programs for the analysis of fluid flow and heat transfer in complex industrial systems. In COMMIX-PPC, two major features have been added to previously published COMMIX codes. One feature is the incorporation of one-dimensional conservation of mass. momentum, and energy equations on the tube side, and the proper accounting for the thermal interaction between shell and tube side through the porous medium approach. The other added feature is the extension of the three-dimensional conservation equations for shell-side flow to treat the flow of a multicomponent medium. COMMIX-PPC is designed to perform steady-state and transient three-dimensional analysis of fluid flow with heat transfer in a power plant condenser. However, the code is designed in a generalized fashion so that, with some modification. it can be used to analyze processes in any heat exchanger or other single-phase engineering applications.

  18. COMMIX-PPC: A three-dimensional transient multicomponent computer program for analyzing performance of power plant condensers

    Energy Technology Data Exchange (ETDEWEB)

    Chien, T.H.; Domanus, H.M.; Sha, W.T.

    1993-02-01

    The COMMIX-PPC computer pregrain is an extended and improved version of earlier COMMIX codes and is specifically designed for evaluating the thermal performance of power plant condensers. The COMMIX codes are general-purpose computer programs for the analysis of fluid flow and heat transfer in complex Industrial systems. In COMMIX-PPC, two major features have been added to previously published COMMIX codes. One feature is the incorporation of one-dimensional equations of conservation of mass, momentum, and energy on the tube stile and the proper accounting for the thermal interaction between shell and tube side through the porous-medium approach. The other added feature is the extension of the three-dimensional conservation equations for shell-side flow to treat the flow of a multicomponent medium. COMMIX-PPC is designed to perform steady-state and transient. Three-dimensional analysis of fluid flow with heat transfer tn a power plant condenser. However, the code is designed in a generalized fashion so that, with some modification, it can be used to analyze processes in any heat exchanger or other single-phase engineering applications. Volume I (Equations and Numerics) of this report describes in detail the basic equations, formulation, solution procedures, and models for a phenomena. Volume II (User's Guide and Manual) contains the input instruction, flow charts, sample problems, and descriptions of available options and boundary conditions.

  19. Field testing the prototype BNL fan-atomized oil burner

    Energy Technology Data Exchange (ETDEWEB)

    McDonald, R.; Celebi, Y. [Brookhaven National Lab., Upton, NY (United States)

    1995-04-01

    BNL has developed a new oil burner design referred to as the Fan Atomized burner System. The primary objective of the field study was to evaluate and demonstrate the reliable operation of the Fan Atomized Burner. The secondary objective was to establish and validate the ability of a low firing rate burner (0.3-0.4 gph) to fully satisfy the heating and domestic hot water load demands of an average household in a climate zone with over 5,000 heating-degree-days. The field activity was also used to evaluate the practicality of side-wall venting with the Fan Atomized Burner with a low stack temperature (300F) and illustrate the potential for very high efficiency with an integrated heating system approach based on the Fan Atomized Burner.

  20. Beam Loss Estimates and Control for the BNL Neutrino Facility

    CERN Document Server

    Weng, Wu-Tsung; Raparia, Deepak; Tsoupas, Nicholaos; Wei, Jie; Yung Lee, Yong; Zhang, S Y

    2005-01-01

    BNL plans to upgrade the AGS proton beam from the current 0.14 MW to higher than 1.0 MW for a very long baseline neutrino oscillation experiment. This increase in beam power is mainly due to the faster repetition rate of the AGS by a new 1.5 GeV superconductiong linac as injector, replacing the existing booster. The requirement for low beam loss is very important both to protect the beam component, and to make the hands-on maintenance possible. In this report, the design considerations for achieving high intensity and low loss will be presented. We start by specifying the beam loss limit at every physical process followed by the proper design and parameters for realising the required goals. The process considered in this paper include the emittance growth in the linac, the H-

  1. Injection and acceleration of Au31+ in the BNL AGS.

    Energy Technology Data Exchange (ETDEWEB)

    Fischer,W.; Ahrens, L.; Brown, K.; Gardner, C.; Glenn, W.; Huang, H.; Mapes, M.; Smart, L.; Thieberger, P.; Tsoupas, N.; Zhang, S.Y.; Zeno, K.; Omet, C.; Spiller, P.

    2008-06-23

    Injection and acceleration of ions in a lower charge state reduces space charge effects, and, if further elcctron stripping is needed, may allow elimination of a stripping stage and the associated beam losses. The former is of interest to the accelerators in the GSI FAIR complex, the latter for BNL RHIC collider operation at energies lower than the current injection energy. Lower charge state ions, however, have a higher likelihood of electron stripping which can lead to dynamic pressures rises and subsequent beam losses. We report on experiments in the AGS where Au{sup 31+} ions were injected and accelerated instead of the normally used Au{sup 77+} ions. Beam intensities and the average pressure in the AGS ring are recorded, and compared with calculations for dynamic pressures and beam losses. The experimental results will be used to benchmark the StrahlSim dynamic vacuum code and will be incorporated in the GSI FAIR SIS100 design.

  2. PROCEEDINGS OF RIKEN BNL RESEARCH CENTER WORKSHOP (VOLUME 64)

    Energy Technology Data Exchange (ETDEWEB)

    KHARZEEV,D.; KRETZER,S.; TEANEY,D.; VENUGOPALAN,R.; VOGELSANG,W.

    2004-09-28

    We are presently in a very exciting and important phase of the RHIC era. A huge body of data. has been gathered in heavy-ion collisions that provides very convincing evidence for the formation of a quark. gluon plasma in central collisions. Recently, studies of nuclear modification factors in forward dAu collisions have shown tantalizing signatures that may be understood most naturally in terms of a, universal form of matter controlling the high energy limit of strong interactions, the Color Glass Condensate. Finally, important advances have also been made in spin physics, where first measurements of single-transverse and double-longitudinal spin asymmetries have been presented, marking a qualitatively new era in this field. The wealth of the new experimental data called for a workshop in which theorists took stock and reviewed in depth what has been achieved, in order to give guidance as to what avenues should be taken from here. This was the idea behind the workshop ''Theory Summer Program on RHIC Physics''. We decided to invite a fairly small number of participants--some world leaders in their field, others only at the beginning of their careers, but all actively involved in RHIC physics. Each one of them stayed over an extended period of time from two to six weeks. Such long-terms stays led to particularly fruitful interactions and collaborations with many members of the BNL theory groups, as well as with experimentalists at BNL. They also were most beneficial for achieving the main goal of this workshop, namely to perform detailed studies.

  3. PHYSICS OF THE 1 TERAFLOP RIKEN-BNL-COLUMBIA QCD PROJECT.

    Energy Technology Data Exchange (ETDEWEB)

    MAWHINNEY,R.

    1998-10-16

    A workshop was held at the RIKEN-BNL Research Center on the afternoon of October 16, 1998, as part of the first anniversary ceremony for the center. Titled ''Workshop on Physics of the 1 Teraflop RIKEN-BNL-Columbia QCD Project'', this meeting brought together the physicists from RIKEN-BNL, BNL and Columbia who are using the QCDSP (Quantum Chromodynamics on Digital Signal Processors) computer at the RIKEN-BNL Research Center for studies of QCD. In addition, Akira Ukawa, a leader of the CP-PACS project at the University of Tsukuba in Japan, attended and gave a talk on the Aoki phase. There were also others in attendance who were interested in more general properties of the QCDSP computer. The QCDSP computer and lattice QCD had been presented during the morning ceremony by Shigemi Ohta of KEK and the RIKEN-BNL Research Center. This was followed by a tour of the QCDSP machine room and a formal unveiling of the computer to the attendees of the anniversary ceremony and the press. The rapid completion of construction of the QCDSP computer was made possible through many factors: (1) the existence of a complete design and working hardware at Columbia when the RIKEN-BNL center was being set up, (2) strong support for the project from RIKEN and the center and (3) aggressive involvement of members of the Computing and Communications Division at BNL. With this powerful new resource, the members of the RIKEN-BNL-Columbia, QCD project are looking forward to advances in our understanding of QCD.

  4. PHYSICS OF THE 1 TERAFLOP RIKEN-BNL-COLUMBIA QCD PROJECT.

    Energy Technology Data Exchange (ETDEWEB)

    MAWHINNEY,R.

    1998-10-16

    A workshop was held at the RIKEN-BNL Research Center on the afternoon of October 16, i 998, as part of the first anniversary ceremony for the center. Titled ''Workshop on Physics of the 1 Teraflop RIKEN-BNL-Columbia QCD Project'', this meeting brought together the physicists from RIKEN-BNL, BNL and Columbia who are using the QCDSP (Quantum Chromodynamics on Digital Signal Processors) computer at the RIKEN-BNL Research Center for studies of QCD. In addition, Akira Ukawa, a leader of the CP-PACS project at the University of Tsukuba in Japan, attended and gave a talk on the Aoki phase. There were also others in attendance who were interested in more general properties of the QCDSP computer. The QCDSP computer and lattice QCD had been presented during the morning ceremony by Shigemi Ohta of KEK and the RIKEN-BNL Research Center. This was followed by a tour of the QCDSP machine room and a formal unveiling of the computer to the attendees of the anniversary ceremony and the press. The rapid completion of construction of the QCDSP computer was made possible through many factors: (1) the existence of a complete design and working hardware at Columbia when the RIKEN-BNL center was being set up, (2) strong support for the project from RIKEN and the center and (3) aggressive involvement of members of the Computing and Communications Division at BNL. With this powerful new resource, the members of the RIKEN-BNL-Columbia, QCD project are looking forward to advances in our understanding of QCD.

  5. Development of a coal quality analyzer for application to power plants based on laser-induced breakdown spectroscopy

    Science.gov (United States)

    Zhang, Lei; Gong, Yao; Li, Yufang; Wang, Xin; Fan, Juanjuan; Dong, Lei; Ma, Weiguang; Yin, Wangbao; Jia, Suotang

    2015-11-01

    It is vitally important for a power plant to determine the coal property rapidly to optimize the combustion process. In this work, a fully software-controlled laser-induced breakdown spectroscopy (LIBS) based coal quality analyzer comprising a LIBS apparatus, a sampling equipment, and a control module, has been designed for possible application to power plants for offering rapid and precise coal quality analysis results. A closed-loop feedback pulsed laser energy stabilization technology is proposed to stabilize the Nd: YAG laser output energy to a preset interval by using the detected laser energy signal so as to enhance the measurement stability and applied in a month-long monitoring experiment. The results show that the laser energy stability has been greatly reduced from ± 5.2% to ± 1.3%. In order to indicate the complex relationship between the concentrations of the analyte of interest and the corresponding plasma spectra, the support vector regression (SVR) is employed as a non-linear regression method. It is shown that this SVR method combined with principal component analysis (PCA) enables a significant improvement in cross-validation accuracy by using the calibration set of coal samples. The root mean square error for prediction of ash content, volatile matter content, and calorific value decreases from 2.74% to 1.82%, 1.69% to 1.22%, and 1.23 MJ/kg to 0.85 MJ/kg, respectively. Meanwhile, the corresponding average relative error of the predicted samples is reduced from 8.3% to 5.48%, 5.83% to 4.42%, and 5.4% to 3.68%, respectively. The enhanced levels of accuracy obtained with the SVR combined with PCA based calibration models open up avenues for prospective prediction in coal properties.

  6. Changes in soil microbial community structure associated with two types of genetically engineered plants analyzing by PLFA

    Institute of Scientific and Technical Information of China (English)

    XUE Kai; LUO Hai-feng; QI Hong-yan; ZHANG Hong-xun

    2005-01-01

    With the rapid expansion of GEPs(genetically engineered plants), people are more and more concerned about the ecological risks brought by their release. Assessing the effect of GEPs on soil microbial ecology is indispensable to study their ecological risks. In our study, the phospholipids fatty acid(PLFA) method was used to analyze the microbial community of soil samples collected from fields with two types of GEPs-Bt transgenic corn and PVY(potato virus Y) cell protein gene transgenic potato. The principal components analysis(PCA) showed all controls were on the right of related GEPs samples along the PC1 (the first principal component) axis, which means a decrease of fungi in soils with genetically engineered crop since most of PLFAs that are strongly positively correlated with PC1 represent fungi. For samples collected from Bt transgenic cornfield, the ratios of gram-positive to gram-negative bacteria were less than those of controls. For samples of transgenic potato field, these ratios were lower than those of controls when soils were collected from deep layer(20-40 cm), but were higher when soils collected from surface layer(0-20 cm). For soils collected from 0-20 cm, the ratios of fungi to bacteria for all GEPs samples were at the same level. So were such rations for all controls. Changes of soil microbial community in two types of GEPs fields were detected in our study, but the causes and more information still needs further study.

  7. First test of BNL electron beam ion source with high current density electron beam

    Science.gov (United States)

    Pikin, Alexander; Alessi, James G.; Beebe, Edward N.; Shornikov, Andrey; Mertzig, Robert; Wenander, Fredrik; Scrivens, Richard

    2015-01-01

    A new electron gun with electrostatic compression has been installed at the Electron Beam Ion Source (EBIS) Test Stand at BNL. This is a collaborative effort by BNL and CERN teams with a common goal to study an EBIS with electron beam current up to 10 A, current density up to 10,000 A/cm2 and energy more than 50 keV. Intensive and pure beams of heavy highly charged ions with mass-to-charge ratio heavy ion research facilities including NASA Space Radiation Laboratory (NSRL) at BNL and HIE-ISOLDE at CERN. With a multiampere electron gun, the EBIS should be capable of delivering highly charged ions for both RHIC facility applications at BNL and for ISOLDE experiments at CERN. Details of the electron gun simulations and design, and the Test EBIS electrostatic and magnetostatic structures with the new electron gun are presented. The experimental results of the electron beam transmission are given.

  8. First Results from the DUV-FEL Upgrade at BNL

    CERN Document Server

    Wang, Xijie; Murphy, James; Pinayev, Igor; Rakowsky, George; Rose, James; Shaftan, Timur; Sheehy, Brian; Skaritka, John; Wu, Zilu; Yu Li Hua

    2005-01-01

    The DUV-FEL at BNL is the world’s only facility dedicated to laser-seeded FEL R&D and its applications. Tremendous progress was made in both HGHG FEL and its applications in the last couple years.*,** In response to the requests of many users to study chemical science at the facility, the DUV-FEL linac was upgraded from 200 to 300 MeV to enable the HGHG FEL to produce 100 uJ pulses of 100 nm light. This will establish the DUV FEL as a premier user facility for ultraviolet radiation and enable state-of-the-art gas phase photochemistry research. The upgraded facility will also make possible key R&D experiments such as higher harmonic HGHG (n>5) that would lay the groundwork for future X-ray FEL based on HGHG. The upgraded HGHG FEL will operate at the 4th harmonic with the seed laser at either 800 nm or 400nm. The increase of the electron beam energy will be accomplished by installing a 5th linac cavity and two 45 MW klystrons. New HGHG modulator and dispersion sections vacuum chambers w...

  9. The BNL fan-atomized burner system prototype

    Energy Technology Data Exchange (ETDEWEB)

    Butcher, T.A.; Celebi, Y. [Brookhaven National Lab., Upton, NY (United States)

    1995-04-01

    Brookhaven National Laboratory (BNL) has a continuing interest in the development of advanced oil burners which can provide new capabilities not currently available with pressure atomized, retention head burners. Specifically program goals include: the ability to operate at firing rates as low as 0.25 gph; the ability to operate with very low excess air levels for high steady state efficiency and to minimize formation of sulfuric acid and iron sulfate fouling; low emissions of smoke, CO, and NO{sub x} even at very low excess air levels; and the potential for modulation - either staged firing or continuous modulation. In addition any such advanced burner must have production costs which would be sufficiently attractive to allow commercialization. The primary motivation for all work sponsored by the US DOE is, of course, improved efficiency. With existing boiler and furnace models this can be achieved through down-firing and low excess air operation. Also, with low excess air operation fouling and efficiency degradation due to iron-sulfate scale formation are reduced.

  10. The BNL Accelerator Test Facility and experimental program

    Energy Technology Data Exchange (ETDEWEB)

    Ben-Zvi, I. (Brookhaven National Lab., Upton, NY (United States) State Univ. of New York, Stony Brook, NY (United States). Dept. of Physics)

    1991-01-01

    The Accelerator Test Facility (ATF) at BNL is a users' facility for experiments in Accelerator and Beam Physics. The ATF provides high brightness electron beams and high power laser pulses synchronized to the electron beam, suitable for studies of new methods of high gradient acceleration and state of the art free electron lasers. The electrons are produced by a laser photocathode rf gun and accelerated to 50 to 100 MeV by two traveling wave accelerator sections. The lasers include a 10 mJ, 10 ps Nd:YAG laser and a 100 mJ, 10 ps CO{sub 2} laser. A number of users from National Laboratories, universities and industry take part in experiments at the ATF. The experimental program includes various acceleration schemes, Free-Electron Laser experiments and a program on the development of high brightness electron beams. The AFT's experimental program commenced in early 1991 at an energy of about 4 MeV. The full program, with 50 MeV and the High power laser will begin operation this year. 28 refs., 4 figs.

  11. The BNL Accelerator Test Facility and experimental program

    Energy Technology Data Exchange (ETDEWEB)

    Ben-Zvi, I. [Brookhaven National Lab., Upton, NY (United States)]|[State Univ. of New York, Stony Brook, NY (United States). Dept. of Physics

    1992-09-01

    The Accelerator Test Facility (ATF) at BNL is a users` facility for experiments in Accelerator and Beam Physics. The ATF provides high brightness electron beams and high-power laser pulses synchronized to the electron beam, suitable for studies of new methods of high-gradient acceleration and state-of-the-art Free-Electron Lasers. The electrons are produced by a laser photocathode rf gun and accelerated to 50 MeV by two traveling wave accelerator sections. The lasers include a 10 mJ, 10 ps ND:YAG laser and a 500 mJ, 10 to 100 ps C0{sub 2} laser. A number of users from National Laboratories, universities and industry take part in experiments at the ATF. The experimental program includes various laser acceleration schemes, Free-Electron Laser experiments and a program on the development of high-brightness electron beams. The ATF`s experimental program commenced in early 1991 at an energy of about 4 MeV. The full program, with 50 MeV and the high-power laser will begin operation this year.

  12. The BNL Accelerator Test Facility and experimental program

    Energy Technology Data Exchange (ETDEWEB)

    Ben-Zvi, I. (Brookhaven National Lab., Upton, NY (United States) State Univ. of New York, Stony Brook, NY (United States). Dept. of Physics)

    1992-01-01

    The Accelerator Test Facility (ATF) at BNL is a users' facility for experiments in Accelerator and Beam Physics. The ATF provides high brightness electron beams and high-power laser pulses synchronized to the electron beam, suitable for studies of new methods of high-gradient acceleration and state-of-the-art Free-Electron Lasers. The electrons are produced by a laser photocathode rf gun and accelerated to 50 MeV by two traveling wave accelerator sections. The lasers include a 10 mJ, 10 ps ND:YAG laser and a 500 mJ, 10 to 100 ps C0{sub 2} laser. A number of users from National Laboratories, universities and industry take part in experiments at the ATF. The experimental program includes various laser acceleration schemes, Free-Electron Laser experiments and a program on the development of high-brightness electron beams. The ATF's experimental program commenced in early 1991 at an energy of about 4 MeV. The full program, with 50 MeV and the high-power laser will begin operation this year.

  13. Measuring The Electric-dipole Moment Of The Muon At Bnl E821

    CERN Document Server

    Giron, S O

    2004-01-01

    The muon g − 2 experiment at Brookhaven National Lab (BNL E821) improved the measurement of anomalous magnetic moment of the muon (aμ = g-22 ) by an order of magnitude over the previous measurement made by the CERN collaboration. The experiment used segmented detectors in order to also improve the measurement of the muon electric-dipole moment (EDM) by an order of magnitude. There are several methods available for making such an EDM measurement. Three methods were studied for their sensitivities to an EDM and to systematic biases. The g2geant Monte Carlo program was used to generate over 200 million simulated events so that the studies were not limited by statistical uncertainties. Each method was also used to analyze the 1999 E821 data set which contains 20 million events. It was found that one method had the least susceptibility to systematic biases with the greatest resolution of the effects of an EDM, and could best discriminate between the effects of a true EDM and those of systematic ...

  14. An image classification approach to analyze the suppression of plant immunity by the human pathogen Salmonella Typhimurium

    Directory of Open Access Journals (Sweden)

    Schikora Marek

    2012-07-01

    Full Text Available Abstract Background The enteric pathogen Salmonella is the causative agent of the majority of food-borne bacterial poisonings. Resent research revealed that colonization of plants by Salmonella is an active infection process. Salmonella changes the metabolism and adjust the plant host by suppressing the defense mechanisms. In this report we developed an automatic algorithm to quantify the symptoms caused by Salmonella infection on Arabidopsis. Results The algorithm is designed to attribute image pixels into one of the two classes: healthy and unhealthy. The task is solved in three steps. First, we perform segmentation to divide the image into foreground and background. In the second step, a support vector machine (SVM is applied to predict the class of each pixel belonging to the foreground. And finally, we do refinement by a neighborhood-check in order to omit all falsely classified pixels from the second step. The developed algorithm was tested on infection with the non-pathogenic E. coli and the plant pathogen Pseudomonas syringae and used to study the interaction between plants and Salmonella wild type and T3SS mutants. We proved that T3SS mutants of Salmonella are unable to suppress the plant defenses. Results obtained through the automatic analyses were further verified on biochemical and transcriptome levels. Conclusion This report presents an automatic pixel-based classification method for detecting “unhealthy” regions in leaf images. The proposed method was compared to existing method and showed a higher accuracy. We used this algorithm to study the impact of the human pathogenic bacterium Salmonella Typhimurium on plants immune system. The comparison between wild type bacteria and T3SS mutants showed similarity in the infection process in animals and in plants. Plant epidemiology is only one possible application of the proposed algorithm, it can be easily extended to other detection tasks, which also rely on color information, or

  15. Red Hat Enterprise Virtualization - KVM-based infrastructure services at BNL

    Energy Technology Data Exchange (ETDEWEB)

    Cortijo, D.

    2011-06-14

    Over the past 18 months, BNL has moved a large percentage of its Linux-based servers and services into a Red Hat Enterprise Virtualization (RHEV) environment. This presentation will address our approach to virtualization, critical decision points, and a discussion of our implementation. Specific topics will include an overview of hardware and software requirements, networking, and storage; discussion of the decision of Red Hat solution over competing products (VMWare, Xen, etc); details on some of the features of RHEV - both current and on their roadmap; Review of performance and reliability gains since deployment completion; path forward for RHEV at BNL and caveats and potential problems.

  16. Twenty years of space radiation physics at the BNL AGS and NASA Space Radiation Laboratory.

    Science.gov (United States)

    Miller, J; Zeitlin, C

    2016-06-01

    Highly ionizing atomic nuclei HZE in the GCR will be a significant source of radiation exposure for humans on extended missions outside low Earth orbit. Accelerators such as the LBNL Bevalac and the BNL AGS, designed decades ago for fundamental nuclear and particle physics research, subsequently found use as sources of GCR-like particles for ground-based physics and biology research relevant to space flight. The NASA Space Radiation Laboratory at BNL was constructed specifically for space radiation research. Here we review some of the space-related physics results obtained over the first 20 years of NASA-sponsored research at Brookhaven.

  17. Pion-nucleus total cross-section data from LAMPF and BNL. [Neutron and proton radii

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, M.D.

    1976-01-01

    New measurements of pion-nucleus total cross sections were made at LAMPF and BNL. The results from LAMPF include measurement of the difference of the rms neutron and proton radii of /sup 48/Ca to be 0.08 +- 0.02 and that of /sup 18/O to be 0.19 +- 0.02. The BNL measurements provide a new phenomenology on the downshift and spreading of the (3-3) resonance in nuclei from the first data on heavy nuclei. A new technique for handling the Coulomb effects in total cross section measurements is discussed.

  18. Methodology to analyze environmental monitoring reports of desalination plants; Metodologia para el analisis de los documentos de seguimiento ambiental de las instalaciones desaladoras de agua marina

    Energy Technology Data Exchange (ETDEWEB)

    Ruis Arriaga, S.; Orozco Conti, F.; Ubaldi Freda, G. M.; Garau Hernandez, F.; Salguero Martinez, J.; Garcia Sanchez-Colomer, M.

    2010-07-01

    In this paper we propose a methodology, based on check lists, to analyze the shape and the contents of the environmental vigilance programmes and the monitoring reports related to projects of desalination plants subjects to environmental impact assessment. The aim is to obtain useful and reproducible analysis tool for detect possible faults in the environmental monitoring reports. The application of this methodology leeds up to simplify and speed up the checking of these documents by competent authority. (Author) 6 refs.

  19. QUARKONIUM PRODUCTION IN RELATIVISTIC NUCLEAR COLLISIONS. PROCEEDINGS OF RIKEN BNL RESEARCH CENTER WORKSHOP, VOLUME 12

    Energy Technology Data Exchange (ETDEWEB)

    KHARZEEV,D.

    1999-04-20

    The RIKEN-BNL Workshop on Quarkonium Production in Relativistic Nuclear Collisions was held September 28--October 2, 1998, at Brookhaven National Laboratory. The Workshop brought together about 50 invited participants from around the world and a number of Brookhaven physicists from both particle and nuclear physics communities.

  20. First test of BNL electron beam ion source with high current density electron beam

    Energy Technology Data Exchange (ETDEWEB)

    Pikin, Alexander, E-mail: pikin@bnl.gov; Alessi, James G., E-mail: pikin@bnl.gov; Beebe, Edward N., E-mail: pikin@bnl.gov [Brookhaven National Laboratory, Upton, NY 11973 (United States); Shornikov, Andrey; Mertzig, Robert; Wenander, Fredrik; Scrivens, Richard [CERN, CH-1211 Geneva 23 (Switzerland)

    2015-01-09

    A new electron gun with electrostatic compression has been installed at the Electron Beam Ion Source (EBIS) Test Stand at BNL. This is a collaborative effort by BNL and CERN teams with a common goal to study an EBIS with electron beam current up to 10 A, current density up to 10,000 A/cm{sup 2} and energy more than 50 keV. Intensive and pure beams of heavy highly charged ions with mass-to-charge ratio < 4.5 are requested by many heavy ion research facilities including NASA Space Radiation Laboratory (NSRL) at BNL and HIE-ISOLDE at CERN. With a multiampere electron gun, the EBIS should be capable of delivering highly charged ions for both RHIC facility applications at BNL and for ISOLDE experiments at CERN. Details of the electron gun simulations and design, and the Test EBIS electrostatic and magnetostatic structures with the new electron gun are presented. The experimental results of the electron beam transmission are given.

  1. BNL Building 650 lead decontamination and treatment feasibility study. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Kalb, P.D.; Cowgill, M.G.; Milian, L.W. [and others

    1995-10-01

    Lead has been used extensively at Brookhaven National Laboratory (BNL) for radiation shielding in numerous reactor, accelerator and other research programs. A large inventory of excess lead (estimated at 410,000 kg) in many shapes and sizes is currently being stored. Due to it`s toxicity, lead and soluble lead compounds are considered hazardous waste by the Environmental Protection Agency. Through use at BNL, some of the lead has become radioactive, either by contamination of the surface or through activation by neutrons or deuterons. This study was conducted at BNL`s Environmental and Waste Technology Center for the BNL Safety and Environmental Protection Division to evaluate feasibility of various treatment options for excess lead currently being stored. The objectives of this effort included investigating potential treatment methods by conducting a review of the literature, developing a means of screening lead waste to determine the radioactive characteristics, examining the feasibility of chemical and physical decontamination technologies, and demonstrating BNL polyethylene macro-encapsulation as a means of treating hazardous or mixed waste lead for disposal. A review and evaluation of the literature indicated that a number of physical and chemical methods are available for decontamination of lead. Many of these techniques have been applied for this purpose with varying degrees of success. Methods that apply mechanical techniques are more appropriate for lead bricks and sheet which contain large smooth surfaces amenable to physical abrasion. Lead wool, turnings, and small irregularly shaped pieces would be treated more effectively by chemical decontamination techniques. Either dry abrasion or wet chemical methods result in production of a secondary mixed waste stream that requires treatment prior to disposal.

  2. Thermo Scientific Sulfur Dioxide Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. BNL has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

  3. Multipacting simulation and test results of BNL 704 MHz SRF gun

    Energy Technology Data Exchange (ETDEWEB)

    Xu W.; Belomestnykh, S.; Ben-Zvi, I.; Cullen, C. et al

    2012-05-20

    The BNL 704MHz SRF gun has a grooved choke joint to support the photo-cathode. Due to the distortion of grooves at the choke joint during the BCP for the choke joint, several multipacting barriers showed up when it was tested with Nb cathode stalk at JLab. We built a setup to use the spare large grain SRF cavity to test and condition the multipacting at BNL with various power sources up to 50kW. The test is carried out in three stages: testing the cavity performance without cathode, testing the cavity with the Nb cathode stalk that was used at Jlab, and testing the cavity with a copper cathode stalk that is based on the design for the SRF gun. This paper summarizes the results of multipacting simulation, and presents the large grain cavity test setup and the test results.

  4. PRODEEDINGS OF RIKEN BNL RESEARCH CENTER WORKSHOP : HIGH PERFORMANCE COMPUTING WITH QCDOC AND BLUEGENE.

    Energy Technology Data Exchange (ETDEWEB)

    CHRIST,N.; DAVENPORT,J.; DENG,Y.; GARA,A.; GLIMM,J.; MAWHINNEY,R.; MCFADDEN,E.; PESKIN,A.; PULLEYBLANK,W.

    2003-03-11

    Staff of Brookhaven National Laboratory, Columbia University, IBM and the RIKEN BNL Research Center organized a one-day workshop held on February 28, 2003 at Brookhaven to promote the following goals: (1) To explore areas other than QCD applications where the QCDOC and BlueGene/L machines can be applied to good advantage, (2) To identify areas where collaboration among the sponsoring institutions can be fruitful, and (3) To expose scientists to the emerging software architecture. This workshop grew out of an informal visit last fall by BNL staff to the IBM Thomas J. Watson Research Center that resulted in a continuing dialog among participants on issues common to these two related supercomputers. The workshop was divided into three sessions, addressing the hardware and software status of each system, prospective applications, and future directions.

  5. NEUTRINO SUPER BEAM FACILITY FOR A LONG BASELINE EXPERIMENT FROM BNL TO HOMESTAKE.

    Energy Technology Data Exchange (ETDEWEB)

    KAHN,S.

    2002-10-21

    An upgrade to the BNL Alternate Gradient Synchrotron (AGS) could produce a very intense proton source at a relatively low cost. Such a proton beam could be used to generate a conventional neutrino beam with a significant flux at large distances from the laboratory. This provides the possibility of a very long baseline neutrino experiment at the Homestake mine. The construction of this facility would allow a program of experiments to study many of the aspects of neutrino oscillations including CP violations. This study examines a 1 MW proton source at BNL and a large 1 megaton detector positioned at the Homestake Mine as the ultimate goal of a staged program to study neutrino oscillations.

  6. STATUS OF HIGH TEMPERATURE SUPERCONDUCTOR MAGNET R AND D AT BNL.

    Energy Technology Data Exchange (ETDEWEB)

    GUPTA,R.; ANERELLA,M.; COZZOLINO,J.; ESCALLIER,J.; GANETIS,G.; GHOSH,A.; ET AL.

    2004-01-22

    We report the status and test results of the High Temperature Superconductor (HTS) cable and magnet R&D at Brookhaven National Laboratory (BNL). If successful, this will enhance the performance and reduce the cost of operation of magnets that must absorb a large amount of energy. The need for developing this technology has been seen in a number of high field magnet applications for high energy colliders, and a medium field application in the proposed Rare Isotope Accelerator (RIA). The likelihood of the future use of HTS is improving because of the availability of longer and more uniform length tapes and cables and because of the ongoing construction and test experience at BNL and elsewhere. The design of a super-ferric quadrupole, that must survive the very high radiation environment of RIA, and operate at 20-40 K, is also presented.

  7. Radar Wind Profiler for Cloud Forecasting at Brookhaven National Laboratory (BNL) Field Campaign Report

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, M. P. [Brookhaven National Laboratory (BNL), Upton, NY (United States); Giangrande, S. E. [Brookhaven National Laboratory (BNL), Upton, NY (United States); Bartholomew, M. J. [Brookhaven National Laboratory (BNL), Upton, NY (United States)

    2016-04-01

    The Radar Wind Profiler for Cloud Forecasting at Brookhaven National Laboratory (BNL) [http://www.arm.gov/campaigns/osc2013rwpcf] campaign was scheduled to take place from 15 July 2013 through 15 July 2015 (or until shipped for the next U.S. Department of Energy Atmospheric Radiation Measurement [ARM] Climate Research Facility first Mobile Facility [AMF1] deployment). The campaign involved the deployment of the AMF1 Scintec 915 MHz Radar Wind Profiler (RWP) at BNL, in conjunction with several other ARM, BNL and National Weather Service (NWS) instruments. The two main scientific foci of the campaign were: 1) To provide profiles of the horizontal wind to be used to test and validate short-term cloud advection forecasts for solar-energy applications and 2) to provide vertical profiling capabilities for the study of dynamics (i.e., vertical velocity) and hydrometeors in winter storms. This campaign was a serendipitous opportunity that arose following the deployment of the RWP at the Two-Column Aerosol Project (TCAP) campaign in Cape Cod, Massachusetts and restriction from participation in the Green Ocean Amazon 2014/15 (GoAmazon 2014/15) campaign due to radio-frequency allocation restriction for international deployments. The RWP arrived at BNL in the fall of 2013, but deployment was delayed until fall of 2014 as work/safety planning and site preparation were completed. The RWP further encountered multiple electrical failures, which eventually required several shipments of instrument power supplies and the final amplifier to the vendor to complete repairs. Data collection began in late January 2015. The operational modes of the RWP were changed such that in addition to collecting traditional profiles of the horizontal wind, a vertically pointing mode was also included for the purpose of precipitation sensing and estimation of vertical velocities. The RWP operated well until the end of the campaign in July 2015 and collected observations for more than 20 precipitation

  8. BNL program in support of LWR degraded-core accident analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ginsberg, T.; Greene, G.A.

    1982-01-01

    Two major sources of loading on dry watr reactor containments are steam generatin from core debris water thermal interactions and molten core-concrete interactions. Experiments are in progress at BNL in support of analytical model development related to aspects of the above containment loading mechanisms. The work supports development and evaluation of the CORCON (Muir, 1981) and MARCH (Wooton, 1980) computer codes. Progress in the two programs is described in this paper. 8 figures.

  9. SynapSense Wireless Environmental Monitoring System of the RHIC & ATLAS Computing Facility at BNL

    Science.gov (United States)

    Casella, K.; Garcia, E.; Hogue, R.; Hollowell, C.; Strecker-Kellogg, W.; Wong, A.; Zaytsev, A.

    2014-06-01

    RHIC & ATLAS Computing Facility (RACF) at BNL is a 15000 sq. ft. facility hosting the IT equipment of the BNL ATLAS WLCG Tier-1 site, offline farms for the STAR and PHENIX experiments operating at the Relativistic Heavy Ion Collider (RHIC), the BNL Cloud installation, various Open Science Grid (OSG) resources, and many other small physics research oriented IT installations. The facility originated in 1990 and grew steadily up to the present configuration with 4 physically isolated IT areas with the maximum rack capacity of about 1000 racks and the total peak power consumption of 1.5 MW. In June 2012 a project was initiated with the primary goal to replace several environmental monitoring systems deployed earlier within RACF with a single commercial hardware and software solution by SynapSense Corporation based on wireless sensor groups and proprietary SynapSense™ MapSense™ software that offers a unified solution for monitoring the temperature and humidity within the rack/CRAC units as well as pressure distribution underneath the raised floor across the entire facility. The deployment was completed successfully in 2013. The new system also supports a set of additional features such as capacity planning based on measurements of total heat load, power consumption monitoring and control, CRAC unit power consumption optimization based on feedback from the temperature measurements and overall power usage efficiency estimations that are not currently implemented within RACF but may be deployed in the future.

  10. 植物萜类化合物提取与检测方法研究进展%Advances in techniques of extracting and analyzing plant terpenoids

    Institute of Scientific and Technical Information of China (English)

    黄艳贞; 栾军波

    2014-01-01

    Solvent extraction,steam distillation,supercritical fluid extraction and headspace analysis are currently often used for extracting plant terpenoids,and gas chromatography-mass spectrometry is the most common means of terpenoid analysis.In recent years the techniques to isolate and analyze plant terpenoids have been developing in high efficiency,high sensitivity,high precision, high throughput and online real-time analysis,obviously improving the analysis capacity,simplifying the procedures and greatly promoting the isolation and analysis of plant terpenoids.As new techniques are still not perfect and need to further improve and develop,the techniques should be used selectively and combinedly.%溶剂提取、水蒸气蒸馏提取、超临界流体提取和顶空分析方法目前常用于植物萜类物质的提取;气相色谱-质谱是萜类化合物分析的最常用手段;近年来植物萜类测定技术正朝着高效、高灵敏度、高精确度、高通量和在线实时的方向发展;新发展的技术明显提高了分析水平,简化了操作流程,极大促进了植物萜类化合物的测定;但新技术仍有不足,尚需不断完善发展,技术应用上既要有所选择,又要结合使用。

  11. COMMIX-PPC: A three-dimensional transient multicomponent computer program for analyzing performance of power plant condensers. Volume 2, User`s guide and manual

    Energy Technology Data Exchange (ETDEWEB)

    Chien, T.H.; Domanus, H.M.; Sha, W.T.

    1993-02-01

    The COMMIX-PPC computer program is an extended and improved version of earlier COMMIX codes and is specifically designed for evaluating the thermal performance of power plant condensers. The COMMIX codes are general-purpose computer programs for the analysis of fluid flow and heat transfer in complex industrial systems. In COMMIX-PPC, two major features have been added to previously published COMMIX codes. One feature is the incorporation of one-dimensional conservation of mass. momentum, and energy equations on the tube side, and the proper accounting for the thermal interaction between shell and tube side through the porous medium approach. The other added feature is the extension of the three-dimensional conservation equations for shell-side flow to treat the flow of a multicomponent medium. COMMIX-PPC is designed to perform steady-state and transient three-dimensional analysis of fluid flow with heat transfer in a power plant condenser. However, the code is designed in a generalized fashion so that, with some modification. it can be used to analyze processes in any heat exchanger or other single-phase engineering applications.

  12. COMMIX-PPC: A three-dimensional transient multicomponent computer program for analyzing performance of power plant condensers. Volume 1, Equations and numerics

    Energy Technology Data Exchange (ETDEWEB)

    Chien, T.H.; Domanus, H.M.; Sha, W.T.

    1993-02-01

    The COMMIX-PPC computer pregrain is an extended and improved version of earlier COMMIX codes and is specifically designed for evaluating the thermal performance of power plant condensers. The COMMIX codes are general-purpose computer programs for the analysis of fluid flow and heat transfer in complex Industrial systems. In COMMIX-PPC, two major features have been added to previously published COMMIX codes. One feature is the incorporation of one-dimensional equations of conservation of mass, momentum, and energy on the tube stile and the proper accounting for the thermal interaction between shell and tube side through the porous-medium approach. The other added feature is the extension of the three-dimensional conservation equations for shell-side flow to treat the flow of a multicomponent medium. COMMIX-PPC is designed to perform steady-state and transient. Three-dimensional analysis of fluid flow with heat transfer tn a power plant condenser. However, the code is designed in a generalized fashion so that, with some modification, it can be used to analyze processes in any heat exchanger or other single-phase engineering applications. Volume I (Equations and Numerics) of this report describes in detail the basic equations, formulation, solution procedures, and models for a phenomena. Volume II (User`s Guide and Manual) contains the input instruction, flow charts, sample problems, and descriptions of available options and boundary conditions.

  13. Digital I&C Systems in Nuclear Power Plants. Risk-Screening of Environmental Stressors and a Comparison of Hardware Unavailability With an Existing Analog System

    Science.gov (United States)

    1998-01-01

    NUREG /CR-6579 BNL- NUREG -52536 Digital I&C Systems in Nuclear Power Plants Risk-Screening of Environmental Stressors and a Comparison of Hardware...13.2.3 NUREG /CR-6579 BNL- NUREG -52536 Digital I&C Systems in Nuclear Power Plants Risk-Screening of Environmental Stressors and a Comparison of...licensee documents and correspondence. The following documents in the NUREG series are available for purchase from the Government Printing Office: formal

  14. Target and orbit feedback simulations of a muSR beamline at BNL

    Energy Technology Data Exchange (ETDEWEB)

    MacKay, W. W. [Residence, 25 Rhododendron Circle, Asheville, NC (United States); Fischer, W. [Brookhaven National Lab. (BNL), Upton, NY (United States); Blaskiewicz, M. [Brookhaven National Lab. (BNL), Upton, NY (United States); Pile, P. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2015-05-03

    Well-polarized positive surface muons are a tool to measure the magnetic properties of materials since the precession rate of the spin can be determined from the observation of the positron directions when the muons decay. The use of the AGS complex at BNL has been explored for a muSR facility previously. Here we report simulations of a beamline with a target inside a solenoidal field, and of an orbit feed-back system with single muon beam positioning monitors based on technology available today

  15. Proceedings of RIKEN BNL Research Center Workshop: Progress in High-pT Physics at RHIC

    Energy Technology Data Exchange (ETDEWEB)

    Bazilevsky, A.; Bland, L.; Vogelsang, W.

    2010-03-17

    This volume archives the presentations at the RIKEN BNL Research Center workshop 'Progress in High-PT Physics at RHIC', held at BNL in March 2010. Much has been learned from high-p{sub T} physics after 10 years of RHIC operations for heavy-ion collisions, polarized proton collisions and d+Au collisions. The workshop focused on recent progress in these areas by both theory and experiment. The first morning saw review talks on the theory of RHIC high-p{sub T} physics by G. Sterman and J. Soffer, and on the experimental results by M. Tannenbaum. One of the most exciting recent results from the RHIC spin program is the first observation of W bosons and their associated single-spin asymmetry. The new preliminary data were reported on the first day of our workshop, along with a theoretical perspective. There also were detailed discussions on the global analysis of polarized parton distributions, including the knowledge on gluon polarization and the impact of the W-data. The main topic of the second workshop day were single-transverse spin asymmetries and their analysis in terms of transverse-momentum dependent parton distributions. There is currently much interest in a future Drell-Yan program at RHIC, thanks to the exciting physics opportunities this would offer. This was addressed in some of the talks. There also were presentations on the latest results on transverse-spin physics from HERMES and BELLE. On the final day of the workshop, the focus shifted toward forward and small-x physics at RHIC, which has become a cornerstone of the whole RHIC program. Exciting new data were presented and discussed in terms of their possible implications for our understanding of strong color-field phenomena in QCD. In the afternoon, there were discussions of nuclear parton distributions and jet observables, among them fragmentation. The workshop was concluded with outlooks toward the near-term (LHC, JLab) and longer-term (EIC) future. The workshop has been a great success

  16. US-Japan collaboration in the construction of the BNL superconducting muon storage ring and inflector

    Energy Technology Data Exchange (ETDEWEB)

    Hirabayashi, Hiromi; Yamamoto, Akira [High Energy Accelerator Research Organization, Tsukuba, Ibaraki (Japan)

    2003-03-01

    The US-Japan collaboration in the contraction of a BNL muon storage ring for the g-2 experiment (E821) is described from the viewpoint of Japanese collaborators. Japan has contributed in the production of the pole pieces made of a vacuum-melted Ultra-Low Carbon Steel, Al-stabilized Nb/Ti superconductors for the superferric storage ring dipole coils, including technology transfer, and the development of a sophisticated superconducting inflector for muon injection. All of above items seem to be essential techniques to pursue accurate and detailed muon g-2 experiments. Recent experimental results are also mentioned in the latter part of this report. (author)

  17. Design and Data Model of the BNL Archive and Dissemination System

    Energy Technology Data Exchange (ETDEWEB)

    Heller, J.; Osterer, L.

    1977-03-01

    The BNL Archive and Dissemination (BNLAD) System was designed to operate on a homogeneous distributed data base in a computer network. Its primary function is to present a uniform logical and physical view of already existing sequential files of data, so that these files can be accessed at any node of a computer network where the BNLAD System is operable. The architecture of the system, based on a subset of PL/I (the host language), is presented. The Data Model, i.e. the information content of the data base as it is viewed by the users, of the BNLAD System is discussed by means of examples. 7 figs.

  18. Analyzing blends of herbivore-induced volatile organic compounds with factor analysis: revisiting "cotton plant, Gossypium hirsutum L., defense in response to nitrogen fertilization".

    Science.gov (United States)

    Chen, Yigen

    2013-04-01

    Many herbivorous, predaceous, and parasitic insects use constitutive and herbivore-induced volatile organic compounds (VOCs) to locate their respective host plant, prey, and hosts. Multivariate statistical tools (e.g., factor analysis) are recognized increasingly as an appropriate approach for analyzing intercorrelated data such as presence/absence or quantities of VOCs. One challenge of implementing factor analysis is determining how many new variables (factors) to retain in the final analysis. I demonstrate a method proposed by Johnson and Wichern to mitigate this problem by using VOC data published in Chen et al. The advantage of using loading (or weight) transformation in interpretation of new variables was also illustrated in the example. Factor analysis found similar nitrogen fertilization effects on VOC production as those in Chen et al. Similarities were 1) nitrogen fertilization interacted with herbivore damage status on VOC production: at low nitrogen (42 ppm) level, beet armyworm, Spodoptera exigua (Hübner) (Lepidoptera: Noctuidae), damage elicited increases in VOC production, whereas at high nitrogen (196 ppm) VOC production was suppressed; 2) nitrogen fertilization did not affect limonene, alpha-pinene, and beta-pinene production. The seven individual VOCs significantly affected by nitrogen fertilization in Chen et al. were (Z)-3-hexenal, (E)-2-hexenal, (E)-beta-farnesene, (E)-4,8-dimethyl-1,3,7-nonatriene (DMNT), alpha-bergamotene, gamma-bisabolene, and bisabolol, of which only three ((E)-beta-farnesene, gamma-bisabolene, and bisabolol) weighed heavily on factor 1 in the current study.

  19. Access to the energy system network simulator (ESNS), via remote computer terminals. [BNL CDC 7600/6600 computer facility

    Energy Technology Data Exchange (ETDEWEB)

    Reisman, A W

    1976-08-15

    The Energy System Network Simulator (ESNS) flow model is installed on the Brookhaven National Laboratory (BNL) CDC 7600/6600 computer facility for access by off-site users. The method of access available to outside users is through a system called CDC-INTERCOM, which allows communication between the BNL machines and remote teletype terminals. This write-up gives a brief description of INTERCOM for users unfamiliar with this system and a step-by-step guide to using INTERCOM in order to access ESNS.

  20. The Cornell-BNL FFAG-ERL Test Accelerator: White Paper

    CERN Document Server

    Bazarov, Ivan; Dunham, Bruce; Hoffstaetter, Georg; Mayes, Christopher; Patterson, Ritchie; Sagan, David; Ben-Zvi, Ilan; Berg, Scott; Blaskiewicz, Michael; Brooks, Stephen; Brown, Kevin; Fischer, Wolfram; Hao, Yue; Meng, Wuzheng; Méot, François; Minty, Michiko; Peggs, Stephen; Ptitsin, Vadim; Roser, Thomas; Thieberger, Peter; Trbojevic, Dejan; Tsoupas, Nick

    2015-01-01

    The Cornell-BNL FFAG-ERL Test Accelerator (C$\\beta$) will comprise the first ever Energy Recovery Linac (ERL) based on a Fixed Field Alternating Gradient (FFAG) lattice. In particular, we plan to use a Non Scaling FFAG (NS-FFAG) lattice that is very compact and thus space- and cost- effective, enabling multiple passes of the electron beam in a single recirculation beam line, using the superconducting RF (SRF) linac multiple times. The FFAG-ERL moves the cost optimized linac and recirculation lattice to a dramatically better optimum. The prime accelerator science motivation for C$\\beta$ is proving that the FFAG-ERL concept works. This is an important milestone for the Brookhaven National Laboratory (BNL) plans to build a major Nuclear Physics facility, eRHIC, based on producing 21 GeV electron beams to collide with the RHIC ion beams. A consequence of the C$\\beta$ work would be the availability of significantly better, cost-effective, compact CW high-brightness electron beams for a plethora of scientific inves...

  1. PROCEEDINGS OF RIKEN/BNL RESEARCH CENTER WORKSHOP FUTURE TRANSVERSITY MEASUREMENTS (VOLUME 29).

    Energy Technology Data Exchange (ETDEWEB)

    Boer, D.; Grosse Perdekamp, M.

    2001-01-02

    The RIKEN-BNL Research Center workshop on ''Future Transversity Measurements'' was held at BNL from September 18-20, 2000. The main goal of the workshop was to explore future measurements of transversity distributions. This issue is of importance to the RHIC experiments, which will study polarized proton-proton collisions with great precision. One of the workshop's goals was to enhance interactions between the DIS community at HERA and the spin community at RHIC in this field. The workshop has been well received by the participants; the number of 69 registered participants demonstrates broad interest in the workshop's topics. The program contained 35 talks and there was ample time for lively discussions. The program covered all recent work in the field and in addition some very elucidating educational talks were given. At the workshop the present status of the field was discussed and it has succeeded in stimulating new experimental and theoretical studies (e.g. model calculations for interference fragmentation functions (IFF), IFF analysis at DELPHI). It also functioned to focus attention on the open questions that need to be resolved for near future experiments. In general, the conclusions were optimistic, i.e. measuring the transversity functions seems to be possible, although some new experimental hurdles will have to be taken.

  2. CHALLENGES ENCOUNTERED DURING THE PROCESSING OF THE BNL ERL 5 CELL ACCELERATING CAVITY

    Energy Technology Data Exchange (ETDEWEB)

    BURRILL,A.

    2007-06-25

    One of the key components for the Energy Recovery Linac being built by the Electron cooling group in the Collider Accelerator Department is the 5 cell accelerating cavity which is designed to accelerate 2 MeV electrons from the gun up to 15-20 MeV, allow them to make one pass through the ring and then decelerate them back down to 2 MeV prior to sending them to the dump. This cavity was designed by BNL and fabricated by AES in Medford, NY. Following fabrication it was sent to Thomas Jefferson Lab in VA for chemical processing, testing and assembly into a string assembly suitable for shipment back to BNL for integration into the ERL. The steps involved in this processing sequence will be reviewed and the deviations from processing of similar SRF cavities will be discussed. The lessons learned from this process are documented to help future projects where the scope is different from that normally encountered.

  3. Advancing the Deployment of Utility-Scale Photovoltaic Plants in the Northeast

    Energy Technology Data Exchange (ETDEWEB)

    Lofaro R.; Villaran, M; Colli, A.

    2012-06-03

    As one of the premier research laboratories operated by the Department of Energy, Brookhaven National Laboratory (BNL) is pursuing an energy research agenda that focuses on renewable energy systems and will help to secure the nation's energy security. A key element of the BNL research is the advancement of grid-connected utility-scale solar photovoltaic (PV) plants, particularly in the northeastern part of the country where BNL is located. While a great deal of information has been generated regarding solar PV systems located in mostly sunny, hot, arid climates of the southwest US, very little data is available to characterize the performance of these systems in the cool, humid, frequently overcast climates experienced in the northeastern portion of the country. Recognizing that there is both a need and a market for solar PV generation in the northeast, BNL is pursuing research that will advance the deployment of this important renewable energy resource. BNL's research will leverage access to unique time-resolved data sets from the 37MWp solar array recently developed on its campus. In addition, BNL is developing a separate 1MWp solar research array on its campus that will allow field testing of new PV system technologies, including solar modules and balance of plant equipment, such as inverters, energy storage devices, and control platforms. These research capabilities will form the cornerstone of the new Northeast Solar Energy Research Center (NSERC) being developed at BNL. In this paper, an overview of BNL's energy research agenda is given, along with a description of the 37MWp solar array and the NSERC.

  4. Analyzing Orientations

    Science.gov (United States)

    Ruggles, Clive L. N.

    Archaeoastronomical field survey typically involves the measurement of structural orientations (i.e., orientations along and between built structures) in relation to the visible landscape and particularly the surrounding horizon. This chapter focuses on the process of analyzing the astronomical potential of oriented structures, whether in the field or as a desktop appraisal, with the aim of establishing the archaeoastronomical "facts". It does not address questions of data selection (see instead Chap. 25, "Best Practice for Evaluating the Astronomical Significance of Archaeological Sites", 10.1007/978-1-4614-6141-8_25) or interpretation (see Chap. 24, "Nature and Analysis of Material Evidence Relevant to Archaeoastronomy", 10.1007/978-1-4614-6141-8_22). The main necessity is to determine the azimuth, horizon altitude, and declination in the direction "indicated" by any structural orientation. Normally, there are a range of possibilities, reflecting the various errors and uncertainties in estimating the intended (or, at least, the constructed) orientation, and in more formal approaches an attempt is made to assign a probability distribution extending over a spread of declinations. These probability distributions can then be cumulated in order to visualize and analyze the combined data from several orientations, so as to identify any consistent astronomical associations that can then be correlated with the declinations of particular astronomical objects or phenomena at any era in the past. The whole process raises various procedural and methodological issues and does not proceed in isolation from the consideration of corroborative data, which is essential in order to develop viable cultural interpretations.

  5. A silicon multiplicity detector system for an experiment on the interaction of antiprotons with nuclei at BNL

    Energy Technology Data Exchange (ETDEWEB)

    Ahmad, S.; Bonner, B.E.; Buchanan, J.A.; Clement, J.M.; Empl, A.; Mutchler, G.S.; Toshkov, S. (Rice Univ., Houston, TX (United States). Bonner Nuclear Labs.); Eiseman, S.E.; Etkin, A.; Foley, K.J.; Hackenburg, R.W.; Longacre, R.S.; Love, W.A.; Morris, T.W.; Platner, E.D.; Saulys, A.C. (Brookhaven National Lab., Upton, NY (United States)); Chan, C.S.; Kramer, M.A.; Lindenbaum, S.J. (City Coll., New York, NY (Unit

    1991-01-01

    A Large Angle Multiplicity Detector (LAMD) system has been developed and used at the BNL experiment E854: Antiproton Nucleus Interactions. This system performed well with an energetic antiproton beam. Charged particle multiplicity distributions from pbar annihilations were measured. We discuss the design and performance of the LAMD system in this paper. 6 refs., 10 figs.

  6. A silicon multiplicity detector system for an experiment on the interaction of antiprotons with nuclei at BNL

    Energy Technology Data Exchange (ETDEWEB)

    Ahmad, S.; Bonner, B.E.; Buchanan, J.A.; Clement, J.M.; Empl, A.; Mutchler, G.S.; Toshkov, S. [Rice Univ., Houston, TX (United States). Bonner Nuclear Labs.; Eiseman, S.E.; Etkin, A.; Foley, K.J.; Hackenburg, R.W.; Longacre, R.S.; Love, W.A.; Morris, T.W.; Platner, E.D.; Saulys, A.C. [Brookhaven National Lab., Upton, NY (United States); Chan, C.S.; Kramer, M.A.; Lindenbaum, S.J. [City Coll., New York, NY (United States); Hallman, T.J.; Madansky, L. [Johns Hopkins Univ., Baltimore, MD (United States); Peaslee, D.C. [Maryland Univ., College Park, MD (United States)

    1991-12-31

    A Large Angle Multiplicity Detector (LAMD) system has been developed and used at the BNL experiment E854: Antiproton Nucleus Interactions. This system performed well with an energetic antiproton beam. Charged particle multiplicity distributions from pbar annihilations were measured. We discuss the design and performance of the LAMD system in this paper. 6 refs., 10 figs.

  7. A silicon multiplicity detector system for an experiment on the interaction of antiprotons with nuclei at BNL

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, S.; Bonner, B.E.; Buchanan, J.A.; Clement, J.M.; Empl, A.; Mutchler, G.S.; Toshkov, S. (Rice Univ., Houston, TX (United States). Bonner Nuclear Labs.); Eiseman, S.E.; Etkin, A.; Foley, K.J. (Brookhaven National Lab., Upton, NY (United States))

    1992-08-01

    A Large Angle Multiplicity Detector (LAMD) system has been developed and used at the BNL experiment E854: Antiproton Nucleus Interactions. This system performed well with an energetic antiproton beam. Charged particle multiplicity distributions from [bar p] annihilations were measured. The authors discuss the design and performance of the LAMD system in this paper.

  8. Proceedings of RIKEN BNL Research Center Workshop: Brookhaven Summer Program on Nucleon Spin Physics

    Energy Technology Data Exchange (ETDEWEB)

    Aschenauer, A.; Qiu, Jianwei; Vogelsang, W.; Yuan, F.

    2011-08-02

    Understanding the structure of the nucleon is of fundamental importance in sub-atomic physics. Already the experimental studies on the electro-magnetic form factors in the 1950s showed that the nucleon has a nontrivial internal structure, and the deep inelastic scattering experiments in the 1970s revealed the partonic substructure of the nucleon. Modern research focuses in particular on the spin and the gluonic structure of the nucleon. Experiments using deep inelastic scattering or polarized p-p collisions are carried out in the US at the CEBAF and RHIC facilities, respectively, and there are other experimental facilities around the world. More than twenty years ago, the European Muon Collaboration published their first experimental results on the proton spin structure as revealed in polarized deep inelastic lepton-nucleon scattering, and concluded that quarks contribute very little to the proton's spin. With additional experimental and theoretical investigations and progress in the following years, it is now established that, contrary to naive quark model expectations, quarks and anti-quarks carry only about 30% of the total spin of the proton. Twenty years later, the discovery from the polarized hadron collider at RHIC was equally surprising. For the phase space probed by existing RHIC experiments, gluons do not seem to contribute any to the proton's spin. To find out what carries the remaining part of proton's spin is a key focus in current hadronic physics and also a major driving force for the new generation of spin experiments at RHIC and Jefferson Lab and at a future Electron Ion Collider. It is therefore very important and timely to organize a series of annual spin physics meetings to summarize the status of proton spin physics, to focus the effort, and to layout the future perspectives. This summer program on 'Nucleon Spin Physics' held at Brookhaven National Laboratory (BNL) on July 14-27, 2010 [http://www.bnl.gov/spnsp/] is the

  9. Portable Fuel Quality Analyzer

    Science.gov (United States)

    2014-01-27

    other transportation industries, such as trucking. The PFQA could also be used in fuel blending operations performed at petroleum, ethanol and biodiesel plants. ...JAN 2014 2. REPORT TYPE Project Summary 3. DATES COVERED 29-07-2013 to 27-01-2014 4. TITLE AND SUBTITLE PORTABLE FUEL QUALITY ANALYZER

  10. A combined model for pseudorapidity distributions in Cu-Cu collisions at BNL-RHIC energies

    CERN Document Server

    Jiang, Zhjin; Huang, Yan

    2016-01-01

    The charged particles produced in nucleus-nucleus collisions come from leading particles and those frozen out from the hot and dense matter created in collisions. The leading particles are conventionally supposed having Gaussian rapidity distributions normalized to the number of participants. The hot and dense matter is assumed to expand according to the unified hydrodynamics, a hydro model which unifies the features of Landau and Hwa-Bjorken model, and freeze out into charged particles from a space-like hypersurface with a proper time of Tau_FO . The rapidity distribution of this part of charged particles can be derived out analytically. The combined contribution from both leading particles and unified hydrodynamics is then compared against the experimental data performed by BNL-RHIC-PHOBOS Collaboration in different centrality Cu-Cu collisions at sqrt(s_NN)=200 and 62.4 GeV, respectively. The model predictions are in well consistent with experimental measurements.

  11. Parameters Optimization for a Novel Vacuum Laser Acceleration Test at BNL-ATF

    CERN Document Server

    Shao, Lei; Zhou, Feng

    2005-01-01

    This paper presents a new VLA theory model which has revealed that the injection electrons with low energy and small incident angle relative to the laser beam are captured and significantly accelerated in a strong laser field. For the further step for verifying the novel-VLA mechanics, we propose to use the BNL-ATF Terawatt CO2 laser and a high-brightness electron beam to carry out a proof-of-principle beam experiment. Experiment setup including the laser injection optics and electron extraction system and beam diagnostics is presented. Extensive optimized simulation results with ATF practical parameters are also presented, which shows that even when the laser intensity is not very high, the net energy gain still can be seen obviously. This could be prospect for a new revolution of vacuum laser acceleration.

  12. NRC-BNL Benchmark Program on Evaluation of Methods for Seismic Analysis of Coupled Systems

    Energy Technology Data Exchange (ETDEWEB)

    Chokshi, N.; DeGrassi, G.; Xu, J.

    1999-03-24

    A NRC-BNL benchmark program for evaluation of state-of-the-art analysis methods and computer programs for seismic analysis of coupled structures with non-classical damping is described. The program includes a series of benchmarking problems designed to investigate various aspects of complexities, applications and limitations associated with methods for analysis of non-classically damped structures. Discussions are provided on the benchmarking process, benchmark structural models, and the evaluation approach, as well as benchmarking ground rules. It is expected that the findings and insights, as well as recommendations from this program will be useful in developing new acceptance criteria and providing guidance for future regulatory activities involving licensing applications of these alternate methods to coupled systems.

  13. NRC-BNL BENCHMARK PROGRAM ON EVALUATION OF METHODS FOR SEISMIC ANALYSIS OF COUPLED SYSTEMS.

    Energy Technology Data Exchange (ETDEWEB)

    XU,J.

    1999-08-15

    A NRC-BNL benchmark program for evaluation of state-of-the-art analysis methods and computer programs for seismic analysis of coupled structures with non-classical damping is described. The program includes a series of benchmarking problems designed to investigate various aspects of complexities, applications and limitations associated with methods for analysis of non-classically damped structures. Discussions are provided on the benchmarking process, benchmark structural models, and the evaluation approach, as well as benchmarking ground rules. It is expected that the findings and insights, as well as recommendations from this program will be useful in developing new acceptance criteria and providing guidance for future regulatory activities involving licensing applications of these alternate methods to coupled systems.

  14. Performance on the low charge state laser ion source in BNL

    Energy Technology Data Exchange (ETDEWEB)

    Okamura, M.; Alessi, J.; Beebe, E.; Costanzo, M.; DeSanto, L.; Jamilkowski, J.; Kanesue, T.; Lambiase, R.; Lehn, D.; Liaw, C. J.; McCafferty, D.; Morris, J.; Olsen, R.; Pikin, A.; Raparia, D.; Steszyn, A.; Ikeda, S.

    2015-09-07

    On March 2014, a Laser Ion Source (LIS) was commissioned which delivers high-brightness, low-charge-state heavy ions for the hadron accelerator complex in Brookhaven National Laboratory (BNL). Since then, the LIS has provided many heavy ion species successfully. The low-charge-state (mostly singly charged) beams are injected to the Electron Beam Ion Source (EBIS), where ions are then highly ionized to fit to the following accelerator’s Q/M acceptance, like Au32+. Recently we upgraded the LIS to be able to provide two different beams into EBIS on a pulse-to-pulse basis. Now the LIS is simultaneously providing beams for both the Relativistic Heavy Ion Collider (RHIC) and NASA Space Radiation Laboratory (NSRL).

  15. Studies of material properties under irradiation at BNL Linear Isotope Producer (BLIP)

    CERN Document Server

    Simos, N; Ludewig, H; Mokhov, N; Hurh, P; Misek, J

    2012-01-01

    Effects of proton beams irradiating materials considered for targets in high-power accelerator experiments have been under study using the Brookhaven National Laboratory's (BNL) 200 MeV Linac. The primary objectives of the study that includes a wide array of materials and alloys ranging between low and high-Z are to (a) observe changes in physio-mechanical properties which are important in maintaining high-power target functionality, (b) identify possible limits of proton flux or fluence above which certain material seize to maintain integrity, (c) study the role of material operating temperature in inducing or maintaining radiation damage reversal, and (d) correlate radiation damage effects of different species such as energetic protons and neutrons on materials by utilizing reactor and particle accelerator experience data. These objectives are specifically being addressed in the latest material irradiation study linked to the Long Baseline Neutrino Experiment (LBNE). Observations on irradiation effects on m...

  16. Target and orbit feedback simulations of a muSR beam line at BNL

    Energy Technology Data Exchange (ETDEWEB)

    MacKay, W. [Brookhaven National Lab. (BNL), Upton, NY (United States); Blaskiewicz, M. [Brookhaven National Lab. (BNL), Upton, NY (United States); Fischer, W. [Brookhaven National Lab. (BNL), Upton, NY (United States); Pile, P. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2015-07-28

    Well-polarized positive surface muons are a tool to measure the magnetic properties of materials since the precession rate of the spin can be determined from the observation of the positron directions when the muons decay. For a dc beam an ideal µSR flux for surface µ+ should be about 40 kHz/mm2. In this report we show how this flux could be achieved in a beam line using the AGS complex at BNL for a source of protons. We also determined that an orbit feedback system with a pair of thin silicon position monitors and kickers would miss the desired flux by at least an order of magnitude, even with perfect time resolution and no multiple scattering.

  17. Application of the micro-PIXE technique for analyzing arsenic in biomat and lower plants of lichen and mosses around an arsenic mine site, at Gunma, Japan

    Science.gov (United States)

    Ohnuki, T.; Sakamoto, F.; Kozai, N.; Samadfam, M.; Sakai, T.; Kamiya, T.; Satoh, T.; Oikawa, M.

    2002-05-01

    Microhabitats of bacteria (biomat) and lower plants, such as lichen and mosses, are known to accumulate hazardous elements. Since the concentration of hazardous elements in the environment is quite low, we have applied the in-air μ-PIXE (particle induced X-ray emission) system developed in the TIARA facility of JAERI, which has low concentration detection limit of ppm, to measure As, one of the hazardous elements, distributions in biomat, lichen and mosses observed around an abandoned As mine site in Gunma, Japan to elucidate the applicability of these biomat and lower plants as bio-indicators of As. Spatial distributions of As, Fe, Si and S in all biomat, lichen and moss collected within 3 m from the mine entrance indicate that As is localized, and is associated with silicate and Fe-containing compounds. In addition, the intensity ratio of peak area for As to Fe in μ-PIXE spectrum of the moss collected from the concrete wall at 3 m downstream of the mine water discharge position is different from those of the lower plants on the rock near the closed entrance, but is the same as that of biomat formed at the mine water discharge position. This indicates that As trapped by the moss on the concrete wall probably has the same origin as the biomat. It is concluded that application of μ-PIXE analysis to the measurement of As in the lower plants and biomat gives not only the distribution of the hazardous element of As, but also the information of the origin.

  18. Performance of microstrip gas chambers in BNL-E885: a search for LAMBDA LAMBDA-hypernuclei

    CERN Document Server

    Landry, M; Davis, C A; Faszer, W; Gan, L; Lee, L; Page, S A; Ramsay, W D; Salomon, M; Oers, W T H

    1999-01-01

    The performance of MicroStrip Gas Chambers (MSGC) in BNL Experiment 885, a search for LAMBDA LAMBDA-hypernuclei, is detailed. Chambers with an active area of 80x50 mm sup 2 were instrumented and operated as a vertex detector in the experiment. Furthermore, two distinct types of microstrip prints were utilized in these chambers. Prints manufactured with Integrated Circuit (IC) photolithographic technology have fine tolerances and thin minimum trace widths, but can suffer from a high rate of defects per print and are more costly. Prints constructed with Printed Circuit (PC) photolithographic technology have coarser tolerances but relatively few defects per print, and are extremely cost-effective. Results of bench and beam tests of both IC and PC based MSGCs are presented and their performance in BNL-E885 is discussed. E885 marks the first use of PC based MSGCs in an experiment.

  19. Analyzing the Implications of Climate Data on Plant Hardiness Zones for Green Infrastructure Planning: Case Study of Knoxville, Tennessee and Surrounding Region

    Energy Technology Data Exchange (ETDEWEB)

    Sylvester, Linda M [ORNL; Omitaomu, Olufemi A [ORNL; Parish, Esther S [ORNL

    2016-07-01

    Downscaled climate data for Knoxville, Tennessee and the surrounding region were used to investigate future changing Plant Hardiness Zones due to climate change. The methodology used is the same as the US Department of Agriculture (USDA), well-known for their creation of the standard Plant Hardiness Zone map used by gardeners and planners. USDA data were calculated from observed daily data for 1976–2005. The modeled climate data for the past is daily data from 1980-2005 and the future data is projected for 2025–2050. The average of all the modeled annual extreme minimums for each time period of interest was calculated. Each 1 km raster cell was placed into zone categories based on temperature, using the same criteria and categories of the USDA. The individual models vary between suggesting little change to the Plant Hardiness Zones to suggesting Knoxville moves into the next two Hardiness Zones. But overall, the models suggest moving into the next warmer Zone. USDA currently has the Knoxville area categorized as Zone 7a. None of the Zones calculated from the climate data models placed Knoxville in Zone 7a for the similar time period. The models placed Knoxville in a cooler Hardiness Zone and projected the area to increase to Zone 7. The modeled temperature data appears to be slightly cooler than the actual temperature data and this may explain the zone discrepancy. However, overall Knoxville is projected to increase to the next warmer Zone. As the modeled data has Knoxville, overall, moving from Zone 6 to Zone 7, it can be inferred that Knoxville, Tennessee may increase from their current Zone 7 to Zone 8.

  20. Analyzing the Economy-wide Impact of the Supply Chains Activated by a new Biomass Power Plant. The case of cardoon in Sardinia

    OpenAIRE

    Bonfiglio, Andrea; Esposti, Roberto

    2014-01-01

    This study investigates the impact on the economy of the Italian region of the Sardinia generated by a new biomass power plant that will be fed with locally cultivated cardoon. The cardoon will also serve the production of biopolymers. The impact is assessed at an economy-wide level using a multiregional mixed-variable closed I-O model that allows taking into account the whole supply chain activated and the cross-regional effects generated by trade across local industries. The effects are com...

  1. Proceedings of RIKEN BNL Research Center Workshop: Brookhaven Summer Program on Quarkonium Production in Elementary and Heavy Ion Collisions

    Energy Technology Data Exchange (ETDEWEB)

    Dumitru, A.; Lourenco, C.; Petreczky, P.; Qiu, J., Ruan, L.

    2011-08-03

    Understanding the structure of the hadron is of fundamental importance in subatomic physics. Production of heavy quarkonia is arguably one of the most fascinating subjects in strong interaction physics. It offers unique perspectives into the formation of QCD bound states. Heavy quarkonia are among the most studied particles both theoretically and experimentally. They have been, and continue to be, the focus of measurements in all high energy colliders around the world. Because of their distinct multiple mass scales, heavy quarkonia were suggested as a probe of the hot quark-gluon matter produced in heavy-ion collisions; and their production has been one of the main subjects of the experimental heavy-ion programs at the SPS and RHIC. However, since the discovery of J/psi at Brookhaven National Laboratory and SLAC National Accelerator Laboratory over 36 years ago, theorists still have not been able to fully understand the production mechanism of heavy quarkonia, although major progresses have been made in recent years. With this in mind, a two-week program on quarkonium production was organized at BNL on June 6-17, 2011. Many new experimental data from LHC and from RHIC were presented during the program, including results from the LHC heavy ion run. To analyze and correctly interpret these measurements, and in order to quantify properties of the hot matter produced in heavy-ion collisions, it is necessary to improve our theoretical understanding of quarkonium production. Therefore, a wide range of theoretical aspects on the production mechanism in the vacuum as well as in cold nuclear and hot quark-gluon medium were discussed during the program from the controlled calculations in QCD and its effective theories such as NRQCD to various models, and to the first principle lattice calculation. The scientific program was divided into three major scientific parts: basic production mechanism for heavy quarkonium in vacuum or in high energy elementary collisions; the

  2. Microsatellite markers in plants and insects part II: Databases and in silico tools for microsatellite mining and analyzing population genetic stratification

    Science.gov (United States)

    Nucleotide sequence information available in searchable sequence databases and the free in silico software with which to extract and analyze microsatellite data continues to grow at a rapid rate across eukaryote taxa. The sheer amount of information available means that a comprehensive or exhaustive...

  3. A five-watt G-M/J-T refrigerator for LHe target at BNL

    Science.gov (United States)

    Jia, L. X.; Wang, L.; Addessi, L.; Miglionico, G.; Martin, D.; Leskowicz, J.; McNeill, M.; Yatauro, B.; Tallerico, T.

    2002-05-01

    A five-watts G-M/J-T refrigerator was built and installed for the high-energy physics research at Brookhaven National Laboratory in 2001. A liquid helium target of 8.25 liters was required for an experiment in the proton beam line at the Alternating Gradient Synchrotron (AGS) of BNL. The large radiation heat load towards the target requires a five-watts refrigerator at 4.2 K to support a liquid helium flask of 0.2 meter in diameter and 0.3 meter in length, which is made of Mylar film of 0.35 mm in thickness. The liquid helium flask is thermally exposed to the vacuum windows that are also made of 0.35 mm thickness Mylar film at room temperature. The refrigerator uses a two-stage Gifford-McMahon cryocooler for precooling the Joule-Thomson circuit that consists of five Linde-type heat exchangers. A mass flow rate of 0.8˜1.0 grams per second at 17.7 atm is applied to the refrigerator cold box. The two-phase helium flows between the liquid target and liquid/gas separator by means of a thermosyphon. This paper presents the system design as well as the test results including the control of the thermal oscillation.

  4. Proceedings of RIKEN BNL Research Center Workshop: The Physics of W and Z Bosons

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, S.; Okada, K.; Patwa, A.; Qiu, J.; Surrow, B.

    2010-06-24

    A two-day workshop on 'The Physics of Wand Z Bosons' Was held at the RIKEN BNL Research Center at Brookhaven National Laboratory on June 24-25, 2010. With the recent release of the first measurement of W bosons in proton-proton collisions at RHIC and the first observation of W events at the LHC, the workshop was a timely opportunity to bring together experts from both the high energy particle and nuclear physics communities to share their ideas and expertise on the physics of Wand Z bosons, with the aim of fully exploring the potential of the W/Z physics programs at RHIC and the LHC. The focus was on the production and measurement of W/Z bosons in both polarized and unpolarized proton-proton collisions, and the role of W/Z production in probing the parton flavor and helicity structure of the colliding proton and in the search for new physics. There were lively discussions about the potential and future prospects of W/Z programs at RHIC, Tevatron, and the LHC.

  5. The Upgrade of the DUV-FEL Facility at the BNL

    CERN Document Server

    Wang, Xijie; Murphy, James; Rakowsky, George; Rose, James; Sheehy, Brian; Shen, Yuzhen; Skaritka, John; Wu, Zilu; Yu Li Hua

    2004-01-01

    The DUV-FEL at BNL, is the world's only facility dedicated to laser-seeded FEL R&D and its applications. The HGHG at the DUV-FEL reached saturation at 266 nm with 800 nm seeding [1] in 2002. Since then, the first chemical science experiment ? ion pair imaging, was successfully completed [2].The DUV-FEL linac is being upgraded from 200 to 300 MeV to enable the HGHG FEL to produce 100 μJ pulses of 100 nm light. This will establish the DUV FEL as a premier user facility for XUV radiation. The upgraded facility will also enable several critical R&Ds for a future X-ray FEL based on HGHG, such as cascaded HGHG and higher harmonic HGHG (n>5). The upgraded HGHG will operate at the 4th harmonic with the seed laser at 400nm. The increase of the electron beam energy will be accomplished by installing a 5th linac cavity and two 45 MW klystrons. New modulator and dispersion sections vacuum chambers will be manufactured to accommodate new matching optics and 8th harmonic HGHG. The status of the DUV-FEL upgra...

  6. Proceedings of RIKEN BNL Research Center Workshop, Volume 91, RBRC Scientific Review Committee Meeting

    Energy Technology Data Exchange (ETDEWEB)

    Samios,N.P.

    2008-11-17

    The ninth evaluation of the RIKEN BNL Research Center (RBRC) took place on Nov. 17-18, 2008, at Brookhaven National Laboratory. The members of the Scientific Review Committee (SRC) were Dr. Dr. Wit Busza (Chair), Dr. Miklos Gyulassy, Dr. Akira Masaike, Dr. Richard Milner, Dr. Alfred Mueller, and Dr. Akira Ukawa. We are pleased that Dr. Yasushige Yano, the Director of the Nishina Institute of RIKEN, Japan participated in this meeting both in informing the committee of the activities of the Nishina Institute and the role of RBRC and as an observer of this review. In order to illustrate the breadth and scope of the RBRC program, each member of the Center made a presentation on his/her research efforts. This encompassed three major areas of investigation, theoretical, experimental and computational physics. In addition the committee met privately with the fellows and postdocs to ascertain their opinions and concerns. Although the main purpose of this review is a report to RIKEN Management (Dr. Ryoji Noyori, RIKEN President) on the health, scientific value, management and future prospects of the Center, the RBRC management felt that a compendium of the scientific presentations are of sufficient quality and interest that they warrant a wider distribution. Therefore we have made this compilation and present it to the community for its information and enlightenment.

  7. Distribution and ecotoxicity of chlorotriazines in the Scheldt Estuary (B-Nl)

    Energy Technology Data Exchange (ETDEWEB)

    Noppe, Herlinde [Ghent University, Faculty of Veterinary Medicine, Research group of Veterinary Public Health and Zoonoses, Laboratory of Chemical Analysis, Salisburylaan 133, B-9820 Merelbeke (Belgium)]. E-mail: hubert.debrabander@ugent.be; Ghekiere, An [Ghent University, Faculty of Bioscience Engineering, Laboratory of Environmental Toxicology and Aquatic Ecology, J. Plateaustraat 22, B-9000 Ghent (Belgium); Verslycke, Tim [Woods Hole Oceanographic Institution, Biology Department, MS32, Woods Hole, MA 02543 (United States); Wulf, Eric de [Flemish Environment Agency, Laboratory for Analysis of Organic Micropollutants, Krijgslaan 281-S2, B-9000 Ghent (Belgium); Verheyden, Karolien [Ghent University, Faculty of Veterinary Medicine, Research group of Veterinary Public Health and Zoonoses, Laboratory of Chemical Analysis, Salisburylaan 133, B-9820 Merelbeke (Belgium); Monteyne, Els [Management Unit of the North Sea Mathematical Models, 3e and 23e Linieregimentsplein, B-8400 Ostend (Belgium); Polfliet, Karen [Ghent University, Faculty of Bioscience Engineering, Laboratory of Environmental Toxicology and Aquatic Ecology, J. Plateaustraat 22, B-9000 Ghent (Belgium); Caeter, Peter van [Flemish Environment Agency, Laboratory for Analysis of Organic Micropollutants, Krijgslaan 281-S2, B-9000 Ghent (Belgium); Janssen, Colin R. [Ghent University, Faculty of Bioscience Engineering, Laboratory of Environmental Toxicology and Aquatic Ecology, J. Plateaustraat 22, B-9000 Ghent (Belgium); Brabander, Hubert F. de [Ghent University, Faculty of Veterinary Medicine, Research group of Veterinary Public Health and Zoonoses, Laboratory of Chemical Analysis, Salisburylaan 133, B-9820 Merelbeke (Belgium)]. E-mail: herlinde.noppe@ugent.be

    2007-06-15

    As part of the Endis-Risks project, the current study describes the occurrence of the chlorotriazine pesticides atrazine, simazine and terbutylazine in water, sediment and suspended matter in the Scheldt estuary (B-Nl) from 2002 to 2005 (3 samplings a year, 8 sampling points). Atrazine was found at the highest concentrations, varying from 10 to 736 ng/l in water and from 5 up to 10 ng/g in suspended matter. Simazine and terbutylazine were detected at lower concentrations. Traces of the targeted pesticides were also detected in sediments, but these were below the limit of quantification. As part of an ecotoxicological assessment, we studied the potential effect of atrazine on molting of Neomysis integer (Crustacea:Mysidacea), a resident invertebrate of the Scheldt Estuary and a proposed test organism for the evaluation of endocrine disruption. Following chronic exposure ({approx}3 weeks), atrazine did not significantly affect mysid molting at environmentally relevant concentrations (up to 1 {mu}g/l). - The water of the Scheldt estuary and its associated suspended solids are contaminated with chlorotriazines at concentrations that do not affect mysid molting.

  8. Proposal for Reduction of Transverse Emittance of BNL 200 MeV Linac

    CERN Document Server

    Alessi, J; Raparia, D; Weng, W T

    2004-01-01

    BNL plans to upgrade the AGS proton beam from the current 0.14 MW to higher than 1.0 MW and beyond for such a neutrino facility which consists of two major subsystems. First is a 1.2 GeV super-conducting linac (SCL) to replace the booster as injector for the AGS. Second is the performance upgrade for the AGS itself for the higher intensity and repetition rate. For high intensity proton accelerators, such as the upgraded AGS, there are very stringent limitations on uncontrolled beam losses. A direct effect of linac beam emittance is the halo/tail generation in the circulating beam. Studies show the estimated halo/tail generation in the beam for present normalized RMS emittance of linac beam is unacceptable. To reduce the transverse emittance of 200 MeV linac, the existing radio frequency quadrupole linac (RFQ) has to be relocated closer to drift tube linac (DTL) tank 1 to meet emittance requirement for the AGS injection with low loss. This paper will present the various options of matching between RFQ and DTL,...

  9. SCIENTIFIC PRESENTATION. 7TH MEETING OF THE MANAGEMENT STEERING COMMITTEE OF THE RIKEN BNL COLLABORATION.

    Energy Technology Data Exchange (ETDEWEB)

    LEE,T.D.

    2001-02-13

    The RIKEN BNL Research Center (RBRC) was established in April 1997 at Brookhaven National Laboratory. It is funded by the ''Rikagaku Kenkysho,'' (RIKEN) The Institute of Physical and Chemical Research, of Japan. The Center is dedicated to the study of strong 'interactions, including hard QCD/spin physics, lattice QCD and RHIC (Relativistic Heavy Ion Collider) physics through nurturing of a new generation of young physicists. The Director of RBRC is Professor T. D. Lee. The first years were dedicated to the establishment of a theory group. This has essentially been completed consisting of Fellows, Postdocs, and RHIC Physics/University Fellows, with an active group of consultants. The center also organizes an extensive series of workshops on specific topics in strong interactions with an accompanying series of published proceedings. In addition, a 0.6 teraflop parallel processor computer has been constructed and operational since August 1998. It was awarded the Supercomputer 1998 Gordon Bell Prize for price performance. An active experimental group centered around the spin physics program at RHIC has subsequently also been established at RBRC. It presently consists of five Fellows, one Postdoc and several scientific collaborators with more appointments being expected in the near future. Members and participants of RBRC on occasion will develop articles such as this one, in the nature of a status report or a general review.

  10. SCIENTIFIC PRESENTATION. 7TH MEETING OF THE MANAGEMENT STEERING COMMITTEE OF THE RIKEN BNL COLLABORATION.

    Energy Technology Data Exchange (ETDEWEB)

    Lee, T.D.

    2001-02-13

    The RIKEN BNL Research Center (RBRC) was established in April 1997 at Brookhaven National Laboratory. It is funded by the ''Rikagaku Kenkysho,'' (RIKEN) The Institute of Physical and Chemical Research, of Japan. The Center is dedicated to the study of strong interactions, including hard QCD/spin physics, lattice QCD and RHIC (Relativistic Heavy Ion Collider) physics through nurturing of a new generation of young physicists. The Director of RBRC is Professor T. D. Lee. The first years were dedicated to the establishment of a theory group. This has essentially been completed consisting of Fellows, Postdocs, and RHIC Physics/University Fellows, with an active group of consultants. The center also organizes an extensive series of workshops on specific topics in strong interactions with an accompanying series of published proceedings. In addition, a 0.6 teraflop parallel processor computer has been constructed and operational since August 1998. It was awarded the Supercomputer 1998 Gordon Bell Prize for price performance. An active experimental group centered around the spin physics program at RHIC has subsequently also been established at RBRC. It presently consists of five Fellows, one Postdoc and several scientific collaborators with more appointments being expected in the near future. Members and participants of RBRC on occasion will develop articles such as this one, in the nature of a status report or a general review.

  11. eRHIC Design Study: An Electron-Ion Collider at BNL

    CERN Document Server

    Aschenauer, E C; Bazilevsky, A; Boyle, K; Belomestnykh, S; Ben-Zvi, I; Brooks, S; Brutus, C; Burton, T; Fazio, S; Fedotov, A; Gassner, D; Hao, Y; Jing, Y; Kayran, D; Kiselev, A; Lamont, M A C; Lee, J -H; Litvinenko, V N; Liu, C; Ludlam, T; Mahler, G; McIntyre, G; Meng, W; Meot, F; Miller, T; Minty, M; Parker, B; Pinayev, I; Ptitsyn, V; Roser, T; Stratmann, M; Sichtermann, E; Skaritka, J; Tchoubar, O; Thieberger, P; Toll, T; Trbojevic, D; Tsoupas, N; Tuozzolo, J; Ullrich, T; Wang, E; Wang, G; Wu, Q; Xu, W; Zheng, L

    2014-01-01

    This document presents BNL's plan for an electron-ion collider, eRHIC, a major new research tool that builds on the existing RHIC facility to advance the long-term vision for Nuclear Physics to discover and understand the emergent phenomena of Quantum Chromodynamics (QCD), the fundamental theory of the strong interaction that binds the atomic nucleus. We describe the scientific requirements for such a facility, following up on the community wide 2012 white paper, "Electron-Ion Collider: the Next QCD Frontier", and present a design concept that incorporates new, innovative accelerator techniques to provide a cost-effective upgrade of RHIC with polarized electron beams colliding with the full array of RHIC hadron beams. The new facility will deliver electron-nucleon luminosity of $\\sim10^{33} cm^{-2}sec^{-1}$ for collisions of 15.9 GeV polarized electrons on either 250 GeV polarized protons or 100 GeV/u heavy ion beams. The facility will also be capable of providing an electron beam energy of 21.2 GeV, at reduc...

  12. Malfunction Diagnosing and Analyzing for the Reciprocating Compressor of Dry Gas Purification Ethylene Plant%干气压缩机运行故障诊断研究

    Institute of Scientific and Technical Information of China (English)

    刘文涛; 郭洋

    2012-01-01

    结合干气提浓乙烯装置的工艺特点,对装置主流程中关键设备往复式压缩机的运行故障进行详细分析,提出介质携带杂质、液体并具有腐蚀性,以及仪表系统的假信号等是导致压缩机故障频发的主要原因.根据实际运行经验,针对导致故障发生的原因从工艺与设备两方面深入探讨改进措施,在实际运行中已采取的措施及取得的良好效果,并就如何提高机组的运行可靠性做了进一步讨论,为干气装置的压缩机长周期稳定运行提出切实可行的方案.%Combining the analysis of the technology characteristics of the reciprocating compressor for "dry gas purification ethylene plant", a detailed analysis is carried on of the operational failure of key e?quipment in the main flow of the device reciprocating compressors, and the main reasons leading to the frequent failure of compressor are raised,which are the medium that carries impurity,liquid and being corrosive,and the false signal in instrument systenuAccording to the actual operating experience,in view of the cause of the failure, technology and equipment are focused on to improve measures,and in the actual opera-lion good results are achieved,and the situation of how to improve the operation reliability of the unit have been further discussed,providing feasible scheme for the stable and long-term operation of reciprocating compressor of dry gas plant.

  13. Analyzing and Solving Problems of Centrifugal Pumps Occurred during the Commissioning of Shell Coal Gasification Plant%Shell煤气化装置离心泵试车问题分析与处理

    Institute of Scientific and Technical Information of China (English)

    尹俊杰; 赵瑞萍

    2012-01-01

    The problems of too great head of delivery, air sinuses, interlock, etc., which occurred to the discharge pump in the Yueyang Coal Gasification Plant, the hot water pump in the Anqing Coal Gasification Plant and the high pressure process water pumps in the Yueyang and Anqing coal gasification plants during the commissioning of these plants were analyzed respectively, taking the centrifugal pumps in the Yueyang and Anqing Shell gasification plants as examples. Relevant programs for solving these problems were proposed based on the theory of fluid handling mechanism. The practice of those programs verified their correctness and rationality. The method for solving problems of centrifugal pumps for chemical plants was thus summarized and can be served as a reference for other similar chemical plants.%针对Shell煤气化装置离心泵的应用,分别分析了岳阳煤气化装置排水泵、安庆煤气化装置热水泵及岳阳和安庆煤气化装置高压工艺水泵在试车过程中出现的扬程偏大、气缚和联锁等问题,结合流体输送机械理论,提出了相应的解决方案,并通过实施后的运行效果,检验了所提方案的正确性与合理性,从而总结出化工装置现场离心泵试车问题的处理方法,可供其他类似项目借鉴.

  14. PROCEEDINGS OF RIKEN BNL RESEARCH CENTER WORKSHOP, RHIC SPIN PHYSICS V, VOLUME 32, FEBRUARY 21, 2001.

    Energy Technology Data Exchange (ETDEWEB)

    BUNCE,G.; SAITO,N.; VIGDOR,S.; ROSER,T.; SPINKA,H.; ENYO,H.; BLAND,L.C.; GURYN,W.

    2001-02-21

    The RIKEN BNL Research Center (RBRC) was established in April 1997 at Brookhaven National Laboratory. It is funded by the ''Rikagaku Kenkysho'' (RIKEN, The Institute of Physical and Chemical Research) of Japan. The Center is dedicated to the study of strong interactions, including spin physics, lattice QCD and RHIC physics through the nurturing of a new generation of young physicists. During the fast year, the Center had only a Theory Group. In the second year, an Experimental Group was also established at the Center. At present, there are seven Fellows and nine post dots in these two groups. During the third year, we started a new Tenure Track Strong Interaction Theory RHIC Physics Fellow Program, with six positions in the academic year 1999-2000; this program will increase to include eleven theorists in the next academic year, and, in the year after, also be extended to experimental physics. In addition, the Center has an active workshop program on strong interaction physics, about ten workshops a year, with each workshop focused on a specific physics problem. Each workshop speaker is encouraged to select few of the most important transparencies from his or her presentation, accompanied by a page of explanation. This material is collected at the end of the workshop by the organizer to form proceedings, which can therefore be available within a short time. The construction of a 0.6 teraflop parallel processor, which was begun at the Center on February 19, 1998, was completed on August 28, 1998.

  15. PROCEEDINGS OF RIKEN BNL RESEARCH CENTER WORKSHOP (VOLUME 55) COLLECTIVE FLOW AND QGP PROPERTIES.

    Energy Technology Data Exchange (ETDEWEB)

    BASS,S.ESUMI,S.HEINZ,U.KOLB,P.SHURYAK,E.XU,N.

    2003-11-17

    The first three years of RHIC physics, with Au/Au collisions induced at 65, 130 and 200 GeV per nucleon pair, produced dramatic results, particularly with respect to collective observables such as transverse flow and anisotropies in transverse momentum spectra. It has become clear that the data show very strong rescattering at very early times of the reaction, strong enough in fact to be described by the hydrodynamic limit. Therefore, with today's experiments, we are able to investigate the equation of state of hot quark gluon matter, discuss its thermodynamic properties and relate them to experimental observables. At this workshop we came together to discuss our latest efforts both in the theoretical description of heavy ion collisions as well as most recent experimental results that ultimately allow us to extract information on the properties of RHIC matter. About 50 participants registered for the workshop, but many more dropped in from the offices at BNL. The workshop lasted for three days, of which each day was assigned a special topic on which the talks focused. On the first day we dealt with the more general question what the strong collective phenomena observed in RHIC collisions tell us about the properties and the dynamics of RHIC matter. The second day covered all different aspects of momentum anisotropies, and interesting new experimental results were presented for the first time. On the third day, we focused on the late fireball dynamics and the breakdown of the assumption of thermalization. New experimental observables were discussed, which will deliver more information of how the expanding fireball breaks up, once the frequent interaction ceases.

  16. PROCEEDINGS OF RIKEN BNL RESEARCH CENTER WORKSHOP ENTITLED - DOMAIN WALL FERMIONS AT TEN YEARS (VOLUME 84)

    Energy Technology Data Exchange (ETDEWEB)

    BLUM,T.; SONI,A.

    2007-03-15

    The workshop was held to mark the 10th anniversary of the first numerical simulations of QCD using domain wall fermions initiated at BNL. It is very gratifying that in the intervening decade widespread use of domain wall and overlap fermions is being made. It therefore seemed appropriate at this stage for some ''communal introspection'' of the progress that has been made, hurdles that need to be overcome, and physics that can and should be done with chiral fermions. The meeting was very well attended, drawing about 60 registered participants primarily from Europe, Japan and the US. It was quite remarkable that pioneers David Kaplan, Herbert Neuberger, Rajamani Narayanan, Yigal Shamir, Sinya Aoki, and Pavlos Vranas all attended the workshop. Comparisons between domain wall and overlap formulations, with their respective advantages and limitations, were discussed at length, and a broad physics program including pion and kaon physics, the epsilon regime, nucleon structure, and topology, among others, emerged. New machines and improved algorithms have played a key role in realizing realistic dynamical fermion lattice simulations (small quark mass, large volume, and so on), so much in fact that measurements are now as costly. Consequently, ways to make the measurements more efficient were also discussed. We were very pleased to see the keen and ever growing interest in chiral fermions in our community and the significant strides our colleagues have made in bringing chiral fermions to the fore of lattice QCD calculations. Their contributions made the workshop a success, and we thank them deeply for sharing their time and ideas. Finally, we must especially acknowledge Norman Christ and Bob Mawhinney for their early and continued collaboration without which the success of domain wall fermions would not have been possible.

  17. 浅析丽水市屋顶绿化的现状及适宜植物的选择%Analyzed Roof Greening Situation of Lishui City and the Selection of Suitable Plants

    Institute of Scientific and Technical Information of China (English)

    邹晓梅; 刘瑞瑜

    2013-01-01

      随着城市的发展,生态问题的凸现,因而诞生了一种新型的绿化形式——屋顶绿化。该文阐述了屋顶绿化的概念及分类,分析了丽水市屋顶绿化的现状;通过对丽水市的植物的调查研究,同时结合国内外屋顶绿化植物的选用及屋顶绿化植物配置的原则,筛选出适合丽水地区的屋顶绿化植物。%  this paper briefly introduces the concept and classification of urban roof greening,and analyzed the roof greening situation of Lishui city. Through to Lishui plant investigation and research,at the same time,com⁃bined with the selection of roof greening plants at home and abroad and the roof greening plant configuration, the principle of screening for Lishui Roof greening plants.

  18. PROCEEDINGS OF RIKEN BNL RESEARCH CENTER WORKSHOP, VOLUME 77, RBRC SCIENTIFIC REVIEW COMMITTEE MEETING, OCTOBER 10-12, 2005

    Energy Technology Data Exchange (ETDEWEB)

    SAMIOS, N.P.

    2005-10-10

    The eighth evaluation of the RIKEN BNL Research Center (RBRC) took place on October 10-12, 2005, at Brookhaven National Laboratory. The members of the Scientific Review Committee (SRC) were Dr. Jean-Paul Blaizot, Professor Makoto Kobayashi, Dr. Akira Masaike, Professor Charles Young Prescott (Chair), Professor Stephen Sharpe (absent), and Professor Jack Sandweiss. We are grateful to Professor Akira Ukawa who was appointed to the SRC to cover Professor Sharpe's area of expertise. In addition to reviewing this year's program, the committee, augmented by Professor Kozi Nakai, evaluated the RBRC proposal for a five-year extension of the RIKEN BNL Collaboration MOU beyond 2007. Dr. Koji Kaya, Director of the Discovery Research Institute, RIKEN, Japan, presided over the session on the extension proposal. In order to illustrate the breadth and scope of the RBRC program, each member of the Center made a presentation on higher research efforts. In addition, a special session was held in connection with the RBRC QCDSP and QCDOC supercomputers. Professor Norman H. Christ, a collaborator from Columbia University, gave a presentation on the progress and status of the project, and Professor Frithjof Karsch of BNL presented the first physics results from QCDOC. Although the main purpose of this review is a report to RIKEN Management (Dr. Ryoji Noyori, RIKEN President) on the health, scientific value, management and future prospects of the Center, the RBRC management felt that a compendium of the scientific presentations are of sufficient quality and interest that they warrant a wider distribution. Therefore we have made this compilation and present it to the community for its information and enlightenment.

  19. PROCEEDINGS OF RIKEN BNL RESEARCH CENTER WORKSHOP, RHIC SPIN COLLABORATION MEETINGS XII AND XIII, SEPTEMBER 16, 2002, OCTOBER 22, 2002.

    Energy Technology Data Exchange (ETDEWEB)

    FOX,B.

    2003-03-06

    Since its inception, the RHIC Spin Collaboration (RSC) has held semi-regular meetings each year to discuss the physics possibilities and the operational details of the program. Having collected our first data sample of polarized proton-proton collisions in Run02 of RHIC, we are now in the process of examining the performance of both the accelerator and the experiments. During the PAC meeting on August 29, 2002, the beam use proposal with a four week, polarized proton physics run was approved as part of the plan for Run-03. So, we meet at BNL on September 16, 2002 to discuss the concrete plans for this proton-proton run.

  20. High-intensity polarized H-(proton), deuteron and 3He++ion source development at BNL.

    Energy Technology Data Exchange (ETDEWEB)

    Zelenski,A.

    2008-06-23

    New techniques for the production of polarized electron, H{sup -} (proton), D (D+) and {sup 3}H{sup ++} ion beams are discussed. Feasibility studies of these techniques are in progress at BNL. An Optically Pumped Polarized H{sup -} Ion Source (OPPIS) delivers beam for polarization studies in RHIC. The polarized deuteron beam will be required for the deuteron Electron Dipole Moment (EDM) experiment, and the {sup 3}H{sup ++} ion beam is a part of the experimental program for the future eRHIC (Electron Ion) collider.

  1. Development of a practical training program based on BNL`s input to new NFPA Lined Masonary Chimney Venting Tables

    Energy Technology Data Exchange (ETDEWEB)

    Potter, G. [Agway Energy Products, Tully, NJ (United States)

    1997-09-01

    This paper describes how we developed a practical training program for technicians and sales personnel from the BNL studies that evolved into the Lined Chimney Venting Tables. One of the topics discussed is our search for solutions to the reoccurring problems associated with flue gas condensation on newly installed oil fired appliances. The paper will also discuss our own experiences in applying the new venting tables and working through the questions that arise when we encounter installations beyond the scope of the present tables.

  2. PROCEEDINGS OF RIKEN BNL RESEARCH CENTER WORKSHOP, VOLUME 72, RHIC SPIN COLLABORATION MEETINGS XXXI, XXXII, XXXIII.

    Energy Technology Data Exchange (ETDEWEB)

    OGAWA, A.

    2005-04-11

    The RIKEN BNL Research Center (RBRC) was established in April 1997 at Brookhaven National Laboratory. It is funded by the ''Rikagaku Kenkyusho'' (RIKEN, The Institute of Physical and Chemical Research) of Japan. The Center is dedicated to the study of strong interactions, including spin physics, lattice QCD, and RHIC physics through the nurturing of a new generation of young physicists. The RBRC has both a theory and experimental component. At present the theoretical group has 4 Fellows and 3 Research Associates as well as 11 RHIC Physics/University Fellows (academic year 2003-2004). To date there are approximately 30 graduates from the program of which 13 have attained tenure positions at major institutions worldwide. The experimental group is smaller and has 2 Fellows and 3 RHIC Physics/University Fellows and 3 Research Associates, and historically 6 individuals have attained permanent positions. Beginning in 2001 a new RIKEN Spin Program (RSP) category was implemented at RBRC. These appointments are joint positions of RBRC and RIKEN and include the following positions in theory and experiment: RSP Researchers, RSP Research Associates, and Young Researchers, who are mentored by senior RBRC Scientists. A number of RIKEN Jr. Research Associates and Visiting Scientists also contribute to the physics program at the Center. RBRC has an active workshop program on strong interaction physics with each workshop focused on a specific physics problem. Each workshop speaker is encouraged to select a few of the most important transparencies from his or her presentation, accompanied by a page of explanation. This material is collected at the end of the workshop by the organizer to form proceedings, which can therefore be available within a short time. To date there are seventy-two proceeding volumes available. The construction of a 0.6 teraflops parallel processor, dedicated to lattice QCD, begun at the Center on February 19, 1998, was completed on August

  3. PROCEEDINGS FROM RIKEN-BNL RESEARCH CENTER WORKSHOP: PARITY-VIOLATING SPIN ASYMMETRIES AT RHIC.

    Energy Technology Data Exchange (ETDEWEB)

    VOGELSANG,W.; PERDEKAMP, M.; SURROW, B.

    2007-04-26

    . Also, new observables, such as jet and W+charrn final states and spin asymmetries in Z production, were proposed and discussed. All of the talks attracted much interest and initiated active discussions. This was a very successful workshop. It stimulated many discussions and new collaborations. We are grateful to all participants and speakers for coming to the Center, and for their excellent work. The support provided for this workshop by Dr. N. Samios and his RIKEN-BNL Research Center has been magnificent, and we are very grateful for it. We thank Brookhaven National Laboratory and the U.S. Department of Energy for providing the facilities to hold the workshop. Finally, sincere thanks go to Jane Lysik for her efficient work on organizing and running the workshop.

  4. PROCEEDINGS OF RIKEN BNL RESEARCH CENTER WORKSHOP, VOLUME 57, HIGH PT PHYSICS AT RHIC, DECEMBER 2-6, 2003

    Energy Technology Data Exchange (ETDEWEB)

    Kretzer, Stefan; Venugopalan, Raju; Vogelsang, Werner

    2004-02-18

    The AuAu, dAu, and pp collision modes of the RHIC collider at BNL have led to the publication of exciting high p{perpendicular} particle production data. There have also been two physics runs with polarized protons, and preliminary results on the double-spin asymmetry for pion production had been presented very recently. The ontological questions behind these measurements are fascinating: Did RHIC collisions create a Quark-Gluon-Plasma phase and did they verify the Color Glass Condensate as the high energy limit of QCD? Will the Spin Crisis finally be resolved in terms of gluon polarization and what new surprises are we yet to meet for Transverse Spin? Phenomena related to sub-microscopic questions as important as these call for interpretations that are footed in solid theory. At large p{perpendicular}, perturbative concepts are legitimately expected to provide useful approaches. The corresponding hard parton dynamics are, in several ways, key to unraveling the initial or final state and collisional phase of hard scattering events in vacuum as well as in hot or cold nuclear matter. Before the advent of RHIC data, a RIKEN-BNL workshop had been held at BNL in March 1999 on ''Hard Parton Physics in High Energy Nuclear Collisions''. The 2003 workshop on ''High p{perpendicular} Physics at RHIC'' was a logical continuation of this previous workshop. It gave the opportunity to revisit the 1999 expectations in the light of what has been found in the meantime and, at the same time, to critically discuss the underlying theoretical concepts. We brought together theorists who have done seminal work on the foundations of parton phenomenology in field theory, with theorists and experimentalists who are presently working on RHIC phenomenology. The participants were both from a high-energy physics and nuclear physics background and it remains only to be said here that this chemistry worked perfectly and the workshop was a great success.

  5. On line chemical analyzers for high purity steam and water, applied to steam power plants; Analizadores quimicos en linea para agua y vapor de alta pureza, aplicados a centrales termoelectricas

    Energy Technology Data Exchange (ETDEWEB)

    Diaz Perez, Ruth [Instituto de Investigaciones Electricas, Cuernavaca (Mexico)

    1989-12-31

    This article presents a general overview of the advances in the subject of on line analyzers of chemical parameters for high purity water and steam and specifies which ones are commercially available. Also are mentioned besides, the criteria nowadays applied for the selection of the sites for sample grabbing and the analysis that is necessary to perform in each point, depending on the power plant type and the treatment administered (phosphates-Ph coordinated or AVT treatment). [Espanol] El articulo presenta un panorama general de los avances que en materia de analizadores de parametros quimicos en linea para agua y vapor de alta pureza, y especifica cuales estan disponibles en forma comercial. Se citan, ademas los criterios que se aplican actualmente para seleccionar los puntos de toma de muestra y los analisis que es necesario efectuar en cada punto, dependiendo del tipo de central y del tratamiento que se le administre (fosfatos-pH coordinado o tratamiento AVT).

  6. EVENT DRIVEN AUTOMATIC STATE MODIFICATION OF BNL'S BOOSTER FOR NASA SPACE RADIATION LABORATORY SOLAR PARTICLE SIMULATOR.

    Energy Technology Data Exchange (ETDEWEB)

    BROWN, D.; BINELLO, S.; HARVEY, M.; MORRIS, J.; RUSEK, A.; TSOUPAS, N.

    2005-05-16

    The NASA Space Radiation Laboratory (NSRL) was constructed in collaboration with NASA for the purpose of performing radiation effect studies for the NASA space program. The NSRL makes use of heavy ions in the range of 0.05 to 3 GeV/n slow extracted from BNL's AGS Booster. NASA is interested in reproducing the energy spectrum from a solar flare in the space environment for a single ion species. To do this we have built and tested a set of software tools which allow the state of the Booster and the NSRL beam line to be changed automatically. In this report we will describe the system and present results of beam tests.

  7. Operational tests of the BNL 24.8 kW, 3.8 K helium refrigerator

    Science.gov (United States)

    Brown, D. P.; Farah, Y.; Gibbs, R. J.; Schlafke, A. P.; Sondericker, J. H.; Wu, K. C.; Freeman, M.; Ganni, V.; Kowalski, R.; McWilliams, R.

    1985-06-01

    The BNL 24.8 kW refrigeration system is completely installed and major portions of the acceptance tests have been completed. So far, the equipment tested has performed at or above design levels. The room temperature helium compressor station has been completely tested and accepted. The two-stage oil injected screw compressor system exhibited an isothermal efficiency of 57% while delivering a helium flow in excess of 4400 g/s. Data on the performance of the make-up gas cryogenic purifier is given. The refrigerator turbomachinery, 13 expanders and three cold compressors, has been tested at room temperature for mechanical integrity and control stability. The first cooldown to operating temperature will be attempted in late August, 1985.

  8. Perturbative QCD as a probe of hadron structure: Volume 2. Proceedings of RIKEN BNL Research Center workshop

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-10-01

    The workshop brought together about thirty invited participants from around the world, and an almost equal number of Brookhaven users and staff, to discuss recent developments and future prospects for hadronic strong interaction studies at high energy, particularly relating to the RHIC project at Brookhaven. RIKEN and Brookhaven have long traditions in and commitments to the study of the strong interactions, and the advent of the RHIC collider will open new opportunities both for relativistic heavy ion and polarized proton-proton studies. Activities at the RIKEN BNL Research Center are intended to focus on physics opportunities stimulated by this new facility. Thus, one of the purposes of the center is to provide a forum where workers in the field can gather to share and develop their ideas in a stimulating environment. The purpose of the workshop was both to delineate theoretical problems and stimulate collaborations to address them. The workshop focused primarily, but not exclusively, on spin and small-x physics.

  9. Proceedings of RIKEN BNL Research Center Workshop: Thermal Photons and Dileptons in Heavy-Ion Collisions. Volume 119

    Energy Technology Data Exchange (ETDEWEB)

    David, G. [Brookhaven National Laboratory (BNL), Upton, NY (United States); Rapp, R. [Brookhaven National Laboratory (BNL), Upton, NY (United States); Ruan, L. [Brookhaven National Laboratory (BNL), Upton, NY (United States); Yee, H-U. [Brookhaven National Laboratory (BNL), Upton, NY (United States)

    2014-09-11

    The RIKEN BNL Research Center (RBRC) was established in April 1997 at Brookhaven National Laboratory. It is funded by the ''Rikagaku Kenkyusho'' (RIKEN, The Institute of Physical and Chemical Research) of Japan and the U. S. Department of Energy’s Office of Science. The RBRC is dedicated to the study of strong interactions, including spin physics, lattice QCD, and RHIC physics through the nurturing of a new generation of young physicists. The RBRC has theory, lattice gauge computing and experimental components. It is presently exploring the possibility of an astrophysics component being added to the program. The primary theme for this workshop related to sharing the latest experimental and theoretical developments in area of low transverse momentum (pT) dielectron and photons. All the presentations given at the workshop are included in this proceedings, primarily as PowerPoint presentations.

  10. PROCEEDINGS OF RIKEN BNL RESEARCH CENTER WORKSHOP, HADRON STRUCTURE FROM LATTICE QCD, MARCH 18 - 22, 2002, BROOKHAVEN NATIONAL LABORATORY.

    Energy Technology Data Exchange (ETDEWEB)

    BLUM, T.; BOER, D.; CREUTZ, M.; OHTA, S.; ORGINOS, K.

    2002-03-18

    The RIKEN BNL Research Center workshop on ''Hadron Structure from Lattice QCD'' was held at BNL during March 11-15, 2002. Hadron structure has been the subject of many theoretical and experimental investigations, with significant success in understanding the building blocks of matter. The nonperturbative nature of QCD, however, has always been an obstacle to deepening our understanding of hadronic physics. Lattice QCD provides the tool to overcome these difficulties and hence a link can be established between the fundamental theory of QCD and hadron phenomenology. Due to the steady progress in improving lattice calculations over the years, comparison with experimentally measured hadronic quantities has become important. In this respect the workshop was especially timely. By providing an opportunity for experts from the lattice and hadron structure communities to present their latest results, the workshop enhanced the exchange of knowledge and ideas. With a total of 32 registered participants and 26 talks, the interest of a growing community is clearly exemplified. At the workshop Schierholz and Negele presented the current status of lattice computations of hadron structure. Substantial progress has been made during recent years now that the quenched results are well under control and the first dynamical results have appeared. In both the dynamical and the quenched simulations the lattice results, extrapolated to lighter quark masses, seem to disagree with experiment. Melnitchouk presented a possible explanation (chiral logs) for this disagreement. It became clear from these discussions that lattice computations at significantly lighter quark masses need to be performed.

  11. Proceedings of RIKEN BNL Research Center Workshop: Understanding QGP through Spectral Functions and Euclidean Correlators (Volume 89)

    Energy Technology Data Exchange (ETDEWEB)

    Mocsy,A.; Petreczky, P.

    2008-06-27

    In the past two decades, one of the most important goals of the nuclear physics community has been the production and characterization of the new state of matter--Quark-Gluon Plasma (QGP). Understanding how properties of hadrons change in medium, particularly, the bound state of a very heavy quark and its antiquark, known as quarkonium, as well as determining the transport coefficients is crucial for identifying the properties of QGP and for the understanding of the experimental data from RHIC. On April 23rd, more than sixty physicists from twenty-seven institutions gathered for this three-day topical workshop held at BNL to discuss how to understand the properties of the new state of matter obtained in ultra-relativistic heavy ion collisions (particularly at RHIC-BNL) through spectral functions. In-medium properties of the different particle species and the transport properties of the medium are encoded in spectral functions. The former could yield important signatures of deconfinement and chiral symmetry restoration at high temperatures and densities, while the later are crucial for the understanding of the dynamics of ultra-relativistic heavy ion collisions. Participants at the workshop are experts in various areas of spectral function studies. The workshop encouraged direct exchange of scientific information among experts, as well as between the younger and the more established scientists. The workshops success is evident from the coherent picture that developed of the current understanding of transport properties and in-medium particle properties, illustrated in the current proceedings. The following pages show calculations of meson spectral functions in lattice QCD, as well as implications of these for quarkonia melting/survival in the quark gluon plasma; Lattice calculations of the transport coefficients (shear and bulk viscosities, electric conductivity); Calculation of spectral functions and transport coefficients in field theories using weak coupling

  12. Comparing and Analyzing for Seismic Response of Main Workshop of Heat-engine Plant under Different Arrangement Formation%火电厂主厂房不同布置方式下地震反应对比分析

    Institute of Scientific and Technical Information of China (English)

    张景瑞; 陈雨; 袁国锋

    2011-01-01

    大型火力发电厂是重要的生命线工程,主厂房传统的布置方式采用三连式布置方式,而新的布置方式采用汽机房和煤仓问交叉布置方式.用有限元分析软件对两种布置方式进行反应谱分析,从而对两种布置方式在不同地震烈度条件下的主厂房的层间剪力、层问位移及层间位移角进行对比分析.结果表明,在不同地震烈度条件下,主厂房采用交叉布置方式时,主厂房的质量、刚度的分布较采用三连式布置方式时均匀;主厂房采用交叉布置方式时,主厂房的抗震性能要比主厂房采用三连式布置方式时要好.%Huge heat-engine plant is the crucial lifeblood project, the traditional arrangement formation of which is the triple-connected method, but the latest arrangement foimation of which is the cross arrangement method between the turbine hall and coml-store hall. Comparing and analyzing the response spectrum of the said two methods by using finite element software, the interlaminated shear force, interlaminated displacement and interlaminated displacement angle under different seismic intensity are compared and analyzed. The results indicate that the quality and rigidity of the main workshop by using cross arrangement method are more even than those of the one by using triple-connected method under different seismic intensity, and the anti-seismic performance of the former is better than that of the latter.

  13. Summary of the Mini BNL/LARP/CARE-HHH Workshop on Crab Cavities for the LHC (LHC-CC08)

    Energy Technology Data Exchange (ETDEWEB)

    Ben-Zvi,I.; Calaga, R.; Zimmermann, F.

    2008-05-01

    The first mini-workshop on crab compensation for the LHC luminosity upgrade (LHC-CC08) was held February 24-25, 2008 at the Brookhaven National Laboratory. A total of 35 participants from 3 continents and 15 institutions from around the world participated to discuss the exciting prospect of a crab scheme for the LHC. If realized it will be the first demonstration in hadron colliders. The workshop is organized by joint collaboration of BNL, US-LARP and CARE-HHH. The enormous interest in the subject of crab cavities for the international linear collider and future light sources has resulted in a large international collaboration to exchange aspects of synergy and expertise. A central repository for this exchange of information documenting the latest design effort for LHC crab cavities is consolidated in a wiki page: https://twiki.cern.ch/twiki/bin/view/Main/LHCCrabCavities. The main goal of this workshop was to define a road-map for a prototype crab cavity to be installed in the LHC and to discuss the associated R&D and beam dynamics challenges. The diverse subject of implementing the crab scheme resulted in a scientific program with a wide range of subtopics which were divided into 8 sessions. Each session was given a list of fundamental questions to be addressed and used as a guideline to steer the discussions.

  14. Proceedings of RIKEN BNL Research Center Workshop: The Approach to Equilibrium in Strongly Interacting Matter. Volume 118

    Energy Technology Data Exchange (ETDEWEB)

    Liao, J. [Brookhaven National Lab. (BNL), Upton, NY (United States); Venugopalan, R. [Brookhaven National Lab. (BNL), Upton, NY (United States); Berges, J. [Brookhaven National Lab. (BNL), Upton, NY (United States); Blaizot, J. -P. [Brookhaven National Lab. (BNL), Upton, NY (United States); Gelis, F. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2014-04-09

    The RIKEN BNL Research Center (RBRC) was established in April 1997 at Brookhaven National Laboratory*. It is funded by the ''Rikagaku Kenkyusho'' (RIKEN, The Institute of Physical and Chemical Research) of Japan and the U. S. Department of Energy’s Office of Science. The RBRC is dedicated to the study of strong interactions, including spin physics, lattice QCD, and RHIC physics through the nurturing of a new generation of young physicists. The RBRC has theory, lattice gauge computing and experimental components. It is presently exploring the possibility of an astrophysics component being added to the program. The purpose of this Workshop is to critically review the recent progress on the theory and phenomenology of early time dynamics in relativistic heavy ion collisions from RHIC to LHC energies, to examine the various approaches on thermalization and existing issues, and to formulate new research efforts for the future. Topics slated to be covered include Experimental evidence for equilibration/isotropization, comparison of various approaches, dependence on the initial conditions and couplings, and turbulent cascades and Bose-Einstein condensation.

  15. Open charm meson production at BNL RHIC within $k_{t}$-factorization approach and revision of their semileptonic decays

    CERN Document Server

    Maciula, Rafal; Luszczak, Marta

    2015-01-01

    We discuss inclusive production of open charm mesons in proton-proton scattering at the BNL RHIC. The calculation is performed in the framework of $k_t$-factorization approach which effectively includes higher-order pQCD corrections. Different models of unintegrated gluon distributions (UGDF) from the literature are used. We focus on UGDF models favoured by the LHC data and on a new up-to-date parametrizations based on the HERA collider DIS high-precision data. Results of the $k_t$-factorization approach are compared to next-to-leading order collinear predictions. The hadronization of heavy quarks is done by means of fragmentation function technique. The theoretical transverse momentum distributions of charmed mesons are compared with recent experimental data of the STAR collaboration at $\\sqrt{s} = 200$ and $500$ GeV. Theoretical uncertainties related to the choice of renormalization and factorization scales as well as due to the quark mass are discussed. Very good description of the measured integrated cros...

  16. The Intermodulation Lockin Analyzer

    CERN Document Server

    Tholen, Erik A; Forchheimer, Daniel; Schuler, Vivien; Tholen, Mats O; Hutter, Carsten; Haviland, David B

    2011-01-01

    Nonlinear systems can be probed by driving them with two or more pure tones while measuring the intermodulation products of the drive tones in the response. We describe a digital lock-in analyzer which is designed explicitly for this purpose. The analyzer is implemented on a field-programmable gate array, providing speed in analysis, real-time feedback and stability in operation. The use of the analyzer is demonstrated for Intermodulation Atomic Force Microscopy. A generalization of the intermodulation spectral technique to arbitrary drive waveforms is discussed.

  17. Analog multivariate counting analyzers

    CERN Document Server

    Nikitin, A V; Armstrong, T P

    2003-01-01

    Characterizing rates of occurrence of various features of a signal is of great importance in numerous types of physical measurements. Such signal features can be defined as certain discrete coincidence events, e.g. crossings of a signal with a given threshold, or occurrence of extrema of a certain amplitude. We describe measuring rates of such events by means of analog multivariate counting analyzers. Given a continuous scalar or multicomponent (vector) input signal, an analog counting analyzer outputs a continuous signal with the instantaneous magnitude equal to the rate of occurrence of certain coincidence events. The analog nature of the proposed analyzers allows us to reformulate many problems of the traditional counting measurements, and cast them in a form which is readily addressed by methods of differential calculus rather than by algebraic or logical means of digital signal processing. Analog counting analyzers can be easily implemented in discrete or integrated electronic circuits, do not suffer fro...

  18. Analyzing binding data.

    Science.gov (United States)

    Motulsky, Harvey J; Neubig, Richard R

    2010-07-01

    Measuring the rate and extent of radioligand binding provides information on the number of binding sites, and their affinity and accessibility of these binding sites for various drugs. This unit explains how to design and analyze such experiments.

  19. Analyzing in the Present

    DEFF Research Database (Denmark)

    Revsbæk, Line; Pedersen, Lene Tanggaard

    2015-01-01

    The article presents a notion of “analyzing in the present” as a source of inspiration in analyzing qualitative research materials. The term emerged from extensive listening to interview recordings during everyday commuting to university campus. Paying attention to the way different parts...... of various interviews conveyed diverse significance to the listening researcher at different times became a method of continuously opening up the empirical material in a reflexive, breakdown-oriented process of analysis. We argue that situating analysis in the present of analyzing emphasizes and acknowledges...... the interdependency between researcher and researched. On this basis, we advocate an explicit “open-state-of mind” listening as a key aspect of analyzing qualitative material, often described only as a matter of reading transcribed empirical materials, reading theory, and writing. The article contributes...

  20. Miniature mass analyzer

    CERN Document Server

    Cuna, C; Lupsa, N; Cuna, S; Tuzson, B

    2003-01-01

    The paper presents the concept of different mass analyzers that were specifically designed as small dimension instruments able to detect with great sensitivity and accuracy the main environmental pollutants. The mass spectrometers are very suited instrument for chemical and isotopic analysis, needed in environmental surveillance. Usually, this is done by sampling the soil, air or water followed by laboratory analysis. To avoid drawbacks caused by sample alteration during the sampling process and transport, the 'in situ' analysis is preferred. Theoretically, any type of mass analyzer can be miniaturized, but some are more appropriate than others. Quadrupole mass filter and trap, magnetic sector, time-of-flight and ion cyclotron mass analyzers can be successfully shrunk, for each of them some performances being sacrificed but we must know which parameters are necessary to be kept unchanged. To satisfy the miniaturization criteria of the analyzer, it is necessary to use asymmetrical geometries, with ion beam obl...

  1. Analyzing Microarray Data.

    Science.gov (United States)

    Hung, Jui-Hung; Weng, Zhiping

    2017-03-01

    Because there is no widely used software for analyzing RNA-seq data that has a graphical user interface, this protocol provides an example of analyzing microarray data using Babelomics. This analysis entails performing quantile normalization and then detecting differentially expressed genes associated with the transgenesis of a human oncogene c-Myc in mice. Finally, hierarchical clustering is performed on the differentially expressed genes using the Cluster program, and the results are visualized using TreeView.

  2. Human Systems Interface and Plant Modernization Process: Technical Basis and Human Factors Review Guidance

    Science.gov (United States)

    2000-03-01

    NUREG /CR-6637 BNL- NUREG -52567 Human Systems Interface and Plant Modernization Process: Technical Basis and Human Factors Review Guidance Brookhaven...NOTICE Availability of Reference Materials Cited in NRC Publications NRC publications in the NUREG series, NRC regu- <http://www.nrc.gov>lations, and...sources: access NUREG -series publications and other NRCrecords in NRC’s Agencywide Document Access 1. The Superintendent of Documents and Management

  3. HOMs of the SRF Electron Gun Cavity in the BNL ERL

    Science.gov (United States)

    Hahn, H.; Ben-Zvi, I.; Belomestnykh, S.; Hammons, L.; Litvinenko, V.; Than, Y. R.; Todd, R.; Weiss, D.; Xu, Wencan

    The Brookhaven Energy Recovery Linac (ERL) is operated as an R&D test bed for high-current, low emittance electron beams. It comprises a superconducting five-cell cavity and a half-cell superconducting RF photo-injector electron gun. The ERL is undergoing commissioning with focus on the performance of the electron gun, not the least on the cavity Higher Order Modes (HOM). Among the various alternative solutions, a beam tube damper based on a layer of ferrite tiles was adopted for the five-cell accelerator cavity. For the gun, a ceramic-ferrite damper consisting of a lossless ceramic cylinder surrounded by damping ferrite tiles has been investigated. This design is innovative in its damper approach and combines a variety of goals including broadband HOM damping and protection of the superconducting cavity vacuum from potential damage by the separately cooled absorber. In this paper the empirical performance of an installed ceramic-ferrite damper is described by the Q reduction of a few selected gun cavity resonances. The theoretical coupling impedance presented to a traversing beam is numerically analyzed in terms of radial waveguide modes in the damper section. Strong damping of the gun cavity HOMs by the fundamental power coupler (FPC) is found and discussed. Finally, the measured Q-values of the operational gun cavity without the ceramic-ferrite damper at superconducting temperatures are presented

  4. Total organic carbon analyzer

    Science.gov (United States)

    Godec, Richard G.; Kosenka, Paul P.; Smith, Brian D.; Hutte, Richard S.; Webb, Johanna V.; Sauer, Richard L.

    The development and testing of a breadboard version of a highly sensitive total-organic-carbon (TOC) analyzer are reported. Attention is given to the system components including the CO2 sensor, oxidation reactor, acidification module, and the sample-inlet system. Research is reported for an experimental reagentless oxidation reactor, and good results are reported for linearity, sensitivity, and selectivity in the CO2 sensor. The TOC analyzer is developed with gravity-independent components and is designed for minimal additions of chemical reagents. The reagentless oxidation reactor is based on electrolysis and UV photolysis and is shown to be potentially useful. The stability of the breadboard instrument is shown to be good on a day-to-day basis, and the analyzer is capable of 5 sample analyses per day for a period of about 80 days. The instrument can provide accurate TOC and TIC measurements over a concentration range of 20 ppb to 50 ppm C.

  5. Advances in hematology analyzers.

    Science.gov (United States)

    DeNicola, Dennis B

    2011-05-01

    The complete blood count is one of the basic building blocks of the minimum database in veterinary medicine. Over the past 20 years, there has been a tremendous advancement in the technology of hematology analyzers and their availability to the general practitioner. There are 4 basic methodologies that can be used to generate data for a complete blood count: manual methods, quantitative buffy coat analysis, automated impedance analysis, and flow cytometric analysis. This article will review the principles of these methodologies, discuss some of their advantages and disadvantages, and describe some of the hematology analyzers that are available for the in-house veterinary laboratory.

  6. Analyzing radioligand binding data.

    Science.gov (United States)

    Motulsky, Harvey; Neubig, Richard

    2002-08-01

    Radioligand binding experiments are easy to perform, and provide useful data in many fields. They can be used to study receptor regulation, discover new drugs by screening for compounds that compete with high affinity for radioligand binding to a particular receptor, investigate receptor localization in different organs or regions using autoradiography, categorize receptor subtypes, and probe mechanisms of receptor signaling, via measurements of agonist binding and its regulation by ions, nucleotides, and other allosteric modulators. This unit reviews the theory of receptor binding and explains how to analyze experimental data. Since binding data are usually best analyzed using nonlinear regression, this unit also explains the principles of curve fitting with nonlinear regression.

  7. Analyzing Workforce Education. Monograph.

    Science.gov (United States)

    Texas Community & Technical Coll. Workforce Education Consortium.

    This monograph examines the issue of task analysis as used in workplace literacy programs, debating the need for it and how to perform it in a rapidly changing environment. Based on experiences of community colleges in Texas, the report analyzes ways that task analysis can be done and how to implement work force education programs more quickly.…

  8. Analyzing Stereotypes in Media.

    Science.gov (United States)

    Baker, Jackie

    1996-01-01

    A high school film teacher studied how students recognized messages in film, examining how film education could help students identify and analyze racial and gender stereotypes. Comparison of students' attitudes before and after the film course found that the course was successful in raising students' consciousness. (SM)

  9. Targeted Alpha Therapy: The US DOE Tri-Lab (ORNL, BNL, LANL) Research Effort to Provide Accelerator-Produced 225Ac for Radiotherapy

    Science.gov (United States)

    John, Kevin

    2017-01-01

    Targeted radiotherapy is an emerging discipline of cancer therapy that exploits the biochemical differences between normal cells and cancer cells to selectively deliver a lethal dose of radiation to cancer cells, while leaving healthy cells relatively unperturbed. A broad overview of targeted alpha therapy including isotope production methods, and associated isotope production facility needs, will be provided. A more general overview of the US Department of Energy Isotope Program's Tri-Lab (ORNL, BNL, LANL) Research Effort to Provide Accelerator-Produced 225Ac for Radiotherapy will also be presented focusing on the accelerator-production of 225Ac and final product isolation methodologies for medical applications.

  10. Magnetoresistive emulsion analyzer.

    Science.gov (United States)

    Lin, Gungun; Baraban, Larysa; Han, Luyang; Karnaushenko, Daniil; Makarov, Denys; Cuniberti, Gianaurelio; Schmidt, Oliver G

    2013-01-01

    We realize a magnetoresistive emulsion analyzer capable of detection, multiparametric analysis and sorting of ferrofluid-containing nanoliter-droplets. The operation of the device in a cytometric mode provides high throughput and quantitative information about the dimensions and magnetic content of the emulsion. Our method offers important complementarity to conventional optical approaches involving ferrofluids, and paves the way to the development of novel compact tools for diagnostics and nanomedicine including drug design and screening.

  11. PhosphoSiteAnalyzer

    DEFF Research Database (Denmark)

    Bennetzen, Martin V; Cox, Jürgen; Mann, Matthias

    2012-01-01

    an algorithm to retrieve kinase predictions from the public NetworKIN webpage in a semiautomated way and applies hereafter advanced statistics to facilitate a user-tailored in-depth analysis of the phosphoproteomic data sets. The interface of the software provides a high degree of analytical flexibility...... and is designed to be intuitive for most users. PhosphoSiteAnalyzer is a freeware program available at http://phosphosite.sourceforge.net ....

  12. IPv6 Protocol Analyzer

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    With the emerging of next generation Intemet protocol (IPv6), it is expected to replace the current version of Internet protocol (IPv4) that will be exhausted in the near future. Besides providing adequate address space, some other new features are included into the new 128 bits of IP such as IP auto configuration, quality of service, simple routing capability, security, mobility and multicasting. The current protocol analyzer will not be able to handle IPv6 packets. This paper will focus on developing protocol analyzer that decodes IPv6 packet. IPv6 protocol analyzer is an application module,which is able to decode the IPv6 packet and provide detail breakdown of the construction of the packet. It has to understand the detail construction of the IPv6, and provide a high level abstraction of bits and bytes of the IPv6 packet.Thus it increases network administrators' understanding of a network protocol,helps he/she in solving protocol related problem in a IPv6 network environment.

  13. Proceedings of RIKEN BNL Research Center Workshop: P- and CP-odd Effects in Hot and Dense Matter

    Energy Technology Data Exchange (ETDEWEB)

    Deshpande, A.; Fukushima, K.; Kharzeev, D.; Warringa, H.; Voloshin, S.

    2010-04-26

    This volume contains the proceedings of the RBRC/CATHIE workshop on 'P- and CP-odd Effects in Hot and Dense Matter' held at the RIKEN-BNL Research Center on April 26-30, 2010. The workshop was triggered by the experimental observation of charge correlations in heavy ion collisions at RHIC, which were predicted to occur due to local parity violation (P- and CP-odd fluctuations) in hot and dense QCD matter. This experimental result excited a significant interest in the broad physics community, inspired a few alternative interpretations, and emphasized the need for a deeper understanding of the role of topology in QCD vacuum and in hot and dense quark-gluon matter. Topological effects in QCD are also closely related to a number of intriguing problems in condensed matter physics, cosmology and astrophysics. We therefore felt that a broad cross-disciplinary discussion of topological P- and CP-odd effects in various kinds of matter was urgently needed. Such a discussion became the subject of the workshop. Specific topics discussed at the workshop include the following: (1) The current experimental results on charge asymmetries at RHIC and the physical interpretations of the data; (2) Quantitative characterization of topological effects in QCD matter including both analytical (perturbative and non-perturbative using gauge/gravity duality) and numerical (lattice-QCD) calculations; (3) Topological effects in cosmology of the Early Universe (including baryogenesis and dark energy); (4) Topological effects in condensed matter physics (including graphene and superfiuids); and (5) Directions for the future experimental studies of P- and CP-odd effects at RHIC and elsewhere. We feel that the talks and intense discussions during the workshop were extremely useful, and resulted in new ideas in both theory and experiment. We hope that the workshop has contributed to the progress in understanding the role of topology in QCD and related fields. We thank all the speakers and

  14. Analyzing Chinese Financial Reporting

    Institute of Scientific and Technical Information of China (English)

    SABRINA; ZHANG

    2008-01-01

    If the world’s capital markets could use a harmonized accounting framework it would not be necessary for a comparison between two or more sets of accounting standards. However,there is much to do before this becomes reality.This article aims to pres- ent a general overview of China’s General Accepted Accounting Principles(GAAP), U.S.General Accepted Accounting Principles and International Financial Reporting Standards(IFRS),and to analyze the differ- ences among IFRS,U.S.GAAP and China GAAP using fixed assets as an example.

  15. Analyzing business models

    DEFF Research Database (Denmark)

    Nielsen, Christian

    2014-01-01

    financial statement. Plumlee (2003) finds for instance that such information imposes significant costs on even expert users such as analysts and fund managers and reduces their use of it. Analysts’ ability to incorporate complex information in their analyses is a decreasing function of its complexity......, because the costs of processing and analyzing it exceed the benefits indicating bounded rationality. Hutton (2002) concludes that the analyst community’s inability to raise important questions on quality of management and the viability of its business model inevitably led to the Enron debacle. There seems...

  16. Mineral/Water Analyzer

    Science.gov (United States)

    1983-01-01

    An x-ray fluorescence spectrometer developed for the Viking Landers by Martin Marietta was modified for geological exploration, water quality monitoring, and aircraft engine maintenance. The aerospace system was highly miniaturized and used very little power. It irradiates the sample causing it to emit x-rays at various energies, then measures the energy levels for sample composition analysis. It was used in oceanographic applications and modified to identify element concentrations in ore samples, on site. The instrument can also analyze the chemical content of water, and detect the sudden development of excessive engine wear.

  17. Analyzing Aeroelasticity in Turbomachines

    Science.gov (United States)

    Reddy, T. S. R.; Srivastava, R.

    2003-01-01

    ASTROP2-LE is a computer program that predicts flutter and forced responses of blades, vanes, and other components of such turbomachines as fans, compressors, and turbines. ASTROP2-LE is based on the ASTROP2 program, developed previously for analysis of stability of turbomachinery components. In developing ASTROP2- LE, ASTROP2 was modified to include a capability for modeling forced responses. The program was also modified to add a capability for analysis of aeroelasticity with mistuning and unsteady aerodynamic solutions from another program, LINFLX2D, that solves the linearized Euler equations of unsteady two-dimensional flow. Using LINFLX2D to calculate unsteady aerodynamic loads, it is possible to analyze effects of transonic flow on flutter and forced response. ASTROP2-LE can be used to analyze subsonic, transonic, and supersonic aerodynamics and structural mistuning for rotors with blades of differing structural properties. It calculates the aerodynamic damping of a blade system operating in airflow so that stability can be assessed. The code also predicts the magnitudes and frequencies of the unsteady aerodynamic forces on the airfoils of a blade row from incoming wakes. This information can be used in high-cycle fatigue analysis to predict the fatigue lives of the blades.

  18. Field Deployable DNA analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Wheeler, E; Christian, A; Marion, J; Sorensen, K; Arroyo, E; Vrankovich, G; Hara, C; Nguyen, C

    2005-02-09

    This report details the feasibility of a field deployable DNA analyzer. Steps for swabbing cells from surfaces and extracting DNA in an automatable way are presented. Since enzymatic amplification reactions are highly sensitive to environmental contamination, sample preparation is a crucial step to make an autonomous deployable instrument. We perform sample clean up and concentration in a flow through packed bed. For small initial samples, whole genome amplification is performed in the packed bed resulting in enough product for subsequent PCR amplification. In addition to DNA, which can be used to identify a subject, protein is also left behind, the analysis of which can be used to determine exposure to certain substances, such as radionuclides. Our preparative step for DNA analysis left behind the protein complement as a waste stream; we determined to learn if the proteins themselves could be analyzed in a fieldable device. We successfully developed a two-step lateral flow assay for protein analysis and demonstrate a proof of principle assay.

  19. Analyzing the platelet proteome.

    Science.gov (United States)

    García, Angel; Zitzmann, Nicole; Watson, Steve P

    2004-08-01

    During the last 10 years, mass spectrometry (MS) has become a key tool for protein analysis and has underpinned the emerging field of proteomics. Using high-throughput tandem MS/MS following protein separation, it is potentially possible to analyze hundreds to thousands of proteins in a sample at a time. This technology can be used to analyze the protein content (i.e., the proteome) of any cell or tissue and complements the powerful field of genomics. The technology is particularly suitable for platelets because of the absence of a nucleus. Cellular proteins can be separated by either gel-based methods such as two-dimensional gel electrophoresis or one-dimensional sodium dodecyl sulfate polyacrylamide gel electrophoresis followed by liquid chromatography (LC) -MS/MS or by multidimensional LC-MS/MS. Prefractionation techniques, such as subcellular fractionations or immunoprecipitations, can be used to improve the analysis. Each method has particular advantages and disadvantages. Proteomics can be used to compare the proteome of basal and diseased platelets, helping to reveal information on the molecular basis of the disease.

  20. BNL future plans

    Energy Technology Data Exchange (ETDEWEB)

    Littenberg, L.

    1998-01-01

    In 1999, after almost 40 years of independent existence, the Brookhaven Alternating Gradient Synchrotron (AGS) is scheduled to be pressed into service as an injector to the Relativistic Heavy Ion Collider (RHIC). Although at first sight this seems like the end of an era, in actuality, it represents a very attractive new opportunity. For the AGS is actually needed by RHIC for only a few hours per day. The balance of the time it is available for extracted proton beam work at a very small incremental cost. This represents the reverse of the current situation in which the nuclear physics program gets access to the AGS (for fixed target heavy ion experiments) at incremental cost, while the base cost of maintaining the accelerator is borne by the high energy physics program. Retaining the AGS for particle physics work would broaden the US HEP program considerably, allowing continued exploitation of the world`s most intense source of medium energy protons. High energy possibilities include incisive probes of Standard Model and non-SM CP-violation, and of low energy manifestations of supersymmetry.

  1. Analyzing architecture articles

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    In the present study, we express the quality, function, and characteristics of architecture to help people comprehensively understand what architecture is. We also reveal the problems and conflict found in population, land, water resources, pollution, energy, and the organization systems in construction. China’s economy is transforming. We should focus on the cities, architectural environment, energy conservation, emission-reduction, and low-carbon output that will result in successful green development. We should macroscopically and microscopically analyze the development, from the natural environment to the artificial environment; from the relationship between human beings and nature to the combination of social ecology in cities, and farmlands. We must learn to develop and control them harmoniously and scientifically to provide a foundation for the methods used in architecture research.

  2. Analyzing geographic clustered response

    Energy Technology Data Exchange (ETDEWEB)

    Merrill, D.W.; Selvin, S.; Mohr, M.S.

    1991-08-01

    In the study of geographic disease clusters, an alternative to traditional methods based on rates is to analyze case locations on a transformed map in which population density is everywhere equal. Although the analyst's task is thereby simplified, the specification of the density equalizing map projection (DEMP) itself is not simple and continues to be the subject of considerable research. Here a new DEMP algorithm is described, which avoids some of the difficulties of earlier approaches. The new algorithm (a) avoids illegal overlapping of transformed polygons; (b) finds the unique solution that minimizes map distortion; (c) provides constant magnification over each map polygon; (d) defines a continuous transformation over the entire map domain; (e) defines an inverse transformation; (f) can accept optional constraints such as fixed boundaries; and (g) can use commercially supported minimization software. Work is continuing to improve computing efficiency and improve the algorithm. 21 refs., 15 figs., 2 tabs.

  3. PDA: Pooled DNA analyzer

    Directory of Open Access Journals (Sweden)

    Lin Chin-Yu

    2006-04-01

    Full Text Available Abstract Background Association mapping using abundant single nucleotide polymorphisms is a powerful tool for identifying disease susceptibility genes for complex traits and exploring possible genetic diversity. Genotyping large numbers of SNPs individually is performed routinely but is cost prohibitive for large-scale genetic studies. DNA pooling is a reliable and cost-saving alternative genotyping method. However, no software has been developed for complete pooled-DNA analyses, including data standardization, allele frequency estimation, and single/multipoint DNA pooling association tests. This motivated the development of the software, 'PDA' (Pooled DNA Analyzer, to analyze pooled DNA data. Results We develop the software, PDA, for the analysis of pooled-DNA data. PDA is originally implemented with the MATLAB® language, but it can also be executed on a Windows system without installing the MATLAB®. PDA provides estimates of the coefficient of preferential amplification and allele frequency. PDA considers an extended single-point association test, which can compare allele frequencies between two DNA pools constructed under different experimental conditions. Moreover, PDA also provides novel chromosome-wide multipoint association tests based on p-value combinations and a sliding-window concept. This new multipoint testing procedure overcomes a computational bottleneck of conventional haplotype-oriented multipoint methods in DNA pooling analyses and can handle data sets having a large pool size and/or large numbers of polymorphic markers. All of the PDA functions are illustrated in the four bona fide examples. Conclusion PDA is simple to operate and does not require that users have a strong statistical background. The software is available at http://www.ibms.sinica.edu.tw/%7Ecsjfann/first%20flow/pda.htm.

  4. Assessing the levels of food shortage using the traffic light metaphor by analyzing the gathering and consumption of wild food plants, crop parts and crop residues in Konso, Ethiopia

    Directory of Open Access Journals (Sweden)

    Ocho Dechassa

    2012-08-01

    Full Text Available Abstract Background Humanitarian relief agencies use scales to assess levels of critical food shortage to efficiently target and allocate food to the neediest. These scales are often labor-intensive. A lesser used approach is assessing gathering and consumption of wild food plants. This gathering per se is not a reliable signal of emerging food stress. However, the gathering and consumption of some specific plant species could be considered markers of food shortage, as it indicates that people are compelled to eat very poor or even health-threatening food. Methods We used the traffic light metaphor to indicate normal (green, alarmingly low (amber and fully depleted (red food supplies and identified these conditions for Konso (Ethiopia on the basis of wild food plants (WFPs, crop parts (crop parts not used for human consumption under normal conditions; CPs and crop residues (CRs being gathered and consumed. Plant specimens were collected for expert identification and deposition in the National Herbarium. Two hundred twenty individual households free-listed WFPs, CPs, and CRs gathered and consumed during times of food stress. Through focus group discussions, the species list from the free-listing that was further enriched through key informants interviews and own field observations was categorized into species used for green, amber and red conditions. Results The study identified 113 WFPs (120 products/food items whose gathering and consumption reflect the three traffic light metaphors: red, amber and green. We identified 25 food items for the red, 30 food items for the amber and 65 food items for the green metaphor. We also obtained reliable information on 21 different products/food items (from 17 crops normally not consumed as food, reflecting the red or amber metaphor and 10 crop residues (from various crops, plus one recycled stuff which are used as emergency foods in the study area clearly indicating the severity of food stress (red metaphor

  5. TEAMS Model Analyzer

    Science.gov (United States)

    Tijidjian, Raffi P.

    2010-01-01

    The TEAMS model analyzer is a supporting tool developed to work with models created with TEAMS (Testability, Engineering, and Maintenance System), which was developed by QSI. In an effort to reduce the time spent in the manual process that each TEAMS modeler must perform in the preparation of reporting for model reviews, a new tool has been developed as an aid to models developed in TEAMS. The software allows for the viewing, reporting, and checking of TEAMS models that are checked into the TEAMS model database. The software allows the user to selectively model in a hierarchical tree outline view that displays the components, failure modes, and ports. The reporting features allow the user to quickly gather statistics about the model, and generate an input/output report pertaining to all of the components. Rules can be automatically validated against the model, with a report generated containing resulting inconsistencies. In addition to reducing manual effort, this software also provides an automated process framework for the Verification and Validation (V&V) effort that will follow development of these models. The aid of such an automated tool would have a significant impact on the V&V process.

  6. Downhole Fluid Analyzer Development

    Energy Technology Data Exchange (ETDEWEB)

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  7. Analyzing Spacecraft Telecommunication Systems

    Science.gov (United States)

    Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric

    2004-01-01

    Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.

  8. Bios data analyzer.

    Science.gov (United States)

    Sabelli, H; Sugerman, A; Kovacevic, L; Kauffman, L; Carlson-Sabelli, L; Patel, M; Konecki, J

    2005-10-01

    The Bios Data Analyzer (BDA) is a set of computer programs (CD-ROM, in Sabelli et al., Bios. A Study of Creation, 2005) for new time series analyses that detects and measures creative phenomena, namely diversification, novelty, complexes, nonrandom complexity. We define a process as creative when its time series displays these properties. They are found in heartbeat interval series, the exemplar of bios .just as turbulence is the exemplar of chaos, in many other empirical series (galactic distributions, meteorological, economic and physiological series), in biotic series generated mathematically by the bipolar feedback, and in stochastic noise, but not in chaotic attractors. Differencing, consecutive recurrence and partial autocorrelation indicate nonrandom causation, thereby distinguishing chaos and bios from random and random walk. Embedding plots distinguish causal creative processes (e.g. bios) that include both simple and complex components of variation from stochastic processes (e.g. Brownian noise) that include only complex components, and from chaotic processes that decay from order to randomness as the number of dimensions is increased. Varying bin and dimensionality show that entropy measures symmetry and variety, and that complexity is associated with asymmetry. Trigonometric transformations measure coexisting opposites in time series and demonstrate bipolar, partial, and uncorrelated opposites in empirical processes and bios, supporting the hypothesis that bios is generated by bipolar feedback, a concept which is at variance with standard concepts of polar and complementary opposites.

  9. Digital Microfluidics Sample Analyzer

    Science.gov (United States)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  10. An intergenic region shared by At4g35985 and At4g35987 in Arabidopsis thaliana is a tissue specific and stress inducible bidirectional promoter analyzed in transgenic arabidopsis and tobacco plants.

    Directory of Open Access Journals (Sweden)

    Joydeep Banerjee

    Full Text Available On chromosome 4 in the Arabidopsis genome, two neighboring genes (calmodulin methyl transferase At4g35987 and senescence associated gene At4g35985 are located in a head-to-head divergent orientation sharing a putative bidirectional promoter. This 1258 bp intergenic region contains a number of environmental stress responsive and tissue specific cis-regulatory elements. Transcript analysis of At4g35985 and At4g35987 genes by quantitative real time PCR showed tissue specific and stress inducible expression profiles. We tested the bidirectional promoter-function of the intergenic region shared by the divergent genes At4g35985 and At4g35987 using two reporter genes (GFP and GUS in both orientations in transient tobacco protoplast and Agro-infiltration assays, as well as in stably transformed transgenic Arabidopsis and tobacco plants. In transient assays with GFP and GUS reporter genes the At4g35985 promoter (P85 showed stronger expression (about 3.5 fold compared to the At4g35987 promoter (P87. The tissue specific as well as stress responsive functional nature of the bidirectional promoter was evaluated in independent transgenic Arabidopsis and tobacco lines. Expression of P85 activity was detected in the midrib of leaves, leaf trichomes, apical meristemic regions, throughout the root, lateral roots and flowers. The expression of P87 was observed in leaf-tip, hydathodes, apical meristem, root tips, emerging lateral root tips, root stele region and in floral tissues. The bidirectional promoter in both orientations shows differential up-regulation (2.5 to 3 fold under salt stress. Use of such regulatory elements of bidirectional promoters showing spatial and stress inducible promoter-functions in heterologous system might be an important tool for plant biotechnology and gene stacking applications.

  11. Soft Decision Analyzer

    Science.gov (United States)

    Steele, Glen; Lansdowne, Chatwin; Zucha, Joan; Schlensinger, Adam

    2013-01-01

    The Soft Decision Analyzer (SDA) is an instrument that combines hardware, firmware, and software to perform realtime closed-loop end-to-end statistical analysis of single- or dual- channel serial digital RF communications systems operating in very low signal-to-noise conditions. As an innovation, the unique SDA capabilities allow it to perform analysis of situations where the receiving communication system slips bits due to low signal-to-noise conditions or experiences constellation rotations resulting in channel polarity in versions or channel assignment swaps. SDA s closed-loop detection allows it to instrument a live system and correlate observations with frame, codeword, and packet losses, as well as Quality of Service (QoS) and Quality of Experience (QoE) events. The SDA s abilities are not confined to performing analysis in low signal-to-noise conditions. Its analysis provides in-depth insight of a communication system s receiver performance in a variety of operating conditions. The SDA incorporates two techniques for identifying slips. The first is an examination of content of the received data stream s relation to the transmitted data content and the second is a direct examination of the receiver s recovered clock signals relative to a reference. Both techniques provide benefits in different ways and allow the communication engineer evaluating test results increased confidence and understanding of receiver performance. Direct examination of data contents is performed by two different data techniques, power correlation or a modified Massey correlation, and can be applied to soft decision data widths 1 to 12 bits wide over a correlation depth ranging from 16 to 512 samples. The SDA detects receiver bit slips within a 4 bits window and can handle systems with up to four quadrants (QPSK, SQPSK, and BPSK systems). The SDA continuously monitors correlation results to characterize slips and quadrant change and is capable of performing analysis even when the

  12. Systems Analyze Water Quality in Real Time

    Science.gov (United States)

    2010-01-01

    A water analyzer developed under Small Business Innovation Research (SBIR) contracts with Kennedy Space Center now monitors treatment processes at water and wastewater facilities around the world. Originally designed to provide real-time detection of nutrient levels in hydroponic solutions for growing plants in space, the ChemScan analyzer, produced by ASA Analytics Inc., of Waukesha, Wisconsin, utilizes spectrometry and chemometric algorithms to automatically analyze multiple parameters in the water treatment process with little need for maintenance, calibration, or operator intervention. The company has experienced a compound annual growth rate of 40 percent over its 15-year history as a direct result of the technology's success.

  13. Microwave measurements and beam dynamics simulations of the BNL/SLAC/UCLA emittance-compensated 1.6-cell photocathode rf gun

    Science.gov (United States)

    Palmer, Dennis T.; Miller, Roger H.; Winick, Herman; Wang, Xi J.; Batchelor, Kenneth; Woodle, Martin H.; Ben-Zvi, Ilan

    1995-09-01

    A dedicated low energy (2 to 10 MeV) experimental beam line is now under construction at Brookhaven National Laboratory/Accelerator Test Facility (BNL/ATF) for photocathode RF gun testing and photoemission experiments. Microwave measurements of the 1.6 cell photocathode RF gun have been conducted along with beam dynamics simulations of the emittance compensated low energy beam. These simulations indicate that the 1.6 cell photocathode RF gun in combination with solenoidal emittance compensation will be capable of producing a high brightness beam with a normalization rms emittance of (epsilon) n,rms approximately equals 1 (pi) mm mrad. The longitudinal accelerating field Ez has been measured as a function of azimuthal angle in the full cell of the cold test model for the 1.6 cell BNL/SLAC/UCLA #3 S-band RF Gun using a needle rotation/frequency perturbation technique. These measurements were conducted before and after symmetrizing the full cell with a vacuum pump out port and an adjustable short. Two different waveguide to full cell coupling schemes were studied. Experimental and theoretical studies of the field balance versus mode separation were conducted. The dipole mode of the full cell using the (theta) - coupling scheme is an order of magnitude less severe before symmetrization than the Z- coupling scheme. The multi-pole contribution to the longitudinal field asymmetry are calculated using standard Fourier series techniques for both coupling schemes. The Panofsky- Wenzel theorem is used in estimating the transverse emittance due to the multipole components of Ez. Detailed beam dynamics simulations were performed for the 1.6 cell photocathode RF gun injector using a solenoidal emittance compensation technique. The design of the experimental line along with a proposed experimental program using the 1.6 cell photocathode RF gun developed by the BNL/SLAC/UCLA RF gun collaboration is presented. This experimental program includes measurements of beam loading caused

  14. Analyzing Agricultural Agglomeration in China

    Directory of Open Access Journals (Sweden)

    Erling Li

    2017-02-01

    Full Text Available There has been little scholarly research on Chinese agriculture’s geographic pattern of agglomeration and its evolutionary mechanisms, which are essential to sustainable development in China. By calculating the barycenter coordinates, the Gini coefficient, spatial autocorrelation and specialization indices for 11 crops during 1981–2012, we analyze the evolutionary pattern and mechanisms of agricultural agglomeration. We argue that the degree of spatial concentration of Chinese planting has been gradually increasing and that regional specialization and diversification have progressively been strengthened. Furthermore, Chinese crop production is moving from the eastern provinces to the central and western provinces. This is in contrast to Chinese manufacturing growth which has continued to be concentrated in the coastal and southeastern regions. In Northeast China, the Sanjiang and Songnen plains have become agricultural clustering regions, and the earlier domination of aquaculture and rice production in Southeast China has gradually decreased. In summary, this paper provides a political economy framework for understanding the regionalization of Chinese agriculture, focusing on the interaction among the objectives, decisionmaking behavior, path dependencies and spatial effects.

  15. (BNL/DoE-hyped) ``Self-Organized-Criticality'' (SOC) is Merely Newton's(1687) Third Law of Motion F = ma REdiscovery: LONG PRE-``Bak''!!!

    Science.gov (United States)

    Bak, P. R. E.; Newton, I.; Siegel, Edward Carl-Ludwig

    2011-03-01

    "Bak"/BNL/DoE "self-organized-criticality"(SOC) usual BNL/DoE media-hype P.R spin-doctoring "show-biz" "Bush-waaa-...-aaah!!!" is manifestly-demonstrated in two distinct ways to be nothing but Newton's Third Law of Motion F = ma REdiscovery!!! PHYSICS: (1687) cross-multiplied F = ma rewritten as 1/m = a/F = OUTPUT/INPUT = EFFECT/ CAUSE = inverse-mass mechanical-susceptibility = X ("w "); X ("w ") (F.-D. theorem-equivalence /proportionality) P("w ") "noise" power-spectrum; E w ; and E (any/all media upper-limiting-speeds) m. Thus: w E m; inversion yields: 1/w 1 /E 1 /m a/F = X ("w ") P("w "); hence: F = ma dual/inverse-integral-transform is "'SOC"'s" P(w) 1 / w (1) !!! ; "PURE"-MATHS: F = ma double-integral time-series s(t) = [vot + (1/2) at (2) ] inverse/dual-integral-transform formally defines power-spectrum: P (w) = S { s (t) e [ - (iORnoi) wt ] } dt = S { [ vot + (1 / 2) at 2) ] e [ - (iORnoi) wt ] } dt = voS { te [ - (iORnoi) wt ] } dt + (1 / 2) S { [ a = / = a (t) ] e [ - (iORnoi) wt) } dt = vo (d / dw) Delta (w) + (1 / 2) [ a = / = a (t) ] (d / dw) (2) Delta (w) = vo / w (0) + (1 / 2) [ a = / = a (t) ] / w 1 : ifa = 0 , then P(w) 1 / w 0 , VS . ifa = / = a (t) = / = 0 , then P(w) 1 /w; = by physics: ``SOC'' RE-expresses F = ma!!!: ``just `a tad' late/tardy'' REdiscovery of F=ma: LONG PRE-"Bak"!!!

  16. Tropical Ocean Climate Study (TOCS) and Japan-United States Tropical Ocean Study (JUSTOS) on the R/V KAIYO, 25 Jan to 2 March 1997, to the Tropical Western Pacific Ocean BNL component

    Energy Technology Data Exchange (ETDEWEB)

    Reynolds, R.M.; Smith, S.

    1997-04-11

    The Japanese U.S. Tropical Ocean Study (JUSTOS) cruise on the R/V KAIYO in the Tropical Western Pacific Ocean was a collaborative effort with participants from the Japanese Marine Science and Technology Center (JAMSTEC), the National Center for Atmospheric Research (NCAR), and Brookhaven National Laboratory BNL. This report is a summary of the instruments, measurements, and initial analysis of the BNL portion of the cruise only. It includes a brief description of the instrument system, calibration procedures, problems and resolutions, data collection, processing and data file descriptions. This is a working document, which is meant to provide both a good description of the work and as much information as possible in one place for future analysis.

  17. Soft Decision Analyzer and Method

    Science.gov (United States)

    Steele, Glen F. (Inventor); Lansdowne, Chatwin (Inventor); Zucha, Joan P. (Inventor); Schlesinger, Adam M. (Inventor)

    2016-01-01

    A soft decision analyzer system is operable to interconnect soft decision communication equipment and analyze the operation thereof to detect symbol wise alignment between a test data stream and a reference data stream in a variety of operating conditions.

  18. Beam-energy dependence of charge balance functions from Au + Au collisions at energies available at the BNL Relativistic Heavy Ion Collider

    Science.gov (United States)

    Adamczyk, L.; Adkins, J. K.; Agakishiev, G.; Aggarwal, M. M.; Ahammed, Z.; Alekseev, I.; Alford, J.; Aparin, A.; Arkhipkin, D.; Aschenauer, E. C.; Averichev, G. S.; Banerjee, A.; Bellwied, R.; Bhasin, A.; Bhati, A. K.; Bhattarai, P.; Bielcik, J.; Bielcikova, J.; Bland, L. C.; Bordyuzhin, I. G.; Bouchet, J.; Brandin, A. V.; Bunzarov, I.; Burton, T. P.; Butterworth, J.; Caines, H.; Calderón de la Barca Sánchez, M.; Campbell, J. M.; Cebra, D.; Cervantes, M. C.; Chakaberia, I.; Chaloupka, P.; Chang, Z.; Chattopadhyay, S.; Chen, J. H.; Chen, H. F.; Cheng, J.; Cherney, M.; Christie, W.; Codrington, M. J. M.; Contin, G.; Crawford, H. J.; Cui, X.; Das, S.; De Silva, L. C.; Debbe, R. R.; Dedovich, T. G.; Deng, J.; Derevschikov, A. A.; Derradi de Souza, R.; di Ruzza, B.; Didenko, L.; Dilks, C.; Dong, X.; Drachenberg, J. L.; Draper, J. E.; Du, C. M.; Dunkelberger, L. E.; Dunlop, J. C.; Efimov, L. G.; Engelage, J.; Eppley, G.; Esha, R.; Evdokimov, O.; Eyser, O.; Fatemi, R.; Fazio, S.; Federic, P.; Fedorisin, J.; Feng, Filip, P.; Fisyak, Y.; Flores, C. E.; Gagliardi, C. A.; Garand, D.; Geurts, F.; Gibson, A.; Girard, M.; Greiner, L.; Grosnick, D.; Gunarathne, D. S.; Guo, Y.; Gupta, A.; Gupta, S.; Guryn, W.; Hamad, A.; Hamed, A.; Han, L.-X.; Haque, R.; Harris, J. W.; Heppelmann, S.; Hirsch, A.; Hoffmann, G. W.; Hofman, D. J.; Horvat, S.; Huang, B.; Huang, X.; Huang, H. Z.; Huck, P.; Humanic, T. J.; Igo, G.; Jacobs, W. W.; Jang, H.; Judd, E. G.; Kabana, S.; Kalinkin, D.; Kang, K.; Kauder, K.; Ke, H. W.; Keane, D.; Kechechyan, A.; Khan, Z. H.; Kikola, D. P.; Kisel, I.; Kisiel, A.; Klein, S. R.; Koetke, D. D.; Kollegger, T.; Kosarzewski, L. K.; Kotchenda, L.; Kraishan, A. F.; Kravtsov, P.; Krueger, K.; Kulakov, I.; Kumar, L.; Kycia, R. A.; Lamont, M. A. C.; Landgraf, J. M.; Landry, K. D.; Lauret, J.; Lebedev, A.; Lednicky, R.; Lee, J. H.; Li, Z. M.; Li, X.; Li, W.; Li, Y.; Li, X.; Li, C.; Lisa, M. A.; Liu, F.; Ljubicic, T.; Llope, W. J.; Lomnitz, M.; Longacre, R. S.; Luo, X.; Ma, G. L.; Ma, R. M.; Ma, Y. G.; Magdy, N.; Mahapatra, D. P.; Majka, R.; Manion, A.; Margetis, S.; Markert, C.; Masui, H.; Matis, H. S.; McDonald, D.; Minaev, N. G.; Mioduszewski, S.; Mohanty, B.; Mondal, M. M.; Morozov, D. A.; Mustafa, M. K.; Nandi, B. K.; Nasim, Md.; Nayak, T. K.; Nigmatkulov, G.; Nogach, L. V.; Noh, S. Y.; Novak, J.; Nurushev, S. B.; Odyniec, G.; Ogawa, A.; Oh, K.; Okorokov, V.; Olvitt, D. L.; Page, B. S.; Pan, Y. X.; Pandit, Y.; Panebratsev, Y.; Pawlak, T.; Pawlik, B.; Pei, H.; Perkins, C.; Pile, P.; Planinic, M.; Pluta, J.; Poljak, N.; Poniatowska, K.; Porter, J.; Poskanzer, A. M.; Pruthi, N. K.; Przybycien, M.; Putschke, J.; Qiu, H.; Quintero, A.; Ramachandran, S.; Raniwala, R.; Raniwala, S.; Ray, R. L.; Ritter, H. G.; Roberts, J. B.; Rogachevskiy, O. V.; Romero, J. L.; Roy, A.; Ruan, L.; Rusnak, J.; Rusnakova, O.; Sahoo, N. R.; Sahu, P. K.; Sakrejda, I.; Salur, S.; Sandacz, A.; Sandweiss, J.; Sarkar, A.; Schambach, J.; Scharenberg, R. P.; Schmah, A. M.; Schmidke, W. B.; Schmitz, N.; Seger, J.; Seyboth, P.; Shah, N.; Shahaliev, E.; Shanmuganathan, P. V.; Shao, M.; Sharma, B.; Shen, W. Q.; Shi, S. S.; Shou, Q. Y.; Sichtermann, E. P.; Simko, M.; Skoby, M. J.; Smirnov, N.; Smirnov, D.; Solanki, D.; Song, L.; Sorensen, P.; Spinka, H. M.; Srivastava, B.; Stanislaus, T. D. S.; Stock, R.; Strikhanov, M.; Stringfellow, B.; Sumbera, M.; Summa, B. J.; Sun, X. M.; Sun, Z.; Sun, Y.; Sun, X.; Surrow, B.; Svirida, D. N.; Szelezniak, M. A.; Takahashi, J.; Tang, Z.; Tang, A. H.; Tarnowsky, T.; Tawfik, A. N.; Thomas, J. H.; Timmins, A. R.; Tlusty, D.; Tokarev, M.; Trentalange, S.; Tribble, R. E.; Tribedy, P.; Tripathy, S. K.; Trzeciak, B. A.; Tsai, O. D.; Turnau, J.; Ullrich, T.; Underwood, D. G.; Upsal, I.; Van Buren, G.; van Nieuwenhuizen, G.; Vandenbroucke, M.; Varma, R.; Vasconcelos, G. M. S.; Vasiliev, A. N.; Vertesi, R.; Videbæk, F.; Viyogi, Y. P.; Vokal, S.; Voloshin, S. A.; Vossen, A.; Wang, J. S.; Wang, X. L.; Wang, Y.; Wang, H.; Wang, F.; Wang, G.; Webb, G.; Webb, J. C.; Wen, L.; Westfall, G. D.; Wieman, H.; Wissink, S. W.; Witt, R.; Wu, Y. F.; Xiao, Z.; Xie, W.; Xin, K.; Xu, N.; Xu, Z.; Xu, H.; Xu, Y.; Xu, Q. H.; Yan, W.; Yang, Y.; Yang, C.; Yang, Y.; Ye, Z.; Yepes, P.; Yi, L.; Yip, K.; Yoo, I.-K.; Yu, N.; Zbroszczyk, H.; Zha, W.; Zhang, X. P.; Zhang, Z. P.; Zhang, J. B.; Zhang, J. L.; Zhang, Y.; Zhang, S.; Zhao, F.; Zhao, J.; Zhong, C.; Zhu, Y. H.; Zhu, X.; Zoulkarneeva, Y.; Zyzak, M.; STAR Collaboration

    2016-08-01

    Balance functions have been measured in terms of relative pseudorapidity (Δ η ) for charged particle pairs at the BNL Relativistic Heavy Ion Collider from Au + Au collisions at √{sNN}=7.7 GeV to 200 GeV using the STAR detector. These results are compared with balance functions measured at the CERN Large Hadron Collider from Pb + Pb collisions at √{sNN}=2.76 TeV by the ALICE Collaboration. The width of the balance function decreases as the collisions become more central and as the beam energy is increased. In contrast, the widths of the balance functions calculated using shuffled events show little dependence on centrality or beam energy and are larger than the observed widths. Balance function widths calculated using events generated by UrQMD are wider than the measured widths in central collisions and show little centrality dependence. The measured widths of the balance functions in central collisions are consistent with the delayed hadronization of a deconfined quark gluon plasma (QGP). The narrowing of the balance function in central collisions at √{sNN}=7.7 GeV implies that a QGP is still being created at this relatively low energy.

  19. PROCEEDINGS OF RIKEN BNL RESEARCH CENTER WORKSHOP, RHIC SPIN COLLABORATION MEETINGS VIII, IX, X, XI, APRIL 12, MAY, 22, JUNE 17, JULY 29, 2002.

    Energy Technology Data Exchange (ETDEWEB)

    FOX,B.

    2003-03-06

    Since its inception, the RHIC Spin Collaboration (RSC) has held semi-regular meetings each year to discuss the physics possibilities and the operational details of the program. Having collected our first data sample of polarized proton-proton collisions in Run02 of RHIC, we are now in the process of examining the performance of both the accelerator and the experiments. From this evaluation, we not only aim to formulate a consensus plan for polarized proton-proton during Run03 of RHIC but also to look more forward into the future to ensure the success of the spin program. In the second meeting of this series (which took place at BNL on April 12, 2002), we focused on Run02 polarization issues. This meeting opened with a presentation by Thomas Roser about his reflections on the outcome from the RHIC retreat during which the Run02 performance was evaluated. Of particular importance, Thomas pointed out that, with the expected beam time and his estimates for machine-tuning requirements, the experiments should limit their beam requests to two or three programs.

  20. 演化过程主导的流体力学模型与Cu-Cu在BNL-RHIC能量碰撞中带电粒子的赝快度分布%Evolution-dominated Hydrodynamic Model and the Pseudorapidity Distributions of the Charged Particles Pro duced in Cu-Cu Collisions at BNL-RHIC Energies

    Institute of Scientific and Technical Information of China (English)

    姜志进; 王杰; 张海丽; 马可

    2014-01-01

    The charged particles resulting in high energy heavy ion collisions consist of two parts: One is from the hot and dense matter produced in collisions. The other is the leading particles. We suppose that the hot and dense matter expands and freezes out into the charged particles according to the evolution-dominated hydrodynamics, and the leading particles are from participants with approximately the same energy. On the basis of this assumption, we get the pseudorapidity distributions of the charged particles produced in high energy heavy ion collisions, and make a comparison with the experimental data presented by PHOBOS Collaboration at BNL-RHIC in Cu-Cu collisions at√sNN=62.4 and 200 GeV. The theoretical predictions are in good accordance with experimental measurements.%高能重离子碰撞产生的带电粒子由两部分组成:一部分来源于碰撞产生的高温高密度物质,另一部分是带头粒子。假设高温高密度物质按照由演化过程主导的流体力学的规律膨胀并冻析为带电粒子,带头粒子来源于参与者且具有大致相同的能量。基于该假设,得到了高能重离子碰撞带电粒子的赝快度分布,并与BNL-RHIC上的PHOBOS合作组在√sNN=62.4与200 GeV的Cu-Cu碰撞中给出的实验结果相比较,理论与实验测量符合得很好。

  1. Studies of Strangeness Production in proton-Nucleus Collision: preliminary results from E910 at BNL-AGS

    Science.gov (United States)

    Yang, Xihong

    1996-10-01

    Strange particle production has been viewed as an interesting probe of Heavy-Ion physics because it has the signature of QGP formation. Using the EOS TPC and downstream drift chambers for tracking and using TOF and Cerenkov counters for particle identification, experiment E910 provides a facility with large acceptance and high resolution for exclusive measurements of proton-nucleus collisions at AGS energy. Production of Λ in both 12.5 GeV/c and 18 GeV/c p+A (A = Au, Cu) from '96 run data has been analyzed. The initial reconstruction results of the Λ invariant mass distribution shows a mass resolution of 2.5MeV/c^2. The Λ yield for different beam energies and target masses has been analyzed and compared with the p+p data and E859 data. The transverse mass and rapidity distributions are also discussed here.

  2. Analyzing Valuation Practices through Contracts

    DEFF Research Database (Denmark)

    Tesnière, Germain; Labatut, Julie; Boxenbaum, Eva

    This paper seeks to analyze the most recent changes in how societies value animals. We analyze this topic through the prism of contracts between breeding companies and farmers. Focusing on new valuation practices and qualification of breeding animals, we question the evaluation of difficult...

  3. Analyzing data files in SWAN

    CERN Document Server

    Gajam, Niharika

    2016-01-01

    Traditionally analyzing data happens via batch-processing and interactive work on the terminal. The project aims to provide another way of analyzing data files: A cloud-based approach. It aims to make it a productive and interactive environment through the combination of FCC and SWAN software.

  4. Fundaments of plant cybernetics.

    Science.gov (United States)

    Zucconi, F

    2001-01-01

    A systemic approach is proposed for analyzing plants' physiological organization and cybernesis. To this end, the plant is inspected as a system, starting from the integration of crown and root systems, and its impact on a number of basic epigenetic events. The approach proves to be axiomatic and facilitates the definition of the principles behind the plant's autonomous control of growth and reproduction.

  5. Identification and Assessment of Material Models for Age-Related Degradation of Structures and Passive Components in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Nie,J.; Braverman, J.; Hofmayer, C.; Kim, M. K.; Choi, I-K.

    2009-04-27

    When performing seismic safety assessments of nuclear power plants (NPPs), the potential effects of age-related degradation on structures, systems, and components (SSCs) should be considered. To address the issue of aging degradation, the Korea Atomic Energy Research Institute (KAERI) has embarked on a five-year research project to develop a realistic seismic risk evaluation system which will include the consideration of aging of structures and components in NPPs. Three specific areas that are included in the KAERI research project, related to seismic probabilistic risk assessment (PRA), are probabilistic seismic hazard analysis, seismic fragility analysis including the effects of aging, and a plant seismic risk analysis. To support the development of seismic capability evaluation technology for degraded structures and components, KAERI entered into a collaboration agreement with Brookhaven National Laboratory (BNL) in 2007. The collaborative research effort is intended to continue over a five year period with the goal of developing seismic fragility analysis methods that consider the potential effects of age-related degradation of SSCs, and using these results as input to seismic PRAs. In the Year 1 scope of work BNL collected and reviewed degradation occurrences in US NPPs and identified important aging characteristics needed for the seismic capability evaluations that will be performed in the subsequent evaluations in the years that follow. This information is presented in the Annual Report for the Year 1 Task, identified as BNL Report-81741-2008 and also designated as KAERI/RR-2931/2008. The report presents results of the statistical and trending analysis of this data and compares the results to prior aging studies. In addition, the report provides a description of U.S. current regulatory requirements, regulatory guidance documents, generic communications, industry standards and guidance, and past research related to aging degradation of SSCs. This report

  6. Development of pulse neutron coal analyzer

    Science.gov (United States)

    Jing, Shi-wie; Gu, De-shan; Qiao, Shuang; Liu, Yu-ren; Liu, Lin-mao; Shi-wei, Jing

    2005-04-01

    This article introduced the development of pulsed neutron coal analyzer by pulse fast-thermal neutron analysis technology in the Radiation Technology Institute of Northeast Normal University. The 14MeV pulse neutron generator and bismuth germanate detector and 4096 multichannel analyzer were applied in this system. The multiple linear regression method employed to process data solved the interferential problem of multiple elements. The prototype (model MZ-MKFY) had been applied in Changshan and Jilin power plant for about a year. The results of measuring the main parameters of coal such as low caloric power, whole total water, ash content, volatile content, and sulfur content, with precision acceptable to the coal industry, are presented.

  7. ANALYZE Users' Guide

    Energy Technology Data Exchange (ETDEWEB)

    Azevedo, S.

    1982-10-01

    This report is a reproduction of the visuals that were used in the ANALYZE Users' Guide lectures of the videotaped LLNL Continuing Education Course CE2018-H, State Space Lectures. The course was given in Spring 1982 through the EE Department Education Office. Since ANALYZE is menu-driven, interactive, and has self-explanatory questions (sort of), these visuals and the two 50-minute videotapes are the only documentation which comes with the code. More information about the algorithms contained in ANALYZE can be obtained from the IEEE book on Programs for Digital Signal Processing.

  8. C2Analyzer:Co-target-Co-function Analyzer

    Institute of Scientific and Technical Information of China (English)

    Md Aftabuddin; Chittabrata Mal; Arindam Deb; Sudip Kundu

    2014-01-01

    MicroRNAs (miRNAs) interact with their target mRNAs and regulate biological pro-cesses at post-transcriptional level. While one miRNA can target many mRNAs, a single mRNA can also be targeted by a set of miRNAs. The targeted mRNAs may be involved in different bio-logical processes that are described by gene ontology (GO) terms. The major challenges involved in analyzing these multitude regulations include identification of the combinatorial regulation of miR-NAs as well as determination of the co-functionally-enriched miRNA pairs. The C2Analyzer:Co-target-Co-function Analyzer, is a Perl-based, versatile and user-friendly web tool with online instructions. Based on the hypergeometric analysis, this novel tool can determine whether given pairs of miRNAs are co-functionally enriched. For a given set of GO term(s), it can also identify the set of miRNAs whose targets are enriched in the given GO term(s). Moreover, C2Analyzer can also identify the co-targeting miRNA pairs, their targets and GO processes, which they are involved in. The miRNA-miRNA co-functional relationship can also be saved as a .txt file, which can be used to further visualize the co-functional network by using other software like Cytoscape. C2Analyzer is freely available at www.bioinformatics.org/c2analyzer.

  9. An assessment of mercury emissions and health risks from a coal-fired power plant

    Energy Technology Data Exchange (ETDEWEB)

    Fthenakis, V.M.; Lipfert, F.W.; Moskowitz, P.D.; Saroff, L. [Brookhaven National Laboratory, Upton, NY (United States)

    1995-12-01

    Title III of the 1990 Clean Air Act Amendments (CAAA) directed the US Environmental Protection Agency (EPA) to evaluate the rate and effect of mercury emissions in the atmosphere and technologies to control the emissions. The US DOE sponsored a risk assessment project at Brookhaven (BNL) to evaluate health risks of mercury emissions from coal combustion. Methylmercury (MeHg) is the compound predominantly responsible for human exposure to atmospheric mercury in the United States, through fish ingestion. In the BNL study, health risks to adults resulting from Hg emissions from a hypothetical coal-fired power plant were estimated using probabilistic risk assessment techniques. This study showed that the effects of emissions of a single large power plant may double the background exposures to MeHg resulting from consuming fish obtained from a localized are near the power plant. Even at these more elevated exposure levels, the attributable incidence in mild neurological symptoms (paresthesia) was estimated to be quite small, especially when compared with the estimated background incidence in the population. 29 refs., 5 figs., 2 tabs.

  10. An assessment of mercury emissions and health risks from a coal-fired power plant

    Energy Technology Data Exchange (ETDEWEB)

    Fthenakis, V.M.; Lipfert, F.; Moskowitz, P. [Brookhaven National Lab., Upton, NY (United States). Analytical Sciences Div.

    1994-12-01

    Title 3 of the 1990 Clean Air Act Amendments (CAAA) mandated that the US Environmental Protection Agency (EPA) evaluate the need to regulate mercury emissions from electric utilities. In support of this forthcoming regulatory analysis the U.S. DOE, sponsored a risk assessment project at Brookhaven (BNL) to evaluate methylmercury (MeHg) hazards independently. In the US MeHg is the predominant way of exposure to mercury originated in the atmosphere. In the BNL study, health risks to adults resulting from Hg emissions from a hypothetical 1,000 MW coal-fired power plant were estimated using probabilistic risk assessment techniques. This study showed that the effects of emissions of a single power plant may double the background exposures to MeHg resulting from consuming fish obtained from a localized area near the power plant. Even at these more elevated exposure levels, the attributable incidence in mild neurological symptoms was estimated to be quite small, especially when compared with the estimated background incidence in the population. The current paper summarizes the basic conclusions of this assessment and highlights issues dealing with emissions control and environmental transport.

  11. Metodología basada en el Modelo de Referencia para Cadenas de Suministro para Analizar el Proceso de producción de Biodiesel a partir de Higuerilla Methodology based on the Supply Chain Operations Reference Model to Analyze the Production Process of Biodiesel from Castor Oil Plant

    Directory of Open Access Journals (Sweden)

    Fernando Salazar

    2012-01-01

    Full Text Available Se presenta un análisis para la producción de Biodiesel a partir de Higuerilla usando el Modelo de Referencia para Cadenas de Suministro (SCOR. Se identificaron los niveles del proceso, indicadores clave de desempeño, atributos y operaciones logísticas que se llevan a cabo en toda la cadena de la producción de biodiesel de higuerilla. La utilidad de la aplicación del modelo SCOR se determina por la identificación que logra de los componentes de la cadena, evidenciando fortalezas y debilidades que presenta en las relaciones externas e internas de la logística. Desde esta perspectiva se observó que la aplicación del modelo SCOR hace más eficiente las operaciones logísticas a lo largo de la cadena.The production process of Biodiesel from castor oil plants is analyzed using the Supply Chain Operations Reference Model (SCOR. The different levels of the process, the key performance indicators, the attributes and the logistic operations carried out throughout the chain were analyzed. The usefulness of applying the SCOR model is determined by the identification of the components of the chain, highlighting strengths and weaknesses of external and internal relations of the logistics. From this perspective it was observed that the implementation of the SCOR model makes more efficient the logistics operations through the whole chain.

  12. Determination nitrogen in the Kjeldahl digests of plant samples by continuous flow analyzer in comparison with auto-mated distillation-titration instrument%连续流动分析仪与自动凯氏定氮仪测定小麦秸秆全氮含量之比较

    Institute of Scientific and Technical Information of China (English)

    温云杰; 李桂花; 黄金莉; 刘云霞; 高翔; 汪洪

    2015-01-01

    凯氏定氮法是测定植株全氮含量的经典方法,但费时费力。选择24个小麦秸秆样品,用浓 H2 SO4-H2 O2消煮,分别利用连续流动分析仪与全自动凯氏定氮仪测定消煮液中氮含量,比较了两种方法测定结果,探讨利用连续流动分析仪测定植株样品全氮含量的可行性。结果表明:两种仪器测定的小麦秸秆中全氮含量无明显差异,彼此间呈显著线性相关,回归直线方程为Y (连续流动分析仪-N)=0.892X (凯氏蒸馏滴定-N)+0.753,相关系数r=0.9421(n=24, P<0.01)。连续流动分析仪测定的回收率在96.6%~102.3%之间,对5个样品消煮液中氮浓度分别重复测定5次,相对标准偏差在5%以下。连续流动分析仪分析速度快,消耗试剂少,可用于大批量H2 SO4-H2 O2消煮的植株样品中全氮含量分析。研究结果为采用连续流动分析仪测定植株全氮含量提供了技术依据。%Kjeldahl distillation-titration has been used as a reference method for N determination in plants, but it is time consu-ming. This article aimed at establishing a time saving method for determining of plant N concentrations by continuous flow ana-lyzer ( CFA) . Twenty four of wheat plant samples were selected and digested with H2 SO4-H2 O2 . N contents in the digests were determined by CFA and an automated Kjeldahl distillation-titration ( AKDT) instrument, respectively. The test showed that there was no any significant difference of the plant N contents measured by the CFA and the AKDT method. A linear rela-tionship best described the measured N values generated by both methods: CFA-N=0. 892 AKDT-N +0. 753. The Pearson’s correlation coefficient was 0. 942 1 with a significance level (n=24, P<0. 01). The CFA method for plant N measurement had a high precision with relative standard deviation less than 5%. The standard recovery rate of feed samples with addition of ( NH4 ) 2 SO4 was 96. 6% ~102. 3%. It suggested that the CFA based on

  13. On-Demand Urine Analyzer

    Science.gov (United States)

    Farquharson, Stuart; Inscore, Frank; Shende, Chetan

    2010-01-01

    A lab-on-a-chip was developed that is capable of extracting biochemical indicators from urine samples and generating their surface-enhanced Raman spectra (SERS) so that the indicators can be quantified and identified. The development was motivated by the need to monitor and assess the effects of extended weightlessness, which include space motion sickness and loss of bone and muscle mass. The results may lead to developments of effective exercise programs and drug regimes that would maintain astronaut health. The analyzer containing the lab-on-a- chip includes materials to extract 3- methylhistidine (a muscle-loss indicator) and Risedronate (a bone-loss indicator) from the urine sample and detect them at the required concentrations using a Raman analyzer. The lab-on- a-chip has both an extractive material and a SERS-active material. The analyzer could be used to monitor the onset of diseases, such as osteoporosis.

  14. Analyzing the Grammar of English

    CERN Document Server

    Teschner, Richard V

    2007-01-01

    Analyzing the Grammar of English offers a descriptive analysis of the indispensable elements of English grammar. Designed to be covered in one semester, this textbook starts from scratch and takes nothing for granted beyond a reading and speaking knowledge of English. Extensively revised to function better in skills-building classes, it includes more interspersed exercises that promptly test what is taught, simplified and clarified explanations, greatly expanded and more diverse activities, and a new glossary of over 200 technical terms.Analyzing the Grammar of English is the only English gram

  15. An update on chemistry analyzers.

    Science.gov (United States)

    Vap, L M; Mitzner, B

    1996-09-01

    This update of six chemistry analyzers available to the clinician discusses several points that should be considered prior to the purchase of equipment. General topics include how to best match an instrument to clinic needs and the indirect costs associated with instrument operation. Quality assurance recommendations are discussed and common terms are defined. Specific instrument features, principles of operation, performance, and costs are presented. The information provided offers potential purchasers an objective approach to the evaluation of a chemistry analyzer for the veterinary clinic.

  16. Strategies for Analyzing Tone Languages

    Science.gov (United States)

    Coupe, Alexander R.

    2014-01-01

    This paper outlines a method of auditory and acoustic analysis for determining the tonemes of a language starting from scratch, drawing on the author's experience of recording and analyzing tone languages of north-east India. The methodology is applied to a preliminary analysis of tone in the Thang dialect of Khiamniungan, a virtually undocumented…

  17. The Convertible Arbitrage Strategy Analyzed

    NARCIS (Netherlands)

    Loncarski, I.; Ter Horst, J.R.; Veld, C.H.

    2006-01-01

    This paper analyzes convertible bond arbitrage on the Canadian market for the period 1998 to 2004.Convertible bond arbitrage is the combination of a long position in convertible bonds and a short position in the underlying stocks. Convertible arbitrage has been one of the most successful strategies

  18. FORTRAN Static Source Code Analyzer

    Science.gov (United States)

    Merwarth, P.

    1984-01-01

    FORTRAN Static Source Code Analyzer program, SAP (DEC VAX version), automatically gathers statistics on occurrences of statements and structures within FORTRAN program and provides reports of those statistics. Provisions made for weighting each statistic and provide an overall figure of complexity.

  19. Analyzing Classroom Instruction in Reading.

    Science.gov (United States)

    Rutherford, William L.

    A method for analyzing instructional techniques employed during reading group instruction is reported, and the characteristics of the effective reading teacher are discussed. Teaching effectiveness is divided into two categories: (1) how the teacher acts and interacts with children on a personal level and (2) how the teacher performs his…

  20. Analyzing Software Piracy in Education.

    Science.gov (United States)

    Lesisko, Lee James

    This study analyzes the controversy of software piracy in education. It begins with a real world scenario that presents the setting and context of the problem. The legalities and background of software piracy are explained and true court cases are briefly examined. Discussion then focuses on explaining why individuals and organizations pirate…

  1. Fragility Analysis Methodology for Degraded Structures and Passive Components in Nuclear Power Plants - Illustrated using a Condensate Storage Tank

    Energy Technology Data Exchange (ETDEWEB)

    Nie, J.; Braverman, J.; Hofmayer, C.; Choun, Y.; Kim, M.; Choi, I.

    2010-06-30

    The Korea Atomic Energy Research Institute (KAERI) is conducting a five-year research project to develop a realistic seismic risk evaluation system which includes the consideration of aging of structures and components in nuclear power plants (NPPs). The KAERI research project includes three specific areas that are essential to seismic probabilistic risk assessment (PRA): (1) probabilistic seismic hazard analysis, (2) seismic fragility analysis including the effects of aging, and (3) a plant seismic risk analysis. Since 2007, Brookhaven National Laboratory (BNL) has entered into a collaboration agreement with KAERI to support its development of seismic capability evaluation technology for degraded structures and components. The collaborative research effort is intended to continue over a five year period. The goal of this collaboration endeavor is to assist KAERI to develop seismic fragility analysis methods that consider the potential effects of age-related degradation of structures, systems, and components (SSCs). The research results of this multi-year collaboration will be utilized as input to seismic PRAs. In the Year 1 scope of work, BNL collected and reviewed degradation occurrences in US NPPs and identified important aging characteristics needed for the seismic capability evaluations. This information is presented in the Annual Report for the Year 1 Task, identified as BNL Report-81741-2008 and also designated as KAERI/RR-2931/2008. The report presents results of the statistical and trending analysis of this data and compares the results to prior aging studies. In addition, the report provides a description of U.S. current regulatory requirements, regulatory guidance documents, generic communications, industry standards and guidance, and past research related to aging degradation of SSCs. In the Year 2 scope of work, BNL carried out a research effort to identify and assess degradation models for the long-term behavior of dominant materials that are

  2. Introduction: why analyze single cells?

    Science.gov (United States)

    Di Carlo, Dino; Tse, Henry Tat Kwong; Gossett, Daniel R

    2012-01-01

    Powerful methods in molecular biology are abundant; however, in many fields including hematology, stem cell biology, tissue engineering, and cancer biology, data from tools and assays that analyze the average signals from many cells may not yield the desired result because the cells of interest may be in the minority-their behavior masked by the majority-or because the dynamics of the populations of interest are offset in time. Accurate characterization of samples with high cellular heterogeneity may only be achieved by analyzing single cells. In this chapter, we discuss the rationale for performing analyses on individual cells in more depth, cover the fields of study in which single-cell behavior is yielding new insights into biological and clinical questions, and speculate on how single-cell analysis will be critical in the future.

  3. Methods for Analyzing Social Media

    DEFF Research Database (Denmark)

    Jensen, Jakob Linaa; [Ukendt], editors

    2013-01-01

    Social media is becoming increasingly attractive for users. It is a fast way to communicate ideas and a key source of information. It is therefore one of the most influential mediums of communication of our time and an important area for audience research. The growth of social media invites many...... new questions such as: How can we analyze social media? Can we use traditional audience research methods and apply them to online content? Which new research strategies have been developed? Which ethical research issues and controversies do we have to pay attention to? This book focuses on research...... strategies and methods for analyzing social media and will be of interest to researchers and practitioners using social media, as well as those wanting to keep up to date with the subject....

  4. Analyzing viewpoint diversity in twitter

    OpenAIRE

    2013-01-01

    Information diversity has a long tradition in human history. Recently there have been claims that diversity is diminishing in information available in social networks. On the other hand, some studies suggest that diversity is actually quite high in social networks such as Twitter. However these studies only focus on the concept of source diversity and they only focus on American users. In this paper we analyze different dimensions of diversity. We also provide an experimental design in which ...

  5. Analyzing ion distributions around DNA.

    Science.gov (United States)

    Lavery, Richard; Maddocks, John H; Pasi, Marco; Zakrzewska, Krystyna

    2014-07-01

    We present a new method for analyzing ion, or molecule, distributions around helical nucleic acids and illustrate the approach by analyzing data derived from molecular dynamics simulations. The analysis is based on the use of curvilinear helicoidal coordinates and leads to highly localized ion densities compared to those obtained by simply superposing molecular dynamics snapshots in Cartesian space. The results identify highly populated and sequence-dependent regions where ions strongly interact with the nucleic and are coupled to its conformational fluctuations. The data from this approach is presented as ion populations or ion densities (in units of molarity) and can be analyzed in radial, angular and longitudinal coordinates using 1D or 2D graphics. It is also possible to regenerate 3D densities in Cartesian space. This approach makes it easy to understand and compare ion distributions and also allows the calculation of average ion populations in any desired zone surrounding a nucleic acid without requiring references to its constituent atoms. The method is illustrated using microsecond molecular dynamics simulations for two different DNA oligomers in the presence of 0.15 M potassium chloride. We discuss the results in terms of convergence, sequence-specific ion binding and coupling with DNA conformation.

  6. Remote Laser Diffraction PSD Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2000-06-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified "off-the-shelf" classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a "hot cell" (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  7. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Science.gov (United States)

    Cheung, Mike W.-L.; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists—and probably the most crucial one—is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study. PMID:27242639

  8. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    Science.gov (United States)

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  9. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Directory of Open Access Journals (Sweden)

    Mike W.-L. Cheung

    2016-05-01

    Full Text Available Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  10. The Aqueduct Global Flood Analyzer

    Science.gov (United States)

    Iceland, Charles

    2015-04-01

    As population growth and economic growth take place, and as climate change accelerates, many regions across the globe are finding themselves increasingly vulnerable to flooding. A recent OECD study of the exposure of the world's large port cities to coastal flooding found that 40 million people were exposed to a 1 in 100 year coastal flood event in 2005, and the total value of exposed assets was about US 3,000 billion, or 5% of global GDP. By the 2070s, those numbers were estimated to increase to 150 million people and US 35,000 billion, or roughly 9% of projected global GDP. Impoverished people in developing countries are particularly at risk because they often live in flood-prone areas and lack the resources to respond. WRI and its Dutch partners - Deltares, IVM-VU University Amsterdam, Utrecht University, and PBL Netherlands Environmental Assessment Agency - are in the initial stages of developing a robust set of river flood and coastal storm surge risk measures that show the extent of flooding under a variety of scenarios (both current and future), together with the projected human and economic impacts of these flood scenarios. These flood risk data and information will be accessible via an online, easy-to-use Aqueduct Global Flood Analyzer. We will also investigate the viability, benefits, and costs of a wide array of flood risk reduction measures that could be implemented in a variety of geographic and socio-economic settings. Together, the activities we propose have the potential for saving hundreds of thousands of lives and strengthening the resiliency and security of many millions more, especially those who are most vulnerable. Mr. Iceland will present Version 1.0 of the Aqueduct Global Flood Analyzer and provide a preview of additional elements of the Analyzer to be released in the coming years.

  11. Method for analyzing microbial communities

    Science.gov (United States)

    Zhou, Jizhong [Oak Ridge, TN; Wu, Liyou [Oak Ridge, TN

    2010-07-20

    The present invention provides a method for quantitatively analyzing microbial genes, species, or strains in a sample that contains at least two species or strains of microorganisms. The method involves using an isothermal DNA polymerase to randomly and representatively amplify genomic DNA of the microorganisms in the sample, hybridizing the resultant polynucleotide amplification product to a polynucleotide microarray that can differentiate different genes, species, or strains of microorganisms of interest, and measuring hybridization signals on the microarray to quantify the genes, species, or strains of interest.

  12. Fuel analyzer; Analisador de combustiveis

    Energy Technology Data Exchange (ETDEWEB)

    Cozzolino, Roberval [RS Motors, Indaiatuba, SP (Brazil)

    2008-07-01

    The current technology 'COMBUSTIMETRO' aims to examine the fuel through performance of the engine, as the role of the fuel is to produce energy for the combustion engine in the form of which is directly proportional to the quality and type of fuel. The 'COMBUSTIMETRO' has an engine that always keeps the same entry of air, fuel and fixed point of ignition. His operation is monitored by sensors (Sonda Lambda, RPM and Gases Analyzer) connected to a processor that performs calculations and records the information, generate reports and graphs. (author)

  13. PROCEEDINGS OF THE RIKEN BNL RESEARCH CENTER WORKSHOP ON LARGE SCALE COMPUTATIONS IN NUCLEAR PHYSICS USING THE QCDOC, SEPTEMBER 26 - 28, 2002.

    Energy Technology Data Exchange (ETDEWEB)

    AOKI,Y.; BALTZ,A.; CREUTZ,M.; GYULASSY,M.; OHTA,S.

    2002-09-26

    The massively parallel computer QCDOC (QCD On a Chip) of the RIKEN BNL Research Center (RI3RC) will provide ten-teraflop peak performance for lattice gauge calculations. Lattice groups from both Columbia University and RBRC, along with assistance from IBM, jointly handled the design of the QCDOC. RIKEN has provided $5 million in funding to complete the machine in 2003. Some fraction of this computer (perhaps as much as 10%) might be made available for large-scale computations in areas of theoretical nuclear physics other than lattice gauge theory. The purpose of this workshop was to investigate the feasibility and possibility of using a supercomputer such as the QCDOC for lattice, general nuclear theory, and other calculations. The lattice applications to nuclear physics that can be investigated with the QCDOC are varied: for example, the light hadron spectrum, finite temperature QCD, and kaon ({Delta}I = 1/2 and CP violation), and nucleon (the structure of the proton) matrix elements, to name a few. There are also other topics in theoretical nuclear physics that are currently limited by computer resources. Among these are ab initio calculations of nuclear structure for light nuclei (e.g. up to {approx}A = 8 nuclei), nuclear shell model calculations, nuclear hydrodynamics, heavy ion cascade and other transport calculations for RHIC, and nuclear astrophysics topics such as exploding supernovae. The physics topics were quite varied, ranging from simulations of stellar collapse by Douglas Swesty to detailed shell model calculations by David Dean, Takaharu Otsuka, and Noritaka Shimizu. Going outside traditional nuclear physics, James Davenport discussed molecular dynamics simulations and Shailesh Chandrasekharan presented a class of algorithms for simulating a wide variety of femionic problems. Four speakers addressed various aspects of theory and computational modeling for relativistic heavy ion reactions at RHIC. Scott Pratt and Steffen Bass gave general overviews of

  14. Thermal and evolved gas analyzer

    Science.gov (United States)

    Williams, M. S.; Boynton, W. V.; James, R. L.; Verts, W. T.; Bailey, S. H.; Hamara, D. K.

    1998-01-01

    The Thermal and Evolved Gas Analyzer (TEGA) instrument will perform calorimetry and evolved gas analysis on soil samples collected from the Martian surface. TEGA is one of three instruments, along with a robotic arm, that form the Mars Volatile and Climate Survey (MVACS) payload. The other instruments are a stereo surface imager, built by Peter Smith of the University of Arizona and a meteorological station, built by JPL. The MVACS lander will investigate a Martian landing site at approximately 70 deg south latitude. Launch will take place from Kennedy Space Center in January, 1999. The TEGA project started in February, 1996. In the intervening 24 months, a flight instrument concept has been designed, prototyped, built as an engineering model and flight model, and tested. The instrument performs laboratory-quality differential-scanning calorimetry (DSC) over the temperature range of Mars ambient to 1400K. Low-temperature volatiles (water and carbon dioxide ices) and the carbonates will be analyzed in this temperature range. Carbonates melt and evolve carbon dioxide at temperatures above 600 C. Evolved oxygen (down to a concentration of 1 ppm) is detected, and C02 and water vapor and the isotopic variations of C02 and water vapor are detected and their concentrations measured. The isotopic composition provides important tests of the theory of solar system formation.

  15. VOSA: A VO SED Analyzer

    Science.gov (United States)

    Rodrigo, C.; Bayo, A.; Solano, E.

    2017-03-01

    VOSA (VO Sed Analyzer, http://svo2.cab.inta-csic.es/theory/vosa) is a public web-tool developed by the Spanish Virtual Observatory (http://svo.cab.inta-csic.es/) and designed to help users to (1) build Spectral Energy Distributions (SEDs) combining private photometric measurements with data available in VO services, (2) obtain relevant properties of these objects (distance, extinction, etc) from VO catalogs, (3) analyze them comparing observed photometry with synthetic photometry from different collections of theoretical models or observational templates, using different techniques (chi-square minimization, Bayesian analysis) to estimate physical parameters of the observed objects (teff, logg, metallicity, stellar radius/distance ratio, infrared excess, etc), and use these results to (4) estimate masses and ages via interpolation of collections of isochrones and evolutionary tracks from the VO. In particular, VOSA offers the advantage of deriving physical parameters using all the available photometric information instead of a restricted subset of colors. The results can be downloaded in different formats or sent to other VO tools using SAMP. We have upgraded VOSA to provide access to Gaia photometry and give a homogeneous estimation of the physical parameters of thousands of objects at a time. This upgrade has required the implementation of a new computation paradigm, including a distributed environment, the capability of submitting and processing jobs in an asynchronous way, the use of parallelized computing to speed up processes (˜ ten times faster) and a new design of the web interface.

  16. Coaxial charged particle energy analyzer

    Science.gov (United States)

    Kelly, Michael A. (Inventor); Bryson, III, Charles E. (Inventor); Wu, Warren (Inventor)

    2011-01-01

    A non-dispersive electrostatic energy analyzer for electrons and other charged particles having a generally coaxial structure of a sequentially arranged sections of an electrostatic lens to focus the beam through an iris and preferably including an ellipsoidally shaped input grid for collimating a wide acceptance beam from a charged-particle source, an electrostatic high-pass filter including a planar exit grid, and an electrostatic low-pass filter. The low-pass filter is configured to reflect low-energy particles back towards a charged particle detector located within the low-pass filter. Each section comprises multiple tubular or conical electrodes arranged about the central axis. The voltages on the lens are scanned to place a selected energy band of the accepted beam at a selected energy at the iris. Voltages on the high-pass and low-pass filters remain substantially fixed during the scan.

  17. Compact Microwave Fourier Spectrum Analyzer

    Science.gov (United States)

    Savchenkov, Anatoliy; Matsko, Andrey; Strekalov, Dmitry

    2009-01-01

    A compact photonic microwave Fourier spectrum analyzer [a Fourier-transform microwave spectrometer, (FTMWS)] with no moving parts has been proposed for use in remote sensing of weak, natural microwave emissions from the surfaces and atmospheres of planets to enable remote analysis and determination of chemical composition and abundances of critical molecular constituents in space. The instrument is based on a Bessel beam (light modes with non-zero angular momenta) fiber-optic elements. It features low power consumption, low mass, and high resolution, without a need for any cryogenics, beyond what is achievable by the current state-of-the-art in space instruments. The instrument can also be used in a wide-band scatterometer mode in active radar systems.

  18. Highlights from BNL-RHIC

    CERN Document Server

    Tannenbaum, M J

    2012-01-01

    Recent highlights from Brookhaven National Laboratory and the Relativistic Heavy Ion Collider (RHIC) are reviewed and discussed. Topics include: Discovery of the strongly interacting Quark Gluon Plasma (sQGP) in 2005; RHIC machine operation in 2011 as well as latest achievements from the superconducting Magnet Division and the National Synchrotron Light Source II project. Highlights from QGP physics at RHIC include: comparison of new measurements of charged multiplicity in A+A collisions by ALICE at the LHC to previous RHIC measurements; Observation of the anti-alpha particle by the STAR experiment; Collective Flow, including the Triangular Flow discovery and the latest results on v3; the RHIC beam energy scan in search of the QCD critical point. The pioneering use at RHIC of hard-scattering as a probe of the sQGP will also be reviewed and the latest results presented including: jet-quenching via suppression of high pT particles and two particle correlations; new results on fragmentation functions using gamma...

  19. Remote Laser Diffraction Particle Size Distribution Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2001-03-01

    In support of a radioactive slurry sampling and physical characterization task, an “off-the-shelf” laser diffraction (classical light scattering) particle size analyzer was utilized for remote particle size distribution (PSD) analysis. Spent nuclear fuel was previously reprocessed at the Idaho Nuclear Technology and Engineering Center (INTEC—formerly recognized as the Idaho Chemical Processing Plant) which is on DOE’s INEEL site. The acidic, radioactive aqueous raffinate streams from these processes were transferred to 300,000 gallon stainless steel storage vessels located in the INTEC Tank Farm area. Due to the transfer piping configuration in these vessels, complete removal of the liquid can not be achieved. Consequently, a “heel” slurry remains at the bottom of an “emptied” vessel. Particle size distribution characterization of the settled solids in this remaining heel slurry, as well as suspended solids in the tank liquid, is the goal of this remote PSD analyzer task. A Horiba Instruments Inc. Model LA-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a “hot cell” (gamma radiation) environment. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not previously achievable—making this technology far superior than the traditional methods used. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  20. Analyzing and modeling heterogeneous behavior

    Science.gov (United States)

    Lin, Zhiting; Wu, Xiaoqing; He, Dongyue; Zhu, Qiang; Ni, Jixiang

    2016-05-01

    Recently, it was pointed out that the non-Poisson statistics with heavy tail existed in many scenarios of human behaviors. But most of these studies claimed that power-law characterized diverse aspects of human mobility patterns. In this paper, we suggest that human behavior may not be driven by identical mechanisms and can be modeled as a Semi-Markov Modulated Process. To verify our suggestion and model, we analyzed a total of 1,619,934 records of library visitations (including undergraduate and graduate students). It is found that the distribution of visitation intervals is well fitted with three sections of lines instead of the traditional power law distribution in log-log scale. The results confirm that some human behaviors cannot be simply expressed as power law or any other simple functions. At the same time, we divided the data into groups and extracted period bursty events. Through careful analysis in different groups, we drew a conclusion that aggregate behavior might be composed of heterogeneous behaviors, and even the behaviors of the same type tended to be different in different period. The aggregate behavior is supposed to be formed by "heterogeneous groups". We performed a series of experiments. Simulation results showed that we just needed to set up two states Semi-Markov Modulated Process to construct proper representation of heterogeneous behavior.

  1. Thomson parabola ion energy analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Cobble, James A [Los Alamos National Laboratory; Flippo, Kirk A [Los Alamos National Laboratory; Letzring, Samuel A [Los Alamos National Laboratory; Lopez, Frank E [Los Alamos National Laboratory; Offermann, Dustin T [Los Alamos National Laboratory; Oertel, John A [Los Alamos National Laboratory; Mastrosimone, Dino [UNIV OF ROCHESTER

    2010-01-01

    A new, versatile Thomson parabola ion energy (TPIE) analyzer has been designed and constructed for use at the OMEGA-EP facility. Multi-MeV ions from EP targets are transmitted through a W pinhole into a (5- or 8-kG) magnetic field and subsequently through a parallel electric field of up to 30 kV/cm. The ion drift region may have a user-selected length of 10, 50, or 80 cm. With the highest fields, 500-Me V C{sup 6+} and C{sup 5+} may be resolved. TPIE is TIM-mounted at OMEGA-EP and is qualified in all existing TIMs. The instrument runs on pressure-interlocked 15-VDC power available in EP TIM carts. It may be inserted to within several inches of the target to attain sufficient flux for a measurement. For additional flux control, the user may select a square-aperture W pinhole of 0.004-inch or 0.010-inch. The detector consists of CR-39 backed by an image plate. The fully relativistic design code and design features are discussed. Ion spectral results from first use at OMEGA-EP are expected.

  2. Objects in Films: analyzing signs

    Directory of Open Access Journals (Sweden)

    GAMBARATO, Renira Rampazzo

    2009-12-01

    Full Text Available The focus of this essay is the analysis of daily objects as signs in films. Objects from everyday life acquire several functions in films: they can be solely used as scene objects or to support a particular film style. Other objects are specially chosen to translate a character’s interior state of mind or the filmmaker’s aesthetical or ethical commitment to narrative concepts. In order to understand such functions and commitments, we developed a methodology for film analysis which focuses on the objects. Object interpretation, as the starting point of film analysis, is not a new approach. For instance, French film critic André Bazin proposed that use of object interpretation in the 1950s. Similarly, German film theorist Siegfried Kracauer stated it in the 1960s. However, there is currently no existing analytical model to use when engaging in object interpretation in film. This methodology searches for the most representative objects in films which involves both quantitative and qualitative analysis; we consider the number of times each object appears in a film (quantitative analysis as well as the context of their appearance, i.e. the type of shot used and how that creates either a larger or smaller relevance and/or expressiveness (qualitative analysis. In addition to the criteria of relevance and expressiveness, we also analyze the functionality of an object by exploring details and specifying the role various objects play in films. This research was developed at Concordia University, Montreal, Canada and was supported by the Foreign Affairs and International Trade, Canada (DFAIT.

  3. Multinationals and plant survival

    DEFF Research Database (Denmark)

    Bandick, Roger

    2010-01-01

    The aim of this paper is twofold: first, to investigate how different ownership structures affect plant survival, and second, to analyze how the presence of foreign multinational enterprises (MNEs) affects domestic plants’ survival. Using a unique and detailed data set on the Swedish manufacturing...... sector, I am able to separate plants into those owned by foreign MNEs, domestic MNEs, exporting non-MNEs, and purely domestic firms. In line with previous findings, the result, when conditioned on other factors affecting survival, shows that foreign MNE plants have lower survival rates than non......-MNE plants. However, separating the non-MNEs into exporters and non-exporters, the result shows that foreign MNE plants have higher survival rates than non-exporting non-MNEs, while the survival rates of foreign MNE plants and exporting non-MNE plants do not seem to differ. Moreover, the simple non...

  4. The testis-specific VAD1.3/AEP1 interacts with {beta}-actin and syntaxin 1 and directs peri-nuclear/Golgi expression with bipartite nucleus localization (BNL) sequence

    Energy Technology Data Exchange (ETDEWEB)

    Zuo, Yan; Gao, Jing [Department of Obstetrics and Gynaecology, The University of Hong Kong, Pokfulam (Hong Kong); Yeung, William S.B. [Department of Obstetrics and Gynaecology, The University of Hong Kong, Pokfulam (Hong Kong); Centre for Reproduction, Development and Growth, Hong Kong Jockey Club Clinical Research Centre, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Pokfulam (Hong Kong); Lee, Kai-Fai, E-mail: ckflee@hkucc.hku.hk [Department of Obstetrics and Gynaecology, The University of Hong Kong, Pokfulam (Hong Kong); Centre for Reproduction, Development and Growth, Hong Kong Jockey Club Clinical Research Centre, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Pokfulam (Hong Kong)

    2010-10-15

    Research highlights: {yields} VAD1.3 interacts {beta}-actin and syntaxin 1. {yields} VAD1.3 colocalizes {beta}-actin in spermatids. {yields} The bipartite nucleus localization (BNL) signal is important for peri-nuclear/Golgi expression in transfected cells. {yields} The C-terminal region of VAD1.3 direct nuclei localization. -- Abstract: VAD1.3 (AEP1), a novel testis-specific gene, was first isolated from the testis of a retinol-treated vitamin-A-deficient (VAD) rat model. It is expressed at the acrosomal region of spermatids from postnatal day 25. VAD1.3 immunoreactivity is present in rat, human, monkey and porcine spermatids and spermatozoa, suggesting that VAD1.3 may play a role in acrosome formation. However, direct evidence on the detailed sub-cellular localization of the VAD1.3 protein in the acrosome and how VAD1.3 is involved in acrosome formation remains largely unknown. Here, we isolated and identified VAD1.3 interacting proteins by immunoprecipitation followed by mass spectrometry, and determined the functional motifs of VAD1.3 that were important for its specific sub-cellular location in vitro. We found that VAD1.3 bound to syntaxin 1 and {beta}-actin proteins in vitro. Immunogold electron microscopic study localized VAD1.3 immunoreactivity to the acrosome membranes and matrix, and colocalized it with the {beta}-actin protein. The full-length GFP-VAD (1-3601) and GFP-VAD (1-730) fusion proteins that contain the bipartite nucleus localization (BNL) signal were located in the peri-nucleus/Golgi of the transfected cells. In addition, the GFP signal colocalized with the endoplasmic reticulum marker and the syntaxin 1 protein in the transfected HeLa and GC-2spd cells. The C-terminal GFP-VAD (1770-3601) was expressed in the nucleus. Taken together, VAD1.3 interacts with {beta}-actin and syntaxin 1 in vitro. The BNL signal may mediate the peri-nuclei localization of the protein that may interact with syntaxin 1 and {beta}-actin for acrosome formation in

  5. Pathogen Phytosensing: Plants to Report Plant Pathogens

    Directory of Open Access Journals (Sweden)

    C. Neal Stewart

    2008-04-01

    Full Text Available Real-time systems that provide evidence of pathogen contamination in crops can be an important new line of early defense in agricultural centers. Plants possess defense mechanisms to protect against pathogen attack. Inducible plant defense is controlled by signal transduction pathways, inducible promoters and cis-regulatory elements corresponding to key genes involved in defense, and pathogen-specific responses. Identified inducible promoters and cis-acting elements could be utilized in plant sentinels, or ‘phytosensors’, by fusing these to reporter genes to produce plants with altered phenotypes in response to the presence of pathogens. Here, we have employed cis-acting elements from promoter regions of pathogen inducible genes as well as those responsive to the plant defense signal molecules salicylic acid, jasmonic acid, and ethylene. Synthetic promoters were constructed by combining various regulatory elements supplemented with the enhancer elements from the Cauliflower mosaic virus (CaMV 35S promoter to increase basal level of the GUS expression. The inducibility of each synthetic promoter was first assessed in transient expression assays using Arabidopsis thaliana protoplasts and then examined for efficacy in stably transgenic Arabidopsis and tobacco plants. Histochemical and fluorometric GUS expression analyses showed that both transgenic Arabidopsis and tobacco plants responded to elicitor and phytohormone treatments with increased GUS expression when compared to untreated plants. Pathogen-inducible phytosensor studies were initiated by analyzing the sensitivity of the synthetic promoters against virus infection. Transgenic tobacco plants infected with Alfalfa mosaic virus showed an increase in GUS expression when compared to mock-inoculated control plants, whereas Tobacco mosaic virus infection caused no changes in GUS expression. Further research, using these transgenic plants against a range of different

  6. Adaptation of thermal power plants

    NARCIS (Netherlands)

    Bogmans, Christian W.J.; Dijkema, Gerard P.J.; Vliet, van Michelle T.H.

    2017-01-01

    When does climate change information lead to adaptation? We analyze thermal power plant adaptation by means of investing in water-saving (cooling) technology to prevent a decrease in plant efficiency and load reduction. A comprehensive power plant investment model, forced with downscaled climate

  7. A Procedure for Determination of Degradation Acceptance Criteria for Structures and Passive Components in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Nie, J.; Braverman, J.; Hofmayer, C.; Choun, Y-S.; Hahm, D.; Choi, I-K.

    2012-01-30

    The Korea Atomic Energy Research Institute (KAERI) has been collaborating with Brookhaven National Laboratory since 2007 to develop a realistic seismic risk evaluation system which includes the consideration of aging of structures and components in nuclear power plants (NPPs). This collaboration program aims at providing technical support to a five-year KAERI research project, which includes three specific areas that are essential to seismic probabilistic risk assessment: (1) probabilistic seismic hazard analysis, (2) seismic fragility analysis including the effects of aging, and (3) a plant seismic risk analysis. The understanding and assessment of age-related degradations of structures, systems, and components and their impact on plant safety is the major goal of this KAERI-BNL collaboration. Four annual reports have been published before this report as a result of the collaboration research.

  8. Prospects for measuring K{sup +} {r_arrow} {pi}{sup +} {nu}{bar {nu}} and K{sub L}{sup 0} {r_arrow} {pi}{sup 0} {nu}{bar {nu}} at BNL

    Energy Technology Data Exchange (ETDEWEB)

    Bryman, D.A.; Littenberg, L.

    2000-09-18

    Rare kaon decay experiments underway or planned for the BNL AGS will yield new and independent determinations of V*{sub ts}V{sub td}. A measurement of B(K{sub L}{sup 0} {r_arrow} {pi}{sup 0} {nu}{bar {nu}}) allows a determination of the imaginary part of this quantity, which is the fundamental CP-violating parameter of the Standard Model, in a uniquely clean manner. Since the measurement of B(K{sup +} {r_arrow} {pi}{sup +} {nu}{bar {nu}}) determines {vert_bar}V*{sub ts}V{sub td}, a complete derivation of the unitarity triangle is facilitated. These results can be compared to high precision data expected to come from the B sector in a number of ways, allowing for unique tests of new physics.

  9. [Quality control of plant extract].

    Science.gov (United States)

    Shao, Yun-dong; Gao, Wen-yuan; Liu, Dan; Jia, Wei; Duan, Hong-Quan; Zhang, Tie-jun

    2003-10-01

    The current situation of plant extract in domestic and international market was analyzed in the paper. The quality control of 20 plant extracts which have reasonably good sales in USA market was compared and analyzed. The analysis of the quality control of six plant extracts indicated that there were two main reasons leading to the varied quality specifications among different suppliers. One reason was that the plant species utilized by different companies were different. The other reason was that the extraction processes were different among different production plants. Comparing with the significant international suppliers of plant extracts, the product quality of Chinese companies were not satisfactory. It was suggested that chromatography and chromatographic fingerprint techniques should be applied to improve the quality control standard of plant extract in our country.

  10. Portable Programmable Multifunction Body Fluids Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced Liquid Logic proposes to develop a very capable analyzer based on its digital microfluidic technology. Such an analyzer would be:  Capable of both...

  11. Technical basis for environmental qualification of microprocessor-based safety-related equipment in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Korsah, K.; Wood, R.T. [Oak Ridge National Lab., TN (United States); Hassan, M. [Brookhaven National Lab., Upton, NY (United States); Tanaka, T.J. [Sandia National Labs., Albuquerque, NM (United States)

    1998-01-01

    This document presents the results of studies sponsored by the Nuclear Regulatory Commission (NRC) to provide the technical basis for environmental qualification of computer-based safety equipment in nuclear power plants. The studies were conducted by Oak Ridge National Laboratory (ORNL), Sandia National Laboratories (SNL), and Brookhaven National Laboratory (BNL). The studies address the following: (1) adequacy of the present test methods for qualification of digital I and C systems; (2) preferred (i.e., Regulatory Guide-endorsed) standards; (3) recommended stressors to be included in the qualification process during type testing; (4) resolution of need for accelerated aging for equipment to be located in a benign environment; and (5) determination of an appropriate approach for addressing the impact of smoke in digital equipment qualification programs. Significant findings from the studies form the technical basis for a recommended approach to the environmental qualification of microprocessor-based safety-related equipment in nuclear power plants.

  12. Fuzzy Based Auto-coagulation Control Through Photometric Dispersion Analyzer

    Institute of Scientific and Technical Information of China (English)

    白桦; 李圭白

    2004-01-01

    The main role of water treatment plants is to supply high-quality safe drinking water. Coagulation is one of the most important stages of surface water treatment. The photometric dispersion analyzer(PDA) is a new optical method for flocculation monitoring, and is feasible to realize coagulation feedback control. The on line modification of the coagulation control system' s set point( or optimum dosing coagulant) has influenced the application of this technology in water treatment plant for a long time. A fuzzy control system incorporating the photometric dispersion analyzer was utilized in this coagulation control system. Proposed is a fuzzy logic inference control system by using Takagi and Sugeno' s fuzzy if-then rule for the self-correction of set point on line. Programmed is the dosing rate fuzzy control system in SIEMENS small-scale programmable logic controller. A 400 L/min middle-scale water treatment plant was utilized to simulate the reaction. With the changes of raw water quality, the set point was modified correctly in time, as well as coagulant dosing rate, and residual turbility before filtration was eligible and stable. Results show that this fuzzy inference and control system performs well on the coagulation control system through PDA.

  13. Manufacturing Plants

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    China starts to produce vegetables and fruits in a factory sunshine,air and soil are indispensable for green plants. This might be axiomatic but not in a plant factory. By creating a plant factory,scientists are trying to grow plants where natural elements are deficient or absent,such as deserts, islands,water surfaces,South and North poles and space,as well as in human habitats such as skyscrapers in modern cities.

  14. Manufacturing Plants

    Institute of Scientific and Technical Information of China (English)

    TANG YUANKAI

    2010-01-01

    @@ Sunshine, air and soil are indispensable for green plants. This might be axi-omatic but not in a plant factory. By creating a plant factory, scientists are trying to grow plants where natural elements are deficient or absent, such as deserts,islands, water surfaces, South and North poles and space, as well as in human habi-tats such as skyscrapers in modern cities.

  15. Aquatic plants

    DEFF Research Database (Denmark)

    Madsen, T. V.; Sand-Jensen, K.

    2006-01-01

    Aquatic fl owering plants form a relatively young plant group on an evolutionary timescale. The group has developed over the past 80 million years from terrestrial fl owering plants that re-colonised the aquatic environment after 60-100 million years on land. The exchange of species between...... terrestrial and aquatic environments continues today and is very intensive along stream banks. In this chapter we describe the physical and chemical barriers to the exchange of plants between land and water....

  16. [Plant hydroponics and its application prospect in medicinal plants study].

    Science.gov (United States)

    Zeng, Yan; Guo, Lan-Ping; Huang, Lu-Qi; Sun, Yu-Zhang

    2007-03-01

    This article introduced the theorem and method of hydroponics. Some examples of studies in agriculture and forestry were presented, the effects of elements, environmental stress and hormones on physiology of medicinal plants by using hydroponics were analyzed. It also introduced the feasibility and advantage of hydroponics in intermediate propagation and allelopathy of medicinal plant. And finally it made the conclusion that the way of hydroponics would be widely used in medicinal plant study.

  17. Medicinal Plants.

    Science.gov (United States)

    Phillipson, J. David

    1997-01-01

    Highlights the demand for medicinal plants as pharmaceuticals and the demand for health care treatments worldwide and the issues that arise from this. Discusses new drugs from plants, anticancer drugs, antiviral drugs, antimalarial drugs, herbal remedies, quality, safety, efficacy, and conservation of plants. Contains 30 references. (JRH)

  18. DEVELOPMENT OF AN ON-LINE COAL WASHABILITY ANALYZER

    Energy Technology Data Exchange (ETDEWEB)

    J.D. Miller

    1999-09-30

    Washability analysis is the basis for nearly all coal preparation plant separations. Unfortunately, there are no on-line techniques for determining this most fundamental of all coal cleaning information. In light of recent successes at the University of Utah, it now appears possible to determine coal washability on-line through the use of x-ray computed tomography (CT) analysis. The successful development of such a device is critical to the establishment of process control and automated coal blending systems. In this regard, Virginia Tech, Terra Tek Inc., and several eastern coal companies have joined with the University of Utah and agreed to undertake the development of a x-ray CT-based on-line coal washability analyzer with financial assistance from DOE. The three-year project will cost $594,571, of which 33% ($194,575) will be cost-shared by the participants. The project involves development of appropriate software and extensive testing/evaluation of well-characterized coal samples from operating coal preparation plants. Each project participant brings special expertise to the project which is expected to create a new dimension in coal cleaning technology. Finally, it should be noted that the analyzer may prove to be a universal analyzer capable of providing not only washability analysis, but also particle size distribution analysis, ash analysis and perhaps pyritic sulfur analysis.

  19. Electrical spectrum & network analyzers a practical approach

    CERN Document Server

    Helfrick, Albert D

    1991-01-01

    This book presents fundamentals and the latest techniques of electrical spectrum analysis. It focuses on instruments and techniques used on spectrum and network analysis, rather than theory. The book covers the use of spectrum analyzers, tracking generators, and network analyzers. Filled with practical examples, the book presents techniques that are widely used in signal processing and communications applications, yet are difficult to find in most literature.Key Features* Presents numerous practical examples, including actual spectrum analyzer circuits* Instruction on how to us

  20. Designing of Acousto-optic Spectrum Analyzer

    Institute of Scientific and Technical Information of China (English)

    WANG Dan-zhi; SHAO Ding-rong; LI Shu-jian

    2004-01-01

    The structure of the acousto-optic spectrum analyzer was investigated including the RF amplifying circuit, the optical structures and the postprocessing circuit, and the design idea of the module was applied to design the spectrum analyzer. The modularization spectrum analyzer takes on the performance stabilization and higher reliability, and according to different demands, the different modules can be used. The spectrum analyzer had such performances as the detecting frequency error of 0.58MHz,detecting responsivity of 90 dBm and bandwidth of 50 Mhz.

  1. ADAM: Analyzer for Dialectal Arabic Morphology

    Directory of Open Access Journals (Sweden)

    Wael Salloum

    2014-12-01

    Full Text Available While Modern Standard Arabic (MSA has many resources, Arabic Dialects, the primarily spoken local varieties of Arabic, are quite impoverished in this regard. In this article, we present ADAM (Analyzer for Dialectal Arabic Morphology. ADAM is a poor man’s solution to quickly develop morphological analyzers for dialectal Arabic. ADAM has roughly half the out-of-vocabulary rate of a state-of-the-art MSA analyzer and is comparable in its recall performance to an Egyptian dialectal morphological analyzer that took years and expensive resources to build.

  2. Autoluminescent plants.

    Directory of Open Access Journals (Sweden)

    Alexander Krichevsky

    Full Text Available Prospects of obtaining plants glowing in the dark have captivated the imagination of scientists and layman alike. While light emission has been developed into a useful marker of gene expression, bioluminescence in plants remained dependent on externally supplied substrate. Evolutionary conservation of the prokaryotic gene expression machinery enabled expression of the six genes of the lux operon in chloroplasts yielding plants that are capable of autonomous light emission. This work demonstrates that complex metabolic pathways of prokaryotes can be reconstructed and function in plant chloroplasts and that transplastomic plants can emit light that is visible by naked eye.

  3. Plant volatiles.

    Science.gov (United States)

    Baldwin, Ian T

    2010-05-11

    Plant volatiles are the metabolites that plants release into the air. The quantities released are not trivial. Almost one-fifth of the atmospheric CO2 fixed by land plants is released back into the air each day as volatiles. Plants are champion synthetic chemists; they take advantage of their anabolic prowess to produce volatiles, which they use to protect themselves against biotic and abiotic stresses and to provide information - and potentially disinformation - to mutualists and competitors alike. As transferors of information, volatiles have provided plants with solutions to the challenges associated with being rooted in the ground and immobile.

  4. Statistical network analysis for analyzing policy networks

    DEFF Research Database (Denmark)

    Robins, Garry; Lewis, Jenny; Wang, Peng

    2012-01-01

    To analyze social network data using standard statistical approaches is to risk incorrect inference. The dependencies among observations implied in a network conceptualization undermine standard assumptions of the usual general linear models. One of the most quickly expanding areas of social...... has much to offer in analyzing the policy process....

  5. Performance evaluation of PL-11 platelet analyzer

    Institute of Scientific and Technical Information of China (English)

    张有涛

    2013-01-01

    Objective To evaluate and report the performance of PL-11 platelet analyzer. Methods Intravenous blood sam-ples anticoagulated with EDTA-K2 and sodium citrate were tested by the PL-11 platelet analyzer to evaluate the intra-assay and interassay coefficient of variation(CV),

  6. Analyzing metabolomics-based challenge tests

    NARCIS (Netherlands)

    Vis, D.J.; Westerhuis, J.A.; Jacobs, D.M.; Duynhoven, van J.P.M.; Wopereis, S.; Ommen, van B.; Hendriks, M.M.W.B.; Smilde, A.K.

    2015-01-01

    Challenge tests are used to assess the resilience of human beings to perturbations by analyzing responses to detect functional abnormalities. Well known examples are allergy tests and glucose tolerance tests. Increasingly, metabolomics analysis of blood or serum samples is used to analyze the biolog

  7. Towards Multi Fuel SOFC Plant

    DEFF Research Database (Denmark)

    Rokni, Masoud; Clausen, Lasse Røngaard; Bang-Møller, Christian

    2011-01-01

    Complete Solid Oxide Fuel Cell (SOFC) plants fed by several different fuels are suggested and analyzed. The plants sizes are about 10 kW which is suitable for single family house with needs for both electricity and heat. Alternative fuels such as, methanol, DME (Di-Methyl Ether) and ethanol...... are also considered and the results will be compared with the base plant fed by Natural Gas (NG). A single plant design will be suggested that can be fed with methanol, DME and ethanol whenever these fuels are available. It will be shown that the plant fed by ethanol will have slightly higher electrical...... efficiency compared with other fuels. A methanator will be suggested to be included into the plants design in order to produce methane from the fuel before entering the anode side of the SOFC stacks. Increasing methane content will decrease the needed compressor effect and thereby increase the plant power....

  8. [Plant hormones, plant growth regulators].

    Science.gov (United States)

    Végvári, György; Vidéki, Edina

    2014-06-29

    Plants seem to be rather defenceless, they are unable to do motion, have no nervous system or immune system unlike animals. Besides this, plants do have hormones, though these substances are produced not in glands. In view of their complexity they lagged behind animals, however, plant organisms show large scale integration in their structure and function. In higher plants, such as in animals, the intercellular communication is fulfilled through chemical messengers. These specific compounds in plants are called phytohormones, or in a wide sense, bioregulators. Even a small quantity of these endogenous organic compounds are able to regulate the operation, growth and development of higher plants, and keep the connection between cells, tissues and synergy between organs. Since they do not have nervous and immume systems, phytohormones play essential role in plants' life.

  9. Analyzing machine noise for real time maintenance

    Science.gov (United States)

    Yamato, Yoji; Fukumoto, Yoshifumi; Kumazaki, Hiroki

    2017-02-01

    Recently, IoT technologies have been progressed and applications of maintenance area are expected. However, IoT maintenance applications are not spread in Japan yet because of one-off solution of sensing and analyzing for each case, high cost to collect sensing data and insufficient maintenance automation. This paper proposes a maintenance platform which analyzes sound data in edges, analyzes only anomaly data in cloud and orders maintenance automatically to resolve existing technology problems. We also implement a sample application and compare related work.

  10. ANALYZING OF MULTICOMPONENT UNDERSAMPLED SIGNALS BY HAF

    Institute of Scientific and Technical Information of China (English)

    Tao Ran; Shan Tao; Zhou Siyong; Wang Yue

    2001-01-01

    The phenomenon of frequency ambiguity may appear in radar or communication systems. S. Barbarossa(1991) had unwrapped the frequency ambiguity of single component undersampled signals by Wigner-Ville distribution(WVD). But there has no any effective algorithm to analyze multicomponent undersampled signals by now. A new algorithm to analyze multicomponent undersampled signals by high-order ambiguity function (HAF) is proposed hera HAF analyzes polynomial phase signals by the method of phase rank reduction, its advantage is that it does not have boundary effect and is not sensitive to the cross-items of multicomponent signals.The simulation results prove the effectiveness of HAF algorithm.

  11. A method for analyzing strategic product launch

    OpenAIRE

    XIAO Junji

    2007-01-01

    This paper proposes a method to analyze how the manufacturers make product launch decisions in a multi-product oligopoly market, and how the heterogeneity in their products affects the manufacturers' decisions on model launch and withdrawal.

  12. Low Gravity Drug Stability Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed program through Phase III is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation. It will be...

  13. On-Demand Urine Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer that can be integrated into International Space Station (ISS) toilets to measure key...

  14. Low Gravity Drug Stability Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this proposed program (through Phase III) is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation....

  15. Ultrasensitive Atmospheric Analyzer for Miniature UAVs Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this SBIR Phase I effort, Los Gatos Research (LGR) proposes to develop a highly-accurate, lightweight, low-power gas analyzer for quantification of water vapor...

  16. Network analysis using organizational risk analyzer

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The tool system of the organizational risk analyzer (ORA) to study the network of East Turkistan terrorists is selected. The model of the relationships among its personnel, knowledge, resources and task entities is represented by the meta-matrix in ORA, with which to analyze the risks and vulnerabilities of organizational structure quantitatively, and obtain the last vulnerabilities and risks of the organization. Case study in this system shows that it should be a shortcut to destroy effectively the network...

  17. Analyzing storage media of digital camera

    OpenAIRE

    Chow, KP; Tse, KWH; Law, FYW; Ieong, RSC; Kwan, MYK; Tse, H.; Lai, PKY

    2009-01-01

    Digital photography has become popular in recent years. Photographs have become common tools for people to record every tiny parts of their daily life. By analyzing the storage media of a digital camera, crime investigators may extract a lot of useful information to reconstruct the events. In this work, we will discuss a few approaches in analyzing these kinds of storage media of digital cameras. A hypothetical crime case will be used as case study for demonstration of concepts. © 2009 IEEE.

  18. The Information Flow Analyzing Based on CPC

    Institute of Scientific and Technical Information of China (English)

    ZHANG Zhang; LI Hui

    2005-01-01

    The information flow chart within product life cycle is given out based on collaborative production commerce (CPC) thoughts. In this chart, the separated information systems are integrated by means of enterprise knowledge assets that are promoted by CPC from production knowledge. The information flow in R&D process is analyzed in the environment of virtual R&D group and distributed PDM. In addition, the information flow throughout the manufacturing and marketing process is analyzed in CPC environment.

  19. QUBIT DATA STRUCTURES FOR ANALYZING COMPUTING SYSTEMS

    Directory of Open Access Journals (Sweden)

    Vladimir Hahanov

    2014-11-01

    Full Text Available Qubit models and methods for improving the performance of software and hardware for analyzing digital devices through increasing the dimension of the data structures and memory are proposed. The basic concepts, terminology and definitions necessary for the implementation of quantum computing when analyzing virtual computers are introduced. The investigation results concerning design and modeling computer systems in a cyberspace based on the use of two-component structure are presented.

  20. Analyte comparisons between 2 clinical chemistry analyzers.

    OpenAIRE

    Sutton, A; Dawson, H; Hoff, B; Grift, E; Shoukri, M

    1999-01-01

    The purpose of this study was to assess agreement between a wet reagent and a dry reagent analyzer. Thirteen analytes (albumin, globulin, alkaline phosphatase, alanine aminotransferase, amylase, urea nitrogen, calcium, cholesterol, creatinine, glucose, potassium, total bilirubin, and total protein) for both canine and feline serum were evaluated. Concordance correlations, linear regression, and plots of difference against mean were used to analyze the data. Concordance correlations were excel...

  1. DEVELOPMENT OF AN ON-LINE COAL WASHABILITY ANALYZER

    Energy Technology Data Exchange (ETDEWEB)

    J.D. Miller; C.L. Lin; G.H. Luttrell; G.T. Adel; Barbara Marin

    2001-06-26

    Washability analysis is the basis for nearly all coal preparation plant separations. Unfortunately, there are no on- line techniques for determining this most fundamental of all coal cleaning information. In light of recent successes at the University of Utah, it now appears possible to determine coal washability on-line through the use of x-ray computed tomography (CT) analysis. The successful development of such a device is critical to the establishment of process control and automated coal blending systems. In this regard, Virginia Tech, Terra Tek Inc., and U.S. coal producers have joined with the University of Utah and to undertake the development of an X-ray CT-based on- line coal washability analyzer with financial assistance from DOE. Each project participant brought special expertise to the project in order to create a new dimension in coal cleaning technology. The project involves development of appropriate software and extensive testing/evaluation of well-characterized coal samples from operating coal preparation plants. Data collected to date suggest that this new technology is capable of serving as a universal analyzer that can not only provide washability analysis, but also particle size distribution analysis, ash analysis, and perhaps pyritic sulfur analysis.

  2. Human Factors Considerations in New Nuclear Power Plants: Detailed Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    OHara,J.; Higgins, J.; Brown, W.; Fink, R.

    2008-02-14

    This Nuclear Regulatory Commission (NRC) sponsored study has identified human-performance issues in new and advanced nuclear power plants. To identify the issues, current industry developments and trends were evaluated in the areas of reactor technology, instrumentation and control technology, human-system integration technology, and human factors engineering (HFE) methods and tools. The issues were organized into seven high-level HFE topic areas: Role of Personnel and Automation, Staffing and Training, Normal Operations Management, Disturbance and Emergency Management, Maintenance and Change Management, Plant Design and Construction, and HFE Methods and Tools. The issues where then prioritized into four categories using a 'Phenomena Identification and Ranking Table' methodology based on evaluations provided by 14 independent subject matter experts. The subject matter experts were knowledgeable in a variety of disciplines. Vendors, utilities, research organizations and regulators all participated. Twenty issues were categorized into the top priority category. This Brookhaven National Laboratory (BNL) technical report provides the detailed methodology, issue analysis, and results. A summary of the results of this study can be found in NUREG/CR-6947. The research performed for this project has identified a large number of human-performance issues for new control stations and new nuclear power plant designs. The information gathered in this project can serve as input to the development of a long-term strategy and plan for addressing human performance in these areas through regulatory research. Addressing human-performance issues will provide the technical basis from which regulatory review guidance can be developed to meet these challenges. The availability of this review guidance will help set clear expectations for how the NRC staff will evaluate new designs, reduce regulatory uncertainty, and provide a well-defined path to new nuclear power plant

  3. Analyzing visual signals as visual scenes.

    Science.gov (United States)

    Allen, William L; Higham, James P

    2013-07-01

    The study of visual signal design is gaining momentum as techniques for studying signals become more sophisticated and more freely available. In this paper we discuss methods for analyzing the color and form of visual signals, for integrating signal components into visual scenes, and for producing visual signal stimuli for use in psychophysical experiments. Our recommended methods aim to be rigorous, detailed, quantitative, objective, and where possible based on the perceptual representation of the intended signal receiver(s). As methods for analyzing signal color and luminance have been outlined in previous publications we focus on analyzing form information by discussing how statistical shape analysis (SSA) methods can be used to analyze signal shape, and spatial filtering to analyze repetitive patterns. We also suggest the use of vector-based approaches for integrating multiple signal components. In our opinion elliptical Fourier analysis (EFA) is the most promising technique for shape quantification but we await the results of empirical comparison of techniques and the development of new shape analysis methods based on the cognitive and perceptual representations of receivers. Our manuscript should serve as an introductory guide to those interested in measuring visual signals, and while our examples focus on primate signals, the methods are applicable to quantifying visual signals in most taxa.

  4. Plant Behavior

    Science.gov (United States)

    Liu, Dennis W. C.

    2014-01-01

    Plants are a huge and diverse group of organisms, ranging from microscopic marine phytoplankton to enormous terrestrial trees epitomized by the giant sequoia: 300 feet tall, living 3000 years, and weighing as much as 3000 tons. For this plant issue of "CBE-Life Sciences Education," the author focuses on a botanical topic that most…

  5. Plant minichromosomes.

    Science.gov (United States)

    Birchler, James A; Graham, Nathaniel D; Swyers, Nathan C; Cody, Jon P; McCaw, Morgan E

    2016-02-01

    Plant minichromosomes have the potential for stacking multiple traits on a separate entity from the remainder of the genome. Transgenes carried on an independent chromosome would facilitate conferring many new properties to plants and using minichromosomes as genetic tools. The favored method for producing plant minichromosomes is telomere-mediated chromosomal truncation because the epigenetic nature of centromere function prevents using centromere sequences to confer the ability to organize a kinetochore when reintroduced into plant cells. Because haploid induction procedures are not always complete in eliminating one parental genome, chromosomes from the inducer lines are often present in plants that are otherwise haploid. This fact suggests that minichromosomes could be combined with doubled haploid breeding to transfer stacked traits more easily to multiple lines and to use minichromosomes for massive scale genome editing.

  6. A resource-efficient adaptive Fourier analyzer

    Science.gov (United States)

    Hajdu, C. F.; Zamantzas, C.; Dabóczi, T.

    2016-10-01

    We present a resource-efficient frequency adaptation method to complement the Fourier analyzer proposed by Péceli. The novel frequency adaptation scheme is based on the adaptive Fourier analyzer suggested by Nagy. The frequency adaptation method was elaborated with a view to realizing a detector connectivity check on an FPGA in a new beam loss monitoring (BLM) system, currently being developed for beam setup and machine protection of the particle accelerators at the European Organisation for Nuclear Research (CERN). The paper summarizes the Fourier analyzer to the extent relevant to this work and the basic principle of the related frequency adaptation methods. It then outlines the suggested new scheme, presents practical considerations for implementing it and underpins it with an example and the corresponding operational experience.

  7. Detecting influenza outbreaks by analyzing Twitter messages

    CERN Document Server

    Culotta, Aron

    2010-01-01

    We analyze over 500 million Twitter messages from an eight month period and find that tracking a small number of flu-related keywords allows us to forecast future influenza rates with high accuracy, obtaining a 95% correlation with national health statistics. We then analyze the robustness of this approach to spurious keyword matches, and we propose a document classification component to filter these misleading messages. We find that this document classifier can reduce error rates by over half in simulated false alarm experiments, though more research is needed to develop methods that are robust in cases of extremely high noise.

  8. Review of Recent Aging-Related Degradation Occurrences of Structures and Passive Components in U.S. Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Nie,J.; Braverman, J.; Hofmayer, C.; Choun, Y.-S.; Kim, M.K.; Choi, I.-K.

    2009-04-02

    The Korea Atomic Energy Research Institute (KAERI) and Brookhaven National Laboratory (BNL) are collaborating to develop seismic capability evaluation technology for degraded structures and passive components (SPCs) under a multi-year research agreement. To better understand the status and characteristics of degradation of SPCs in nuclear power plants (NPPs), the first step in this multi-year research effort was to identify and evaluate degradation occurrences of SPCs in U.S. NPPs. This was performed by reviewing recent publicly available information sources to identify and evaluate the characteristics of degradation occurrences and then comparing the information to the observations in the past. Ten categories of SPCs that are applicable to Korean NPPs were identified, comprising of anchorage, concrete, containment, exchanger, filter, piping system, reactor pressure vessel, structural steel, tank, and vessel. Software tools were developed to expedite the review process. Results from this review effort were compared to previous data in the literature to characterize the overall degradation trends.

  9. Color screening scenario for quarkonia suppression in a quasiparticle model compared with data obtained from experiments at the CERN SPS, BNL RHIC, and CERN LHC

    Science.gov (United States)

    Srivastava, P. K.; Mishra, M.; Singh, C. P.

    2013-03-01

    We present a modified color screening model for J/ψ suppression in the quark-gluon plasma (QGP) using the quasiparticle model (QPM) as the equation of state (EOS). Other theoretical ingredients incorporated in the model are feed-down from higher resonances, namely, χc, and ψ', dilated formation time for quarkonia, and viscous effects of the QGP medium. By assuming further that the QGP is expanding with Bjorken's hydrodynamical expansion, the present model is used to analyze the centrality dependence of the J/ψ suppression in the mid-rapidity region and compare it with the data obtained from Super Proton Synchrotron, Relativistic Heavy Ion Collider, and Large Hadron Collider experiments. We find that the centrality dependence of the data for the survival probability at all energies is well reproduced by our model. We further compare our model predictions with the results obtained from the bag model EOS for QGP which has usually been used earlier in all such calculations.

  10. Analyzing volatile compounds in dairy products

    Science.gov (United States)

    Volatile compounds give the first indication of the flavor in a dairy product. Volatiles are isolated from the sample matrix and then analyzed by chromatography, sensory methods, or an electronic nose. Isolation may be performed by solvent extraction or headspace analysis, and gas chromatography i...

  11. GSM Trace Quality Analyzer (TQA) software

    OpenAIRE

    Blanchart Forne, Marc

    2016-01-01

    Connectivity is now the must-have service for enhancing passenger experience. To proof and also to show to the customers the quality of the connectivity system an user friendly mock-up has to be designed. A packet analyzer software designed to validate an existing SATCOM simulator and to improve future airline architecture networks.

  12. Imaging thermal plasma mass and velocity analyzer

    Science.gov (United States)

    Yau, Andrew W.; Howarth, Andrew

    2016-07-01

    We present the design and principle of operation of the imaging ion mass and velocity analyzer on the Enhanced Polar Outflow Probe (e-POP), which measures low-energy (1-90 eV/e) ion mass composition (1-40 AMU/e) and velocity distributions using a hemispherical electrostatic analyzer (HEA), a time-of-flight (TOF) gate, and a pair of toroidal electrostatic deflectors (TED). The HEA and TOF gate measure the energy-per-charge and azimuth of each detected ion and the ion transit time inside the analyzer, respectively, providing the 2-D velocity distribution of each major ionospheric ion species and resolving the minor ion species under favorable conditions. The TED are in front of the TOF gate and optionally sample ions at different elevation angles up to ±60°, for measurement of 3-D velocity distribution. We present examples of observation data to illustrate the measurement capability of the analyzer, and show the occurrence of enhanced densities of heavy "minor" O++, N+, and molecular ions and intermittent, high-velocity (a few km/s) upward and downward flowing H+ ions in localized regions of the quiet time topside high-latitude ionosphere.

  13. Strengthening 4-H by Analyzing Enrollment Data

    Science.gov (United States)

    Hamilton, Stephen F.; Northern, Angela; Neff, Robert

    2014-01-01

    The study reported here used data from the ACCESS 4-H Enrollment System to gain insight into strengthening New York State's 4-H programming. Member enrollment lists from 2009 to 2012 were analyzed using Microsoft Excel to determine trends and dropout rates. The descriptive data indicate declining 4-H enrollment in recent years and peak…

  14. Thermal and Evolved-Gas Analyzer Illustration

    Science.gov (United States)

    2008-01-01

    This is a computer-aided drawing of the Thermal and Evolved-Gas Analyzer, or TEGA, on NASA's Phoenix Mars Lander. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  15. How to Analyze Company Using Social Network?

    Science.gov (United States)

    Palus, Sebastian; Bródka, Piotr; Kazienko, Przemysław

    Every single company or institution wants to utilize its resources in the most efficient way. In order to do so they have to be have good structure. The new way to analyze company structure by utilizing existing within company natural social network and example of its usage on Enron company are presented in this paper.

  16. Analyzing computer system performance with Perl

    CERN Document Server

    Gunther, Neil J

    2011-01-01

    This expanded second edition of Analyzing Computer System Performance with Perl::PDQ, builds on the success of the first edition. It contains new chapters on queues, tools and virtualization, and new Perl listing format to aid readability of PDQ models.

  17. Graphic method for analyzing common path interferometers

    DEFF Research Database (Denmark)

    Glückstad, J.

    1998-01-01

    Common path interferometers are widely used for visualizing phase disturbances and fluid flows. They are attractive because of the inherent simplicity and robustness in the setup. A graphic method will be presented for analyzing and optimizing filter parameters in common path interferometers....

  18. Analyzing the Control Structure of PEPA

    DEFF Research Database (Denmark)

    Yang, Fan; Nielson, Hanne Riis

    to PEPA programs, the approximating result is very precise. Based on the analysis, we also develop algorithms for validating the deadlock property of PEPA programs. The techniques have been implemented in a tool which is able to analyze processes with a control structure that more than one thousand states....

  19. Analyzing the Information Economy: Tools and Techniques.

    Science.gov (United States)

    Robinson, Sherman

    1986-01-01

    Examines methodologies underlying studies which measure the information economy and considers their applicability and limitations for analyzing policy issues concerning libraries and library networks. Two studies provide major focus for discussion: Porat's "The Information Economy: Definition and Measurement" and Machlup's "Production and…

  20. Analyzing the Biology on the System Level

    OpenAIRE

    Tong, Wei

    2016-01-01

    Although various genome projects have provided us enormous static sequence information, understanding of the sophisticated biology continues to require integrating the computational modeling, system analysis, technology development for experiments, and quantitative experiments all together to analyze the biology architecture on various levels, which is just the origin of systems biology subject. This review discusses the object, its characteristics, and research attentions in systems biology,...

  1. Studying Reliability Using Identical Handheld Lactate Analyzers

    Science.gov (United States)

    Stewart, Mark T.; Stavrianeas, Stasinos

    2008-01-01

    Accusport analyzers were used to generate lactate performance curves in an investigative laboratory activity emphasizing the importance of reliable instrumentation. Both the calibration and testing phases of the exercise provided students with a hands-on opportunity to use laboratory-grade instrumentation while allowing for meaningful connections…

  2. Analyzing Languages for Specific Purposes Discourse

    Science.gov (United States)

    Bowles, Hugo

    2012-01-01

    In the last 20 years, technological advancement and increased multidisciplinarity has expanded the range of data regarded as within the scope of languages for specific purposes (LSP) research and the means by which they can be analyzed. As a result, the analytical work of LSP researchers has developed from a narrow focus on specialist terminology…

  3. Analyzing Vessel Behavior Using Process Mining

    NARCIS (Netherlands)

    Maggi, F.M.; Mooij, A.J.; Aalst, W.M.P. van der

    2013-01-01

    In the maritime domain, electronic sensors such as AIS receivers and radars collect large amounts of data about the vessels in a certain geographical area. We investigate the use of process mining techniques for analyzing the behavior of the vessels based on these data. In the context of maritime sa

  4. 40 CFR 90.313 - Analyzers required.

    Science.gov (United States)

    2010-07-01

    ... ionization (HFID) type. For constant volume sampling, the hydrocarbon analyzer may be of the flame ionization (FID) type or of the heated flame ionization (HFID) type. (ii) For the HFID system, if the temperature... drying. Chemical dryers are not an acceptable method of removing water from the sample. Water removal...

  5. Fluidization quality analyzer for fluidized beds

    Science.gov (United States)

    Daw, C. Stuart; Hawk, James A.

    1995-01-01

    A control loop and fluidization quality analyzer for a fluidized bed utilizes time varying pressure drop measurements. A fast-response pressure transducer measures the overall bed pressure drop, or over some segment of the bed, and the pressure drop signal is processed to produce an output voltage which changes with the degree of fluidization turbulence.

  6. Strengthening 4-H by Analyzing Enrollment Data

    Science.gov (United States)

    Hamilton, Stephen F.; Northern, Angela; Neff, Robert

    2014-01-01

    The study reported here used data from the ACCESS 4-H Enrollment System to gain insight into strengthening New York State's 4-H programming. Member enrollment lists from 2009 to 2012 were analyzed using Microsoft Excel to determine trends and dropout rates. The descriptive data indicate declining 4-H enrollment in recent years and peak enrollment…

  7. Seismic fragility of nuclear power plant components (Phase II)

    Energy Technology Data Exchange (ETDEWEB)

    Bandyopadhyay, K.K.; Hofmayer, C.H.; Kassir, M.K.; Pepper, S.E. (Brookhaven National Lab., Upton, NY (USA))

    1990-02-01

    As part of the Component Fragility Program which was initiated in FY 1985, three additional equipment classes have been evaluated. This report contains the fragility results and discussions on these equipment classes which are switchgear, I and C panels and relays. Both low and medium voltage switchgear assemblies have been considered and a separate fragility estimate for each type is provided. Test data on cabinets from the nuclear instrumentation/neutron monitoring system, plant/process protection system, solid state protective system and engineered safeguards test system comprise the BNL data base for I and C panels (NSSS). Fragility levels have been determined for various failure modes of switchgear and I C panels, and the deterministic results are presented in terms of test response spectra. In addition, the test data have been evaluated for estimating the respective probabilistic fragility levels which are expressed in terms of a median value, an uncertainty coefficient, a randomness coefficient and an HCLPF value. Due to a wide variation of relay design and the fragility level, a generic fragility level cannot be established for relays. 7 refs., 13 figs., 12 tabs.

  8. Seed planting

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This report summarizes prairie seed plantings on Neal Smith National Wildlife Refuge (formerly Walnut Creek National Wildlife Refuge) between 1992 and 2009.

  9. T Plant

    Data.gov (United States)

    Federal Laboratory Consortium — Arguably the second most historic building at Hanford is the T Plant.This facility is historic in that it's the oldest remaining nuclear facility in the country that...

  10. Plant Macrofossils

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Records of past vegetation and environmental change derived from plant remains large enough to be seen without a microscope (macrofossils), such as leaves, needles,...

  11. Simulation of a Hyperbolic Field Energy Analyzer

    CERN Document Server

    Gonzalez-Lizardo, Angel

    2016-01-01

    Energy analyzers are important plasma diagnostic tools with applications in a broad range of disciplines including molecular spectroscopy, electron microscopy, basic plasma physics, plasma etching, plasma processing, and ion sputtering technology. The Hyperbolic Field Energy Analyzer (HFEA) is a novel device able to determine ion and electron energy spectra and temperatures. The HFEA is well suited for ion temperature and density diagnostics at those situations where ions are scarce. A simulation of the capacities of the HFEA to discriminate particles of a particular energy level, as well as to determine temperature and density is performed in this work. The electric field due the combination of the conical elements, collimator lens, and Faraday cup applied voltage was computed in a well suited three-dimensional grid. The field is later used to compute the trajectory of a set of particles with a predetermined energy distribution. The results include the observation of the particle trajectories inside the sens...

  12. Operating System Performance Analyzer for Embedded Systems

    Directory of Open Access Journals (Sweden)

    Shahzada Khayyam Nisar

    2011-11-01

    Full Text Available RTOS provides a number of services to an embedded system designs such as case management, memory management, and Resource Management to build a program. Choosing the best OS for an embedded system is based on the available OS for system designers and their previous knowledge and experience. This can cause an imbalance between the OS and embedded systems. RTOS performance analysis is critical in the design and integration of embedded software to ensure that limits the application meet at runtime. To select an appropriate operating system for an embedded system for a particular application, the OS services to be analyzed. These OS services are identified by parameters to establish performance metrics. Performance Metrics selected include context switching, Preemption time and interrupt latency. Performance Metrics are analyzed to choose the right OS for an embedded system for a particular application.

  13. CRIE: An automated analyzer for Chinese texts.

    Science.gov (United States)

    Sung, Yao-Ting; Chang, Tao-Hsing; Lin, Wei-Chun; Hsieh, Kuan-Sheng; Chang, Kuo-En

    2016-12-01

    Textual analysis has been applied to various fields, such as discourse analysis, corpus studies, text leveling, and automated essay evaluation. Several tools have been developed for analyzing texts written in alphabetic languages such as English and Spanish. However, currently there is no tool available for analyzing Chinese-language texts. This article introduces a tool for the automated analysis of simplified and traditional Chinese texts, called the Chinese Readability Index Explorer (CRIE). Composed of four subsystems and incorporating 82 multilevel linguistic features, CRIE is able to conduct the major tasks of segmentation, syntactic parsing, and feature extraction. Furthermore, the integration of linguistic features with machine learning models enables CRIE to provide leveling and diagnostic information for texts in language arts, texts for learning Chinese as a foreign language, and texts with domain knowledge. The usage and validation of the functions provided by CRIE are also introduced.

  14. Raman Gas Analyzer (RGA): Natural Gas Measurements.

    Science.gov (United States)

    Petrov, Dmitry V; Matrosov, Ivan I

    2016-06-08

    In the present work, an improved model of the Raman gas analyzer (RGA) of natural gas (NG) developed by us is described together with its operating principle. The sensitivity has been improved and the number of measurable gases has been expanded. Results of its approbation on a real NG sample are presented for different measurement times. A comparison of the data obtained with the results of chromatographic analysis demonstrates their good agreement. The time stability of the results obtained using this model is analyzed. It is experimentally established that the given RGA can reliably determine the content of all molecular NG components whose content exceeds 0.005% for 100 s; moreover, in this case the limiting sensitivity for some NG components is equal to 0.002%.

  15. Methods of analyzing composition of aerosol particles

    Science.gov (United States)

    Reilly, Peter T.A.

    2013-02-12

    An aerosol particle analyzer includes a laser ablation chamber, a gas-filled conduit, and a mass spectrometer. The laser ablation chamber can be operated at a low pressure, which can be from 0.1 mTorr to 30 mTorr. The ablated ions are transferred into a gas-filled conduit. The gas-filled conduit reduces the electrical charge and the speed of ablated ions as they collide and mix with buffer gases in the gas-filled conduit. Preferably, the gas filled-conduit includes an electromagnetic multipole structure that collimates the nascent ions into a beam, which is guided into the mass spectrometer. Because the gas-filled conduit allows storage of vast quantities of the ions from the ablated particles, the ions from a single ablated particle can be analyzed multiple times and by a variety of techniques to supply statistically meaningful analysis of composition and isotope ratios.

  16. An improved prism energy analyzer for neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Schulz, J., E-mail: jennifer.schulz@helmholtz-berlin.de [Helmholtz-Zentrum Berlin für Materialien und Energie, Hahn-Meitner-Platz 1, 14109 Berlin (Germany); Ott, F. [Laboratoire Leon Brillouin, Bât 563 CEA Saclay, 91191 Gif sur Yvette Cedex (France); Krist, Th. [Helmholtz-Zentrum Berlin für Materialien und Energie, Hahn-Meitner-Platz 1, 14109 Berlin (Germany)

    2014-04-21

    The effects of two improvements of an existing neutron energy analyzer consisting of stacked silicon prism rows are presented. First we tested the effect of coating the back of the prism rows with an absorbing layer to suppress neutron scattering by total reflection and by refraction at small angles. Experiments at HZB showed that this works perfectly. Second the prism rows were bent to shift the transmitted wavelength band to larger wavelengths. At HZB we showed that bending increased the transmission of neutrons with a wavelength of 4.9 Å. Experiments with a white beam at the EROS reflectometer at LLB showed that bending of the energy analyzing device to a radius of 7.9 m allows to shift the transmitted wavelength band from 0 to 9 Å to 2 to 16 Å.

  17. The EPOS Automated Selective Chemistry Analyzer evaluated.

    Science.gov (United States)

    Moses, G C; Lightle, G O; Tuckerman, J F; Henderson, A R

    1986-01-01

    We evaluated the analytical performance of the EPOS (Eppendorf Patient Oriented System) Automated Selective Chemistry Analyzer, using the following tests for serum analytes: alanine and aspartate aminotransferases, lactate dehydrogenase, creatine kinase, gamma-glutamyltransferase, alkaline phosphatase, and glucose. Results from the EPOS correlated well with those from comparison instruments (r greater than or equal to 0.990). Precision and linearity limits were excellent for all tests; linearity of the optical and pipetting systems was satisfactory. Reagent carryover was negligible. Sample-to-sample carryover was less than 1% for all tests, but only lactate dehydrogenase was less than the manufacturer's specified 0.5%. Volumes aspirated and dispensed by the sample and reagent II pipetting systems differed significantly from preset values, especially at lower settings; the reagent I system was satisfactory at all volumes tested. Minimal daily maintenance and an external data-reduction system make the EPOS a practical alternative to other bench-top chemistry analyzers.

  18. The analyzing of Dove marketing strategy

    Institute of Scientific and Technical Information of China (English)

    Guo; Yaohui

    2015-01-01

    <正>1.Introduction In this report,I try to analyze the related information about DOVE chocolate.Firstly,I would like to introduce this product.Dove chocolate is one of a series of products launched by the world’s largest pet food and snack food manufacturers,U.S.multinational food company Mars(Mars).Entered China in 1989,It becomes China’s leading brand of chocolate in

  19. LEGAL-EASE:Analyzing Chinese Financial Statements

    Institute of Scientific and Technical Information of China (English)

    EDWARD; MA

    2008-01-01

    In this article,we will focus on under- standing and analyzing the typical accounts of Chinese financial statements,including the balance sheet and income statement. Accounts are generally incorrectly prepared. This can be due to several factors,incom- petence,as well as more serious cases of deliberate attempts to deceive.Regardless, accounts can be understood and errors or specific acts of misrepresentation uncovered. We will conduct some simple analysis to demonstrate how these can be spotted.

  20. MORPHOLOGICAL ANALYZER MYSTEM 3.0

    Directory of Open Access Journals (Sweden)

    A. I. Zobnin

    2015-01-01

    Full Text Available The large part of the Russian National Corpus has automatic morphological markup. It is based on the morphological analyzer Mystem developed in Yandex with some postprocessing of the results (for example, all indeclinable nouns acquire the tag '0', verbs are divided into separate paradigms by aspect, etc.. Recently a new (third version of Mystem has been released (see https://tech.yandex.ru/mystem/.  In this article we give an overview of its capabilities.

  1. Coordinating, Scheduling, Processing and Analyzing IYA09

    Science.gov (United States)

    Gipson, John; Behrend, Dirk; Gordon, David; Himwich, Ed; MacMillan, Dan; Titus, Mike; Corey, Brian

    2010-01-01

    The IVS scheduled a special astrometric VLBI session for the International Year of Astronomy 2009 (IYA09) commemorating 400 years of optical astronomy and 40 years of VLBI. The IYA09 session is the most ambitious geodetic session to date in terms of network size, number of sources, and number of observations. We describe the process of designing, coordinating, scheduling, pre-session station checkout, correlating, and analyzing this session.

  2. Organization theory. Analyzing health care organizations.

    Science.gov (United States)

    Cors, W K

    1997-02-01

    Organization theory (OT) is a tool that can be applied to analyze and understand health care organizations. Transaction cost theory is used to explain, in a unifying fashion, the myriad changes being undertaken by different groups of constituencies in health care. Agency theory is applied to aligning economic incentives needed to ensure Integrated Delivery System (IDS) success. By using tools such as OT, a clearer understanding of organizational changes is possible.

  3. Analyzing negative ties in social networks

    Directory of Open Access Journals (Sweden)

    Mankirat Kaur

    2016-03-01

    Full Text Available Online social networks are a source of sharing information and maintaining personal contacts with other people through social interactions and thus forming virtual communities online. Social networks are crowded with positive and negative relations. Positive relations are formed by support, endorsement and friendship and thus, create a network of well-connected users whereas negative relations are a result of opposition, distrust and avoidance creating disconnected networks. Due to increase in illegal activities such as masquerading, conspiring and creating fake profiles on online social networks, exploring and analyzing these negative activities becomes the need of hour. Usually negative ties are treated in same way as positive ties in many theories such as balance theory and blockmodeling analysis. But the standard concepts of social network analysis do not yield same results in respect of each tie. This paper presents a survey on analyzing negative ties in social networks through various types of network analysis techniques that are used for examining ties such as status, centrality and power measures. Due to the difference in characteristics of flow in positive and negative tie networks some of these measures are not applicable on negative ties. This paper also discusses new methods that have been developed specifically for analyzing negative ties such as negative degree, and h∗ measure along with the measures based on mixture of positive and negative ties. The different types of social network analysis approaches have been reviewed and compared to determine the best approach that can appropriately identify the negative ties in online networks. It has been analyzed that only few measures such as Degree and PN centrality are applicable for identifying outsiders in network. For applicability in online networks, the performance of PN measure needs to be verified and further, new measures should be developed based upon negative clique concept.

  4. Miles Technicon H.2 automated hematology analyzer.

    Science.gov (United States)

    1992-11-01

    Automated hematology analyzers are used in all large hospitals and most commercial laboratories, as well as in most smaller hospitals and laboratories, to perform complete blood counts (including white blood cell, red blood cell, and platelet counts; hemoglobin concentration; and RBC indices) and white blood cell differential counts. Our objectives in this study are to provide user guidance for selecting, purchasing, and using an automated hematology analyzer, as well as to present an overview of the technology used in an automated five-part differential unit. Specifications for additional automated units are available in ECRI's Clinical Laboratory Product Comparison System. We evaluated the Miles Technicon H.2 unit and rated it Acceptable. The information in this Single Product Evaluation is also useful for purchasing other models; our criteria will guide users in assessing components, and our findings and discussions on some aspects of automated hematology testing are common to many available systems. We caution readers not to base purchasing decisions on our rating of the Miles unit alone, but on a thorough understanding of the issues surrounding automated hematology analyzers, which can be gained only by reading this report in its entirety. The willingness of manufacturers to cooperate in our studies and the knowledge they gain through participating lead to the development of better products. Readers should refer to the Guidance Section, "Selecting and Purchasing an Automated Hematology Analyzer," where we discuss factors such as standardization, training, human factors, manufacturer support, patient population, and special features that the laboratory must consider before obtaining any automated unit; we also provide an in-depth review of cost issues, including life-cycle cost analyses, acquisition methods and costs of hardware and supplies, and we describe the Hemacost and Hemexmpt cost worksheets for use with our PresValu and PSV Manager CAHDModel software

  5. Analyzing Maize Anther Development Using Transposons

    Science.gov (United States)

    Han, S.

    2011-12-01

    Over the summer, we tackled two projects in studying more about transposons (moving/jumping genes) such as Mutator genes in corn for this project, and how the plants switch from the stages of mitosis to meiosis without a germ line. We use a transgenic corn line containing RescueMu (an artificial Mutator containing a plasmid in it), so we can keep track of the insertion events. This is a long term project so we haven't come to any final conclusions or results with tracking what happens in Mutator transposition during different stages of corn development but our process shows to work so we continue with what we've been doing.

  6. Analyzing Malware Based on Volatile Memory

    Directory of Open Access Journals (Sweden)

    Liang Hu

    2013-11-01

    Full Text Available To explain the necessity of comprehensive and automatically analysis process for volatile memory, this paper summarized ordinarily analyzing methods and their common points especially for concerned data source. Then, a memory analysis framework Volatiltiy-2.2 and statistical output file size are recommended. In addition, to address the limitation of plug-ins classification in analyzing procedure, a user perspective classify is necessary and proposed. Furthermore, according to target data source differences on the base of result data set volume and employed relational method is introduced for comprehensive analysis guideline procedure. Finally, a test demo including DLLs loading order list analyzing is recommend, in which DLL load list is regard as different kind of characteristics typical data source with process and convert into process behavior fingerprint. The clustering for the fingerprint is employed string similar degree algorithm model in the demo, which has a wide range applications in traditional malware behavior analysis, and it is proposed that these methods also can be applied for volatile memory

  7. Modular Construction of Shape-Numeric Analyzers

    Directory of Open Access Journals (Sweden)

    Bor-Yuh Evan Chang

    2013-09-01

    Full Text Available The aim of static analysis is to infer invariants about programs that are precise enough to establish semantic properties, such as the absence of run-time errors. Broadly speaking, there are two major branches of static analysis for imperative programs. Pointer and shape analyses focus on inferring properties of pointers, dynamically-allocated memory, and recursive data structures, while numeric analyses seek to derive invariants on numeric values. Although simultaneous inference of shape-numeric invariants is often needed, this case is especially challenging and is not particularly well explored. Notably, simultaneous shape-numeric inference raises complex issues in the design of the static analyzer itself. In this paper, we study the construction of such shape-numeric, static analyzers. We set up an abstract interpretation framework that allows us to reason about simultaneous shape-numeric properties by combining shape and numeric abstractions into a modular, expressive abstract domain. Such a modular structure is highly desirable to make its formalization and implementation easier to do and get correct. To achieve this, we choose a concrete semantics that can be abstracted step-by-step, while preserving a high level of expressiveness. The structure of abstract operations (i.e., transfer, join, and comparison follows the structure of this semantics. The advantage of this construction is to divide the analyzer in modules and functors that implement abstractions of distinct features.

  8. Plant morphology and allometric relationships in competing and non-competing plants of Tagetes patula L.

    Directory of Open Access Journals (Sweden)

    Ingeborga Jarzyna

    2014-01-01

    Full Text Available Allometric relationships (defined as correlation coefficients between plant mass - stem diameter, plant mass - stem height and stem diameter - stem height in plants of Tagetes patula L. (Brassicaceae var. "Tangerine" were analyzed. Competing and non-competing plants were compared in a glasshouse experiment. Competing plants were grown in broad range of densities, from 200 to 6000 individuals • m-2. For non-competing plants no allometric relationships were observed, while for competing plants they were strong, irrespective of density treatment used. Gradual changes of plant morphology (plant mass, stem diameter, stem height and height/mass ratio with the increase of competition intensity were also analyzed.The present study clearly showed, that the intraspecific competition influenced allometric relationships between height, mass and stem diameter of Tagetes patula.

  9. The DOE Automated Radioxenon Sampler-Analyzer (ARSA) Beta-Gamma Coincidence Spectrometer Data Analyzer

    Science.gov (United States)

    2000-09-01

    detected using the counting system given the daily fluctuations in Radon gas interference, the background counts, the memory effect of previous...THE DOE AUTOMATED RADIOXENON SAMPLER-ANALYZER (ARSA) BETA-GAMMA COINCIDENCE SPECTROMETER DATA ANALYZER T.R. Heimbigner, T.W. Bowyer, J.I...1830 ABSTRACT The Automated Radioxenon Sampler/Analyzer (ARSA) developed at the Pacific Northwest National Laboratory for the Comprehensive

  10. A chemical analyzer for charged ultrafine particles

    Directory of Open Access Journals (Sweden)

    S. G. Gonser

    2013-04-01

    Full Text Available New particle formation is a frequent phenomenon in the atmosphere and of major significance for the earth's climate and human health. To date the mechanisms leading to the nucleation of particles as well as to aerosol growth are not completely understood. A lack of appropriate measurement equipment for online analysis of the chemical composition of freshly nucleated particles is one major limitation. We have developed a Chemical Analyzer for Charged Ultrafine Particles (CAChUP capable of analyzing particles with diameters below 30 nm. A bulk of size separated particles is collected electrostatically on a metal filament, resistively desorbed and consequently analyzed for its molecular composition in a time of flight mass spectrometer. We report of technical details as well as characterization experiments performed with the CAChUP. Our instrument was tested in the laboratory for its detection performance as well as for its collection and desorption capabilities. The manual application of known masses of camphene (C10H16 to the desorption filament resulted in a detection limit between 0.5 and 5 ng, and showed a linear response of the mass spectrometer. Flow tube experiments of 25 nm diameter secondary organic aerosol from ozonolysis of alpha-pinene also showed a linear relation between collection time and the mass spectrometer's signal intensity. The resulting mass spectra from the collection experiments are in good agreement with published work on particles generated by the ozonolysis of alpha-pinene. A sensitivity study shows that the current setup of CAChUP is ready for laboratory measurements and for the observation of new particle formation events in the field.

  11. A chemical analyzer for charged ultrafine particles

    Directory of Open Access Journals (Sweden)

    S. G. Gonser

    2013-09-01

    Full Text Available New particle formation is a frequent phenomenon in the atmosphere and of major significance for the Earth's climate and human health. To date the mechanisms leading to the nucleation of particles as well as to aerosol growth are not completely understood. A lack of appropriate measurement equipment for online analysis of the chemical composition of freshly nucleated particles is one major limitation. We have developed a Chemical Analyzer for Charged Ultrafine Particles (CAChUP capable of analyzing particles with diameters below 30 nm. A bulk of size-separated particles is collected electrostatically on a metal filament, resistively desorbed and subsequently analyzed for its molecular composition in a time of flight mass spectrometer. We report on technical details as well as characterization experiments performed with the CAChUP. Our instrument was tested in the laboratory for its detection performance as well as for its collection and desorption capabilities. The manual application of defined masses of camphene (C10H16 to the desorption filament resulted in a detection limit between 0.5 and 5 ng, and showed a linear response of the mass spectrometer. Flow tube experiments of 25 nm diameter secondary organic aerosol from ozonolysis of alpha-pinene also showed a linear relation between collection time and the mass spectrometer's signal intensity. The resulting mass spectra from the collection experiments are in good agreement with published work on particles generated by the ozonolysis of alpha-pinene. A sensitivity study shows that the current setup of CAChUP is ready for laboratory measurements and for the observation of new particle formation events in the field.

  12. Thermo Scientific Ozone Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The primary measurement output from the Thermo Scientific Ozone Analyzer is the concentration of the analyte (O3) reported at 1-s resolution in units of ppbv in ambient air. Note that because of internal pneumatic switching limitations the instrument only makes an independent measurement every 4 seconds. Thus, the same concentration number is repeated roughly 4 times at the uniform, monotonic 1-s time base used in the AOS systems. Accompanying instrument outputs include sample temperatures, flows, chamber pressure, lamp intensities and a multiplicity of housekeeping information. There is also a field for operator comments made at any time while data is being collected.

  13. Grid and Data Analyzing and Security

    Directory of Open Access Journals (Sweden)

    Fatemeh SHOKRI

    2012-12-01

    Full Text Available This paper examines the importance of secure structures in the process of analyzing and distributing information with aid of Grid-based technologies. The advent of distributed network has provided many practical opportunities for detecting and recording the time of events, and made efforts to identify the events and solve problems of storing information such as being up-to-date and documented. In this regard, the data distribution systems in a network environment should be accurate. As a consequence, a series of continuous and updated data must be at hand. In this case, Grid is the best answer to use data and resource of organizations by common processing.

  14. Analyzing the Biology on the System Level

    Institute of Scientific and Technical Information of China (English)

    Wei Tong

    2004-01-01

    Although various genome projects have provided us enormous static sequence information, understanding of the sophisticated biology continues to require integrating the computational modeling, system analysis, technology development for experiments, and quantitative experiments all together to analyze the biology architecture on various levels, which is just the origin of systems biology subject. This review discusses the object, its characteristics, and research attentions in systems biology, and summarizes the analysis methods, experimental technologies, research developments, and so on in the four key fields of systems biology-systemic structures, dynamics, control methods, and design principles.

  15. Spectrum Analyzers Incorporating Tunable WGM Resonators

    Science.gov (United States)

    Savchenkov, Anatoliy; Matsko, Andrey; Strekalov, Dmitry; Maleki, Lute

    2009-01-01

    A photonic instrument is proposed to boost the resolution for ultraviolet/ optical/infrared spectral analysis and spectral imaging allowing the detection of narrow (0.00007-to-0.07-picometer wavelength resolution range) optical spectral signatures of chemical elements in space and planetary atmospheres. The idea underlying the proposal is to exploit the advantageous spectral characteristics of whispering-gallery-mode (WGM) resonators to obtain spectral resolutions at least three orders of magnitude greater than those of optical spectrum analyzers now in use. Such high resolutions would enable measurement of spectral features that could not be resolved by prior instruments.

  16. Analyzing Engineered Nanoparticles using Photothermal Infrared Spectroscopy

    DEFF Research Database (Denmark)

    Yamada, Shoko

    using redox activity measurements. With a new setup adapted to miniaturization, stable pH was achieved, platinum was found to be more suitable than gold for open circuit potential-time measurements, miniaturized platinum working electrodes and quasi silver/silver chloride reference electrodes were...... of design rules for the responsivity of the string-based photothermal spectrometer. Responsivity is maximized for a thin, narrow and long string irradiated by high power radiation. Various types of nanoparticles and binary mixtures of them were successfully detected and analyzed. Detection of copper...

  17. CRISP90 - SOFTWARE DESIGN ANALYZER SYSTEM

    Science.gov (United States)

    Tausworthe, R. C.

    1994-01-01

    The CRISP90 Software Design Analyzer System, an update of CRISP-80, is a set of programs forming a software design and documentation tool which supports top-down, hierarchic, modular, structured design and programming methodologies. The quality of a computer program can often be significantly influenced by the design medium in which the program is developed. The medium must foster the expression of the programmer's ideas easily and quickly, and it must permit flexible and facile alterations, additions, and deletions to these ideas as the design evolves. The CRISP90 software design analyzer system was developed to provide the PDL (Programmer Design Language) programmer with such a design medium. A program design using CRISP90 consists of short, English-like textual descriptions of data, interfaces, and procedures that are imbedded in a simple, structured, modular syntax. The display is formatted into two-dimensional, flowchart-like segments for a graphic presentation of the design. Together with a good interactive full-screen editor or word processor, the CRISP90 design analyzer becomes a powerful tool for the programmer. In addition to being a text formatter, the CRISP90 system prepares material that would be tedious and error prone to extract manually, such as a table of contents, module directory, structure (tier) chart, cross-references, and a statistics report on the characteristics of the design. Referenced modules are marked by schematic logic symbols to show conditional, iterative, and/or concurrent invocation in the program. A keyword usage profile can be generated automatically and glossary definitions inserted into the output documentation. Another feature is the capability to detect changes that were made between versions. Thus, "change-bars" can be placed in the output document along with a list of changed pages and a version history report. Also, items may be marked as "to be determined" and each will appear on a special table until the item is

  18. Analyzing PICL trace data with MEDEA

    Energy Technology Data Exchange (ETDEWEB)

    Merlo, A.P. [Pavia Univ. (Italy). Dipt di Informatica e Sistemistica; Worley, P.H. [Oak Ridge National Lab., TN (United States)

    1993-11-01

    Execution traces and performance statistics can be collected for parallel applications on a variety of multiprocessor platforms by using the Portable Instrumented Communication Library (PICL). The static and dynamic performance characteristics of performance data can be analyzed easily and effectively with the facilities provided within the MEasurements Description Evaluation and Analysis tool (MEDEA). This report describes the integration of the PICL trace file format into MEDEA. A case study is then outlined that uses PICL and MEDEA to characterize the performance of a parallel benchmark code executed on different hardware platforms and using different parallel algorithms and communication protocols.

  19. Using SCR methods to analyze requirements documentation

    Science.gov (United States)

    Callahan, John; Morrison, Jeffery

    1995-01-01

    Software Cost Reduction (SCR) methods are being utilized to analyze and verify selected parts of NASA's EOS-DIS Core System (ECS) requirements documentation. SCR is being used as a spot-inspection tool. Through this formal and systematic approach of the SCR requirements methods, insights as to whether the requirements are internally inconsistent or incomplete as the scenarios of intended usage evolve in the OC (Operations Concept) documentation. Thus, by modelling the scenarios and requirements as mode charts using the SCR methods, we have been able to identify problems within and between the documents.

  20. Marketing time predicts naturalization of horticultural plants.

    Science.gov (United States)

    Pemberton, Robert W; Liu, Hong

    2009-01-01

    Horticulture is an important source of naturalized plants, but our knowledge about naturalization frequencies and potential patterns of naturalization in horticultural plants is limited. We analyzed a unique set of data derived from the detailed sales catalogs (1887-1930) of the most important early Florida, USA, plant nursery (Royal Palm Nursery) to detect naturalization patterns of these horticultural plants in the state. Of the 1903 nonnative species sold by the nursery, 15% naturalized. The probability of plants becoming naturalized increases significantly with the number of years the plants were marketed. Plants that became invasive and naturalized were sold for an average of 19.6 and 14.8 years, respectively, compared to 6.8 years for non-naturalized plants, and the naturalization of plants sold for 30 years or more is 70%. Unexpectedly, plants that were sold earlier were less likely to naturalize than those sold later. The nursery's inexperience, which caused them to grow and market many plants unsuited to Florida during their early period, may account for this pattern. Plants with pantropical distributions and those native to both Africa and Asia were more likely to naturalize (42%), than were plants native to other smaller regions, suggesting that plants with large native ranges were more likely to naturalize. Naturalization percentages also differed according to plant life form, with the most naturalization occurring in aquatic herbs (36.8%) and vines (30.8%). Plants belonging to the families Araceae, Apocynaceae, Convolvulaceae, Moraceae, Oleaceae, and Verbenaceae had higher than expected naturalization. Information theoretic model selection indicated that the number of years a plant was sold, alone or together with the first year a plant was sold, was the strongest predictor of naturalization. Because continued importation and marketing of nonnative horticultural plants will lead to additional plant naturalization and invasion, a comprehensive approach

  1. Development of a seismic damage assessment program for nuclear power plant structures

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Hyun Moo; Cho, Ho Hyun; Cho, Yang Hui [Seoul National Univ., Seoul (Korea, Republic of)] (and others)

    2000-12-15

    Some of nuclear power plants operating currently in Korea have been passed about 20 years after construction. Moreover, in the case of KORI I the service year is over 20 years, so their abilities are different from initial abilities. Also, earthquake outbreak increase, our country is not safe area for earthquake. Therefore, need is to guarantee the safety of these power plant structures against seismic accident, to decide to maintain them operational and to obtain data relative to maintenance/repair. Such objectives can be reached by damage assessment using inelastic seismic analysis considering aging degradation. It appears to be more important particularly for the structure enclosing the nuclear reactor that must absolutely protect against any radioactive leakage. Actually, the tendency of the technical world, led by the OECD/NEA, BNL in the United States, CEA in France and IAEA, is to develop researches or programs to assess the seismic safety considering aging degradation of operating nuclear power plants. Regard to the above-mentioned international technical trend, a technology to establish inelastic seismic analysis considering aging degradation so as to assess damage level and seismic safety margin appears to be necessary. Damage assessment and prediction system to grasp in real-time the actual seismic resistance capacity and damage level by 3-dimensional graphic representations are also required.

  2. Development of a seismic damage assessment program for nuclear power plant structures

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Hyun Moo; Cho, Yang Heui; Shin, Hyun Mok [Seoul National Univ., Seoul (Korea, Republic of)] (and others)

    2001-12-15

    The most part of the nuclear power plants operating currently in Korea are more than 20 years old and obviously we cannot pretend that their original performance is actually maintained. In addition, earthquake occurrences show an increasing trend all over the world, and Korea can no more be considered as a zone safe from earthquake. Therefore, need is to guarantee the safety of these power plant structures against seismic accident, to decide to maintain them operational and to obtain data relative to maintenance/repair. Such objectives can be reached by damage assessment using inelastic seismic analysis considering aging degradation. It appears to be more important particularly for the structure enclosing the nuclear reactor that must absolutely protect against any radioactive leakage. Actually, the tendency of the technical world, led by the OECD/NEA, BNL in the United States, CEA in France and IAEA, is to develop researches or programs to assess the seismic safety considering aging degradation of operating nuclear power plants. Regard to the above-mentioned international technical trend, a technology to establish inelastic seismic analysis considering aging degradation so as to assess damage level and seismic safety margin appears to be necessary. Damage assessment and prediction system to grasp in real-time the actual seismic resistance capacity and damage level by 3-dimensional graphic representations are also required.

  3. CALIBRATION OF ONLINE ANALYZERS USING NEURAL NETWORKS

    Energy Technology Data Exchange (ETDEWEB)

    Rajive Ganguli; Daniel E. Walsh; Shaohai Yu

    2003-12-05

    Neural networks were used to calibrate an online ash analyzer at the Usibelli Coal Mine, Healy, Alaska, by relating the Americium and Cesium counts to the ash content. A total of 104 samples were collected from the mine, with 47 being from screened coal, and the rest being from unscreened coal. Each sample corresponded to 20 seconds of coal on the running conveyor belt. Neural network modeling used the quick stop training procedure. Therefore, the samples were split into training, calibration and prediction subsets. Special techniques, using genetic algorithms, were developed to representatively split the sample into the three subsets. Two separate approaches were tried. In one approach, the screened and unscreened coal was modeled separately. In another, a single model was developed for the entire dataset. No advantage was seen from modeling the two subsets separately. The neural network method performed very well on average but not individually, i.e. though each prediction was unreliable, the average of a few predictions was close to the true average. Thus, the method demonstrated that the analyzers were accurate at 2-3 minutes intervals (average of 6-9 samples), but not at 20 seconds (each prediction).

  4. Analyzing endocrine system conservation and evolution.

    Science.gov (United States)

    Bonett, Ronald M

    2016-08-01

    Analyzing variation in rates of evolution can provide important insights into the factors that constrain trait evolution, as well as those that promote diversification. Metazoan endocrine systems exhibit apparent variation in evolutionary rates of their constituent components at multiple levels, yet relatively few studies have quantified these patterns and analyzed them in a phylogenetic context. This may be in part due to historical and current data limitations for many endocrine components and taxonomic groups. However, recent technological advancements such as high-throughput sequencing provide the opportunity to collect large-scale comparative data sets for even non-model species. Such ventures will produce a fertile data landscape for evolutionary analyses of nucleic acid and amino acid based endocrine components. Here I summarize evolutionary rate analyses that can be applied to categorical and continuous endocrine traits, and also those for nucleic acid and protein-based components. I emphasize analyses that could be used to test whether other variables (e.g., ecology, ontogenetic timing of expression, etc.) are related to patterns of rate variation and endocrine component diversification. The application of phylogenetic-based rate analyses to comparative endocrine data will greatly enhance our understanding of the factors that have shaped endocrine system evolution.

  5. Analyzing Mode Confusion via Model Checking

    Science.gov (United States)

    Luettgen, Gerald; Carreno, Victor

    1999-01-01

    Mode confusion is one of the most serious problems in aviation safety. Today's complex digital flight decks make it difficult for pilots to maintain awareness of the actual states, or modes, of the flight deck automation. NASA Langley leads an initiative to explore how formal techniques can be used to discover possible sources of mode confusion. As part of this initiative, a flight guidance system was previously specified as a finite Mealy automaton, and the theorem prover PVS was used to reason about it. The objective of the present paper is to investigate whether state-exploration techniques, especially model checking, are better able to achieve this task than theorem proving and also to compare several verification tools for the specific application. The flight guidance system is modeled and analyzed in Murphi, SMV, and Spin. The tools are compared regarding their system description language, their practicality for analyzing mode confusion, and their capabilities for error tracing and for animating diagnostic information. It turns out that their strengths are complementary.

  6. Analyzing Network Coding Gossip Made Easy

    CERN Document Server

    Haeupler, Bernhard

    2010-01-01

    We give a new technique to analyze the stopping time of gossip protocols that are based on random linear network coding (RLNC). Our analysis drastically simplifies, extends and strengthens previous results. We analyze RLNC gossip in a general framework for network and communication models that encompasses and unifies the models used previously in this context. We show, in most settings for the first time, that it converges with high probability in the information-theoretically optimal time. Most stopping times are of the form O(k + T) where k is the number of messages to be distributed and T is the time it takes to disseminate one message. This means RLNC gossip achieves "perfect pipelining". Our analysis directly extends to highly dynamic networks in which the topology can change completely at any time. This remains true even if the network dynamics are controlled by a fully adaptive adversary that knows the complete network state. Virtually nothing besides simple O(kT) sequential flooding protocols was prev...

  7. Analyzing Interoperability of Protocols Using Model Checking

    Institute of Scientific and Technical Information of China (English)

    WUPeng

    2005-01-01

    In practical terms, protocol interoperability testing is still laborious and error-prone with little effect, even for those products that have passed conformance testing. Deadlock and unsymmetrical data communication are familiar in interoperability testing, and it is always very hard to trace their causes. The previous work has not provided a coherent way to analyze why the interoperability was broken among protocol implementations under test. In this paper, an alternative approach is presented to analyzing these problems from a viewpoint of implementation structures. Sequential and concurrent structures are both representative implementation structures, especially in event-driven development model. Our research mainly discusses the influence of sequential and concurrent structures on interoperability, with two instructive conclusions: (a) a sequential structure may lead to deadlock; (b) a concurrent structure may lead to unsymmetrical data communication. Therefore, implementation structures carry weight on interoperability, which may not gain much attention before. To some extent, they are decisive on the result of interoperability testing. Moreover, a concurrent structure with a sound task-scheduling strategy may contribute to the interoperability of a protocol implementation. Herein model checking technique is introduced into interoperability analysis for the first time. As the paper shows, it is an effective way to validate developers' selections on implementation structures or strategies.

  8. Sentiment Analyzer for Arabic Comments System

    Directory of Open Access Journals (Sweden)

    Alaa El-Dine Ali Hamouda

    2013-04-01

    Full Text Available Today, the number of users of social network is increasing. Millions of users share opinions on different aspects of life every day. Therefore social network are rich sources of data for opinion mining and sentiment analysis. Also users have become more interested in following news pages on Facebook. Several posts; political for example, have thousands of users’ comments that agree/disagree with the post content. Such comments can be a good indicator for the community opinion about the post content. For politicians, marketers, decision makers …, it is required to make sentiment analysis to know the percentage of users agree, disagree and neutral respect to a post. This raised the need to analyze theusers’ comments in Facebook. We focused on Arabic Facebook news pages for the task of sentiment analysis. We developed a corpus for sentiment analysis and opinion mining purposes. Then, we used different machine learning algorithms – decision tree, support vector machines, and naive bayes - to develop sentiment analyzer. The performance of the system using each technique was evaluated and compared with others.

  9. Optoacoustic 13C-breath test analyzer

    Science.gov (United States)

    Harde, Hermann; Helmrich, Günther; Wolff, Marcus

    2010-02-01

    The composition and concentration of exhaled volatile gases reflects the physical ability of a patient. Therefore, a breath analysis allows to recognize an infectious disease in an organ or even to identify a tumor. One of the most prominent breath tests is the 13C-urea-breath test, applied to ascertain the presence of the bacterium helicobacter pylori in the stomach wall as an indication of a gastric ulcer. In this contribution we present a new optical analyzer that employs a compact and simple set-up based on photoacoustic spectroscopy. It consists of two identical photoacoustic cells containing two breath samples, one taken before and one after capturing an isotope-marked substrate, where the most common isotope 12C is replaced to a large extent by 13C. The analyzer measures simultaneously the relative CO2 isotopologue concentrations in both samples by exciting the molecules on specially selected absorption lines with a semiconductor laser operating at a wavelength of 2.744 μm. For a reliable diagnosis changes of the 13CO2 concentration of 1% in the exhaled breath have to be detected at a concentration level of this isotope in the breath of about 500 ppm.

  10. Atmospheric Aerosol Chemistry Analyzer: Demonstration of feasibility

    Energy Technology Data Exchange (ETDEWEB)

    Mroz, E.J.; Olivares, J.; Kok, G.

    1996-04-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The project objective was to demonstrate the technical feasibility of an Atmospheric Aerosol Chemistry Analyzer (AACA) that will provide a continuous, real-time analysis of the elemental (major, minor and trace) composition of atmospheric aerosols. The AACA concept is based on sampling the atmospheric aerosol through a wet cyclone scrubber that produces an aqueous suspension of the particles. This suspension can then be analyzed for elemental composition by ICP/MS or collected for subsequent analysis by other methods. The key technical challenge was to develop a wet cyclone aerosol sampler suitable for respirable particles found in ambient aerosols. We adapted an ultrasonic nebulizer to a conventional, commercially available, cyclone aerosol sampler and completed collection efficiency tests for the unit, which was shown to efficiently collect particles as small as 0.2 microns. We have completed the necessary basic research and have demonstrated the feasibility of the AACA concept.

  11. Analyzing rare diseases terms in biomedical terminologies

    Directory of Open Access Journals (Sweden)

    Erika Pasceri

    2012-03-01

    Full Text Available Rare disease patients too often face common problems, including the lack of access to correct diagnosis, lack of quality information on the disease, lack of scientific knowledge of the disease, inequities and difficulties in access to treatment and care. These things could be changed by implementing a comprehensive approach to rare diseases, increasing international cooperation in scientific research, by gaining and sharing scientific knowledge about and by developing tools for extracting and sharing knowledge. A significant aspect to analyze is the organization of knowledge in the biomedical field for the proper management and recovery of health information. For these purposes, the sources needed have been acquired from the Office of Rare Diseases Research, the National Organization of Rare Disorders and Orphanet, organizations that provide information to patients and physicians and facilitate the exchange of information among different actors involved in this field. The present paper shows the representation of rare diseases terms in biomedical terminologies such as MeSH, ICD-10, SNOMED CT and OMIM, leveraging the fact that these terminologies are integrated in the UMLS. At the first level, it was analyzed the overlap among sources and at a second level, the presence of rare diseases terms in target sources included in UMLS, working at the term and concept level. We found that MeSH has the best representation of rare diseases terms.

  12. Fully Analyzing an Algebraic Polya Urn Model

    CERN Document Server

    Morcrette, Basile

    2012-01-01

    This paper introduces and analyzes a particular class of Polya urns: balls are of two colors, can only be added (the urns are said to be additive) and at every step the same constant number of balls is added, thus only the color compositions varies (the urns are said to be balanced). These properties make this class of urns ideally suited for analysis from an "analytic combinatorics" point-of-view, following in the footsteps of Flajolet-Dumas-Puyhaubert, 2006. Through an algebraic generating function to which we apply a multiple coalescing saddle-point method, we are able to give precise asymptotic results for the probability distribution of the composition of the urn, as well as local limit law and large deviation bounds.

  13. Modeling and analyzing architectural change with alloy

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Ingstrup, Mads

    2010-01-01

    to the uptake of reconfiguration techniques in industry. Using the Alloy language and associated tool, we propose a practical way to formally model and analyze runtime architectural change expressed as architectural scripts. Our evaluation shows the performance to be acceptable; our experience......Although adaptivity based on reconfiguration has the potential to improve dependability of systems, the cost of a failed attempt at reconfiguration is prohibitive in precisely the applications where high dependability is required. Existing work on formal modeling and verification of architectural...... reconfigurations partly achieve the goal of ensuring correctness, however the formalisms used often lack tool support and the ensuing models have uncertain relation to a concrete implementation. Thus a practical way to ensure with formal certainty that specific architectural changes are correct remains a barrier...

  14. Analyzing and forecasting the European social climate

    Directory of Open Access Journals (Sweden)

    Liliana DUGULEANĂ

    2015-06-01

    Full Text Available The paper uses the results of the sample survey Eurobarometer, which has been requested by the European Commission. The social climate index is used to measure the level of perceptions of population by taking into account their personal situation and their perspective at national level. The paper makes an analysis of the evolution of social climate indices for the countries of European Union and offers information about the expectations of population of analyzed countries. The obtained results can be compared with the forecasting of Eurobarometer, on short term of one year and medium term of five years. Modelling the social climate index and its influence factors offers useful information about the efficiency of social protection and inclusion policies.

  15. Drug stability analyzer for long duration spaceflights

    Science.gov (United States)

    Shende, Chetan; Smith, Wayne; Brouillette, Carl; Farquharson, Stuart

    2014-06-01

    Crewmembers of current and future long duration spaceflights require drugs to overcome the deleterious effects of weightlessness, sickness and injuries. Unfortunately, recent studies have shown that some of the drugs currently used may degrade more rapidly in space, losing their potency well before their expiration dates. To complicate matters, the degradation products of some drugs can be toxic. Consequently there is a need for an analyzer that can determine if a drug is safe at the time of use, as well as to monitor and understand space-induced degradation, so that drug types, formulations, and packaging can be improved. Towards this goal we have been investigating the ability of Raman spectroscopy to monitor and quantify drug degradation. Here we present preliminary data by measuring acetaminophen, and its degradation product, p-aminophenol, as pure samples, and during forced degradation reactions.

  16. Basis-neutral Hilbert-space analyzers

    CERN Document Server

    Martin, Lane; Kondakci, H Esat; Larson, Walker D; Shabahang, Soroush; Jahromi, Ali K; Malhotra, Tanya; Vamivakas, A Nick; Atia, George K; Abouraddy, Ayman F

    2016-01-01

    Interferometry is one of the central organizing principles of optics. Key to interferometry is the concept of optical delay, which facilitates spectral analysis in terms of time-harmonics. In contrast, when analyzing a beam in a Hilbert space spanned by spatial modes -- a critical task for spatial-mode multiplexing and quantum communication -- basis-specific principles are invoked that are altogether distinct from that of `delay.' Here, we extend the traditional concept of temporal delay to the spatial domain, thereby enabling the analysis of a beam in an arbitrary spatial-mode basis -- exemplified using Hermite-Gaussian and radial Laguerre-Gaussian modes. Such generalized delays correspond to optical implementations of fractional transforms; for example, the fractional Hankel transform is the generalized delay associated with the space of Laguerre-Gaussian modes, and an interferometer incorporating such a `delay' obtains modal weights in the associated Hilbert space. By implementing an inherently stable, rec...

  17. Analyzing Options for Airborne Emergency Wireless Communications

    Energy Technology Data Exchange (ETDEWEB)

    Michael Schmitt; Juan Deaton; Curt Papke; Shane Cherry

    2008-03-01

    In the event of large-scale natural or manmade catastrophic events, access to reliable and enduring commercial communication systems is critical. Hurricane Katrina provided a recent example of the need to ensure communications during a national emergency. To ensure that communication demands are met during these critical times, Idaho National Laboratory (INL) under the guidance of United States Strategic Command has studied infrastructure issues, concerns, and vulnerabilities associated with an airborne wireless communications capability. Such a capability could provide emergency wireless communications until public/commercial nodes can be systematically restored. This report focuses on the airborne cellular restoration concept; analyzing basic infrastructure requirements; identifying related infrastructure issues, concerns, and vulnerabilities and offers recommended solutions.

  18. Computer model for analyzing sodium cold traps

    Energy Technology Data Exchange (ETDEWEB)

    McPheeters, C C; Raue, D J

    1983-05-01

    A computer model was developed to simulate the processes that occur in sodium cold traps. The Model for Analyzing Sodium Cold Traps (MASCOT) simulates any desired configuration of mesh arrangements and dimensions and calculates pressure drops and flow distributions, temperature profiles, impurity concentration profiles, and impurity mass distributions. The calculated pressure drop as a function of impurity mass content determines the capacity of the cold trap. The accuracy of the model was checked by comparing calculated mass distributions with experimentally determined mass distributions from literature publications and with results from our own cold trap experiments. The comparisons were excellent in all cases. A parametric study was performed to determine which design variables are most important in maximizing cold trap capacity.

  19. Analyzing Hydrological Sustainability Through Water Balance

    Science.gov (United States)

    Menció, Anna; Folch, Albert; Mas-Pla, Josep

    2010-05-01

    The objective of the Water Framework Directive (2000/60/EC) is to assist in the development of management plans that will lead to the sustainable use of water resources in all EU member states. However, defining the degree of sustainability aimed at is not a straightforward task. It requires detailed knowledge of the hydrogeological characteristics of the basin in question, its environmental needs, the amount of human water demand, and the opportunity to construct a proper water balance that describes the behavior of the hydrological system and estimates available water resources. An analysis of the water balance in the Selva basin (Girona, NE Spain) points to the importance of regional groundwater fluxes in satisfying current exploitation rates, and shows that regional scale approaches are often necessary to evaluate water availability. In addition, we discuss the pressures on water resources, and analyze potential actions, based on the water balance results, directed towards achieving sustainable water management in the basin.

  20. Analyzing petabytes of data with Hadoop

    CERN Document Server

    CERN. Geneva

    2009-01-01

    Abstract The open source Apache Hadoop project provides a powerful suite of tools for storing and analyzing petabytes of data using commodity hardware. After several years of production use inside of web companies like Yahoo! and Facebook and nearly a year of commercial support and development by Cloudera, the technology is spreading rapidly through other disciplines, from financial services and government to life sciences and high energy physics. The talk will motivate the design of Hadoop and discuss some key implementation details in depth. It will also cover the major subprojects in the Hadoop ecosystem, go over some example applications, highlight best practices for deploying Hadoop in your environment, discuss plans for the future of the technology, and provide pointers to the many resources available for learning more. In addition to providing more information about the Hadoop platform, a major goal of this talk is to begin a dialogue with the ATLAS research team on how the tools commonly used in t...

  1. Stackable differential mobility analyzer for aerosol measurement

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Meng-Dawn (Oak Ridge, TN); Chen, Da-Ren (Creve Coeur, MO)

    2007-05-08

    A multi-stage differential mobility analyzer (MDMA) for aerosol measurements includes a first electrode or grid including at least one inlet or injection slit for receiving an aerosol including charged particles for analysis. A second electrode or grid is spaced apart from the first electrode. The second electrode has at least one sampling outlet disposed at a plurality different distances along its length. A volume between the first and the second electrode or grid between the inlet or injection slit and a distal one of the plurality of sampling outlets forms a classifying region, the first and second electrodes for charging to suitable potentials to create an electric field within the classifying region. At least one inlet or injection slit in the second electrode receives a sheath gas flow into an upstream end of the classifying region, wherein each sampling outlet functions as an independent DMA stage and classifies different size ranges of charged particles based on electric mobility simultaneously.

  2. Analyzing BSE transmission to quantify regional risk.

    Science.gov (United States)

    de Koeijer, Aline A

    2007-10-01

    As a result of consumer fears and political concerns related to BSE as a risk to human health, a need has arisen recently for more sensitive methods to detect BSE and more accurate methods to determine BSE incidence. As a part of the development of such methods, it is important to be able to identify groups of animals with above-average BSE risk. One of the well-known risk factors for BSE is age, as very young animals do not develop the disease, and very old animals are less likely to develop the disease. Here, we analyze which factors have a strong influence on the age distribution of BSE in a population. Building on that, we develop a simple set of calculation rules for classifying the BSE risk in a given cattle population. Required inputs are data on imports and on the BSE control measures in place over the last 10 or 20 years.

  3. Analyzing and mining automated imaging experiments.

    Science.gov (United States)

    Berlage, Thomas

    2007-04-01

    Image mining is the application of computer-based techniques that extract and exploit information from large image sets to support human users in generating knowledge from these sources. This review focuses on biomedical applications of this technique, in particular automated imaging at the cellular level. Due to increasing automation and the availability of integrated instruments, biomedical users are becoming increasingly confronted with the problem of analyzing such data. Image database applications need to combine data management, image analysis and visual data mining. The main point of such a system is a software layer that represents objects within an image and the ability to use a large spectrum of quantitative and symbolic object features. Image analysis needs to be adapted to each particular experiment; therefore, 'end user programming' will be desired to make the technology more widely applicable.

  4. Buccal microbiology analyzed by infrared spectroscopy

    Science.gov (United States)

    de Abreu, Geraldo Magno Alves; da Silva, Gislene Rodrigues; Khouri, Sônia; Favero, Priscila Pereira; Raniero, Leandro; Martin, Airton Abrahão

    2012-01-01

    Rapid microbiological identification and characterization are very important in dentistry and medicine. In addition to dental diseases, pathogens are directly linked to cases of endocarditis, premature delivery, low birth weight, and loss of organ transplants. Fourier Transform Infrared Spectroscopy (FTIR) was used to analyze oral pathogens Aggregatibacter actinomycetemcomitans ATCC 29523, Aggregatibacter actinomycetemcomitans-JP2, and Aggregatibacter actinomycetemcomitans which was clinically isolated from the human blood-CI. Significant spectra differences were found among each organism allowing the identification and characterization of each bacterial species. Vibrational modes in the regions of 3500-2800 cm-1, the 1484-1420 cm-1, and 1000-750 cm-1 were used in this differentiation. The identification and classification of each strain were performed by cluster analysis achieving 100% separation of strains. This study demonstrated that FTIR can be used to decrease the identification time, compared to the traditional methods, of fastidious buccal microorganisms associated with the etiology of the manifestation of periodontitis.

  5. Complex networks theory for analyzing metabolic networks

    Institute of Scientific and Technical Information of China (English)

    ZHAO Jing; YU Hong; LUO Jianhua; CAO Z.W.; LI Yixue

    2006-01-01

    One of the main tasks of post-genomic informatics is to systematically investigate all molecules and their interactions within a living cell so as to understand how these molecules and the interactions between them relate to the function of the organism,while networks are appropriate abstract description of all kinds of interactions. In the past few years, great achievement has been made in developing theory of complex networks for revealing the organizing principles that govern the formation and evolution of various complex biological, technological and social networks. This paper reviews the accomplishments in constructing genome-based metabolic networks and describes how the theory of complex networks is applied to analyze metabolic networks.

  6. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business...... as the software development and distribution by a set of actors dependent on each other and the ecosystem. We commence on the hypothesis that the establishment of a software ecosystem on the telemedicine services of Denmark would address these issues and investigate how a software ecosystem can foster...... the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...

  7. Three Practical Methods for Analyzing Slope Stability

    Institute of Scientific and Technical Information of China (English)

    XU Shiguang; ZHANG Shitao; ZHU Chuanbing; YIN Ying

    2008-01-01

    Since the environmental capacity and the arable as well as the inhabitant lands have actually reached a full balance, the slopes are becoming the more and more important options for various engineering constructions. Because of the geological complexity of the slope, the design and thedecision-making of a slope-based engineering is still not ractical to rely solely on the theoretical analysis and numerical calculation, but mainly on the experience of the experts. Therefore, it hasimportant practical significance to turn some successful experience into mathematic equations. Basedupon the abundant typical slope engineering construction cases in Yunnan, Southwestern China, 3methods for yzing the slope stability have been developed in this paper. First of all, the corresponded analogous mathematic equation for analyzing slope stability has been established through case studies. Then, artificial neural network and multivariate regression analysis have alsobeen set up when 7 main influencing factors are adopted

  8. Analyzing Cell Wall Elasticity After Hormone Treatment: An Example Using Tobacco BY-2 Cells and Auxin.

    Science.gov (United States)

    Braybrook, Siobhan A

    2017-01-01

    Atomic force microscopy, and related nano-indentation techniques, is a valuable tool for analyzing the elastic properties of plant cell walls as they relate to changes in cell wall chemistry, changes in development, and response to hormones. Within this chapter I will describe a method for analyzing the effect of the phytohormone auxin on the cell wall elasticity of tobacco BY-2 cells. This general method may be easily altered for different experimental systems and hormones of interest.

  9. Multi-Pass Quadrupole Mass Analyzer

    Science.gov (United States)

    Prestage, John D.

    2013-01-01

    Analysis of the composition of planetary atmospheres is one of the most important and fundamental measurements in planetary robotic exploration. Quadrupole mass analyzers (QMAs) are the primary tool used to execute these investigations, but reductions in size of these instruments has sacrificed mass resolving power so that the best present-day QMA devices are still large, expensive, and do not deliver performance of laboratory instruments. An ultra-high-resolution QMA was developed to resolve N2 +/CO+ by trapping ions in a linear trap quadrupole filter. Because N2 and CO are resolved, gas chromatography columns used to separate species before analysis are eliminated, greatly simplifying gas analysis instrumentation. For highest performance, the ion trap mode is used. High-resolution (or narrow-band) mass selection is carried out in the central region, but near the DC electrodes at each end, RF/DC field settings are adjusted to allow broadband ion passage. This is to prevent ion loss during ion reflection at each end. Ions are created inside the trap so that low-energy particles are selected by low-voltage settings on the end electrodes. This is beneficial to good mass resolution since low-energy particles traverse many cycles of the RF filtering fields. Through Monte Carlo simulations, it is shown that ions are reflected at each end many tens of times, each time being sent back through the central section of the quadrupole where ultrahigh mass filtering is carried out. An analyzer was produced with electrical length orders of magnitude longer than its physical length. Since the selector fields are sized as in conventional devices, the loss of sensitivity inherent in miniaturizing quadrupole instruments is avoided. The no-loss, multi-pass QMA architecture will improve mass resolution of planetary QMA instruments while reducing demands on the RF electronics for high-voltage/high-frequency production since ion transit time is no longer limited to a single pass. The

  10. Suspension-cultured plant cells as a tool to analyze the extracellular proteome.

    Science.gov (United States)

    Sabater-Jara, Ana B; Almagro, Lorena; Belchí-Navarro, Sarai; Martínez-Esteso, María J; Youssef, Sabry M; Casado-Vela, Juan; Vera-Urbina, Juan C; Sellés-Marchart, Susana; Bru-Martínez, Roque; Pedreño, María A

    2014-01-01

    Suspension-cultured cells (SCC) are generally considered the most suitable cell systems to carry out scientific studies, including the extracellular proteome (secretome). SCC are initiated by transferring friable callus fragments into flasks containing liquid culture medium for cell biomass growth, and they are maintained in an orbital shaker to supply the sufficient oxygen that allows cell growth. SCC increase rapidly during the exponential phase and after 10-20 days (depending on the cell culture nature), the growth rate starts to decrease due to limitation of nutrients, and to maintain for decades these kinds of cell cultures is needed to transfer a portion of these SCC into a fresh culture medium. Despite the central role played by extracellular proteins in most processes that control growth and development, the secretome has been less well characterized than other subcellular compartments, meaning that our understanding of the cell wall physiology is still very limited. Useful proteomic tools have emerged in recent years to unravel metabolic network that occurs in cell walls. With the recent progress made in mass spectrometry technology, it has become feasible to identify proteins from a given organ, tissue, cells, or even a subcellular compartment. Compared with other methods used to isolate cell wall proteins, the spent medium of SCC provides a convenient, continuous, and reliable and unique source of extracellular proteins. Therefore, this biological system could be used as a large-scale cell culture from which these proteins can be secreted, easily separated from cells without cell disruption, and so, without any cytosolic contamination, easily recovered from the extracellular medium. This nondestructive cell wall proteome approach discloses a set of proteins that are specifically expressed in the remodelling of the cell wall architecture and stress defense.

  11. Comparison of two dry chemistry analyzers and a wet chemistry analyzer using canine serum.

    Science.gov (United States)

    Lanevschi, Anne; Kramer, John W.

    1996-01-01

    Canine serum was used to compare seven chemistry analytes on two tabletop clinical dry chemistry analyzers, Boehringer's Reflotron and Kodak's Ektachem. Results were compared to those obtained on a wet chemistry reference analyzer, Roche Diagnostic's Cobas Mira. Analytes measured were urea nitrogen (BUN), creatinine, glucose, aspartate aminotransferase (AST), alanine aminotransferase (ALT), cholesterol and bilirubin. Nine to 12 canine sera with values in the low, normal, and high range were evaluated. The correlations were acceptable for all comparisons with correlation coefficients greater than 0.98 for all analytes. Regression analysis resulted in significant differences for both tabletop analyzers when compared to the reference analyzer for cholesterol and bilirubin, and for glucose and AST on the Kodak Ektachem. Differences appeared to result from proportional systematic error occurring at high analyte concentrations.

  12. The Chemnitz LogAnalyzer: a tool for analyzing data from hypertext navigation research.

    Science.gov (United States)

    Brunstein, Angela; Naumann, Anja; Krems, Josef F

    2005-05-01

    Computer-based studies usually produce log files as raw data. These data cannot be analyzed adequately with conventional statistical software. The Chemnitz LogAnalyzer provides tools for quick and comfortable visualization and analyses of hypertext navigation behavior by individual users and for aggregated data. In addition, it supports analogous analyses of questionnaire data and reanalysis with respect to several predefined orders of nodes of the same hypertext. As an illustration of how to use the Chemnitz LogAnalyzer, we give an account of one study on learning with hypertext. Participants either searched for specific details or read a hypertext document to familiarize themselves with its content. The tool helped identify navigation strategies affected by these two processing goals and provided comparisons, for example, of processing times and visited sites. Altogether, the Chemnitz LogAnalyzer fills the gap between log files as raw data of Web-based studies and conventional statistical software.

  13. Highlights from BNL-RHIC-2012

    CERN Document Server

    Tannenbaum, M J

    2013-01-01

    Recent highlights from Brookhaven National Laboratory and the Relativistic Heavy Ion Collider (RHIC) are reviewed and discussed in the context of the discovery of the strongly interacting Quark Gluon Plasma (sQGP) at RHIC in 2005 as confirmed by results from the CERN-LHC Pb+Pb program. Outstanding RHIC machine operation in 2012 with 3-dimensional stochastic cooling and a new EBIS ion source enabled measurements with Cu+Au, U+U, for which multiplicity distributions are shown, as well as with polarized p-p collisions. Differences of the physics and goals of p-p versus A+A are discussed leading to a review of RHIC results on pi0 suppression in Au+Au collisions and comparison to LHC Pb+Pb results in the same range 5 30 GeV. Improved measurements of direct photon production and correlation with charged particles at RHIC are shown, including the absence of a low pT (thermal) photon enhancement in d+Au collisions. Attempts to understand the apparent equality of the energy loss of light and heavy quarks in the QGP by...

  14. BNL Citric Acid Technology: Pilot Scale Demonstration

    Energy Technology Data Exchange (ETDEWEB)

    FRANCIS, A J; DODGE,; J, C; GILLOW, J B; FORRESTER, K E

    1999-09-24

    The objective of this project is to remove toxic metals such as lead and cadmium from incinerator ash using the Citric Acid Process developed at Brookhaven National Laboratory. In this process toxic metals in bottom ash from the incineration of municipal solid waste were first extracted with citric acid followed by biodegradation of the citric acid-metal extract by the bacterium Pseudomonas fluorescens for metals recovery. The ash contained the following metals: Al, As, Ba, Ca, Cd, Cr, Cu, Fe, Mg, Mn, Ni, Pb, Se, Sr, Ti, and Zn. Optimization of the Citric Acid Process parameters which included citric acid molarity, contact time, the impact of mixing aggressiveness during extraction and pretreatment showed lead and cadmium removal from incinerator ash of >90%. Seeding the treated ash with P. fluorescens resulted in the removal of residual citric acid and biostabilization of any leachable lead, thus allowing it to pass EPA?s Toxicity Characteristic Leaching Procedure. Biodegradation of the citric acid extract removed >99% of the lead from the extract as well as other metals such as Al, Ca, Cu, Fe, Mg, Mn, Ti, and Zn. Speciation of the bioprecipitated lead by Extended X-ray Absorption Fine Structure at the National Synchrotron Light Source showed that the lead is predominantly associated with the phosphate and carboxyl functional groups in a stable form. Citric acid was completely recovered (>99%) from the extract by sulfide precipitation technique and the extraction efficiency of recovered citric acid is similar to that of the fresh citric acid. Recycling of the citric acid should result in considerable savings in the overall treatment cost. We have shown the potential application of this technology to remove and recover the metal contaminants from incinerator ash as well as from other heavy metal bearing wastes (i.e., electric arc furnace dust from steel industry) or soils. Information developed from this project is being applied to demonstrate the remediation of lead paint contaminated soils on Long Island.

  15. BNL ACCELERATOR-BASED RADIOBIOLOGY FACILITIES

    Energy Technology Data Exchange (ETDEWEB)

    LOWENSTEIN,D.I.

    2000-05-28

    For the past several years, the Alternating Gradient Synchrotron (AGS) at Brookhaven National Laboratory (USA) has provided ions of iron, silicon and gold, at energies from 600 MeV/nucleon to 10 GeV/nucleon, for the US National Aeronautics and Space Administration (NASA) radiobiology research program. NASA has recently funded the construction of a new dedicated ion facility, the Booster Applications Facility (BAF). The Booster synchrotron will supply ion beams ranging from protons to gold, in an energy range from 40--3,000 MeV/nucleon with maximum beam intensities of 10{sup 10} to 10{sup 11} ions per pulse. The BAF Project is described and the future AGS and BAF operation plans are presented.

  16. Ionic behavior of treated water at a water purification plant

    OpenAIRE

    Yanagida, Kazumi; Kawahigashi, Tatsuo

    2012-01-01

    [Abstract] Water at each processing stage in a water purification plant was extracted and analyzed to investigate changes of water quality. Investigations of water at each processing stage at the water purification plant are discussed herein.

  17. Signal processing and analyzing works of art

    Science.gov (United States)

    Johnson, Don H.; Johnson, C. Richard, Jr.; Hendriks, Ella

    2010-08-01

    In examining paintings, art historians use a wide variety of physico-chemical methods to determine, for example, the paints, the ground (canvas primer) and any underdrawing the artist used. However, the art world has been little touched by signal processing algorithms. Our work develops algorithms to examine x-ray images of paintings, not to analyze the artist's brushstrokes but to characterize the weave of the canvas that supports the painting. The physics of radiography indicates that linear processing of the x-rays is most appropriate. Our spectral analysis algorithms have an accuracy superior to human spot-measurements and have the advantage that, through "short-space" Fourier analysis, they can be readily applied to entire x-rays. We have found that variations in the manufacturing process create a unique pattern of horizontal and vertical thread density variations in the bolts of canvas produced. In addition, we measure the thread angles, providing a way to determine the presence of cusping and to infer the location of the tacks used to stretch the canvas on a frame during the priming process. We have developed weave matching software that employs a new correlation measure to find paintings that share canvas weave characteristics. Using a corpus of over 290 paintings attributed to Vincent van Gogh, we have found several weave match cliques that we believe will refine the art historical record and provide more insight into the artist's creative processes.

  18. Analyzing modified unimodular gravity via Lagrange multipliers

    Science.gov (United States)

    Sáez-Gómez, Diego

    2016-06-01

    The so-called unimodular version of general relativity is revisited. Unimodular gravity is constructed by fixing the determinant of the metric, which leads to the trace-free part of the equations instead of the usual Einstein field equations. Then a cosmological constant naturally arises as an integration constant. While unimodular gravity turns out to be equivalent to general relativity (GR) at the classical level, it provides important differences at the quantum level. Here we extend the unimodular constraint to some extensions of general relativity that have drawn a lot of attention over the last years—f (R ) gravity (or its scalar-tensor picture) and Gauss-Bonnet gravity. The corresponding unimodular version of such theories is constructed as well as the conformal transformation that relates the Einstein and Jordan frames for these nonminimally coupled theories. From the classical point of view, the unimodular versions of such extensions are completely equivalent to their originals, but an effective cosmological constant arises naturally, which may provide a richer description of the evolution of the Universe. Here we analyze the case of Starobisnky inflation and compare it with the original one.

  19. A Method for Analyzing Volunteered Geographic Information ...

    Science.gov (United States)

    Volunteered geographic information (VGI) can be used to identify public valuation of ecosystem services in a defined geographic area using photos as a representation of lived experiences. This method can help researchers better survey and report on the values and preferences of stakeholders involved in rehabilitation and revitalization projects. Current research utilizes VGI in the form of geotagged social media photos from three platforms: Flickr, Instagram, and Panaramio. Social media photos have been obtained for the neighborhoods next to the St. Louis River in Duluth, Minnesota, and are being analyzed along several dimensions. These dimensions include the spatial distribution of each platform, the characteristics of the physical environment portrayed in the photos, and finally, the ecosystem service depicted. In this poster, we focus on the photos from the Irving and Fairmount neighborhoods of Duluth, MN to demonstrate the method at the neighborhood scale. This study demonstrates a method for translating the values expressed in social media photos into ecosystem services and spatially-explicit data to be used in multiple settings, including the City of Duluth’s Comprehensive Planning and community revitalization efforts, habitat restoration in a Great Lakes Area of Concern, and the USEPA’s Office of Research and Development. This poster will demonstrate a method for translating values expressed in social media photos into ecosystem services and spatially

  20. PSAIA – Protein Structure and Interaction Analyzer

    Directory of Open Access Journals (Sweden)

    Vlahoviček Kristian

    2008-04-01

    Full Text Available Abstract Background PSAIA (Protein Structure and Interaction Analyzer was developed to compute geometric parameters for large sets of protein structures in order to predict and investigate protein-protein interaction sites. Results In addition to most relevant established algorithms, PSAIA offers a new method PIADA (Protein Interaction Atom Distance Algorithm for the determination of residue interaction pairs. We found that PIADA produced more satisfactory results than comparable algorithms implemented in PSAIA. Particular advantages of PSAIA include its capacity to combine different methods to detect the locations and types of interactions between residues and its ability, without any further automation steps, to handle large numbers of protein structures and complexes. Generally, the integration of a variety of methods enables PSAIA to offer easier automation of analysis and greater reliability of results. PSAIA can be used either via a graphical user interface or from the command-line. Results are generated in either tabular or XML format. Conclusion In a straightforward fashion and for large sets of protein structures, PSAIA enables the calculation of protein geometric parameters and the determination of location and type for protein-protein interaction sites. XML formatted output enables easy conversion of results to various formats suitable for statistic analysis. Results from smaller data sets demonstrated the influence of geometry on protein interaction sites. Comprehensive analysis of properties of large data sets lead to new information useful in the prediction of protein-protein interaction sites.

  1. Alzheimer's disease: analyzing the missing heritability.

    Directory of Open Access Journals (Sweden)

    Perry G Ridge

    Full Text Available Alzheimer's disease (AD is a complex disorder influenced by environmental and genetic factors. Recent work has identified 11 AD markers in 10 loci. We used Genome-wide Complex Trait Analysis to analyze >2 million SNPs for 10,922 individuals from the Alzheimer's Disease Genetics Consortium to assess the phenotypic variance explained first by known late-onset AD loci, and then by all SNPs in the Alzheimer's Disease Genetics Consortium dataset. In all, 33% of total phenotypic variance is explained by all common SNPs. APOE alone explained 6% and other known markers 2%, meaning more than 25% of phenotypic variance remains unexplained by known markers, but is tagged by common SNPs included on genotyping arrays or imputed with HapMap genotypes. Novel AD markers that explain large amounts of phenotypic variance are likely to be rare and unidentifiable using genome-wide association studies. Based on our findings and the current direction of human genetics research, we suggest specific study designs for future studies to identify the remaining heritability of Alzheimer's disease.

  2. Methodological considerations in analyzing Twitter data.

    Science.gov (United States)

    Kim, Annice E; Hansen, Heather M; Murphy, Joe; Richards, Ashley K; Duke, Jennifer; Allen, Jane A

    2013-12-01

    Twitter is an online microblogging tool that disseminates more than 400 million messages per day, including vast amounts of health information. Twitter represents an important data source for the cancer prevention and control community. This paper introduces investigators in cancer research to the logistics of Twitter analysis. It explores methodological challenges in extracting and analyzing Twitter data, including characteristics and representativeness of data; data sources, access, and cost; sampling approaches; data management and cleaning; standardizing metrics; and analysis. We briefly describe the key issues and provide examples from the literature and our studies using Twitter data to understand public health issues. For investigators considering Twitter-based cancer research, we recommend assessing whether research questions can be answered appropriately using Twitter, choosing search terms carefully to optimize precision and recall, using respected vendors that can provide access to the full Twitter data stream if possible, standardizing metrics to account for growth in the Twitter population over time, considering crowdsourcing for analysis of Twitter content, and documenting and publishing all methodological decisions to further the evidence base.

  3. Eastern Mediterranean Natural Gas: Analyzing Turkey's Stance

    Directory of Open Access Journals (Sweden)

    Abdullah Tanriverdi

    2016-02-01

    Full Text Available Recent large-scale natural gas discoveries in East Mediterranean have drawn attention to the region. The discoveries caused both hope and tension in the region. As stated, the new resources may serve as a new hope for all relevant parties as well as the region if managed in a collaborative and conciliatory way. Energy may be a remedy to Cyprus' financial predicament, initiate a process for resolving differences between Turkey and Cyprus, normalize Israel-Turkey relations and so on. On the contrary, adopting unilateral and uncooperative approach may aggravate the tension and undermine regional stability and security. In this sense, the role of energy in generating hope or tension is dependent on the approaches of related parties. The article will analyze Turkey's attitude in East Mediterranean case in terms of possible negative and positive implications for Turkey in the energy field. The article examines Turkey's position and the reasons behind its stance in the East Mediterranean case. Considering Turkey's energy profile and energy policy goals, the article argues that the newly found hydrocarbons may bring in more stakes for Turkey if Turkey adopts a cooperative approach in this case.

  4. Analyzing Consumer Behavior Towards Contemporary Food Retailers

    Directory of Open Access Journals (Sweden)

    E.Dursun

    2008-01-01

    Full Text Available The objective of this research is analyzing consumer behaviors towards to contemporary food retailers. Food retailing has been changing during recent years in Turkey. Foreign investors captivated with this market potential of food retailing. Retailer‟s format has been changed and featuring large-scale, extended product variety and full service retailers spreading rapidly through the nation-wide. Consumers‟ tend to shop their household needs from contemporary retailers due mainly to urbanism, increasing women workforce and income growth. In this research, original data collected through face-to-face interview from 385 respondents which are located in Istanbul. Different Socio-Economic Status (SES groups‟ ratio for Istanbul was forming sampling distribution. Consumers prefer closest food retailers which are mainly purchasing food products. Consumers purchase more than their planned what their needs; especially C SES group average comes first for the spending money for unplanned shopping. Chain stores and hypermarkets are the most preferred retailers in food purchasing. Moreover, consumer responses to judgments related to retailing are being investigating with factor analysis.

  5. Numerical methods for analyzing electromagnetic scattering

    Science.gov (United States)

    Lee, S. W.; Lo, Y. T.; Chuang, S. L.; Lee, C. S.

    1985-01-01

    Attenuation properties of the normal modes in an overmoded waveguide coated with a lossy material were analyzed. It is found that the low-order modes, can be significantly attenuated even with a thin layer of coating if the coating material is not too lossy. A thinner layer of coating is required for large attenuation of the low-order modes if the coating material is magnetic rather than dielectric. The Radar Cross Section (RCS) from an uncoated circular guide terminated by a perfect electric conductor was calculated and compared with available experimental data. It is confirmed that the interior irradiation contributes to the RCS. The equivalent-current method based on the geometrical theory of diffraction (GTD) was chosen for the calculation of the contribution from the rim diffraction. The RCS reduction from a coated circular guide terminated by a PEC are planned schemes for the experiments are included. The waveguide coated with a lossy magnetic material is suggested as a substitute for the corrugated waveguide.

  6. Qualitative Methodology in Analyzing Educational Phenomena

    Directory of Open Access Journals (Sweden)

    Antonio SANDU

    2010-12-01

    Full Text Available Semiological analysis of educational phenomena allow researchers access to a multidimensional universe of meanings that is represented by the school, not so much seen as an institution, but as a vector of social action through educational strategies. We consider education as a multidimensional phenomenon since its analysis allows the researcher to explore a variety of research hypotheses of different paradigmatic perspectives that converge in an educational finality. According to the author Simona Branc one of the most appropriate methods used in qualitative data analysis is Grounded Theory; this one assumes a systematic process of generating concepts and theories based on the data collected. Specialised literature defines Grounded Theory as an inductive approach that starts with general observations and during the analytical process creates conceptual categories that explain the theme explored. Research insist on the role of the sociologic theory of managing the research data and for providing ways of conceptualizing the descriptions and explanations.Qualitative content analysis is based on the constructivist paradigm (constructionist in the restricted sense that we used previously. It aims to create an “understanding of the latent meanings of the analyzed messages”. Quantitative content analysis involves a process of encoding and statistical analysis of data extracted from the content of the paper in the form of extractions like: frequencies, contingency analysis, etc

  7. Analyzing planar cell polarity during zebrafish gastrulation.

    Science.gov (United States)

    Jessen, Jason R

    2012-01-01

    Planar cell polarity was first described in invertebrates over 20 years ago and is defined as the polarity of cells (and cell structures) within the plane of a tissue, such as an epithelium. Studies in the last 10 years have identified critical roles for vertebrate homologs of these planar cell polarity proteins during gastrulation cell movements. In zebrafish, the terms convergence and extension are used to describe the collection of morphogenetic movements and cell behaviors that contribute to narrowing and elongation of the embryonic body plan. Disruption of planar cell polarity gene function causes profound defects in convergence and extension creating an embryo that has a shortened anterior-posterior axis and is broadened mediolaterally. The zebrafish gastrula-stage embryo is transparent and amenable to live imaging using both Nomarski/differential interference contrast and fluorescence microscopy. This chapter describes methods to analyze convergence and extension movements at the cellular level and thereby connect embryonic phenotypes with underlying planar cell polarity defects in migrating cells.

  8. Analyzing Design Heating Loads in Superinsulated Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Arena, Lois [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)

    2015-06-16

    The U.S. Department of Energy’s Building America research team Consortium for Advanced Residential Buildings (CARB) worked with the EcoVillage cohousing community in Ithaca, New York, on the Third Residential EcoVillage Experience neighborhood. This communityscale project consists of 40 housing units—15 apartments and 25 single-family residences. Units range in size from 450 ft2 to 1,664 ft2 and cost from $80,000 for a studio apartment to $235,000 for a three- or four-bedroom single-family home. For the research component of this project, CARB analyzed current heating system sizing methods for superinsulated homes in cold climates to determine if changes in building load calculation methodology should be recommended. Actual heating energy use was monitored and compared to results from the Air Conditioning Contractors of America’s Manual J8 (MJ8) and the Passive House Planning Package software. Results from that research indicate that MJ8 significantly oversizes heating systems for superinsulated homes and that thermal inertia and internal gains should be considered for more accurate load calculations.

  9. USING NLP APPROACH FOR ANALYZING CUSTOMER REVIEWS

    Directory of Open Access Journals (Sweden)

    Saleem Abuleil

    2017-01-01

    Full Text Available The Web considers one of the main sources of customer opinions and reviews which they are represented in two formats; structured data (numeric ratings and unstructured data (textual comments. Millions of textual comments about goods and services are posted on the web by customers and every day thousands are added, make it a big challenge to read and understand them to make them a useful structured data for customers and decision makers. Sentiment analysis or Opinion mining is a popular technique for summarizing and analyzing those opinions and reviews. In this paper, we use natural language processing techniques to generate some rules to help us understand customer opinions and reviews (textual comments written in the Arabic language for the purpose of understanding each one of them and then convert them to a structured data. We use adjectives as a key point to highlight important information in the text then we work around them to tag attributes that describe the subject of the reviews, and we associate them with their values (adjectives.

  10. Using Live-Cell Markers in Maize to Analyze Cell Division Orientation and Timing.

    Science.gov (United States)

    Rasmussen, Carolyn G

    2016-01-01

    Recently developed live-cell markers provide an opportunity to explore the dynamics and localization of proteins in maize, an important crop and model for monocot development. A step-by-step method is outlined for observing and analyzing the process of division in maize cells. The steps include plant growth conditions, sample preparation, time-lapse setup, and calculation of division rates.

  11. ANALYZING THE PROCESS OF PRODUCTION IN LOGISTICS SUGARCANE MILL: A CASE STUDY

    Directory of Open Access Journals (Sweden)

    Alexandre Tognoli

    2011-06-01

    Full Text Available The objective was to present and analyze the physical arrangement of logistics and production process plant in a sugarcane mill, in order to expose the processes involved, analyzing them more deeply and thus collaborate in a more efficient production. The relevance of this presentation is linked to the benefits that the plant and professionals can get through this work, enabling the development of methods and production alternatives. The research method used was case study based on interviews, on-site observation and document analysis, which was very appropriate as it could examine and cross checking. This work will allow a better understanding of the production process of the logistics of the plant in a sugarcane mill and working with suggestions and methods for more efficient production.

  12. Antifertility activity of medicinal plants.

    Science.gov (United States)

    Daniyal, Muhammad; Akram, Muhammad

    2015-07-01

    The aim of this review was to provide a comprehensive summary of medicinal plants used as antifertility agents in females throughout the world by various tribes and ethnic groups. We undertook an extensive bibliographic review by analyzing classical text books and peer reviewed papers, and further consulting well accepted worldwide scientific databases. We performed CENTRAL, Embase, and PubMed searches using terms such as "antifertility", "anti-implantation", "antiovulation", and "antispermatogenic" activity of plants. Plants, including their parts and extracts, that have traditionally been used to facilitate antifertility have been considered as antifertility agents. In this paper, various medicinal plants have been reviewed for thorough studies such as Polygonum hydropiper Linn, Citrus limonum, Piper nigrum Linn, Juniperis communis, Achyanthes aspera, Azadirachta indica, Tinospora cordifolia, and Barleria prionitis. Many of these medicinal plants appear to act through an antizygotic mechanism. This review clearly demonstrates that it is time to expand upon experimental studies to source new potential chemical constituents from medicinal plants; plant extracts and their active constituents should be further investigated for their mechanisms. This review creates a solid foundation upon which to further study the efficacy of plants that are both currently used by women as traditional antifertility medicines, but also could be efficacious as an antifertility agent with additional research and study.

  13. Analyzing cancer samples with SNP arrays.

    Science.gov (United States)

    Van Loo, Peter; Nilsen, Gro; Nordgard, Silje H; Vollan, Hans Kristian Moen; Børresen-Dale, Anne-Lise; Kristensen, Vessela N; Lingjærde, Ole Christian

    2012-01-01

    Single nucleotide polymorphism (SNP) arrays are powerful tools to delineate genomic aberrations in cancer genomes. However, the analysis of these SNP array data of cancer samples is complicated by three phenomena: (a) aneuploidy: due to massive aberrations, the total DNA content of a cancer cell can differ significantly from its normal two copies; (b) nonaberrant cell admixture: samples from solid tumors do not exclusively contain aberrant tumor cells, but always contain some portion of nonaberrant cells; (c) intratumor heterogeneity: different cells in the tumor sample may have different aberrations. We describe here how these phenomena impact the SNP array profile, and how these can be accounted for in the analysis. In an extended practical example, we apply our recently developed and further improved ASCAT (allele-specific copy number analysis of tumors) suite of tools to analyze SNP array data using data from a series of breast carcinomas as an example. We first describe the structure of the data, how it can be plotted and interpreted, and how it can be segmented. The core ASCAT algorithm next determines the fraction of nonaberrant cells and the tumor ploidy (the average number of DNA copies), and calculates an ASCAT profile. We describe how these ASCAT profiles visualize both copy number aberrations as well as copy-number-neutral events. Finally, we touch upon regions showing intratumor heterogeneity, and how they can be detected in ASCAT profiles. All source code and data described here can be found at our ASCAT Web site ( http://www.ifi.uio.no/forskning/grupper/bioinf/Projects/ASCAT/).

  14. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the

  15. Analyzing personalized policies for online biometric verification.

    Directory of Open Access Journals (Sweden)

    Apaar Sadhwani

    Full Text Available Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR subject to constraints on the false accept rate (FAR and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses 12 biometrics for each resident, which represents a five (four, respectively log reduction in FRR relative to fingerprint (iris, respectively policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR and acquires an average of 1.3 fingerprints per resident.

  16. Analyzing personalized policies for online biometric verification.

    Science.gov (United States)

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  17. Novel topological descriptors for analyzing biological networks

    Directory of Open Access Journals (Sweden)

    Varmuza Kurt K

    2010-06-01

    Full Text Available Abstract Background Topological descriptors, other graph measures, and in a broader sense, graph-theoretical methods, have been proven as powerful tools to perform biological network analysis. However, the majority of the developed descriptors and graph-theoretical methods does not have the ability to take vertex- and edge-labels into account, e.g., atom- and bond-types when considering molecular graphs. Indeed, this feature is important to characterize biological networks more meaningfully instead of only considering pure topological information. Results In this paper, we put the emphasis on analyzing a special type of biological networks, namely bio-chemical structures. First, we derive entropic measures to calculate the information content of vertex- and edge-labeled graphs and investigate some useful properties thereof. Second, we apply the mentioned measures combined with other well-known descriptors to supervised machine learning methods for predicting Ames mutagenicity. Moreover, we investigate the influence of our topological descriptors - measures for only unlabeled vs. measures for labeled graphs - on the prediction performance of the underlying graph classification problem. Conclusions Our study demonstrates that the application of entropic measures to molecules representing graphs is useful to characterize such structures meaningfully. For instance, we have found that if one extends the measures for determining the structural information content of unlabeled graphs to labeled graphs, the uniqueness of the resulting indices is higher. Because measures to structurally characterize labeled graphs are clearly underrepresented so far, the further development of such methods might be valuable and fruitful for solving problems within biological network analysis.

  18. Audubon Plant Study Program.

    Science.gov (United States)

    National Audubon Society, New York, NY.

    Included are an illustrated student reader, "The Story of Plants and Flowers," an adult leaders' guide, and a large wall chart picturing 37 wildflowers and describing 23 major plant families. The student reader presents these main topics: The Plant Kingdom, The Wonderful World of Plants, Plants Without Flowers, Flowering Plants, Plants Make Food…

  19. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2013-12-01

    The latest Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with newly available global observations. The traditional approach to climate model evaluation, which compares a single parameter at a time, identifies symptomatic model biases and errors but fails to diagnose the model problems. The model diagnosis process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. To address these challenges, we are developing a parallel, distributed web-service system that enables the physics-based multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation and (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, and (4) the calculation of difference between two variables. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use

  20. Technology for collecting and analyzing relational data

    Directory of Open Access Journals (Sweden)

    E. N. Fedorova

    2016-01-01

    summarize the information there is a mechanism of data grouping, which provides general data of the number of entries, maximum, minimum, average values for different groups of records.Results. This technology has been tested in the monitoring requirements of the services of additional professional education and the definition of the educational needs of teachers and executives of educational organizations of the Irkutsk region. The survey has involved 2,780 respondents in 36 municipalities. Creating the data model took several hours. The survey was conducted during the month.Conclusion. The proposed technology allows a short time to collect the information in relational form, and then analyze it without the need for programming with flexible assignment of the operating logic for form.

  1. Analyzing the attributes of Indiana's STEM schools

    Science.gov (United States)

    Eltz, Jeremy

    "Primary and secondary schools do not seem able to produce enough students with the interest, motivation, knowledge, and skills they will need to compete and prosper in the emerging world" (National Academy of Sciences [NAS], 2007a, p. 94). This quote indicated that there are changing expectations for today's students which have ultimately led to new models of education, such as charters, online and blended programs, career and technical centers, and for the purposes of this research, STEM schools. STEM education as defined in this study is a non-traditional model of teaching and learning intended to "equip them [students] with critical thinking, problem solving, creative and collaborative skills, and ultimately establishes connections between the school, work place, community and the global economy" (Science Foundation Arizona, 2014, p. 1). Focusing on science, technology, engineering, and math (STEM) education is believed by many educational stakeholders to be the solution for the deficits many students hold as they move on to college and careers. The National Governors Association (NGA; 2011) believes that building STEM skills in the nation's students will lead to the ability to compete globally with a new workforce that has the capacity to innovate and will in turn spur economic growth. In order to accomplish the STEM model of education, a group of educators and business leaders from Indiana developed a comprehensive plan for STEM education as an option for schools to use in order to close this gap. This plan has been promoted by the Indiana Department of Education (IDOE, 2014a) with the goal of increasing STEM schools throughout Indiana. To determine what Indiana's elementary STEM schools are doing, this study analyzed two of the elementary schools that were certified STEM by the IDOE. This qualitative case study described the findings and themes from two elementary STEM schools. Specifically, the research looked at the vital components to accomplish STEM

  2. Evaluation and performance characteristics of the Q Hemostasis Analyzer, an automated coagulation analyzer.

    Science.gov (United States)

    Toulon, Pierre; Fischer, Florence; Appert-Flory, Anny; Jambou, Didier

    2014-05-01

    The Q Hemostasis Analyzer (Grifols, Barcelona, Spain) is a fully-automated random-access multiparameter analyzer, designed to perform coagulation, chromogenic and immunologic assays. It is equipped with a cap-piercing system. The instrument was evaluated in a hemostasis laboratory of a University Hospital with respect to its technical features in the determination of coagulation i.e. prothrombin time (PT), activated partial thromboplastin time (aPTT), thrombin time, fibrinogen and single coagulation factors V (FV) and VIII (FVIII), chromogenic [antithrombin (AT) and protein C activity] and immunologic assays [von Willebrand factor antigen (vWF:Ag) concentration], using reagents from the analyzer manufacturer. Total precision (evaluated as the coefficient of variation) was below 6% for most parameters both in normal and in pathological ranges, except for FV, FVIII, AT and vWF:Ag both in the normal and pathological samples. No carryover was detected in alternating aPTT measurement in a pool of normal plasma samples and in the same pool spiked with unfractionated heparin (>1.5 IU/mL). The effective throughput was 154 PT, 66 PT/aPTT, 42 PT/aPTT/fibrinogen, and 38 PT/aPTT/AT per hour, leading to 154 to 114 tests performed per hour, depending of the tested panel. Test results obtained on the Q Hemostasis Analyzer were well correlated with those obtained on the ACL TOP analyzer (Instrumentation Laboratory), with r between 0.862 and 0.989. In conclusion, routine coagulation testing can be performed on the Q Hemostasis Analyzer with satisfactory precision and the same apply to more specialized and specific tests.

  3. Analyzers Measure Greenhouse Gases, Airborne Pollutants

    Science.gov (United States)

    2012-01-01

    In complete darkness, a NASA observatory waits. When an eruption of boiling water billows from a nearby crack in the ground, the observatory s sensors seek particles in the fluid, measure shifts in carbon isotopes, and analyze samples for biological signatures. NASA has landed the observatory in this remote location, far removed from air and sunlight, to find life unlike any that scientists have ever seen. It might sound like a scene from a distant planet, but this NASA mission is actually exploring an ocean floor right here on Earth. NASA established a formal exobiology program in 1960, which expanded into the present-day Astrobiology Program. The program, which celebrated its 50th anniversary in 2010, not only explores the possibility of life elsewhere in the universe, but also examines how life begins and evolves, and what the future may hold for life on Earth and other planets. Answers to these questions may be found not only by launching rockets skyward, but by sending probes in the opposite direction. Research here on Earth can revise prevailing concepts of life and biochemistry and point to the possibilities for life on other planets, as was demonstrated in December 2010, when NASA researchers discovered microbes in Mono Lake in California that subsist and reproduce using arsenic, a toxic chemical. The Mono Lake discovery may be the first of many that could reveal possible models for extraterrestrial life. One primary area of interest for NASA astrobiologists lies with the hydrothermal vents on the ocean floor. These vents expel jets of water heated and enriched with chemicals from off-gassing magma below the Earth s crust. Also potentially within the vents: microbes that, like the Mono Lake microorganisms, defy the common characteristics of life on Earth. Basically all organisms on our planet generate energy through the Krebs Cycle, explains Mike Flynn, research scientist at NASA s Ames Research Center. This metabolic process breaks down sugars for energy

  4. Using Simulation to Analyze Acoustic Environments

    Science.gov (United States)

    Wood, Eric J.

    2016-01-01

    One of the main projects that was worked on this semester was creating an acoustic model for the Advanced Space Suit in Comsol Multiphysics. The geometry tools built into the software were used to create an accurate model of the helmet and upper torso of the suit. After running the simulation, plots of the sound pressure level within the suit were produced, as seen below in Figure 1. These plots show significant nulls which should be avoided when placing microphones inside the suit. In the future, this model can be easily adapted to changes in the suit design to determine optimal microphone placements and other acoustic properties. Another major project was creating an acoustic diverter that will potentially be used to route audio into the Space Station's Node 1. The concept of the project was to create geometry to divert sound from a neighboring module, the US Lab, into Node 1. By doing this, no new audio equipment would need to be installed in Node 1. After creating an initial design for the diverter, analysis was performed in Comsol in order to determine how changes in geometry would affect acoustic performance, as shown in Figure 2. These results were used to produce a physical prototype diverter on a 3D printer. With the physical prototype, testing was conducted in an anechoic chamber to determine the true effectiveness of the design, as seen in Figure 3. The results from this testing have been compared to the Comsol simulation results to analyze how closely the Comsol results are to real-world performance. While the Comsol results do not seem to closely resemble the real world performance, this testing has provided valuable insight into how much trust can be placed in the results of Comsol simulations. A final project that was worked on during this tour was the Audio Interface Unit (AIU) design for the Orion program. The AIU is a small device that will be used for as an audio communication device both during launch and on-orbit. The unit will have functions

  5. Plant traits related to nitrogen uptake influence plant-microbe competition.

    Science.gov (United States)

    Moreau, Delphine; Pivato, Barbara; Bru, David; Busset, Hugues; Deau, Florence; Faivre, Céline; Matejicek, Annick; Strbik, Florence; Philippot, Laurent; Mougel, Christophe

    2015-08-01

    Plant species are important drivers of soil microbial communities. However, how plant functional traits are shaping these communities has received less attention though linking plant and microbial traits is crucial for better understanding plant-microbe interactions. Our objective was to determine how plant-microbe interactions were affected by plant traits. Specifically we analyzed how interactions between plant species and microbes involved in nitrogen cycling were affected by plant traits related to 'nitrogen nutrition in interaction with soil nitrogen availability. Eleven plant species, selected along an oligotrophic-nitrophilic gradient, were grown individually in a nitrogen-poor soil with two levels of nitrate availability. Plant traits for both carbon and nitrogen nutrition were measured and the genetic structure and abundance of rhizosphere. microbial communities, in particular the ammonia oxidizer and nitrate reducer guilds, were analyzed. The structure of the bacterial community in the rhizosphere differed significantly between plant species and these differences depended on nitrogen availability. The results suggest that the rate of nitrogen uptake per unit of root biomass and per day is a key plant trait, explaining why the effect of nitrogen availability on the structure of the bacterial community depends on the plant species. We also showed that the abundance of nitrate reducing bacteria always decreased with increasing nitrogen uptake per unit of root biomass per day, indicating that there was competition for nitrate between plants and nitrate reducing bacteria. This study demonstrates that nitrate-reducing microorganisms may be adversely affected by plants with a high nitrogen uptake rate. Our work puts forward the role of traits related to nitrogen in plant-microbe interactions, whereas carbon is commonly considered as the main driver. It also suggests that plant traits related to ecophysiological processes, such as nitrogen uptake rates, are more

  6. Plant health sensing system for determining nitrogen status in plants

    Science.gov (United States)

    Thomasson, J. A.; Sui, Ruixiu; Read, John J.; Reddy, K. R.

    2004-03-01

    A plant health sensing system was developed for determining nitrogen status in plants. The system consists of a multi-spectral optical sensor and a data-acquisition and processing unit. The optical sensor"s light source provides modulated panchromatic illumination of a plant canopy with light-emitting diodes, and the sensor measures spectral reflectance through optical filters that partition the energy into blue, green, red, and near-infrared wavebands. Spectral reflectance of plants is detected in situ, at the four wavebands, in real time. The data-acquisition and processing unit is based on a single board computer that collects data from the multi-spectral sensor and spatial information from a global positioning system receiver. Spectral reflectance at the selected wavebands is analyzed, with algorithms developed during preliminary work, to determine nitrogen status in plants. The plant health sensing system has been tested primarily in the laboratory and field so far, and promising results have been obtained. This article describes the development, theory of operation, and test results of the plant health sensing system.

  7. Stress tolerant plants

    OpenAIRE

    2014-01-01

    [EN] The invention relates to transgenic plants and methods for modulating abscisic acid (ABA) perception and signal transduction in plants. The plants find use in increasing yield in plants, particularly under abiotic stress.

  8. Plant fertilizer poisoning

    Science.gov (United States)

    Plant fertilizers and household plant foods are used to improve plant growth. Poisoning can occur if someone swallows these products. Plant fertilizers are mildly poisonous if small amounts are swallowed. ...

  9. Research Progress in Glycine Betaine Improving Plant Salty Stressful Tolerance

    Institute of Scientific and Technical Information of China (English)

    ZHU Hong; WANG Wenjie; YAN Yongqing; ZU Yuangang

    2008-01-01

    Many plants accumulate compatible solutes in response to the imposition of environmental stresses. Glycine betaine, which is one of compatible solutes in cell of plants, has been shown to have surviving ability for plant from salt stress. Effect of glycine betaine on improving plant salt resistance was discussed in plants under salt stress. The accumulation of glycine betaine protects plants against the damaging effects of stress. Strategies of glycine betaine against the damaging effects of stress were analyzed to clarify the roles of glycine betaine in salt stress tolerance of plants.

  10. [Issues of large scale tissue culture of medicinal plant].

    Science.gov (United States)

    Lv, Dong-Mei; Yuan, Yuan; Zhan, Zhi-Lai

    2014-09-01

    In order to increase the yield and quality of the medicinal plant and enhance the competitive power of industry of medicinal plant in our country, this paper analyzed the status, problem and countermeasure of the tissue culture of medicinal plant on large scale. Although the biotechnology is one of the most efficient and promising means in production of medicinal plant, it still has problems such as stability of the material, safety of the transgenic medicinal plant and optimization of cultured condition. Establishing perfect evaluation system according to the characteristic of the medicinal plant is the key measures to assure the sustainable development of the tissue culture of medicinal plant on large scale.

  11. HMR Log Analyzer: Analyze Web Application Logs Over Hadoop MapReduce

    Directory of Open Access Journals (Sweden)

    Sayalee Narkhede

    2013-07-01

    Full Text Available In today’s Internet world, log file analysis is becoming a necessary task for analyzing the customer’sbehavior in order to improve advertising and sales as well as for datasets like environment, medical,banking system it is important to analyze the log data to get required knowledge from it. Web mining is theprocess of discovering the knowledge from the web data. Log files are getting generated very fast at therate of 1-10 Mb/s per machine, a single data center can generate tens of terabytes of log data in a day.These datasets are huge. In order to analyze such large datasets we need parallel processing system andreliable data storage mechanism. Virtual database system is an effective solution for integrating the databut it becomes inefficient for large datasets. The Hadoop framework provides reliable data storage byHadoop Distributed File System and MapReduce programming model which is a parallel processingsystem for large datasets. Hadoop distributed file system breaks up input data and sends fractions of theoriginal data to several machines in hadoop cluster to hold blocks of data. This mechanism helps toprocess log data in parallel using all the machines in the hadoop cluster and computes result efficiently.The dominant approach provided by hadoop to “Store first query later”, loads the data to the HadoopDistributed File System and then executes queries written in Pig Latin. This approach reduces the responsetime as well as the load on to the end system. This paper proposes a log analysis system using HadoopMapReduce which will provide accurate results in minimum response time.

  12. Foreign acquisition, plant survival, and employment growth

    DEFF Research Database (Denmark)

    Bandick, Roger; Görg, Holger

    2010-01-01

    endogeneity of acquisition using IV and propensity score matching approaches suggest that acquisition by foreign owners increases the lifetime of the acquired plants only if the plant was an exporter. The effect is robust to controlling for domestic acquisitions and differs between horizontal and vertical...... acquisitions. We find robust positive employment growth effects only for exporters and only if the takeover is vertical.......This paper analyzes the effect of foreign acquisition on survival and employment growth of targets using data on Swedish manufacturing plants.We separate targeted plants into those within Swedish MNEs, Swedish exporting non-MNEs, and purely domestic firms. The results, controlling for possible...

  13. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Roozbeh Rashed

    2013-01-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.  

  14. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Farzin Heravi

    2012-09-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.

  15. [Characteristics of working conditions at metallurgy-related plants].

    Science.gov (United States)

    Egorova, A M

    2008-01-01

    Working conditions at more versus less advanced technology steel plants of the Volgograd Region are analyzed. The working conditions at the less advanced technology plants are referred to as a very high occupational risk. It is necessary to work out measures to lower the poor impact of microclimate, dust, noise, to improve illumination, and to regulate labor at steel plants.

  16. Investigating Effects of Invasive Species on Plant Community Structure

    Science.gov (United States)

    Franklin, Wilfred

    2008-01-01

    In this article, the author presents a field study project that explores factors influencing forest community structure and lifts the veil off of "plant blindness." This ecological study consists of three laboratories: (1) preliminary field trip to the study site; (2) plant survey; and (3) analyzing plant community structure with descriptive…

  17. 40 CFR 92.119 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 92... Hydrocarbon analyzer calibration. The HFID hydrocarbon analyzer shall receive the following initial and... into service and at least annually thereafter, the HFID hydrocarbon analyzer shall be adjusted...

  18. 40 CFR 86.1321-94 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86... Procedures § 86.1321-94 Hydrocarbon analyzer calibration. The FID hydrocarbon analyzer shall receive the... into service and at least annually thereafter, the FID hydrocarbon analyzer shall be adjusted...

  19. 40 CFR 91.316 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 91....316 Hydrocarbon analyzer calibration. (a) Calibrate the FID and HFID hydrocarbon analyzer as described... thereafter, adjust the FID and HFID hydrocarbon analyzer for optimum hydrocarbon response as specified...

  20. 40 CFR 89.319 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 89... Equipment Provisions § 89.319 Hydrocarbon analyzer calibration. (a) The FID hydrocarbon analyzer shall... and at least annually thereafter, adjust the FID hydrocarbon analyzer for optimum hydrocarbon...

  1. RootAnalyzer: A Cross-Section Image Analysis Tool for Automated Characterization of Root Cells and Tissues.

    Directory of Open Access Journals (Sweden)

    Joshua Chopin

    Full Text Available The morphology of plant root anatomical features is a key factor in effective water and nutrient uptake. Existing techniques for phenotyping root anatomical traits are often based on manual or semi-automatic segmentation and annotation of microscopic images of root cross sections. In this article, we propose a fully automated tool, hereinafter referred to as RootAnalyzer, for efficiently extracting and analyzing anatomical traits from root-cross section images. Using a range of image processing techniques such as local thresholding and nearest neighbor identification, RootAnalyzer segments the plant root from the image's background, classifies and characterizes the cortex, stele, endodermis and epidermis, and subsequently produces statistics about the morphological properties of the root cells and tissues. We use RootAnalyzer to analyze 15 images of wheat plants and one maize plant image and evaluate its performance against manually-obtained ground truth data. The comparison shows that RootAnalyzer can fully characterize most root tissue regions with over 90% accuracy.

  2. RootAnalyzer: A Cross-Section Image Analysis Tool for Automated Characterization of Root Cells and Tissues.

    Science.gov (United States)

    Chopin, Joshua; Laga, Hamid; Huang, Chun Yuan; Heuer, Sigrid; Miklavcic, Stanley J

    2015-01-01

    The morphology of plant root anatomical features is a key factor in effective water and nutrient uptake. Existing techniques for phenotyping root anatomical traits are often based on manual or semi-automatic segmentation and annotation of microscopic images of root cross sections. In this article, we propose a fully automated tool, hereinafter referred to as RootAnalyzer, for efficiently extracting and analyzing anatomical traits from root-cross section images. Using a range of image processing techniques such as local thresholding and nearest neighbor identification, RootAnalyzer segments the plant root from the image's background, classifies and characterizes the cortex, stele, endodermis and epidermis, and subsequently produces statistics about the morphological properties of the root cells and tissues. We use RootAnalyzer to analyze 15 images of wheat plants and one maize plant image and evaluate its performance against manually-obtained ground truth data. The comparison shows that RootAnalyzer can fully characterize most root tissue regions with over 90% accuracy.

  3. Direct flash steam geothermal power plant assessment

    Science.gov (United States)

    Alt, T. E.

    1982-01-01

    The objective was to analyze the capacity and availability factors of an operating direct flash geothermal power plant. System and component specifications, operating procedures, maintenance history, malfunctions, and outage rate are discussed. The plant studied was the 75 MW(e) geothermal power plant at Cerro Prieto, Mexico, for the years 1973 to 1979. To describe and assess the plant, the project staff reviewed documents, visited the plant, and met with staff of the operating utility. The high reliability and availability of the plant was documented and actions responsible for the good performance were identified and reported. The results are useful as guidance to US utilities considering use of hot water geothermal resources for power generation through a direct flash conversion cycle.

  4. Analyzing the complex machinery of cell wall biosynthesis

    NARCIS (Netherlands)

    Timmers, J.F.P.

    2009-01-01

    The plant cell wall polymers make up most of the plant biomass and provide the raw material for many economically important products including food, feed, bio-materials, chemicals, textiles, and biofuel. This broad range of functions and applications make the biosynthesis of these polysaccharides a

  5. Methane production from plant biomass

    Energy Technology Data Exchange (ETDEWEB)

    Zauner, E.

    1985-01-01

    Methane fermentations of plant biomass were performed to increase basic knowledge necessary for development of suitable conversion technologies. Effects of bacterial inoculants, substrate compounds and varied process conditions were analyzed in batch and continuous fermentation experiments. Use of enriched bacterial populations precultured and adapted to plant materials was proved to be advantageous for inoculation. Methane yields and productivities as well as chemical and bacterial composition of digester fluids were determined at various loading rates and retention times during fermentation of different grass and maize silages. Recycling for favorable amounts of decomposed effluent for neutralization of supplied acid raw materials was important to achieve high methane yields. Quantity and composition of acido-, aceto- and methanogenic bacteria were not essentially influenced by changed fermentation conditions. Results of these laboratory examinations have to be completed by long run and scale up experiments to develop control parameters for plant biogas digesters.

  6. Teaching Plant Reproduction.

    Science.gov (United States)

    Tolman, Marvin N., Ed.; Hardy, Garry R., Ed.

    2000-01-01

    Recommends using Amaryllis hippeastrum to teach young children about plant reproduction. Provides tips for growing these plants, discusses the fast growing rate of the plant, and explains the anatomy. (YDS)

  7. Poinsettia plant exposure

    Science.gov (United States)

    Christmas flower poisoning; Lobster plant poisoning; Painted leaf poisoning ... Leaves, stem, sap of the poinsettia plant ... Poinsettia plant exposure can affect many parts of the body. EYES (IF DIRECT CONTACT OCCURS) Burning Redness STOMACH AND ...

  8. Kansas Power Plants

    Data.gov (United States)

    Kansas Data Access and Support Center — The Kansas Power Plants database depicts, as point features, the locations of the various types of power plant locations in Kansas. The locations of the power plants...

  9. Herbivory and dominance shifts among exotic and congeneric native plant species during plant community establishment

    DEFF Research Database (Denmark)

    Engelkes, Tim; Meisner, Annelein; Morriën, Elly;

    2016-01-01

    Invasive exotic plant species often have fewer natural enemies and suffer less damage from herbivores in their new range than genetically or functionally related species that are native to that area. Although we might expect that having fewer enemies would promote the invasiveness of the introduced...... exotic plant species due to reduced enemy exposure, few studies have actually analyzed the ecological consequences of this situation in the field. Here, we examined how exposure to aboveground herbivores influences shifts in dominance among exotic and phylogenetically related native plant species...... in a riparian ecosystem during early establishment of invaded communities. We planted ten plant communities each consisting of three individuals of each of six exotic plant species as well as six phylogenetically related natives. Exotic plant species were selected based on a rapid recent increase in regional...

  10. Analyzing salvia divinorum and its active ingredient salvinorin a utilizing thin layer chromatography and gas chromatography/mass spectrometry.

    Science.gov (United States)

    Jermain, John D; Evans, Hiram K

    2009-05-01

    In recent years, Salvia divinorum has become a major focus by state legislatures throughout the United States looking to prohibit the sale of the psychoactive plant. After researching testing procedures presented in the literature and those employed by crime laboratories throughout the country, it was decided that thin layer chromatography (TLC) and gas chromatography/mass spectrometry (GC/MS) were the methods to use to analyze plant material for salvinorin A. With TLC, salvinorin A was detected from extracted plant material and was easily distinguishable from 13 other Salvia species as well as Cannabis sativa L. (marijuana). When using GC/MS, salvinorin A was best extracted from plant material with chloroform at ambient temperature when using a nonpolar solvent and acetone at ambient temperature when using a polar solvent. By utilizing these techniques, criminalists are now able to confirm the presence of salvinorin A in a submitted plant material suspected to be Salvia divinorum.

  11. In-Born Radio Frequency Identification Devices for Safeguards Use at Gas-Centrifuge Enrichment Plants

    Energy Technology Data Exchange (ETDEWEB)

    Ward,R.; Rosenthal,M.

    2009-07-12

    Global expansion of nuclear power has made the need for improved safeguards measures at Gas Centrifuge Enrichment Plants (GCEPs) imperative. One technology under consideration for safeguards applications is Radio Frequency Identification Devices (RFIDs). RFIDs have the potential to increase IAEA inspector"s efficiency and effectiveness either by reducing the number of inspection visits necessary or by reducing inspection effort at those visits. This study assesses the use of RFIDs as an integral component of the "Option 4" safeguards approach developed by Bruce Moran, U.S. Nuclear Regulatory Commission (NRC), for a model GCEP [1]. A previous analysis of RFIDs was conducted by Jae Jo, Brookhaven National Laboratory (BNL), which evaluated the effectiveness of an RFID tag applied by the facility operator [2]. This paper presents a similar evaluation carried out in the framework of Jo’s paper, but it is predicated on the assumption that the RFID tag is applied by the manufacturer at the birth of the cylinder, rather than by the operator. Relevant diversion scenarios are examined to determine if RFIDs increase the effectiveness and/ or efficiency of safeguards in these scenarios. Conclusions on the benefits offered to inspectors by using in-born RFID tagging are presented.

  12. Lab-on-a-chip Astrobiology Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer to measure chemical signatures of life in extraterrestrial settings. The analyzer will...

  13. Looking for a Framework for Analyzing Eco-innovation Dynamics

    DEFF Research Database (Denmark)

    Yang, Yan

    2011-01-01

    Looking for a Framework for Analyzing Eco-innovation Dynamics: A Triple Helix Model of Innovation Perspective.......Looking for a Framework for Analyzing Eco-innovation Dynamics: A Triple Helix Model of Innovation Perspective....

  14. 40 CFR 90.320 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Emission Test Equipment Provisions § 90.320 Carbon dioxide analyzer calibration. (a) Prior to its initial... carbon dioxide analyzer as follows: (1) Follow good engineering practices for instrument start-up...

  15. 40 CFR 89.322 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Test Equipment Provisions § 89.322 Carbon dioxide analyzer calibration. (a) Prior to its introduction... carbon dioxide analyzer shall be calibrated on all normally used instrument ranges. New...

  16. 21 CFR 868.1400 - Carbon dioxide gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Carbon dioxide gas analyzer. 868.1400 Section 868...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1400 Carbon dioxide gas analyzer. (a) Identification. A carbon dioxide gas analyzer is a device intended to measure the concentration of carbon...

  17. 21 CFR 882.1420 - Electroencephalogram (EEG) signal spectrum analyzer.

    Science.gov (United States)

    2010-04-01

    ....1420 Electroencephalogram (EEG) signal spectrum analyzer. (a) Identification. An electroencephalogram (EEG) signal spectrum analyzer is a device used to display the frequency content or power spectral... analyzer. 882.1420 Section 882.1420 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH...

  18. 21 CFR 1230.32 - Analyzing of samples.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Analyzing of samples. 1230.32 Section 1230.32 Food... FEDERAL CAUSTIC POISON ACT Administrative Procedures § 1230.32 Analyzing of samples. Samples collected by an authorized agent shall be analyzed at the laboratory designated by the Food and...

  19. 21 CFR 868.1670 - Neon gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Neon gas analyzer. 868.1670 Section 868.1670 Food... DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1670 Neon gas analyzer. (a) Identification. A neon gas analyzer is a device intended to measure the concentration of neon in a gas mixture exhaled by...

  20. 21 CFR 864.5680 - Automated heparin analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated heparin analyzer. 864.5680 Section 864....5680 Automated heparin analyzer. (a) Identification. An automated heparin analyzer is a device used to determine the heparin level in a blood sample by mixing the sample with protamine (a...

  1. 40 CFR 86.1221-90 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86...-Fueled and Methanol-Fueled Heavy-Duty Vehicles § 86.1221-90 Hydrocarbon analyzer calibration. The FID hydrocarbon analyzer shall receive the following initial and periodic calibrations. (a) Initial and...

  2. 40 CFR 90.316 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 90... Equipment Provisions § 90.316 Hydrocarbon analyzer calibration. (a) Calibrate the FID and HFID hydrocarbon... thereafter, adjust the FID and HFID hydrocarbon analyzer for optimum hydrocarbon response as specified...

  3. 40 CFR 86.331-79 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86....331-79 Hydrocarbon analyzer calibration. The following steps are followed in sequence to calibrate the hydrocarbon analyzer. It is suggested, but not required, that efforts be made to minimize relative...

  4. 40 CFR 86.121-90 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86... Complete Heavy-Duty Vehicles; Test Procedures § 86.121-90 Hydrocarbon analyzer calibration. The hydrocarbon... FID and HFID hydrocarbon analyzers shall be adjusted for optimum hydrocarbon response....

  5. 40 CFR 86.521-90 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86... Regulations for 1978 and Later New Motorcycles; Test Procedures § 86.521-90 Hydrocarbon analyzer calibration. (a) The FID hydrocarbon analyzer shall receive the following initial and periodic calibration....

  6. 21 CFR 868.2380 - Nitric oxide analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Nitric oxide analyzer. 868.2380 Section 868.2380...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Monitoring Devices § 868.2380 Nitric oxide analyzer. (a) Identification. The nitric oxide analyzer is a device intended to measure the concentration of nitric oxide...

  7. Ethylene insensitive plants

    Science.gov (United States)

    Ecker, Joseph R.; Nehring, Ramlah; McGrath, Robert B.

    2007-05-22

    Nucleic acid and polypeptide sequences are described which relate to an EIN6 gene, a gene involved in the plant ethylene response. Plant transformation vectors and transgenic plants are described which display an altered ethylene-dependent phenotype due to altered expression of EIN6 in transformed plants.

  8. Plant Growth Regulators.

    Science.gov (United States)

    Nickell, Louis G.

    1978-01-01

    Describes the effect of "plant growth regulators" on plants, such as controlling the flowering, fruit development, plant size, and increasing crop yields. Provides a list of plant growth regulators which includes their chemical, common, and trade names, as well as their different use(s). (GA)

  9. Plant Biology Science Projects.

    Science.gov (United States)

    Hershey, David R.

    This book contains science projects about seed plants that deal with plant physiology, plant ecology, and plant agriculture. Each of the projects includes a step-by-step experiment followed by suggestions for further investigations. Chapters include: (1) "Bean Seed Imbibition"; (2) "Germination Percentages of Different Types of Seeds"; (3)…

  10. THE EUROPEAN POSITION OF DUTCH PLANT COMMUNITIES

    Directory of Open Access Journals (Sweden)

    J.A.M. JANSSEN

    2007-04-01

    Full Text Available In this paper it is analyzed for which plant communities (alliances the Netherlands has an international responsibility. Data has been brought together on the range and distribution of alliances in Europe, the area of plant communities in the Netherlands and surrounding countries and the occurrence of endemic associations in the Netherlands. The analysis resulted in a list of 34 out of 93 alliances in the Netherlands which are important from an international point of view.

  11. Analyzing Impact of Intermodal Facilities on Design and Management of Biofuel Supply Chain

    Energy Technology Data Exchange (ETDEWEB)

    Eksioglu, Sandra D [ORNL; Li, Song [ORNL; Zhang, Shu [Mississippi State University (MSU); Petrolia, Daniel [Mississippi State University (MSU); Sokhansanj, Shahabaddine [ORNL

    2010-09-01

    The impact of an intermodal facility on location and transportation decisions for biofuel production plants is analyzed. Location decisions affect the management of the inbound and outbound logistics of a plant. This supply chain design and management problem is modeled as a mixed integer program. Input data for this model are location of intermodal facilities and available transportation modes, cost and cargo capacity for each transportation mode, geographical distribution of biomass feedstock and production yields, and biomass processing and inventory costs. Outputs from this model are the number, location, and capacity of biofuel production plants. For each plant, the transportation mode used, timing of shipments, shipment size, inventory size, and production schedule that minimize the delivery cost of biofuel are determined. The model proposed in this research can be used as a decision-making tool for investors in the biofuels industry since it estimates the real cost of the business. The state of Mississippi is considered as the testing grounds for the model.

  12. JSTOR Plant Science

    OpenAIRE

    2010-01-01

    JSTOR Plant Science is an online environment that brings together content, tools, and people interested in plant science. It provides access to foundational content vital to plant science – plant type specimens, taxonomic structures, scientific literature, and related materials, making them widely accessible to the plant science community as well as to researchers in other fields and to the public. It also provides an easy to use interface with powerful functionality that su...

  13. Plant Research '75

    Energy Technology Data Exchange (ETDEWEB)

    1975-01-01

    Research is reported on stomatal regulation of the gas exchanges between plant and environment; inhibitory effects in flower formation; plant growth and development through hormones; hormone action; development and nitrogen fixation in algae; primary cell wall glycoprotein ectensin; enzymic mechanisms and control of polysaccharide and glycoprotein synthesis; molecular studies of membrane studies; sensory transduction in plants; regulation of formation of protein complexes and enzymes in higher plant cell and mechanism of sulfur dioxide toxicity in plants. (PCS)

  14. PLANT BIOPRINTING: NOVEL PERSPECTIVE FOR PLANT BIOTECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Adhityo WICAKSONO

    2015-12-01

    Full Text Available Bioprinting is a technical innovation that has revolutionized tissue engineering. Using conventional printer cartridges filled with cells as well as a suitable scaffold, major advances have been made in the biomedical field, and it is now possible to print skin, bones, blood vessels, and even organs. Unlike animal systems, the application of bioprinting in simple plant tissue cells is still in a nascent phase and has yet to be studied. One major advantage of plants is that all living parts are reprogrammable in the form of totipotent cells. Plant bioprinting may improve scientists’understanding of plant shape and morphogenesis, and could serve for the mass production of desired tissues or plants, or even the production of plant-based biomaterial for industrial uses. This perspectives paper explores these possibilities using knowledge on what is known about bioprinting in other biosystems.

  15. Risk and insurance management for biogas plants

    Energy Technology Data Exchange (ETDEWEB)

    Haerig, M. [Marsh GmbH, Duesseldorf (Germany)

    2007-07-01

    The continuing negative experiences of insurers with biogas plants make it difficult for operators to get appropriate insurance cover. There are different reasons for this big number of damages. Especially in the beginning components have been rebuilt for biogas without sufficient experiences. Other damages have emerged due to improper or disregardful plant management. For that reason the insurers' standards for the biogas plant and the plant management have risen up. The following report deals with experiences of insurance for biogas plants and the resulting consequences. For that reason Marsh carried out a research project and analyzed all reported claims in the All Risk Insurance. The necessary technical minimum requirements for installation and operation are geared to the experiences with damages. But they also account for the interests of the insured. (orig.)

  16. Gramene database: Navigating plant comparative genomics resources

    Directory of Open Access Journals (Sweden)

    Parul Gupta

    2016-11-01

    Full Text Available Gramene (http://www.gramene.org is an online, open source, curated resource for plant comparative genomics and pathway analysis designed to support researchers working in plant genomics, breeding, evolutionary biology, system biology, and metabolic engineering. It exploits phylogenetic relationships to enrich the annotation of genomic data and provides tools to perform powerful comparative analyses across a wide spectrum of plant species. It consists of an integrated portal for querying, visualizing and analyzing data for 44 plant reference genomes, genetic variation data sets for 12 species, expression data for 16 species, curated rice pathways and orthology-based pathway projections for 66 plant species including various crops. Here we briefly describe the functions and uses of the Gramene database.

  17. The controversial telomeres of lily plants.

    Science.gov (United States)

    de la Herrán, R; Cuñado, N; Navajas-Pérez, R; Santos, J L; Ruiz Rejón, C; Garrido-Ramos, M A; Ruiz Rejón, M

    2005-01-01

    The molecular structure of the exceptional telomeres of six plant species belonging to the order Asparagales and two species of the order Liliales was analyzed using Southern blot and fluorescence in situ hybridization. Three different situations were found, namely: i) In the two Liliales species, Tulipa australis (Liliaceae) and Merendera montana (Colchicaceae), the chromosome ends display hybridization signals with oligonucleotides resembling telomere repeats of both plants (TTTAGGG)n and vertebrates (TTAGGG)n. ii) Asparagales species such as Phormium tenax (Hemerocallidaceae), Muscari comosum (Hyacinthaceae), Narcissus jonquilla (Amaryllidaceae) and Allium sativum (Alliaceae) lack both the plant telomere repeats and the vertebrate telomere repeats. iii) Two other Asparagales species, Aloe vera (Asphodelaceae) and an Iris hybrid (Iridaceae), display positive hybridization with the vertebrate telomere repeats but not with the plant telomere repeats. Southern blot hybridization revealed concurring results. On this basis, the composition of the telomere structure in this plant group is discussed.

  18. Measurement of the analyzing power of proton-carbon elastic scattering in the CNI region at RHIC

    CERN Document Server

    Jinnouchi, O; Bravar, A; Bunce, G; Dhawan, S; Huang, H; Igo, G; Kanavets, V P; Kurita, K; Okada, H; Saitô, N; Spinka, H; Svirida, D N; Wood, J

    2005-01-01

    The single transverse spin asymmetry, A_N, of the p-carbon elastic scattering process in the Coulomb Nuclear Interference (CNI) region was measured using an ultra thin carbon target and polarized proton beam in the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory (BNL). In 2004, data were collected to calibrate the p-carbon process at two RHIC energies (24 GeV, 100 GeV). A_N was obtained as a function of momentum transfer -t. The results were fit with theoretical models which allow us to assess the contribution from a hadronic spin flip amplitude.

  19. Reshaping Plant Biology: Qualitative and Quantitative Descriptors for Plant Morphology

    Science.gov (United States)

    Balduzzi, Mathilde; Binder, Brad M.; Bucksch, Alexander; Chang, Cynthia; Hong, Lilan; Iyer-Pascuzzi, Anjali S.; Pradal, Christophe; Sparks, Erin E.

    2017-01-01

    An emerging challenge in plant biology is to develop qualitative and quantitative measures to describe the appearance of plants through the integration of mathematics and biology. A major hurdle in developing these metrics is finding common terminology across fields. In this review, we define approaches for analyzing plant geometry, topology, and shape, and provide examples for how these terms have been and can be applied to plants. In leaf morphological quantifications both geometry and shape have been used to gain insight into leaf function and evolution. For the analysis of cell growth and expansion, we highlight the utility of geometric descriptors for understanding sepal and hypocotyl development. For branched structures, we describe how topology has been applied to quantify root system architecture to lend insight into root function. Lastly, we discuss the importance of using morphological descriptors in ecology to assess how communities interact, function, and respond within different environments. This review aims to provide a basic description of the mathematical principles underlying morphological quantifications. PMID:28217137

  20. Evaluation of the Olympus AU 400 clinical chemistry analyzer.

    Science.gov (United States)

    Bilić, A; Alpeza, I; Rukavina, A S

    2000-01-01

    The performance of the Olympus AU 400 clinical chemistry analyzer was evaluated according to the guidelines of the European Committee for Clinical Laboratory Standards. The following analytes were tested: glucose, urea, creatinine, calcium, AST, ALT, CK, LDH, ALP and amylase. The Olympus AU 400 was compared with the Olympus AU 800. Coefficients of correlation showed high correlation between the compared analyzers. Other performances (intra- and inter-assay variation, carry-over and interferences) of the analyzer were satisfactory.

  1. Oxygen analyzers: failure rates and life spans of galvanic cells.

    Science.gov (United States)

    Meyer, R M

    1990-07-01

    Competing technologies exist for measuring oxygen concentrations in breathing circuits. Over a 4-year period, two types of oxygen analyzers were studied prospectively in routine clinical use to determine the incidence and nature of malfunctions. Newer AC-powered galvanic analyzers (North American Dräger O2med) were compared with older, battery-powered polarographic analyzers (Ohmeda 201) by recording all failures and necessary repairs. The AC-powered galvanic analyzer had a significantly lower incidence of failures (0.12 +/- 0.04 failures per machine-month) than the battery-powered polarographic analyzer (4.0 +/- 0.3 failures per machine-month). Disposable capsules containing the active galvanic cells lasted 12 +/- 7 months. Although the galvanic analyzers tended to remain out of service longer, awaiting the arrival of costly parts, the polarographic analyzers were more expensive to keep operating when calculations included the cost of time spent on repairs. Stocking galvanic capsules would have decreased the amount of time the galvanic analyzers were out of service, while increasing costs. In conclusion, galvanic oxygen analyzers appear capable of delivering more reliable service at a lower overall cost. By keeping the galvanic capsules exposed to room air during periods of storage, it should be possible to prolong their life span, further decreasing the cost of using them. In addition, recognizing the aberrations in their performance that warn of the exhaustion of the galvanic cells should permit timely recording and minimize downtime.

  2. A Novel Analyzer Control System for Diffraction Enhanced Imaging

    Science.gov (United States)

    Rhoades, Glendon; Belev, George; Rosenberg, Alan; Chapman, Dean

    2013-03-01

    Diffraction Enhanced Imaging is an imaging modality that derives contrast from x-ray refraction, an extreme form of scatter rejection (extinction) and absorption which is common to conventional radiography. A critical part of the imaging system is the "analyzer crystal" which is used to re-diffract the beam after passing through the object being imaged. The analyzer and monochromator crystals form a matched parallel crystal set. This analyzer needs to be accurately aligned and that alignment maintained over the course of an imaging session. Typically, the analyzer needs to remain at a specific angle within a few tens of nanoradians to prevent problems with image interpretation. Ideally, the analyzer would be set to a specific angle and would remain at that angle over the course of an imaging session which might be from a fraction of a second to several minutes or longer. In many instances, this requirement is well beyond what is possible by relying on mechanical stability alone and some form of feedback to control the analyzer setting is required. We describe a novel analyzer control system that allows the analyzer to be set at any location in the analyzer rocking curve, including the peak location. The method described is extensible to include methods to extend the range of analyzer control to several Darwin widths away from the analyzer peaked location. Such a system is necessary for the accurate implementation of the method and is intended to make the use of the method simpler without relying on repeated alignment during the imaging session.

  3. [Feasibility of applying ornamental plants in contaminated soil remediation].

    Science.gov (United States)

    Liu, Jia-Nü; Zhou, Qi-Xing; Sun, Ting; Wang, Xiao-Fei

    2007-07-01

    Phytoremediation is one of the effective ways in resolving problems of contaminated soils, but limited hyperaccumulation plant species were reported and documented. This shortage could be offset if remediation plants can be screened out from various ornamental plants. In addition, such doing can beautify the environment while bring some economic effects. Starting from the importance of phytoremediation, this paper generalized the characters and standards of remediation plants. Through describing the resources of ornamental plants and their functions on environmental protection, particularizing their superiorities to other plants, and analyzing their endurance, accumulation traits and remediation types, the feasibility of applying ornamental plants in the practices of contaminated soil remediation was discussed. To screening out hyperaccumulators from ornamental plants would be an entirely new research area in the remediation of contaminated soils.

  4. Agent-based Modeling Methodology for Analyzing Weapons Systems

    Science.gov (United States)

    2015-03-26

    43 Figure 14: Simulation Study Methodology for the Weapon System Analysis Metrics Definition and Data Collection The analysis plan calls for...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Casey D. Connors, Major, USA...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Presented to the Faculty Department of Operational Sciences

  5. 40 CFR 86.1524 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration. 86.1524 Section 86.1524 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Test Procedures § 86.1524 Carbon dioxide analyzer calibration. (a) The calibration requirements for...

  6. 40 CFR 91.320 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Provisions § 91.320 Carbon dioxide analyzer calibration. (a) Prior to its introduction into service, and monthly thereafter, or within one month prior to the certification test, calibrate the NDIR carbon...

  7. 40 CFR 86.317-79 - Hydrocarbon analyzer specifications.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Hydrocarbon analyzer specifications....317-79 Hydrocarbon analyzer specifications. (a) Hydrocarbon measurements are to be made with a heated... measures hydrocarbon emissions on a dry basis is permitted for gasoline-fueled testing; Provided,...

  8. 40 CFR 90.318 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... at any point, use the best-fit non-linear equation which represents the data to within two percent of... been set to the most common operating range. (4) Introduce into the NOX generator analyzer-system an NO... off the NOX generator but maintain gas flow through the system. The oxides of nitrogen analyzer...

  9. DUAL-CHANNEL PARTICLE SIZE AND SHAPE ANALYZER

    Institute of Scientific and Technical Information of China (English)

    Arjen van der Schoot

    2004-01-01

    @@ Fig. 1 shows a newly developed analyzer (Ankersmid CIS-100) that brings together two different measurement channels for accurate size and shape measurement of spherical and non-spherical particles. The size of spherical particles is measured by a HeNe Laser Beam; the size of non-spherical particles is analyzed by Dynamic Video Analysis of the particles' shape.

  10. Variability of Pesticide Dissipation Half-Lives in Plants

    DEFF Research Database (Denmark)

    Fantke, Peter; Juraske, Ronnie

    2013-01-01

    on the variability across substances, plant species and harvested plant components and finally discuss different substance, plant and environmental aspects influencing pesticide dissipation. Measured half-lives in harvested plant materials range from around 1 hour for pyrethrins in leaves of tomato and pepper fruit...... to 918 days for pyriproxyfen in pepper fruits under cold storage conditions. Ninety-five percent of all half-lives fall within the range between 0.6 and 29 days. Our results emphasize that future experiments are required to analyze pesticide–plant species combinations that have so far not been covered...

  11. Plant Phenotype Characterization System

    Energy Technology Data Exchange (ETDEWEB)

    Daniel W McDonald; Ronald B Michaels

    2005-09-09

    This report is the final scientific report for the DOE Inventions and Innovations Project: Plant Phenotype Characterization System, DE-FG36-04GO14334. The period of performance was September 30, 2004 through July 15, 2005. The project objective is to demonstrate the viability of a new scientific instrument concept for the study of plant root systems. The root systems of plants are thought to be important in plant yield and thus important to DOE goals in renewable energy sources. The scientific study and understanding of plant root systems is hampered by the difficulty in observing root activity and the inadequacy of existing root study instrumentation options. We have demonstrated a high throughput, non-invasive, high resolution technique for visualizing plant root systems in-situ. Our approach is based upon low-energy x-ray radiography and the use of containers and substrates (artificial soil) which are virtually transparent to x-rays. The system allows us to germinate and grow plant specimens in our containers and substrates and to generate x-ray images of the developing root system over time. The same plant can be imaged at different times in its development. The system can be used for root studies in plant physiology, plant morphology, plant breeding, plant functional genomics and plant genotype screening.

  12. Analyzing the genomes of wild and cultivated beets

    Science.gov (United States)

    Sugar beet is an important crop plant that accounts for roughly 25% of the world's sugar production per year. We have previously shown that sugar beet has a quite narrow genetic base, presumably due to a domestication bottleneck. To increase the crop ´s stress tolerance, the introduction of desirabl...

  13. Competition-density effect in plant populations

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The competition-density effect of plant populations is of significance in theory and practice of forest management and has been studied for long time. The differences between the two reciprocal equations of the competition-density effect in nonself-thinning populations and self-thinning populations were analyzed theoretically. This supplies a theoretical basis for analyzing the dynamics of forest populations and evaluating the effect of forest management.

  14. Plant tissue culture techniques

    OpenAIRE

    Rolf Dieter Illg

    1991-01-01

    Plant cell and tissue culture in a simple fashion refers to techniques which utilize either single plant cells, groups of unorganized cells (callus) or organized tissues or organs put in culture, under controlled sterile conditions.

  15. Plant growth and cultivation.

    Science.gov (United States)

    Podar, Dorina

    2013-01-01

    There is a variety of methods used for growing plants indoor for laboratory research. In most cases plant research requires germination and growth of plants. Often, people have adapted plant cultivation protocols to the conditions and materials at hand in their own laboratory and growth facilities. Here I will provide a guide for growing some of the most frequently used plant species for research, i.e., Arabidopsis thaliana, barley (Hordeum vulgare) and rice (Oryza sativa). However, the methods presented can be used for other plant species as well, especially if they are related to the above-mentioned species. The presented methods include growing plants in soil, hydroponics, and in vitro on plates. This guide is intended as a starting point for those who are just beginning to work on any of the above-mentioned plant species. Methods presented are to be taken as suggestive and modification can be made according to the conditions existing in the host laboratory.

  16. Plant tissue culture techniques

    Directory of Open Access Journals (Sweden)

    Rolf Dieter Illg

    1991-01-01

    Full Text Available Plant cell and tissue culture in a simple fashion refers to techniques which utilize either single plant cells, groups of unorganized cells (callus or organized tissues or organs put in culture, under controlled sterile conditions.

  17. Classification of cultivated plants.

    NARCIS (Netherlands)

    Brandenburg, W.A.

    1986-01-01

    Agricultural practice demands principles for classification, starting from the basal entity in cultivated plants: the cultivar. In establishing biosystematic relationships between wild, weedy and cultivated plants, the species concept needs re-examination. Combining of botanic classification, based

  18. Plant proton pumps

    DEFF Research Database (Denmark)

    Gaxiola, Roberto A.; Palmgren, Michael Gjedde; Schumacher, Karin

    2007-01-01

    Chemiosmotic circuits of plant cells are driven by proton (H+) gradients that mediate secondary active transport of compounds across plasma and endosomal membranes. Furthermore, regulation of endosomal acidification is critical for endocytic and secretory pathways. For plants to react to their co......Chemiosmotic circuits of plant cells are driven by proton (H+) gradients that mediate secondary active transport of compounds across plasma and endosomal membranes. Furthermore, regulation of endosomal acidification is critical for endocytic and secretory pathways. For plants to react...

  19. International Space Station Major Constituent Analyzer On-orbit Performance

    Science.gov (United States)

    Gardner, Ben D.; Erwin, Phillip M.; Wiedemann, Rachel; Matty, Chris

    2016-01-01

    The Major Constituent Analyzer (MCA) is a mass spectrometer based system that measures the major atmospheric constituents on the International Space Station. A number of limited-life components require periodic change-out, including the ORU 02 analyzer and the ORU 08 Verification Gas Assembly. The most recent ORU 02 and ORU 08 assemblies are operating nominally. For ORU 02, the ion source filaments and ion pump lifetime continue to be key determinants of MCA performance. Additionally, testing is underway to evaluate the capacity of the MCA to analyze ammonia. Finally, plans are being made to bring the second MCA on ISS to an operational configuration.

  20. Note: Portable rare-earth element analyzer using pyroelectric crystal

    Energy Technology Data Exchange (ETDEWEB)

    Imashuku, Susumu, E-mail: imashuku.susumu.2m@kyoto-u.ac.jp; Fuyuno, Naoto; Hanasaki, Kohei; Kawai, Jun [Department of Materials Science and Engineering, Kyoto University, Sakyo, Kyoto 606-8501 (Japan)

    2013-12-15

    We report a portable rare-earth element analyzer with a palm-top size chamber including the electron source of a pyroelectric crystal and the sample stage utilizing cathodoluminescence (CL) phenomenon. The portable rare-earth element analyzer utilizing CL phenomenon is the smallest reported so far. The portable rare-earth element analyzer detected the rare-earth elements Dy, Tb, Er, and Sm of ppm order in zircon, which were not detected by scanning electron microscopy-energy dispersive X-ray spectroscopy analysis. We also performed an elemental mapping of rare-earth elements by capturing a CL image using CCD camera.

  1. COSTEP: A comprehensive suprathermal and energetic particle analyzer for SOHO

    Science.gov (United States)

    Kunow, Horst; Fischer, Harald; Green, Guenter; Mueller-Mellin, Reinhold; Wibberenz, Gerd; Holweger, Hartmut; Evenson, Paul; Meyer, Jean-Paul; Hasebe, Nabuyuki; Vonrosenvinge, Tycho

    1988-01-01

    The group of instruments involved in the COSTEP (comprehensive suprathermal and energetic particle analyzer) project are described. Three sensors, the LION (low energy ion and electron) instrument, the MEICA (medium energy ion composition analyzer) and the EPHIN (electron proton helium instrument) are described. They are designed to analyze particle emissions from the sun over a wide range of species (electrons through iron) and energies (60 KeV/particle to 500 MeV/nucleon). The data collected is used in studying solar and space plasma physics.

  2. Validation of ESR analyzer using Westergren ESR method.

    Science.gov (United States)

    Sikka, Meera; Tandon, Rajesh; Rusia, Usha; Madan, Nishi

    2007-07-01

    Erythrocyte sedimentation rate (ESR) is one of the most frequently ordered laboratory test. ESR analyzers were developed to provide a quick and efficient measure of ESR. We compared the results of ESR obtained by an ESR analyzer with those by the Westergren method in a group of 75 patients Linear regression analysis showed a good correlation between the two results (r = 0.818, p < 0.01). The intra class correlation was 0.82. The analyzer method had the advantages of safety, decreased technician time and improved patient care by providing quick results.

  3. Plant Systems Biology (editorial)

    Science.gov (United States)

    In June 2003, Plant Physiology published an Arabidopsis special issue devoted to plant systems biology. The intention of Natasha Raikhel and Gloria Coruzzi, the two editors of this first-of-its-kind issue, was ‘‘to help nucleate this new effort within the plant community’’ as they considered that ‘‘...

  4. Power Plant Cycling Costs

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, N.; Besuner, P.; Lefton, S.; Agan, D.; Hilleman, D.

    2012-07-01

    This report provides a detailed review of the most up to date data available on power plant cycling costs. The primary objective of this report is to increase awareness of power plant cycling cost, the use of these costs in renewable integration studies and to stimulate debate between policymakers, system dispatchers, plant personnel and power utilities.

  5. Designing with plants

    NARCIS (Netherlands)

    Smits, R.

    2012-01-01

    This "designers' manual" is made during the TIDO-course AR0531 Smart & Bioclimatic Design. Rainforests are the lungs of the earth and plants can be the lungs of a buildings. Every plant uses CO2, water and light to produce sugars and oxygen; furthermore plants provide shade, take pollutants from th

  6. Plants of the Bayshore.

    Science.gov (United States)

    Bachle, Leo; And Others

    This field guide gives pictures and descriptions of plants that can be found along the San Francisco Bayshore, especially along the Hayward shoreline. The plants are divided into three categories, those of the mud-flat zone, the drier zone, and the levee zone. Eighteen plants are represented in all. The guide is designed to be used alone, with an…

  7. Plant Diseases & Chemicals

    OpenAIRE

    Thompson, Sherm

    2008-01-01

    This course discusses the use of chemicals for plant disease control. Specifically, pesticides that can be used both in commercial or home/yard sitautions. This course also teaches how to determine plant diseases that may have caused a plant to die.

  8. Iron stress in plants.

    Science.gov (United States)

    Connolly, Erin L; Guerinot, Mary

    2002-07-30

    Although iron is an essential nutrient for plants, its accumulation within cells can be toxic. Plants, therefore, respond to both iron deficiency and iron excess by inducing expression of different gene sets. Here, we review recent advances in the understanding of iron homeostasis in plants gained through functional genomic approaches

  9. Iron stress in plants

    OpenAIRE

    Connolly, Erin L.; Guerinot, Mary Lou

    2002-01-01

    Although iron is an essential nutrient for plants, its accumulation within cells can be toxic. Plants, therefore, respond to both iron deficiency and iron excess by inducing expression of different gene sets. Here, we review recent advances in the understanding of iron homeostasis in plants gained through functional genomic approaches.

  10. Recognizing plant defense priming

    NARCIS (Netherlands)

    Martinez-Medina, A.; Flors, V.; Heil, M.; Mauch-Mani, B.; Pieterse, C.M.J.; Pozo, M.J.; Ton, J.; Van Dam, N.M.; Conrath, U.

    2016-01-01

    Defense priming conditions diverse plant species for the superinduction of defense, often resulting in enhanced pest and disease resistance and abiotic stress tolerance. Here, we propose a guideline that might assist the plant research community in a consistent assessment of defense priming in plant

  11. The Photo-Pneumatic CO2 Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We are proposing to build a new technology, the photo-pneumatic analyzer. It is small, solid-state, inexpensive, and appropriate for observations of atmospheric...

  12. Josephson junction spectrum analyzer for millimeter and submillimeter wavelengths

    Energy Technology Data Exchange (ETDEWEB)

    Larkin, S.Y.; Anischenko, S.E.; Khabayev, P.V. [State Research Center, Kiev (Ukraine)

    1994-12-31

    A prototype of the Josephson-effect spectrum analyzer developed for the millimeter-wave band is described. The measurement results for spectra obtained in the frequency band from 50 to 250 GHz are presented.

  13. Testing Evaluation of the Electrochemical Organic Content Analyzer

    Science.gov (United States)

    Davenport, R. J.

    1979-01-01

    The breadboard electrochemical organic content analyzer was evalauted for aerospace applications. An awareness of the disadvantages of expendables in some systems resulted in an effort to investigate ways of reducing the consumption of the analyzer's electrolyte from the rate of 5.17 kg/30 days. It was found that the electrochemical organic content analyzer can result in an organic monitor in the water quality monitor having a range of 0.1 to 100 mg/1 total organic carbon for a large number of common organic solutes. In a flight version it is anticipated the analyzer would occupy .0002 cu m, weigh 1.4 kg, and require 10 W or less of power. With the optimum method of injecting electrolyte into the sample (saturation of the sample with a salt) it would expend only 0.04 kg of electrolyte during 30 days of continuous operation.

  14. Mini Total Organic Carbon Analyzer (miniTOCA) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Total Organic Carbon (TOC) analyzers function by converting (oxidizing) all organic compounds (contaminants) in the water sample to carbon dioxide gas (CO2), then...

  15. Multisensor Analyzed Sea Ice Extent - Northern Hemisphere (MASIE-NH)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Multisensor Analyzed Sea Ice Extent Northern Hemisphere (MASIE-NH) products provide measurements of daily sea ice extent and sea ice edge boundary for the...

  16. Mars & Multi-Planetary Electrical Environment Spectrum Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Our objective is to develop MENSA as a highly integrated planetary radio and digital spectrum analyzer cubesat payload that can be deployed as a satellite instrument...

  17. Airspace Analyzer for Assessing Airspace Directional Permeability Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We build a software tool which enables the user (airline or Air Traffic Service Provider (ATSP)) the ability to analyze the flight-level-by-flight-level permeability...

  18. Mobile Greenhouse Gas Flux Analyzer for Unmanned Aerial Vehicles Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Los Gatos Research (LGR) proposes to develop highly-accurate, lightweight, low-power gas analyzers for measurements of carbon dioxide (CO2) and water vapor (H2O)...

  19. 40 CFR 1065.250 - Nondispersive infra-red analyzer.

    Science.gov (United States)

    2010-07-01

    ... analyzer that has compensation algorithms that are functions of other gaseous measurements and the engine's known or assumed fuel properties. The target value for any compensation algorithm is 0.0% (that is,...

  20. 40 CFR 1065.272 - Nondispersive ultraviolet analyzer.

    Science.gov (United States)

    2010-07-01

    ... in § 1065.307. You may use a NDUV analyzer that has compensation algorithms that are functions of... compensation algorithm is 0.0% (that is, no bias high and no bias low), regardless of the uncompensated...