WorldWideScience

Sample records for analyzing pmoa mmox

  1. Linking carbon and nitrogen cycling: Environmental transcription of mmoX, pmoA, and nifH by methane oxidizing Proteobacteria in a Sub-Arctic palsa peatland

    Science.gov (United States)

    Liebner, Susanne; Svenning, Mette M.

    2013-04-01

    Sub-Arctic terrestrial ecosystems are currently affected by climate change which causes degradation of stored organic carbon and emissions of greenhouse gases from microbial processes. Methane oxidizing bacteria (MOB) mitigate methane emissions and perform an important function in the soil-atmosphere interaction. In this study we investigated presence and environmental transcription of functional genes of MOB along the degradation of permafrost in a Sub-Arctic palsa peatland using molecular approaches. The acidic and oligotrophic peatland hosts a small number of active MOB among a seemingly specialized community. The methanotrophic community displayed a broad functional potential by transcribing genes for key enzymes involved in both carbon and nitrogen metabolisms including particulate and soluble methane monoogygenase (pMMO and sMMO) as well as nitrogenase. Transcription of mmoX that encodes for a subunit of the sMMO suggests an ecological importance of sMMO with a broad substrate range in this peatland. In situ transcripts of mmoX were tracked mainly to Methylocella related Beijerinckiaceae, and to relatives of Methylomonas while Methylocystis constituting the dominant group which utilizes pMMO. These results address interesting questions concerning in-situ substrate preferences of MOB, and the general importance of species that lack a pMMO for mitigating methane emissions. The importance of MOB for the nitrogen budget in this low pH, nitrogen limited habitat was identified by nifH transcripts of native methanotrophs. Hence, methane oxidizing Proteobacteria show an extended functional repertoire and importance for the biogeochemical cycling in this dynamic ecosystem of degrading permafrost.

  2. Methane oxidation in industrial biogas plants-Insights in a novel methanotrophic environment evidenced by pmoA gene analyses and stable isotope labelling studies.

    Science.gov (United States)

    May, Tobias; Polag, Daniela; Keppler, Frank; Greule, Markus; Müller, Liane; König, Helmut

    2018-03-20

    A broad methanotrophic community consisting of 16 different operational taxonomic units (OTUs) was detected by particulate methane monooxygenase A (pmoA) gene analyses of reactor sludge samples obtained from an industrial biogas plant. Using a cloning-sequencing approach, 75% of the OTUs were affiliated to the group of type I methanotrophs (γ-Proteobacteria) and 25% to type II methanotrophs (α-Proteobacteria) with a distinct predominance of the genus Methylobacter. By database matching, half of the total OTUs may constitute entirely novel species. For evaluation of process conditions that support growth of methanotrophic bacteria, qPCR analyses of pmoA gene copy numbers were performed during a sampling period of 70 days at varying reactor feeding scenarios. During the investigation period, methanotrophic cell counts estimated by qPCR fluctuated between 3.4 × 10 4 and 2 × 10 5 cells/mL with no distinct correlation to the organic loading rate, the amount of CH 4 , O 2 and NH 4 -N. Methanotrophic activity was proofed even at low O 2 levels (1%) by using stable carbon isotope labelling experiments of CH 4 in batch experiments inoculated with reactor sludge. Supplementation of 13 C labelled CH 4 in the headspace of the reaction vials unambiguously confirmed the formation of 13 C labelled CO 2 . Thus, industrial biogas reactors can be considered as a further methanotrophic habitat that exhibits a unique methanotrophic community which is specifically adapted to high CH 4 and low O 2 concentrations. To the best of our knowledge, our study is the first accurate detection and quantification of methanotrophic bacteria in industrial biogas reactors. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Identity of active methanotrophs in landfill cover soil as revealed by DNA-stable isotope probing.

    Science.gov (United States)

    Cébron, Aurélie; Bodrossy, Levente; Chen, Yin; Singer, Andrew C; Thompson, Ian P; Prosser, James I; Murrell, J Colin

    2007-10-01

    A considerable amount of methane produced during decomposition of landfill waste can be oxidized in landfill cover soil by methane-oxidizing bacteria (methanotrophs) thus reducing greenhouse gas emissions to the atmosphere. The identity of active methanotrophs in Roscommon landfill cover soil, a slightly acidic peat soil, was assessed by DNA-stable isotope probing (SIP). Landfill cover soil slurries were incubated with (13)C-labelled methane and under either nutrient-rich nitrate mineral salt medium or water. The identity of active methanotrophs was revealed by analysis of (13)C-labelled DNA fractions. The diversity of functional genes (pmoA and mmoX) and 16S rRNA genes was analyzed using clone libraries, microarrays and denaturing gradient gel electrophoresis. 16S rRNA gene analysis revealed that the cover soil was mainly dominated by Type II methanotrophs closely related to the genera Methylocella and Methylocapsa and to Methylocystis species. These results were supported by analysis of mmoX genes in (13)C-DNA. Analysis of pmoA gene diversity indicated that a significant proportion of active bacteria were also closely related to the Type I methanotrophs, Methylobacter and Methylomonas species. Environmental conditions in the slightly acidic peat soil from Roscommon landfill cover allow establishment of both Type I and Type II methanotrophs.

  4. Transient analyzer

    International Nuclear Information System (INIS)

    Muir, M.D.

    1975-01-01

    The design and design philosophy of a high performance, extremely versatile transient analyzer is described. This sub-system was designed to be controlled through the data acquisition computer system which allows hands off operation. Thus it may be placed on the experiment side of the high voltage safety break between the experimental device and the control room. This analyzer provides control features which are extremely useful for data acquisition from PPPL diagnostics. These include dynamic sample rate changing, which may be intermixed with multiple post trigger operations with variable length blocks using normal, peak to peak or integrate modes. Included in the discussion are general remarks on the advantages of adding intelligence to transient analyzers, a detailed description of the characteristics of the PPPL transient analyzer, a description of the hardware, firmware, control language and operation of the PPPL transient analyzer, and general remarks on future trends in this type of instrumentation both at PPPL and in general

  5. Analyzing Peace Pedagogies

    Science.gov (United States)

    Haavelsrud, Magnus; Stenberg, Oddbjorn

    2012-01-01

    Eleven articles on peace education published in the first volume of the Journal of Peace Education are analyzed. This selection comprises peace education programs that have been planned or carried out in different contexts. In analyzing peace pedagogies as proposed in the 11 contributions, we have chosen network analysis as our method--enabling…

  6. Miniature mass analyzer

    CERN Document Server

    Cuna, C; Lupsa, N; Cuna, S; Tuzson, B

    2003-01-01

    The paper presents the concept of different mass analyzers that were specifically designed as small dimension instruments able to detect with great sensitivity and accuracy the main environmental pollutants. The mass spectrometers are very suited instrument for chemical and isotopic analysis, needed in environmental surveillance. Usually, this is done by sampling the soil, air or water followed by laboratory analysis. To avoid drawbacks caused by sample alteration during the sampling process and transport, the 'in situ' analysis is preferred. Theoretically, any type of mass analyzer can be miniaturized, but some are more appropriate than others. Quadrupole mass filter and trap, magnetic sector, time-of-flight and ion cyclotron mass analyzers can be successfully shrunk, for each of them some performances being sacrificed but we must know which parameters are necessary to be kept unchanged. To satisfy the miniaturization criteria of the analyzer, it is necessary to use asymmetrical geometries, with ion beam obl...

  7. Analog multivariate counting analyzers

    CERN Document Server

    Nikitin, A V; Armstrong, T P

    2003-01-01

    Characterizing rates of occurrence of various features of a signal is of great importance in numerous types of physical measurements. Such signal features can be defined as certain discrete coincidence events, e.g. crossings of a signal with a given threshold, or occurrence of extrema of a certain amplitude. We describe measuring rates of such events by means of analog multivariate counting analyzers. Given a continuous scalar or multicomponent (vector) input signal, an analog counting analyzer outputs a continuous signal with the instantaneous magnitude equal to the rate of occurrence of certain coincidence events. The analog nature of the proposed analyzers allows us to reformulate many problems of the traditional counting measurements, and cast them in a form which is readily addressed by methods of differential calculus rather than by algebraic or logical means of digital signal processing. Analog counting analyzers can be easily implemented in discrete or integrated electronic circuits, do not suffer fro...

  8. Analyzing Stereotypes in Media.

    Science.gov (United States)

    Baker, Jackie

    1996-01-01

    A high school film teacher studied how students recognized messages in film, examining how film education could help students identify and analyze racial and gender stereotypes. Comparison of students' attitudes before and after the film course found that the course was successful in raising students' consciousness. (SM)

  9. Centrifugal analyzer development

    International Nuclear Information System (INIS)

    Burtis, C.A.; Bauer, M.L.; Bostick, W.D.

    1976-01-01

    The development of the centrifuge fast analyzer (CFA) is reviewed. The development of a miniature CFA with computer data analysis is reported and applications for automated diagnostic chemical and hematological assays are discussed. A portable CFA system with microprocessor was adapted for field assays of air and water samples for environmental pollutants, including ammonia, nitrates, nitrites, phosphates, sulfates, and silica. 83 references

  10. Americal options analyzed differently

    NARCIS (Netherlands)

    Nieuwenhuis, J.W.

    2003-01-01

    In this note we analyze in a discrete-time context and with a finite outcome space American options starting with the idea that every tradable should be a martingale under a certain measure. We believe that in this way American options become more understandable to people with a good working

  11. Analyzed Using Statistical Moments

    International Nuclear Information System (INIS)

    Oltulu, O.

    2004-01-01

    Diffraction enhanced imaging (DEl) technique is a new x-ray imaging method derived from radiography. The method uses a monorheumetten x-ray beam and introduces an analyzer crystal between an object and a detector Narrow angular acceptance of the analyzer crystal generates an improved contrast over the evaluation radiography. While standart radiography can produce an 'absorption image', DEl produces 'apparent absorption' and 'apparent refraction' images with superior quality. Objects with similar absorption properties may not be distinguished with conventional techniques due to close absorption coefficients. This problem becomes more dominant when an object has scattering properties. A simple approach is introduced to utilize scattered radiation to obtain 'pure absorption' and 'pure refraction' images

  12. Charged particle analyzer PLAZMAG

    International Nuclear Information System (INIS)

    Apathy, Istvan; Endroeczy, Gabor; Szemerey, Istvan; Szendroe, Sandor

    1985-01-01

    The scientific task of the charged particle analyzer PLAZMAG, a part of the VEGA space probe, and the physical background of the measurements are described. The sensor of the device face the Sun and the comet Halley measuring the energy and mass spectrum of ion and electron components of energies lower than 25 keV. The tasks of the individual electronic parts, the design aspects and the modes of operation in different phases of the flight are dealt with. (author)

  13. Fractional channel multichannel analyzer

    Science.gov (United States)

    Brackenbush, L.W.; Anderson, G.A.

    1994-08-23

    A multichannel analyzer incorporating the features of the present invention obtains the effect of fractional channels thus greatly reducing the number of actual channels necessary to record complex line spectra. This is accomplished by using an analog-to-digital converter in the asynchronous mode, i.e., the gate pulse from the pulse height-to-pulse width converter is not synchronized with the signal from a clock oscillator. This saves power and reduces the number of components required on the board to achieve the effect of radically expanding the number of channels without changing the circuit board. 9 figs.

  14. Plutonium solution analyzer

    International Nuclear Information System (INIS)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded)

  15. Ring Image Analyzer

    Science.gov (United States)

    Strekalov, Dmitry V.

    2012-01-01

    Ring Image Analyzer software analyzes images to recognize elliptical patterns. It determines the ellipse parameters (axes ratio, centroid coordinate, tilt angle). The program attempts to recognize elliptical fringes (e.g., Newton Rings) on a photograph and determine their centroid position, the short-to-long-axis ratio, and the angle of rotation of the long axis relative to the horizontal direction on the photograph. These capabilities are important in interferometric imaging and control of surfaces. In particular, this program has been developed and applied for determining the rim shape of precision-machined optical whispering gallery mode resonators. The program relies on a unique image recognition algorithm aimed at recognizing elliptical shapes, but can be easily adapted to other geometric shapes. It is robust against non-elliptical details of the image and against noise. Interferometric analysis of precision-machined surfaces remains an important technological instrument in hardware development and quality analysis. This software automates and increases the accuracy of this technique. The software has been developed for the needs of an R&TD-funded project and has become an important asset for the future research proposal to NASA as well as other agencies.

  16. Plutonium solution analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded).

  17. Analyzing Water's Optical Absorption

    Science.gov (United States)

    2002-01-01

    A cooperative agreement between World Precision Instruments (WPI), Inc., and Stennis Space Center has led the UltraPath(TM) device, which provides a more efficient method for analyzing the optical absorption of water samples at sea. UltraPath is a unique, high-performance absorbance spectrophotometer with user-selectable light path lengths. It is an ideal tool for any study requiring precise and highly sensitive spectroscopic determination of analytes, either in the laboratory or the field. As a low-cost, rugged, and portable system capable of high- sensitivity measurements in widely divergent waters, UltraPath will help scientists examine the role that coastal ocean environments play in the global carbon cycle. UltraPath(TM) is a trademark of World Precision Instruments, Inc. LWCC(TM) is a trademark of World Precision Instruments, Inc.

  18. PDA: Pooled DNA analyzer

    Directory of Open Access Journals (Sweden)

    Lin Chin-Yu

    2006-04-01

    Full Text Available Abstract Background Association mapping using abundant single nucleotide polymorphisms is a powerful tool for identifying disease susceptibility genes for complex traits and exploring possible genetic diversity. Genotyping large numbers of SNPs individually is performed routinely but is cost prohibitive for large-scale genetic studies. DNA pooling is a reliable and cost-saving alternative genotyping method. However, no software has been developed for complete pooled-DNA analyses, including data standardization, allele frequency estimation, and single/multipoint DNA pooling association tests. This motivated the development of the software, 'PDA' (Pooled DNA Analyzer, to analyze pooled DNA data. Results We develop the software, PDA, for the analysis of pooled-DNA data. PDA is originally implemented with the MATLAB® language, but it can also be executed on a Windows system without installing the MATLAB®. PDA provides estimates of the coefficient of preferential amplification and allele frequency. PDA considers an extended single-point association test, which can compare allele frequencies between two DNA pools constructed under different experimental conditions. Moreover, PDA also provides novel chromosome-wide multipoint association tests based on p-value combinations and a sliding-window concept. This new multipoint testing procedure overcomes a computational bottleneck of conventional haplotype-oriented multipoint methods in DNA pooling analyses and can handle data sets having a large pool size and/or large numbers of polymorphic markers. All of the PDA functions are illustrated in the four bona fide examples. Conclusion PDA is simple to operate and does not require that users have a strong statistical background. The software is available at http://www.ibms.sinica.edu.tw/%7Ecsjfann/first%20flow/pda.htm.

  19. Downhole Fluid Analyzer Development

    Energy Technology Data Exchange (ETDEWEB)

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  20. PULSE HEIGHT ANALYZER

    Science.gov (United States)

    Johnstone, C.W.

    1958-01-21

    An anticoincidence device is described for a pair of adjacent channels of a multi-channel pulse height analyzer for preventing the lower channel from generating a count pulse in response to an input pulse when the input pulse has sufficient magnitude to reach the upper level channel. The anticoincidence circuit comprises a window amplifier, upper and lower level discriminators, and a biased-off amplifier. The output of the window amplifier is coupled to the inputs of the discriminators, the output of the upper level discriminator is connected to the resistance end of a series R-C network, the output of the lower level discriminator is coupled to the capacitance end of the R-C network, and the grid of the biased-off amplifier is coupled to the junction of the R-C network. In operation each discriminator produces a negative pulse output when the input pulse traverses its voltage setting. As a result of the connections to the R-C network, a trigger pulse will be sent to the biased-off amplifier when the incoming pulse level is sufficient to trigger only the lower level discriminator.

  1. Analyzing Spacecraft Telecommunication Systems

    Science.gov (United States)

    Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric

    2004-01-01

    Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.

  2. Soft Decision Analyzer

    Science.gov (United States)

    Steele, Glen; Lansdowne, Chatwin; Zucha, Joan; Schlensinger, Adam

    2013-01-01

    The Soft Decision Analyzer (SDA) is an instrument that combines hardware, firmware, and software to perform realtime closed-loop end-to-end statistical analysis of single- or dual- channel serial digital RF communications systems operating in very low signal-to-noise conditions. As an innovation, the unique SDA capabilities allow it to perform analysis of situations where the receiving communication system slips bits due to low signal-to-noise conditions or experiences constellation rotations resulting in channel polarity in versions or channel assignment swaps. SDA s closed-loop detection allows it to instrument a live system and correlate observations with frame, codeword, and packet losses, as well as Quality of Service (QoS) and Quality of Experience (QoE) events. The SDA s abilities are not confined to performing analysis in low signal-to-noise conditions. Its analysis provides in-depth insight of a communication system s receiver performance in a variety of operating conditions. The SDA incorporates two techniques for identifying slips. The first is an examination of content of the received data stream s relation to the transmitted data content and the second is a direct examination of the receiver s recovered clock signals relative to a reference. Both techniques provide benefits in different ways and allow the communication engineer evaluating test results increased confidence and understanding of receiver performance. Direct examination of data contents is performed by two different data techniques, power correlation or a modified Massey correlation, and can be applied to soft decision data widths 1 to 12 bits wide over a correlation depth ranging from 16 to 512 samples. The SDA detects receiver bit slips within a 4 bits window and can handle systems with up to four quadrants (QPSK, SQPSK, and BPSK systems). The SDA continuously monitors correlation results to characterize slips and quadrant change and is capable of performing analysis even when the

  3. Multichannel analyzer development in CAMAC

    International Nuclear Information System (INIS)

    Nagy, J.Z.; Zarandy, A.

    1988-01-01

    The data acquisition in TOKAMAK experiments some CAMAC modules have been developed. The modules are the following: 64 K analyzer memory, 32 K analyzer memory, 6-channel pulse peak analyzer memory which contains the 32 K analyzer memory and eight AD-converters

  4. PM 3655 PHILIPS Logic analyzer

    CERN Multimedia

    A logic analyzer is an electronic instrument that captures and displays multiple signals from a digital system or digital circuit. A logic analyzer may convert the captured data into timing diagrams, protocol decodes, state machine traces, assembly language, or may correlate assembly with source-level software. Logic Analyzers have advanced triggering capabilities, and are useful when a user needs to see the timing relationships between many signals in a digital system.

  5. Digital Multi Channel Analyzer Enhancement

    International Nuclear Information System (INIS)

    Gonen, E.; Marcus, E.; Wengrowicz, U.; Beck, A.; Nir, J.; Sheinfeld, M.; Broide, A.; Tirosh, D.

    2002-01-01

    A cement analyzing system based on radiation spectroscopy had been developed [1], using novel digital approach for real-time, high-throughput and low-cost Multi Channel Analyzer. The performance of the developed system had a severe problem: the resulted spectrum suffered from lack of smoothness, it was very noisy and full of spikes and surges, therefore it was impossible to use this spectrum for analyzing the cement substance. This paper describes the work carried out to improve the system performance

  6. Multichannel analyzer type CMA-3

    International Nuclear Information System (INIS)

    Czermak, A.; Jablonski, J.; Ostrowicz, A.

    1978-01-01

    Multichannel analyzer CMA-3 is designed for two-parametric analysis with operator controlled logical windows. It is implemented in CAMAC standard. A single crate contains all required modules and is controlled by the PDP-11/10 minicomputer. Configuration of CMA-3 is shown. CMA-3 is the next version of the multichannel analyzer described in report No 958/E-8. (author)

  7. Comparison of fiber length analyzers

    Science.gov (United States)

    Don Guay; Nancy Ross Sutherland; Walter Rantanen; Nicole Malandri; Aimee Stephens; Kathleen Mattingly; Matt Schneider

    2005-01-01

    In recent years, several fiber new fiber length analyzers have been developed and brought to market. The new instruments provide faster measurements and the capability of both laboratory and on-line analysis. Do the various fiber analyzers provide the same length, coarseness, width, and fines measurements for a given fiber sample? This paper provides a comparison of...

  8. Nuclear fuel microsphere gamma analyzer

    International Nuclear Information System (INIS)

    Valentine, K.H.; Long, E.L. Jr.; Willey, M.G.

    1977-01-01

    A gamma analyzer system is provided for the analysis of nuclear fuel microspheres and other radioactive particles. The system consists of an analysis turntable with means for loading, in sequence, a plurality of stations within the turntable; a gamma ray detector for determining the spectrum of a sample in one section; means for analyzing the spectrum; and a receiver turntable to collect the analyzed material in stations according to the spectrum analysis. Accordingly, particles may be sorted according to their quality; e.g., fuel particles with fractured coatings may be separated from those that are not fractured, or according to other properties. 4 claims, 3 figures

  9. Market study: Whole blood analyzer

    Science.gov (United States)

    1977-01-01

    A market survey was conducted to develop findings relative to the commercialization potential and key market factors of the whole blood analyzer which is being developed in conjunction with NASA's Space Shuttle Medical System.

  10. CSTT Update: Fuel Quality Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Brosha, Eric L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lujan, Roger W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mukundan, Rangachary [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockward, Tommy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Romero, Christopher J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Stefan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wilson, Mahlon S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-06

    These are slides from a presentation. The following topics are covered: project background (scope and approach), developing the prototype (timeline), update on intellectual property, analyzer comparisons (improving humidification, stabilizing the baseline, applying clean-up strategy, impact of ionomer content and improving clean-up), proposed operating mode, considerations for testing in real-world conditions (Gen 1 analyzer electronics development, testing partner identified, field trial planning), summary, and future work.

  11. Compact analyzer: an interactive simulator

    International Nuclear Information System (INIS)

    Ipakchi, A.; Khadem, M.; Colley, R.W.

    1985-01-01

    Compact Analyzer is a computer system that combines dynamic simulation models with interactive and color graphics user interface functions to provide a cost-effective simulator for dynamic analysis and evaluation of power plant operation, with engineering and training applications. Most dynamic simulation packages such as RETRAN and TRAC are designed for a batch-mode operation. With advancements in computer technology and man/machine interface capabilities, it is possible to integrate such codes with interactive and graphic functions into advanced simulators. The US Nuclear Regulatory Commission has sponsored development of plant analyzers with such characteristics. The Compact Analyzer is an Electric Power Research Institute (EPRI)-sponsored project, which currently utilizes the EPRI modular modeling system (MMS) for process simulation, and uses an adaptable color graphic package for dynamic display of the simulation results

  12. On-Demand Urine Analyzer

    Science.gov (United States)

    Farquharson, Stuart; Inscore, Frank; Shende, Chetan

    2010-01-01

    A lab-on-a-chip was developed that is capable of extracting biochemical indicators from urine samples and generating their surface-enhanced Raman spectra (SERS) so that the indicators can be quantified and identified. The development was motivated by the need to monitor and assess the effects of extended weightlessness, which include space motion sickness and loss of bone and muscle mass. The results may lead to developments of effective exercise programs and drug regimes that would maintain astronaut health. The analyzer containing the lab-on-a- chip includes materials to extract 3- methylhistidine (a muscle-loss indicator) and Risedronate (a bone-loss indicator) from the urine sample and detect them at the required concentrations using a Raman analyzer. The lab-on- a-chip has both an extractive material and a SERS-active material. The analyzer could be used to monitor the onset of diseases, such as osteoporosis.

  13. Multichannel analyzer embedded in FPGA

    International Nuclear Information System (INIS)

    Garcia D, A.; Hernandez D, V. M.; Vega C, H. R.; Ordaz G, O. O.; Bravo M, I.

    2017-10-01

    Ionizing radiation has different applications, so it is a very significant and useful tool, which in turn can be dangerous for living beings if they are exposed to uncontrolled doses. However, due to its characteristics, it cannot be perceived by any of the senses of the human being, so that in order to know the presence of it, radiation detectors and additional devices are required to quantify and classify it. A multichannel analyzer is responsible for separating the different pulse heights that are generated in the detectors, in a certain number of channels; according to the number of bits of the analog to digital converter. The objective of the work was to design and implement a multichannel analyzer and its associated virtual instrument, for nuclear spectrometry. The components of the multichannel analyzer were created in VHDL hardware description language and packaged in the Xilinx Vivado design suite, making use of resources such as the ARM processing core that the System on Chip Zynq contains and the virtual instrument was developed on the LabView programming graphics platform. The first phase was to design the hardware architecture to be embedded in the FPGA and for the internal control of the multichannel analyzer the application was generated for the ARM processor in C language. For the second phase, the virtual instrument was developed for the management, control and visualization of the results. The data obtained as a result of the development of the system were observed graphically in a histogram showing the spectrum measured. The design of the multichannel analyzer embedded in FPGA was tested with two different radiation detection systems (hyper-pure germanium and scintillation) which allowed determining that the spectra obtained are similar in comparison with the commercial multichannel analyzers. (Author)

  14. Analyzing Generation Y Workforce Motivation

    Science.gov (United States)

    2011-03-01

    Analyzing Generation Y Workforce Motivation Ian N. Barford n Patrick T. Hester R Defense AT&L: Special Edition: March –April 2011 36 Report...REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Analyzing Generation Y Workforce Motivation 5a. CONTRACT NUMBER 5b...between 1965 and 1979), and Generation Y (born between 1980 and 2000). 37 Defense AT&L: Special Edition: March –April 2011 Defense AT&L: Special

  15. The security analyzer: A security analyzer program written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.; Densley, P.J.

    1986-09-01

    The Security Analyzer is a software tool capable of analyzing the effectiveness of a facility's security system. It is written in the Prolog logic programming computer language, using entity-relationship data modeling techniques. The program performs the following functions: (1) provides descriptive, locational and operational status information about intrusion detectors and assessment devices (i.e., ''sensors'' and ''cameras'') upon request; (2) provides for storage and retrieval of maintenance history information for various components of the security system (including intrusion detectors), and allows for changing that information as desired; (3) provides a ''search'' mode, wherein all paths are found from any specified physical location to another specified location which satisfy user chosen ''intruder detection'' probability and elapsed time criteria (i.e., the program finds the ''weakest paths'' from a security point of view). The first two of these functions can be provided fairly easily with a conventional database program; the third function could be provided using Fortran or some similar language, though with substantial difficulty. In the Security Analyzer program, all these functions are provided in a simple and straight-forward manner. This simplicity is possible because the program is written in the symbolic (as opposed to numeric) processing language Prolog, and because the knowledge base is structured according to entity-relationship modeling principles. Also, the use of Prolog and the entity-relationship modeling technique allows the capabilities of the Security analyzer program, both for knowledge base interrogation and for searching-type operations, to be easily expanded in ways that would be very difficult for a numeric and more algorithmically deterministic language such as Fortran to duplicate. 4 refs

  16. Historical Thinking: Analyzing Student and Teacher Ability to Analyze Sources

    OpenAIRE

    Cowgill II, Daniel Armond; Waring, Scott M.

    2017-01-01

    The purpose of this study was to partially replicate the Historical Problem Solving: A Study of the Cognitive Process Using Historical Evidence study conducted by Sam Wineburg in 1991. The Historical Problem Solving study conducted by Wineburg (1991) sought to compare the ability of historians and top level students, as they analyzed pictures and written documents centered on the Battle of Lexington Green. In this version of the study, rather than compare historians and students, we sought ...

  17. Pollution Analyzing and Monitoring Instruments.

    Science.gov (United States)

    1972

    Compiled in this book is basic, technical information useful in a systems approach to pollution control. Descriptions and specifications are given of what is available in ready made, on-the-line commercial equipment for sampling, monitoring, measuring and continuously analyzing the multitudinous types of pollutants found in the air, water, soil,…

  18. Methods of analyzing crude oil

    Energy Technology Data Exchange (ETDEWEB)

    Cooks, Robert Graham; Jjunju, Fred Paul Mark; Li, Anyin; Rogan, Iman S.

    2017-08-15

    The invention generally relates to methods of analyzing crude oil. In certain embodiments, methods of the invention involve obtaining a crude oil sample, and subjecting the crude oil sample to mass spectrometry analysis. In certain embodiments, the method is performed without any sample pre-purification steps.

  19. Analyzing Software Piracy in Education.

    Science.gov (United States)

    Lesisko, Lee James

    This study analyzes the controversy of software piracy in education. It begins with a real world scenario that presents the setting and context of the problem. The legalities and background of software piracy are explained and true court cases are briefly examined. Discussion then focuses on explaining why individuals and organizations pirate…

  20. The Convertible Arbitrage Strategy Analyzed

    NARCIS (Netherlands)

    Loncarski, I.; Ter Horst, J.R.; Veld, C.H.

    2006-01-01

    This paper analyzes convertible bond arbitrage on the Canadian market for the period 1998 to 2004.Convertible bond arbitrage is the combination of a long position in convertible bonds and a short position in the underlying stocks. Convertible arbitrage has been one of the most successful strategies

  1. Identification of active methanotrophs in a landfill cover soil through detection of expression of 16S rRNA and functional genes.

    Science.gov (United States)

    Chen, Yin; Dumont, Marc G; Cébron, Aurélie; Murrell, J Colin

    2007-11-01

    Active methanotrophs in a landfill soil were revealed by detecting the 16S rRNA of methanotrophs and the mRNA transcripts of key genes involved in methane oxidation. New 16S rRNA primers targeting type I and type II methanotrophs were designed and optimized for analysis by denaturing gradient gel electrophoresis. Direct extraction of RNA from soil enabled the analysis of the expression of the functional genes: mmoX, pmoA and mxaF, which encode subunits of soluble methane monooxygenase, particulate methane monooxygenase and methanol dehydrogenase respectively. The 16S rRNA polymerase chain reaction (PCR) primers for type I methanotrophs detected Methylomonas, Methylosarcina and Methylobacter sequences from both soil DNA and cDNA which was generated from RNA extracted directly from the landfill cover soil. The 16S rRNA primers for type II methanotrophs detected primarily Methylocella and some Methylocystis 16S rRNA genes. Phylogenetic analysis of mRNA recovered from the soil indicated that Methylobacter, Methylosarcina, Methylomonas, Methylocystis and Methylocella were actively expressing genes involved in methane and methanol oxidation. Transcripts of pmoA but not mmoX were readily detected by reverse transcription polymerase chain reaction (RT-PCR), indicating that particulate methane monooxygenase may be largely responsible for methane oxidation in situ.

  2. Diversity of methanotroph communities in a basalt aquifer.

    Science.gov (United States)

    Newby, D T; Reed, D W; Petzke, L M; Igoe, A L; Delwiche, M E; Roberto, F F; McKinley, J P; Whiticar, M J; Colwell, F S

    2004-06-01

    Methanotrophic bacteria play an important role in global cycling of carbon and co-metabolism of contaminants. Methanotrophs from pristine regions of the Snake River Plain Aquifer (SRPA; Idaho, USA) were studied in order to gain insight into the native groundwater communities' genetic potential to carry out TCE co-metabolism. Wells were selected that were proximal to a TCE plume believed to be undergoing natural attenuation. Methane concentrations ranged from 1 to >1000 nM. Carbon isotope ratios and diversity data together suggest that the SRPA contains active communities of methanotrophs that oxidize microbially produced methane. Microorganisms removed from groundwater by filtration were used as inocula for enrichments or frozen immediately and DNA was subsequently extracted for molecular characterization. Primers that specifically target methanotroph 16S rRNA genes or genes that code for subunits of soluble or particulate methane monooxygenase, mmoX and pmoA, respectively, were used to characterize the indigenous methanotrophs via PCR, cloning, RFLP analysis, and sequencing. Type I methanotroph clones aligned with Methylomonas, Methylocaldum, and Methylobacter sequences and a distinct 16S rRNA phylogenetic lineage grouped near Methylobacter. The majority of clone sequences in type II methanotroph 16S rRNA, pmoA, and mmoX gene libraries grouped closely with sequences in the Methylocystis genus. A subset of the type II methanotroph clones from the aquifer had sequences that aligned most closely to Methylosinus trichosporium OB3b and Methylocystis spp., known TCE-co-metabolizing methanotrophs.

  3. Analyzer for gamma cameras diagnostic

    International Nuclear Information System (INIS)

    Oramas Polo, I.; Osorio Deliz, J. F.; Diaz Garcia, A.

    2013-01-01

    This research work was carried out to develop an analyzer for gamma cameras diagnostic. It is composed of an electronic system that includes hardware and software capabilities, and operates from the acquisition of the 4 head position signals of a gamma camera detector. The result is the spectrum of the energy delivered by nuclear radiation coming from the camera detector head. This system includes analog processing of position signals from the camera, digitization and the subsequent processing of the energy signal in a multichannel analyzer, sending data to a computer via a standard USB port and processing of data in a personal computer to obtain the final histogram. The circuits are composed of an analog processing board and a universal kit with micro controller and programmable gate array. (Author)

  4. New approach to analyzing vulnerability

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.; Carlson, R.L.; Riedeman, G.W.

    1986-01-01

    The Westinghouse Hanford Company (WHC) has recently completed construction of the Fuel Cycle Plant (FCP) at Richland, Washington. At start-up the facility will fabricate driver fuel for the Fast Flux Test Facility in the Secure Automated Fabrication line. After construction completion, but before facility certification, the Department of Energy (DOE) Richland Operation Office requested that a vulnerability analysis be performed which assumed multiple insiders as a threat to the security system. A unique method of analyzing facility vulnerabilities was developed at the Security Applications Center (SAC), which is managed by WHC for DOE. The method that was developed verifies a previous vulnerability assessment, as well as introducing a modeling technique which analyzes security alarms in relation to delaying factors and possible insider activities. With this information it is possible to assess the relative strength or weakness of various possible routes to and from a target within a facility

  5. Methods for Analyzing Social Media

    DEFF Research Database (Denmark)

    Jensen, Jakob Linaa

    2013-01-01

    Social media is becoming increasingly attractive for users. It is a fast way to communicate ideas and a key source of information. It is therefore one of the most influential mediums of communication of our time and an important area for audience research. The growth of social media invites many...... new questions such as: How can we analyze social media? Can we use traditional audience research methods and apply them to online content? Which new research strategies have been developed? Which ethical research issues and controversies do we have to pay attention to? This book focuses on research...... strategies and methods for analyzing social media and will be of interest to researchers and practitioners using social media, as well as those wanting to keep up to date with the subject....

  6. A novel methanotroph in the genus Methylomonas that contains a distinct clade of soluble methane monooxygenase.

    Science.gov (United States)

    Nguyen, Ngoc-Loi; Yu, Woon-Jong; Yang, Hye-Young; Kim, Jong-Geol; Jung, Man-Young; Park, Soo-Je; Roh, Seong-Woon; Rhee, Sung-Keun

    2017-10-01

    Aerobic methane oxidation is a key process in the global carbon cycle that acts as a major sink of methane. In this study, we describe a novel methanotroph designated EMGL16-1 that was isolated from a freshwater lake using the floating filter culture technique. Based on a phylogenetic analysis of 16S rRNA gene sequences, the isolate was found to be closely related to the genus Methylomonas in the family Methylococcaceae of the class Gammaproteobacteria with 94.2-97.4% 16S rRNA gene similarity to Methylomonas type strains. Comparison of chemotaxonomic and physiological properties further suggested that strain EMGL16-1 was taxonomically distinct from other species in the genus Methylomonas. The isolate was versatile in utilizing nitrogen sources such as molecular nitrogen, nitrate, nitrite, urea, and ammonium. The genes coding for subunit of the particulate form methane monooxygenase (pmoA), soluble methane monooxygenase (mmoX), and methanol dehydrogenase (mxaF) were detected in strain EMGL16-1. Phylogenetic analysis of mmoX indicated that mmoX of strain EMGL16-1 is distinct from those of other strains in the genus Methylomonas. This isolate probably represents a novel species in the genus. Our study provides new insights into the diversity of species in the genus Methylomonas and their environmental adaptations.

  7. Portable Tandem Mass Spectrometer Analyzer

    Science.gov (United States)

    1991-07-01

    FILE : MHCI TUNE TABLE 84 (SCANNING with PARENT) SCAN RANGE 10.9 TO 700.0 TUNE MASS 355.0 (AUTO) >LENS 1-3 -13. 88 0. 2: POFF - 1. 2 9: COFF - 4. 1 3...and 500 ng of caffeine in I uL of chloroform by GC/A?:,,MS using negative ions. Also analyzed were barbiturates, extracted from urine, in the 3-5 Mg

  8. Remote Laser Diffraction PSD Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2000-06-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified "off-the-shelf" classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a "hot cell" (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  9. Remote Laser Diffraction PSD Analyzer

    International Nuclear Information System (INIS)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2000-01-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified ''off-the-shelf'' classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a ''hot cell'' (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable--making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives

  10. Sulfur Dioxide Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, Stephen R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-05-01

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. Brookhaven National Laboratory (BNL) has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

  11. Remote Laser Diffraction PSD Analyzer

    International Nuclear Information System (INIS)

    Batcheller, T.A.; Huestis, G.M.; Bolton, S.M.

    2000-01-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified off-the-shelf classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a hot cell (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives

  12. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    Science.gov (United States)

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  13. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Science.gov (United States)

    Cheung, Mike W.-L.; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists—and probably the most crucial one—is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study. PMID:27242639

  14. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Directory of Open Access Journals (Sweden)

    Mike W.-L. Cheung

    2016-05-01

    Full Text Available Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  15. Charged particle mobility refrigerant analyzer

    Science.gov (United States)

    Allman, S.L.; Chunghsuan Chen; Chen, F.C.

    1993-02-02

    A method for analyzing a gaseous electronegative species comprises the steps of providing an analysis chamber; providing an electric field of known potential within the analysis chamber; admitting into the analysis chamber a gaseous sample containing the gaseous electronegative species; providing a pulse of free electrons within the electric field so that the pulse of free electrons interacts with the gaseous electronegative species so that a swarm of electrically charged particles is produced within the electric field; and, measuring the mobility of the electrically charged particles within the electric field.

  16. Fuel analyzer; Analisador de combustiveis

    Energy Technology Data Exchange (ETDEWEB)

    Cozzolino, Roberval [RS Motors, Indaiatuba, SP (Brazil)

    2008-07-01

    The current technology 'COMBUSTIMETRO' aims to examine the fuel through performance of the engine, as the role of the fuel is to produce energy for the combustion engine in the form of which is directly proportional to the quality and type of fuel. The 'COMBUSTIMETRO' has an engine that always keeps the same entry of air, fuel and fixed point of ignition. His operation is monitored by sensors (Sonda Lambda, RPM and Gases Analyzer) connected to a processor that performs calculations and records the information, generate reports and graphs. (author)

  17. Historical Thinking: Analyzing Student and Teacher Ability to Analyze Sources

    Directory of Open Access Journals (Sweden)

    Daniel Armond Cowgill II

    2017-05-01

    Full Text Available The purpose of this study was to partially replicate the Historical Problem Solving: A Study of the Cognitive Process Using Historical Evidence study conducted by Sam Wineburg in 1991. The Historical Problem Solving study conducted by Wineburg (1991 sought to compare the ability of historians and top level students, as they analyzed pictures and written documents centered on the Battle of Lexington Green. In this version of the study, rather than compare historians and students, we sought out to compare the analytical skills of teachers and students. The main findings relate to the fact that the participants lacked the ability to engage in the very complex activities associated with historical inquiry and the utilization of primary sources in learning about the past. This lack of ability should be used to improve teacher professional development programs and help them develop the skills needed to not only engage in historical evaluation themselves but to also develop skills that will allow them to instruct students to do the same.

  18. Compact Microwave Fourier Spectrum Analyzer

    Science.gov (United States)

    Savchenkov, Anatoliy; Matsko, Andrey; Strekalov, Dmitry

    2009-01-01

    A compact photonic microwave Fourier spectrum analyzer [a Fourier-transform microwave spectrometer, (FTMWS)] with no moving parts has been proposed for use in remote sensing of weak, natural microwave emissions from the surfaces and atmospheres of planets to enable remote analysis and determination of chemical composition and abundances of critical molecular constituents in space. The instrument is based on a Bessel beam (light modes with non-zero angular momenta) fiber-optic elements. It features low power consumption, low mass, and high resolution, without a need for any cryogenics, beyond what is achievable by the current state-of-the-art in space instruments. The instrument can also be used in a wide-band scatterometer mode in active radar systems.

  19. Radiation energy detector and analyzer

    International Nuclear Information System (INIS)

    Roberts, T.G.

    1981-01-01

    A radiation detector array and a method for measuring the spectral content of radiation. The radiation sensor or detector is an array or stack of thin solid-electrolyte batteries. The batteries, arranged in a stack, may be composed of independent battery cells or may be arranged so that adjacent cells share a common terminal surface. This common surface is possible since the polarity of the batteries with respect to an adjacent battery is unrestricted, allowing a reduction in component parts of the assembly and reducing the overall stack length. Additionally, a test jig or chamber for allowing rapid measurement of the voltage across each battery is disclosed. A multichannel recorder and display may be used to indicate the voltage gradient change across the cells, or a small computer may be used for rapidly converting these voltage readings to a graph of radiation intensity versus wavelength or energy. The behavior of the batteries when used as a radiation detector and analyzer are such that the voltage measurements can be made at leisure after the detector array has been exposed to the radiation, and it is not necessary to make rapid measurements as is now done

  20. Thomson parabola ion energy analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Cobble, James A [Los Alamos National Laboratory; Flippo, Kirk A [Los Alamos National Laboratory; Letzring, Samuel A [Los Alamos National Laboratory; Lopez, Frank E [Los Alamos National Laboratory; Offermann, Dustin T [Los Alamos National Laboratory; Oertel, John A [Los Alamos National Laboratory; Mastrosimone, Dino [UNIV OF ROCHESTER

    2010-01-01

    A new, versatile Thomson parabola ion energy (TPIE) analyzer has been designed and constructed for use at the OMEGA-EP facility. Multi-MeV ions from EP targets are transmitted through a W pinhole into a (5- or 8-kG) magnetic field and subsequently through a parallel electric field of up to 30 kV/cm. The ion drift region may have a user-selected length of 10, 50, or 80 cm. With the highest fields, 500-Me V C{sup 6+} and C{sup 5+} may be resolved. TPIE is TIM-mounted at OMEGA-EP and is qualified in all existing TIMs. The instrument runs on pressure-interlocked 15-VDC power available in EP TIM carts. It may be inserted to within several inches of the target to attain sufficient flux for a measurement. For additional flux control, the user may select a square-aperture W pinhole of 0.004-inch or 0.010-inch. The detector consists of CR-39 backed by an image plate. The fully relativistic design code and design features are discussed. Ion spectral results from first use at OMEGA-EP are expected.

  1. Nuclear plant analyzer desktop workstation

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1990-01-01

    In 1983 the U.S. Nuclear Regulatory Commission (USNRC) commissioned the Idaho National Engineering Laboratory (INEL) to develop a Nuclear Plant Analyzer (NPA). The NPA was envisioned as a graphical aid to assist reactor safety analysts in comprehending the results of thermal-hydraulic code calculations. The development was to proceed in three distinct phases culminating in a desktop reactor safety workstation. The desktop NPA is now complete. The desktop NPA is a microcomputer based reactor transient simulation, visualization and analysis tool developed at INEL to assist an analyst in evaluating the transient behavior of nuclear power plants by means of graphic displays. The NPA desktop workstation integrates advanced reactor simulation codes with online computer graphics allowing reactor plant transient simulation and graphical presentation of results. The graphics software, written exclusively in ANSI standard C and FORTRAN 77 and implemented over the UNIX/X-windows operating environment, is modular and is designed to interface to the NRC's suite of advanced thermal-hydraulic codes to the extent allowed by that code. Currently, full, interactive, desktop NPA capabilities are realized only with RELAP5

  2. [Effects of copper on biodegradation mechanism of trichloroethylene by mixed microorganisms].

    Science.gov (United States)

    Gao, Yanhui; Zhao, Tiantao; Xing, Zhilin; He, Zhi; Zhang, Lijie; Peng, Xuya

    2016-05-25

    We isolated and enriched mixed microorganisms SWA1 from landfill cover soils supplemented with trichloroethylene (TCE). The microbial mixture could degrade TCE effectively under aerobic conditions. Then, we investigated the effect of copper ion (0 to 15 μmol/L) on TCE biodegradation. Results show that the maximum TCE degradation speed was 29.60 nmol/min with 95.75% degradation when copper ion was at 0.03 μmol/L. In addition, genes encoding key enzymes during biodegradation were analyzed by Real-time quantitative reverse transcription PCR (RT-qPCR). The relative expression abundance of pmoA gene (4.22E-03) and mmoX gene (9.30E-06) was the highest when copper ion was at 0.03 μmol/L. Finally, we also used MiSeq pyrosequencing to investigate the diversity of microbial community. Methylocystaceae that can co-metabolic degrade TCE were the dominant microorganisms; other microorganisms with the function of direct oxidation of TCE were also included in SWA1 and the microbial diversity decreased significantly along with increasing of copper ion concentration. Based on the above results, variation of copper ion concentration affected the composition of SWA1 and degradation mechanism of TCE. The degradation mechanism of TCE included co-metabolism degradation of methanotrophs and oxidation metabolism directly at copper ion of 0.03 μmol/L. When copper ion at 5 μmol/L (biodegradation was 84.75%), the degradation mechanism of TCE included direct-degradation and co-metabolism degradation of methanotrophs and microorganisms containing phenol hydroxylase. Therefore, biodegradation of TCE by microorganisms was a complicated process, the degradation mechanism included co-metabolism degradation of methanotrophs and bio-oxidation of non-methanotrophs.

  3. Portable Programmable Multifunction Body Fluids Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced Liquid Logic proposes to develop a very capable analyzer based on its digital microfluidic technology. Such an analyzer would be:  Capable of both...

  4. Using expert systems to analyze ATE data

    Science.gov (United States)

    Harrington, Jim

    1994-01-01

    The proliferation of automatic test equipment (ATE) is resulting in the generation of large amounts of component data. Some of this component data is not accurate due to the presence of noise. Analyzing this data requires the use of new techniques. This paper describes the process of developing an expert system to analyze ATE data and provides an example rule in the CLIPS language for analyzing trip thresholds for high gain/high speed comparators.

  5. Electrical spectrum & network analyzers a practical approach

    CERN Document Server

    Helfrick, Albert D

    1991-01-01

    This book presents fundamentals and the latest techniques of electrical spectrum analysis. It focuses on instruments and techniques used on spectrum and network analysis, rather than theory. The book covers the use of spectrum analyzers, tracking generators, and network analyzers. Filled with practical examples, the book presents techniques that are widely used in signal processing and communications applications, yet are difficult to find in most literature.Key Features* Presents numerous practical examples, including actual spectrum analyzer circuits* Instruction on how to us

  6. ADAM: Analyzer for Dialectal Arabic Morphology

    Directory of Open Access Journals (Sweden)

    Wael Salloum

    2014-12-01

    Full Text Available While Modern Standard Arabic (MSA has many resources, Arabic Dialects, the primarily spoken local varieties of Arabic, are quite impoverished in this regard. In this article, we present ADAM (Analyzer for Dialectal Arabic Morphology. ADAM is a poor man’s solution to quickly develop morphological analyzers for dialectal Arabic. ADAM has roughly half the out-of-vocabulary rate of a state-of-the-art MSA analyzer and is comparable in its recall performance to an Egyptian dialectal morphological analyzer that took years and expensive resources to build.

  7. Analyzing metabolomics-based challenge test

    NARCIS (Netherlands)

    Vis, D.J.; Westerhuis, J.A.; Jacobs, D.M.; van Duynhoven, J.P.M.; Wopereis, S.; van Ommen, B.; Hendriks, M.M.W.B.; Smilde, A.K.

    2015-01-01

    Challenge tests are used to assess the resilience of human beings to perturbations by analyzing responses to detect functional abnormalities. Well known examples are allergy tests and glucose tolerance tests. Increasingly, metabolomics analysis of blood or serum samples is used to analyze the

  8. Time-delay analyzer with continuous discretization

    International Nuclear Information System (INIS)

    Bayatyan, G.L.; Darbinyan, K.T.; Mkrtchyan, K.K.; Stepanyan, S.S.

    1988-01-01

    A time-delay analyzer is described which when triggered by a start pulse of adjustable duration performs continuous discretization of the analyzed signal within nearly 22 ns time intervals, the recording in a memory unit with following slow read-out of the information to the computer and its processing. The time-delay analyzer consists of four CAMAC-VECTOR systems of unit width. With its help one can separate comparatively short, small-amplitude rare signals against the background of quasistationary noise processes. 4 refs.; 3 figs

  9. Systems Analyze Water Quality in Real Time

    Science.gov (United States)

    2010-01-01

    A water analyzer developed under Small Business Innovation Research (SBIR) contracts with Kennedy Space Center now monitors treatment processes at water and wastewater facilities around the world. Originally designed to provide real-time detection of nutrient levels in hydroponic solutions for growing plants in space, the ChemScan analyzer, produced by ASA Analytics Inc., of Waukesha, Wisconsin, utilizes spectrometry and chemometric algorithms to automatically analyze multiple parameters in the water treatment process with little need for maintenance, calibration, or operator intervention. The company has experienced a compound annual growth rate of 40 percent over its 15-year history as a direct result of the technology's success.

  10. Low Gravity Drug Stability Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed program through Phase III is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation. It will be...

  11. On-Demand Urine Analyzer, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer that can be integrated into International Space Station (ISS) toilets to measure key...

  12. Low Gravity Drug Stability Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this proposed program (through Phase III) is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation....

  13. Analyzing the economic impacts of transportation projects.

    Science.gov (United States)

    2013-09-01

    The main goal of the study is to explore methods, approaches and : analytical software tools for analyzing economic activity that results from largescale : transportation investments in Connecticut. The primary conclusion is that the : transportation...

  14. Low Gravity Drug Stability Analyzer, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this proposed program (through Phase III) is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation....

  15. Guide to analyzing investment options using TWIGS.

    Science.gov (United States)

    Charles R Blinn; Dietmar W. Rose; Monique L. Belli

    1988-01-01

    Describes methods for analyzing economic return of simulated stand management alternatives in TWIGS. Defines and discusses net present value, equivalent annual income, soil expectation value, and real vs. nominal analyses. Discusses risk and sensitivity analysis when comparing alternatives.

  16. Ultrasensitive Atmospheric Analyzer for Miniature UAVs Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this SBIR Phase I effort, Los Gatos Research (LGR) proposes to develop a highly-accurate, lightweight, low-power gas analyzer for quantification of water vapor...

  17. Analyzing Protein Dynamics Using Dimensionality Reduction

    OpenAIRE

    Eryol, Atahan

    2015-01-01

    This thesis investigates dimensionality reduction for analyzing the dynamics ofprotein simulations, particularly disordered proteins which do not fold into a xedshape but are thought to perform their functions through their movements. Ratherthan analyze the movement of the proteins in 3D space, we use dimensionalityreduction to project the molecular structure of the proteins into a target space inwhich each structure is represented as a point. All that is needed to do this arethe pairwise dis...

  18. Digital dynamic amplitude-frequency spectra analyzer

    International Nuclear Information System (INIS)

    Kalinnikov, V.A.; )

    2006-01-01

    The spectra analyzer is intended for the dynamic spectral analysis of signals physical installations and noise filtering. The recurrence Fourier transformation algorithm is used in the digital dynamic analyzer. It is realized on the basis of the fast logic FPGA matrix and the special signal ADSP microprocessor. The discretization frequency is 2 kHz-10 MHz. The number of calculated spectral coefficients is not less 512. The functional fast-action is 20 ns [ru

  19. ISOLATION AND CHARACTERIZATION OF METHANOTROPHIC BACTERIA FROM RICE FIELDS

    Directory of Open Access Journals (Sweden)

    IMAN RUSMANA

    2009-01-01

    Full Text Available Methane is a greenhouse gas capable of depleting the ozone layer. Rice fields are significant sources of atmospheric methane. The application of chemical fertilizer in rice fields increases the methane emission. Methanotrophic bacteria has a unique ability as it can utilize methane as a source of carbon and energy. This research was able to isolate and characterize successfully the methanotrophic bacteria from rice fields in Bogor and Sukabumi, in West Java, Indonesia. Methane oxidation was determined through Gas Chromatography and it shows that all isolates performed methane oxidation activity. The highest methane oxidation activity was performed by BGM 9 isolate. And the DNA amplification of BGM 9 genome was performed by a single band of mmoX in the size of 500 bp and three bands of pmoA in the size of 1000, 750 and 500 bp respectively

  20. Analyzing Log Files using Data-Mining

    Directory of Open Access Journals (Sweden)

    Marius Mihut

    2008-01-01

    Full Text Available Information systems (i.e. servers, applications and communication devices create a large amount of monitoring data that are saved as log files. For analyzing them, a data-mining approach is helpful. This article presents the steps which are necessary for creating an ‘analyzing instrument’, based on an open source software called Waikato Environment for Knowledge Analysis (Weka [1]. For exemplification, a system log file created by a Windows-based operating system, is used as input file.

  1. A Novel Architecture For Multichannel Analyzer

    International Nuclear Information System (INIS)

    Marcus, E.; Elhanani, I.; Nir, J.; Ellenbogen, M.; Kadmon, Y.; Tirosh, D.

    1999-01-01

    A novel digital approach to real-time, high-throughput, low-cost Multichannel Analyzer (MCA) for radiation spectroscopy is being presented. The MCA input is a shaped nuclear pulse sampled at a high rate, using an Analog-to-Digital Converter (ADC) chip. The digital samples are analyzed by a state-of-the-art Field Programmable Gate Away (FPGA). A customized algorithm is utilized to estimate the peak of the pulse, to reject pile-up and to eliminate processing dead time. The valid pulses estimated peaks are transferred to a micro controller system that creates the histogram and controls the Human Machine Interface (HMI)

  2. Advances on CT analyzing urolithiasis constituents

    International Nuclear Information System (INIS)

    Feng Qiang; Ma Zhijun

    2009-01-01

    Urolithiasis is common and frequently-occurring diseases of urology. The treatment of lithiasis is not only relevant with the size, location, brittle and infection of calculi, but also affected by urolithiasis constituents. Knowing the urolithiasis constituents in advance is no doubt to guide treatment. But so far an reliable inspection method was not found to analyze accurately urolithiasis constituents in vivo. CT judge precisely the size, location of calculi and analyze roughly the urolithiasis constituents in vivo, especially the appear of dual soure CT, which provide a new method for studying urolithiasis constituents. It may be helpful to find the cause, prevention and therapy of calculi. (authors)

  3. Empirical mode decomposition for analyzing acoustical signals

    Science.gov (United States)

    Huang, Norden E. (Inventor)

    2005-01-01

    The present invention discloses a computer implemented signal analysis method through the Hilbert-Huang Transformation (HHT) for analyzing acoustical signals, which are assumed to be nonlinear and nonstationary. The Empirical Decomposition Method (EMD) and the Hilbert Spectral Analysis (HSA) are used to obtain the HHT. Essentially, the acoustical signal will be decomposed into the Intrinsic Mode Function Components (IMFs). Once the invention decomposes the acoustic signal into its constituting components, all operations such as analyzing, identifying, and removing unwanted signals can be performed on these components. Upon transforming the IMFs into Hilbert spectrum, the acoustical signal may be compared with other acoustical signals.

  4. Electrical aerosol analyzer: calibration and performance

    Energy Technology Data Exchange (ETDEWEB)

    Pui, D.Y.H.; Liu, B.Y.H.

    1976-01-01

    The Electrical Aerosol Analyzer (EAA) was calibrated by means of monodisperse aerosols generated by two independent techniques. In the 0.02 to 1 ..mu..m diameter range, the aerosol was generated by electrostatic classification. In the range between 0.007 and 0.03 ..mu..m, the aerosols were generated by the photo-oxidation of SO/sub 2/ in a smog chamber. Calibration data are presented showing the performance of the EAA as an aerosol detector and as a size distribution analyzer.

  5. BWR plant analyzer development at BNL

    International Nuclear Information System (INIS)

    Cheng, H.S.; Wulff, W.; Mallen, A.N.; Lekach, S.V.; Stritar, A.; Cerbone, R.J.

    1985-01-01

    Advanced technology for high-speed interactive nuclear power plant simulations is of great value for timely resolution of safety issues, for plant monitoring, and for computer-aided emergency responses to an accident. Presented is the methodology employed at BNL to develop a BWR plant analyzer capable of simulating severe plant transients at much faster than real-time process speeds. Five modeling principles are established and a criterion is given for selecting numerical procedures and efficient computers to achieve the very high simulation speeds. Typical results are shown to demonstrate the modeling fidelity of the BWR plant analyzer

  6. Environmental applications of the centrifugal fast analyzer

    International Nuclear Information System (INIS)

    Goldstein, G.; Strain, J.E.; Bowling, J.L.

    1975-12-01

    The centrifugal fast analyzer (GeMSAEC Fast Analyzer) was applied to the analysis of pollutants in air and water. Since data acquisition and processing are computer controlled, considerable effort went into devising appropriate software. A modified version of the standard FOCAL interpreter was developed which includes special machine language functions for data timing, acquisition, and storage, and also permits chaining together of programs stored on a disk. Programs were written and experimental procedures developed to implement spectrophotometric, turbidimetric, kinetic (including initial-rate, fixed-time, and variable-time techniques), and chemiluminescence methods of analysis. Analytical methods were developed for the following elements and compounds: SO 2 , O 3 , Ca, Cr, Cu, Fe, Mg, Se(IV), Zn, Cl - , I - , NO 2 - , PO 4 -3 , S -2 , and SO 4 -2 . In many cases, standard methods could be adapted to the centrifugal analyzer, in others new methods were employed. In general, analyses performed with the centrifugal fast analyzer were faster, more precise, and more accurate than with conventional instrumentation

  7. Fluidization quality analyzer for fluidized beds

    Science.gov (United States)

    Daw, C.S.; Hawk, J.A.

    1995-07-25

    A control loop and fluidization quality analyzer for a fluidized bed utilizes time varying pressure drop measurements. A fast-response pressure transducer measures the overall bed pressure drop, or over some segment of the bed, and the pressure drop signal is processed to produce an output voltage which changes with the degree of fluidization turbulence. 9 figs.

  8. How to Analyze Company Using Social Network?

    Science.gov (United States)

    Palus, Sebastian; Bródka, Piotr; Kazienko, Przemysław

    Every single company or institution wants to utilize its resources in the most efficient way. In order to do so they have to be have good structure. The new way to analyze company structure by utilizing existing within company natural social network and example of its usage on Enron company are presented in this paper.

  9. Analyzing Vessel Behavior Using Process Mining

    NARCIS (Netherlands)

    Maggi, F.M.; Mooij, A.J.; Aalst, W.M.P. van der

    2013-01-01

    In the maritime domain, electronic sensors such as AIS receivers and radars collect large amounts of data about the vessels in a certain geographical area. We investigate the use of process mining techniques for analyzing the behavior of the vessels based on these data. In the context of maritime

  10. Images & Issues: How to Analyze Election Rhetoric.

    Science.gov (United States)

    Rank, Hugh

    Although it is impossible to know in advance the credibility of political messages, such persuasive discourse can be analyzed in a non-partisan, common sense way using predictable patterns in content and form. The content of a candidate's message can be summarized as "I am competent and trustworthy; from me, you'll get 'more good' and 'less…

  11. Thermal and Evolved-Gas Analyzer Illustration

    Science.gov (United States)

    2008-01-01

    This is a computer-aided drawing of the Thermal and Evolved-Gas Analyzer, or TEGA, on NASA's Phoenix Mars Lander. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  12. Statistical network analysis for analyzing policy networks

    DEFF Research Database (Denmark)

    Robins, Garry; Lewis, Jenny; Wang, Peng

    2012-01-01

    and policy network methodology is the development of statistical modeling approaches that can accommodate such dependent data. In this article, we review three network statistical methods commonly used in the current literature: quadratic assignment procedures, exponential random graph models (ERGMs...... has much to offer in analyzing the policy process....

  13. Consideration Regarding Diagnosis Analyze of Corporate Management

    Directory of Open Access Journals (Sweden)

    Mihaela Ciopi OPREA

    2009-01-01

    Full Text Available Diagnosis management aims to identify critical situations and positive aspectsof corporate management. An effective diagnosis made by a team with thestatus of independence from the organization’s management is for managers auseful feedback necessary to improve performance. The work presented focuseson the methodology to achieve effective diagnosis, considering multitudecriteria and variables to be analyzed.

  14. Analyzing and Interpreting Research in Health Education ...

    African Journals Online (AJOL)

    While qualitative research is used when little or nothing is known about the subject, quantitative research is required when there are quantifiable variables to be measured. By implication, health education research is based on phenomenological, ethnographical and/or grounded theoretical approaches that are analyzable ...

  15. Analyzing Languages for Specific Purposes Discourse

    Science.gov (United States)

    Bowles, Hugo

    2012-01-01

    In the last 20 years, technological advancement and increased multidisciplinarity has expanded the range of data regarded as within the scope of languages for specific purposes (LSP) research and the means by which they can be analyzed. As a result, the analytical work of LSP researchers has developed from a narrow focus on specialist terminology…

  16. Automatic radioxenon analyzer for CTBT monitoring

    International Nuclear Information System (INIS)

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.

    1996-12-01

    Over the past 3 years, with support from US DOE's NN-20 Comprehensive Test Ban Treaty (CTBT) R ampersand D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, 131m Xe(11.9 d), 133m Xe(2.19 d), 133 Xe (5.24 d), and 135 Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m 3 /h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 μBq/m 3 of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers

  17. Performance optimization of spectroscopic process analyzers

    NARCIS (Netherlands)

    Boelens, Hans F. M.; Kok, Wim Th; de Noord, Onno E.; Smilde, Age K.

    2004-01-01

    To increase the power and the robustness of spectroscopic process analyzers, methods are needed that suppress the spectral variation that is not related to the property of interest in the process stream. An approach for the selection of a suitable method is presented. The approach uses the net

  18. ITK and ANALYZE: a synergistic integration

    Science.gov (United States)

    Augustine, Kurt E.; Holmes, David R., III; Robb, Richard A.

    2004-05-01

    The Insight Toolkit (ITK) is a C++ open-source software toolkit developed under sponsorship of the National Library of Medicine. It provides advanced algorithms for performing image registration and segmentation, but does not provide support for visualization and analysis, nor does it offer any graphical user interface (GUI). The purpose of this integration project is to make ITK readily accessible to end-users with little or no programming skills, and provide interactive processing, visualization and measurement capabilities. This is achieved through the integration of ITK with ANALYZE, a multi-dimension image visualization/analysis application installed in over 300 institutions around the world, with a user-base in excess of 4000. This integration is carried out at both the software foundation and GUI levels. The foundation technology upon which ANALYZE is built is a comprehensive C-function library called AVW. A new set of AVW-ITK functions have been developed and integrated into the AVW library, and four new ITK modules have been added to the ANALYZE interface. Since ITK is a software developer"s toolkit, the only way to access its intrinsic power is to write programs that incorporate it. Integrating ITK with ANALYZE opens the ITK algorithms to end-users who otherwise might never be able to take advantage of the toolkit"s advanced functionality. In addition, this integration provides end-to-end interactive problem solving capabilities which allow all users, including programmers, an integrated system to readily display and quantitatively evaluate the results from the segmentation and registration routines in ITK, regardless of the type or format of input images, which are comprehensively supported in ANALYZE.

  19. Methodology for analyzing risk at nuclear facilities

    International Nuclear Information System (INIS)

    Yoo, Hosik; Lee, Nayoung; Ham, Taekyu; Seo, Janghoon

    2015-01-01

    Highlights: • A new methodology for evaluating the risk at nuclear facilities was developed. • Five measures reflecting all factors that should be concerned to assess risk were developed. • The attributes on NMAC and nuclear security culture are included as attributes for analyzing. • The newly developed methodology can be used to evaluate risk of both existing facility and future nuclear system. - Abstract: A methodology for evaluating risks at nuclear facilities is developed in this work. A series of measures is drawn from the analysis of factors that determine risks. Five measures are created to evaluate risks at nuclear facilities. These include the legal and institutional framework, material control, physical protection system effectiveness, human resources, and consequences. Evaluation attributes are developed for each measure and specific values are given in order to calculate the risk value quantitatively. Questionnaires are drawn up on whether or not a state has properly established a legal and regulatory framework (based on international standards). These questionnaires can be a useful measure for comparing the status of the physical protection regime between two countries. Analyzing an insider threat is not an easy task and no methodology has been developed for this purpose. In this study, attributes that could quantitatively evaluate an insider threat, in the case of an unauthorized removal of nuclear materials, are developed by adopting the Nuclear Material Accounting & Control (NMAC) system. The effectiveness of a physical protection system, P(E), could be analyzed by calculating the probability of interruption, P(I), and the probability of neutralization, P(N). In this study, the Tool for Evaluating Security System (TESS) code developed by KINAC is used to calculate P(I) and P(N). Consequence is an important measure used to analyze risks at nuclear facilities. This measure comprises radiological, economic, and social damage. Social and

  20. Modeling extreme ultraviolet suppression of electrostatic analyzers

    International Nuclear Information System (INIS)

    Gershman, Daniel J.; Zurbuchen, Thomas H.

    2010-01-01

    In addition to analyzing energy-per-charge ratios of incident ions, electrostatic analyzers (ESAs) for spaceborne time-of-flight mass spectrometers must also protect detectors from extreme ultraviolet (EUV) photons from the Sun. The required suppression rate often exceeds 1:10 7 and is generally established in tests upon instrument design and integration. This paper describes a novel technique to model the EUV suppression of ESAs using photon ray tracing integrated into SIMION, the most commonly used ion optics design software for such instruments. The paper compares simulation results with measurements taken from the ESA of the Mass instrument flying onboard the Wind spacecraft. This novel technique enables an active inclusion of EUV suppression requirements in the ESA design process. Furthermore, the simulation results also motivate design rules for such instruments.

  1. Real-time airborne particle analyzer

    Science.gov (United States)

    Reilly, Peter T.A.

    2012-10-16

    An aerosol particle analyzer includes a laser ablation chamber, a gas-filled conduit, and a mass spectrometer. The laser ablation chamber can be operated at a low pressure, which can be from 0.1 mTorr to 30 mTorr. The ablated ions are transferred into a gas-filled conduit. The gas-filled conduit reduces the electrical charge and the speed of ablated ions as they collide and mix with buffer gases in the gas-filled conduit. Preferably, the gas filled-conduit includes an electromagnetic multipole structure that collimates the nascent ions into a beam, which is guided into the mass spectrometer. Because the gas-filled conduit allows storage of vast quantities of the ions from the ablated particles, the ions from a single ablated particle can be analyzed multiple times and by a variety of techniques to supply statistically meaningful analysis of composition and isotope ratios.

  2. Analyzing Technique of Power Systems Under Deregulation

    Science.gov (United States)

    Miyauchi, Hajime; Kita, Hiroyuki; Ishigame, Atsushi

    Deregulation of the electric utilities has been progressing. Even under the deregulation, the reliability should be the most important problem of power systems. However, according to the deregulation, operation and scheduling of power systems are changing and new techniques to analyze power systems are introducing. To evaluate reliability of power systems, adequacy and security are well employed recently. This paper presents the new analyzing technique which will be realized in near future from the viewpoint of adequacy and security. First, simulation tool to evaluate adequacy is described. As an example of this tool, MARS and other methods are mentioned. Next, to evaluate the security, security constrained unit commitment (SCUC) and security constrained optimal power flow (SCOPF) are mentioned. Finally, some topics concerning ancillary service are described.

  3. Development of a nuclear plant analyzer (NPA)

    International Nuclear Information System (INIS)

    De Vlaminck, M.; Mampaey, L.; Vanhoenacker, L.; Bastenaire, F.

    1990-01-01

    A Nuclear Plant Analyzer has been developed by TRACTABEL. Three distinct functional units make up the Nuclear Plant Analyser, a model builder, a run time unit and an analysis unit. The model builder is intended to build simulation models which describe on the one hand the geometric structure and initial conditions of a given plant and on the other hand command control logics and reactor protection systems. The run time unit carries out dialog between the user and the thermal-hydraulic code. The analysis unit is aimed at deep analyzing of the transient results. The model builder is being tested in the framework of the International Standard Problem ISP-26, which is the simulation of a LOCA on the Japanese ROSA facility

  4. Computer-based radionuclide analyzer system

    International Nuclear Information System (INIS)

    Ohba, Kengo; Ishizuka, Akira; Kobayashi, Akira; Ohhashi, Hideaki; Tsuruoka, Kimitoshi.

    1978-01-01

    The radionuclide analysis in nuclear power plants, practiced for the purpose of monitoring the quality of the primary loop water, the confirmation of the performance of reactor cleanup system and monitoring the radioactive waste effluent, is an important job. Important as it is, it requires considerable labor of experts, because the samples to be analyzed are multifarious and very large in number, and in addition, this job depends much on manual work. With a view of saving the labor, simplifying and standardizing the work, reducing radiation exposure, and automatizing the work of analysis, the computerized analyzer system has been worked out. The results of its performance test at the operating power plant have proved that the development has fairly accomplished the objects and that the system is well useful. The developmental work was carried out by the cooperation between The Tokyo Electric Power Co. and Toshiba in about 4 years from 1974 to this year. (auth.)

  5. Analyzing the Existing Undergraduate Engineering Leadership Skills

    OpenAIRE

    Hamed M. Almalki; Luis Rabelo; Charles Davis; Hammad Usmani; Debra Hollister; Alfonso Sarmiento

    2016-01-01

    Purpose: Studying and analyzing the undergraduate engineering students' leadership skills to discover their potential leadership strengths and weaknesses. This study will unveil potential ways to enhance the ways we teach engineering leadership. The research has great insights that might assist engineering programs to improve curricula for the purpose of better engineering preparation to meet industry's demands. Methodology and Findings: 441 undergraduate engineering students have been s...

  6. Analyzing negative ties in social networks

    Directory of Open Access Journals (Sweden)

    Mankirat Kaur

    2016-03-01

    Full Text Available Online social networks are a source of sharing information and maintaining personal contacts with other people through social interactions and thus forming virtual communities online. Social networks are crowded with positive and negative relations. Positive relations are formed by support, endorsement and friendship and thus, create a network of well-connected users whereas negative relations are a result of opposition, distrust and avoidance creating disconnected networks. Due to increase in illegal activities such as masquerading, conspiring and creating fake profiles on online social networks, exploring and analyzing these negative activities becomes the need of hour. Usually negative ties are treated in same way as positive ties in many theories such as balance theory and blockmodeling analysis. But the standard concepts of social network analysis do not yield same results in respect of each tie. This paper presents a survey on analyzing negative ties in social networks through various types of network analysis techniques that are used for examining ties such as status, centrality and power measures. Due to the difference in characteristics of flow in positive and negative tie networks some of these measures are not applicable on negative ties. This paper also discusses new methods that have been developed specifically for analyzing negative ties such as negative degree, and h∗ measure along with the measures based on mixture of positive and negative ties. The different types of social network analysis approaches have been reviewed and compared to determine the best approach that can appropriately identify the negative ties in online networks. It has been analyzed that only few measures such as Degree and PN centrality are applicable for identifying outsiders in network. For applicability in online networks, the performance of PN measure needs to be verified and further, new measures should be developed based upon negative clique concept.

  7. Analyzing Architecture of Mithraism Rock Temples

    OpenAIRE

    Zohre AliJabbari

    2017-01-01

    This analyzes the architecture of rock temples of West and Northwest of Iran, as well as factors influencing their formation. The creation of rock architecture in this area of Iran is influenced by the religious, geographical and political atmosphere of their time. Most of these structures are formed by dominated empires in the first millennium BC. And in some works we are observing their continuity in later periods and change in their functions. One of the reasons that have attracted man to ...

  8. General methods for analyzing bounded proportion data

    OpenAIRE

    Hossain, Abu

    2017-01-01

    This thesis introduces two general classes of models for analyzing proportion response variable when the response variable Y can take values between zero and one, inclusive of zero and/or one. The models are inflated GAMLSS model and generalized Tobit GAMLSS model. The inflated GAMLSS model extends the flexibility of beta inflated models by allowing the distribution on (0,1) of the continuous component of the dependent variable to come from any explicit or transformed (i.e. logit or truncated...

  9. Development of a Portable Water Quality Analyzer

    OpenAIRE

    Germán COMINA; Martin NISSFOLK; José Luís SOLÍS

    2010-01-01

    A portable water analyzer based on a voltammetric electronic tongue has been developed. The system uses an electrochemical cell with two working electrodes as sensors, a computer controlled potentiostat, and software based on multivariate data analysis for pattern recognition. The system is suitable to differentiate laboratory made and real in-situ river water samples contaminated with different amounts of Escherichia coli. This bacteria is not only one of the main indicators for water qualit...

  10. Moving Block Bootstrap for Analyzing Longitudinal Data.

    Science.gov (United States)

    Ju, Hyunsu

    In a longitudinal study subjects are followed over time. I focus on a case where the number of replications over time is large relative to the number of subjects in the study. I investigate the use of moving block bootstrap methods for analyzing such data. Asymptotic properties of the bootstrap methods in this setting are derived. The effectiveness of these resampling methods is also demonstrated through a simulation study.

  11. A chemical analyzer for charged ultrafine particles

    OpenAIRE

    S. G. Gonser; A. Held

    2013-01-01

    New particle formation is a frequent phenomenon in the atmosphere and of major significance for the earth's climate and human health. To date the mechanisms leading to the nucleation of particles as well as to aerosol growth are not completely understood. A lack of appropriate measurement equipment for online analysis of the chemical composition of freshly nucleated particles is one major limitation. We have developed a Chemical Analyzer for Charged Ultrafine Particles (CAChUP) capable...

  12. A chemical analyzer for charged ultrafine particles

    OpenAIRE

    S. G. Gonser; A. Held

    2013-01-01

    New particle formation is a frequent phenomenon in the atmosphere and of major significance for the Earth's climate and human health. To date the mechanisms leading to the nucleation of particles as well as to aerosol growth are not completely understood. A lack of appropriate measurement equipment for online analysis of the chemical composition of freshly nucleated particles is one major limitation. We have developed a Chemical Analyzer for Charged Ultrafine Particles (CAChUP) capable of ana...

  13. A seal analyzer for testing container integrity

    International Nuclear Information System (INIS)

    McDaniel, P.; Jenkins, C.

    1988-01-01

    This paper reports on the development of laboratory and production seal analyzer that offers a rapid, nondestructive method of assuring the seal integrity of virtually any type of single or double sealed container. The system can test a broad range of metal cans, drums and trays, membrane-lidded vessels, flexible pouches, aerosol containers, and glass or metal containers with twist-top lids that are used in the chemical/pesticide (hazardous materials/waste), beverage, food, medical and pharmaceutical industries

  14. Information decomposition method to analyze symbolical sequences

    International Nuclear Information System (INIS)

    Korotkov, E.V.; Korotkova, M.A.; Kudryashov, N.A.

    2003-01-01

    The information decomposition (ID) method to analyze symbolical sequences is presented. This method allows us to reveal a latent periodicity of any symbolical sequence. The ID method is shown to have advantages in comparison with application of the Fourier transformation, the wavelet transform and the dynamic programming method to look for latent periodicity. Examples of the latent periods for poetic texts, DNA sequences and amino acids are presented. Possible origin of a latent periodicity for different symbolical sequences is discussed

  15. Neutral Particle Analyzer Diagnostic on NSTX

    Energy Technology Data Exchange (ETDEWEB)

    S.S. Medley; A.L. Roquemore

    2004-03-16

    The Neutral Particle Analyzer (NPA) diagnostic on the National Spherical Torus Experiment (NSTX) utilizes a PPPL-designed E||B spectrometer that measures the energy spectra of minority hydrogen and bulk deuterium species simultaneously with 39 energy channels per mass specie and a time resolution of 1 ms. The calibrated energy range is E = 0.5-150 keV and the energy resolution varies from AE/E = 3-7% over the surface of the microchannel plate detector.

  16. Neutral Particle Analyzer Diagnostic on NSTX

    International Nuclear Information System (INIS)

    Medley, S.S.; Roquemore, A.L.

    2004-01-01

    The Neutral Particle Analyzer (NPA) diagnostic on the National Spherical Torus Experiment (NSTX) utilizes a PPPL-designed E||B spectrometer that measures the energy spectra of minority hydrogen and bulk deuterium species simultaneously with 39 energy channels per mass specie and a time resolution of 1 ms. The calibrated energy range is E = 0.5-150 keV and the energy resolution varies from AE/E = 3-7% over the surface of the microchannel plate detector

  17. A Raman-Based Portable Fuel Analyzer

    Science.gov (United States)

    Farquharson, Stuart

    2010-08-01

    Fuel is the single most import supply during war. Consider that the US Military is employing over 25,000 vehicles in Iraq and Afghanistan. Most fuel is obtained locally, and must be characterized to ensure proper operation of these vehicles. Fuel properties are currently determined using a deployed chemical laboratory. Unfortunately, each sample requires in excess of 6 hours to characterize. To overcome this limitation, we have developed a portable fuel analyzer capable of determine 7 fuel properties that allow determining fuel usage. The analyzer uses Raman spectroscopy to measure the fuel samples without preparation in 2 minutes. The challenge, however, is that as distilled fractions of crude oil, all fuels are composed of hundreds of hydrocarbon components that boil at similar temperatures, and performance properties can not be simply correlated to a single component, and certainly not to specific Raman peaks. To meet this challenge, we measured over 800 diesel and jet fuels from around the world and used chemometrics to correlate the Raman spectra to fuel properties. Critical to the success of this approach is laser excitation at 1064 nm to avoid fluorescence interference (many fuels fluoresce) and a rugged interferometer that provides 0.1 cm-1 wavenumber (x-axis) accuracy to guarantee accurate correlations. Here we describe the portable fuel analyzer, the chemometric models, and the successful determination of these 7 fuel properties for over 100 unknown samples provided by the US Marine Corps, US Navy, and US Army.

  18. Thermo Scientific Sulfur Dioxide Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. BNL has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

  19. Visual analyzer as anticipatory system (functional organization)

    Science.gov (United States)

    Kirvelis, Dobilas

    2000-05-01

    Hypothetical functional organization of the visual analyzer is presented. The interpretation of visual perception, anatomic and morphological structure of visual systems of animals, neuro-physiological, psychological and psycho-physiological data in the light of a number of the theoretical solutions of image recognition and visual processes simulation enable active information processing. The activities in special areas of cortex are as follows: focused attention, prediction with analysis of visual scenes and synthesis, predictive mental images. In the projection zone of visual cortex Area Streata or V1 a "sensory" screen (SS) and "reconstruction" screen (RS) are supposed to exist. The functional structure of visual analyzer consist of: analysis of visual scenes projected onto SS; "tracing" of images; preliminary recognition; reversive image reconstruction onto RS; comparison of images projected onto SS with images reconstructed onto RS; and "correction" of preliminary recognition. Special attention is paid to the quasiholographical principles of the neuronal organization within the brain, of the image "tracing," and of reverse image reconstruction. Tachistoscopic experiments revealed that the duration of one such hypothesis-testing cycle of the human visual analyzers is about 8-10 milliseconds.

  20. IRISpy: Analyzing IRIS Data in Python

    Science.gov (United States)

    Ryan, Daniel; Christe, Steven; Mumford, Stuart; Baruah, Ankit; Timothy, Shelbe; Pereira, Tiago; De Pontieu, Bart

    2017-08-01

    IRISpy is a new community-developed open-source software library for analysing IRIS level 2 data. It is written in Python, a free, cross-platform, general-purpose, high-level programming language. A wide array of scientific computing software packages have already been developed in Python, from numerical computation (NumPy, SciPy, etc.), to visualization and plotting (matplotlib), to solar-physics-specific data analysis (SunPy). IRISpy is currently under development as a SunPy-affiliated package which means it depends on the SunPy library, follows similar standards and conventions, and is developed with the support of of the SunPy development team. IRISpy’s has two primary data objects, one for analyzing slit-jaw imager data and another for analyzing spectrograph data. Both objects contain basic slicing, indexing, plotting, and animating functionality to allow users to easily inspect, reduce and analyze the data. As part of this functionality the objects can output SunPy Maps, TimeSeries, Spectra, etc. of relevant data slices for easier inspection and analysis. Work is also ongoing to provide additional data analysis functionality including derivation of systematic measurement errors (e.g. readout noise), exposure time correction, residual wavelength calibration, radiometric calibration, and fine scale pointing corrections. IRISpy’s code base is publicly available through github.com and can be contributed to by anyone. In this poster we demonstrate IRISpy’s functionality and future goals of the project. We also encourage interested users to become involved in further developing IRISpy.

  1. Air sampling unit for breath analyzers

    Science.gov (United States)

    Szabra, Dariusz; Prokopiuk, Artur; Mikołajczyk, Janusz; Ligor, Tomasz; Buszewski, Bogusław; Bielecki, Zbigniew

    2017-11-01

    The paper presents a portable breath sampling unit (BSU) for human breath analyzers. The developed unit can be used to probe air from the upper airway and alveolar for clinical and science studies. The BSU is able to operate as a patient interface device for most types of breath analyzers. Its main task is to separate and to collect the selected phases of the exhaled air. To monitor the so-called I, II, or III phase and to identify the airflow from the upper and lower parts of the human respiratory system, the unit performs measurements of the exhaled CO2 (ECO2) in the concentration range of 0%-20% (0-150 mm Hg). It can work in both on-line and off-line modes according to American Thoracic Society/European Respiratory Society standards. A Tedlar bag with a volume of 5 dm3 is mounted as a BSU sample container. This volume allows us to collect ca. 1-25 selected breath phases. At the user panel, each step of the unit operation is visualized by LED indicators. This helps us to regulate the natural breathing cycle of the patient. There is also an operator's panel to ensure monitoring and configuration setup of the unit parameters. The operation of the breath sampling unit was preliminarily verified using the gas chromatography/mass spectrometry (GC/MS) laboratory setup. At this setup, volatile organic compounds were extracted by solid phase microextraction. The tests were performed by the comparison of GC/MS signals from both exhaled nitric oxide and isoprene analyses for three breath phases. The functionality of the unit was proven because there was an observed increase in the signal level in the case of the III phase (approximately 40%). The described work made it possible to construct a prototype of a very efficient breath sampling unit dedicated to breath sample analyzers.

  2. A chemical analyzer for charged ultrafine particles

    Science.gov (United States)

    Gonser, S. G.; Held, A.

    2013-09-01

    New particle formation is a frequent phenomenon in the atmosphere and of major significance for the Earth's climate and human health. To date the mechanisms leading to the nucleation of particles as well as to aerosol growth are not completely understood. A lack of appropriate measurement equipment for online analysis of the chemical composition of freshly nucleated particles is one major limitation. We have developed a Chemical Analyzer for Charged Ultrafine Particles (CAChUP) capable of analyzing particles with diameters below 30 nm. A bulk of size-separated particles is collected electrostatically on a metal filament, resistively desorbed and subsequently analyzed for its molecular composition in a time of flight mass spectrometer. We report on technical details as well as characterization experiments performed with the CAChUP. Our instrument was tested in the laboratory for its detection performance as well as for its collection and desorption capabilities. The manual application of defined masses of camphene (C10H16) to the desorption filament resulted in a detection limit between 0.5 and 5 ng, and showed a linear response of the mass spectrometer. Flow tube experiments of 25 nm diameter secondary organic aerosol from ozonolysis of alpha-pinene also showed a linear relation between collection time and the mass spectrometer's signal intensity. The resulting mass spectra from the collection experiments are in good agreement with published work on particles generated by the ozonolysis of alpha-pinene. A sensitivity study shows that the current setup of CAChUP is ready for laboratory measurements and for the observation of new particle formation events in the field.

  3. A computer program for analyzing channel geometry

    Science.gov (United States)

    Regan, R.S.; Schaffranek, R.W.

    1985-01-01

    The Channel Geometry Analysis Program (CGAP) provides the capability to process, analyze, and format cross-sectional data for input to flow/transport simulation models or other computational programs. CGAP allows for a variety of cross-sectional data input formats through use of variable format specification. The program accepts data from various computer media and provides for modification of machine-stored parameter values. CGAP has been devised to provide a rapid and efficient means of computing and analyzing the physical properties of an open-channel reach defined by a sequence of cross sections. CGAP 's 16 options provide a wide range of methods by which to analyze and depict a channel reach and its individual cross-sectional properties. The primary function of the program is to compute the area, width, wetted perimeter, and hydraulic radius of cross sections at successive increments of water surface elevation (stage) from data that consist of coordinate pairs of cross-channel distances and land surface or channel bottom elevations. Longitudinal rates-of-change of cross-sectional properties are also computed, as are the mean properties of a channel reach. Output products include tabular lists of cross-sectional area, channel width, wetted perimeter, hydraulic radius, average depth, and cross-sectional symmetry computed as functions of stage; plots of cross sections; plots of cross-sectional area and (or) channel width as functions of stage; tabular lists of cross-sectional area and channel width computed as functions of stage for subdivisions of a cross section; plots of cross sections in isometric projection; and plots of cross-sectional area at a fixed stage as a function of longitudinal distance along an open-channel reach. A Command Procedure Language program and Job Control Language procedure exist to facilitate program execution on the U.S. Geological Survey Prime and Amdahl computer systems respectively. (Lantz-PTT)

  4. Remote Laser Diffraction Particle Size Distribution Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2001-03-01

    In support of a radioactive slurry sampling and physical characterization task, an “off-the-shelf” laser diffraction (classical light scattering) particle size analyzer was utilized for remote particle size distribution (PSD) analysis. Spent nuclear fuel was previously reprocessed at the Idaho Nuclear Technology and Engineering Center (INTEC—formerly recognized as the Idaho Chemical Processing Plant) which is on DOE’s INEEL site. The acidic, radioactive aqueous raffinate streams from these processes were transferred to 300,000 gallon stainless steel storage vessels located in the INTEC Tank Farm area. Due to the transfer piping configuration in these vessels, complete removal of the liquid can not be achieved. Consequently, a “heel” slurry remains at the bottom of an “emptied” vessel. Particle size distribution characterization of the settled solids in this remaining heel slurry, as well as suspended solids in the tank liquid, is the goal of this remote PSD analyzer task. A Horiba Instruments Inc. Model LA-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a “hot cell” (gamma radiation) environment. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not previously achievable—making this technology far superior than the traditional methods used. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  5. Analyzing Argumentation In Rich, Natural Contexts

    Directory of Open Access Journals (Sweden)

    Anita Reznitskaya

    2008-02-01

    Full Text Available The paper presents the theoretical and methodological aspects of research on the development of argument- ation in elementary school children. It presents a theoretical framework detailing psychological mechanisms responsible for the acquisition and transfer of argumentative discourse and demonstrates several applications of the framework, described in sufficient detail to guide future empirical investigations of oral, written, individual, or group argumentation performance. Software programs capable of facilitating data analysis are identified and their uses illustrated. The analytic schemes can be used to analyze large amounts of verbal data with reasonable precision and efficiency. The conclusion addresses more generally the challenges for and possibilities of empirical study of the development of argumentation.

  6. Thermo Scientific Ozone Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The primary measurement output from the Thermo Scientific Ozone Analyzer is the concentration of the analyte (O3) reported at 1-s resolution in units of ppbv in ambient air. Note that because of internal pneumatic switching limitations the instrument only makes an independent measurement every 4 seconds. Thus, the same concentration number is repeated roughly 4 times at the uniform, monotonic 1-s time base used in the AOS systems. Accompanying instrument outputs include sample temperatures, flows, chamber pressure, lamp intensities and a multiplicity of housekeeping information. There is also a field for operator comments made at any time while data is being collected.

  7. Grid and Data Analyzing and Security

    Directory of Open Access Journals (Sweden)

    Fatemeh SHOKRI

    2012-12-01

    Full Text Available This paper examines the importance of secure structures in the process of analyzing and distributing information with aid of Grid-based technologies. The advent of distributed network has provided many practical opportunities for detecting and recording the time of events, and made efforts to identify the events and solve problems of storing information such as being up-to-date and documented. In this regard, the data distribution systems in a network environment should be accurate. As a consequence, a series of continuous and updated data must be at hand. In this case, Grid is the best answer to use data and resource of organizations by common processing.

  8. Development of a Portable Water Quality Analyzer

    Directory of Open Access Journals (Sweden)

    Germán COMINA

    2010-08-01

    Full Text Available A portable water analyzer based on a voltammetric electronic tongue has been developed. The system uses an electrochemical cell with two working electrodes as sensors, a computer controlled potentiostat, and software based on multivariate data analysis for pattern recognition. The system is suitable to differentiate laboratory made and real in-situ river water samples contaminated with different amounts of Escherichia coli. This bacteria is not only one of the main indicators for water quality, but also a main concern for public health, affecting especially people living in high-burden, resource-limiting settings.

  9. Light-weight analyzer for odor recognition

    Energy Technology Data Exchange (ETDEWEB)

    Vass, Arpad A; Wise, Marcus B

    2014-05-20

    The invention provides a light weight analyzer, e.g., detector, capable of locating clandestine graves. The detector utilizes the very specific and unique chemicals identified in the database of human decompositional odor. This detector, based on specific chemical compounds found relevant to human decomposition, is the next step forward in clandestine grave detection and will take the guess-work out of current methods using canines and ground-penetrating radar, which have historically been unreliable. The detector is self contained, portable and built for field use. Both visual and auditory cues are provided to the operator.

  10. Nonlinear single-spin spectrum analyzer.

    Science.gov (United States)

    Kotler, Shlomi; Akerman, Nitzan; Glickman, Yinnon; Ozeri, Roee

    2013-03-15

    Qubits have been used as linear spectrum analyzers of their environments. Here we solve the problem of nonlinear spectral analysis, required for discrete noise induced by a strongly coupled environment. Our nonperturbative analytical model shows a nonlinear signal dependence on noise power, resulting in a spectral resolution beyond the Fourier limit as well as frequency mixing. We develop a noise characterization scheme adapted to this nonlinearity. We then apply it using a single trapped ion as a sensitive probe of strong, non-Gaussian, discrete magnetic field noise. Finally, we experimentally compared the performance of equidistant vs Uhrig modulation schemes for spectral analysis.

  11. Analyzing delay causes in Egyptian construction projects.

    Science.gov (United States)

    Marzouk, Mohamed M; El-Rasas, Tarek I

    2014-01-01

    Construction delays are common problems in civil engineering projects in Egypt. These problems occur frequently during project life-time leading to disputes and litigation. Therefore, it is essential to study and analyze causes of construction delays. This research presents a list of construction delay causes retrieved from literature. The feedback of construction experts was obtained through interviews. Subsequently, a questionnaire survey was prepared. The questionnaire survey was distributed to thirty-three construction experts who represent owners, consultants, and contractor's organizations. Frequency Index, Severity Index, and Importance Index are calculated and according to the highest values of them the top ten delay causes of construction projects in Egypt are determined. A case study is analyzed and compared to the most important delay causes in the research. Statistical analysis is carried out using analysis of variance ANOVA method to test delay causes, obtained from the survey. The test results reveal good correlation between groups while there is significant difference between them for some delay causes and finally roadmap for prioritizing delay causes groups is presented.

  12. Plutonium solution analyzer. Revised February 1995

    International Nuclear Information System (INIS)

    Burns, D.A.

    1995-02-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%--0.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40--240 g/l: and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4--4.0 g/y. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 ml of each sample and standard, and generates waste at the rate of about 1.5 ml per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded)

  13. Mango: combining and analyzing heterogeneous biological networks.

    Science.gov (United States)

    Chang, Jennifer; Cho, Hyejin; Chou, Hui-Hsien

    2016-01-01

    Heterogeneous biological data such as sequence matches, gene expression correlations, protein-protein interactions, and biochemical pathways can be merged and analyzed via graphs, or networks. Existing software for network analysis has limited scalability to large data sets or is only accessible to software developers as libraries. In addition, the polymorphic nature of the data sets requires a more standardized method for integration and exploration. Mango facilitates large network analyses with its Graph Exploration Language, automatic graph attribute handling, and real-time 3-dimensional visualization. On a personal computer Mango can load, merge, and analyze networks with millions of links and can connect to online databases to fetch and merge biological pathways. Mango is written in C++ and runs on Mac OS, Windows, and Linux. The stand-alone distributions, including the Graph Exploration Language integrated development environment, are freely available for download from http://www.complex.iastate.edu/download/Mango. The Mango User Guide listing all features can be found at http://www.gitbook.com/book/j23414/mango-user-guide.

  14. Analyzing block placement errors in SADP patterning

    Science.gov (United States)

    Kobayashi, Shinji; Okada, Soichiro; Shimura, Satoru; Nafus, Kathleen; Fonseca, Carlos; Demand, Marc; Biesemans, Serge; Versluijs, Janko; Ercken, Monique; Foubert, Philippe; Miyazaki, Shinobu

    2016-03-01

    We discuss edge placement errors (EPE) for multi-patterning of Mx critical layers using ArF lithography. Specific focus is placed on the block formation part of the process. While plenty of literature characterization data exist on spacer formation, only limited published data is available on block processes. We analyze the accuracy of placing blocks relative to narrow spacers. Many publications calculate EPE assuming Gaussian distributions for key process variations contributing to EPE. For practical reasons, each contributor is measured on dedicated test structures. In this work, we complement such analysis and directly measure the EPE in product. We perform high density sampling of blocks using CDSEM images and analyze all feature edges of interest. We find that block placement errors can be very different depending on their local design context. Specifically we report on 2 block populations (further called block A and B) which have a 4x different standard deviation. We attribute this to differences in local topography (spacer shape) and interaction with the plasma-etch process design. Block A (on top of the `core space' S1) has excellent EPE uniformity of ~1 nm while block B (on top of `gap space' S2) has degraded EPE control of ~4 nm. Finally, we suggest that the SOC etch process is at the origin on positioning blocks accurately on slim spacers, helping the manufacturability of spacer-based patterning techniques, and helping its extension toward the 5nm node.

  15. Analyzing endocrine system conservation and evolution.

    Science.gov (United States)

    Bonett, Ronald M

    2016-08-01

    Analyzing variation in rates of evolution can provide important insights into the factors that constrain trait evolution, as well as those that promote diversification. Metazoan endocrine systems exhibit apparent variation in evolutionary rates of their constituent components at multiple levels, yet relatively few studies have quantified these patterns and analyzed them in a phylogenetic context. This may be in part due to historical and current data limitations for many endocrine components and taxonomic groups. However, recent technological advancements such as high-throughput sequencing provide the opportunity to collect large-scale comparative data sets for even non-model species. Such ventures will produce a fertile data landscape for evolutionary analyses of nucleic acid and amino acid based endocrine components. Here I summarize evolutionary rate analyses that can be applied to categorical and continuous endocrine traits, and also those for nucleic acid and protein-based components. I emphasize analyses that could be used to test whether other variables (e.g., ecology, ontogenetic timing of expression, etc.) are related to patterns of rate variation and endocrine component diversification. The application of phylogenetic-based rate analyses to comparative endocrine data will greatly enhance our understanding of the factors that have shaped endocrine system evolution. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Analyzing rare diseases terms in biomedical terminologies

    Directory of Open Access Journals (Sweden)

    Erika Pasceri

    2012-03-01

    Full Text Available Rare disease patients too often face common problems, including the lack of access to correct diagnosis, lack of quality information on the disease, lack of scientific knowledge of the disease, inequities and difficulties in access to treatment and care. These things could be changed by implementing a comprehensive approach to rare diseases, increasing international cooperation in scientific research, by gaining and sharing scientific knowledge about and by developing tools for extracting and sharing knowledge. A significant aspect to analyze is the organization of knowledge in the biomedical field for the proper management and recovery of health information. For these purposes, the sources needed have been acquired from the Office of Rare Diseases Research, the National Organization of Rare Disorders and Orphanet, organizations that provide information to patients and physicians and facilitate the exchange of information among different actors involved in this field. The present paper shows the representation of rare diseases terms in biomedical terminologies such as MeSH, ICD-10, SNOMED CT and OMIM, leveraging the fact that these terminologies are integrated in the UMLS. At the first level, it was analyzed the overlap among sources and at a second level, the presence of rare diseases terms in target sources included in UMLS, working at the term and concept level. We found that MeSH has the best representation of rare diseases terms.

  17. CALIBRATION OF ONLINE ANALYZERS USING NEURAL NETWORKS

    Energy Technology Data Exchange (ETDEWEB)

    Rajive Ganguli; Daniel E. Walsh; Shaohai Yu

    2003-12-05

    Neural networks were used to calibrate an online ash analyzer at the Usibelli Coal Mine, Healy, Alaska, by relating the Americium and Cesium counts to the ash content. A total of 104 samples were collected from the mine, with 47 being from screened coal, and the rest being from unscreened coal. Each sample corresponded to 20 seconds of coal on the running conveyor belt. Neural network modeling used the quick stop training procedure. Therefore, the samples were split into training, calibration and prediction subsets. Special techniques, using genetic algorithms, were developed to representatively split the sample into the three subsets. Two separate approaches were tried. In one approach, the screened and unscreened coal was modeled separately. In another, a single model was developed for the entire dataset. No advantage was seen from modeling the two subsets separately. The neural network method performed very well on average but not individually, i.e. though each prediction was unreliable, the average of a few predictions was close to the true average. Thus, the method demonstrated that the analyzers were accurate at 2-3 minutes intervals (average of 6-9 samples), but not at 20 seconds (each prediction).

  18. Analyzing Virtual Physics Simulations with Tracker

    Science.gov (United States)

    Claessens, Tom

    2017-12-01

    In the physics teaching community, Tracker is well known as a user-friendly open source video analysis software, authored by Douglas Brown. With this tool, the user can trace markers indicated on a video or on stroboscopic photos and perform kinematic analyses. Tracker also includes a data modeling tool that allows one to fit some theoretical equations of motion onto experimentally obtained data. In the field of particle mechanics, Tracker has been effectively used for learning and teaching about projectile motion, "toss up" and free-fall vertical motion, and to explain the principle of mechanical energy conservation. Also, Tracker has been successfully used in rigid body mechanics to interpret the results of experiments with rolling/slipping cylinders and moving rods. In this work, I propose an original method in which Tracker is used to analyze virtual computer simulations created with a physics-based motion solver, instead of analyzing video recording or stroboscopic photos. This could be an interesting approach to study kinematics and dynamics problems in physics education, in particular when there is no or limited access to physical labs. I demonstrate the working method with a typical (but quite challenging) problem in classical mechanics: a slipping/rolling cylinder on a rough surface.

  19. Calibration of the portable wear metal analyzer

    Science.gov (United States)

    Quinn, Michael J.

    1987-12-01

    The Portable Wear Metal Analyzer (PWMA), a graphite furnace atomic absorption (AA) spectrometer, developed under a contract for this laboratory, was evaluated using powdered metal particles suspended in oil. The PWMA is a microprocessor controlled automatic sequential multielement AA spectrometer designed to support the deployed aircraft requirement for spectrometric oil analysis. The PWMA will analyze for nine elements (Ni, Fe, Cu, Cr, Ag, Mg, Si, Ti, Al) at a rate of 4 min per sample. The graphite tube and modified sample introduction system increase the detection of particles in oil when compared to the currently used techniques of flame AA or spark atomic emission (AE) spectroscopy. The PWMA shows good-to-excellent response for particles in sizes of 0 to 5 and 5 to 10 micrometers and fair response to particles of 10 to 20 and 20 to 30 micrometers. All trends in statistical variations are easily explained by system considerations. Correction factors to the calibration curves are necessary to correlate the analytical capability of the PWMA to the performance of existing spectrometric oil analysis (SOA) instruments.

  20. Solar Probe ANalyzer for Ions - Laboratory Performance

    Science.gov (United States)

    Livi, R.; Larson, D. E.; Kasper, J. C.; Korreck, K. E.; Whittlesey, P. L.

    2017-12-01

    The Parker Solar Probe (PSP) mission is a heliospheric satellite that will orbit the Sun closer than any prior mission to date with a perihelion of 35 solar radii (RS) and an aphelion of 10 RS. PSP includes the Solar Wind Electrons Alphas and Protons (SWEAP) instrument suite, which in turn consists of four instruments: the Solar Probe Cup (SPC) and three Solar Probe ANalyzers (SPAN) for ions and electrons. Together, this suite will take local measurements of particles and electromagnetic fields within the Sun's corona. SPAN-Ai has completed flight calibration and spacecraft integration and is set to be launched in July of 2018. The main mode of operation consists of an electrostatic analyzer (ESA) at its aperture followed by a Time-of-Flight section to measure the energy and mass per charge (m/q) of the ambient ions. SPAN-Ai's main objective is to measure solar wind ions within an energy range of 5 eV - 20 keV, a mass/q between 1-60 [amu/q] and a field of view of 2400x1200. Here we will show flight calibration results and performance.

  1. Optoacoustic 13C-breath test analyzer

    Science.gov (United States)

    Harde, Hermann; Helmrich, Günther; Wolff, Marcus

    2010-02-01

    The composition and concentration of exhaled volatile gases reflects the physical ability of a patient. Therefore, a breath analysis allows to recognize an infectious disease in an organ or even to identify a tumor. One of the most prominent breath tests is the 13C-urea-breath test, applied to ascertain the presence of the bacterium helicobacter pylori in the stomach wall as an indication of a gastric ulcer. In this contribution we present a new optical analyzer that employs a compact and simple set-up based on photoacoustic spectroscopy. It consists of two identical photoacoustic cells containing two breath samples, one taken before and one after capturing an isotope-marked substrate, where the most common isotope 12C is replaced to a large extent by 13C. The analyzer measures simultaneously the relative CO2 isotopologue concentrations in both samples by exciting the molecules on specially selected absorption lines with a semiconductor laser operating at a wavelength of 2.744 μm. For a reliable diagnosis changes of the 13CO2 concentration of 1% in the exhaled breath have to be detected at a concentration level of this isotope in the breath of about 500 ppm.

  2. Analyzing surface coatings in situ: High-temperature surface film analyzer developed

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    Scientists at Argonne National Laboratory (ANL) have devised a new instrument that can analyze surface coatings under operating conditions. The High-Temperature Surface Film Analyzer is the first such instrument to analyze the molecular composition and structure of surface coatings on metals and solids under conditions of high temperature and pressure in liquid environments. Corrosion layers, oxide coatings, polymers or paint films, or adsorbed molecules are examples of conditions that can be analyzed using this instrument. Film thicknesses may vary from a few molecular layers to several microns or thicker. The instrument was originally developed to study metal corrosion in aqueous solutions similar to the cooling water systems of light-water nuclear reactors. The instrument may have use for the nuclear power industry where coolant pipes degrade due to stress corrosion cracking, which often leads to plant shutdown. Key determinants in the occurrence of stress corrosion cracking are the properties and composition of corrosion scales that form inside pipes. The High-Temperature Surface Analyzer can analyze these coatings under laboratory conditions that simulate the same hostile environment of high temperature, pressure, and solution that exist during plant operations. The ability to analyze these scales in hostile liquid environments is unique to the instrument. Other applications include analyzing paint composition, corrosion of materials in geothermal power systems, integrity of canisters for radioactive waste storage, corrosion inhibitor films on piping and drilling systems, and surface scales on condenser tubes in industrial hot water heat exchangers. The device is not patented

  3. Automated Root Tracking with "Root System Analyzer"

    Science.gov (United States)

    Schnepf, Andrea; Jin, Meina; Ockert, Charlotte; Bol, Roland; Leitner, Daniel

    2015-04-01

    Crucial factors for plant development are water and nutrient availability in soils. Thus, root architecture is a main aspect of plant productivity and needs to be accurately considered when describing root processes. Images of root architecture contain a huge amount of information, and image analysis helps to recover parameters describing certain root architectural and morphological traits. The majority of imaging systems for root systems are designed for two-dimensional images, such as RootReader2, GiA Roots, SmartRoot, EZ-Rhizo, and Growscreen, but most of them are semi-automated and involve mouse-clicks in each root by the user. "Root System Analyzer" is a new, fully automated approach for recovering root architectural parameters from two-dimensional images of root systems. Individual roots can still be corrected manually in a user interface if required. The algorithm starts with a sequence of segmented two-dimensional images showing the dynamic development of a root system. For each image, morphological operators are used for skeletonization. Based on this, a graph representation of the root system is created. A dynamic root architecture model helps to determine which edges of the graph belong to an individual root. The algorithm elongates each root at the root tip and simulates growth confined within the already existing graph representation. The increment of root elongation is calculated assuming constant growth. For each root, the algorithm finds all possible paths and elongates the root in the direction of the optimal path. In this way, each edge of the graph is assigned to one or more coherent roots. Image sequences of root systems are handled in such a way that the previous image is used as a starting point for the current image. The algorithm is implemented in a set of Matlab m-files. Output of Root System Analyzer is a data structure that includes for each root an identification number, the branching order, the time of emergence, the parent

  4. Analyzing and forecasting the European social climate

    Directory of Open Access Journals (Sweden)

    Liliana DUGULEANĂ

    2015-06-01

    Full Text Available The paper uses the results of the sample survey Eurobarometer, which has been requested by the European Commission. The social climate index is used to measure the level of perceptions of population by taking into account their personal situation and their perspective at national level. The paper makes an analysis of the evolution of social climate indices for the countries of European Union and offers information about the expectations of population of analyzed countries. The obtained results can be compared with the forecasting of Eurobarometer, on short term of one year and medium term of five years. Modelling the social climate index and its influence factors offers useful information about the efficiency of social protection and inclusion policies.

  5. Analyzer of neutron flux in real time

    International Nuclear Information System (INIS)

    Rojas S, A.S.; Carrillo M, R.A.; Balderas, E.G.

    1999-01-01

    With base in the study of the real signals of neutron flux of instability events occurred in the Laguna Verde nuclear power plant where the nucleus oscillation phenomena of the reactor are in the 0 to 2.5 Hz range, it has been seen the possibility about the development a surveillance and diagnostic equipment capable to analyze in real time the behavior of nucleus in this frequencies range. An important method for surveillance the stability of the reactor nucleus is the use of the Power spectral density which allows to determine the frequencies and amplitudes contained in the signals. It is used an instrument carried out by LabVIEW graphic programming with a data acquisition card of 16 channels which works at Windows 95/98 environment. (Author)

  6. Analyzing the Control Structure of PEPA

    DEFF Research Database (Denmark)

    Yang, Fan; Nielson, Hanne Riis

    expressed in PEPA. The analysis technique we adopted is Data Flow Analysis. We begin the analysis by defining an appropriate transfer function, then with the classical worklist algorithm we construct a finite automaton that captures all possible interactions among processes. By annotating labels and layers...... to PEPA programs, the approximating result is very precise. Based on the analysis, we also develop algorithms for validating the deadlock property of PEPA programs. The techniques have been implemented in a tool which is able to analyze processes with a control structure that more than one thousand states.......The Performance Evaluation Process Algebra, PEPA, is introduced by Jane Hillston as a stochastic process algebra for modelling distributed systems and especially suitable for performance evaluation. We present a static analysis that very precisely approximates the control structure of processes...

  7. Buccal microbiology analyzed by infrared spectroscopy

    Science.gov (United States)

    de Abreu, Geraldo Magno Alves; da Silva, Gislene Rodrigues; Khouri, Sônia; Favero, Priscila Pereira; Raniero, Leandro; Martin, Airton Abrahão

    2012-01-01

    Rapid microbiological identification and characterization are very important in dentistry and medicine. In addition to dental diseases, pathogens are directly linked to cases of endocarditis, premature delivery, low birth weight, and loss of organ transplants. Fourier Transform Infrared Spectroscopy (FTIR) was used to analyze oral pathogens Aggregatibacter actinomycetemcomitans ATCC 29523, Aggregatibacter actinomycetemcomitans-JP2, and Aggregatibacter actinomycetemcomitans which was clinically isolated from the human blood-CI. Significant spectra differences were found among each organism allowing the identification and characterization of each bacterial species. Vibrational modes in the regions of 3500-2800 cm-1, the 1484-1420 cm-1, and 1000-750 cm-1 were used in this differentiation. The identification and classification of each strain were performed by cluster analysis achieving 100% separation of strains. This study demonstrated that FTIR can be used to decrease the identification time, compared to the traditional methods, of fastidious buccal microorganisms associated with the etiology of the manifestation of periodontitis.

  8. Nuclear Plant Analyzer: Installation manual. Volume 1

    International Nuclear Information System (INIS)

    Snider, D.M.; Wagner, K.L.; Grush, W.H.; Jones, K.R.

    1995-01-01

    This report contains the installation instructions for the Nuclear Plant Analyzer (NPA) System. The NPA System consists of the Computer Visual System (CVS) program, the NPA libraries, the associated utility programs. The NPA was developed at the Idaho National Engineering Laboratory under the sponsorship of the US Nuclear Regulatory Commission to provide a highly flexible graphical user interface for displaying the results of these analysis codes. The NPA also provides the user with a convenient means of interactively controlling the host program through user-defined pop-up menus. The NPA was designed to serve primarily as an analysis tool. After a brief introduction to the Computer Visual System and the NPA, an analyst can quickly create a simple picture or set of pictures to aide in the study of a particular phenomenon. These pictures can range from simple collections of square boxes and straight lines to complex representations of emergency response information displays

  9. Using wavelet features for analyzing gamma lines

    International Nuclear Information System (INIS)

    Medhat, M.E.; Abdel-hafiez, A.; Hassan, M.F.; Ali, M.A.; Uzhinskii, V.V.

    2004-01-01

    Data processing methods for analyzing gamma ray spectra with symmetric bell-shaped peaks form are considered. In many cases the peak form is symmetrical bell shaped in particular a Gaussian case is the most often used due to many physical reasons. The problem is how to evaluate parameters of such peaks, i.e. their positions, amplitudes and also their half-widths, that is for a single peak and overlapped peaks. Through wavelet features by using Marr wavelet (Mexican Hat) as a correlation method, it could be to estimate the optimal wavelet parameters and to locate peaks in the spectrum. The performance of the proposed method and others shows a better quality of wavelet transform method

  10. Coke from small-diameter tubes analyzed

    International Nuclear Information System (INIS)

    Albright, L.F.

    1988-01-01

    The mechanism for coke deposit formation and the nature of the coke itself can vary with the design of the ethylene furnace tube bank. In this article, coke deposits from furnaces with small-diameter pyrolysis tubes are examined. The samples were taken from four furnaces of identical design (Plant B). As in both the first and second installments of the series, the coke deposits were examined using a scanning electron microscope (SEM) and an energy dispersive X-ray analyzer (EDAX). The deposits from the small-diameter tubes are compared with the coke deposits from the furnace discussed in earlier articles. Analysis of the coke in both sets of samples are then used to offer recommendations for improved decoking procedures, operating procedures, better feed selection, and better selection of the metallurgy used in furnace tubes, to extend the operating time of the furnace tubes by reducing the amount and type of coke build up

  11. Analyzing Options for Airborne Emergency Wireless Communications

    Energy Technology Data Exchange (ETDEWEB)

    Michael Schmitt; Juan Deaton; Curt Papke; Shane Cherry

    2008-03-01

    In the event of large-scale natural or manmade catastrophic events, access to reliable and enduring commercial communication systems is critical. Hurricane Katrina provided a recent example of the need to ensure communications during a national emergency. To ensure that communication demands are met during these critical times, Idaho National Laboratory (INL) under the guidance of United States Strategic Command has studied infrastructure issues, concerns, and vulnerabilities associated with an airborne wireless communications capability. Such a capability could provide emergency wireless communications until public/commercial nodes can be systematically restored. This report focuses on the airborne cellular restoration concept; analyzing basic infrastructure requirements; identifying related infrastructure issues, concerns, and vulnerabilities and offers recommended solutions.

  12. Sensors and Automated Analyzers for Radionuclides

    International Nuclear Information System (INIS)

    Grate, Jay W.; Egorov, Oleg B.

    2003-01-01

    The production of nuclear weapons materials has generated large quantities of nuclear waste and significant environmental contamination. We have developed new, rapid, automated methods for determination of radionuclides using sequential injection methodologies to automate extraction chromatographic separations, with on-line flow-through scintillation counting for real time detection. This work has progressed in two main areas: radionuclide sensors for water monitoring and automated radiochemical analyzers for monitoring nuclear waste processing operations. Radionuclide sensors have been developed that collect and concentrate radionuclides in preconcentrating minicolumns with dual functionality: chemical selectivity for radionuclide capture and scintillation for signal output. These sensors can detect pertechnetate to below regulatory levels and have been engineered into a prototype for field testing. A fully automated process monitor has been developed for total technetium in nuclear waste streams. This instrument performs sample acidification, speciation adjustment, separation and detection in fifteen minutes or less

  13. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...... of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business....... This thesis documents the groundwork towards addressing the challenges faced by telemedical technologies today and establishing telemedicine as a means of patient diagnosis and treatment. Furthermore, it serves as an empirical example of designing a software ecosystem....

  14. Analyzing Strategic Business Rules through Simulation Modeling

    Science.gov (United States)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  15. Method and apparatus for analyzing ionizable materials

    International Nuclear Information System (INIS)

    Ehrlich, B.J.; Hall, R.C.; Thiede, P.W.

    1979-01-01

    An apparatus and method are described for analyzing a solution of ionizable compounds in a liquid. The solution is irradiated with electromagnetic radiation to ionize the compounds and the electrical conductivity of the solution is measured. The radiation may be X-rays, ultra-violet, infra-red or microwaves. The solution may be split into two streams, only one of which is irradiated, the other being used as a reference by comparing conductivities of the two streams. The liquid must be nonionizable and is preferably a polar solvent. The invention provides an analysis technique useful in liquid chromatography and in gas chromatography after dissolving the eluted gases in a suitable solvent. Electrical conductivity measurements performed on the irradiated eluent provide a quantitative indication of the ionizable materials existing within the eluent stream and a qualitative indication of the purity of the eluent stream. (author)

  16. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    as the software development and distribution by a set of actors dependent on each other and the ecosystem. We commence on the hypothesis that the establishment of a software ecosystem on the telemedicine services of Denmark would address these issues and investigate how a software ecosystem can foster...... the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...... of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business...

  17. Stackable differential mobility analyzer for aerosol measurement

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Meng-Dawn [Oak Ridge, TN; Chen, Da-Ren [Creve Coeur, MO

    2007-05-08

    A multi-stage differential mobility analyzer (MDMA) for aerosol measurements includes a first electrode or grid including at least one inlet or injection slit for receiving an aerosol including charged particles for analysis. A second electrode or grid is spaced apart from the first electrode. The second electrode has at least one sampling outlet disposed at a plurality different distances along its length. A volume between the first and the second electrode or grid between the inlet or injection slit and a distal one of the plurality of sampling outlets forms a classifying region, the first and second electrodes for charging to suitable potentials to create an electric field within the classifying region. At least one inlet or injection slit in the second electrode receives a sheath gas flow into an upstream end of the classifying region, wherein each sampling outlet functions as an independent DMA stage and classifies different size ranges of charged particles based on electric mobility simultaneously.

  18. Analyzing use cases for knowledge acquisition

    Science.gov (United States)

    Kelsey, Robert L.; Webster, Robert B.

    2000-03-01

    The analysis of use cases describing construction of simulation configuration files in a data/information management system can lead to the acquisition of new information and knowledge. In this application, a user creates a use case with an eXtensible Markup Language (XML) description representing a configuration file for simulation of a physical system. INtelligent agents analyze separate versions of the XML descriptions of a user and additionally, make comparisons of the descriptions with examples form a library of use cases. The agents can then make recommendations to a user on how to proceed or if tutoring is necessary. In a proof-of-concept test, new information is acquired and a user learns from the agent-facilitated tutoring.

  19. Spectroscopic methods to analyze drug metabolites.

    Science.gov (United States)

    Yi, Jong-Jae; Park, Kyeongsoon; Kim, Won-Je; Rhee, Jin-Kyu; Son, Woo Sung

    2018-03-09

    Drug metabolites have been monitored with various types of newly developed techniques and/or combination of common analytical methods, which could provide a great deal of information on metabolite profiling. Because it is not easy to analyze whole drug metabolites qualitatively and quantitatively, a single solution of analytical techniques is combined in a multilateral manner to cover the widest range of drug metabolites. Mass-based spectroscopic analysis of drug metabolites has been expanded with the help of other parameter-based methods. The current development of metabolism studies through contemporary pharmaceutical research are reviewed with an overview on conventionally used spectroscopic methods. Several technical approaches for conducting drug metabolic profiling through spectroscopic methods are discussed in depth.

  20. Analyzing petabytes of data with Hadoop

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    Abstract The open source Apache Hadoop project provides a powerful suite of tools for storing and analyzing petabytes of data using commodity hardware. After several years of production use inside of web companies like Yahoo! and Facebook and nearly a year of commercial support and development by Cloudera, the technology is spreading rapidly through other disciplines, from financial services and government to life sciences and high energy physics. The talk will motivate the design of Hadoop and discuss some key implementation details in depth. It will also cover the major subprojects in the Hadoop ecosystem, go over some example applications, highlight best practices for deploying Hadoop in your environment, discuss plans for the future of the technology, and provide pointers to the many resources available for learning more. In addition to providing more information about the Hadoop platform, a major goal of this talk is to begin a dialogue with the ATLAS research team on how the tools commonly used in t...

  1. Orthopedic surgical analyzer for percutaneous vertebroplasty

    Science.gov (United States)

    Tack, Gye Rae; Choi, Hyung Guen; Lim, Do H.; Lee, Sung J.

    2001-05-01

    Since the spine is one of the most complex joint structures in the human body, its surgical treatment requires careful planning and high degree of precision to avoid any unwanted neurological compromises. In addition, comprehensive biomechanical analysis can be very helpful because the spine is subject to a variety of load. In case for the osteoporotic spine in which the structural integrity has been compromised, it brings out the double challenges for a surgeon both clinically and biomechanically. Thus, we have been developing an integrated medical image system that is capable of doing the both. This system is called orthopedic surgical analyzer and it combines the clinical results from image-guided examination and the biomechanical data from finite element analysis. In order to demonstrate its feasibility, this system was applied to percutaneous vertebroplasty. Percutaneous vertebroplasty is a surgical procedure that has been recently introduced for the treatment of compression fracture of the osteoporotic vertebrae. It involves puncturing vertebrae and filling with polymethylmethacrylate (PMMA). Recent studies have shown that the procedure could provide structural reinforcement for the osteoporotic vertebrae while being minimally invasive and safe with immediate pain relief. However, treatment failures due to excessive PMMA volume injection have been reported as one of complications. It is believed that control of PMMA volume is one of the most critical factors that can reduce the incidence of complications. Since the degree of the osteoporosis can influence the porosity of the cancellous bone in the vertebral body, the injection volume can be different from patient to patient. In this study, the optimal volume of PMMA injection for vertebroplasty was predicted based on the image analysis of a given patient. In addition, biomechanical effects due to the changes in PMMA volume and bone mineral density (BMD) level were investigated by constructing clinically

  2. A framework to analyze emissions implications of ...

    Science.gov (United States)

    Future year emissions depend highly on the evolution of the economy, technology and current and future regulatory drivers. A scenario framework was adopted to analyze various technology development pathways and societal change while considering existing regulations and future uncertainty in regulations and evaluate resulting emissions growth patterns. The framework integrates EPA’s energy systems model with an economic Input-Output (I/O) Life Cycle Assessment model. The EPAUS9r MARKAL database is assembled from a set of technologies to represent the U.S. energy system within MARKAL bottom-up technology rich energy modeling framework. The general state of the economy and consequent demands for goods and services from these sectors are taken exogenously in MARKAL. It is important to characterize exogenous inputs about the economy to appropriately represent the industrial sector outlook for each of the scenarios and case studies evaluated. An economic input-output (I/O) model of the US economy is constructed to link up with MARKAL. The I/O model enables user to change input requirements (e.g. energy intensity) for different sectors or the share of consumer income expended on a given good. This gives end-users a mechanism for modeling change in the two dimensions of technological progress and consumer preferences that define the future scenarios. The framework will then be extended to include environmental I/O framework to track life cycle emissions associated

  3. Analyzing Music Services Positioning Through Qualitative Research

    Directory of Open Access Journals (Sweden)

    Manuel Cuadrado

    2015-12-01

    Full Text Available Information technologies have produced new ways of distributing and consuming music, mainly by youth, in relation to both goods and services. In the case of goods, there has been a dramatic shift from traditional ways of buying and listening to music to new digital platforms. There has also been an evolution in relation to music services. In this sense, live music concerts have been losing their audiences over the past few years, as have music radio stations, in favor of streaming platforms. Curious about this phenomenon, we conducted an exploratory research in order to analyze how all these services, both traditional and new ones were perceived. Specifically, we aimed to study youth´s assessment of the three most relevant music service categories: music radio stations, digital streaming platforms, and pop-rock music festivals. To do so, we used the projective technique of image association to gather information. The population of the study consisted of individuals between 18 and 25 years of age. Our results, after using content analysis, were poor due to spontaneous recall. Therefore, we duplicated the study, but in a more focus-oriented way. Information gathered this time allowed us not only to better know how all these organizations are positioned but also to obtain a list of descriptors to be used in a subsequent descriptive research study.

  4. PSAIA – Protein Structure and Interaction Analyzer

    Directory of Open Access Journals (Sweden)

    Vlahoviček Kristian

    2008-04-01

    Full Text Available Abstract Background PSAIA (Protein Structure and Interaction Analyzer was developed to compute geometric parameters for large sets of protein structures in order to predict and investigate protein-protein interaction sites. Results In addition to most relevant established algorithms, PSAIA offers a new method PIADA (Protein Interaction Atom Distance Algorithm for the determination of residue interaction pairs. We found that PIADA produced more satisfactory results than comparable algorithms implemented in PSAIA. Particular advantages of PSAIA include its capacity to combine different methods to detect the locations and types of interactions between residues and its ability, without any further automation steps, to handle large numbers of protein structures and complexes. Generally, the integration of a variety of methods enables PSAIA to offer easier automation of analysis and greater reliability of results. PSAIA can be used either via a graphical user interface or from the command-line. Results are generated in either tabular or XML format. Conclusion In a straightforward fashion and for large sets of protein structures, PSAIA enables the calculation of protein geometric parameters and the determination of location and type for protein-protein interaction sites. XML formatted output enables easy conversion of results to various formats suitable for statistic analysis. Results from smaller data sets demonstrated the influence of geometry on protein interaction sites. Comprehensive analysis of properties of large data sets lead to new information useful in the prediction of protein-protein interaction sites.

  5. Analyzing Architecture of Mithraism Rock Temples

    Directory of Open Access Journals (Sweden)

    Zohre AliJabbari

    2017-06-01

    Full Text Available This analyzes the architecture of rock temples of West and Northwest of Iran, as well as factors influencing their formation. The creation of rock architecture in this area of Iran is influenced by the religious, geographical and political atmosphere of their time. Most of these structures are formed by dominated empires in the first millennium BC. And in some works we are observing their continuity in later periods and change in their functions. One of the reasons that have attracted man to mountain and rock in different schools was the religious structure of community. According to the sanctity of mountains and rocks in the ancient religions, especially in Mithraism, valuable temples and places of worship have emerged in the mountains. Their obvious characteristic is circular dome-shaped spaces; simplicity, arrangement of spaces and the way of creating light that correspond with the tradition of Mithraism in Iran. Mehr Temple in Maragheh, Dashkasan in Zanjan, and Qadamgah Temple in Azarshahr are the rock temples in northwest of Iran that signs and symbols on them indicate the performance of Mithraism duties in these temples. In the West of Iran, Cogan cave in Lorestan, considering the characteristics of Mithraism temples, in a period had function as a temple for the worship of Mithra. This research investigates architectural futures of these temples.

  6. Analyzing the Existing Undergraduate Engineering Leadership Skills

    Directory of Open Access Journals (Sweden)

    Hamed M. Almalki

    2016-12-01

    Full Text Available Purpose: Studying and analyzing the undergraduate engineering students' leadership skills to discover their potential leadership strengths and weaknesses. This study will unveil potential ways to enhance the ways we teach engineering leadership. The research has great insights that might assist engineering programs to improve curricula for the purpose of better engineering preparation to meet industry's demands. Methodology and Findings: 441 undergraduate engineering students have been surveyed in two undergraduate engineering programs to discover their leadership skills. The results in both programs were revealing that undergraduate engineering students are lacking behind in the visionary leadership skills compared to directing, including and cultivating leadership styles. Recommendation: A practical framework has been proposed to enhance the lacking leadership skills by utilizing the Matrix of Change (MOC, and the Balanced Scorecard BSC to capture the best leadership scenarios to design virtual simulation environment as per the lacking leadership skills which is the visionary leadership skills in this case. After that, the virtual simulation will be used to provide an experiential learning by replacing human beings with avatars that can be managed or dramatized by real people to enable the creation of live, practical, measurable, and customizable leadership development programs.

  7. Analyzing human errors in flight mission operations

    Science.gov (United States)

    Bruno, Kristin J.; Welz, Linda L.; Barnes, G. Michael; Sherif, Josef

    1993-01-01

    A long-term program is in progress at JPL to reduce cost and risk of flight mission operations through a defect prevention/error management program. The main thrust of this program is to create an environment in which the performance of the total system, both the human operator and the computer system, is optimized. To this end, 1580 Incident Surprise Anomaly reports (ISA's) from 1977-1991 were analyzed from the Voyager and Magellan projects. A Pareto analysis revealed that 38 percent of the errors were classified as human errors. A preliminary cluster analysis based on the Magellan human errors (204 ISA's) is presented here. The resulting clusters described the underlying relationships among the ISA's. Initial models of human error in flight mission operations are presented. Next, the Voyager ISA's will be scored and included in the analysis. Eventually, these relationships will be used to derive a theoretically motivated and empirically validated model of human error in flight mission operations. Ultimately, this analysis will be used to make continuous process improvements continuous process improvements to end-user applications and training requirements. This Total Quality Management approach will enable the management and prevention of errors in the future.

  8. Complete denture analyzed by optical coherence tomography

    Science.gov (United States)

    Negrutiu, Meda L.; Sinescu, Cosmin; Todea, Carmen; Podoleanu, Adrian G.

    2008-02-01

    The complete dentures are currently made using different technologies. In order to avoid deficiencies of the prostheses made using the classical technique, several alternative systems and procedures were imagined, directly related to the material used and also to the manufacturing technology. Thus, at the present time, there are several injecting systems and technologies on the market, that use chemoplastic materials, which are heat cured (90-100°C), in dry or wet environment, or cold cured (below 60°C). There are also technologies that plasticize a hard cured material by thermoplastic processing (without any chemical changes) and then inject it into a mold. The purpose of this study was to analyze the existence of possible defects in several dental prostheses using a non invasive method, before their insertion in the mouth. Different dental prostheses, fabricated from various materials were investigated using en-face optical coherence tomography. In order to discover the defects, the scanning was made in three planes, obtaining images at different depths, from 0,01 μm to 2 mm. In several of the investigated prostheses we found defects which may cause their fracture. These defects are totally included in the prostheses material and can not be vizualised with other imagistic methods. In conclusion, en-face OCT is an important investigative tool for the dental practice.

  9. Update on the USNRC's nuclear plant analyzer

    International Nuclear Information System (INIS)

    Laats, E.T.

    1987-01-01

    The Nuclear Plant Analyzer (NPA) is the U.S. Nuclear Regulatory Commission's (NRC's) state-of-the-art nuclear reactor simulation capability. This computer software package integrates high fidelity nuclear reactor simulation codes such as the TRAC and RELAPS series of codes with color graphics display techniques and advanced workstation hardware. An overview of this program was given at the 1984 Summer Computer Simulation Conference (SCSC), with selected topics discussed at the 1985 and 1986 SCSCs. This paper addresses these activities and related experiences. First, The Class VI computer implementation is discussed. The trade-offs between gaining significantly greater computational speed and central memory, with the loss of performance due to many more simultaneous users is shown. Second, the goal of the super-minicomputer implementation is to produce a very cost-effective system that utilizes advanced (multi-dimensional, two-phase coolant) simulation capabilities at real wall-clock simulation times. Benchmarking of the initial super-minicomputer implementation is discussed. Finally, the technical and economic feasibility is addressed for implementing the super-minicomputer version of the NPA with the RELAPS simulation code onto the Black Fox full scope nuclear power plant simulator

  10. Analyzing Spatiotemporal Anomalies through Interactive Visualization

    Directory of Open Access Journals (Sweden)

    Tao Zhang

    2014-06-01

    Full Text Available As we move into the big data era, data grows not just in size, but also in complexity, containing a rich set of attributes, including location and time information, such as data from mobile devices (e.g., smart phones, natural disasters (e.g., earthquake and hurricane, epidemic spread, etc. We are motivated by the rising challenge and build a visualization tool for exploring generic spatiotemporal data, i.e., records containing time location information and numeric attribute values. Since the values often evolve over time and across geographic regions, we are particularly interested in detecting and analyzing the anomalous changes over time/space. Our analytic tool is based on geographic information system and is combined with spatiotemporal data mining algorithms, as well as various data visualization techniques, such as anomaly grids and anomaly bars superimposed on the map. We study how effective the tool may guide users to find potential anomalies through demonstrating and evaluating over publicly available spatiotemporal datasets. The tool for spatiotemporal anomaly analysis and visualization is useful in many domains, such as security investigation and monitoring, situation awareness, etc.

  11. NRC plant-analyzer development at BNL

    International Nuclear Information System (INIS)

    Wulff, W.

    1983-01-01

    The objective of this program is to develop an LWR engineering plant analyzer capable of performing realistic and accurate simulations of plant transients and Small-Break Loss of Coolant Accidents at real-time and faster than real-time computing speeds and at low costs for preparing, executing and evaluating such simulations. The program is directed toward facilitating reactor safety analyses, on-line plant monitoring, on-line accident diagnosis and mitigation and toward improving reactor operator training. The AD10 of Applied Dynamics International, Ann Arbor, MI, a special-purpose peripheral processor for high-speed systems simulation, is programmed through a PDP-11/34 minicomputer and carries out digital simulations with analog hardware in the input/output loop (up to 256 channels). Analog signals from a control panel are being used now to activate or to disable valves and to trip pump drive motors or regulators without interrupting the simulation. An IBM personal computer with multicolor graphics capabilities and a CRT monitor are used to produce on-line labelled diagrams of selected plant parameters as functions of time

  12. Analyzing Design Heating Loads in Superinsulated Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Arena, Lois [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)

    2015-06-16

    The U.S. Department of Energy’s Building America research team Consortium for Advanced Residential Buildings (CARB) worked with the EcoVillage cohousing community in Ithaca, New York, on the Third Residential EcoVillage Experience neighborhood. This communityscale project consists of 40 housing units—15 apartments and 25 single-family residences. Units range in size from 450 ft2 to 1,664 ft2 and cost from $80,000 for a studio apartment to $235,000 for a three- or four-bedroom single-family home. For the research component of this project, CARB analyzed current heating system sizing methods for superinsulated homes in cold climates to determine if changes in building load calculation methodology should be recommended. Actual heating energy use was monitored and compared to results from the Air Conditioning Contractors of America’s Manual J8 (MJ8) and the Passive House Planning Package software. Results from that research indicate that MJ8 significantly oversizes heating systems for superinsulated homes and that thermal inertia and internal gains should be considered for more accurate load calculations.

  13. Analyzing dialect variation in historical speech corpora.

    Science.gov (United States)

    Renwick, Margaret E L; Olsen, Rachel M

    2017-07-01

    The Linguistic Atlas of the Gulf States is an extensive audio corpus of sociolinguistic interviews with 1121 speakers from eight southeastern U.S. states. Complete interviews have never been fully transcribed, leaving a wealth of phonetic information unexplored. This paper details methods for large-scale acoustic analysis of this historical speech corpus, providing a fuller picture of Southern speech than offered by previous impressionistic analyses. Interviews from 10 speakers (∼36 h) in southeast Georgia were transcribed and analyzed for dialectal features associated with the Southern Vowel Shift and African American Vowel Shift, also considering the effects of age, gender, and race. Multiple tokens of common words were annotated (N = 6085), and formant values of their stressed vowels were extracted. The effects of shifting on relative vowel placement were evaluated via Pillai scores, and vowel dynamics were estimated via functional data analysis and modeled with linear mixed-effects regression. Results indicate that European American speakers show features of the Southern Vowel Shift, though certain speakers shift in more ways than others, and African American speakers' productions are consistent with the African American Vowel Shift. Wide variation is apparent, even within this small geographic region, contributing evidence of the complexity of Southern speech.

  14. Numerical methods for analyzing electromagnetic scattering

    Science.gov (United States)

    Lee, S. W.; Lo, Y. T.; Chuang, S. L.; Lee, C. S.

    1985-01-01

    Attenuation properties of the normal modes in an overmoded waveguide coated with a lossy material were analyzed. It is found that the low-order modes, can be significantly attenuated even with a thin layer of coating if the coating material is not too lossy. A thinner layer of coating is required for large attenuation of the low-order modes if the coating material is magnetic rather than dielectric. The Radar Cross Section (RCS) from an uncoated circular guide terminated by a perfect electric conductor was calculated and compared with available experimental data. It is confirmed that the interior irradiation contributes to the RCS. The equivalent-current method based on the geometrical theory of diffraction (GTD) was chosen for the calculation of the contribution from the rim diffraction. The RCS reduction from a coated circular guide terminated by a PEC are planned schemes for the experiments are included. The waveguide coated with a lossy magnetic material is suggested as a substitute for the corrugated waveguide.

  15. Analyzing the Pension System of the USSR

    Directory of Open Access Journals (Sweden)

    Aleksei V. Pudovkin

    2015-01-01

    Full Text Available The article under the title "ANALYSIS OF THE PENSION SYSTEM OF THE USSR" deals with numerous aspects of development of the pension system of the former USSR. Since the improvement of the Russian pension system is presently high on the agenda, the author believes that analyzing the own historical experience in the first line is essential in order to create a sound and efficient pension system in Russia. The study presented in the article aims to execute an in-depth analysis of legislature on the soviet pension system with the view to recreate the architecture of the pension system of the USSR. In addition, the study also reflects on the official statistics for the said period to make a qualified and fundamental conclusion on the efficiency of the soviet pension system. The evolution of the pension system, based on statistical data evidently proves the efficiently of the soviet pension system. It is highly recommended that the positive aspects of the soviet pension system are taken into consideration when reforming the actual pension system of Russian Federation.

  16. Nuclear plant analyzer development and analysis applications

    International Nuclear Information System (INIS)

    Laats, E.T.

    1984-01-01

    The Nuclear Plant Analyzer (NPA) is being developed as the U.S. Nuclear Regulatory Commission's (NRC's) state of the art safety analysis and engineering tool to address key nuclear plant safety issues. The NPA integrates the NRC's computerized reactor behavior simulation codes such as RELAP5 and TRAC-BWR, both of which are well-developed computer graphics programs, and large repositories of reactor design and experimental data. Utilizing the complex reactor behavior codes as well as the experiment data repositories enables simulation applications of the NPA that are generally not possible with more simplistic, less mechanistic reactor behavior codes. These latter codes are used in training simulators or with other NPA-type software packages and are limited to displaying calculated data only. This paper describes four applications of the NPA in assisting reactor safety analyses. Two analyses evaluated reactor operating procedures, during off-normal operation, for a pressurized water reactor (PWR) and a boiling water reactor (BWR), respectively. The third analysis was performed in support of a reactor safety experiment conducted in the Semiscale facility. The final application demonstrated the usefulness of atmospheric dispersion computer codes for site emergency planning purposes. An overview of the NPA and how it supported these analyses are the topics of this paper

  17. Practical Limitations of Aerosol Separation by a Tandem Differential Mobility Analyzer-Aerosol Particle Mass Analyzer

    OpenAIRE

    Radney, James G.; Zangmeister, Christopher D.

    2016-01-01

    A cavity ring-down spectrometer and condensation particle counter were used to investigate the limitations in the separation of singly and multiply charged aerosol particles by a tandem differential mobility analyzer (DMA) and aerosol particle mass analyzer (APM). The impact of particle polydispersity and morphology was investigated using three materials: nearly-monodisperse polystyrene latex nanospheres (PSL); polydisperse, nearly-spherical ammonium sulfate (AS) and polydisperse lacey fracta...

  18. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the

  19. Novel topological descriptors for analyzing biological networks

    Directory of Open Access Journals (Sweden)

    Varmuza Kurt K

    2010-06-01

    Full Text Available Abstract Background Topological descriptors, other graph measures, and in a broader sense, graph-theoretical methods, have been proven as powerful tools to perform biological network analysis. However, the majority of the developed descriptors and graph-theoretical methods does not have the ability to take vertex- and edge-labels into account, e.g., atom- and bond-types when considering molecular graphs. Indeed, this feature is important to characterize biological networks more meaningfully instead of only considering pure topological information. Results In this paper, we put the emphasis on analyzing a special type of biological networks, namely bio-chemical structures. First, we derive entropic measures to calculate the information content of vertex- and edge-labeled graphs and investigate some useful properties thereof. Second, we apply the mentioned measures combined with other well-known descriptors to supervised machine learning methods for predicting Ames mutagenicity. Moreover, we investigate the influence of our topological descriptors - measures for only unlabeled vs. measures for labeled graphs - on the prediction performance of the underlying graph classification problem. Conclusions Our study demonstrates that the application of entropic measures to molecules representing graphs is useful to characterize such structures meaningfully. For instance, we have found that if one extends the measures for determining the structural information content of unlabeled graphs to labeled graphs, the uniqueness of the resulting indices is higher. Because measures to structurally characterize labeled graphs are clearly underrepresented so far, the further development of such methods might be valuable and fruitful for solving problems within biological network analysis.

  20. Nuclear plant analyzer program for Bulgaria

    International Nuclear Information System (INIS)

    Shier, W.; Kennett, R.

    1993-01-01

    An interactive nuclear plant analyzer(NPA) has been developed for use by the Bulgarian technical community in the training of plant personnel, the development and verification of plant operating procedures, and in the analysis of various anticipated operational occurrences and accident scenarios. The current NPA includes models for a VVER-440 Model 230 and a VVER-1000 Model 320 and is operational on an IBM RISC6000 workstation. The RELAP5/MOD2 computer code has been used for the calculation of the reactor responses to the interactive commands initiated by the NPA operator. The interactive capabilities of the NPA have been developed to provide considerable flexibility in the plant actions that can be initiated by the operator. The current capabilities for both the VVER-440 and VVER-1000 models include: (1) scram initiation; (2) reactor coolant pump trip; (3) high pressure safety injection system initiation; (4) low pressure safety injection system initiation; (5) pressurizer safety valve opening; (6) steam generator relief/safety valve opening; (7) feedwater system initiation and trip; (8) turbine trip; and (9) emergency feedwater initiation. The NPA has the capability to display the results of the simulations in various forms that are determined by the model developer. Results displayed on the reactor mask are shown through the user defined, digital display of various plant parameters and through color changes that reflect changes in primary system fluid temperatures, fuel and clad temperatures, and the temperature of other metal structures. In addition, changes in the status of various components and systems can be initiated and/or displayed both numerically and graphically on the mask. This paper provides a description of the structure of the NPA, a discussion of the simulation models used for the VVER-440 and the VVER-1000, and an overview of the NPA capabilities. Typical results obtained using both simulation models will be discussed

  1. Update on the USNRC's Nuclear Plant Analyzer

    International Nuclear Information System (INIS)

    Laats, E.T.

    1987-01-01

    The Nuclear Plant Analyzer (NPA) is the US Nuclear Regulatory Commission's (NRC's) state-of-the-art nuclear reactor simulation capability. This computer software package integrates high fidelity nuclear reactor simulation codes such as the TRAC and RELAP5 series of codes with color graphics display techniques and advanced workstation hardware. An overview of this program was given at the 1984 Summer Computer Simulation Conference (SCSC), with selected topics discussed at the 1985 and 1986 SCSCs. Since the 1984 presentation, major redirections of this NRC program have been taken. The original NPA system was developed for operation on a Control Data Corporation CYBER 176 computer, technology that is some 10 to 15 years old. The NPA system has recently been implemented on Class VI computers to gain increased computational capabilities, and is now being implemented on super-minicomputers for use by the scientific community and possibly by the commercial nuclear power plant simulator community. This paper addresses these activities and related experiences. First, the Class VI computer implementation is discussed. The trade-offs between gaining significantly greater computational speed and central memory, with the loss of performance due to many more simultaneous users is shown. Second, the goal of the super-minicomputer implementation is to produce a very cost-effective system that utilizes advanced (multi-dimensional, two-phase coolant) simulation capabilities at real wall-clock simulation times. Benchmarking of the initial super-minicomputer implementation is discussed. Finally, the technical and economic feasibility is addressed for implementing the super-minicomputer version of the NPA with the RELAP5 simulation code onto the Black Fox full scope nuclear power plant simulator

  2. Analyzing personalized policies for online biometric verification.

    Science.gov (United States)

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  3. Analyzing personalized policies for online biometric verification.

    Directory of Open Access Journals (Sweden)

    Apaar Sadhwani

    Full Text Available Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR subject to constraints on the false accept rate (FAR and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses 12 biometrics for each resident, which represents a five (four, respectively log reduction in FRR relative to fingerprint (iris, respectively policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR and acquires an average of 1.3 fingerprints per resident.

  4. Modeling and Analyzing Academic Researcher Behavior

    Directory of Open Access Journals (Sweden)

    Phuc Huu Nguyen

    2016-12-01

    Full Text Available Abstract. This paper suggests a theoretical framework for analyzing the mechanism of the behavior of academic researchers whose interests are tangled and vary widely in academic factors (the intrinsic satisfaction in conducting research, the improvement in individual research ability, etc. or non-academic factors (career rewards, financial rewards, etc.. Furthermore, each researcher also has his/her different academic stances in their preferences about academic freedom and academic entrepreneurship. Understanding the behavior of academic researchers will contribute to nurture young researchers, to improve the standard of research and education as well as to boost collaboration in academia-industry. In particular, as open innovation is increasingly in need of the involvement of university researchers, to establish a successful approach to entice researchers into enterprises’ research, companies must comprehend the behavior of university researchers who have multiple complex motivations. The paper explores academic researchers' behaviors through optimizing their utility functions, i.e. the satisfaction obtained by their research outputs. This paper characterizes these outputs as the results of researchers' 3C: Competence (the ability to implement the research, Commitment (the effort to do the research, and Contribution (finding meaning in the research. Most of the previous research utilized the empirical methods to study researcher's motivation. Without adopting economic theory into the analysis, the past literature could not offer a deeper understanding of researcher's behavior. Our contribution is important both conceptually and practically because it provides the first theoretical framework to study the mechanism of researcher's behavior. Keywords: Academia-Industry, researcher behavior, ulrich model’s 3C.

  5. Detection, Isolation, and Characterization of Acidophilic Methanotrophs from Sphagnum Mosses ▿ †

    Science.gov (United States)

    Kip, Nardy; Ouyang, Wenjing; van Winden, Julia; Raghoebarsing, Ashna; van Niftrik, Laura; Pol, Arjan; Pan, Yao; Bodrossy, Levente; van Donselaar, Elly G.; Reichart, Gert-Jan; Jetten, Mike S. M.; Sinninghe Damsté, Jaap S.; Op den Camp, Huub J. M.

    2011-01-01

    Sphagnum peatlands are important ecosystems in the methane cycle. Methane-oxidizing bacteria in these ecosystems serve as a methane filter and limit methane emissions. Yet little is known about the diversity and identity of the methanotrophs present in and on Sphagnum mosses of peatlands, and only a few isolates are known. The methanotrophic community in Sphagnum mosses, originating from a Dutch peat bog, was investigated using a pmoA microarray. A high biodiversity of both gamma- and alphaproteobacterial methanotrophs was found. With Sphagnum mosses as the inoculum, alpha- and gammaproteobacterial acidophilic methanotrophs were isolated using established and newly designed media. The 16S rRNA, pmoA, pxmA, and mmoX gene sequences showed that the alphaproteobacterial isolates belonged to the Methylocystis and Methylosinus genera. The Methylosinus species isolated are the first acid-tolerant members of this genus. Of the acidophilic gammaproteobacterial strains isolated, strain M5 was affiliated with the Methylomonas genus, and the other strain, M200, may represent a novel genus, most closely related to the genera Methylosoma and Methylovulum. So far, no acidophilic or acid-tolerant methanotrophs in the Gammaproteobacteria class are known. All strains showed the typical features of either type I or II methanotrophs and are, to the best of our knowledge, the first isolated (acidophilic or acid-tolerant) methanotrophs from Sphagnum mosses. PMID:21724892

  6. Detection, isolation, and characterization of acidophilic methanotrophs from Sphagnum mosses.

    Science.gov (United States)

    Kip, Nardy; Ouyang, Wenjing; van Winden, Julia; Raghoebarsing, Ashna; van Niftrik, Laura; Pol, Arjan; Pan, Yao; Bodrossy, Levente; van Donselaar, Elly G; Reichart, Gert-Jan; Jetten, Mike S M; Damsté, Jaap S Sinninghe; Op den Camp, Huub J M

    2011-08-15

    Sphagnum peatlands are important ecosystems in the methane cycle. Methane-oxidizing bacteria in these ecosystems serve as a methane filter and limit methane emissions. Yet little is known about the diversity and identity of the methanotrophs present in and on Sphagnum mosses of peatlands, and only a few isolates are known. The methanotrophic community in Sphagnum mosses, originating from a Dutch peat bog, was investigated using a pmoA microarray. A high biodiversity of both gamma- and alphaproteobacterial methanotrophs was found. With Sphagnum mosses as the inoculum, alpha- and gammaproteobacterial acidophilic methanotrophs were isolated using established and newly designed media. The 16S rRNA, pmoA, pxmA, and mmoX gene sequences showed that the alphaproteobacterial isolates belonged to the Methylocystis and Methylosinus genera. The Methylosinus species isolated are the first acid-tolerant members of this genus. Of the acidophilic gammaproteobacterial strains isolated, strain M5 was affiliated with the Methylomonas genus, and the other strain, M200, may represent a novel genus, most closely related to the genera Methylosoma and Methylovulum. So far, no acidophilic or acid-tolerant methanotrophs in the Gammaproteobacteria class are known. All strains showed the typical features of either type I or II methanotrophs and are, to the best of our knowledge, the first isolated (acidophilic or acid-tolerant) methanotrophs from Sphagnum mosses.

  7. analyzers in overweight/obese renal patients

    Directory of Open Access Journals (Sweden)

    Mariusz Kusztal

    2015-05-01

    Full Text Available Bioelectrical impedance analysis (BIA is an affordable, non-invasive and fast alternative method to assess body composition. The purpose of this study was to compare two different tetrapolar BIA devices for estimating body fluid volumes and body cell mass (BCM in a clinical setting among patients with kidney failure.All double measurements were performed by multi-frequency (MF and single-frequency (SF BIA analyzers: a Body Composition Monitor (Fresenius Medical Care, Germany and BIA-101 (Akern, Italy, respectively. All procedures were conducted according to the manufacturers’ instructions (dedicated electrodes, measurement sites, positions, etc. Total body water (TBW, extracellular water (ECW, intracellular water (ICW and BCM were compared. The study included 39 chronic kidney disease patients (stage III-V with a mean age of 45.8 ± 8 years (21 men and 18 women who had a wide range of BMI [17-34 kg/m2 (mean 26.6 ±5].A comparison of results from patients with BMI <25 vs ≥25 revealed a significant discrepancy in measurements between the two BIA devices. Namely, in the group with BMI <25 (n=16 acceptable correlations were obtained in TBW (r 0.99; p<0.01, ICW (0.92; p<0.01, BCM (0.68; p<0.01, and ECW (0.96 p<0.05, but those with BMI ≥25 (n=23 showed a discrepancy (lower correlations in TBW (r 0.82; p<0.05, ICW (0.78; p<0.05, BCM (0.52; p<0.05, and ECW (0.76; p<0.01.Since estimates of TBW, ICW and BCM by the present BIA devices do not differ in patients with BMI <25, they might be interchangeable. This does not hold true for overweight/obese renal patients.

  8. The Albuquerque Seismological Laboratory Data Quality Analyzer

    Science.gov (United States)

    Ringler, A. T.; Hagerty, M.; Holland, J.; Gee, L. S.; Wilson, D.

    2013-12-01

    The U.S. Geological Survey's Albuquerque Seismological Laboratory (ASL) has several efforts underway to improve data quality at its stations. The Data Quality Analyzer (DQA) is one such development. The DQA is designed to characterize station data quality in a quantitative and automated manner. Station quality is based on the evaluation of various metrics, such as timing quality, noise levels, sensor coherence, and so on. These metrics are aggregated into a measurable grade for each station. The DQA consists of a website, a metric calculator (Seedscan), and a PostgreSQL database. The website allows the user to make requests for various time periods, review specific networks and stations, adjust weighting of the station's grade, and plot metrics as a function of time. The website dynamically loads all station data from a PostgreSQL database. The database is central to the application; it acts as a hub where metric values and limited station descriptions are stored. Data is stored at the level of one sensor's channel per day. The database is populated by Seedscan. Seedscan reads and processes miniSEED data, to generate metric values. Seedscan, written in Java, compares hashes of metadata and data to detect changes and perform subsequent recalculations. This ensures that the metric values are up to date and accurate. Seedscan can be run in a scheduled task or on demand by way of a config file. It will compute metrics specified in its configuration file. While many metrics are currently in development, some are completed and being actively used. These include: availability, timing quality, gap count, deviation from the New Low Noise Model, deviation from a station's noise baseline, inter-sensor coherence, and data-synthetic fits. In all, 20 metrics are planned, but any number could be added. ASL is actively using the DQA on a daily basis for station diagnostics and evaluation. As Seedscan is scheduled to run every night, data quality analysts are able to then use the

  9. Technology for collecting and analyzing relational data

    Directory of Open Access Journals (Sweden)

    E. N. Fedorova

    2016-01-01

    summarize the information there is a mechanism of data grouping, which provides general data of the number of entries, maximum, minimum, average values for different groups of records.Results. This technology has been tested in the monitoring requirements of the services of additional professional education and the definition of the educational needs of teachers and executives of educational organizations of the Irkutsk region. The survey has involved 2,780 respondents in 36 municipalities. Creating the data model took several hours. The survey was conducted during the month.Conclusion. The proposed technology allows a short time to collect the information in relational form, and then analyze it without the need for programming with flexible assignment of the operating logic for form.

  10. Analyzing the attributes of Indiana's STEM schools

    Science.gov (United States)

    Eltz, Jeremy

    "Primary and secondary schools do not seem able to produce enough students with the interest, motivation, knowledge, and skills they will need to compete and prosper in the emerging world" (National Academy of Sciences [NAS], 2007a, p. 94). This quote indicated that there are changing expectations for today's students which have ultimately led to new models of education, such as charters, online and blended programs, career and technical centers, and for the purposes of this research, STEM schools. STEM education as defined in this study is a non-traditional model of teaching and learning intended to "equip them [students] with critical thinking, problem solving, creative and collaborative skills, and ultimately establishes connections between the school, work place, community and the global economy" (Science Foundation Arizona, 2014, p. 1). Focusing on science, technology, engineering, and math (STEM) education is believed by many educational stakeholders to be the solution for the deficits many students hold as they move on to college and careers. The National Governors Association (NGA; 2011) believes that building STEM skills in the nation's students will lead to the ability to compete globally with a new workforce that has the capacity to innovate and will in turn spur economic growth. In order to accomplish the STEM model of education, a group of educators and business leaders from Indiana developed a comprehensive plan for STEM education as an option for schools to use in order to close this gap. This plan has been promoted by the Indiana Department of Education (IDOE, 2014a) with the goal of increasing STEM schools throughout Indiana. To determine what Indiana's elementary STEM schools are doing, this study analyzed two of the elementary schools that were certified STEM by the IDOE. This qualitative case study described the findings and themes from two elementary STEM schools. Specifically, the research looked at the vital components to accomplish STEM

  11. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2014-12-01

    We have developed a cloud-enabled web-service system that empowers physics-based, multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks. The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the observational datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation, (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs, and (3) ECMWF reanalysis outputs for several environmental variables in order to supplement observational datasets. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, (4) the calculation of difference between two variables, and (5) the conditional sampling of one physical variable with respect to another variable. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA will be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. In order to support 30+ simultaneous users during the school, we have deployed CMDA to the Amazon cloud environment. The cloud-enabled CMDA will provide each student with a virtual machine while the user interaction with the system will remain the same

  12. Analyzers Measure Greenhouse Gases, Airborne Pollutants

    Science.gov (United States)

    2012-01-01

    In complete darkness, a NASA observatory waits. When an eruption of boiling water billows from a nearby crack in the ground, the observatory s sensors seek particles in the fluid, measure shifts in carbon isotopes, and analyze samples for biological signatures. NASA has landed the observatory in this remote location, far removed from air and sunlight, to find life unlike any that scientists have ever seen. It might sound like a scene from a distant planet, but this NASA mission is actually exploring an ocean floor right here on Earth. NASA established a formal exobiology program in 1960, which expanded into the present-day Astrobiology Program. The program, which celebrated its 50th anniversary in 2010, not only explores the possibility of life elsewhere in the universe, but also examines how life begins and evolves, and what the future may hold for life on Earth and other planets. Answers to these questions may be found not only by launching rockets skyward, but by sending probes in the opposite direction. Research here on Earth can revise prevailing concepts of life and biochemistry and point to the possibilities for life on other planets, as was demonstrated in December 2010, when NASA researchers discovered microbes in Mono Lake in California that subsist and reproduce using arsenic, a toxic chemical. The Mono Lake discovery may be the first of many that could reveal possible models for extraterrestrial life. One primary area of interest for NASA astrobiologists lies with the hydrothermal vents on the ocean floor. These vents expel jets of water heated and enriched with chemicals from off-gassing magma below the Earth s crust. Also potentially within the vents: microbes that, like the Mono Lake microorganisms, defy the common characteristics of life on Earth. Basically all organisms on our planet generate energy through the Krebs Cycle, explains Mike Flynn, research scientist at NASA s Ames Research Center. This metabolic process breaks down sugars for energy

  13. Noise and analyzer-crystal angular position analysis for analyzer-based phase-contrast imaging

    International Nuclear Information System (INIS)

    Majidi, Keivan; Brankov, Jovan G; Li, Jun; Muehleman, Carol

    2014-01-01

    The analyzer-based phase-contrast x-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the angular intensity profile of the x-ray beam passed through the object at different angular positions of the analyzer crystal. The noise in the estimated ABI parametric images depends upon imaging conditions like the source intensity (flux), measurements angular positions, object properties, and the estimation method. In this paper, we use the Cramér–Rao lower bound (CRLB) to quantify the noise properties in parametric images and to investigate the effect of source intensity, different analyzer-crystal angular positions and object properties on this bound, assuming a fixed radiation dose delivered to an object. The CRLB is the minimum bound for the variance of an unbiased estimator and defines the best noise performance that one can obtain regardless of which estimation method is used to estimate ABI parametric images. The main result of this paper is that the variance (hence the noise) in parametric images is directly proportional to the source intensity and only a limited number of analyzer-crystal angular measurements (eleven for uniform and three for optimal non-uniform) are required to get the best parametric images. The following angular measurements only spread the total dose to the measurements without improving or worsening CRLB, but the added measurements may improve parametric images by reducing estimation bias. Next, using CRLB we evaluate the multiple-image radiography, diffraction enhanced imaging and scatter diffraction enhanced imaging estimation techniques

  14. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Roozbeh Rashed

    2013-01-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.  

  15. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Farzin Heravi

    2012-09-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.

  16. Practical Limitations of Aerosol Separation by a Tandem Differential Mobility Analyzer-Aerosol Particle Mass Analyzer.

    Science.gov (United States)

    Radney, James G; Zangmeister, Christopher D

    2016-01-01

    A cavity ring-down spectrometer and condensation particle counter were used to investigate the limitations in the separation of singly and multiply charged aerosol particles by a tandem differential mobility analyzer (DMA) and aerosol particle mass analyzer (APM). The impact of particle polydispersity and morphology was investigated using three materials: nearly-monodisperse polystyrene latex nanospheres (PSL); polydisperse, nearly-spherical ammonium sulfate (AS) and polydisperse lacey fractal soot agglomerates. PSL and AS particles were easily resolved as a function of charge. For fresh soot, the presence of multiply charged particles severely affects the isolation of the singly charged particles. In cases where the DMA-APM was unable to fully resolve the singly charged particles of interest, the peak mass deviated by up to 13 % leading to errors in the mass specific extinction cross section of over 100 %. For measurements of non-spherical particles, non-symmetrical distributions of concentration as a function of mass were a sign of the presence of multiply charged particles. Under these conditions, the effects of multiply charged particles can be reduced by using a second charge neutralizer after the DMA and prior to the APM. Dilution of the aerosol stream serves to decrease the total number concentration of particles and does not remove the contributions of multiply charged particles.

  17. Looking for a Framework for Analyzing Eco-innovation Dynamics

    DEFF Research Database (Denmark)

    Yang, Yan

    2011-01-01

    Looking for a Framework for Analyzing Eco-innovation Dynamics: A Triple Helix Model of Innovation Perspective.......Looking for a Framework for Analyzing Eco-innovation Dynamics: A Triple Helix Model of Innovation Perspective....

  18. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  19. ARC Code TI: Inference Kernel for Open Static Analyzers (IKOS)

    Data.gov (United States)

    National Aeronautics and Space Administration — IKOS is a C++ library designed to facilitate the development of sound static analyzers based on Abstract Interpretation. Specialization of a static analyzer for an...

  20. Lab-on-a-chip Astrobiology Analyzer, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer to measure chemical signatures of life in extraterrestrial settings. The analyzer will...

  1. Portable Programmable Multifunction Body Fluids Analyzer, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced Liquid Logic proposes to develop a very capable analyzer based on its digital microfluidic technology. Such an analyzer would be:  Capable of both simple...

  2. 21 CFR 868.1700 - Nitrous oxide gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Nitrous oxide gas analyzer. 868.1700 Section 868...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1700 Nitrous oxide gas analyzer. (a) Identification. A nitrous oxide gas analyzer is a device intended to measure the concentration of nitrous oxide...

  3. 21 CFR 862.2500 - Enzyme analyzer for clinical use.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Enzyme analyzer for clinical use. 862.2500 Section... Instruments § 862.2500 Enzyme analyzer for clinical use. (a) Identification. An enzyme analyzer for clinical use is a device intended to measure enzymes in plasma or serum by nonkinetic or kinetic measurement of...

  4. 40 CFR 91.318 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer... Provisions § 91.318 Oxides of nitrogen analyzer calibration. (a) Calibrate the chemiluminescent oxides of... introduction into service, and monthly thereafter, check the chemiluminescent oxides of nitrogen analyzer for...

  5. 40 CFR 89.321 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer... Test Equipment Provisions § 89.321 Oxides of nitrogen analyzer calibration. (a) The chemiluminescent oxides of nitrogen analyzer shall receive the initial and periodic calibration described in this section...

  6. 40 CFR 90.318 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer... Emission Test Equipment Provisions § 90.318 Oxides of nitrogen analyzer calibration. (a) Calibrate the chemiluminescent oxides of nitrogen analyzer as described in this section. (b) Initial and Periodic Interference...

  7. [Communication interface and software design for laboratory analyzer].

    Science.gov (United States)

    Wang, Dong; Wang, Yun-Guang; Wang, Chun-Hua; Song, Cheng-Jun; Hu, Wen-Zhong; Chen, Hang-Hua

    2009-07-01

    In this text the interface of hardware and software for laboratory analyzer is analyzed. Adopting VC++ and multi-thread technique, the real-time communication without errors between LIS and the laboratory analyzer is realized. Practice proves that the system based on the technique is stable in running and all data are received accurately and timely.

  8. 40 CFR 89.322 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Test Equipment Provisions § 89.322 Carbon dioxide analyzer calibration. (a) Prior to its introduction... carbon dioxide analyzer shall be calibrated on all normally used instrument ranges. New calibration...

  9. 40 CFR 86.1324-84 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Exhaust Test Procedures § 86.1324-84 Carbon dioxide analyzer calibration. Prior to its introduction into service and monthly thereafter, the NDIR carbon dioxide analyzer shall be calibrated as follows: (a...

  10. 40 CFR 86.124-78 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Complete Heavy-Duty Vehicles; Test Procedures § 86.124-78 Carbon dioxide analyzer calibration. Prior to its introduction into service and monthly thereafter the NDIR carbon dioxide analyzer shall be calibrated: (a...

  11. 40 CFR 90.320 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Emission Test Equipment Provisions § 90.320 Carbon dioxide analyzer calibration. (a) Prior to its initial... carbon dioxide analyzer as follows: (1) Follow good engineering practices for instrument start-up and...

  12. 40 CFR 1065.250 - Nondispersive infra-red analyzer.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Nondispersive infra-red analyzer. 1065... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Measurement Instruments Co and Co2 Measurements § 1065.250 Nondispersive infra-red analyzer. (a) Application. Use a nondispersive infra-red (NDIR) analyzer to measure CO...

  13. 21 CFR 868.1975 - Water vapor analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Water vapor analyzer. 868.1975 Section 868.1975...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1975 Water vapor analyzer. (a) Identification. A water vapor analyzer is a device intended to measure the concentration of water vapor in a...

  14. The design of virtual signal analyzer with high cost performance

    Science.gov (United States)

    Wang, Ya-nan; Pei, Gui-ling; Xu, Lei

    2013-03-01

    Based on 16bit STEREO AUDIO CODEC and C# this paper introduces a virtual signal analyzer. It mainly describes the system's overall structure, hardware design, PC software framework, etc. With reducing costs dramatically, the system also achieves being a signal generator, oscilloscope, recorder, spectrum analyzer, time-frequency analyzer and so on.

  15. 21 CFR 870.3640 - Indirect pacemaker generator function analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Indirect pacemaker generator function analyzer... Indirect pacemaker generator function analyzer. (a) Identification. An indirect pacemaker generator function analyzer is an electrically powered device that is used to determine pacemaker function or...

  16. 21 CFR 870.3630 - Pacemaker generator function analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Pacemaker generator function analyzer. 870.3630... (CONTINUED) MEDICAL DEVICES CARDIOVASCULAR DEVICES Cardiovascular Prosthetic Devices § 870.3630 Pacemaker generator function analyzer. (a) Identification. A pacemaker generator function analyzer is a device that is...

  17. Direct bonded HOPG - Analyzer support without background source

    Science.gov (United States)

    Groitl, Felix; Kitaura, Hidetoshi; Nishiki, Naomi; Rønnow, Henrik M.

    2018-04-01

    A new production process allows a direct bonding of HOPG crystals on Si wafers. This new method facilitates the production of analyzer crystals with support structure without the use of additional, background inducing fixation material, e.g. glue, wax and screws. This new method is especially interesting for the upcoming generation of CAMEA-type multiplexing spectrometers. These instruments allow for a drastic performance increase due to the increased angular coverage and multiple energy analysis. Exploiting the transparency of multiple HOPG for cold neutrons, a consecutive arrangement of HOPG analyzer crystals per Q-channel can be achieved. This implies that neutrons travel through up to 10 arrays of analyzer crystals before reaching the analyzer corresponding to their energy. Hence, a careful choice of the fixation method for the analyzer crystals in regards to transparency and background is necessary. Here, we present first results on the diffraction and mechanical performance of direct bonded analyzer crystals.

  18. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  19. The development of a new uranium analyzer and its application

    International Nuclear Information System (INIS)

    Hou Shengli; Zeng Weihua; Deng Yonghui; Li Fubin

    2012-01-01

    A new uranium analyzer and its application are introduced in this paper. The analyzer consists of a detector for β-ray, a detector for γ-ray, the circuits for signal-processing and the software for data collecting and processing. The uranium content is measured by simultaneously counting the β-rays and γ-rays radiated by the sample. It is very easy to operate the analyzer and very quick to get the result. The experiment shows that the data error between this analyzer and chemical analysis can meet the demand of uranium mine for the detection of uranium content. (authors)

  20. L G-2 Scintrex manual.Fluorescence analyzer

    International Nuclear Information System (INIS)

    Pirelli, H.

    1987-01-01

    The Scintrex Fluorescence Analyzer LG-2 selectively detects the presence of certain fluorescent minerals through UV photoluminescence induced and provides quantitative information on its distribution.

  1. Split shielding plates in electrostatic sector analyzers and Wien filters

    Science.gov (United States)

    Yavor, Mikhail I.

    1995-09-01

    An analytical method is developed for calculation of the influence of the splitted shielding plates in inhomogeneous electrostatic sector analyzers and Wien filters on their electron optical properties. The method allows one to simplify considerably the choice of the mode of operation of the shielding plates needed to achieve a required electrostatic field distribution inside the analyzer.

  2. 21 CFR 868.1120 - Indwelling blood oxyhemoglobin concentration analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Indwelling blood oxyhemoglobin concentration... Indwelling blood oxyhemoglobin concentration analyzer. (a) Identification. An indwelling blood oxyhemoglobin concentration analyzer is a photoelectric device used to measure, in vivo, the oxygen-carrying capacity of...

  3. 40 CFR 86.1221-90 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... instructions or good engineering practice for instrument startup and basic operating adjustment using the... thereafter the FID hydrocarbon analyzer shall be calibrated on all normally used instrument ranges, and, if... measured by a gas flow meter with an accuracy of ±2 percent. ER06OC93.184 (2) The bag sample is analyzed...

  4. 40 CFR 86.327-79 - Quench checks; NOX analyzer.

    Science.gov (United States)

    2010-07-01

    ... any flow rate into the reaction chamber. This includes, but is not limited to, sample capillary, ozone... Quench checks; NOX analyzer. (a) Perform the reaction chamber quench check for each model of high vacuum reaction chamber analyzer prior to initial use. (b) Perform the reaction chamber quench check for each new...

  5. Development of parallel/serial program analyzing tool

    International Nuclear Information System (INIS)

    Watanabe, Hiroshi; Nagao, Saichi; Takigawa, Yoshio; Kumakura, Toshimasa

    1999-03-01

    Japan Atomic Energy Research Institute has been developing 'KMtool', a parallel/serial program analyzing tool, in order to promote the parallelization of the science and engineering computation program. KMtool analyzes the performance of program written by FORTRAN77 and MPI, and it reduces the effort for parallelization. This paper describes development purpose, design, utilization and evaluation of KMtool. (author)

  6. 40 CFR 86.524-78 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Regulations for 1978 and Later New Motorcycles; Test Procedures § 86.524-78 Carbon dioxide analyzer calibration. (a) Prior to its introduction into service and monthly thereafter the NDIR carbon dioxide...

  7. 40 CFR 91.320 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Provisions § 91.320 Carbon dioxide analyzer calibration. (a) Prior to its introduction into service, and monthly thereafter, or within one month prior to the certification test, calibrate the NDIR carbon dioxide...

  8. 40 CFR 86.1524 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration. 86.1524 Section 86.1524 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Test Procedures § 86.1524 Carbon dioxide analyzer calibration. (a) The calibration requirements for the...

  9. Analyzing FCS Professionals in Higher Education: A Case Study

    Science.gov (United States)

    Hall, Scott S.; Harden, Amy; Pucciarelli, Deanna L.

    2016-01-01

    A national study of family and consumer sciences (FCS) professionals in higher education was analyzed as a case study to illustrate procedures useful for investigating issues related to FCS. The authors analyzed response rates of more than 1,900 FCS faculty and administrators by comparing those invited to participate and the 345 individuals who…

  10. Analyzing atomic noise with a consumer sound card

    Science.gov (United States)

    Schulte, Carsten H. H.; Müller, Georg M.; Horn, Hauke; Hübner, Jens; Oestreich, Michael

    2012-03-01

    We discuss an advanced undergraduate project on spin noise spectroscopy of atomic rubidium vapor. The spin noise is digitized using a consumer sound card and analyzed by a fast Fourier transform. Students gain competence in digital data acquisition and processing, and the idea of analyzing a noise signal is emphasized.

  11. The Pep Talk: How to Analyze Political Language.

    Science.gov (United States)

    Rank, Hugh

    Recognizing the proliferation of persuasive language in advertising and in politics, this guide explains how to analyze political language for its persuasive techniques and why it is important to do so. The first chapter of the book provides a rationale for analyzing persuasion, and an overview of the book. The remaining chapters explore various…

  12. Validation of ESR analyzer using Westergren ESR method.

    Science.gov (United States)

    Sikka, Meera; Tandon, Rajesh; Rusia, Usha; Madan, Nishi

    2007-07-01

    Erythrocyte sedimentation rate (ESR) is one of the most frequently ordered laboratory test. ESR analyzers were developed to provide a quick and efficient measure of ESR. We compared the results of ESR obtained by an ESR analyzer with those by the Westergren method in a group of 75 patients Linear regression analysis showed a good correlation between the two results (r = 0.818, p < 0.01). The intra class correlation was 0.82. The analyzer method had the advantages of safety, decreased technician time and improved patient care by providing quick results.

  13. Emergency response training with the BNL plant analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, H.S.; Guppy, J.G.; Mallen, A.N.; Wulff, W.

    1987-01-01

    Presented is the experience in the use of the BNL Plant Analyzer for NRC emergency response training to simulated accidents in a BWR. The unique features of the BNL Plant Analyzer that are important for the emergency response training are summarized. A closed-loop simulation of all the key systems of a power plant in question was found essential to the realism of the emergency drills conducted at NRC. The faster than real-time simulation speeds afforded by the BNL Plant Analyzer have demonstrated its usefulness for the timely conduct of the emergency response training.

  14. Note: Portable rare-earth element analyzer using pyroelectric crystal.

    Science.gov (United States)

    Imashuku, Susumu; Fuyuno, Naoto; Hanasaki, Kohei; Kawai, Jun

    2013-12-01

    We report a portable rare-earth element analyzer with a palm-top size chamber including the electron source of a pyroelectric crystal and the sample stage utilizing cathodoluminescence (CL) phenomenon. The portable rare-earth element analyzer utilizing CL phenomenon is the smallest reported so far. The portable rare-earth element analyzer detected the rare-earth elements Dy, Tb, Er, and Sm of ppm order in zircon, which were not detected by scanning electron microscopy-energy dispersive X-ray spectroscopy analysis. We also performed an elemental mapping of rare-earth elements by capturing a CL image using CCD camera.

  15. The Rhetoric of Satire: Analyzing in Freshman English.

    Science.gov (United States)

    Proctor, Betty Jane

    1982-01-01

    Presents a series of exercises designed to provide freshman composition students with a base for analyzing works rhetorically, to point out how language can be used persuasively, and to illustrate how satire functions. (FL)

  16. The Photo-Pneumatic CO2 Analyzer, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We are proposing to build a new technology, the photo-pneumatic analyzer. It is small, solid-state, inexpensive, and appropriate for observations of atmospheric...

  17. Multisensor Analyzed Sea Ice Extent - Northern Hemisphere (MASIE-NH)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Multisensor Analyzed Sea Ice Extent Northern Hemisphere (MASIE-NH) products provide measurements of daily sea ice extent and sea ice edge boundary for the...

  18. Analyzing Software Errors in Safety-Critical Embedded Systems

    Science.gov (United States)

    Lutz, Robyn R.

    1994-01-01

    This paper analyzes the root causes of safty-related software faults identified as potentially hazardous to the system are distributed somewhat differently over the set of possible error causes than non-safety-related software faults.

  19. Triple Isotope Water Analyzer for Extraplanetary Studies, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Los Gatos Research (LGR) proposes to employ Off-Axis ICOS to develop triple-isotope water analyzers for lunar and other extraplanetary exploration. This instrument...

  20. Lab-on-a-chip astrobiology analyzer, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an astrobiology analyzer to measure chemical signatures of life in extraterrestrial settings. The...

  1. Analyzing Spread of Influence in Social Networks for Transportation Application.

    Science.gov (United States)

    2016-09-02

    This project analyzed the spread of influence in social media, in particular, the Twitter social media site, and identified the individuals who exert the most influence to those they interact with. There are published studies that use social media to...

  2. Analyzing Spread of Influence in Social Networks for Transportation Applications

    Science.gov (United States)

    2016-09-02

    This project analyzed the spread of influence in social media, in particular, the Twitter social media site, and identified the individuals who exert the most influence to those they interact with. There are published studies that use social media to...

  3. Analyzed method for calculating the distribution of electrostatic field

    International Nuclear Information System (INIS)

    Lai, W.

    1981-01-01

    An analyzed method for calculating the distribution of electrostatic field under any given axial gradient in tandem accelerators is described. This method possesses satisfactory accuracy compared with the results of numerical calculation

  4. A multichannel analyzer software system realized by C Language

    International Nuclear Information System (INIS)

    Zheng Lifang; Xue Liudong

    1995-01-01

    The specialty of a multichannel analyzer software system realized by C Language is introduced. Because of its superior performance, the software has brilliant prospects for applications. The function of the software is also introduced

  5. Automated Real-Time Clearance Analyzer (ARCA), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The Automated Real-Time Clearance Analyzer (ARCA) addresses the future safety need for Real-Time System-Wide Safety Assurance (RSSA) in aviation and progressively...

  6. Mini Total Organic Carbon Analyzer (miniTOCA)

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this development is to create a prototype hand-held, 1 to 2 liter size battery-powered Total Organic Carbon Analyzer (TOCA). The majority of...

  7. Josephson junction spectrum analyzer for millimeter and submillimeter wavelengths

    International Nuclear Information System (INIS)

    Larkin, S.Y.; Anischenko, S.E.; Khabayev, P.V.

    1994-01-01

    A prototype of the Josephson-effect spectrum analyzer developed for the millimeter-wave band is described. The measurement results for spectra obtained in the frequency band from 50 to 250 GHz are presented

  8. Energy range extension for an electrostatic-analyzer homogenizer system

    International Nuclear Information System (INIS)

    Flynn, D.S.; Bilpuch, E.G.; Purser, F.O.; Newson, H.W.; Seagondollar, L.W.; Mitchell, G.E.

    1976-01-01

    The energy range of an electrostatic-analyzer homogenizer system has been extended by the introduction of a molecular dissociator which operates on the diatomic hydrogen ions from a Van de Graff accelerator. Energy resolutions of 300-400 eV with solid targets have been obtained routinely up to proton energies of 4.5 MeV with an electrostatic analyzer originally limited to measuring proton energies below 2.7 MeV. Improvements to the Van de Graaff accelerator are expected to permit this system to operate as high as 5-6 MeV. Since feedback for this homogenizer is obtained by modulating the voltage on the dissociator, a magnetic analyzer could be substituted for the electrostatic analyzer. (Auth.)

  9. Mini Total Organic Carbon Analyzer (miniTOCA) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Total Organic Carbon (TOC) analyzers function by converting (oxidizing) all organic compounds (contaminants) in the water sample to carbon dioxide gas (CO2), then...

  10. The quality infrastructure measuring, analyzing, and improving library services

    CERN Document Server

    Murphy, Sarah Anne

    2013-01-01

    Summarizing specific tools for measuring service quality alongside tips for using these tools most effectively, this book helps libraries of all kinds take a programmatic approach to measuring, analyzing, and improving library services.

  11. Analyzing Properties of Stochastic Business Processes By Model Checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2013-01-01

    This chapter presents an approach to precise formal analysis of business processes with stochastic properties. The method presented here allows for both qualitative and quantitative properties to be individually analyzed at design time without requiring a full specification. This provides...

  12. Airspace Analyzer for Assessing Airspace Directional Permeability, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We build a software tool which enables the user (airline or Air Traffic Service Provider (ATSP)) the ability to analyze the flight-level-by-flight-level permeability...

  13. GIOTTO JOHNSTONE PARTICLE ANALYZER MERGED DATA V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — This dataset contains results from the Implanted Ion Sensor (IIS) 4DH mode and the Fast Ion Sensor SW and HAR modes of the Three- Dimensional Particle Analyzer (JPA)...

  14. Mars & Multi-Planetary Electrical Environment Spectrum Analyzer

    Data.gov (United States)

    National Aeronautics and Space Administration — In FY14, we will develop and deliver a Multi-Planetary Electrical Environment Plasma Spectrum Analyzer. This instrument is a DC and AC electric field sensor suite...

  15. A traffic analyzer for multiple SpaceWire links

    Science.gov (United States)

    Liu, Scige J.; Giusi, Giovanni; Di Giorgio, Anna M.; Vertolli, Nello; Galli, Emanuele; Biondi, David; Farina, Maria; Pezzuto, Stefano; Spinoglio, Luigi

    2014-07-01

    Modern space missions are becoming increasingly complex: the interconnection of the units in a satellite is now a network of terminals linked together through routers, where devices with different level of automation and intelligence share the same data-network. The traceability of the network transactions is performed mostly at terminal level through log analysis and hence it is difficult to verify in real time the reliability of the interconnections and the interchange protocols. To improve and ease the traffic analysis in a SpaceWire network we implemented a low-level link analyzer, with the specific goal to simplify the integration and test phases in the development of space instrumentation. The traffic analyzer collects signals coming from pod probes connected in-series on the interested links between two SpaceWire terminals. With respect to the standard traffic analyzers, the design of this new tool includes the possibility to internally reshape the LVDS signal. This improvement increases the robustness of the analyzer towards environmental noise effects and guarantees a deterministic delay on all analyzed signals. The analyzer core is implemented on a Xilinx FPGA, programmed to decode the bidirectional LVDS signals at Link and Network level. Successively, the core packetizes protocol characters in homogeneous sets of time ordered events. The analyzer provides time-tagging functionality for each characters set, with a precision down to the FPGA Clock, i.e. about 20nsec in the adopted HW environment. The use of a common time reference for each character stream allows synchronous performance measurements. The collected information is then routed to an external computer for quick analysis: this is done via high-speed USB2 connection. With this analyzer it is possible to verify the link performances in terms of induced delays in the transmitted signals. A case study focused on the analysis of the Time-Code synchronization in presence of a SpaceWire Router is

  16. Methods of analyzing microsize particulate aerosols and contaminants.

    Science.gov (United States)

    Blanchard, M. B.; Farlow, N. H.; Ferry, G. V.

    1971-01-01

    Methods for analyzing atmospheric aerosols and particulate contaminants from natural and industrial sources are briefly described. Their development was spurred by interest in cosmic dust investigations. These methods make it possible to analyze individual particles ranging in size from 0.01 to 1000 microns for elemental and mineralogical content. The described methods include transmission electron microscopy, scanning electron microscopy, electron probe microanalysis, X-ray diffraction, optical mineralogy, and density measurement.

  17. APPROACHES TO ANALYZE THE QUALITY OF ROMANIAN TOURISM WEB SITES

    Directory of Open Access Journals (Sweden)

    Lacurezeanu Ramona

    2013-07-01

    The purpose of our work is to analyze travel web-sites, more exactly, whether the criteria used to analyze virtual stores are also adequate for the Romanian tourism product. Following the study, we concluded that the Romanian online tourism web-sites for the Romanian market have the features that we found listed on similar web-sites of France, England, Germany, etc. In conclusion, online Romanian tourism can be considered one of the factors of economic growth.

  18. Principle and application of portable NIR tea drinks analyzer

    Science.gov (United States)

    Jiang, Liyi; Chen, Huacai; Liu, Fuli

    2009-11-01

    Tea polyphenols (Tp) and free amino acids (Aa) are the most important quality materials in tea drinks. Due to the high number of samples to be analyzed, new analytical techniques providing fast and reliable data about the quality are essential. Therefore, a portable near-infrared spectroscopy (NIR) analyzer was developed for real-time, continuous and quantitative determination of Tp and Aa in tea drinks. The portable NIR tea drinks analyzer is composed of a lamphouse, a temperature-controlled sample chamber, an optical fiber and an InGaAs array mini grating spectrometer. The analyzer is compact, lightweight and robust with no movable elements. The software with the functions of spectrum acquisition, model establishment, method selection and real-time analysis was also developed for the analyzer. Using partial least squares (PLS) regression, the calibration models for the quantification of Tp and Aa were established with reference to the GB methods (the national standard methods). The values of root mean square error of cross validation (RMSECV) of the models for Tp and Aa calibration were 0.059 mg/mL, 0.005 mg/mL, the values of the correlation coefficients (R2) were 0.99 and 0.98 respectively. The relative standard deviation (RSD) of ten repetitive testing were 3.17% and 4.15%. It suggested that the portable NIR tea drinks analyzer could be a fast and reliable alternative for tea drinks quality testing.

  19. Whole community genome amplification (WCGA) leads to compositional bias in methane oxidizing communities as assessed by pmoA based microarray analyses and QPCR

    NARCIS (Netherlands)

    Bodelier, P.L.E.; Kamst, M.; Meima-Franke, M.; Stralis-Pavese, N.; Bodrossy, L.

    2009-01-01

    Whole-genome amplification (WGA) using multiple displacement amplification (MDA) has recently been introduced to the field of environmental microbiology. The amplification of single-cell genomes or whole-community metagenomes decreases the minimum amount of DNA needed for subsequent molecular

  20. 40 CFR 1065.309 - Continuous gas analyzer system-response and updating-recording verification-for gas analyzers...

    Science.gov (United States)

    2010-07-01

    ...). Measurement of Engine Parameters and Ambient Conditions ... for system response and updating-recording frequency for continuous gas analyzers that output a single... change the system response. (b) Measurement principles. This procedure verifies that the updating and...

  1. Test of a two-dimensional neutron spin analyzer

    International Nuclear Information System (INIS)

    Falus, Peter; Vorobiev, Alexei; Krist, Thomas

    2006-01-01

    The aim of this measurement was to test the new large-area spin polarization analyzer for the EVA-SERGIS beamline at Institute Laue Langevin (ILL). The spin analyzer, which was built in Berlin selects one of the two spin states of a neutron beam of wavelength 5.5 A impinging on a horizontal sample and reflected or scattered from the sample. The spin is analyzed for all neutrons scattered into a detector with an area of 190 mmx190 mm positioned 2.7 m behind the sample, thus covering an angular interval of 4 o x4 o . The tests were done at the HMI V14 beamline followed by tests at the EVA beamline at ILL. The transmission for the two spin components, the flipping ratio and small angle scattering were recorded while scanning the incoming beam on the analyzer. It was clearly visible, that due to the stacked construction the intensity is blocked at regular intervals. Careful inspection shows that the transmission of the good spin component is more than 0.72 for 60% of the detector area and the corrected flipping ratio is more than 47 for 60% of the detector area. Although some small-angle scattering is visible, it is notable that this analyzer design has small scattering intensities

  2. Test of a two-dimensional neutron spin analyzer

    Science.gov (United States)

    Falus, Péter; Vorobiev, Alexei; Krist, Thomas

    2006-11-01

    The aim of this measurement was to test the new large-area spin polarization analyzer for the EVA-SERGIS beamline at Institute Laue Langevin (ILL). The spin analyzer, which was built in Berlin selects one of the two spin states of a neutron beam of wavelength 5.5 Å impinging on a horizontal sample and reflected or scattered from the sample. The spin is analyzed for all neutrons scattered into a detector with an area of 190 mm×190 mm positioned 2.7 m behind the sample, thus covering an angular interval of 4°×4°. The tests were done at the HMI V14 beamline followed by tests at the EVA beamline at ILL. The transmission for the two spin components, the flipping ratio and small angle scattering were recorded while scanning the incoming beam on the analyzer. It was clearly visible, that due to the stacked construction the intensity is blocked at regular intervals. Careful inspection shows that the transmission of the good spin component is more than 0.72 for 60% of the detector area and the corrected flipping ratio is more than 47 for 60% of the detector area. Although some small-angle scattering is visible, it is notable that this analyzer design has small scattering intensities.

  3. A Vector Network Analyzer Based on Pulse Generators

    Directory of Open Access Journals (Sweden)

    B. Schulte

    2005-01-01

    Full Text Available A fast four channel network analyzer is introduced to measure S-parameters in a frequency range from 10MHz to 3GHz. The signal generation for this kind of analyzer is based on pulse generators, which are realized with bipolar transistors. The output signal of the transistor is differentiated and two short pulses, a slow and a fast one, with opposite polarities are generated. The slow pulse is suppressed with a clipping network. Thus the generation of very short electrical pulses with a duration of about 100ps is possible. The structure of the following network analyzer is similar to the structure of a conventional four channel network analyzer. All four pulses, which contain the high frequency information of the device under test, are evaluated after the digitalization of intermediate frequencies. These intermediate frequencies are generated with sampling mixers. The recorded data is evaluated with a special analysis technique, which is based on a Fourier transformation. The calibration techniques used are the same as for conventional four channel network analyzers, no new calibration techniques need to be developed.

  4. Health Services Cost Analyzing in Tabriz Health Centers 2008

    Directory of Open Access Journals (Sweden)

    Massumeh gholizadeh

    2015-08-01

    Full Text Available Background and objectives : Health Services cost analyzing is an important management tool for evidence-based decision making in health system. This study was conducted with the purpose of cost analyzing and identifying the proportion of different factors on total cost of health services that are provided in urban health centers in Tabriz. Material and Methods : This study was a descriptive and analytic study. Activity Based Costing method (ABC was used for cost analyzing. This cross–sectional survey analyzed and identified the proportion of different factors on total cost of health services that are provided in Tabriz urban health centers. The statistical population of this study was comprised of urban community health centers in Tabriz. In this study, a multi-stage sampling method was used to collect data. Excel software was used for data analyzing. The results were described with tables and graphs. Results : The study results showed the portion of different factors in various health services. Human factors by 58%, physical space 8%, medical equipment 1.3% were allocated with high portion of expenditures and costs of health services in Tabriz urban health centers. Conclusion : Based on study results, since the human factors included the highest portion of health services costs and expenditures in Tabriz urban health centers, balancing workload with staff number, institutionalizing performance-based management and using multidisciplinary staffs may lead to reduced costs of services. ​

  5. Applications of Electronstatic Lenses to Electron Gun and Energy Analyzers

    International Nuclear Information System (INIS)

    Sise, O.

    2004-01-01

    Focal properties and geometries are given for several types of electrostatic lens systems commonly needed in electron impact studies. One type is an electron gun which focuses electrons over a wide range of energy onto a fixed point, such as target, and the other type is an analyzer system which focuses scattered electrons of variable energy onto a fixed position, such as the entrance plane of an analyzer. There are many different types and geometries of these lenses for controlling and focusing of the electron beams. In this presentation we discussed the criteria used for the design of the electrostatic lenses associated with the electron gun and energy analyzers and determined the fundamental relationships between the operation and behaviour of multi-element electrostatic lenses, containing five, six and seven elements. The focusing of the electron beam was achieved by applying suitable voltages to the series of these lens elements, Design of the lens system for electron gun was based on our requirements that the beam at the target had a small spot size and zero beam angle, that is, afocal mode. For energy analyzer systems we considered the entrance of the hemispherical analyzer which determines the energy of the electron beam and discussed the focusing condition of this lens systems

  6. Pseudocode Interpreter (Pseudocode Integrated Development Environment with Lexical Analyzer and Syntax Analyzer using Recursive Descent Parsing Algorithm

    Directory of Open Access Journals (Sweden)

    Christian Lester D. Gimeno

    2017-11-01

    Full Text Available –This research study focused on the development of a software that helps students design, write, validate and run their pseudocode in a semi Integrated Development Environment (IDE instead of manually writing it on a piece of paper.Specifically, the study aimed to develop lexical analyzer or lexer, syntax analyzer or parser using recursive descent parsing algorithm and an interpreter. The lexical analyzer reads pseudocodesource in a sequence of symbols or characters as lexemes.The lexemes are then analyzed by the lexer that matches a pattern for valid tokens and passes to the syntax analyzer or parser. The syntax analyzer or parser takes those valid tokens and builds meaningful commands using recursive descent parsing algorithm in a form of an abstract syntax tree. The generation of an abstract syntax tree is based on the specified grammar rule created by the researcher expressed in Extended Backus-Naur Form. The Interpreter takes the generated abstract syntax tree and starts the evaluation or interpretation to produce pseudocode output. The software was evaluated using white-box testing by several ICT professionals and black-box testing by several computer science students based on the International Organization for Standardization (ISO 9126 software quality standards. The overall results of the evaluation both for white-box and black-box were described as “Excellent in terms of functionality, reliability, usability, efficiency, maintainability and portability”.

  7. A nuclear facility Security Analyzer written in PROLOG

    International Nuclear Information System (INIS)

    Zimmerman, B.D.

    1987-08-01

    The Security Analyzer project was undertaken to use the Prolog ''artificial intelligence'' programming language and Entity-Relationship database construction techniques to produce an intelligent database computer program capable of analyzing the effectiveness of a nuclear facility's security systems. The Security Analyzer program can search through a facility to find all possible surreptitious entry paths that meet various user-selected time and detection probability criteria. The program can also respond to user-formulated queries concerning the database information. The intelligent database approach allows the program to perform a more comprehensive path search than other programs that only find a single ''optimal'' path. The program also is more flexible in that the database, once constructed, can be interrogated and used for purposes independent of the searching function

  8. A nuclear facility Security Analyzer written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.

    1987-01-01

    The Security Analyzer project was undertaken to use the Prolog artificial intelligence programming language and Entity-Relationship database construction techniques to produce an intelligent database computer program capable of analyzing the effectiveness of a nuclear facility's security systems. The Security Analyzer program can search through a facility to find all possible surreptitious entry paths that meet various user-selected time and detection probability criteria. The program can also respond to user-formulated queries concerning the database information. The intelligent database approach allows the program to perform a more comprehensive path search than other programs that only find a single optimal path. The program also is more flexible in that the database, once constructed, can be interrogated and used for purposes independent of the searching function

  9. Development of a low energy neutral analyzer (LENA). Final report

    International Nuclear Information System (INIS)

    Curtis, C.C.; Fan, C.Y.; Hsieh, K.C.; McCullen, J.D.

    1986-05-01

    A low energy neutral particle analyzer (LENA) has been developed at the University of Arizona to detect particles originating in the edge plasma of fusion reactors. LENA was designed to perform energy analysis and measure flux levels of neutrals having energies between 5 and 50 eV (with possible extension to 500 eV neutrals), and do this with 1 to 10 ms time resolution. The instrument uses hot filaments to produce a 10 mA diffusion electron beam which ionizes incoming neutrals in a nearly field free region so that their velocity distribution is nearly undisturbed. The resultant ions are energy analyzed in a hyperbolic electrostatic analyzer, and detected by an MCP detector. LENA has been installed and operated on the ALCATOR C tokamak at the MIT Plasma Fusion Center. Results to date are discussed. At present, the LENA exhibits excessive sensitivity to the extremely high ultraviolet photon flux emanating from the plasma. Measures to correct this are suggested

  10. Coherent error study in a retarding field energy analyzer

    International Nuclear Information System (INIS)

    Cui, Y.; Zou, Y.; Reiser, M.; Kishek, R.A.; Haber, I.; Bernal, S.; O'Shea, P.G.

    2005-01-01

    A novel cylindrical retarding electrostatic field energy analyzer for low-energy beams has been designed, simulated, and tested with electron beams of several keV, in which space charge effects play an important role. A cylindrical focusing electrode is used to overcome the beam expansion inside the device due to space-charge forces, beam emittance, etc. In this paper, we present the coherent error analysis for this energy analyzer with beam envelope equation including space charge and emittance effects. The study shows that this energy analyzer can achieve very high resolution (with relative error of around 10 -5 ) if taking away the coherent errors by using proper focusing voltages. The theoretical analysis is compared with experimental results

  11. Correction of surface aberration in strain scanning method with analyzer

    International Nuclear Information System (INIS)

    Shobu, Takahisa; Mizuki, Junichiro; Suzuki, Kenji; Akiniwa, Yoshiaki; Tanaka, Keisuke

    2006-01-01

    When a gauge volume sank below a specimen surface, the diffraction angle shifts. Thus, it is required to correct the surface aberration. For the annealed specimen of S45C, the shift in the diffraction angle was investigated using a strain scanning method with Ge (111) analyzer. This phenomenon was caused by the difference in the centroid between the geometric and the instrumental gauge volumes. This difference is explained by the following factors; 1) the change in the gauge volume by the divergence of the analyzer, 2) the X-ray penetration depth, 3) the gap of the centre line between the double receiving slits due to mis-setting the analyzer. As a result, the correcting method considered into these factors was proposed. For the shot-peened specimens of S45C, the diffraction angles were measured and corrected by our method. The distribution of the residual stress agreed with that obtained by the removal method. (author)

  12. The Common Technique for Analyzing the Financial Results Report

    Directory of Open Access Journals (Sweden)

    Pasternak Maria M.

    2017-04-01

    Full Text Available The article is aimed at generalizing the theoretical approaches to the structure and elements of the technique for analysis of the Financial results report (Cumulative income report and providing suggestions for its improvement. The current methods have been analyzed, relevance of the application of a common technique for such analysis has been substantiated. A common technique for analyzing the Financial results report has been proposed, which includes definition of the objectives and tasks of analysis, subjects and objects of analysis, sources of its information. Stages of such an analysis were allocated and described. The findings of the article can be used to theoretically substantiate and to practically develop a technique for analyzing the Financial results report in the branches of Ukrainian economy.

  13. Breath acetone analyzer: diagnostic tool to monitor dietary fat loss.

    Science.gov (United States)

    Kundu, S K; Bruzek, J A; Nair, R; Judilla, A M

    1993-01-01

    Acetone, a metabolite of fat catabolism, is produced in excessive amounts in subjects on restricted-calorie weight-loss programs. Breath acetone measurements are useful as a motivational tool during dieting and for monitoring the effectiveness of weight-loss programs. We have developed a simple, easy-to-read method that quantifies the amount of acetone in a defined volume of exhaled breath after trapping the sample in a gas-analyzer column. The concentration of acetone, as measured by the length of a blue color zone in the analyzer column, correlates with results obtained by gas chromatography. Using the breath acetone analyzer to quantify breath acetone concentrations of dieting subjects, we established a correlation between breath acetone concentration and rate of fat loss (slope 52.2 nmol/L per gram per day, intercept 15.3 nmol/L, n = 78, r = 0.81). We also discussed the possibility of using breath acetone in diabetes management.

  14. A cascading failure model for analyzing railway accident causation

    Science.gov (United States)

    Liu, Jin-Tao; Li, Ke-Ping

    2018-01-01

    In this paper, a new cascading failure model is proposed for quantitatively analyzing the railway accident causation. In the model, the loads of nodes are redistributed according to the strength of the causal relationships between the nodes. By analyzing the actual situation of the existing prevention measures, a critical threshold of the load parameter in the model is obtained. To verify the effectiveness of the proposed cascading model, simulation experiments of a train collision accident are performed. The results show that the cascading failure model can describe the cascading process of the railway accident more accurately than the previous models, and can quantitatively analyze the sensitivities and the influence of the causes. In conclusion, this model can assist us to reveal the latent rules of accident causation to reduce the occurrence of railway accidents.

  15. Evaluation of performance of veterinary in-clinic hematology analyzers.

    Science.gov (United States)

    Rishniw, Mark; Pion, Paul D

    2016-12-01

    A previous study provided information regarding the quality of in-clinic veterinary biochemistry testing. However, no similar studies for in-clinic veterinary hematology testing have been conducted. The objective of this study was to assess the quality of hematology testing in veterinary in-clinic laboratories using results obtained from testing 3 levels of canine EDTA blood samples. Clinicians prepared blood samples to achieve measurand concentrations within, below, and above their RIs and evaluated the samples in triplicate using their in-clinic analyzers. Quality was assessed by comparison of calculated total error with quality requirements, determination of sigma metrics, use of a quality goal index, and agreement between in-clinic and reference laboratory instruments. Suitability for statistical quality control was determined using adaptations from the computerized program, EZRules3. Evaluation of 10 veterinary in-clinic hematology analyzers showed that these instruments often fail to meet quality requirements. At least 60% of analyzers reasonably determined RBC, WBC, HCT, and HGB, when assessed by most quality goal criteria; platelets were less reliably measured, with 80% deemed suitable for low platelet counts, but only 30% for high platelet counts, and automated differential leukocyte counts were generally considered unsuitable for clinical use with fewer than 40% of analyzers meeting the least stringent quality goal requirements. Fewer than 50% of analyzers were able to meet requirements for statistical quality control for any measurand. These findings reflect the current status of in-clinic hematology analyzer performance and provide a basis for future evaluations of the quality of veterinary laboratory testing. © 2016 American Society for Veterinary Clinical Pathology.

  16. Development of a Telemetric, Miniaturized Electrochemical Amperometric Analyzer

    Directory of Open Access Journals (Sweden)

    Jaehyo Jung

    2017-10-01

    Full Text Available In this research, we developed a portable, three-electrode electrochemical amperometric analyzer that can transmit data to a PC or a tablet via Bluetooth communication. We performed experiments using an indium tin oxide (ITO glass electrode to confirm the performance and reliability of the analyzer. The proposed analyzer uses a current-to-voltage (I/V converter to convert the current generated by the reduction-oxidation (redox reaction of the buffer solution to a voltage signal. This signal is then digitized by the processor. The configuration of the power and ground of the printed circuit board (PCB layer is divided into digital and analog parts to minimize the noise interference of each part. The proposed analyzer occupies an area of 5.9 × 3.25 cm2 with a current resolution of 0.4 nA. A potential of 0~2.1 V can be applied between the working and the counter electrodes. The results of this study showed the accuracy of the proposed analyzer by measuring the Ruthenium(III chloride ( Ru III concentration in 10 mM phosphate-buffered saline (PBS solution with a pH of 7.4. The measured data can be transmitted to a PC or a mobile such as a smartphone or a tablet PC using the included Bluetooth module. The proposed analyzer uses a 3.7 V, 120 mAh lithium polymer battery and can be operated for 60 min when fully charged, including data processing and wireless communication.

  17. Nonlinear analysis for the electrostatic analyzers with lie algebraic methods

    International Nuclear Information System (INIS)

    Li Jinhai; Lv Jianqin

    2005-01-01

    With the Lie algebraic methods, the charged particle trajectories in electrostatic analyzers are analyzed and the third order solutions obtained. The authors briefly describe the Lie algebraic methods and the procedures of calculating the nonlinear orbits. The procedures are: first, set up the Hamiltonian; then expand the Hamiltonian into a sum of homogeneous polynomials of different degrees; next, calculate the Lie map associating to the Hamiltonian; finally, apply the Lie map on the particle initial coordinates in the phase space, and obtain the particle nonlinear trajectories of the first order, the second order, and the third order approximations respectively. Higher orders solutions could be obtained if needed. (author)

  18. A novel design for a small retractable cylindrical mirror analyzer

    International Nuclear Information System (INIS)

    McIlroy, D.N.; Dowben, P.A.; Knop, A.; Ruehl, E.

    1995-01-01

    In this paper we will review the performance of a ''miniature'' single pass cylindrical mirror analyzer (CMA) which we have used successfully in a variety of experiments. The underlying premise behind this CMA design was to minimize spatial requirements while maintaining an acceptable level of instrumental resolution. While we are presenting the results of a single pass cylindrical mirror analyzer, improvements on the present design, such as going to a double pass design, will undoubtedly improve the instrumental resolution. copyright 1995 American Vacuum Society

  19. A Timing Single Channel Analyzer with pileup rejection

    International Nuclear Information System (INIS)

    Lauch, J.; Nachbar, H.U.

    1981-07-01

    A Timing Single Channel Analyzer is described as normally used in nuclear physics applications for measuring certain ranges of energy spectra. The unit accepts unipolar or bipolar gaussian shaped or rectangular pulses and includes a special pileup rejection circuit. Because of its good timing performance high resolution timing and coincidence measurements are possible. The differential analyzer, trigger and timing modes and the function of external strobe and gate signals are explained. Parts of the circuit are illustrated by help of block diagrams and pulse schematics. An essential part of the unit is the pileup rejection circuit. Following theoretical reflections the circuit is described and some measurement results are reported. (orig.) [de

  20. Analyzing solid waste management practices for the hotel industry

    OpenAIRE

    S.T. Pham Phu; M.G. Hoang; T. Fujiwara

    2018-01-01

    The current study aims to analyze waste characteristics and management practices of the hotel industry in Hoi An, a tourism city in the center of Vietnam. Solid wastes from 120 hotels were sampled, the face-to-face interviews were conducted, and statistical methods were carried out to analyze the data. The results showed that the mean of waste generation rate of the hotels was 2.28 kg/guest/day and strongly correlated to internal influencing factors such as the capacity, the price of the room...

  1. [Collective health in Brazil: analyzing the institutionalization process].

    Science.gov (United States)

    Nunes, Everardo Duarte

    2016-01-01

    This work first analyzes the construction of a typology of the studies on collective health and its institutionalization process in Brazil, in which three stages are proposed: the preventive project, social medicine and collective health. Secondly, the work examines the phases of institutionalization, disciplinarization and professionalization of collective health in Brazil. Within the institutionalization phase, the study analyzes connectivity and communication, regularization of discourses and practices, the construction of a separate identity and political actions, and the incorporation and legitimation of these elements. It is concluded that the trajectory of the construction of collective health is marked by three dimensions: the theoretical-critical, the political-sanitary and the pedagogical-professional.

  2. Wind energy system time-domain (WEST) analyzers

    Science.gov (United States)

    Dreier, M. E.; Hoffman, J. A.

    1981-01-01

    A portable analyzer which simulates in real time the complex nonlinear dynamics of horizontal axis wind energy systems was constructed. Math models for an aeroelastic rotor featuring nonlinear aerodynamic and inertial terms were implemented with high speed digital controllers and analog calculation. This model was combined with other math models of elastic supports, control systems, a power train and gimballed rotor kinematics. A stroboscopic display system graphically depicting distributed blade loads, motion, and other aerodynamic functions on a cathode ray tube is included. Limited correlation efforts showed good comparison between the results of this analyzer and other sophisticated digital simulations. The digital simulation results were successfully correlated with test data.

  3. Using the Analytic Hierarchy Process to Analyze Multiattribute Decisions.

    Science.gov (United States)

    Spires, Eric E.

    1991-01-01

    The use of the Analytic Hierarchy Process (AHP) in assisting researchers to analyze decisions is discussed. The AHP is compared with other decision-analysis techniques, including multiattribute utility measurement, conjoint analysis, and general linear models. Insights that AHP can provide are illustrated with data gathered in an auditing context.…

  4. Watch out for superman: first visualize, then analyze.

    Science.gov (United States)

    Kozak, Marcin

    2012-01-01

    A visit from Superman shows why data visualization should come before data analysis. The Web extra is a dataset that comprises 100 observations of the quantitative variables y and x plus the qualitative variable group. When analyzed correctly, this dataset exhibits an interesting pattern.

  5. Theories Analyzing Communicative Approach in China's EFL Classes

    Science.gov (United States)

    Xu, Yang

    2010-01-01

    Communicative Approach is applied widely in primary and middle schools, when various modern teaching methods pour in China's EFL classes. However, the development of this effective language teaching method cannot show its advantages in China during years' practice. This paper analyzed the constraining factors, such as teaching habits, examination…

  6. Quality Performance of Drugs Analyzed in the Drug Analysis and ...

    African Journals Online (AJOL)

    ICT TEAM

    During the period 2006-2010, the Drug Analysis and Research Unit analyzed 583 samples. The samples comprised 50.6% local and 49.4% imported products. Samples were subjected to compendial or in-house specifications. The failure rate was 12.2% for local products and 14.2% for imports. Antibacterial products ...

  7. Analyzing Program Termination and Complexity Automatically with AProVE

    DEFF Research Database (Denmark)

    Giesl, Jürgen; Aschermann, Cornelius; Brockschmidt, Marc

    2017-01-01

    In this system description, we present the tool AProVE for automatic termination and complexity proofs of Java, C, Haskell, Prolog, and rewrite systems. In addition to classical term rewrite systems (TRSs), AProVE also supports rewrite systems containing built-in integers (int-TRSs). To analyze p...

  8. Analyzing Financial Flows from Emerging Economies to the ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Analyzing Financial Flows from Emerging Economies to the Developing World. The context of development assistance is changing, with private financial flows now well exceeding official development assistance levels. In addition, growth in emerging economies in the coming 10-to-15 years will likely result in sizable ...

  9. So you want to buy an EDXRF analyzer

    International Nuclear Information System (INIS)

    Fleming, R.

    2002-01-01

    Full text: In an effort to save costs many companies are considering energy dispersive x-ray fluorescence (EDXRF) analyzers as an alternative to wavelength dispersive x-ray fluorescence (WDXRF) analyzers. In many cases, particularly in geological applications, this begins with a desire to either replace an in-house WDX system, or reduce the number of samples sent to an outside laboratory. The differences between the two go far beyond a simple reversal of the order of spectral peaks. In order to decide if an EDX analyzer is a suitable substitute for WDX, or any other analytical method, a number of questions must be answered. These include: 1. What is the normal error associated with comparing methods; 2. What additional sources of error are important in EDXRF; 3. How do changes in sample preparation methods affect the results; 4. How to design an application study that determines if EDX is a suitable method. Each of these questions will be answered in enough detail to provide someone with the tools they need to determine if an EDXRF analyzers meets their requirements. Copyright (2002) Australian X-ray Analytical Association Inc

  10. AnalyzeThis: An Analysis Workflow-Aware Storage System

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Hyogi [ORNL; Kim, Youngjae [ORNL; Vazhkudai, Sudharshan S [ORNL; Tiwari, Devesh [ORNL; Anwar, Ali [Virginia Tech, Blacksburg, VA; Butt, Ali R [Virginia Tech, Blacksburg, VA; Ramakrishnan, Lavanya [Lawrence Berkeley National Laboratory (LBNL)

    2015-01-01

    The need for novel data analysis is urgent in the face of a data deluge from modern applications. Traditional approaches to data analysis incur significant data movement costs, moving data back and forth between the storage system and the processor. Emerging Active Flash devices enable processing on the flash, where the data already resides. An array of such Active Flash devices allows us to revisit how analysis workflows interact with storage systems. By seamlessly blending together the flash storage and data analysis, we create an analysis workflow-aware storage system, AnalyzeThis. Our guiding principle is that analysis-awareness be deeply ingrained in each and every layer of the storage, elevating data analyses as first-class citizens, and transforming AnalyzeThis into a potent analytics-aware appliance. We implement the AnalyzeThis storage system atop an emulation platform of the Active Flash array. Our results indicate that AnalyzeThis is viable, expediting workflow execution and minimizing data movement.

  11. Processing of Mining Induced Seismic Events by Spectra Analyzer Software

    Czech Academy of Sciences Publication Activity Database

    Kaláb, Zdeněk; Lednická, Markéta; Lyubushin, A. A.

    2011-01-01

    Roč. 6, č. 1 (2011), s. 75-83 ISSN 1896-3145. [Ochrona środowiska w górnictwie podziemnym, odkrywkowym i otworowym. Walbrzych, 18.05.2011-20.05.2011] Institutional research plan: CEZ:AV0Z30860518 Keywords : mining seismicity * Spectra Analyzer Software * wavelet decomposition * time-frequency map Subject RIV: DC - Siesmology, Volcanology, Earth Structure

  12. Analyzing the Hidden Curriculum of Screen Media Advertising

    Science.gov (United States)

    Mason, Lance E.

    2015-01-01

    This media literacy article introduces a questioning framework for analyzing screen media with students and provides an example analysis of two contemporary commercials. Investigating screen conventions can help students understand the persuasive functions of commercials, as well as how the unique sensory experience of screen viewing affects how…

  13. Analyzing Traffic Problem Model With Graph Theory Algorithms

    OpenAIRE

    Tan, Yong

    2014-01-01

    This paper will contribute to a practical problem, Urban Traffic. We will investigate those features, try to simplify the complexity and formulize this dynamic system. These contents mainly contain how to analyze a decision problem with combinatorial method and graph theory algorithms; how to optimize our strategy to gain a feasible solution through employing other principles of Computer Science.

  14. A high-speed interface for multi-channel analyzer

    International Nuclear Information System (INIS)

    Shen Ji; Zheng Zhong; Qiao Chong; Chen Ziyu; Ye Yunxiu; Ye Zhenyu

    2003-01-01

    This paper presents a high-speed computer interface for multi-channel analyzer based on DMA technique. Its essential principle and operating procedure are introduced. By the detecting of γ spectrum of 137 Cs with the interface, it's proved that the interface can meet the requirements of high-speed data acquisition

  15. Purdue Plane Structures Analyzer II : a computerized wood engineering system

    Science.gov (United States)

    S. K. Suddarth; R. W. Wolfe

    1984-01-01

    The Purdue Plane Structures Analyzer (PPSA) is a computer program developed specifically for the analysis of wood structures. It uses recognized analysis procedures, in conjunction with recommendations of the 1982 National Design Specification for Wood Construction, to determine stresses and deflections of wood trusses and frames. The program offers several options for...

  16. Make to Relate: Analyzing Narratives of Community Practice

    Science.gov (United States)

    Dixon, Colin; Martin, Lee

    2017-01-01

    As young people design, build, and problem solve within maker spaces and clubs, they talk about making. We analyze short interviews with young people involved in maker clubs, conducted during public presentations, and hypothesize a progression through frames of participation. Moving from "exploration," to "exchange," and on to…

  17. 40 CFR 86.315-79 - General analyzer specifications.

    Science.gov (United States)

    2010-07-01

    ... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission Regulations for New Gasoline-Fueled and Diesel-Fueled Heavy-Duty Engines; Gaseous Exhaust Test Procedures § 86... calibration or span gas. (c) Noise. The analyzer peak-to-peak response to zero and calibration or span gases...

  18. Analyzing discussions on twitter: Case study on HPV vaccinations

    NARCIS (Netherlands)

    Kaptein, R.; Boertjes, E.; Langley, D.

    2014-01-01

    In this work we analyze the discussions on Twitter around the Human papillomavirus (HPV) vaccinations. We collect a dataset consisting of tweets related to the HPV vaccinations by searching for relevant keywords, by retrieving the conversations on Twitter, and by retrieving tweets from our user

  19. Besieged by burqas: Analyzing representations of the burqa

    NARCIS (Netherlands)

    Mazurski, L.E.

    2015-01-01

    In this thesis, I analyze the ways in which various discourses produce knowledge about the burqa. Particularly, since the attacks on the twin towers and the London bombings, Orientalist and neo- Orientalist tropes have been revitalized and propagagated by ideologies of Islamophobia at work to

  20. Analyzing Perceptions of Prospective Teachers about Their Media Literacy Competencies

    Science.gov (United States)

    Recepoglu, Ergun; Ergun, Muammer

    2013-01-01

    The purpose of this study is to analyze perceptions of prospective teachers about their media literacy competencies in terms of different variables. This is a descriptive research in the survey model which tries to detect the current situation. Study group includes 580 prospective teachers from Turkish, Primary School, Social Studies, Science,…

  1. 21 CFR 868.1720 - Oxygen gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... gases by techniques such as mass spectrometry, polarography, thermal conductivity, or gas chromatography... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Oxygen gas analyzer. 868.1720 Section 868.1720 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED...

  2. 21 CFR 868.1075 - Argon gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... thermal conductivity. (b) Classification. Class II (performance standards). ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Argon gas analyzer. 868.1075 Section 868.1075 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL...

  3. 21 CFR 868.1670 - Neon gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... patient. The device may use techniques such as mass spectrometry or thermal conductivity. (b... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Neon gas analyzer. 868.1670 Section 868.1670 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL...

  4. 21 CFR 868.1640 - Helium gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... mixture during pulmonary function testing. The device may use techniques such as thermal conductivity, gas... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Helium gas analyzer. 868.1640 Section 868.1640 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED...

  5. Application platform 'ICX' designed for computer assisted functional image analyzer

    International Nuclear Information System (INIS)

    Kinosada, Yasutomi; Hattori, Takao; Yonezawa, Kazuo; Tojo, Shigenori.

    1994-01-01

    Recent clinical imaging modalities such as X-CT, MRI, SPECT and so on make it easy to obtain various functional images of the human body because of the rapid technical progress of modalities. But the technical progress such as fast imaging technique and 3D volume scanning technique have brought new problems for both medical doctors and technical staffs. They are the increase of both number of images and opportunities of the image processing for 3D presentations. Furthermore, it has been left difficult and troublesome to analyze these functional images. In this study, we have developed the application platform ICX (Independent Console based on X-window system) designed for a computer assisted functional image analyzer under the different concept from the conventional medical image processing workstations. ICX can manage clinical images from various modalities of imaging systems via Ethernet LAN and assist users to analyze or process these images easily with ICX's application programs or some commercial applications. ICX works as a diagnostic console, a personal PACS and a functional image analyzer, but independently works with imaging modalities. Many object-oriented image analysis and processing tools are available and they can be driven in any situations by users. ICX is a new type of the workstation and seems useful in the recent medical fields. (author)

  6. Differentiation and Integration: Guiding Principles for Analyzing Cognitive Change

    Science.gov (United States)

    Siegler, Robert S.; Chen, Zhe

    2008-01-01

    Differentiation and integration played large roles within classic developmental theories but have been relegated to obscurity within contemporary theories. However, they may have a useful role to play in modern theories as well, if conceptualized as guiding principles for analyzing change rather than as real-time mechanisms. In the present study,…

  7. 40 CFR 86.224-94 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration. 86.224-94 Section 86.224-94 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... New Medium-Duty Passenger Vehicles; Cold Temperature Test Procedures § 86.224-94 Carbon dioxide...

  8. A Linguistic Technique for Marking and Analyzing Syntactic Parallelism.

    Science.gov (United States)

    Sackler, Jessie Brome

    Sentences in rhetoric texts were used in this study to determine a way in which thetorical syntactic parallelism can be analyzed. A tagmemic analysis determined tagmas which were parallel or identical or similar to one another. These were distinguished from tagmas which were identical because of the syntactic constraints of the language…

  9. Analyzing Properties of Stochastic Business Processes By Model Checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2013-01-01

    This chapter presents an approach to precise formal analysis of business processes with stochastic properties. The method presented here allows for both qualitative and quantitative properties to be individually analyzed at design time without requiring a full specification. This provides...... an effective means to explore possible designs for a business process and to debug any flaws....

  10. Mining Web Query Logs to Analyze Political Issues

    NARCIS (Netherlands)

    Weber, I.; Garimella, V.R.K.; Borra, E.; Contractor, N.; Uzzi, B.

    2012-01-01

    We present a novel approach to using anonymized web search query logs to analyze and visualize political issues. Our starting point is a list of politically annotated blogs (left vs. right). We use this list to assign a numerical political leaning to queries leading to clicks on these blogs.

  11. A computerised EEG-analyzing system for small laboratory animals

    NARCIS (Netherlands)

    Kropveld, D.; Chamuleau, R. A.; Popken, R. J.; Smith, J.

    1983-01-01

    The experimental setup, including instrumentation and software packaging, is described for the use of a minicomputer as an on-line analyzing system of the EEG in rats. Complete fast Fourier transformation of the EEG sampled in 15 episodes of 10 s each is plotted out within 7 min after the start of

  12. A Study Analyzing the Career Path of Librarians

    Science.gov (United States)

    Noh, Younghee

    2010-01-01

    This study aims to identify the career movement patterns of librarians, analyze factors influencing their career movements, and compare differences in such factors between librarians and chief librarians. Findings showed that the jobs with the highest retention rate were those in public libraries, that library automation system developers showed…

  13. Analyzing the Structure of the International Business Curriculum in India

    Science.gov (United States)

    Srivastava, Deepak K.

    2012-01-01

    This article analyzes the structure of the international business curriculum through a questionnaire-based survey among current students and young managers who are studying or have studied international business courses in one of the top B-Schools of India. Respondents have the opinion that international business is more than internationalization…

  14. Analyzing Oscillations of a Rolling Cart Using Smartphones and Tablets

    Science.gov (United States)

    Egri, Sandor; Szabo, Lorant

    2015-01-01

    It is well known that "interactive engagement" helps students to understand basic concepts in physics. Performing experiments and analyzing measured data are effective ways to realize interactive engagement, in our view. Some experiments need special equipment, measuring instruments, or laboratories, but in this activity we advocate…

  15. Modeling of 1-D Nanowires and analyzing their Hydrogen and ...

    Indian Academy of Sciences (India)

    SUDIP PAN

    Modeling of 1-D Nanowires and analyzing their Hydrogen and Noble. Gas Binding Ability. †. SUDIP PANa, RANAJIT SAHAa, ASHUTOSH GUPTAb and PRATIM K CHATTARAJa,∗. aDepartment of Chemistry and Center for Theoretical Studies, Indian Institute of Technology Kharagpur,. Kharagpur, West Bengal 721 302, ...

  16. 40 CFR 86.1522 - Carbon monoxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... Liquefied Petroleum Gas-Fueled Diesel-Cycle Heavy-Duty Engines, New Otto-Cycle Light-Duty Trucks, and New Methanol-Fueled Natural Gas-Fueled, and Liquefied Petroleum Gas-Fueled Diesel-Cycle Light-Duty Trucks; Idle... engineering practice for instrument start-up and operation. Adjust the analyzer to optimize performance on the...

  17. Analyzing Counsel/Witness Discourse in Nnewi, Anambra State ...

    African Journals Online (AJOL)

    This paper analyzed counsel/witness discourse using the High Court in. Nnewi Municipal Council. Specifically, it described the structure and organization of counsel/witness discourse in the courtroom context highlighting some discourse features inherent in them, and observed the communication strategies and motivation ...

  18. Using Facebook Data to Analyze Learner Interaction during Study Abroad

    Science.gov (United States)

    Back, Michele

    2013-01-01

    Although study abroad is viewed as an ideal environment for interaction in the target language, research in this area has relied mostly upon self-reported data, which pose challenges regarding recall bias and participant commitment. This article shows how Facebook data can be used to analyze naturally occurring learner interactions during study…

  19. Development and performance of on-line uranium analyzers

    International Nuclear Information System (INIS)

    Ofalt, A.E.; O'Rourke, P.E.

    1985-10-01

    A diode-array spectrophotometer and and x-ray fluorescence analyzer were installed online in a full-scale prototype facility to monitor uranium loading and breakthrough of ion exchange columns. Uranium concentrations of 10 ppM in uranyl nitrate solutions can be detected online to improve process control and material accountability. 9 figs

  20. Modeling Adaptive Dynamical Systems to Analyze Eating Regulation Disorders

    NARCIS (Netherlands)

    Bosse, T.; Delfos, M.F.; Jonker, C.M.; Treur, J.

    2006-01-01

    To analyze the disorders of their patients, psychotherapists often have to get insight in adaptive dynamical systems. Analysis of dynamical systems usually is performed using mathematical techniques. Such an analysis is not precisely the type of reasoning performed in psychotherapy practice. In this

  1. 26 Analyzing Counsel/Witness Discourse in Nnewi, Anambra State ...

    African Journals Online (AJOL)

    User

    Abstract. This paper analyzed counsel/witness discourse using the High Court in. Nnewi Municipal Council. Specifically, it described the structure and organization of counsel/witness discourse in the courtroom context highlighting some discourse features inherent in them, and observed the communication strategies and ...

  2. Probability model for analyzing fire management alternatives: theory and structure

    Science.gov (United States)

    Frederick W. Bratten

    1982-01-01

    A theoretical probability model has been developed for analyzing program alternatives in fire management. It includes submodels or modules for predicting probabilities of fire behavior, fire occurrence, fire suppression, effects of fire on land resources, and financial effects of fire. Generalized "fire management situations" are used to represent actual fire...

  3. Moira Chimombo It is never sufficient simply to analyze without ...

    African Journals Online (AJOL)

    SECOND LANGUAGE ACQUISITION*. Moira Chimombo. It is never sufficient simply to analyze without synthesis, or for that matter to synthesize without analysis. Ironically, that is as true of language acquisition itself as it is of the research that describes it. Yet researchers cannot synthesize until they have analysed sufficient ...

  4. Analyzing Exertion of Hardy's Tragic Effect in "Tess"

    Science.gov (United States)

    Yang, Wei

    2009-01-01

    This paper begins with a brief introduction to [Thomas] Hardy's whole life and his works, especially this novel "Tess [of the D'Urbervilles]" and points out the tragic effect's importance and Hardy's tragic idea. Linked to this tragic effect, this paper analyzes the nice application in "Tess." At last, we can understand more…

  5. Analyzing the security posture of South African websites

    CSIR Research Space (South Africa)

    Mtsweni, Jabu, S

    2015-08-12

    Full Text Available observed. Research studies also suggest that over 80% of the active websites are vulnerable to a myriad of attacks. This paper reports on a study conducted to passively analyze and determine the security posture of over 70 South African websites from...

  6. Comparison of chemistry analytes between 2 portable, commercially available analyzers and a conventional laboratory analyzer in reptiles.

    Science.gov (United States)

    McCain, Stephanie L; Flatland, Bente; Schumacher, Juergen P; Clarke Iii, Elsburgh O; Fry, Michael M

    2010-12-01

    Advantages of handheld and small bench-top biochemical analyzers include requirements for smaller sample volume and practicality for use in the field or in practices, but little has been published on the performance of these instruments compared with standard reference methods in analysis of reptilian blood. The aim of this study was to compare reptilian blood biochemical values obtained using the Abaxis VetScan Classic bench-top analyzer and a Heska i-STAT handheld analyzer with values obtained using a Roche Hitachi 911 chemical analyzer. Reptiles, including 14 bearded dragons (Pogona vitticeps), 4 blue-tongued skinks (Tiliqua gigas), 8 Burmese star tortoises (Geochelone platynota), 10 Indian star tortoises (Geochelone elegans), 5 red-tailed boas (Boa constrictor), and 5 Northern pine snakes (Pituophis melanoleucus melanoleucus), were manually restrained, and a single blood sample was obtained and divided for analysis. Results for concentrations of albumin, bile acids, calcium, glucose, phosphates, potassium, sodium, total protein, and uric acid and activities of aspartate aminotransferase and creatine kinase obtained from the VetScan Classic and Hitachi 911 were compared. Results for concentrations of chloride, glucose, potassium, and sodium obtained from the i-STAT and Hitachi 911 were compared. Compared with results from the Hitachi 911, those from the VetScan Classic and i-STAT had variable correlations, and constant or proportional bias was found for many analytes. Bile acid data could not be evaluated because results for 44 of 45 samples fell below the lower linearity limit of the VetScan Classic. Although the 2 portable instruments might provide measurements with clinical utility, there were significant differences compared with the reference analyzer, and development of analyzer-specific reference intervals is recommended. ©2010 American Society for Veterinary Clinical Pathology.

  7. Using Outcomes to Analyze Patients Rather than Patients to Analyze Outcomes: A Step toward Pragmatism in Benefit:risk Evaluation

    Science.gov (United States)

    Evans, Scott R.; Follmann, Dean

    2016-01-01

    In the future, clinical trials will have an increased emphasis on pragmatism, providing a practical description of the effects of new treatments in realistic clinical settings. Accomplishing pragmatism requires better summaries of the totality of the evidence in ways that clinical trials consumers---patients, physicians, insurers---find transparent and allow for informed benefit:risk decision-making. The current approach to the analysis of clinical trials is to analyze efficacy and safety separately and then combine these analyses into a benefit:risk assessment. Many assume that this will effectively describe the impact on patients. But this approach is suboptimal for evaluating the totality of effects on patients. We discuss methods for benefit:risk assessment that have greater pragmatism than methods that separately analyze efficacy and safety. These include the concepts of within-patient analyses and composite benefit:risk endpoints with a goal of understanding how to analyze one patient before trying to figure out how to analyze many. We discuss the desirability of outcome ranking (DOOR) and introduce the partial credit strategy using an example in a clinical trial evaluating the effects of a new antibiotic. As part of the example we introduce a strategy to engage patients as a resource to inform benefit:risk analyses consistent with the goal of measuring and weighing outcomes that are most important from the patient’s perspective. We describe a broad vision for the future of clinical trials consistent with increased pragmatism. Greater focus on using endpoints to analyze patients rather than patients to analyze endpoints particularly in late-phase/stage clinical trials is an important part of this vision. PMID:28435515

  8. Effect of nutrient and selective inhibitor amendments on methane oxidation, nitrous oxide production, and key gene presence and expression in landfill cover soils: characterization of the role of methanotrophs, nitrifiers, and denitrifiers.

    Science.gov (United States)

    Lee, Sung-Woo; Im, Jeongdae; Dispirito, Alan A; Bodrossy, Levente; Barcelona, Michael J; Semrau, Jeremy D

    2009-11-01

    Methane and nitrous oxide are both potent greenhouse gasses, with global warming potentials approximately 25 and 298 times that of carbon dioxide. A matrix of soil microcosms was constructed with landfill cover soils collected from the King Highway Landfill in Kalamazoo, Michigan and exposed to geochemical parameters known to affect methane consumption by methanotrophs while also examining their impact on biogenic nitrous oxide production. It was found that relatively dry soils (5% moisture content) along with 15 mg NH (4) (+) (kg soil)(-1) and 0.1 mg phenylacetylene(kg soil)(-1) provided the greatest stimulation of methane oxidation while minimizing nitrous oxide production. Microarray analyses of pmoA showed that the methanotrophic community structure was dominated by Type II organisms, but Type I genera were more evident with the addition of ammonia. When phenylacetylene was added in conjunction with ammonia, the methanotrophic community structure was more similar to that observed in the presence of no amendments. PCR analyses showed the presence of amoA from both ammonia-oxidizing bacteria and archaea, and that the presence of key genes associated with these cells was reduced with the addition of phenylacetylene. Messenger RNA analyses found transcripts of pmoA, but not of mmoX, nirK, norB, or amoA from either ammonia-oxidizing bacteria or archaea. Pure culture analyses showed that methanotrophs could produce significant amounts of nitrous oxide, particularly when expressing the particulate methane monooxygenase (pMMO). Collectively, these data suggest that methanotrophs expressing pMMO played a role in nitrous oxide production in these microcosms.

  9. Collab-Analyzer: An Environment for Conducting Web-Based Collaborative Learning Activities and Analyzing Students' Information-Searching Behaviors

    Science.gov (United States)

    Wu, Chih-Hsiang; Hwang, Gwo-Jen; Kuo, Fan-Ray

    2014-01-01

    Researchers have found that students might get lost or feel frustrated while searching for information on the Internet to deal with complex problems without real-time guidance or supports. To address this issue, a web-based collaborative learning system, Collab-Analyzer, is proposed in this paper. It is not only equipped with a collaborative…

  10. Planning, performing and analyzing X-ray Raman scattering experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sahle, Ch. J., E-mail: christoph.sahle@esrf.fr [Department of Physics, PO Box 64, FI-00014 University of Helsinki, Helsinki (Finland); European Synchrotron Radiation Facility, BP 220, F-38043 Grenoble Cedex (France); Mirone, A. [European Synchrotron Radiation Facility, BP 220, F-38043 Grenoble Cedex (France); Niskanen, J.; Inkinen, J. [Department of Physics, PO Box 64, FI-00014 University of Helsinki, Helsinki (Finland); Krisch, M. [European Synchrotron Radiation Facility, BP 220, F-38043 Grenoble Cedex (France); Huotari, S. [Department of Physics, PO Box 64, FI-00014 University of Helsinki, Helsinki (Finland)

    2015-02-03

    A summarising review of data treatment for non-resonant inelastic X-ray scattering data from modern synchrotron-based multi-analyzer spectrometers. A compilation of procedures for planning and performing X-ray Raman scattering (XRS) experiments and analyzing data obtained from them is presented. In particular, it is demonstrated how to predict the overall shape of the spectra, estimate detection limits for dilute samples, and how to normalize the recorded spectra to absolute units. In addition, methods for processing data from multiple-crystal XRS spectrometers with imaging capability are presented, including a super-resolution method that can be used for direct tomography using XRS spectra as the contrast. An open-source software package with these procedures implemented is also made available.

  11. Analyzing complex networks evolution through Information Theory quantifiers

    International Nuclear Information System (INIS)

    Carpi, Laura C.; Rosso, Osvaldo A.; Saco, Patricia M.; Ravetti, Martin Gomez

    2011-01-01

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  12. Counter tube window and X-ray fluorescence analyzer study

    Science.gov (United States)

    Hertel, R.; Holm, M.

    1973-01-01

    A study was performed to determine the best design tube window and X-ray fluorescence analyzer for quantitative analysis of Venusian dust and condensates. The principal objective of the project was to develop the best counter tube window geometry for the sensing element of the instrument. This included formulation of a mathematical model of the window and optimization of its parameters. The proposed detector and instrument has several important features. The instrument will perform a near real-time analysis of dust in the Venusian atmosphere, and is capable of measuring dust layers less than 1 micron thick. In addition, wide dynamic measurement range will be provided to compensate for extreme variations in count rates. An integral pulse-height analyzer and memory accumulate data and read out spectra for detail computer analysis on the ground.

  13. [Integrated Development of Full-automatic Fluorescence Analyzer].

    Science.gov (United States)

    Zhang, Mei; Lin, Zhibo; Yuan, Peng; Yao, Zhifeng; Hu, Yueming

    2015-10-01

    In view of the fact that medical inspection equipment sold in the domestic market is mainly imported from abroad and very expensive, we developed a full-automatic fluorescence analyzer in our center, presented in this paper. The present paper introduces the hardware architecture design of FPGA/DSP motion controlling card+PC+ STM32 embedded micro processing unit, software system based on C# multi thread, design and implementation of double-unit communication in detail. By simplifying the hardware structure, selecting hardware legitimately and adopting control system software to object-oriented technology, we have improved the precision and velocity of the control system significantly. Finally, the performance test showed that the control system could meet the needs of automated fluorescence analyzer on the functionality, performance and cost.

  14. [Dynamic Pulse Signal Processing and Analyzing in Mobile System].

    Science.gov (United States)

    Chou, Yongxin; Zhang, Aihua; Ou, Jiqing; Qi, Yusheng

    2015-09-01

    In order to derive dynamic pulse rate variability (DPRV) signal from dynamic pulse signal in real time, a method for extracting DPRV signal was proposed and a portable mobile monitoring system was designed. The system consists of a front end for collecting and wireless sending pulse signal and a mobile terminal. The proposed method is employed to extract DPRV from dynamic pulse signal in mobile terminal, and the DPRV signal is analyzed both in the time domain and the frequency domain and also with non-linear method in real time. The results show that the proposed method can accurately derive DPRV signal in real time, the system can be used for processing and analyzing DPRV signal in real time.

  15. ANALYZER OF QUANTITY AND QUALITY OF THE ELECTRIC POWER

    Directory of Open Access Journals (Sweden)

    A. I. Semilyak

    2013-01-01

    Full Text Available One of the activities of the research center for “Energy Saving Technologies and Smart Metering in Electrical Power Engineering" is research work on the use of electronic devices and systems of intelligent power distribution, produced by Analog Devices and equipped with the accurate energy consumption measurement feature. The article focuses on the development of the analyzer of quantity and quality of electric energy.The main part of the analyzer is a metering IC by Analog Devices ADE7878, designed for use in commercial and industrial smart electricity meters. Such counters measure the amount of consumed or produced electric energy with high accuracy and have the means of remote meter reading.

  16. Efficiency of biparental crossing in sugarcane analyzed by SSR markers

    Directory of Open Access Journals (Sweden)

    João Messias dos Santos

    2014-07-01

    Full Text Available Sugarcane has hermaphrodite flowers, however, selfing and cross pollination may occur, resulting in selfed or hybrid progeny. The aim of this study was to analyze the paternity of progenies from biparental crosses, in order to identify true hybrids or progenies originating from pollen of unknown origin. Seventy-six progenies from four crosses were analyzed using three highly polymorphic microsatellite markers (SSR. Progenies showed moderate genetic similarity and were grouped into four distinct groups, according to the crosses. Transmission of alleles from parents to offspring was clearly observed, in which selfed individuals were not observed, and only true hybrids or progeny resulting from fertilization with pollen uncommon to both parents were. Results showed that there was contamination with pollen from unknown parents in sugarcane crosses, suggesting that errors in the pedigree may occur, and adjustment in the crossing procedure would decrease progenies from pollen of unknown origin.

  17. Analyzing complex networks evolution through Information Theory quantifiers

    Energy Technology Data Exchange (ETDEWEB)

    Carpi, Laura C., E-mail: Laura.Carpi@studentmail.newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Rosso, Osvaldo A., E-mail: rosso@fisica.ufmg.b [Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Chaos and Biology Group, Instituto de Calculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Pabellon II, Ciudad Universitaria, 1428 Ciudad de Buenos Aires (Argentina); Saco, Patricia M., E-mail: Patricia.Saco@newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Hidraulica, Facultad de Ciencias Exactas, Ingenieria y Agrimensura, Universidad Nacional de Rosario, Avenida Pellegrini 250, Rosario (Argentina); Ravetti, Martin Gomez, E-mail: martin.ravetti@dep.ufmg.b [Departamento de Engenharia de Producao, Universidade Federal de Minas Gerais, Av. Antonio Carlos, 6627, Belo Horizonte (31270-901), MG (Brazil)

    2011-01-24

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  18. Radiometric analyzer with plural radiation sources and detectors

    International Nuclear Information System (INIS)

    Arima, S.; Oda, M.; Miyashita, K.; Takada, M.

    1977-01-01

    A radiometric analyzer for measuring characteristics of a material by radiation comprises a plurality of systems in which each consists of a radiation source and a radiation detector which are the same in number as the number of elements of the molecule of the material and a linear calibration circuit having inverse response characteristics (calibration curve) of the respective systems of detectors, whereby the measurement is carried out by four fundamental rules by operation of the mutual outputs of said detector system obtained through said linear calibration circuit. One typical embodiment is a radiometric analyzer for hydrocarbons which measures the density of heavy oil, the sulfur content and the calorific value by three detector systems which include a γ-ray source (E/sub γ/ greater than 50 keV), a soft x-ray source (Ex approximately 20 keV), and a neutron ray source. 2 claims, 6 figures

  19. An empirical study to analyze CRM strategy using BSC

    Directory of Open Access Journals (Sweden)

    Zahra Safari Kahreh

    2012-08-01

    Full Text Available In the new marketing paradigm that is based on the relationship marketing, corporations and organizations look for retaining and enhancing the long run relationships with their customers. Customer relationship management (CRM as a heart of the new marketing paradigm includes numbers of mechanisms that endeavor to manage sustainable and profitable long-term relations with valuable customers. Every year, there are many programs and resources dedicated for marketing strategy and planning. Evaluation of these endeavors especially the CRM strategy is much important. Hence, the primary purpose of this research is to analyze CRM using balanced scorecard as a valuable strategic tool. Required data was gathered from one of the biggest commercial banks of Iran and they were analyzed using BSC and statistical software packages. Results indicate that there is a meaningful relationship between 3 main aspects of CRM strategy and 4 main aspects of BSC.

  20. Analyzing Influenza Virus Sequences using Binary Encoding Approach

    Directory of Open Access Journals (Sweden)

    Ham Ching Lam

    2012-01-01

    Full Text Available Capturing mutation patterns of each individual influenza virus sequence is often challenging; in this paper, we demonstrated that using a binary encoding scheme coupled with dimension reduction technique, we were able to capture the intrinsic mutation pattern of the virus. Our approach looks at the variance between sequences instead of the commonly used p-distance or Hamming distance. We first convert the influenza genetic sequences to a binary strings and form a binary sequence alignment matrix and then apply Principal Component Analysis (PCA to this matrix. PCA also provides identification power to identify reassortant virus by using data projection technique. Due to the sparsity of the binary string, we were able to analyze large volume of influenza sequence data in a very short time. For protein sequences, our scheme also allows the incorporation of biophysical properties of each amino acid. Here, we present various encouraging results from analyzing influenza nucleotide, protein and genome sequences using the proposed approach.

  1. Analyzing the financial crisis using the entropy density function

    Science.gov (United States)

    Oh, Gabjin; Kim, Ho-yong; Ahn, Seok-Won; Kwak, Wooseop

    2015-02-01

    The risk that is created by nonlinear interactions among subjects in economic systems is assumed to increase during an abnormal state of a financial market. Nevertheless, investigating the systemic risk in financial markets following the global financial crisis is not sufficient. In this paper, we analyze the entropy density function in the return time series for several financial markets, such as the S&P500, KOSPI, and DAX indices, from October 2002 to December 2011 and analyze the variability in the entropy value over time. We find that the entropy density function of the S&P500 index during the subprime crisis exhibits a significant decrease compared to that in other periods, whereas the other markets, such as those in Germany and Korea, exhibit no significant decrease during the market crisis. These findings demonstrate that the S&P500 index generated a regular pattern in the return time series during the financial crisis.

  2. Analyzing Resiliency of the Smart Grid Communication Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Anas AlMajali, Anas; Viswanathan, Arun; Neuman, Clifford

    2016-08-01

    Smart grids are susceptible to cyber-attack as a result of new communication, control and computation techniques employed in the grid. In this paper, we characterize and analyze the resiliency of smart grid communication architecture, specifically an RF mesh based architecture, under cyber attacks. We analyze the resiliency of the communication architecture by studying the performance of high-level smart grid functions such as metering, and demand response which depend on communication. Disrupting the operation of these functions impacts the operational resiliency of the smart grid. Our analysis shows that it takes an attacker only a small fraction of meters to compromise the communication resiliency of the smart grid. We discuss the implications of our result to critical smart grid functions and to the overall security of the smart grid.

  3. Analyzing Bullwhip Effect in Supply Networks under Exogenous Uncertainty

    Directory of Open Access Journals (Sweden)

    Mitra Darvish

    2014-05-01

    Full Text Available This paper explains a model for analyzing and measuring the propagation of order amplifications (i.e. bullwhip effect for a single-product supply network topology considering exogenous uncertainty and linear and time-invariant inventory management policies for network entities. The stream of orders placed by each entity of the network is characterized assuming customer demand is ergodic. In fact, we propose an exact formula in order to measure the bullwhip effect in the addressed supply network topology considering the system in Markovian chain framework and presenting a matrix of network member relationships and relevant order sequences. The formula turns out using a mathematical method called frequency domain analysis. The major contribution of this paper is analyzing the bullwhip effect considering exogenous uncertainty in supply networks and using the Fourier transform in order to simplify the relevant calculations. We present a number of numerical examples to assess the analytical results accuracy in quantifying the bullwhip effect.

  4. Bleach gel: a simple agarose gel for analyzing RNA quality.

    Science.gov (United States)

    Aranda, Patrick S; LaJoie, Dollie M; Jorcyk, Cheryl L

    2012-01-01

    RNA-based applications requiring high-quality, non-degraded RNA are a foundational element of many research studies. As such, it is paramount that the integrity of experimental RNA is validated prior to cDNA synthesis or other downstream applications. In the absence of expensive equipment such as microfluidic electrophoretic devices, and as an alternative to the costly and time-consuming standard formaldehyde gel, RNA quality can be quickly analyzed by adding small amounts of commercial bleach to TAE buffer-based agarose gels prior to electrophoresis. In the presence of low concentrations of bleach, the secondary structure of RNA is denatured and potential contaminating RNases are destroyed. Because of this, the 'bleach gel' is a functional approach that addresses the need for an inexpensive and safe way to evaluate RNA integrity and will improve the ability of researchers to rapidly analyze RNA quality. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Dehydration process of fish analyzed by neutron beam imaging

    International Nuclear Information System (INIS)

    Tanoi, K.; Hamada, Y.; Seyama, S.; Saito, T.; Iikura, H.; Nakanishi, T.M.

    2009-01-01

    Since regulation of water content of the dried fish is an important factor for the quality of the fish, water-losing process during drying (squid and Japanese horse mackerel) was analyzed through neutron beam imaging. The neutron image showed that around the shoulder of mackerel, there was a part where water content was liable to maintain high during drying. To analyze water-losing process more in detail, spatial image was produced. From the images, it was clearly indicated that the decrease of water content was regulated around the shoulder part. It was suggested that to prevent deterioration around the shoulder part of the dried fish is an important factor to keep quality of the dried fish in the storage.

  6. BWR plant analyzer development at BNL (Brookhaven National Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Wulff, W.; Cheng, H.S.; Mallen, A.N.

    1986-01-01

    An engineering plant analyzer has been developed at BNL for realistically and accurately simulating transients and severe abnormal events in BWR power plants. Simulations are being carried out routinely with high fidelity, high simulation speed, at low cost and with unsurpassed user convenience. The BNL Plant Analyzer is the only operating facility which (a) simulates more than two orders-of-magnitude faster than the CDC-7600 mainframe computer, (b) is accessible and fully operational in on-line interactive mode, remotely from anywhere in the US, from Europe or the Far East (Korea), via widely available IBM-PC compatible personal computers, standard modems and telephone lines, (c) simulates both slow and rapid transients seven times faster than real-time in direct access, and four times faster in remote access modes, (d) achieves high simulation speed without compromising fidelity, and (e) is available to remote access users at the low cost of $160 per hour.

  7. Photoelectric equipment type MFS-7 for analyzing oils

    International Nuclear Information System (INIS)

    Orlova, S.A.; Fridman, M.G.; Kholosha, T.V.; Ezhoda, G.D.; Nechitailov, V.V.

    1987-01-01

    The authors describe the equipment type MFS-7 which is intended for analyzing used oils for the wear products of motors. The difference between type MFS-7 and its predecessors lies in the application of computer techniques to control the equipment and process the output data; and in the design of the sample container, which allows for two methods of introducing the sample into the discharge. The photoelectric equipment consists of an excitation spectrum source IVS-28, having an ac arc mode and 1.v. spark, a polychoromator, a special sample holder for analyzing liquid samples, an electronic recording apparatus with digital voltmeter type ERU-18 and control computer system Spectr 2.2 based on a minicomputer with its own printer. The type MFS-7 equipment has been tested and put into mass production

  8. Monitoring machining conditions by analyzing cutting force vibration

    International Nuclear Information System (INIS)

    Piao, Chun Guang; Kim, Ju Wan; Kim, Jin Oh; Shin, Yoan

    2015-01-01

    This paper deals with an experimental technique for monitoring machining conditions by analyzing cutting-force vibration measured at a milling machine. This technique is based on the relationship of the cutting-force vibrations with the feed rate and cutting depth as reported earlier. The measurement system consists of dynamic force transducers and a signal amplifier. The analysis system includes an oscilloscope and a computer with a LabVIEW program. Experiments were carried out at various feed rates and cutting depths, while the rotating speed was kept constant. The magnitude of the cutting force vibration component corresponding to the number of cutting edges multiplied by the frequency of rotation was linearly correlated with the machining conditions. When one condition of machining is known, another condition can be identified by analyzing the cutting-force vibration

  9. Analyzing Big Data with the Hybrid Interval Regression Methods

    Directory of Open Access Journals (Sweden)

    Chia-Hui Huang

    2014-01-01

    Full Text Available Big data is a new trend at present, forcing the significant impacts on information technologies. In big data applications, one of the most concerned issues is dealing with large-scale data sets that often require computation resources provided by public cloud services. How to analyze big data efficiently becomes a big challenge. In this paper, we collaborate interval regression with the smooth support vector machine (SSVM to analyze big data. Recently, the smooth support vector machine (SSVM was proposed as an alternative of the standard SVM that has been proved more efficient than the traditional SVM in processing large-scale data. In addition the soft margin method is proposed to modify the excursion of separation margin and to be effective in the gray zone that the distribution of data becomes hard to be described and the separation margin between classes.

  10. Analyzing and Driving Cluster Formation in Atomistic Simulations.

    Science.gov (United States)

    Tribello, Gareth A; Giberti, Federico; Sosso, Gabriele C; Salvalaglio, Matteo; Parrinello, Michele

    2017-03-14

    In this paper a set of computational tools for identifying the phases contained in a system composed of atoms or molecules is introduced. The method is rooted in graph theory and combines atom centered symmetry functions, adjacency matrices, and clustering algorithms to identify regions of space where the properties of the system constituents can be considered uniform. We show how this method can be used to define collective variables and how these collective variables can be used to enhance the sampling of nucleation events. We then show how this method can be used to analyze simulations of crystal nucleation and growth by using it to analyze simulations of the nucleation of the molecular crystal urea and simulations of nucleation in a semiconducting alloy. The semiconducting alloy example we discuss is particular challenging as multiple nucleation centers are formed. We show, however, that our algorithm is able to detect the grain boundaries in the resulting polycrystal.

  11. Presenting and analyzing movie stimuli for psychocinematic research

    Directory of Open Access Journals (Sweden)

    Arthur P. Shimamura

    2013-02-01

    Full Text Available Movies have an extraordinary way of capturing our perceptual, conceptual and emotional processes. As such, they offer a useful means of exploring psychological phenomenon in the laboratory. Until recently, it has been rather difficult to present animated stimuli and collect behavioral responses online. However, with advances in digital technology and commercially available software to construct, present, and analyze movies for behavioral investigations, there is a growing interest in psychocinematic research. A rather simple, yet useful procedure is described that presents movie clips and collects multiple behavioral responses during its presentation. It uses E-prime 2.0 Professional software to run the experiment and Microsoft Excel to sort and analyze the data.

  12. Ecoupling server: A tool to compute and analyze electronic couplings.

    Science.gov (United States)

    Cabeza de Vaca, Israel; Acebes, Sandra; Guallar, Victor

    2016-07-05

    Electron transfer processes are often studied through the evaluation and analysis of the electronic coupling (EC). Since most standard QM codes do not provide readily such a measure, additional, and user-friendly tools to compute and analyze electronic coupling from external wave functions will be of high value. The first server to provide a friendly interface for evaluation and analysis of electronic couplings under two different approximations (FDC and GMH) is presented in this communication. Ecoupling server accepts inputs from common QM and QM/MM software and provides useful plots to understand and analyze the results easily. The web server has been implemented in CGI-python using Apache and it is accessible at http://ecouplingserver.bsc.es. Ecoupling server is free and open to all users without login. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  13. Organization of a multichannel analyzer for gamma ray spectrometry

    International Nuclear Information System (INIS)

    Robinet, Genevieve

    1988-06-01

    This report describes the software organization of a medium scale multichannel analyzer for qualitative and quantitative measurements of the gamma rays emitted by radioactive samples. The first part reminds basis of radioactivity, principle of gamma ray detection, and data processing used for interpretation of a nuclear spectrum. The second part describes first the general organization of the software and then gives some details on interactivity, multidetector capabilites, and integration of complex algorithms for peak search and nuclide identification;problems encountered during the design phase are mentioned and solutions are given. Basic ideas are presented for further developments, such as expert system which should improve interpretation of the results. This present software has been integrated in a manufactured multichannel analyzer named 'POLYGAM NU416'. [fr

  14. Development of the combustion engineering reactivity analyzer system

    International Nuclear Information System (INIS)

    Hoffspiegel, P.; Kaplan, G.; Long, S.; Reum, J.; Rohr, P.

    1988-01-01

    The C-E Reactivity Analyzer (CERA trademark) is an interactive, personal computer based, low power physics data collection and reduction system developed to support pressurized water reactor test programs. The CERA system consists of hardware and software to acquire, digitize, and analyze data for the isothermal temperature coefficient (ITC) and control rod bank worth measurements. The CERA system uses expert-level data analysis algorithms to assist the plant reactor engineer in determining measured values of the ITC and rod bank worths. Statistical, qualitative, and quantitative evaluations of test data fidelity are also provided. The CERA data analysis methodology is a significant advance in speed, accuracy, and test anomaly identification over the standard method of manual data reduction using strip charts and hand-collected data

  15. An electron density measurement using an analyzer based imaging system

    International Nuclear Information System (INIS)

    Bewer, Brian

    2011-01-01

    Using a monochromatic X-ray beam from a synchrotron source the electron density of a homogeneous target was determined by measuring the refraction that occurs at the air-target interface for a known angle of incidence. The angle of deviation that these X-rays undergo at the transition between materials is micro-radian to submicro-radian in scale. Existing analyzer based imaging systems are designed to measure submicro-radian angle changes and commonly use monochromatic hard X-ray beams generated from synchrotron sources. A preliminary experiment using the analyzer based imaging apparatus at the Canadian Light Source Biomedical Imaging and Therapy beamline and a half cylinder shaped plastic target will be presented. By measuring the angle of deviation of the photon beam at several discrete angular positions of the target the electron density of the target material was determined.

  16. Methods for Analyzing Multivariate Phenotypes in Genetic Association Studies

    Directory of Open Access Journals (Sweden)

    Qiong Yang

    2012-01-01

    Full Text Available Multivariate phenotypes are frequently encountered in genetic association studies. The purpose of analyzing multivariate phenotypes usually includes discovery of novel genetic variants of pleiotropy effects, that is, affecting multiple phenotypes, and the ultimate goal of uncovering the underlying genetic mechanism. In recent years, there have been new method development and application of existing statistical methods to such phenotypes. In this paper, we provide a review of the available methods for analyzing association between a single marker and a multivariate phenotype consisting of the same type of components (e.g., all continuous or all categorical or different types of components (e.g., some are continuous and others are categorical. We also reviewed causal inference methods designed to test whether the detected association with the multivariate phenotype is truly pleiotropy or the genetic marker exerts its effects on some phenotypes through affecting the others.

  17. Textbooks in the EFL classroom: Defining, assessing and analyzing

    Directory of Open Access Journals (Sweden)

    Radić-Bojanić Biljana B.

    2016-01-01

    Full Text Available The aim of this paper is to define textbooks, analyze their advantages and disadvantages and to explicate the process of textbook selection and reasons for analyzing textbooks. The paper describes two reasons for performing a textbook analysis: evaluating for potential and evaluating for suitability, and further states various processes of textbook content analysis including the analysis of the stated aims and objectives, learner needs, their abilities and preferences, as well as the establishment of criteria in relation to previously set objectives. The paper concludes by stating that the task that teachers are faced with when selecting and evaluating textbooks is not that is an easy one, but it is crucial. With the assistance of clear guidelines and detailed criteria, they should be able to make an informed decision and choose a textbook that is most suitable for the requirements of their specific classroom context.

  18. Analyzing and synthesizing phylogenies using tree alignment graphs.

    Directory of Open Access Journals (Sweden)

    Stephen A Smith

    Full Text Available Phylogenetic trees are used to analyze and visualize evolution. However, trees can be imperfect datatypes when summarizing multiple trees. This is especially problematic when accommodating for biological phenomena such as horizontal gene transfer, incomplete lineage sorting, and hybridization, as well as topological conflict between datasets. Additionally, researchers may want to combine information from sets of trees that have partially overlapping taxon sets. To address the problem of analyzing sets of trees with conflicting relationships and partially overlapping taxon sets, we introduce methods for aligning, synthesizing and analyzing rooted phylogenetic trees within a graph, called a tree alignment graph (TAG. The TAG can be queried and analyzed to explore uncertainty and conflict. It can also be synthesized to construct trees, presenting an alternative to supertrees approaches. We demonstrate these methods with two empirical datasets. In order to explore uncertainty, we constructed a TAG of the bootstrap trees from the Angiosperm Tree of Life project. Analysis of the resulting graph demonstrates that areas of the dataset that are unresolved in majority-rule consensus tree analyses can be understood in more detail within the context of a graph structure, using measures incorporating node degree and adjacency support. As an exercise in synthesis (i.e., summarization of a TAG constructed from the alignment trees, we also construct a TAG consisting of the taxonomy and source trees from a recent comprehensive bird study. We synthesized this graph into a tree that can be reconstructed in a repeatable fashion and where the underlying source information can be updated. The methods presented here are tractable for large scale analyses and serve as a basis for an alternative to consensus tree and supertree methods. Furthermore, the exploration of these graphs can expose structures and patterns within the dataset that are otherwise difficult to

  19. Multi-channel amplitude analyzer CMA-1 and CMA-2

    International Nuclear Information System (INIS)

    Czermak, A.; Jablonski, J.; Ostrowicz, A.

    1977-01-01

    Analyzer CMA is implemented in the CAMAC system. A single crate contains the required modules and is controlled by the PDP-11/10 minicomputer with 8k 16 bit word memory. Spectra can be accumulated in full 4k, 2k 1k or 0,5k. System provides: display of stored data in the form of full memory, very accurate representation of any part (44 channels) on alphanumeric display, readout of the data by paper tape punch or printing. (author)

  20. Development of imaging energy analyzer using multipole Wien filter

    Science.gov (United States)

    Niimi, H.; Kato, M.; Tsutsumi, T.; Kawasaki, T.; Matsudaira, H.; Suzuki, S.; Chun, W.-J.; Kitajima, Y.; Kudo, M.; Asakura, K.

    2005-02-01

    We discussed a new design of a Wien filter energy analyzer for an energy-filtered X-ray photoemission electron microscopy system. We have demonstrated that the second-order aberration and the third-order aperture aberration can be corrected by the multipole Wien filter by adjusting multipole components of electric and magnetic fields up to octupole components. The three-dimensional charge simulation method indicated that 12 electrodes and magnetic poles can effectively reproduce these ideal electric and magnetic fields.

  1. Development of imaging energy analyzer using multipole Wien filter

    Energy Technology Data Exchange (ETDEWEB)

    Niimi, H. [Catalysis Research Center, Hokkaido University, 21-10 Kita, Kita-ku, Sapporo 001-0021 (Japan)]. E-mail: hironobu@cat.hokudai.ac.jp; Kato, M. [JEOL Ltd., 3-1-2 Musashino, Akishima, Tokyo 196-8558 (Japan); Tsutsumi, T. [Catalysis Research Center, Hokkaido University, 21-10 Kita, Kita-ku, Sapporo 001-0021 (Japan); Kawasaki, T. [Catalysis Research Center, Hokkaido University, 21-10 Kita, Kita-ku, Sapporo 001-0021 (Japan); Matsudaira, H. [Catalysis Research Center, Hokkaido University, 21-10 Kita, Kita-ku, Sapporo 001-0021 (Japan); Suzuki, S. [Catalysis Research Center, Hokkaido University, 21-10 Kita, Kita-ku, Sapporo 001-0021 (Japan); Chun, W.-J. [Catalysis Research Center, Hokkaido University, 21-10 Kita, Kita-ku, Sapporo 001-0021 (Japan); Core Research for Evolutional Science and Technology, Japan Science and Technology Corporation (Japan); Kitajima, Y. [Photon Factory, Institute of Materials Structure Science, Tsukuba 305-0801 (Japan); Kudo, M. [JEOL Ltd., 3-1-2 Musashino, Akishima, Tokyo 196-8558 (Japan); Asakura, K. [Catalysis Research Center, Hokkaido University, 21-10 Kita, Kita-ku, Sapporo 001-0021 (Japan)

    2005-02-28

    We discussed a new design of a Wien filter energy analyzer for an energy-filtered X-ray photoemission electron microscopy system. We have demonstrated that the second-order aberration and the third-order aperture aberration can be corrected by the multipole Wien filter by adjusting multipole components of electric and magnetic fields up to octupole components. The three-dimensional charge simulation method indicated that 12 electrodes and magnetic poles can effectively reproduce these ideal electric and magnetic fields.

  2. Wavelength encoding technique for particle analyses in hematology analyzer

    Science.gov (United States)

    Rongeat, Nelly; Brunel, Patrick; Gineys, Jean-Philippe; Cremien, Didier; Couderc, Vincent; Nérin, Philippe

    2011-07-01

    The aim of this study is to combine multiple excitation wavelengths in order to improve accuracy of fluorescence characterization of labeled cells. The experimental demonstration is realized with a hematology analyzer based on flow cytometry and a CW laser source emitting two visible wavelengths. A given optical encoding associated to each wavelength allows fluorescence identification coming from specific fluorochromes and avoiding the use of noisy compensation method.

  3. Analyzing China Smart Water Meter Industry Cluster Competitiveness

    OpenAIRE

    Chan, Parker

    2013-01-01

    Sustainable development has always been a top issue nowadays. The smart water management is one of the methods to achieve the sustainable development. This paper aims to focus on analyzing the competitiveness of industrial clusters (Guangzhou, Ningbo and Shanghai) in China specifically to the smart water meter industry. It is part of the CEMIS sourcing work package under the KVTELIOS project with Mr. Al Natsheh Anas, and is supervised by Ms. Komulainen Ruey. Porter Diamond Theory is used ...

  4. A fully integrated standalone portable cavity ringdown breath acetone analyzer.

    Science.gov (United States)

    Sun, Meixiu; Jiang, Chenyu; Gong, Zhiyong; Zhao, Xiaomeng; Chen, Zhuying; Wang, Zhennan; Kang, Meiling; Li, Yingxin; Wang, Chuji

    2015-09-01

    Breath analysis is a promising new technique for nonintrusive disease diagnosis and metabolic status monitoring. One challenging issue in using a breath biomarker for potential particular disease screening is to find a quantitative relationship between the concentration of the breath biomarker and clinical diagnostic parameters of the specific disease. In order to address this issue, we need a new instrument that is capable of conducting real-time, online breath analysis with high data throughput, so that a large scale of clinical test (more subjects) can be achieved in a short period of time. In this work, we report a fully integrated, standalone, portable analyzer based on the cavity ringdown spectroscopy technique for near-real time, online breath acetone measurements. The performance of the portable analyzer in measurements of breath acetone was interrogated and validated by using the certificated gas chromatography-mass spectrometry. The results show that this new analyzer is useful for reliable online (online introduction of a breath sample without pre-treatment) breath acetone analysis with high sensitivity (57 ppb) and high data throughput (one data per second). Subsequently, the validated breath analyzer was employed for acetone measurements in 119 human subjects under various situations. The instrument design, packaging, specifications, and future improvements were also described. From an optical ringdown cavity operated by the lab-set electronics reported previously to this fully integrated standalone new instrument, we have enabled a new scientific tool suited for large scales of breath acetone analysis and created an instrument platform that can even be adopted for study of other breath biomarkers by using different lasers and ringdown mirrors covering corresponding spectral fingerprints.

  5. Analyzing and Building Nucleic Acid Structures with 3DNA

    OpenAIRE

    Colasanti, Andrew V.; Lu, Xiang-Jun; Olson, Wilma K.

    2013-01-01

    The 3DNA software package is a popular and versatile bioinformatics tool with capabilities to analyze, construct, and visualize three-dimensional nucleic acid structures. This article presents detailed protocols for a subset of new and popular features available in 3DNA, applicable to both individual structures and ensembles of related structures. Protocol 1 lists the set of instructions needed to download and install the software. This is followed, in Protocol 2, by the analysis of a nucleic...

  6. Closed loop models for analyzing engineering requirements for simulators

    Science.gov (United States)

    Baron, S.; Muralidharan, R.; Kleinman, D.

    1980-01-01

    A closed loop analytic model, incorporating a model for the human pilot, (namely, the optimal control model) that would allow certain simulation design tradeoffs to be evaluated quantitatively was developed. This model was applied to a realistic flight control problem. The resulting model is used to analyze both overall simulation effects and the effects of individual elements. The results show that, as compared to an ideal continuous simulation, the discrete simulation can result in significant performance and/or workload penalties.

  7. RELIABILITY ASSESSMENTS OF INFANT INCUBATOR AND THE ANALYZER

    OpenAIRE

    Özdemirci, Emre; Özarslan Yatak, Meral; Duran, Fecir; Canal, Mehmet Rahmi

    2014-01-01

    Approximately 80% of newborn in Turkey are put in neonatal incubators because of their problematic cases. Incubators used for treatment may adversely affect baby’s health seriously, if they adjusts or measures the parameters incorrectly. In this study, complications arisen because of inaccurate adjustment and measurement of incubator parameters were investigated. Current infant incubator analyzers were researched and the deficiencies were evaluated considering the standards and clin...

  8. A fully integrated standalone portable cavity ringdown breath acetone analyzer

    Science.gov (United States)

    Sun, Meixiu; Jiang, Chenyu; Gong, Zhiyong; Zhao, Xiaomeng; Chen, Zhuying; Wang, Zhennan; Kang, Meiling; Li, Yingxin; Wang, Chuji

    2015-09-01

    Breath analysis is a promising new technique for nonintrusive disease diagnosis and metabolic status monitoring. One challenging issue in using a breath biomarker for potential particular disease screening is to find a quantitative relationship between the concentration of the breath biomarker and clinical diagnostic parameters of the specific disease. In order to address this issue, we need a new instrument that is capable of conducting real-time, online breath analysis with high data throughput, so that a large scale of clinical test (more subjects) can be achieved in a short period of time. In this work, we report a fully integrated, standalone, portable analyzer based on the cavity ringdown spectroscopy technique for near-real time, online breath acetone measurements. The performance of the portable analyzer in measurements of breath acetone was interrogated and validated by using the certificated gas chromatography-mass spectrometry. The results show that this new analyzer is useful for reliable online (online introduction of a breath sample without pre-treatment) breath acetone analysis with high sensitivity (57 ppb) and high data throughput (one data per second). Subsequently, the validated breath analyzer was employed for acetone measurements in 119 human subjects under various situations. The instrument design, packaging, specifications, and future improvements were also described. From an optical ringdown cavity operated by the lab-set electronics reported previously to this fully integrated standalone new instrument, we have enabled a new scientific tool suited for large scales of breath acetone analysis and created an instrument platform that can even be adopted for study of other breath biomarkers by using different lasers and ringdown mirrors covering corresponding spectral fingerprints.

  9. Emission pathway modeling to analyze national ambition levels of decarbonization

    International Nuclear Information System (INIS)

    Kainuma, Mikiko; Waisman, Henri

    2015-01-01

    The Deep Decarbonization Pathways Project (DDPP) is a knowledge network comprising 15 Country Research Teams and several Partner Organizations which develop and share methods, assumptions, and findings related to deep decarbonization. It analyzes the technical decarbonization potential, exploring options for deep decarbonization, but also better taking into account existing infrastructure stocks. It shows the possibility to reduce total CO 2 -energy emissions by 45% by 2050, with bottom-up analyses by 15 Country Research Teams

  10. Development of Process for Analyzing Anthocyanin Contents in Bilberries

    OpenAIRE

    Lieskoski, Sami

    2017-01-01

    This thesis was conducted as a part of the Industry Nordic project, which aims at increasing the use of non wood forest products (NWFP) and especially developing the Nordic berry business. The company Marja Bothnia Berries Oy Ltd. obtained a new UV Vis spectrophotometer for analysis of anthocyanin contents in the bilberries it sells, and wished to have a method developed for this. A method for analyzing anthocyanin contents was developed based on previous literature and on methods used by the...

  11. Resolution of VISION, a crystal-analyzer spectrometer

    International Nuclear Information System (INIS)

    Seeger, Philip A.; Daemen, Luke L.; Larese, John Z.

    2009-01-01

    We present both analytic and Monte Carlo calculations of the resolution of VISION, which is a crystal-analyzer spectrometer based on the TOSCA design. The analyzer crystal in VISION is configured to focus in time, radial, and transverse directions ('triple focused'). Previously published analytical results have two serious flaws in the handling of the statistics, which gave misleading results. First, Gaussian distributions were assumed for all resolution components, so that full-width-half-maximum could be used. Not only is this a very poor approximation for most terms, it is also completely unnecessary because standard deviations can be combined in quadrature for any shape distribution (except Lorentzian). The second flaw was the choice of variables that are not independent, so that significant correlations were ignored. An example of the effect of including correlations is that the mosaic spread of the analyzer crystals does not contribute to the resolution in first order. Monte Carlo simulation is not limited to first order, and we find a mild optimum value for mosaic spread. A complete set of six independent variables is: neutron emission time, incident flight-path variation (due to moderator tilt), sample thickness, mean path in the analyzer (due to multiple reflections), sample-to-detector radial distance, and detector thickness. We treat separately the resolution contributions from histogramming and rebinning during data acquisition and reduction, and describe a scheme for VISION that minimizes the effect on resolution. We compare the contributions of the six variables to the total resolution, both analytically and by Monte Carlo simulations of a complete VISION model using the Neutron Instrument Simulation Package (NISP).

  12. A Fuzzy Logic System to Analyze a Student's Lifestyle

    OpenAIRE

    Ghosh, Sourish; Boob, Aaditya Sanjay; Nikhil, Nishant; Vysyaraju, Nayan Raju; Kumar, Ankit

    2016-01-01

    A college student's life can be primarily categorized into domains such as education, health, social and other activities which may include daily chores and travelling time. Time management is crucial for every student. A self realisation of one's daily time expenditure in various domains is therefore essential to maximize one's effective output. This paper presents how a mobile application using Fuzzy Logic and Global Positioning System (GPS) analyzes a student's lifestyle and provides recom...

  13. Neutral Particle Analyzer Measurements of Ion Behavior in NSTX

    Energy Technology Data Exchange (ETDEWEB)

    S.S. Medley; R.E. Bell; D.S. Darrow; A.L. Roquemore

    2002-02-06

    Initial results obtained with the Neutral Particle Analyzer (NPA) diagnostic on the National Spherical Torus Experiment (NSTX) are presented. Magnetohydrodynamic activity and reconnection events cause depletion of the deuterium energetic ion distribution created by neutral-beam injection. Adding High Harmonic Fast Wave Heating to neutral-beam-heated discharges results in the generation of an energetic ion tail above the beam injection energy. NPA measurements of the residual hydrogen ion temperature are in good agreement with those from recombination spectroscopy.

  14. Analyzing the Impact of Globalization on Economic Growth

    OpenAIRE

    Farid Ullah; Abdur Rauf; Nasir Rasool

    2014-01-01

    Globalization is a buzz word that catches significant importance as a reform agenda post 1980. The current study is an attempt to analyze the impact of globalization on economic growth of Pakistan. The sample period for this study ranges from 1980-2009. For empirical analysis of the study, Autoregressive Distributive Lag model is employed while for data analysis Augmented Dicky Fuller test is applied. It is found that all the variables are stationary at first difference. The empirical finding...

  15. Analyzing IT Service Delivery in an ISP from Nicaragua

    Science.gov (United States)

    Flores, Johnny; Rusu, Lazar; Johanneson, Paul

    This paper presents a method for analyzing IT service delivery and its application in an Internet Service Provider (ISP). The method proposed is based on ITIL-processes and case study technique; it includes questionnaires for gathering information, semi-structured interviews, focus groups and documents as sources of information for recognition of factual information. The method application allows to the ISP determines its practices and limitations of the IT Service Delivery.

  16. Presenting and analyzing movie stimuli for psychocinematic research

    OpenAIRE

    Arthur P. Shimamura

    2013-01-01

    Movies have an extraordinary way of capturing our perceptual, conceptual and emotional processes. As such, they offer a useful means of exploring psychological phenomenon in the laboratory. Until recently, it has been rather difficult to present animated stimuli and collect behavioral responses online. However, with advances in digital technology and commercially available software to construct, present, and analyze movies for behavioral investigations, there is a growing interest in psychoci...

  17. Theoretical reflectivities of bent crystal analyzers for fusion plasma diagnostics

    International Nuclear Information System (INIS)

    Caciuffo, R.; Ferrero, C.; Francescangeli, O.; Melone, S.

    1990-01-01

    The performances of curved crystal analyzers used in the plasma diagnostic spectrometers at several fusion laboratories are evaluated by means of a physical model based on the dynamical theory of x-ray diffraction. We present reflectivity curves and diffraction parameters calculated as a function of the crystal curvature for different wavelengths corresponding to the most relevant spectra of metal impurity ions present in high-energy laboratory plasma sources

  18. Analyzing the Number of Varieties in Frequently Found Flows

    Science.gov (United States)

    Shomura, Yusuke; Watanabe, Yoshinori; Yoshida, Kenichi

    Abnormal traffic that causes various problems on the Internet, such as P2P flows, DDoS attacks, and Internet worms, is increasing; therefore, the importance of methods that identify and control abnormal traffic is also increasing. Though the application of frequent-itemset-mining techniques is a promising way to analyze Internet traffic, the huge amount of data on the Internet prevents such techniques from being effective. To overcome this problem, we have developed a simple frequent-itemset-mining method that uses only a small amount of memory but is effective even with the large volumes of data associated with broadband Internet traffic. Using our method also involves analyzing the number of distinct elements in the itemsets found, which helps identify abnormal traffic. We used a cache-based implementation of our method to analyze actual data on the Internet and demonstrated that such an implementation can be used to provide on-line analysis of data while using only a small amount of memory.

  19. FIONA: A new mass analyzer for superheavy elements

    Science.gov (United States)

    Esker, Nicholas; Gates, Jacklyn; Pang, Gregory; Gregorich, Kenneth

    2015-10-01

    Six new superheavy elements (Z = 113 - 118) and over fifty new transactinide isotopes (Z > 104) have been synthesized in compound nuclear fusion reactions using 48Ca beams on actinide targets in the last 15 years. These superheavy elements (SHE) are short-lived and their decay chains end before reaching nuclides with unambiguously determined Z or A. At the LBNL 88'' Cyclotron, we use the Berkeley Gas-Filled Separator (BGS) to study the production and decay of SHE produced at rates of a few atoms per week. The BGS's high beam suppression comes with poor mass resolution and detection is hindered by the high background rates from the proximity to the target and beamstop. Ongoing upgrades to the BGS, including product thermalization and transport, will allow us to couple a mass analyzer to the BGS. Known as FIONA (Fast Identification Of Nuclide A), the analyzer is a mass separator designed for 100% transmission with an expected mass resolution of 2000A/ Δ A. These upgrades will greatly increase sensitivity by delivering mass-separated superheavy element nuclei to a low-background detector system on a 10-ms timescale. The current progress in commissioning the FIONA mass analyzer and the future directions of the project will be presented.

  20. Performance evaluation of Samsung LABGEO(HC10) Hematology Analyzer.

    Science.gov (United States)

    Park, Il Joong; Ahn, Sunhyun; Kim, Young In; Kang, Seon Joo; Cho, Sung Ran

    2014-08-01

    The Samsung LABGEO(HC10) Hematology Analyzer (LABGEO(HC10)) is a recently developed automated hematology analyzer that uses impedance technologies. The analyzer provides 18 parameters including 3-part differential at a maximum rate of 80 samples per hour. To evaluate the performance of the LABGEO(HC10). We evaluated precision, linearity, carryover, and relationship for complete blood cell count parameters between the LABGEO(HC10) and the LH780 (Beckman Coulter Inc) in a university hospital in Korea according to the Clinical and Laboratory Standards Institute guidelines. Sample stability and differences due to the anticoagulant used (K₂EDTA versus K₃EDTA) were also evaluated. The LABGEO(HC10) showed linearity over a wide range and minimal carryover ( 0.92) except for mean corpuscular hemoglobin concentration. The bias estimated was acceptable for all parameters investigated except for monocyte count. Most parameters were stable until 24 hours both at room temperature and at 4°C. The difference by anticoagulant type was statistically insignificant for all parameters except for a few red cell parameters. The accurate results achievable and simplicity of operation make the unit recommendable for small to medium-sized laboratories.

  1. ANALYZING CONSUMER BEHAVIOR IN BANKING SECTOR OF KOSOVO

    Directory of Open Access Journals (Sweden)

    Vjosa Fejza

    2017-12-01

    Full Text Available Considering the importance of understanding, analyzing and studying consumer behavior and behavior model, it was deemed necessary to conduct a research on this issue. As part of this research, consumer behavior models in the banking system of Kosovo were studied and analyzed. The first part of the study is characterized by a review of various literature, publications and scientific journals related to understanding the role and importance of consumer behavior in enterprises. Whereas the second part of the study includes a survey questionnaire, with a 500 individual client sample base, randomly selected from commercial banks in Kosovo. This survey was done with the purpose to collect data to determine behavior models of existing consumers in the banking sector and analyze various internal and external factors which influence such behaviors. Finally, data obtained from questionnaire surveys were used to draw conclusions on issues central to this research and issue recommendations which may be useful to commercial banks currently operating in Kosovo, as well as other financial institutions interested in this field.

  2. Design and Construction of an Autonomous Low-Cost Pulse Height Analyzer and a Single Channel Analyzer for Moessbauer Spectroscopy

    International Nuclear Information System (INIS)

    Velasquez, A.A.; Trujillo, J.M.; Morales, A.L.; Tobon, J.E.; Gancedo, J.R.; Reyes, L.

    2005-01-01

    A multichannel analyzer (MCA) and a single channel-analyzer (SCA) for Moessbauer spectrometry application have been designed and built. Both systems include low-cost digital and analog components. A microcontroller manages, either in PHA or MCS mode, the data acquisition, data storage and setting of the pulse discriminator limits. The user can monitor the system from an external PC through the serial port with the RS232 communication protocol. A graphic interface made with the LabVIEW software allows the user to adjust digitally the lower and upper limits of the pulse discriminator, and to visualize as well as save the PHA spectra in a file. The system has been tested using a 57Co radioactive source and several iron compounds, yielding satisfactory results. The low cost of its design, construction and maintenance make this equipment an attractive choice when assembling a Moessbauer spectrometer

  3. Micro-Analyzer: automatic preprocessing of Affymetrix microarray data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Cannataro, Mario

    2013-08-01

    A current trend in genomics is the investigation of the cell mechanism using different technologies, in order to explain the relationship among genes, molecular processes and diseases. For instance, the combined use of gene-expression arrays and genomic arrays has been demonstrated as an effective instrument in clinical practice. Consequently, in a single experiment different kind of microarrays may be used, resulting in the production of different types of binary data (images and textual raw data). The analysis of microarray data requires an initial preprocessing phase, that makes raw data suitable for use on existing analysis platforms, such as the TIGR M4 (TM4) Suite. An additional challenge to be faced by emerging data analysis platforms is the ability to treat in a combined way those different microarray formats coupled with clinical data. In fact, resulting integrated data may include both numerical and symbolic data (e.g. gene expression and SNPs regarding molecular data), as well as temporal data (e.g. the response to a drug, time to progression and survival rate), regarding clinical data. Raw data preprocessing is a crucial step in analysis but is often performed in a manual and error prone way using different software tools. Thus novel, platform independent, and possibly open source tools enabling the semi-automatic preprocessing and annotation of different microarray data are needed. The paper presents Micro-Analyzer (Microarray Analyzer), a cross-platform tool for the automatic normalization, summarization and annotation of Affymetrix gene expression and SNP binary data. It represents the evolution of the μ-CS tool, extending the preprocessing to SNP arrays that were not allowed in μ-CS. The Micro-Analyzer is provided as a Java standalone tool and enables users to read, preprocess and analyse binary microarray data (gene expression and SNPs) by invoking TM4 platform. It avoids: (i) the manual invocation of external tools (e.g. the Affymetrix Power

  4. Aqueduct Global Flood Analyzer - bringing risk information to practice

    Science.gov (United States)

    Ward, Philip

    2017-04-01

    The economic losses associated with flooding are huge and rising. As a result, there is increasing attention for strategic flood risk assessments at the global scale. In response, the last few years have seen a large growth in the number of global flood models. At the same time, users and practitioners require flood risk information in a format that is easy to use, understandable, transparent, and actionable. In response, we have developed the Aqueduct Global Flood Analyzer (wri.org/floods). The Analyzer is a free, online, easy to use, tool for assessing global river flood risk at the scale of countries, states, and river basins, using data generated by the state of the art GLOFRIS global flood risk model. The Analyzer allows users to assess flood risk on-the-fly in terms of expected annual urban damage, and expected annual population and GDP affected by floods. Analyses can be carried out for current conditions and under future scenarios of climate change and socioeconomic development. We will demonstrate the tool, and discuss several of its applications in practice. In the past 15 months, the tool has been visited and used by more than 12,000 unique users from almost every country, including many users from the World Bank, Pacific Disaster Center, Red Cross Climate Centre, as well as many journalists from major international news outlets. Use cases will be presented from these user communities. We will also present ongoing research to improve the user functionality of the tool in the coming year. This includes the inclusion of coastal flood risk, assessing the costs and benefits of adaptation, and assessing the impacts of land subsidence and urban extension on risk.

  5. A coastal surface seawater analyzer for nitrogenous nutrient mapping

    Science.gov (United States)

    Masserini, Robert T.; Fanning, Kent A.; Hendrix, Steven A.; Kleiman, Brittany M.

    2017-11-01

    Satellite-data-based modeling of chlorophyll indicates that ocean waters in the mesosphere category are responsible for the majority of oceanic net primary productivity. Coastal waters, which frequently have surface chlorophyll values in the mesosphere range and have strong horizontal chlorophyll gradients and large temporal variations. Thus programs of detailed coastal nutrient surveys are essential to the study of the dynamics of oceanic net primary productivity, along with land use impacts on estuarine and coastal ecosystems. The degree of variability in these regions necessitates flexible instrumentation capable of near real-time analysis to detect and monitor analytes of interest. This work describes the development of a portable coastal surface seawater analyzer for nutrient mapping that can simultaneously elucidate with high resolution the distribution of nitrate, nitrite, and ammonium - the three principal nitrogenous inorganic nutrients in coastal systems. The approach focuses on the use of pulsed xenon flash lamps to construct an analyzer which can be adapted to any automated chemistry with fluorescence detection. The system has two heaters, on-the-fly standardization, on-board data logging, an independent 24 volt direct current power supply, internal local operating network, a 12 channel peristaltic pump, four rotary injection/selection valves, and an intuitive graphical user interface. Using the methodology of Masserini and Fanning (2000) the detection limits for ammonium, nitrite, and nitrate plus nitrite were 11, 10, and 22 nM, respectively. A field test of the analyzer in Gulf of Mexico coastal waters demonstrated its ability to monitor and delineate the complexity of inorganic nitrogen nutrient enrichments within a coastal system.

  6. Nonlinear programming technique for analyzing flocculent settling data.

    Science.gov (United States)

    Rashid, Md Mamunur; Hayes, Donald F

    2014-04-01

    The traditional graphical approach for drawing iso-concentration curves to analyze flocculent settling data and design sedimentation basins poses difficulties for computer-based design methods. Thus, researchers have developed empirical approaches to analyze settling data. In this study, the ability of five empirical approaches to fit flocculent settling test data is compared. Particular emphasis is given to compare rule-based SETTLE and rule-based nonlinear programming (NLP) techniques as a viable alternative to the modeling methods of Berthouex and Stevens (1982), San (1989), and Ozer (1994). Published flocculent settling data are used to test the suitability of these empirical approaches. The primary objective, however, is to determine if the results of a NLP optimization technique are more reliable than those of other approaches. For this, mathematical curve fitting is conducted and the modeled concentration data are graphically compared to the observed data. The design results in terms of average solid removal efficiency as a function of detention times are also compared. Finally, the sum of squared errors values from these approaches are compared. The results indicate a strong correlation between observed and NLP modeled concentration data. The SETTLE and NLP approaches tend to be more conservative at lower retention times and less conservative at longer retention times. The SETTLE approach appears to be the most conservative. In terms of sum of squared errors values, NLP appears to be rank number one (i.e., best model) for eight data sets and number two for six data sets among 15 data sets. Therefore, NLP is recommended for analyzing flocculent settling data as a logical extension of other approaches. The NLP approach is further recommended as it is an optimization technique and uses conventional mathematical algorithms that can be solved using widely available software such as EXCEL and LINGO.

  7. AutomatedPlasmodiumdetection by the Sysmex XN hematology analyzer.

    Science.gov (United States)

    Dumas, Cécile; Bienvenu, Anne-Lise; Girard, Sandrine; Picot, Stéphane; Debize, Gisèle; Durand, Brigitte

    2018-01-03

    Malaria is a potentially severe disease affecting nearly 200 million people per year. Early detection of the parasite even in unsuspected patients remains the challenging aim for effective patient care. Automated complete blood counts that are usually performed for any febrile patient might represent a tool to ascertain malaria infection. To evaluate the ability of the new generation of the Sysmex hematology analyzer (XN-series) to detect malaria. We retrospectively studied 100 blood samples performed with the recent Sysmex XN analyzer that were positive for Plasmodium and explored its ability to detect the parasite. 100 samples from patients uninfected by malaria were used as control group. Specific abnormalities such as additional events in the mature neutrophil/eosinophil area of the white blood cells differential (WDF) scattergram were noted for 1.1% of Plasmodium falciparum samples and 56.2% of other Plasmodium species samples. Mature parasite stages (schizonts or gametocytes) were observed on blood smears among those samples. WDF scattergrams were able to detect 80.0% (12/15) of Plasmodium mature stages. Furthermore, the differential in white blood counts between WDF and white cell nucleated (WNR) channels was a predictive signal of Plasmodium mature stages in 73.3% (11/15) of samples and may be explained by a differential destruction of particles with the analyzer reagent. Associated to thrombocytopaenia, a Sysmex XN Plasmodium pattern may represent a useful warning for Plasmodium detection in unsuspected patients, particularly when mature parasite stages are present. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  8. A Portable, Field-Deployable Analyzer for Isotopic Water Measurements

    Science.gov (United States)

    Berman, E. S.; Gupta, M.; Huang, Y. W.; Lacelle, D.; McKay, C. P.; Fortson, S.

    2015-12-01

    Water stable isotopes have for many years been used to study the hydrological cycle, catchment hydrology, and polar climate among other applications. Typically, discrete water samples are collected and transported to a laboratory for isotope analysis. Due to the expense and labor associated with such sampling, isotope studies have generally been limited in scope and time-resolution. Field sampling of water isotopes has been shown in recent years to provide dense data sets with the increased time resolution illuminating substantially greater short term variability than is generally observed during discrete sampling. A truly portable instrument also opens the possibility to utilize the instrument as a tool for identifying which water samples would be particularly interesting for further laboratory investigation. To make possible such field measurements of liquid water isotopes, Los Gatos Research has developed a miniaturized, field-deployable liquid water isotope analyzer. The prototype miniature liquid water isotope analyzer (mini-LWIA) uses LGR's patented Off-Axis ICOS (Integrated Cavity Output Spectroscopy) technology in a rugged, Pelican case housing for easy transport and field operations. The analyzer simultaneously measures both δ2H and δ18O from liquid water, with both manual and automatic water introduction options. The laboratory precision for δ2H is 0.6 ‰, and for δ18O is 0.3 ‰. The mini-LWIA was deployed in the high Arctic during the summer of 2015 at Inuvik in the Canadian Northwest Territories. Samples were collected from Sachs Harbor, on the southwest coast of Banks Island, including buried basal ice from the Lurentide Ice Sheet, some ice wedges, and other types of ground ice. Methodology and water analysis results from this extreme field deployment will be presented.

  9. Development of turbine cycle performance analyzer using intelligent data mining

    International Nuclear Information System (INIS)

    Heo, Gyun Young

    2004-02-01

    In recent year, the performance enhancement of turbine cycle in nuclear power plants is being highlighted because of worldwide deregulation environment. Especially the first target of operating plants became the reduction of operating cost to compete other power plants. It is known that overhaul interval is closely related to operating cost Author identified that the rapid and reliable performance tests, analysis, and diagnosis play an important role in the control of overhaul interval through field investigation. First the technical road map was proposed to clearly set up the objectives. The controversial issues were summarized into data gathering, analysis tool, and diagnosis method. Author proposed the integrated solution on the basis of intelligent data mining techniques. For the reliable data gathering, the state analyzer composed of statistical regression, wavelet analysis, and neural network was developed. The role of the state analyzer is to estimate unmeasured data and to increase the reliability of the collected data. For the advanced performance analysis, performance analysis toolbox was developed. The purpose of this tool makes analysis process easier and more accurate by providing three novel heat balance diagrams. This tool includes the state analyzer and turbine cycle simulation code. In diagnosis module, the probabilistic technique based on Bayesian network model and the deterministic technique based on algebraical model are provided together. It compromises the uncertainty in diagnosis process and the pin-point capability. All the modules were validated by simulated data as well as actual test data, and some modules are used as industrial applications. We have a lot of thing to be improved in turbine cycle in order to increase plant availability. This study was accomplished to remind the concern about the importance of turbine cycle and to propose the solutions on the basis of academic as well as industrial needs

  10. Analyzing Oscillations of a Rolling Cart Using Smartphones and Tablets

    Science.gov (United States)

    Egri, Sándor; Szabó, Lóránt

    2015-03-01

    It is well known that "interactive engagement" helps students to understand basic concepts in physics.1 Performing experiments and analyzing measured data are effective ways to realize interactive engagement, in our view. Some experiments need special equipment, measuring instruments, or laboratories, but in this activity we advocate student use of mobile phones or tablets to take experimental data. Applying their own devices and measuring simple phenomena from everyday life can improve student interest, while still allowing precise analysis of data, which can give deeper insight into scientific thinking and provide a good opportunity for inquiry-based learning.2

  11. SIMULAND - A CODE TO ANALYZE DYNAMIC EFFECTS DURING LANDING

    Directory of Open Access Journals (Sweden)

    Marcel STERE

    2010-03-01

    Full Text Available The landing gear of an aircraft is part of the aircraft structure. It is the most critical part of the flight mission and also the component that will likely cause the most problems in the aircraft design. The landing gear design combines the best in mechanical, structural and hydraulic design. The designed landing gear should be able to meet the specifications and requirements imposed by the CS23. SIMULAND-01 is a program intended to analyze a reduced model (4-30 DoF of the aircraft under transient dynamic loads during the landing phase (touchdown.

  12. Using linear programming to analyze and optimize stochastic flow lines

    DEFF Research Database (Denmark)

    Helber, Stefan; Schimmelpfeng, Katja; Stolletz, Raik

    2011-01-01

    This paper presents a linear programming approach to analyze and optimize flow lines with limited buffer capacities and stochastic processing times. The basic idea is to solve a huge but simple linear program that models an entire simulation run of a multi-stage production process in discrete time......, to determine a production rate estimate. As our methodology is purely numerical, it offers the full modeling flexibility of stochastic simulation with respect to the probability distribution of processing times. However, unlike discrete-event simulation models, it also offers the optimization power of linear...

  13. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    Full Text Available Abstract Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP, (iii the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different

  14. Demonstration of capabilities of high temperature composites analyzer code HITCAN

    Science.gov (United States)

    Singhal, Surendra N.; Lackney, Joseph J.; Chamis, Christos C.; Murthy, Pappu L. N.

    1990-01-01

    The capabilities a high temperature composites analyzer code, HITCAN which predicts global structural and local stress-strain response of multilayered metal matrix composite structures, are demonstrated. The response can be determined both at the constituent (fiber, matrix, and interphase) and the structure level and includes the fabrication process effects. The thermo-mechanical properties of the constituents are considered to be nonlinearly dependent on several parameters including temperature, stress, and stress rate. The computational procedure employs an incremental iterative nonlinear approach utilizing a multifactor-interactive constituent material behavior model. Various features of the code are demonstrated through example problems for typical structures.

  15. Uncertainty in Analyzed Water and Energy Budgets at Continental Scales

    Science.gov (United States)

    Bosilovich, Michael G.; Robertson, F. R.; Mocko, D.; Chen, J.

    2011-01-01

    Operational analyses and retrospective-analyses provide all the physical terms of mater and energy budgets, guided by the assimilation of atmospheric observations. However, there is significant reliance on the numerical models, and so, uncertainty in the budget terms is always present. Here, we use a recently developed data set consisting of a mix of 10 analyses (both operational and retrospective) to quantify the uncertainty of analyzed water and energy budget terms for GEWEX continental-scale regions, following the evaluation of Dr. John Roads using individual reanalyses data sets.

  16. Multi-faceted data gathering and analyzing system

    International Nuclear Information System (INIS)

    Gustavson, D.B.; Rich, K.

    1977-10-01

    A low-cost general purpose data gathering and analyzing system based on a microprocessor, an interface to CAMAC, and a phone link to a time-sharing system was implemented. The parts cost for the microprocessor system was about $6000. The microprocessor buffers the data such that the variable response of the time-sharing system is acceptable for performing real-time data acquisition. The full power and flexibility of the time-sharing system excels at the task of on-line data analysis once this buffering problem is solved. 4 figures

  17. Analyzing the use of pins in safety bearings

    DEFF Research Database (Denmark)

    da Fonseca, Cesar A. L. L.; Weber, Hans I.; Fleischer, Philip F.

    2015-01-01

    A new concept for safety bearings is analyzed: useful in emergency situations, it shall protect the bearing from destruction by the use of pins which impact with a disc, both capable of good energy dissipation. Results of work in progress are presented by validating partial stages......, the variation of energy content of the disc after each contact is considered. Actually, the validation of the clamped-free rotor is done. The main goal is to design an automatic system with the capability of changing the gap when necessary in order to avoid the rotor colliding with the inner part of the bearing....

  18. Designing experiments and analyzing data a model comparison perspective

    CERN Document Server

    Maxwell, Scott E

    2013-01-01

    Through this book's unique model comparison approach, students and researchers are introduced to a set of fundamental principles for analyzing data. After seeing how these principles can be applied in simple designs, students are shown how these same principles also apply in more complicated designs. Drs. Maxwell and Delaney believe that the model comparison approach better prepares students to understand the logic behind a general strategy of data analysis appropriate for various designs; and builds a stronger foundation, which allows for the introduction of more complex topics omitt

  19. DMET-analyzer: automatic analysis of Affymetrix DMET data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Agapito, Giuseppe; Di Martino, Maria Teresa; Arbitrio, Mariamena; Tassone, Pierfrancesco; Tagliaferri, Pierosandro; Cannataro, Mario

    2012-10-05

    Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters) is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism) on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix) and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i) to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii) the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP), (iii) the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists) to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different case studies regarding the analysis of

  20. Liver plasma membranes: an effective method to analyze membrane proteome.

    Science.gov (United States)

    Cao, Rui; Liang, Songping

    2012-01-01

    Plasma membrane proteins are critical for the maintenance of biological systems and represent important targets for the treatment of disease. The hydrophobicity and low abundance of plasma membrane proteins make them difficult to analyze. The protocols given here are the efficient isolation/digestion procedures for liver plasma membrane proteomic analysis. Both protocol for the isolation of plasma membranes and protocol for the in-gel digestion of gel-embedded plasma membrane proteins are presented. The later method allows the use of a high detergent concentration to achieve efficient solubilization of hydrophobic plasma membrane proteins while avoiding interference with the subsequent LC-MS/MS analysis.

  1. A Rigorous Methodology for Analyzing and Designing Plug-Ins

    DEFF Research Database (Denmark)

    Fasie, Marieta V.; Haxthausen, Anne Elisabeth; Kiniry, Joseph

    2013-01-01

    Today, GUI plug-ins development is typically done in a very ad-hoc way, where developers dive directly into implementation. Without any prior analysis and design, plug-ins are often flaky, unreliable, difficult to maintain and extend with new functionality, and have inconsistent user interfaces....... This paper addresses these problems by describing a rigorous methodology for analyzing and designing plug-ins. The methodology is grounded in the Extended Business Object Notation (EBON) and covers informal analysis and design of features, GUI, actions, and scenarios, formal architecture design, including...

  2. Single particle aerodynamic relaxation time analyzer. [for aerosol pollutants

    Science.gov (United States)

    Mazumder, M. K.; Kirsch, K. J.

    1977-01-01

    An instrument employing a laser Doppler velocimeter and a microphone to measure the phase lag of the motion of aerosol particulates relative to the motion of the fluid medium within an acoustic field is described. The relaxation times and aerodynamic diameters of the particles or droplets are determined in real time from the measured values of phase lag; thus, the size analysis is independent of the electrostatic charges and refractive indices of the particulates. The instrument is suitable for analyzing the aerodynamic size spectrum of atmospheric particulate pollutants with aerodynamic diameters ranging from 0.1 to 10.0 microns.

  3. Analyzing the drivers of green manufacturing with fuzzy approach

    DEFF Research Database (Denmark)

    Govindan, Kannan; Diabat, Ali; Madan Shankar, K.

    2015-01-01

    Green issues have gained more importance in contemporary globalization. Recent years have seen manufacturing processes understand the green issues due to the social and environmental concerns involved. The drivers of green manufacturing, however, have not been thoroughly investigated. Thus...... India, and aided by their replies; a pair-wise comparison was made among the drivers. The pair-wise comparison is used as an input data and the drivers were analyzed on its basis. The analysis resorted to the use of a fuzzy Multi Criteria Decision Making (MCDM) approach. The obtained results...

  4. Resolving 3D magnetism in nanoparticles using polarization analyzed SANS

    Science.gov (United States)

    Krycka, K. L.; Booth, R.; Borchers, J. A.; Chen, W. C.; Conlon, C.; Gentile, T. R.; Hogg, C.; Ijiri, Y.; Laver, M.; Maranville, B. B.; Majetich, S. A.; Rhyne, J. J.; Watson, S. M.

    2009-09-01

    Utilizing a polarized 3He cell as an analyzer we were able to perform a full polarization analysis on small-angle neutron scattering (SANS) data from an ensemble of 7 nm magnetite nanoparticles. The results led to clear separation of magnetic and nuclear scattering plus a 3D vectorial decomposition of the magnetism observed. At remanence variation in long-range magnetic correlation length was found to be highly dependent on temperature from 50 to 300 K. Additionally, we were able to compare the magnetic scattering from moments along and perpendicular to an applied field at saturation and in remanence.

  5. Systems and methods for modeling and analyzing networks

    Science.gov (United States)

    Hill, Colin C; Church, Bruce W; McDonagh, Paul D; Khalil, Iya G; Neyarapally, Thomas A; Pitluk, Zachary W

    2013-10-29

    The systems and methods described herein utilize a probabilistic modeling framework for reverse engineering an ensemble of causal models, from data and then forward simulating the ensemble of models to analyze and predict the behavior of the network. In certain embodiments, the systems and methods described herein include data-driven techniques for developing causal models for biological networks. Causal network models include computational representations of the causal relationships between independent variables such as a compound of interest and dependent variables such as measured DNA alterations, changes in mRNA, protein, and metabolites to phenotypic readouts of efficacy and toxicity.

  6. Analyzing Motives, Preferences, and Experiences in Video Game Play

    Directory of Open Access Journals (Sweden)

    Donald Loffredo

    2017-04-01

    Full Text Available This paper presents the results of analyzing motives, preferences, and experiences in video game play. A sample of 112 (64 male and 48 female students completed online the Gaming Attitudes, Motives, and Experiences Scales (GAMES. Separate one-way independent-measures multivariate analyses of variance (MANOVAs were used to determine if there were statistically significant differences by gender, age category, hours of videogame play, and ethnicity on the nine Factor Subscales of the GAMES. The results supported two of the proposed hypotheses. There were statistically differences by gender and hours of videogame play on some of the Factor Subscales of the GAMES.

  7. Analyzing the aesthetics of participation of media architecture

    DEFF Research Database (Denmark)

    Fritsch, Jonas; Grönvall, Erik; Breinbjerg, Morten

    2016-01-01

    This paper presents a theoretical framework for analyzing the aesthetics of participation of media architecture. The framework is based on a close reading of French philosopher Jacques Rancière and provides four points of emphasis: modes of sense perception, forms of engagement, community...... and emancipation. The framework is put to use in the analysis of three experimental media architectural projects; Ekkomaten/Echoes from Møllevangen, the coMotion Bench and FeltRadio. We discuss the findings from this analysis and outline future perspectives on how to develop and use the framework prospectively...... in the design of media architectural projects and other interactive environments....

  8. Analyzing high school students' reasoning about electromagnetic induction

    Science.gov (United States)

    Jelicic, Katarina; Planinic, Maja; Planinsic, Gorazd

    2017-06-01

    Electromagnetic induction is an important, yet complex, physics topic that is a part of Croatian high school curriculum. Nine Croatian high school students of different abilities in physics were interviewed using six demonstration experiments from electromagnetism (three of them concerned the topic of electromagnetic induction). Students were asked to observe, describe, and explain the experiments. The analysis of students' explanations indicated the existence of many conceptual and reasoning difficulties with the basic concepts of electromagnetism, and especially with recognizing and explaining the phenomenon of electromagnetic induction. Three student mental models of electromagnetic induction, formed during the interviews, which reoccurred among students, are described and analyzed within the knowledge-in-pieces framework.

  9. Development of an Adolescent Depression Ontology for Analyzing Social Data.

    Science.gov (United States)

    Jung, Hyesil; Park, Hyeoun-Ae; Song, Tae-Min; Jeon, Eunjoo; Kim, Ae Ran; Lee, Joo Yun

    2015-01-01

    Depression in adolescence is associated with significant suicidality. Therefore, it is important to detect the risk for depression and provide timely care to adolescents. This study aims to develop an ontology for collecting and analyzing social media data about adolescent depression. This ontology was developed using the 'ontology development 101'. The important terms were extracted from several clinical practice guidelines and postings on Social Network Service. We extracted 777 terms, which were categorized into 'risk factors', 'sign and symptoms', 'screening', 'diagnosis', 'treatment', and 'prevention'. An ontology developed in this study can be used as a framework to understand adolescent depression using unstructured data from social media.

  10. Mass analyzed threshold ionization spectroscopy of indazole cation

    Science.gov (United States)

    Su, Huawei; Pradhan, Manik; Tzeng, Wen Bih

    2005-08-01

    We have recorded the two-color resonant two-photon mass analyzed threshold ionization (MATI) spectra of indazole via four intermediate states. The adiabatic ionization energy of this molecule is determined to be 67 534 ± 5 cm -1. The observed MATI bands include in-plane ring bending as well as out-of-plane ring twisting and bending vibrations of the indazole cation. Comparing the present data with those of indole and 7-azaindole leads to a better understanding about the influence of the nitrogen atom in the aza-aromatic bicyclic system.

  11. Radioactive beam experiments using the Fragment Mass Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Davids, C.N.

    1994-04-01

    The Fragment Mass Analyzer (FMA) is a recoil mass spectrometer that has many potential applications in experiments with radioactive beams. The FMA can be used for spectroscopic studies of nuclei produced in reactions with radioactive beams. The FMA is also an ideal tool for studying radiative capture reactions of astrophysical interest, using inverse kinematics. The FMA has both mass and energy dispersion, which can be used to efficiently separate the reaction recoils from the primary beam. When used with radioactive beams, the FMA allows the recoils from radiative capture reactions to be detected in a low-background environment.

  12. Methods for Analyzing Electric Load Shape and its Variability

    Energy Technology Data Exchange (ETDEWEB)

    Price, Philip

    2010-05-12

    Current methods of summarizing and analyzing electric load shape are discussed briefly and compared. Simple rules of thumb for graphical display of load shapes are suggested. We propose a set of parameters that quantitatively describe the load shape in many buildings. Using the example of a linear regression model to predict load shape from time and temperature, we show how quantities such as the load?s sensitivity to outdoor temperature, and the effectiveness of demand response (DR), can be quantified. Examples are presented using real building data.

  13. The wireshark field guide analyzing and troubleshooting network traffic

    CERN Document Server

    Shimonski, Robert

    2013-01-01

    The Wireshark Field Guide provides hackers, pen testers, and network administrators with practical guidance on capturing and interactively browsing computer network traffic. Wireshark is the world's foremost network protocol analyzer, with a rich feature set that includes deep inspection of hundreds of protocols, live capture, offline analysis and many other features. The Wireshark Field Guide covers the installation, configuration and use of this powerful multi-platform tool. The book give readers the hands-on skills to be more productive with Wireshark as they drill

  14. 3002 Humidified Tandem Differential Mobility Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Uin, Janek [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The Brechtel Manufacturing Inc. (BMI) Humidified Tandem Differential Mobility Analyzer (HT-DMA Model 3002) (Brechtel and Kreidenweis 2000a,b, Henning et al. 2005, Xerxes et al. 2014) measures how aerosol particles of different initial dry sizes grow or shrink when exposed to changing relative humidity (RH) conditions. It uses two different mobility analyzers (DMA) and a humidification system to make the measurements. One DMA selects a narrow size range of dry aerosol particles, which are exposed to varying RH conditions in the humidification system. The second (humidified) DMA scans the particle size distribution output from the humidification system. Scanning a wide range of particle sizes enables the second DMA to measure changes in size or growth factor (growth factor = humidified size/dry size), due to water uptake by the particles. A Condensation Particle Counter (CPC) downstream of the second DMA counts particles as a function of selected size in order to obtain the number size distribution of particles exposed to different RH conditions.

  15. Modeling and Analyzing the Slipping of the Ball Screw

    Directory of Open Access Journals (Sweden)

    Nannan Xu

    Full Text Available AbstractThis paper aims to set up the ball systematic slipping model and analyze the slipping characteristics caused by different factors for a ball screw operating at high speeds. To investigate the ball screw slipping mechanism, transformed coordinate system should be established firstly. Then it is used to set up mathematical modeling for the ball slipping caused by the three main reasons and the speed of slipping can be calculated. Later, the influence of the contact angle, helix angle and screw diameter for ball screw slipping will be analyzed according to the ball slipping model and slipping speeds equation and the slipping analysis will be obtained. Finally, curve of slipping analysis and that of mechanical efficiency of the ball screw analysis by Lin are compared, which will indirectly verify the correctness of the slipping model. The slipping model and the curve of slipping analysis established in this paper will provide theory basis for reducing slipping and improving the mechanical efficiency of a ball screw operating at high speeds.

  16. Algorithm for Analyzing Thermal Images of Laser Irradiated Human Skin.

    Science.gov (United States)

    Toumi, Johnny; Saiof, Fawaz; Bachir, Wesam

    2016-01-01

    Introduction: Tracking temporal changes of temperature during laser skin treatment plays an important role in improving the process of laser skin treatment itself. There are a number of methods to analyze temperature's temporal dependency during laser skin treatment; some of those methods depend on imaging the skin with thermal cameras. However, the use of thermal cameras exhibits specific problems, including the ability to track laser-skin interaction spot. This paper is dedicated to solve that problem using digital image processing program coded with Matlab. Methods: The measurements were taken for 15 native Syrian subjects of different sex, age and skin tones, the treated ailment was port wine stain. The clinical work (laser exposure) was performed in Damascus University, hospital of dermatology. The treatment was observed by thermal camera and analyzed using the proposed Matlab coded tracking system. Results: For all the subjects, the treatment laser spot was tracked and the curves of skin temperature change with time where calculated by the use of the proposed algorithm, then the active time was calculated for each subject. The algorithm proved practical and robust. Conclusion: The proposed algorithm proved to be efficient and can be used to support future researchers with capability to measure the temperature with high frame rate.

  17. The PLT and PDX charge-exchange analyzers

    International Nuclear Information System (INIS)

    Mueller, D.; Hammett, G.; McCune, D.C.

    1986-01-01

    The perpendicularly-viewing mass-resolving charge-exchange analyzers for PLT and PDX were built to measure the plasma ion temperature, central neutral density and hydrogen to deuterium ratio. This paper discusses the measurements as they are affected by instrumental effects. In PDX with perpendicular neutral deuterium beam injection into hydrogen plasmas, a small (∼ 1%) hydrogen impurity in the beam gives rise to an energetic tail on the observed hydrogen neutral spectrum. A simple model indicates that this contamination of the thermal spectrum causes the apparent ion temperature to be 5-20% too high. This effect is included in the analysis. The neutral density measurement relies on knowledge of absolute detection efficiency. While this can in principle be measured in-situ with a diagnostic neutral beam, a large uncertainty remains. Measurement of the H/D ratio in the plasma is limited by the mass rejection (1:1000) of the analyzer. It is primarily the deuterium flux at 1/2 the hydrogen energy that places a lower limit on the measurable flux to H/D ratios above ∼ 0.5%

  18. Capturing and analyzing wheelchair maneuvering patterns with mobile cloud computing.

    Science.gov (United States)

    Fu, Jicheng; Hao, Wei; White, Travis; Yan, Yuqing; Jones, Maria; Jan, Yih-Kuen

    2013-01-01

    Power wheelchairs have been widely used to provide independent mobility to people with disabilities. Despite great advancements in power wheelchair technology, research shows that wheelchair related accidents occur frequently. To ensure safe maneuverability, capturing wheelchair maneuvering patterns is fundamental to enable other research, such as safe robotic assistance for wheelchair users. In this study, we propose to record, store, and analyze wheelchair maneuvering data by means of mobile cloud computing. Specifically, the accelerometer and gyroscope sensors in smart phones are used to record wheelchair maneuvering data in real-time. Then, the recorded data are periodically transmitted to the cloud for storage and analysis. The analyzed results are then made available to various types of users, such as mobile phone users, traditional desktop users, etc. The combination of mobile computing and cloud computing leverages the advantages of both techniques and extends the smart phone's capabilities of computing and data storage via the Internet. We performed a case study to implement the mobile cloud computing framework using Android smart phones and Google App Engine, a popular cloud computing platform. Experimental results demonstrated the feasibility of the proposed mobile cloud computing framework.

  19. Analyzing students' attitudes towards science during inquiry-based lessons

    Science.gov (United States)

    Kostenbader, Tracy C.

    Due to the logistics of guided-inquiry lesson, students learn to problem solve and develop critical thinking skills. This mixed-methods study analyzed the students' attitudes towards science during inquiry lessons. My quantitative results from a repeated measures survey showed no significant difference between student attitudes when taught with either structured-inquiry or guided-inquiry lessons. The qualitative results analyzed through a constant-comparative method did show that students generate positive interest, critical thinking and low level stress during guided-inquiry lessons. The qualitative research also gave insight into a teacher's transition to guided-inquiry. This study showed that with my students, their attitudes did not change during this transition according to the qualitative data however, the qualitative data did how high levels of excitement. The results imply that students like guided-inquiry laboratories, even though they require more work, just as much as they like traditional laboratories with less work and less opportunity for creativity.

  20. Historical civilian nuclear accident based Nuclear Reactor Condition Analyzer

    Science.gov (United States)

    McCoy, Kaylyn Marie

    There are significant challenges to successfully monitoring multiple processes within a nuclear reactor facility. The evidence for this observation can be seen in the historical civilian nuclear incidents that have occurred with similar initiating conditions and sequences of events. Because there is a current lack within the nuclear industry, with regards to the monitoring of internal sensors across multiple processes for patterns of failure, this study has developed a program that is directed at accomplishing that charge through an innovation that monitors these systems simultaneously. The inclusion of digital sensor technology within the nuclear industry has appreciably increased computer systems' capabilities to manipulate sensor signals, thus making the satisfaction of these monitoring challenges possible. One such manipulation to signal data has been explored in this study. The Nuclear Reactor Condition Analyzer (NRCA) program that has been developed for this research, with the assistance of the Nuclear Regulatory Commission's Graduate Fellowship, utilizes one-norm distance and kernel weighting equations to normalize all nuclear reactor parameters under the program's analysis. This normalization allows the program to set more consistent parameter value thresholds for a more simplified approach to analyzing the condition of the nuclear reactor under its scrutiny. The product of this research provides a means for the nuclear industry to implement a safety and monitoring program that can oversee the system parameters of a nuclear power reactor facility, like that of a nuclear power plant.

  1. Hardware Design Improvements to the Major Constituent Analyzer

    Science.gov (United States)

    Combs, Scott; Schwietert, Daniel; Anaya, Marcial; DeWolf, Shannon; Merrill, Dave; Gardner, Ben D.; Thoresen, Souzan; Granahan, John; Belcher, Paul; Matty, Chris

    2011-01-01

    The Major Constituent Analyzer (MCA) onboard the International Space Station (ISS) is designed to monitor the major constituents of the ISS's internal atmosphere. This mass spectrometer based system is an integral part of the Environmental Control and Life Support System (ECLSS) and is a primary tool for the management of ISS atmosphere composition. As a part of NASA Change Request CR10773A, several alterations to the hardware have been made to accommodate improved MCA logistics. First, the ORU 08 verification gas assembly has been modified to allow the verification gas cylinder to be installed on orbit. The verification gas is an essential MCA consumable that requires periodic replenishment. Designing the cylinder for subassembly transport reduces the size and weight of the maintained item for launch. The redesign of the ORU 08 assembly includes a redesigned housing, cylinder mounting apparatus, and pneumatic connection. The second hardware change is a redesigned wiring harness for the ORU 02 analyzer. The ORU 02 electrical connector interface was damaged in a previous on-orbit installation, and this necessitated the development of a temporary fix while a more permanent solution was developed. The new wiring harness design includes flexible cable as well as indexing fasteners and guide-pins, and provides better accessibility during the on-orbit maintenance operation. This presentation will describe the hardware improvements being implemented for MCA as well as the expected improvement to logistics and maintenance.

  2. Bicycle: a bioinformatics pipeline to analyze bisulfite sequencing data.

    Science.gov (United States)

    Graña, Osvaldo; López-Fernández, Hugo; Fdez-Riverola, Florentino; González Pisano, David; Glez-Peña, Daniel

    2018-04-15

    High-throughput sequencing of bisulfite-converted DNA is a technique used to measure DNA methylation levels. Although a considerable number of computational pipelines have been developed to analyze such data, none of them tackles all the peculiarities of the analysis together, revealing limitations that can force the user to manually perform additional steps needed for a complete processing of the data. This article presents bicycle, an integrated, flexible analysis pipeline for bisulfite sequencing data. Bicycle analyzes whole genome bisulfite sequencing data, targeted bisulfite sequencing data and hydroxymethylation data. To show how bicycle overtakes other available pipelines, we compared them on a defined number of features that are summarized in a table. We also tested bicycle with both simulated and real datasets, to show its level of performance, and compared it to different state-of-the-art methylation analysis pipelines. Bicycle is publicly available under GNU LGPL v3.0 license at http://www.sing-group.org/bicycle. Users can also download a customized Ubuntu LiveCD including bicycle and other bisulfite sequencing data pipelines compared here. In addition, a docker image with bicycle and its dependencies, which allows a straightforward use of bicycle in any platform (e.g. Linux, OS X or Windows), is also available. ograna@cnio.es or dgpena@uvigo.es. Supplementary data are available at Bioinformatics online.

  3. BWR stability analysis with the BNL Engineering Plant Analyzer

    International Nuclear Information System (INIS)

    Wulff, W.; Cheng, H.S.; Mallen, A.N.; Rohatgi, U.S.

    1992-10-01

    March 9, 1989 instability at the LaSalle-2 Power Plant and more than ninety related BWR transients have been simulated on the BNL Engineering Plant Analyzer (EPA). Power peaks were found to be potentially seventeen times greater than the rated power, flow reversal occurs momentarily during large power oscillations, the fuel centerline temperature oscillates between 1,030 and 2,090 K, while the cladding temperature oscillates between 560 and 570 K. The Suppression Pool reaches its specified temperature limit either never or in as little as 4.3 minutes, depending on operator actions and transient scenario. Thermohydraulic oscillations occur at low core coolant flow (both Recirculation Pumps tripped), with sharp axial or redial fission power peaking and with partial loss of feedwater preheating while the feedwater is flow kept high to maintain coolant inventory in the vessel. Effects from BOP system were shown to influence reactor stability strongly through dosed-loop resonance feedback. High feedwater flow and low temperature destabilize the reactor. Low feedwater flow restabilizes the reactor, because of steam condensation and feedwater preheating in the downcomer, which reduces effectively the destabilizing core inlet subcooling. The EPA has been found to be capable of analyzing BWR stability '' shown to be effective for scoping calculations and for supporting accident management

  4. QA practice for online analyzers in water steam cycles

    International Nuclear Information System (INIS)

    Staub, L.

    2010-01-01

    The liberalization of power markets throughout the world has resulted in more and more power stations being operated in cycling mode, with frequent load changes and multiple daily start-up and shut-down cycles. This more flexible operation also calls for better automation and poses new challenges to water chemistry in water steam cycles, to avoid subsequent damage to vital plant components such as turbines, boilers or condensers. But automation for the most important chemistry control tool, the sampling and online analyzer system, is only possible if chemists can rely on their online analysis equipment. Proof of plausibility as well as reliability and availability of online analysis results becomes a major focus. While SOP and standard QA procedures for laboratory equipment are well established and daily practice, such measures are widely neglected for online process analyzers. This paper is aiming to establish a roadmap for the implementation of SOP and QA/QC procedures for online instruments in water steam cycles, leading to reliable chemical information that is trustworthy for process automation and chemistry control in water steam cycles. (author)

  5. QA practice for online analyzers in water steam cycles

    International Nuclear Information System (INIS)

    Staub Lukas

    2009-01-01

    The liberalization of power markets throughout the world has resulted in more and more power stations being operated in cycling mode, with frequent load changes and multiple daily start-up and shut-down cycles. This more flexible operation also calls for better automation and poses new challenges to water chemistry in water steam cycles, to avoid subsequent damage to vital plant components such as turbines, boilers or condensers. But automation for the most important chemistry control tool, the sampling and online analyzer system, is only possible if chemists can rely on their online analysis equipment. Proof of plausibility as well as reliability and availability of online analysis results becomes a major focus. While SOP and standard QA procedures for laboratory equipment are well established and daily practice, such measures are widely neglected for online process analyzers. This paper is aiming to establish a roadmap for the implementation of SOP and QA/QC procedures for online instruments in water steam cycles, leading to reliable chemical information that is trustworthy for process automation and chemistry control in water steam cycles. (author)

  6. Analyzing a stochastic process driven by Ornstein-Uhlenbeck noise

    Science.gov (United States)

    Lehle, B.; Peinke, J.

    2018-01-01

    A scalar Langevin-type process X (t ) that is driven by Ornstein-Uhlenbeck noise η (t ) is non-Markovian. However, the joint dynamics of X and η is described by a Markov process in two dimensions. But even though there exists a variety of techniques for the analysis of Markov processes, it is still a challenge to estimate the process parameters solely based on a given time series of X . Such a partially observed 2D process could, e.g., be analyzed in a Bayesian framework using Markov chain Monte Carlo methods. Alternatively, an embedding strategy can be applied, where first the joint dynamics of X and its temporal derivative X ˙ is analyzed. Subsequently, the results can be used to determine the process parameters of X and η . In this paper, we propose a more direct approach that is purely based on the moments of the increments of X , which can be estimated for different time-increments τ from a given time series. From a stochastic Taylor expansion of X , analytic expressions for these moments can be derived, which can be used to estimate the process parameters by a regression strategy.

  7. [Study on factors influencing DNA sequencing by automatic genetic analyzer].

    Science.gov (United States)

    Yan, Shaofei; Wang, Wei; Xu, Jin; Bai, Li; Gan, Xin; Li, Fengqin

    2015-05-01

    To acquire accurate and successful DNA sequencing in a cost-effective way by ABI3500xl automatic genetic analyzer. BigDye was diluted to 8, 16 and 32 times in PCR product sequencing. Three different methods including CENTRI-SEP kit, BigDye cleaning beads and ethanol-NaAc-EDTA were used to purify the sequencing PCR products. The results of DNA sequencing were correct when BigDye was diluted up to 16 times. The misreading of nucleic acid bases was found as BigDye was diluted to 32 times. All three purification methods provided acceptable DNA sequencing results. In terms of method for purification of PCR products, the CENTRI-SEP Kit was the most expensive but time-saving (0.5 h), while ethanol-NaAc-EDTA method was the most economical but time-consuming (2 h). The BigDye cleaning beads method was of a suitable purification time (1 h) but not fit for high-throughput DNA sequencing. BigDye should be diluted up to 16 times in DNA sequencing by ABI3500xl DNA analyzer. Although all three purification methods may promise DNA sequencing results with good quality, it is necessary to choose an appropriate one to keep the balance between time and cost on the basis of the lab condition.

  8. Analyzing B-vitamins in Human Milk: Methodological Approaches.

    Science.gov (United States)

    Hampel, Daniela; Allen, Lindsay H

    2016-01-01

    According to the World Health Organization (WHO), infants should be exclusively breastfed for the first six months of life. However, there is insufficient information about the concentration of nutrients in human milk. For some nutrients, including B-vitamins, maternal intake affects their concentration in human milk but the extent to which inadequate maternal diets affect milk B-vitamin content is poorly documented. Little is known about infant requirements for B-vitamins; recommendations are generally set as Adequate Intakes (AI) calculated on the basis of the mean volume of milk (0.78 L/day) consumed by infants exclusively fed with human milk from well-nourished mothers during the first six months, and the concentration of each vitamin in milk based on reported values. Methods used for analyzing B-vitamins, commonly microbiological, radioisotope dilution or more recently chromatographic, coupled with UV, fluorometric and MS detection, have rarely been validated for the complex human milk matrix. Thus the validity, accuracy, and sensitivity of analytical methods is important for understanding infant requirements for these nutrients, the maternal intakes needed to support adequate concentrations in breast milk. This review summarizes current knowledge on methods used for analyzing the B-vitamins thiamin, riboflavin, niacin, vitamin B-6 and pantothenic acid, vitamin B-12, folate, biotin, and choline in human milk, their chemical and physical properties, the different forms and changes in concentration during lactation, and the effects of deficiency on the infant.

  9. Particulate size distribution cascade analyzer for spacecraft contamination monitoring

    Science.gov (United States)

    Wallace, D. A.; Chuan, R. L.

    1975-01-01

    A cascade particulate analyzer was developed for nearly real time measurement of the contaminating particulate size distribution in the spacecraft interior ambient environment and as a real time total impacting particulate mass monitor under vacuum conditions. The analyzer has four stages, the first stage is a basic 10 MHz quartz crystal microbalance used widely on spacecraft (such as Skylab) for contamination monitoring purposes. In this application the front sensing crystal is coated with a low vapor pressure adhesive grease which captures impacting particles. This first stage has a wide viewing angle and measures total particulate mass impacting the crystal while the unit is exposed to the vacuum environment. The remaining three states form an aerodynamic impaction cascade with individual quartz crystal microbalances at each stage acting as accumulated mass sensing elements. These three stages thus give relative mass distribution of particulates in three ranges, particles having effective diameter greater than 5 micron, particles between 1 and 5 micron diameter and particles 0.3 to 1 micron diameter.

  10. Analyzing the user behavior towards Electronic Commerce stimuli

    Directory of Open Access Journals (Sweden)

    Carlota Lorenzo-Romero

    2016-11-01

    Full Text Available Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e. navigational structure as utilitarian stimulus versus nonverbal web technology (music and presentation of products as hedonic stimuli. Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this research consists in analyzing the impact of these web technologies –which constitute the web atmosphere or webmosphere of a website– on shopping human bebaviour (i.e. users’ internal states -affective, cognitive, and satisfaction- and behavioral responses - approach responses, and real shopping outcomes- within the retail online store created by computer, taking into account some mediator variables (i.e. involvement, atmospheric responsiveness, and perceived risk. A 2(free versus hierarchical navigational structure x2(on versus off music x2(moving versus static images between-subjects computer experimental design is used to test empirically this research. In addition, an integrated methodology was developed allowing the simulation, tracking and recording of virtual user behavior within an online shopping environment. As main conclusion, this study suggests that the positive responses of online consumers might increase when they are allowed to freely navigate the online stores and their experience is enriched by animate gifts and music background. The effect caused by mediator variables modifies relatively the final shopping human behavior.

  11. Analyzing Multimode Wireless Sensor Networks Using the Network Calculus

    Directory of Open Access Journals (Sweden)

    Xi Jin

    2015-01-01

    Full Text Available The network calculus is a powerful tool to analyze the performance of wireless sensor networks. But the original network calculus can only model the single-mode wireless sensor network. In this paper, we combine the original network calculus with the multimode model to analyze the maximum delay bound of the flow of interest in the multimode wireless sensor network. There are two combined methods A-MM and N-MM. The method A-MM models the whole network as a multimode component, and the method N-MM models each node as a multimode component. We prove that the maximum delay bound computed by the method A-MM is tighter than or equal to that computed by the method N-MM. Experiments show that our proposed methods can significantly decrease the analytical delay bound comparing with the separate flow analysis method. For the large-scale wireless sensor network with 32 thousands of sensor nodes, our proposed methods can decrease about 70% of the analytical delay bound.

  12. Multimode laser beam analyzer instrument using electrically programmable optics.

    Science.gov (United States)

    Marraccini, Philip J; Riza, Nabeel A

    2011-12-01

    Presented is a novel design of a multimode laser beam analyzer using a digital micromirror device (DMD) and an electronically controlled variable focus lens (ECVFL) that serve as the digital and analog agile optics, respectively. The proposed analyzer is a broadband laser characterization instrument that uses the agile optics to smartly direct light to the required point photodetectors to enable beam measurements of minimum beam waist size, minimum waist location, divergence, and the beam propagation parameter M(2). Experimental results successfully demonstrate these measurements for a 500 mW multimode test laser beam with a wavelength of 532 nm. The minimum beam waist, divergence, and M(2) experimental results for the test laser are found to be 257.61 μm, 2.103 mrad, 1.600 and 326.67 μm, 2.682 mrad, 2.587 for the vertical and horizontal directions, respectively. These measurements are compared to a traditional scan method and the results of the beam waist are found to be within error tolerance of the demonstrated instrument.

  13. Blood culture cross contamination associated with a radiometric analyzer

    International Nuclear Information System (INIS)

    Griffin, M.R.; Miller, A.D.; Davis, A.C.

    1982-01-01

    During a 9-day period in August 1980 in a New Jersey hospital, three pairs of consecutively numbered blood cultures from different patients were identified as positive for the same organism, for each pair, both cultures were positive in the same atmosphere, both organisms had the same sensitivities, and the second of each pair grew at least 2 days after the first and was the only positive blood culture obtained from the patient. When the hospital laboratory discontinued use of its radiometric culture analyzer for 15 days, no more consecutive pairs of positive cultures occurred. Subsequent use of the machine for 9 days with a new power unit but the original circuit boards resulted in one more similar consecutive pair (Staphylococcus epidermidis). After replacement of the entire power unit, there were no further such pairs. Examination of the machine by the manufacturer revealed a defective circuit board which resulted in inadequate needle sterilization. Laboratories which utilize radiometric analyzers should be aware of the potential for cross contamination. Recognition of such events requires alert microbiologists and infection control practitioners and a record system in the bacteriology laboratory designed to identify such clusters

  14. Analyzing the impact of human capital factors on competitivenes

    Directory of Open Access Journals (Sweden)

    Óhegyi Katalin

    2014-01-01

    Full Text Available There are a number of approaches to measure national competitiveness. However, in these reports human capital typically appears indirectly. The author's purpose is to uncover how human capital contributes to competitiveness of economies and to propose an approach to identify the most effective improvement opportunities for countries, illustrated on the example of Hungary. The analysis is based on the data of the Global Talent Index Report (2011 and the Global Competitiveness Report 2012-2013. The components of the Global Talent Index (GTI and their relation to the Global Competitiveness Index (GCI were analyzed with a linear programming based similarity analysis method, component-based object comparison for objectivity (COCO. Based on the output of the analysis it was identified how sensitive the Global Competitiveness Index is to the components of the GTI. Hungary's position was analyzed further to quantify improvement opportunities and threats based on the step function resulted by the COCO analysis. The author concludes that the human resource of a country is a pivotal element of national competitiveness. By developing human capital of the country the overall competitive position may be improved. Areas of priorities may be identified and the level of intervention may be quantified specific to a country. This could help policy makers to decide in the allocation of resource to maximize effectiveness, leading to improve (or protect a country's overall competitive position in the global arena.

  15. Analyzing the User Behavior toward Electronic Commerce Stimuli.

    Science.gov (United States)

    Lorenzo-Romero, Carlota; Alarcón-Del-Amo, María-Del-Carmen; Gómez-Borja, Miguel-Ángel

    2016-01-01

    Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e., navigational structure as utilitarian stimulus) versus non-verbal web technology (music and presentation of products as hedonic stimuli). Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this research consists in analyzing the impact of these web technologies -which constitute the web atmosphere or webmosphere of a website- on shopping human behavior (i.e., users' internal states -affective, cognitive, and satisfaction- and behavioral responses - approach responses, and real shopping outcomes-) within the retail online store created by computer, taking into account some mediator variables (i.e., involvement, atmospheric responsiveness, and perceived risk). A 2 ("free" versus "hierarchical" navigational structure) × 2 ("on" versus "off" music) × 2 ("moving" versus "static" images) between-subjects computer experimental design is used to test empirically this research. In addition, an integrated methodology was developed allowing the simulation, tracking and recording of virtual user behavior within an online shopping environment. As main conclusion, this study suggests that the positive responses of online consumers might increase when they are allowed to freely navigate the online stores and their experience is enriched by animate gifts and music background. The effect caused by mediator variables modifies relatively the final shopping human behavior.

  16. Energy and mass-analyzer based on two plane capacitors

    International Nuclear Information System (INIS)

    Zashkvara, V.V.; Shestakov, V.P.; AN Kazakhskoj SSR, Alma-Ata. Inst. Yadernoj Fiziki)

    1983-01-01

    An energy and mass-analyzer of charged particle beams consisting of two plane capacitors has been calculated. Energy analysis is performed in the statistical regime, mass analysis - in the regime of time of flight measurement. Focusing in the time-of-flight is performed both according to an angle of beam divergence and to energy. Different modes of energy and mass analyzer operation were considered. In the first mode the energy analysis of a charged particle beam escaping a linearly extended source is accomplished and the energy distribution function N(W) is measured. In the second mode energy analysis and time-of-flight mass analysis of the charged particle beam are performed which permits to measure the spectral distribution function N(W, m) both in masses and energy. In the third mode the time-of-flight mass analysis of the beam is accomplished which enables to measure integral with respect to energy distribution function N(m) in masses. So, the simple device based on two plane capacitors can considerably improve the possibilities of the energy and mass analysis of charged particle beams emitted with linearly extended sources

  17. An image analyzer system for the analysis of nuclear traces

    International Nuclear Information System (INIS)

    Cuapio O, A.

    1990-10-01

    Inside the project of nuclear traces and its application techniques to be applied in the detection of nuclear reactions of low section (non detectable by conventional methods), in the study of accidental and personal neutron dosemeters, and other but, are developed. All these studies are based on the fact that the charged particles leave latent traces of dielectric that if its are engraved with appropriate chemical solutions its are revealed until becoming visible to the optical microscope. From the analysis of the different trace forms, it is possible to obtain information of the characteristic parameters of the incident particles (charge, mass and energy). Of the density of traces it is possible to obtain information of the flow of the incident radiation and consequently of the received dose. For carry out this analysis has been designed and coupled different systems, that it has allowed the solution of diverse outlined problems. Notwithstanding it has been detected that to make but versatile this activity is necessary to have an Image Analyzer System that allow us to digitize, to process and to display the images with more rapidity. The present document, presents the proposal to carry out the acquisition of the necessary components for to assembling an Image Analyzing System, like support to the mentioned project. (Author)

  18. Using GIS to analyze animal movements in the marine environment

    Science.gov (United States)

    Hooge, Philip N.; Eichenlaub, William M.; Solomon, Elizabeth K.; Kruse, Gordon H.; Bez, Nicolas; Booth, Anthony; Dorn, Martin W.; Hills, Susan; Lipcius, Romuald N.; Pelletier, Dominique; Roy, Claude; Smith, Stephen J.; Witherell, David B.

    2001-01-01

    Advanced methods for analyzing animal movements have been little used in the aquatic research environment compared to the terrestrial. In addition, despite obvious advantages of integrating geographic information systems (GIS) with spatial studies of animal movement behavior, movement analysis tools have not been integrated into GIS for either aquatic or terrestrial environments. We therefore developed software that integrates one of the most commonly used GIS programs (ArcView®) with a large collection of animal movement analysis tools. This application, the Animal Movement Analyst Extension (AMAE), can be loaded as an extension to ArcView® under multiple operating system platforms (PC, Unix, and Mac OS). It contains more than 50 functions, including parametric and nonparametric home range analyses, random walk models, habitat analyses, point and circular statistics, tests of complete spatial randomness, tests for autocorrelation and sample size, point and line manipulation tools, and animation tools. This paper describes the use of these functions in analyzing animal location data; some limited examples are drawn from a sonic-tracking study of Pacific halibut (Hippoglossus stenolepis) in Glacier Bay, Alaska. The extension is available on the Internet at www.absc.usgs.gov/glba/gistools/index.htm.

  19. Analyzing a 35-Year Hourly Data Record: Why So Difficult?

    Science.gov (United States)

    Lynnes, Chris

    2014-01-01

    At the Goddard Distributed Active Archive Center, we have recently added a 35-Year record of output data from the North American Land Assimilation System (NLDAS) to the Giovanni web-based analysis and visualization tool. Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure) offers a variety of data summarization and visualization to users that operate at the data center, obviating the need for users to download and read the data themselves for exploratory data analysis. However, the NLDAS data has proven surprisingly resistant to application of the summarization algorithms. Algorithms that were perfectly happy analyzing 15 years of daily satellite data encountered limitations both at the algorithm and system level for 35 years of hourly data. Failures arose, sometimes unexpectedly, from command line overflows, memory overflows, internal buffer overflows, and time-outs, among others. These serve as an early warning sign for the problems likely to be encountered by the general user community as they try to scale up to Big Data analytics. Indeed, it is likely that more users will seek to perform remote web-based analysis precisely to avoid the issues, or the need to reprogram around them. We will discuss approaches to mitigating the limitations and the implications for data systems serving the user communities that try to scale up their current techniques to analyze Big Data.

  20. Educational and Scientific Applications of Climate Model Diagnostic Analyzer

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Zhang, J.; Bao, Q.

    2016-12-01

    Climate Model Diagnostic Analyzer (CMDA) is a web-based information system designed for the climate modeling and model analysis community to analyze climate data from models and observations. CMDA provides tools to diagnostically analyze climate data for model validation and improvement, and to systematically manage analysis provenance for sharing results with other investigators. CMDA utilizes cloud computing resources, multi-threading computing, machine-learning algorithms, web service technologies, and provenance-supporting technologies to address technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. As CMDA infrastructure and technology have matured, we have developed the educational and scientific applications of CMDA. Educationally, CMDA supported the summer school of the JPL Center for Climate Sciences for three years since 2014. In the summer school, the students work on group research projects where CMDA provide datasets and analysis tools. Each student is assigned to a virtual machine with CMDA installed in Amazon Web Services. A provenance management system for CMDA is developed to keep track of students' usages of CMDA, and to recommend datasets and analysis tools for their research topic. The provenance system also allows students to revisit their analysis results and share them with their group. Scientifically, we have developed several science use cases of CMDA covering various topics, datasets, and analysis types. Each use case developed is described and listed in terms of a scientific goal, datasets used, the analysis tools used, scientific results discovered from the use case, an analysis result such as output plots and data files, and a link to the exact analysis service call with all the input arguments filled. For example, one science use case is the evaluation of NCAR CAM5 model with MODIS total cloud fraction. The analysis service used is Difference Plot Service of

  1. Framework for Analyzing Android I/O Stack Behavior: From Generating theWorkload to Analyzing the Trace

    Directory of Open Access Journals (Sweden)

    Sooman Jeong

    2013-12-01

    Full Text Available Abstract: The existing I/O workload generators and trace capturing tools are not adaptable to generating nor capturing the I/O requests of Android apps. The Android platform needs proper tools to capture and replay real world workload in the Android platform to verify the result of benchmark tools. This paper introduces Android Storage Performance Analysis Tool, AndroStep, which is specifically designed for characterizing and analyzing the behavior of the I/O subsystem in Android based devices. The AndroStep consists of Mobibench (workload generator, MOST (Mobile Storage Analyzer, and Mobigen (workload replayer. Mobibench is an Android app that generates a filesystem as well as SQLite database operations. Mobibench can also vary the number of concurrent threads to examining the filesystem scalability to support concurrency, e.g., metadata updates, journal file creation/deletion. MOST captures the trace and extracts key filesystem access characteristics such as access pattern with respect to file types, ratio between random vs. sequential access, ratio between buffered vs. synchronous I/O, fraction of metadata accesses, etc. MOST implements reverse mapping feature (finding an inode for a given block and retrospective reverse mapping (finding an inode for a deleted file. Mobigen is a trace capturing and replaying tool that is specifically designed to perform the user experiment without actual human intervention. Mobigen records the system calls generated from the user behavior and sanitizes the trace for replayable form. Mobigen can replay this trace on different Android platforms or with different I/O stack configurations. As an example of using AndroStep, we analyzed the performances of twelve Android smartphones and the SQLite performances on five different filesystems. AndroStep makes otherwise time consuming I/O stack analysis extremely versatile. AndroStep makes a significant contribution in terms of shedding light on internal behavior of

  2. PLC backplane analyzer for field forensics and intrusion detection

    Energy Technology Data Exchange (ETDEWEB)

    Mulder, John; Schwartz, Moses Daniel; Berg, Michael; Van Houten, Jonathan Roger; Urrea, Jorge Mario; King, Michael Aaron; Clements, Abraham Anthony; Trent, Jason; Depoy, Jennifer M; Jacob, Joshua

    2015-05-12

    The various technologies presented herein relate to the determination of unexpected and/or malicious activity occurring between components communicatively coupled across a backplane. Control data, etc., can be intercepted at a backplane where the backplane facilitates communication between a controller and at least one device in an automation process. During interception of the control data, etc., a copy of the control data can be made, e.g., the original control data can be replicated to generate a copy of the original control data. The original control data can continue on to its destination, while the control data copy can be forwarded to an analyzer system to determine whether the control data contains a data anomaly. The content of the copy of the control data can be compared with a previously captured baseline data content, where the baseline data can be captured for a same operational state as the subsequently captured control data.

  3. Program for Analyzing Flows in a Complex Network

    Science.gov (United States)

    Majumdar, Alok Kumar

    2006-01-01

    Generalized Fluid System Simulation Program (GFSSP) version 4 is a general-purpose computer program for analyzing steady-state and transient flows in a complex fluid network. The program is capable of modeling compressibility, fluid transients (e.g., water hammers), phase changes, mixtures of chemical species, and such externally applied body forces as gravitational and centrifugal ones. A graphical user interface enables the user to interactively develop a simulation of a fluid network consisting of nodes and branches. The user can also run the simulation and view the results in the interface. The system of equations for conservation of mass, energy, chemical species, and momentum is solved numerically by a combination of the Newton-Raphson and successive-substitution methods.

  4. Final Report for "Analyzing and visualizing next generation climate data"

    Energy Technology Data Exchange (ETDEWEB)

    Pletzer, Alexander

    2012-11-13

    The project "Analyzing and visualizing next generation climate data" adds block-structured (mosaic) grid support, parallel processing, and 2D/3D curvilinear interpolation to the open-source UV-CDAT climate data analysis tool. Block structured grid support complies to the Gridspec extension submitted to the Climate and Forecast metadata conventions. It contains two parts: aggregation of data spread over multiple mosaic tiles (M-SPEC) and aggregation of temporal data stored in different files (F-SPEC). Together, M-SPEC and F-SPEC allow users to interact with data stored in multiple files as if the data were in a single file. For computational expensive tasks, a flexible, multi-dimensional, multi-type distributed array class allows users to process data in parallel using remote memory access. Both nodal and cell based interpolation is supported; users can choose between different interpolation libraries including ESMF and LibCF depending on the their particular needs.

  5. mixtools: An R Package for Analyzing Mixture Models

    Directory of Open Access Journals (Sweden)

    Tatiana Benaglia

    2009-10-01

    Full Text Available The mixtools package for R provides a set of functions for analyzing a variety of finite mixture models. These functions include both traditional methods, such as EM algorithms for univariate and multivariate normal mixtures, and newer methods that reflect some recent research in finite mixture models. In the latter category, mixtools provides algorithms for estimating parameters in a wide range of different mixture-of-regression contexts, in multinomial mixtures such as those arising from discretizing continuous multivariate data, in nonparametric situations where the multivariate component densities are completely unspecified, and in semiparametric situations such as a univariate location mixture of symmetric but otherwise unspecified densities. Many of the algorithms of the mixtools package are EM algorithms or are based on EM-like ideas, so this article includes an overview of EM algorithms for finite mixture models.

  6. Precise Measurement of Deuteron Tensor Analyzing Powers with BLAST

    International Nuclear Information System (INIS)

    Zhang, C.; Akdogan, T.; Bertozzi, W.; Botto, T.; Clasie, B.; DeGrush, A.; Dow, K.; Farkhondeh, M.; Franklin, W.; Gilad, S.; Hasell, D.; Kolster, H.; Maschinot, A.; Matthews, J.; Meitanis, N.; Milner, R.; Redwine, R.; Seely, J.; Shinozaki, A.; Tschalaer, C.

    2011-01-01

    We report a precision measurement of the deuteron tensor analyzing powers T 20 and T 21 at the MIT-Bates Linear Accelerator Center. Data were collected simultaneously over a momentum transfer range Q=2.15-4.50 fm -1 with the Bates Large Acceptance Spectrometer Toroid using a highly polarized deuterium internal gas target. The data are in excellent agreement with calculations in a framework of effective field theory. The deuteron charge monopole and quadrupole form factors G C and G Q were separated with improved precision, and the location of the first node of G C was confirmed at Q=4.19±0.05 fm -1 . The new data provide a strong constraint on theoretical models in a momentum transfer range covering the minimum of T 20 and the first node of G C .

  7. Analyzing Enterprise Networks Needs: Action Research from the Mechatronics Sector

    Science.gov (United States)

    Cagnazzo, Luca; Taticchi, Paolo; Bidini, Gianni; Baglieri, Enzo

    New business models and theories are developing nowadays towards collaborative environments direction, and many new tools in sustaining companies involved in these organizations are emerging. Among them, a plethora of methodologies to analyze their needs are already developed for single companies. Few academic works are available about Enterprise Networks (ENs) need analysis. This paper presents the learning from an action research (AR) in the mechatronics sector: AR has been used in order to experience the issue of evaluating network needs and therefore define, develop, and test a complete framework for network evaluation. Reflection on the story in the light of the experience and the theory is presented, as well as extrapolation to a broader context and articulation of usable knowledge.

  8. Analyzing the Bitcoin Network: The First Four Years

    Directory of Open Access Journals (Sweden)

    Matthias Lischke

    2016-03-01

    Full Text Available In this explorative study, we examine the economy and transaction network of the decentralized digital currency Bitcoin during the first four years of its existence. The objective is to develop insights into the evolution of the Bitcoin economy during this period. For this, we establish and analyze a novel integrated dataset that enriches data from the Bitcoin blockchain with off-network data such as business categories and geo-locations. Our analyses reveal the major Bitcoin businesses and markets. Our results also give insights on the business distribution by countries and how businesses evolve over time. We also show that there is a gambling network that features many very small transactions. Furthermore, regional differences in the adoption and business distribution could be found. In the network analysis, the small world phenomenon is investigated and confirmed for several subgraphs of the Bitcoin network.

  9. Micro-motion analyzer used for dynamic MEMS characterization

    Science.gov (United States)

    Guo, Tong; Chang, Hong; Chen, Jinping; Fu, Xing; Hu, Xiaotang

    2009-03-01

    A computer-controlled micro-motion analyzer (MMA) to study the dynamic behavior of movable structures of MEMS is described in this paper. It employs two optical nondestructive methods—computer microvision for in-plane motion measurement and phase-shifting interferometry for out-of-plane motion measurement. This fully integrated system includes a high-performance imaging system, drive electronics, data acquisition and data analysis software. This system can freeze the fast motions of MEMS devices using strobed illumination and measure motions in three dimensions with nanometer accuracy. The static measurement accuracy and repeatability of the system is calibrated by a step height standard which is certified by National Institute of Standards and Technology (NIST). The capabilities of this system are illustrated with a study of the dynamic behaviors of a surface micromachined polysilicon micro-resonator.

  10. Methods for Quantitative Interpretation of Retarding Field Analyzer Data

    International Nuclear Information System (INIS)

    Calvey, J.R.; Crittenden, J.A.; Dugan, G.F.; Palmer, M.A.; Furman, M.; Harkay, K.

    2011-01-01

    Over the course of the CesrTA program at Cornell, over 30 Retarding Field Analyzers (RFAs) have been installed in the CESR storage ring, and a great deal of data has been taken with them. These devices measure the local electron cloud density and energy distribution, and can be used to evaluate the efficacy of different cloud mitigation techniques. Obtaining a quantitative understanding of RFA data requires use of cloud simulation programs, as well as a detailed model of the detector itself. In a drift region, the RFA can be modeled by postprocessing the output of a simulation code, and one can obtain best fit values for important simulation parameters with a chi-square minimization method.

  11. Research and analyze of physical health using multiple regression analysis

    Directory of Open Access Journals (Sweden)

    T. S. Kyi

    2014-01-01

    Full Text Available This paper represents the research which is trying to create a mathematical model of the "healthy people" using the method of regression analysis. The factors are the physical parameters of the person (such as heart rate, lung capacity, blood pressure, breath holding, weight height coefficient, flexibility of the spine, muscles of the shoulder belt, abdominal muscles, squatting, etc.., and the response variable is an indicator of physical working capacity. After performing multiple regression analysis, obtained useful multiple regression models that can predict the physical performance of boys the aged of fourteen to seventeen years. This paper represents the development of regression model for the sixteen year old boys and analyzed results.

  12. Urban forests and pollution mitigation: Analyzing ecosystem services and disservices

    International Nuclear Information System (INIS)

    Escobedo, Francisco J.; Kroeger, Timm; Wagner, John E.

    2011-01-01

    The purpose of this paper is to integrate the concepts of ecosystem services and disservices when assessing the efficacy of using urban forests for mitigating pollution. A brief review of the literature identifies some pollution mitigation ecosystem services provided by urban forests. Existing ecosystem services definitions and typologies from the economics and ecological literature are adapted and applied to urban forest management and the concepts of ecosystem disservices from natural and semi-natural systems are discussed. Examples of the urban forest ecosystem services of air quality and carbon dioxide sequestration are used to illustrate issues associated with assessing their efficacy in mitigating urban pollution. Development of urban forest management alternatives that mitigate pollution should consider scale, contexts, heterogeneity, management intensities and other social and economic co-benefits, tradeoffs, and costs affecting stakeholders and urban sustainability goals. - Environmental managers should analyze ecosystem services and disservices when developing urban forest management alternatives for mitigating urban pollution.

  13. PMD: A Resource for Archiving and Analyzing Protein Microarray data.

    Science.gov (United States)

    Xu, Zhaowei; Huang, Likun; Zhang, Hainan; Li, Yang; Guo, Shujuan; Wang, Nan; Wang, Shi-Hua; Chen, Ziqing; Wang, Jingfang; Tao, Sheng-Ce

    2016-01-27

    Protein microarray is a powerful technology for both basic research and clinical study. However, because there is no database specifically tailored for protein microarray, the majority of the valuable original protein microarray data is still not publically accessible. To address this issue, we constructed Protein Microarray Database (PMD), which is specifically designed for archiving and analyzing protein microarray data. In PMD, users can easily browse and search the entire database by experimental name, protein microarray type, and sample information. Additionally, PMD integrates several data analysis tools and provides an automated data analysis pipeline for users. With just one click, users can obtain a comprehensive analysis report for their protein microarray data. The report includes preliminary data analysis, such as data normalization, candidate identification, and an in-depth bioinformatics analysis of the candidates, which include functional annotation, pathway analysis, and protein-protein interaction network analysis. PMD is now freely available at www.proteinmicroarray.cn.

  14. Increasing process understanding by analyzing complex interactions in experimental data

    DEFF Research Database (Denmark)

    Naelapaa, Kaisa; Allesø, Morten; Kristensen, Henning Gjelstrup

    2009-01-01

    There is a recognized need for new approaches to understand unit operations with pharmaceutical relevance. A method for analyzing complex interactions in experimental data is introduced. Higher-order interactions do exist between process parameters, which complicate the interpretation...... understanding of a coating process. It was possible to model the response, that is, the amount of drug released, using both mentioned techniques. However, the ANOVAmodel was difficult to interpret as several interactions between process parameters existed. In contrast to ANOVA, GEMANOVA is especially suited...... for modeling complex interactions and making easily understandable models of these. GEMANOVA modeling allowed a simple visualization of the entire experimental space. Furthermore, information was obtained on how relative changes in the settings of process parameters influence the film quality and thereby drug...

  15. Outline of fast analyzer for MHD equilibrium 'FAME'

    International Nuclear Information System (INIS)

    Sakata, Shinya; Haginoya, Hirofumi; Tsuruoka, Takuya; Aoyagi, Tetsuo; Saito, Naoyuki; Harada, Hiroo; Tani, Keiji; Watanabe, Hideto.

    1994-03-01

    The FAME (Fast Analyzer for Magnetohydrodynamic (MHD) Equilibrium) system has been developed in order to provide more than 100 MHD equilibria in time series which are enough for the non-stationary analysis of the experimental data of JT-60 within about 20 minutes shot interval. The FAME is an MIMD type small scale parallel computer with 20 microprocessors which are connected by a multi-stage switching system. The maximum theoretical speed is 250 MFLOPS. For the software system of FAME, MHD equilibrium analysis code SELENE and its input data production code FBI are tuned up taking the parallel processing into consideration. Consequently, the computational performance of the FAME system becomes more than 7 times faster than the existing general purpose computer FACOM M780-10s. This report summarizes the outline of the FAME system including hardware, soft-ware and peripheral equipments. (author)

  16. On social inequality: Analyzing the rich-poor disparity

    Science.gov (United States)

    Eliazar, Iddo; Cohen, Morrel H.

    2014-05-01

    From the Old Testament to the Communist Manifesto, and from the French Revolution to the Occupy Wall Street protests, social inequality has always been at the focal point of public debate, as well as a major driver of political change. Although being of prime interest since Biblical times, the scientific investigation of the distributions of wealth and income in human societies began only at the close of the nineteenth century, and was pioneered by Pareto, Lorenz, Gini, and Pietra. The methodologies introduced by these trailblazing scholars form the bedrock of the contemporary science of social inequality. Based on this bedrock we present a new quantitative approach to the analysis of wealth and income distributions, which sets its spotlight on the most heated facet of the current global debate on social inequality-the rich-poor disparity. Our approach offers researchers highly applicable quantitative tools to empirically track and statistically analyze the growing gap between the rich and the poor.

  17. Analyzing Tibetan Monastics Conception of Universe Through Their Drawings

    Science.gov (United States)

    Sonam, Tenzin; Chris Impey

    2016-06-01

    Every culture and tradition has their own representation of the universe that continues to evolve through new technologies and discoveries, and as a result of cultural exchange. With the recent introduction of Western science into the Tibetan Buddhist monasteries in India, this study explores the monastics’ conception of the universe prior to their formal instruction in science. Their drawings were analyzed using Tversky’s three criteria for drawing analysis namely—segmentation, order, and hierarchical structure of knowledge. Among the sixty Buddhist monastics included in this study, we find that most of them draw a geocentric model of the universe with the Solar System as the dominant physical system, reflecting little influence of modern astronomical knowledge. A few monastics draw the traditional Buddhist model of the world. The implications of the monastics' representation of the universe for their assimilation of modern science is discussed.

  18. Analyzing Software Requirements Errors in Safety-Critical, Embedded Systems

    Science.gov (United States)

    Lutz, Robyn R.

    1993-01-01

    This paper analyzes the root causes of safety-related software errors in safety-critical, embedded systems. The results show that software errors identified as potentially hazardous to the system tend to be produced by different error mechanisms than non- safety-related software errors. Safety-related software errors are shown to arise most commonly from (1) discrepancies between the documented requirements specifications and the requirements needed for correct functioning of the system and (2) misunderstandings of the software's interface with the rest of the system. The paper uses these results to identify methods by which requirements errors can be prevented. The goal is to reduce safety-related software errors and to enhance the safety of complex, embedded systems.

  19. Analyzing the Composition of HDI in European Countries

    Directory of Open Access Journals (Sweden)

    Ciupac-Ulici Maria-Lenuţa

    2015-12-01

    Full Text Available Human Development Index (HDI measures development in a country by combining indicators of life expectancy, education level and income. In 2013, 187 countries were included in this index, which aims to expand the coverage area as additional statistics become more available. HDI, which is published by UNDP, may be the most comprehensive indicator, but it is not fully compatible enough to measure the human development level in a global perspective. Human Development Index explicitly explains the development of a country as being more than an economic growth tool or material wealth. In this way, this index is distinguished from many other performance indicators. This article aims to analyze the proportion of the three indicators on 37 European countries.

  20. Guidelines and precautions in collecting and analyzing for mixed wastes

    International Nuclear Information System (INIS)

    Hall, J.R.; Stagg, D.D.; Clark, S.L.

    1987-01-01

    Regulatory requirements mandated by the US Environmental Protection Agency (EPA) for the Resource Conservation and Recovery Act, the Superfund, and the EPA/US Nuclear Regulatory Commission guidance document have generated an increasing demand for hazardous materials analysis of radioactive contaminated samples (mixed waste). The analysis of these samples, which contain both radioactive and hazardous materials, using EPA methods requires that both the sample collection and the analytical analysis be performed using guidelines and precautions different from those normally used for radioanalytical work. The paper discusses the unique procedures, guidelines, and precautions one must use in collecting and analyzing mixed waste samples in order to achieve accurate and reliable results and reports the recent experience of International Technology (IT) Corporation in constructing and operating a mixed waste laboratory. It also describes the experience of IT personnel in collecting samples in the field

  1. Monte Carlo simulation to analyze the performance of CPV modules

    Science.gov (United States)

    Herrero, Rebeca; Antón, Ignacio; Sala, Gabriel; De Nardis, Davide; Araki, Kenji; Yamaguchi, Masafumi

    2017-09-01

    A model to evaluate the performance of high concentrator photovoltaics (HCPV) modules (that generates current-voltage curves) has been applied together with a Monte Carlo approach to obtain a distribution of modules with a given set of characteristics (e.g., receivers electrical properties and misalignments within elementary units in modules) related to a manufacturing scenario. In this paper, the performance of CPV systems (tracker and inverter) that contain the set of simulated modules is evaluated depending on different system characteristics: inverter configuration, sorting of modules and bending of the tracker frame. Thus, the study of the HCPV technology regarding its angular constrains is fully covered by analyzing all the possible elements affecting the generated electrical power.

  2. Solid State Neutral Particle Analyzer Array on NSTX

    International Nuclear Information System (INIS)

    Shinohara, K.; Darrow, D.S.; Roquemore, A.L.; Medley, S.S.; Cecil, F.E.

    2004-01-01

    A Solid State Neutral Particle Analyzer (SSNPA) array has been installed on the National Spherical Torus Experiment (NSTX). The array consists of four chords viewing through a common vacuum flange. The tangency radii of the viewing chords are 60, 90, 100, and 120 cm. They view across the three co-injection neutral beam lines (deuterium, 80 keV (typ.) with tangency radii 48.7, 59.2, and 69.4 cm) on NSTX and detect co-going energetic ions. A silicon photodiode used was calibrated by using a mono-energetic deuteron beam source. Deuterons with energy above 40 keV can be detected with the present setup. The degradation of the performance was also investigated. Lead shots and epoxy are used for neutron shielding to reduce handling any hazardous heavy metal. This method also enables us to make an arbitrary shape to be fit into the complex flight tube

  3. A 16-detector alpha spectrometer using 1 multichannel analyzer

    International Nuclear Information System (INIS)

    Phillips, W.G.

    1978-01-01

    An alpha spectrometer containing 16 independent detectors and utilizing one 4096-channel multichannel analyzer (MCA) was constructed from commerically available modules. The spectrometer was designed specifically for the counting of low levels of radioactivity in environmental samples. Gated analog routing allows spectral data acquisition into 256 channel regions of the MCA memory as if each region were an independent 256-channel MCA. External live-time clocks and 50-Mhz analog-to-digital converters control timing and acquisition on each unit of eight detectors. Spectral data output is to magnetic tape in units of 256 channels each with a unique tagword. These tapes are then read and processed, and final reports are generated, by a large Control Data 6000 series computer

  4. Nuclear plant simulation using the Nuclear Plant Analyzer

    International Nuclear Information System (INIS)

    Beelman, R.J.; Laats, E.T.; Wagner, R.J.

    1984-01-01

    The Nuclear Plant Analyzer (NPA), a state-of-the-art computerized safety analysis and engineering tool, was employed to simulate nuclear plant response to an abnormal transient during a training exercise at the US Nuclear Regulatory Commission (USNRC) in Washington, DC. Information relative to plant status was taken from a computer animated color graphics display depicting the course of the transient and was transmitted to the NRC Operations Center in a manner identical to that employed during an actual event. Recommendations from the Operations Center were implemented during on-line, interactive execution of the RELAP5 reactor systems code through the NPA allowing a degree of flexibility in training exercises not realized previously. When the debriefing was conducted, the RELAP5 calculations were replayed by way of the color graphics display, adding a new dimension to the debriefing and greatly enhancing the critique of the exercise

  5. Keys to analyzing the spatial categories in employment records studies

    Directory of Open Access Journals (Sweden)

    Eugenia Roberti

    2012-01-01

    Full Text Available The aim of this study is to carry out a bibliographical review of the ways in which spatial categories have been approached within poor people's employment record studies. The significance of this research lies on the fact that, even though the perspectives from which the concept of "record" emerges give precedence to the analysis of time and space dimensions, the latter has only been scarcely studied. Within such framework, our paper aims to analyze how the space variable is constructed in various empirical investigations and which contributions are made by the record perspective to space variable studies. To approach the issue, we will focus on those spatial categories of analysis which allow us to discern the new characteristics and consequences underlying the spatial segregation phenomenon. Special emphasis will be given to the neighborhood space

  6. Observation Planning Made Simple with Science Opportunity Analyzer (SOA)

    Science.gov (United States)

    Streiffert, Barbara A.; Polanskey, Carol A.

    2004-01-01

    As NASA undertakes the exploration of the Moon and Mars as well as the rest of the Solar System while continuing to investigate Earth's oceans, winds, atmosphere, weather, etc., the ever-existing need to allow operations users to easily define their observations increases. Operation teams need to be able to determine the best time to perform an observation, as well as its duration and other parameters such as the observation target. In addition, operations teams need to be able to check the observation for validity against objectives and intent as well as spacecraft constraints such as turn rates and acceleration or pointing exclusion zones. Science Opportunity Analyzer (SOA), in development for the last six years, is a multi-mission toolset that has been built to meet those needs. The operations team can follow six simple steps and define his/her observation without having to know the complexities of orbital mechanics, coordinate transformations, or the spacecraft itself.

  7. USING THE FACTORIAL CORRESPONDENCES FOR ANALYZING TOURIST FLOWS

    Directory of Open Access Journals (Sweden)

    Kamer Ainur M. AIVAZ

    2016-06-01

    Full Text Available This study aims to analyze the distribution of each flow of non-residents tourists, coming from 33 countries, on the six main categories of touristic destinations in Romania, in 2015, and assumes that there are differences or similarities between the tourists origin country and the touristic destinations that they chose. The performances recorded in Romania regarding the attraction of foreign tourists were relatively modest during the past three decades, from various reasons, starting with a poor access infrastructure and finishing with a deficient and, sometimes inadequate activity of tourism promotion. The statistical method used is the factorial correspondences analysis. The data processing, the indicators significance testing and the graph representations were performed using SPSS statistical software. We consider that the usage of this method allows the indirect knowledge of the tourist preferences and the results may be useful in developing a strategy for tourism promotion, customized for each country that sends tourists.

  8. Analyzing high school students’ reasoning about electromagnetic induction

    Directory of Open Access Journals (Sweden)

    Katarina Jelicic

    2017-02-01

    Full Text Available Electromagnetic induction is an important, yet complex, physics topic that is a part of Croatian high school curriculum. Nine Croatian high school students of different abilities in physics were interviewed using six demonstration experiments from electromagnetism (three of them concerned the topic of electromagnetic induction. Students were asked to observe, describe, and explain the experiments. The analysis of students’ explanations indicated the existence of many conceptual and reasoning difficulties with the basic concepts of electromagnetism, and especially with recognizing and explaining the phenomenon of electromagnetic induction. Three student mental models of electromagnetic induction, formed during the interviews, which reoccurred among students, are described and analyzed within the knowledge-in-pieces framework.

  9. Analyzing the history of Cognition using Topic Models.

    Science.gov (United States)

    Cohen Priva, Uriel; Austerweil, Joseph L

    2015-02-01

    Very few articles have analyzed how cognitive science as a field has changed over the last six decades. We explore how Cognition changed over the last four decades using Topic Models. Topic Models assume that every word in every document is generated by one of a limited number of topics. Words that are likely to co-occur are likely to be generated by a single topic. We find a number of significant historical trends: the rise of moral cognition, eyetracking methods, and action, the fall of sentence processing, and the stability of development. We introduce the notion of framing topics, which frame content, rather than present the content itself. These framing topics suggest that over time Cognition turned from abstract theorizing to more experimental approaches. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. ANALYZING THE RELATIONSHIP BETWEEN MOVIES AND TV COMMERCIALS TYPES

    Directory of Open Access Journals (Sweden)

    Frank Van Der Valk

    2017-09-01

    Full Text Available The aim of this paper is to shed some light on whether there is any connection between the types of movies broadcasted on TV and types of commercials or not. A total of 20 different movies of Polish channels have been recorded and visually analyzed including the contents of the commercials broadcasted before, after and, during them. The different types of movies examined include comedies of manners, romantic comedies, thrillers, action movies, dramas, crimes, science-fiction and adventures. The research results show that there is a relationship between the types of movies broadcasted and the commercials before, during and after them. This connection is based on the needs and consuming behavior of each movie’s viewers.

  11. Using Model Checking for Analyzing Distributed Power Control Problems

    DEFF Research Database (Denmark)

    Brihaye, Thomas; Jungers, Marc; Lasaulce, Samson

    2010-01-01

    Model checking (MC) is a formal verification technique which has been known and still knows a resounding success in the computer science community. Realizing that the distributed power control ( PC) problem can be modeled by a timed game between a given transmitter and its environment, the authors...... wanted to know whether this approach can be applied to distributed PC. It turns out that it can be applied successfully and allows one to analyze realistic scenarios including the case of discrete transmit powers and games with incomplete information. The proposed methodology is as follows. We state some...... objectives a transmitter-receiver pair would like to reach. The network is modeled by a game where transmitters are considered as timed automata interacting with each other. The objectives are then translated into timed alternating-time temporal logic formulae and MC is exploited to know whether the desired...

  12. Innovation Networks New Approaches in Modelling and Analyzing

    CERN Document Server

    Pyka, Andreas

    2009-01-01

    The science of graphs and networks has become by now a well-established tool for modelling and analyzing a variety of systems with a large number of interacting components. Starting from the physical sciences, applications have spread rapidly to the natural and social sciences, as well as to economics, and are now further extended, in this volume, to the concept of innovations, viewed broadly. In an abstract, systems-theoretical approach, innovation can be understood as a critical event which destabilizes the current state of the system, and results in a new process of self-organization leading to a new stable state. The contributions to this anthology address different aspects of the relationship between innovation and networks. The various chapters incorporate approaches in evolutionary economics, agent-based modeling, social network analysis and econophysics and explore the epistemic tension between insights into economics and society-related processes, and the insights into new forms of complex dynamics.

  13. Using Model Checking for Analyzing Distributed Power Control Problems

    Directory of Open Access Journals (Sweden)

    Thomas Brihaye

    2010-01-01

    Full Text Available Model checking (MC is a formal verification technique which has been known and still knows a resounding success in the computer science community. Realizing that the distributed power control (PC problem can be modeled by a timed game between a given transmitter and its environment, the authors wanted to know whether this approach can be applied to distributed PC. It turns out that it can be applied successfully and allows one to analyze realistic scenarios including the case of discrete transmit powers and games with incomplete information. The proposed methodology is as follows. We state some objectives a transmitter-receiver pair would like to reach. The network is modeled by a game where transmitters are considered as timed automata interacting with each other. The objectives are then translated into timed alternating-time temporal logic formulae and MC is exploited to know whether the desired properties are verified and determine a winning strategy.

  14. Stochastic Particle Real Time Analyzer (SPARTA) Validation and Verification Suite

    Energy Technology Data Exchange (ETDEWEB)

    Gallis, Michael A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Fluid Science and Engineering Dept.; Koehler, Timothy P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Fluid Science and Engineering Dept.; Plimpton, Steven J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Multi Scale Science Dept.

    2014-10-01

    This report presents the test cases used to verify, validate and demonstrate the features and capabilities of the first release of the 3D Direct Simulation Monte Carlo (DSMC) code SPARTA (Stochastic Real Time Particle Analyzer). The test cases included in this report exercise the most critical capabilities of the code like the accurate representation of physical phenomena (molecular advection and collisions, energy conservation, etc.) and implementation of numerical methods (grid adaptation, load balancing, etc.). Several test cases of simple flow examples are shown to demonstrate that the code can reproduce phenomena predicted by analytical solutions and theory. A number of additional test cases are presented to illustrate the ability of SPARTA to model flow around complicated shapes. In these cases, the results are compared to other well-established codes or theoretical predictions. This compilation of test cases is not exhaustive, and it is anticipated that more cases will be added in the future.

  15. A versatile retarding potential analyzer for nano-satellite platforms.

    Science.gov (United States)

    Fanelli, L; Noel, S; Earle, G D; Fish, C; Davidson, R L; Robertson, R V; Marquis, P; Garg, V; Somasundaram, N; Kordella, L; Kennedy, P

    2015-12-01

    The design of the first retarding potential analyzer (RPA) built specifically for use on resource-limited cubesat platforms is described. The size, mass, and power consumption are consistent with the limitations of a nano-satellite, but the performance specifications are commensurate with those of RPAs flown on much larger platforms. The instrument is capable of measuring the ion density, temperature, and the ram component of the ion velocity in the spacecraft reference frame, while also providing estimates of the ion composition. The mechanical and electrical designs are described, as are the operating modes, command and data structure, and timing scheme. Test data obtained using an ion source inside a laboratory vacuum chamber are presented to validate the performance of the new design.

  16. Nuclear Plant Analyzer development at the Idaho National Engineering Laboratory

    International Nuclear Information System (INIS)

    Laats, E.T.; Beelman, R.J.; Charlton, T.R.; Hampton, N.L.; Burtt, J.D.

    1985-01-01

    The Nuclear Plant Analyzer (NPA) is a state-of-the-art safety analysis and engineering tool being used to address key nuclear power plant safety issues. The NPA has been developed to integrate the NRC's computerized reactor behavior simulation codes such as RELAP5, TRAC-BWR, and TRAC-PWR, with well-developed computer graphics programs and large repositories of reactor design and experimental data. An important feature of the NAP is the capability to allow an analyst to redirect a RELAP5 or TRAC calculation as it progresses through its simulated scenario. The analyst can have the same power plant control capabilities as the operator of an actual plant. The NPA resides on the dual CDS Cyber-176 mainframe computers at the INEL and is being converted to operate on a Cray-1S computer at the LANL. The subject of this paper is the program conducted at the INEL

  17. Analyzing sickness absence with statistical models for survival data

    DEFF Research Database (Denmark)

    Christensen, Karl Bang; Andersen, Per Kragh; Smith-Hansen, Lars

    2007-01-01

    OBJECTIVES: Sickness absence is the outcome in many epidemiologic studies and is often based on summary measures such as the number of sickness absences per year. In this study the use of modern statistical methods was examined by making better use of the available information. Since sickness...... absence data deal with events occurring over time, the use of statistical models for survival data has been reviewed, and the use of frailty models has been proposed for the analysis of such data. METHODS: Three methods for analyzing data on sickness absences were compared using a simulation study...... involving the following: (i) Poisson regression using a single outcome variable (number of sickness absences), (ii) analysis of time to first event using the Cox proportional hazards model, and (iii) frailty models, which are random effects proportional hazards models. Data from a study of the relation...

  18. Analyzing Tibetan Monastic Conceptions of the Universe Through Individual Drawings

    Science.gov (United States)

    Sonam, Tenzin; Impey, Chris David

    2017-01-01

    Every culture and tradition has its own representation of the universe that continues to evolve due to the influence of new technologies, discoveries, and cultural exchanges. With the recent introduction of Western science into the Tibetan Buddhist monasteries in India, this study explores monastic conceptions of the universe prior to formal instruction in astronomy. The drawings of 59 Buddhist monks and nuns were analyzed using Tversky’s three criteria for drawing analysis—segmentation, order, and hierarchical structure of knowledge. We found that 22 out of 59 monastics drew a geocentric model of the universe with the Solar System as the dominant physical system, reflecting little influence of modern astronomical knowledge. Only six monastics drew the traditional Buddhist model of the world, generally known as the Mount Meru Cosmology. The implication of the monastics' representation of the universe for their assimilation into modern science is discussed.

  19. A parametric model for analyzing anticipation in genetically predisposed families

    DEFF Research Database (Denmark)

    Larsen, Klaus; Petersen, Janne; Bernstein, Inge

    2009-01-01

    Anticipation, i.e. a decreasing age-at-onset in subsequent generations has been observed in a number of genetically triggered diseases. The impact of anticipation is generally studied in affected parent-child pairs. These analyses are restricted to pairs in which both individuals have been affected....... The suggested model corrects for incomplete observations and considers families rather than affected pairs and thereby allows for studies of large sample sets, facilitates subgroup analyses and provides generation effect estimates.......)/Lynch syndrome family cohort from the national Danish HNPCC register. Age-at-onset was analyzed in 824 individuals from 2-4 generations in 125 families with proved disease-predisposing mutations. A significant effect from anticipation was identified with a mean of 3 years earlier age-at-onset per generation...

  20. Generalized empirical likelihood methods for analyzing longitudinal data

    KAUST Repository

    Wang, S.

    2010-02-16

    Efficient estimation of parameters is a major objective in analyzing longitudinal data. We propose two generalized empirical likelihood based methods that take into consideration within-subject correlations. A nonparametric version of the Wilks theorem for the limiting distributions of the empirical likelihood ratios is derived. It is shown that one of the proposed methods is locally efficient among a class of within-subject variance-covariance matrices. A simulation study is conducted to investigate the finite sample properties of the proposed methods and compare them with the block empirical likelihood method by You et al. (2006) and the normal approximation with a correctly estimated variance-covariance. The results suggest that the proposed methods are generally more efficient than existing methods which ignore the correlation structure, and better in coverage compared to the normal approximation with correctly specified within-subject correlation. An application illustrating our methods and supporting the simulation study results is also presented.

  1. Mass analyzed threshold ionization spectroscopy of 7-azaindole cation

    Science.gov (United States)

    Lee Lin, Jung; Tzeng, Wen Bih

    2003-10-01

    The vibrationally resolved mass analyzed threshold ionization (MATI) spectra of jet-cooled 7-azaindole have been recorded by ionizing via four different intermediate levels. The adiabatic ionization energy of this molecule is determined to be 65 462±5 cm -1, which is greater than that of indole by 2871 cm -1. The vibrational spectra of 7-azaindole in the S 1 and D 0 states have been successfully assigned by comparing the measured frequencies with those of indole as well as the predicted values from the ab initio calculations. Detailed analysis on the MATI spectra shows that the structure of the cation is somewhat different from that of this species in the neutral S 1 state.

  2. Mass analyzed threshold ionization spectroscopy of 1-methylindole cation

    Science.gov (United States)

    Lin, Jung Lee; Tzeng, Wen Bih

    2003-08-01

    The vibrationally resolved mass analyzed threshold ionization spectra of jet-cooled 1-methylindole (1MI) have been recorded by ionizing via four vibronic levels. The adiabatic ionization energy of this molecule is determined to be 60 749 ± 5 cm -1, which is less than that of indole by 1842 cm -1. This indicates that the N-methyl substitution causes a greater extent of the lowering in the zero energy level of the cationic ground state than that of the neutral. A few characteristic vibrations of the 1MI cation are observed, where the frequencies of the out-of-plane, in-plane bending and stretching vibrations of the N-CH 3 part are found to be 124, 251, and 1492 cm -1, respectively.

  3. Analyzing the multilevel structure of the European airport network

    Directory of Open Access Journals (Sweden)

    Oriol Lordan

    2017-04-01

    Full Text Available The multilayered structure of the European airport network (EAN, composed of connections and flights between European cities, is analyzed through the k-core decomposition of the connections network. This decomposition allows to identify the core, bridge and periphery layers of the EAN. The core layer includes the best-connected cities, which include important business air traffic destinations. The periphery layer includes cities with lesser connections, which serve low populated areas where air travel is an economic alternative. The remaining cities form the bridge of the EAN, including important leisure travel origins and destinations. The multilayered structure of the EAN affects network robustness, as the EAN is more robust to isolation of nodes of the core, than to the isolation of a combination of core and bridge nodes.

  4. A versatile retarding potential analyzer for nano-satellite platforms

    Science.gov (United States)

    Fanelli, L.; Noel, S.; Earle, G. D.; Fish, C.; Davidson, R. L.; Robertson, R. V.; Marquis, P.; Garg, V.; Somasundaram, N.; Kordella, L.; Kennedy, P.

    2015-12-01

    The design of the first retarding potential analyzer (RPA) built specifically for use on resource-limited cubesat platforms is described. The size, mass, and power consumption are consistent with the limitations of a nano-satellite, but the performance specifications are commensurate with those of RPAs flown on much larger platforms. The instrument is capable of measuring the ion density, temperature, and the ram component of the ion velocity in the spacecraft reference frame, while also providing estimates of the ion composition. The mechanical and electrical designs are described, as are the operating modes, command and data structure, and timing scheme. Test data obtained using an ion source inside a laboratory vacuum chamber are presented to validate the performance of the new design.

  5. Development of a PDA Based Portable Pulse Height Analyzer System

    International Nuclear Information System (INIS)

    Mankheed, Panuphong; Ngernvijit, Narippawaj; Thong-Aram, Decho

    2007-08-01

    Full text: In this research a portable pulse height analyzer system was developed by application of a Personal Digital Assistant (PDAs) palm Tungsten T model together with Single Chip SCA developed by Department of Nuclear Technology, Chulalongkorn University to be used for education and research works. Capability of the developed system could measure both the energy and the average count rate of gamma rays. The results of this research showed that the gamma energy spectrum analysis of the developed system with a 2? x 2? NaI(Tl) detector could display photo peaks of Cs-137 and Co-60 at channel 57, channel 103, and channel 117 respectively. The energy resolution was found to be 7.14% at energy 661.66 keV of Cs-137

  6. ACTIVITY-BASED COST ALLOCATION AND FUNCTION ANALYZES IN TRADE

    Directory of Open Access Journals (Sweden)

    TÜNDE VERES

    2011-01-01

    Full Text Available In this paper the author is thinking about the efficiency analyzes of trading. The most important evaluation factors of trade are the sales value, volume and the margin. Of course the easiest and fastest way is to follow the market situation by the turnover but for long term thinking the sales companies need to concentrate also for efficiency. Trading activity has some functions which can deeply effect for the final result and this is the reason to calculate their clear and reliable costs is an important condition of the decision making. The author reviews the cost categories and the basic functions in trading activity to find possible ways getting reliable information.

  7. A new grading system for analyzing pediatric cholesteatoma

    International Nuclear Information System (INIS)

    Kodama, Akira; Ashimori, Naoki; Tsurita, Minako; Ban, Akihiro

    2007-01-01

    We developed a new grading system to understand the complicated pathological changes of cholesteatoma in comparison to those of chronic otitis media. This grading system, based on the extent of the cholesteatoma and the surrounding pathlogical changes, is able to simply express the entire pathological condition of the ear with cholesteatoma. Using this grading system, we analyzed the ears of 48 children with cholesteatoma, who underwent tympanoplasty over the past ten years. Their ages ranged from 2 to 15 years with an average age of 8.5 years. The attic and mesotympanic cholesteatomas were associated with greater pathological changes than those observed in postero-superior quadrant cholesteatomas. The degree of the pathological change in the area surrounding the cholesteatoma appears to correlate with the degree of extention of the cholesteatoma. This system is thus considered to be useful for evaluating the improvement in the pathological conditions before and after surgery in patients with cholesteatoma. (author)

  8. Using the Platelet Function Analyzer-100 for monitoring aspirin therapy

    DEFF Research Database (Denmark)

    Poulsen, Tina Svenstrup; Mickley, Hans; Korsholm, Lars

    2007-01-01

    INTRODUCTION: The aim of the study was to evaluate the test characteristics of the Platelet Function Analyzer-100 (PFA-100) in patients treated with aspirin. METHODS AND RESULTS: The study consisted of two sub-studies. In study 1, 10 patients with ischemic heart disease (IHD) and 10 controls had...... platelet function assessed by optical platelet aggregation and the PFA-100 method in two 5-week periods. Patients with IHD were treated with aspirin 150 mg/day (first 5-week period), and 300 mg/day (second 5-week period), whereas the controls only received aspirin (150 mg/day) during the second 5-week...... period. From the results of study 1, we found that a cut-off value for the PFA-100 collagen/epinephrine cartridge aspirin (sensitivity 0.91, specificity 1.00). A good agreement between the PFA-100 method and optical platelet aggregation was found. Within...

  9. Monitoring aspirin therapy with the Platelet Function Analyzer-100

    DEFF Research Database (Denmark)

    Mortensen, Jette; Poulsen, Tina Svenstrup; Grove, Erik Lerkevang

    2008-01-01

    OBJECTIVE: Low platelet response to aspirin has been reported to be associated with a high incidence of vascular events. The reported prevalence of aspirin low-responsiveness varies, which may be explained by poor reproducibility of the methods used to evaluate aspirin response and low compliance....... The Platelet Function Analyzer-100 (PFA-100) is a commonly used platelet function test. We aimed to assess the reproducibility of the PFA-100 and the agreement with optical platelet aggregometry (OPA) in healthy volunteers and in patients with coronary artery disease (CAD) treated with low-dose aspirin....... MATERIAL AND METHODS: Twenty-one healthy volunteers and 43 patients with CAD took part in the study. During treatment with aspirin 75 mg daily, all participants had platelet function assessed in duplicate with the PFA-100 and OPA on 4 consecutive days. Additionally, platelet function was assessed before...

  10. Using SFL as a Tool for Analyzing Students’ Narratives

    Directory of Open Access Journals (Sweden)

    Doris Correa

    2015-06-01

    Full Text Available Traditionally, at universities, English as a foreign language instructors have used a series of approaches to teach students how to write academic texts in English from both teacher preparation and regular programs. In spite of this, students continue to have problems writing the academic texts required of them in the different courses. Concerned with this issue, a group of English as a Foreign Language writing instructors from a Teacher Education Program in Medellín engaged in the study of Systemic Functional Linguistics. The purpose of this article is to report the insights that one of these instructors gained once he began using these theories to analyze a narrative text produced by one of the students in his class.

  11. Using game theory to analyze green stormwater infrastructure implementation policies

    Science.gov (United States)

    William, R. K.; Garg, J.; Stillwell, A. S.

    2017-12-01

    While green stormwater infrastructure is a useful approach in addressing multiple challenges facing the urban environment, little consensus exists on how to best incentivize its adoption by private landowners. Game theory, a field of study designed to model conflict and cooperation between two or more agents, is well-suited to address this policy question. We used a cooperative game theory framework to analyze the impacts of three different policy approaches frequently used to incentivize the uptake of green infrastructure by private landowners: municipal regulation, direct grants, and stormwater fees. The results indicate that municipal regulation leads to the greatest environmental benefits; however, the choice of "best" regulatory approach is dependent on a variety of different factors including political and financial considerations. Policy impacts are also highly dependent on agents' spatial positions within the stormwater network. This finding leads to important questions of social equity and environmental justice.

  12. Optical vector network analyzer based on double-sideband modulation.

    Science.gov (United States)

    Jun, Wen; Wang, Ling; Yang, Chengwu; Li, Ming; Zhu, Ning Hua; Guo, Jinjin; Xiong, Liangming; Li, Wei

    2017-11-01

    We report an optical vector network analyzer (OVNA) based on double-sideband (DSB) modulation using a dual-parallel Mach-Zehnder modulator. The device under test (DUT) is measured twice with different modulation schemes. By post-processing the measurement results, the response of the DUT can be obtained accurately. Since DSB modulation is used in our approach, the measurement range is doubled compared with conventional single-sideband (SSB) modulation-based OVNA. Moreover, the measurement accuracy is improved by eliminating the even-order sidebands. The key advantage of the proposed scheme is that the measurement of a DUT with bandpass response can also be simply realized, which is a big challenge for the SSB-based OVNA. The proposed method is theoretically and experimentally demonstrated.

  13. AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool

    Science.gov (United States)

    Halford, Keith

    2009-01-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  14. Analyzing the User Behavior toward Electronic Commerce Stimuli

    Science.gov (United States)

    Lorenzo-Romero, Carlota; Alarcón-del-Amo, María-del-Carmen; Gómez-Borja, Miguel-Ángel

    2016-01-01

    Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e., navigational structure as utilitarian stimulus) versus non-verbal web technology (music and presentation of products as hedonic stimuli). Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this research consists in analyzing the impact of these web technologies –which constitute the web atmosphere or webmosphere of a website– on shopping human behavior (i.e., users’ internal states -affective, cognitive, and satisfaction- and behavioral responses – approach responses, and real shopping outcomes-) within the retail online store created by computer, taking into account some mediator variables (i.e., involvement, atmospheric responsiveness, and perceived risk). A 2 (“free” versus “hierarchical” navigational structure) × 2 (“on” versus “off” music) × 2 (“moving” versus “static” images) between-subjects computer experimental design is used to test empirically this research. In addition, an integrated methodology was developed allowing the simulation, tracking and recording of virtual user behavior within an online shopping environment. As main conclusion, this study suggests that the positive responses of online consumers might increase when they are allowed to freely navigate the online stores and their experience is enriched by animate gifts and music background. The effect caused by mediator variables modifies relatively the final shopping human behavior. PMID:27965549

  15. Using Fourier transform IR spectroscopy to analyze biological materials.

    Science.gov (United States)

    Baker, Matthew J; Trevisan, Júlio; Bassan, Paul; Bhargava, Rohit; Butler, Holly J; Dorling, Konrad M; Fielden, Peter R; Fogarty, Simon W; Fullwood, Nigel J; Heys, Kelly A; Hughes, Caryn; Lasch, Peter; Martin-Hirsch, Pierre L; Obinaju, Blessing; Sockalingum, Ganesh D; Sulé-Suso, Josep; Strong, Rebecca J; Walsh, Michael J; Wood, Bayden R; Gardner, Peter; Martin, Francis L

    2014-08-01

    IR spectroscopy is an excellent method for biological analyses. It enables the nonperturbative, label-free extraction of biochemical information and images toward diagnosis and the assessment of cell functionality. Although not strictly microscopy in the conventional sense, it allows the construction of images of tissue or cell architecture by the passing of spectral data through a variety of computational algorithms. Because such images are constructed from fingerprint spectra, the notion is that they can be an objective reflection of the underlying health status of the analyzed sample. One of the major difficulties in the field has been determining a consensus on spectral pre-processing and data analysis. This manuscript brings together as coauthors some of the leaders in this field to allow the standardization of methods and procedures for adapting a multistage approach to a methodology that can be applied to a variety of cell biological questions or used within a clinical setting for disease screening or diagnosis. We describe a protocol for collecting IR spectra and images from biological samples (e.g., fixed cytology and tissue sections, live cells or biofluids) that assesses the instrumental options available, appropriate sample preparation, different sampling modes as well as important advances in spectral data acquisition. After acquisition, data processing consists of a sequence of steps including quality control, spectral pre-processing, feature extraction and classification of the supervised or unsupervised type. A typical experiment can be completed and analyzed within hours. Example results are presented on the use of IR spectra combined with multivariate data processing.

  16. The evaluation of skin friction using a frictional feel analyzer.

    Science.gov (United States)

    Egawa, Mariko; Oguri, Motoki; Hirao, Tetsuji; Takahashi, Motoji; Miyakawa, Michio

    2002-02-01

    Sensory evaluation is an important factor for cosmetic products. Several devices for the measurement of sensory properties have been developed in recent years. The objective here is to measure skin surface friction using these devices and examine the correlation with other physiological parameters in order to evaluate the potential of physical measurement of tactile sensation. A KES-SE Frictional Analyzer, a commercial device for measurement of surface frictional characteristics, was used in this study. An arm holder was added to this device for measurement on the human forearm. The frictional coefficient (MIU) and its mean deviation (MMD) were used as the parameter to indicate surface friction. The moisture content in the stratum corneum was measured with a Corneometer CM825, the transepidermal water loss with a Tewameter TM210, the viscoelastic properties of the skin with a Cutometer SEM575 and the skin surface pattern by observing the negative replica using silicon rubber. The MIU was not influenced by load; however, it was increased due to water application on the skin. The relationship between MIU and the moisture content in the stratum comeum, between MMD and skin surface pattern and between MMD and viscosity of both normal human forearm skin and SDS (sodium dodecyl sulfate)-induced dry skin were confirmed by statistical analysis in a test on human subjects. There was also a correlation between either MIU or MMD and sensory evaluation in the morning after the application of moisturizing products. Human skin surface friction was measured by using a KES-SE Frictional Analyzer. Judging from the correlation between either MIU or MMD and sensory evaluation, we considered this instrumental analysis to be useful for evaluating the tactile impression of human skin.

  17. Polarized 3He Gas Circulating Technologies for Neutron Analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Watt, David [Xemed LLC, Durham, NH (United States); Hersman, Bill [Xemed LLC, Durham, NH (United States)

    2014-12-10

    We describe the development of an integrated system for quasi-continuous operation of a large volume neutron analyzer. The system consists of a non-magnetic diaphragm compressor, a prototype large volume helium polarizer, a surrogate neutron analyzer, a non-depolarizing gas storage reservoir, a non-ferrous valve manifold for handling gas distribution, a custom rubidium-vapor gas return purifier, and wire-wound transfer lines, all of which are immersed in a two-meter external magnetic field. Over the Phase II period we focused on three major tasks required for the successful deployment of these types of systems: 1) design and implementation of gas handling hardware, 2) automation for long-term operation, and 3) improvements in polarizer performance, specifically fabrication of aluminosilicate optical pumping cells. In this report we describe the design, implementation, and testing of the gas handling hardware. We describe improved polarizer performance resulting from improved cell materials and fabrication methods. These improvements yielded valved 8.5 liter cells with relaxation times greater than 12 hours. Pumping this cell with 1500W laser power with 1.25nm linewidth yielded peak polarizations of 60%, measured both inside and outside the polarizer. Fully narrowing this laser to 0.25nm, demonstrated separately on one stack of the four, would have allowed 70% polarization with this cell. We demonstrated the removal of 5 liters of polarized helium from the polarizer with no measured loss of polarization. We circulated the gas through a titanium-clad compressor with polarization loss below 3% per pass. We also prepared for the next phase of development by refining the design of the polarizer so that it can be engineer-certified for pressurized operation. The performance of our system far exceeds comparable efforts elsewhere.

  18. Chromatin immunoprecipitation to analyze DNA binding sites of HMGA2.

    Directory of Open Access Journals (Sweden)

    Nina Winter

    Full Text Available BACKGROUND: HMGA2 is an architectonic transcription factor abundantly expressed during embryonic and fetal development and it is associated with the progression of malignant tumors. The protein harbours three basically charged DNA binding domains and an acidic protein binding C-terminal domain. DNA binding induces changes of DNA conformation and hence results in global overall change of gene expression patterns. Recently, using a PCR-based SELEX (Systematic Evolution of Ligands by Exponential Enrichment procedure two consensus sequences for HMGA2 binding have been identified. METHODOLOGY/PRINCIPAL FINDINGS: In this investigation chromatin immunoprecipitation (ChIP experiments and bioinformatic methods were used to analyze if these binding sequences can be verified on chromatin of living cells as well. CONCLUSION: After quantification of HMGA2 protein in different cell lines the colon cancer derived cell line HCT116 was chosen for further ChIP experiments because of its 3.4-fold higher HMGA2 protein level. 49 DNA fragments were obtained by ChIP. These fragments containing HMGA2 binding sites have been analyzed for their AT-content, location in the human genome and similarities to sequences generated by a SELEX study. The sequences show a significantly higher AT-content than the average of the human genome. The artificially generated SELEX sequences and short BLAST alignments (11 and 12 bp of the ChIP fragments from living cells show similarities in their organization. The flanking regions are AT-rich, whereas a lower conservation is present in the center of the sequences.

  19. The Size Spectrum as Tool for Analyzing Marine Plastic Pollution

    KAUST Repository

    Martí, E.

    2016-12-02

    Marine plastic debris spans over six orders of magnitude in lineal size, from microns to meters. The broad range of plastic sizes mainly arises from the continuous photodegradation and fragmentation affecting the plastic objects. Interestingly, this time-dependent process links, to some degree, the size to the age of the debris. The variety of plastic sizes gives the possibility to marine biota to interact and possible take up microplastics through numerous pathways. Physical processes such as sinking and wind-induced transport or the chemical adsorption of contaminants are also closely related to the size and shape of the plastic items. Likewise, available sampling techniques should be considered as partial views of the marine plastic size range. This being so and given that the size is one of the most easily measurable plastic traits, the size spectrum appears as an ideal frame to arrange, integrate, and analyze plastic data of diverse nature. In this work, we examined tens of thousands of plastic items sampled from across the world with the aim of (1) developing and standardizing the size-spectrum tool to study marine plastics, and (2) assembling a global plastic size spectrum (GPSS) database, relating individual size measurements to abundance, color (129 tons), polymer type, and category (rigid fragments, films, threads, foam, pellets, and microbeads). Using GPSS database, we show for instance the dependence of plastic composition on the item size, with high diversity of categories for items larger than 1 cm and a clear dominance (~90%) of hard fragments below, except for the size interval corresponding to microbeads (around 0.5 mm). GPSS database depicts a comprehensive size-based framework for analyzing the marine plastic pollution, enabling the comparison of size-related studies or the testing of hypothesis.

  20. Dissecting and analyzing key residues in protein-DNA complexes.

    Science.gov (United States)

    Kulandaisamy, A; Srivastava, Ambuj; Nagarajan, R; Gromiha, M Michael

    2018-04-01

    Protein-DNA interactions are involved in various fundamental biological processes such as replication, transcription, DNA repair, and gene regulation. To understand the interaction in protein-DNA complexes, the integrative study of binding and stabilizing residues is important. In the present study, we have identified key residues that play a dual role in both binding and stability from a nonredundant dataset of 319 protein-DNA complexes. We observed that key residues are identified in very less number of complexes (29%) and only about 4% of stabilizing/binding residues are identified as key residues. Specifically, stabilizing residues have higher preference to be key residues than binding residues. These key residues include polar, nonpolar, aliphatic, aromatic, and charged amino acids. Moreover, we have analyzed and discussed the key residues in different protein-DNA complexes, which are classified based on protein structural class, function, DNA strand, and their conformations. Especially, Ser, Thr, Tyr, Arg, and Lys residues are commonly found in most of the subclasses of protein-DNA complexes. Further, we analyzed atomic contacts, which show that polar-nonpolar is more enriched than other types of contacts. In addition, the charged contacts are highly preferred in protein-DNA complexes compared with protein-protein and protein-RNA complexes. Finally, we have discussed the sequence and structural features of key residues such as conservation score, surrounding hydrophobicity, solvent accessibility, secondary structure, and long-range order. This study will be helpful to understand the recognition mechanism and structural and functional aspects of protein-DNA complexes. Copyright © 2017 John Wiley & Sons, Ltd.