WorldWideScience

Sample records for analyzing pmoa mmox

  1. Linking carbon and nitrogen cycling: Environmental transcription of mmoX, pmoA, and nifH by methane oxidizing Proteobacteria in a Sub-Arctic palsa peatland

    Science.gov (United States)

    Liebner, Susanne; Svenning, Mette M.

    2013-04-01

    Sub-Arctic terrestrial ecosystems are currently affected by climate change which causes degradation of stored organic carbon and emissions of greenhouse gases from microbial processes. Methane oxidizing bacteria (MOB) mitigate methane emissions and perform an important function in the soil-atmosphere interaction. In this study we investigated presence and environmental transcription of functional genes of MOB along the degradation of permafrost in a Sub-Arctic palsa peatland using molecular approaches. The acidic and oligotrophic peatland hosts a small number of active MOB among a seemingly specialized community. The methanotrophic community displayed a broad functional potential by transcribing genes for key enzymes involved in both carbon and nitrogen metabolisms including particulate and soluble methane monoogygenase (pMMO and sMMO) as well as nitrogenase. Transcription of mmoX that encodes for a subunit of the sMMO suggests an ecological importance of sMMO with a broad substrate range in this peatland. In situ transcripts of mmoX were tracked mainly to Methylocella related Beijerinckiaceae, and to relatives of Methylomonas while Methylocystis constituting the dominant group which utilizes pMMO. These results address interesting questions concerning in-situ substrate preferences of MOB, and the general importance of species that lack a pMMO for mitigating methane emissions. The importance of MOB for the nitrogen budget in this low pH, nitrogen limited habitat was identified by nifH transcripts of native methanotrophs. Hence, methane oxidizing Proteobacteria show an extended functional repertoire and importance for the biogeochemical cycling in this dynamic ecosystem of degrading permafrost.

  2. Complex community of nitrite-dependent anaerobic methane oxidation bacteria in coastal sediments of the Mai Po wetland by PCR amplification of both 16S rRNA and pmoA genes.

    Science.gov (United States)

    Chen, Jing; Zhou, Zhichao; Gu, Ji-Dong

    2015-02-01

    In the present work, both 16S rRNA and pmoA gene-based PCR primers were employed successfully to study the diversity and distribution of n-damo bacteria in the surface and lower layer sediments at the coastal Mai Po wetland. The occurrence of n-damo bacteria in both the surface and subsurface sediments with high diversity was confirmed in this study. Unlike the two other known n-damo communities from coastal areas, the pmoA gene-amplified sequences in the present work clustered not only with some freshwater subclusters but also within three newly erected marine subclusters mostly, indicating the unique niche specificity of n-damo bacteria in this wetland. Results suggested vegetation affected the distribution and community structures of n-damo bacteria in the sediments and n-damo could coexist with sulfate-reducing methanotrophs in the coastal ecosystem. Community structures of the Mai Po n-damo bacteria based on 16S rRNA gene were different from those of either the freshwater or the marine. In contrast, structures of the Mai Po n-damo communities based on pmoA gene grouped with the marine ones and were clearly distinguished from the freshwater ones. The abundance of n-damo bacteria at this wetland was quantified using 16S rRNA gene PCR primers to be 2.65-6.71 × 10(5) copies/g dry sediment. Ammonium and nitrite strongly affected the community structures and distribution of n-damo bacteria in the coastal Mai Po wetland sediments.

  3. Identity of active methanotrophs in landfill cover soil as revealed by DNA-stable isotope probing.

    Science.gov (United States)

    Cébron, Aurélie; Bodrossy, Levente; Chen, Yin; Singer, Andrew C; Thompson, Ian P; Prosser, James I; Murrell, J Colin

    2007-10-01

    A considerable amount of methane produced during decomposition of landfill waste can be oxidized in landfill cover soil by methane-oxidizing bacteria (methanotrophs) thus reducing greenhouse gas emissions to the atmosphere. The identity of active methanotrophs in Roscommon landfill cover soil, a slightly acidic peat soil, was assessed by DNA-stable isotope probing (SIP). Landfill cover soil slurries were incubated with (13)C-labelled methane and under either nutrient-rich nitrate mineral salt medium or water. The identity of active methanotrophs was revealed by analysis of (13)C-labelled DNA fractions. The diversity of functional genes (pmoA and mmoX) and 16S rRNA genes was analyzed using clone libraries, microarrays and denaturing gradient gel electrophoresis. 16S rRNA gene analysis revealed that the cover soil was mainly dominated by Type II methanotrophs closely related to the genera Methylocella and Methylocapsa and to Methylocystis species. These results were supported by analysis of mmoX genes in (13)C-DNA. Analysis of pmoA gene diversity indicated that a significant proportion of active bacteria were also closely related to the Type I methanotrophs, Methylobacter and Methylomonas species. Environmental conditions in the slightly acidic peat soil from Roscommon landfill cover allow establishment of both Type I and Type II methanotrophs.

  4. Transient analyzer

    International Nuclear Information System (INIS)

    Muir, M.D.

    1975-01-01

    The design and design philosophy of a high performance, extremely versatile transient analyzer is described. This sub-system was designed to be controlled through the data acquisition computer system which allows hands off operation. Thus it may be placed on the experiment side of the high voltage safety break between the experimental device and the control room. This analyzer provides control features which are extremely useful for data acquisition from PPPL diagnostics. These include dynamic sample rate changing, which may be intermixed with multiple post trigger operations with variable length blocks using normal, peak to peak or integrate modes. Included in the discussion are general remarks on the advantages of adding intelligence to transient analyzers, a detailed description of the characteristics of the PPPL transient analyzer, a description of the hardware, firmware, control language and operation of the PPPL transient analyzer, and general remarks on future trends in this type of instrumentation both at PPPL and in general

  5. Radiometric analyzer

    International Nuclear Information System (INIS)

    Arima, S.; Oda, M.; Miyashita, K.; Takada, M.

    1977-01-01

    A radiometric analyzer for measuring the characteristic values of a sample by radiation includes a humer of radiation measuring subsystems having different ratios of sensitivities to the elements of the sample and linearizing circuits having inverse function characteristics of calibration functions which correspond to the radiation measuring subsystems. A weighing adder operates a desirable linear combination of the outputs of the linearizing circuits. Operators for operating between two or more different linear combinations are included

  6. Contamination Analyzer

    Science.gov (United States)

    1994-01-01

    Measurement of the total organic carbon content in water is important in assessing contamination levels in high purity water for power generation, pharmaceutical production and electronics manufacture. Even trace levels of organic compounds can cause defects in manufactured products. The Sievers Model 800 Total Organic Carbon (TOC) Analyzer, based on technology developed for the Space Station, uses a strong chemical oxidizing agent and ultraviolet light to convert organic compounds in water to carbon dioxide. After ionizing the carbon dioxide, the amount of ions is determined by measuring the conductivity of the deionized water. The new technique is highly sensitive, does not require compressed gas, and maintenance is minimal.

  7. Web server attack analyzer

    OpenAIRE

    Mižišin, Michal

    2013-01-01

    Web server attack analyzer - Abstract The goal of this work was to create prototype of analyzer of injection flaws attacks on web server. Proposed solution combines capabilities of web application firewall and web server log analyzer. Analysis is based on configurable signatures defined by regular expressions. This paper begins with summary of web attacks, followed by detection techniques analysis on web servers, description and justification of selected implementation. In the end are charact...

  8. Electron attachment analyzer

    International Nuclear Information System (INIS)

    Popp, P.; Grosse, H.J.; Leonhardt, J.; Mothes, S.; Oppermann, G.

    1984-01-01

    The invention concerns an electron attachment analyzer for detecting traces of electroaffine substances in electronegative gases, especially in air. The analyzer can be used for monitoring working places, e. g., in operating theatres. The analyzer consists of two electrodes inserted in a base frame of insulating material (quartz or ceramics) and a high-temperature resistant radiation source ( 85 Kr, 3 H, or 63 Ni)

  9. Nuclear power plant analyzer

    International Nuclear Information System (INIS)

    Stritar, A.

    1986-01-01

    The development of Nuclear Power Plant Analyzers in USA is described. There are two different types of Analyzers under development in USA, the forst in Idaho and Los Alamos national Lab, the second in brookhaven National lab. That one is described in detail. The computer hardware and the mathematical models of the reactor vessel thermalhydraulics are described. (author)

  10. Analyzing Peace Pedagogies

    Science.gov (United States)

    Haavelsrud, Magnus; Stenberg, Oddbjorn

    2012-01-01

    Eleven articles on peace education published in the first volume of the Journal of Peace Education are analyzed. This selection comprises peace education programs that have been planned or carried out in different contexts. In analyzing peace pedagogies as proposed in the 11 contributions, we have chosen network analysis as our method--enabling…

  11. Analyzing in the present

    DEFF Research Database (Denmark)

    Revsbæk, Line; Tanggaard, Lene

    2015-01-01

    The article presents a notion of “analyzing in the present” as a source of inspiration in analyzing qualitative research materials. The term emerged from extensive listening to interview recordings during everyday commuting to university campus. Paying attention to the way different parts of vari...

  12. Gearbox vibration diagnostic analyzer

    Science.gov (United States)

    1992-01-01

    This report describes the Gearbox Vibration Diagnostic Analyzer installed in the NASA Lewis Research Center's 500 HP Helicopter Transmission Test Stand to monitor gearbox testing. The vibration of the gearbox is analyzed using diagnostic algorithms to calculate a parameter indicating damaged components.

  13. Miniature mass analyzer

    CERN Document Server

    Cuna, C; Lupsa, N; Cuna, S; Tuzson, B

    2003-01-01

    The paper presents the concept of different mass analyzers that were specifically designed as small dimension instruments able to detect with great sensitivity and accuracy the main environmental pollutants. The mass spectrometers are very suited instrument for chemical and isotopic analysis, needed in environmental surveillance. Usually, this is done by sampling the soil, air or water followed by laboratory analysis. To avoid drawbacks caused by sample alteration during the sampling process and transport, the 'in situ' analysis is preferred. Theoretically, any type of mass analyzer can be miniaturized, but some are more appropriate than others. Quadrupole mass filter and trap, magnetic sector, time-of-flight and ion cyclotron mass analyzers can be successfully shrunk, for each of them some performances being sacrificed but we must know which parameters are necessary to be kept unchanged. To satisfy the miniaturization criteria of the analyzer, it is necessary to use asymmetrical geometries, with ion beam obl...

  14. Extraction spectrophotometric analyzer

    International Nuclear Information System (INIS)

    Batik, J.; Vitha, F.

    1985-01-01

    Automation is discussed of extraction spectrophotometric determination of uranium in a solution. Uranium is extracted from accompanying elements in an HCl medium with a solution of tributyl phosphate in benzene. The determination is performed by measuring absorbance at 655 nm in a single-phase ethanol-water-benzene-tributyl phosphate medium. The design is described of an analyzer consisting of an analytical unit and a control unit. The analyzer performance promises increased productivity of labour, improved operating and hygiene conditions, and mainly more accurate results of analyses. (J.C.)

  15. Americal options analyzed differently

    NARCIS (Netherlands)

    Nieuwenhuis, J.W.

    2003-01-01

    In this note we analyze in a discrete-time context and with a finite outcome space American options starting with the idea that every tradable should be a martingale under a certain measure. We believe that in this way American options become more understandable to people with a good working

  16. Analyzing Political Television Advertisements.

    Science.gov (United States)

    Burson, George

    1992-01-01

    Presents a lesson plan to help students understand that political advertisements often mislead, lie, or appeal to emotion. Suggests that the lesson will enable students to examine political advertisements analytically. Includes a worksheet to be used by students to analyze individual political advertisements. (DK)

  17. Centrifugal analyzer development

    International Nuclear Information System (INIS)

    Burtis, C.A.; Bauer, M.L.; Bostick, W.D.

    1976-01-01

    The development of the centrifuge fast analyzer (CFA) is reviewed. The development of a miniature CFA with computer data analysis is reported and applications for automated diagnostic chemical and hematological assays are discussed. A portable CFA system with microprocessor was adapted for field assays of air and water samples for environmental pollutants, including ammonia, nitrates, nitrites, phosphates, sulfates, and silica. 83 references

  18. Soft Decision Analyzer

    Science.gov (United States)

    Lansdowne, Chatwin; Steele, Glen; Zucha, Joan; Schlesinger, Adam

    2013-01-01

    We describe the benefit of using closed-loop measurements for a radio receiver paired with a counterpart transmitter. We show that real-time analysis of the soft decision output of a receiver can provide rich and relevant insight far beyond the traditional hard-decision bit error rate (BER) test statistic. We describe a Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in live-time during the development of software defined radios. This test technique gains importance as modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more protocol overhead through noisier channels, and software-defined radios (SDRs) use error-correction codes that approach Shannon's theoretical limit of performance.

  19. KWU Nuclear Plant Analyzer

    International Nuclear Information System (INIS)

    Bennewitz, F.; Hummel, R.; Oelmann, K.

    1986-01-01

    The KWU Nuclear Plant Analyzer is a real time engineering simulator based on the KWU computer programs used in plant transient analysis and licensing. The primary goal is to promote the understanding of the technical and physical processes of a nuclear power plant at an on-site training facility. Thus the KWU Nuclear Plant Analyzer is available with comparable low costs right at the time when technical questions or training needs arise. This has been achieved by (1) application of the transient code NLOOP; (2) unrestricted operator interaction including all simulator functions; (3) using the mainframe computer Control Data Cyber 176 in the KWU computing center; (4) four color graphic displays controlled by a dedicated graphic computer, no control room equipment; and (5) coupling of computers by telecommunication via telephone

  20. Analyzed Using Statistical Moments

    International Nuclear Information System (INIS)

    Oltulu, O.

    2004-01-01

    Diffraction enhanced imaging (DEl) technique is a new x-ray imaging method derived from radiography. The method uses a monorheumetten x-ray beam and introduces an analyzer crystal between an object and a detector Narrow angular acceptance of the analyzer crystal generates an improved contrast over the evaluation radiography. While standart radiography can produce an 'absorption image', DEl produces 'apparent absorption' and 'apparent refraction' images with superior quality. Objects with similar absorption properties may not be distinguished with conventional techniques due to close absorption coefficients. This problem becomes more dominant when an object has scattering properties. A simple approach is introduced to utilize scattered radiation to obtain 'pure absorption' and 'pure refraction' images

  1. Emission spectrometric isotope analyzer

    International Nuclear Information System (INIS)

    Mauersberger, K.; Meier, G.; Nitschke, W.; Rose, W.; Schmidt, G.; Rahm, N.; Andrae, G.; Krieg, D.; Kuefner, W.; Tamme, G.; Wichlacz, D.

    1982-01-01

    An emission spectrometric isotope analyzer has been designed for determining relative abundances of stable isotopes in gaseous samples in discharge tubes, in liquid samples, and in flowing gaseous samples. It consists of a high-frequency generator, a device for defined positioning of discharge tubes, a grating monochromator with oscillating slit and signal converter, signal generator, window discriminator, AND connection, read-out display, oscillograph, gas dosing device and chemical conversion system with carrier gas source and vacuum pump

  2. PhosphoSiteAnalyzer

    DEFF Research Database (Denmark)

    Bennetzen, Martin V; Cox, Jürgen; Mann, Matthias

    2012-01-01

    an algorithm to retrieve kinase predictions from the public NetworKIN webpage in a semiautomated way and applies hereafter advanced statistics to facilitate a user-tailored in-depth analysis of the phosphoproteomic data sets. The interface of the software provides a high degree of analytical flexibility......Phosphoproteomic experiments are routinely conducted in laboratories worldwide, and because of the fast development of mass spectrometric techniques and efficient phosphopeptide enrichment methods, researchers frequently end up having lists with tens of thousands of phosphorylation sites...... and is designed to be intuitive for most users. PhosphoSiteAnalyzer is a freeware program available at http://phosphosite.sourceforge.net ....

  3. Electrodynamic thermogravimetric analyzer

    International Nuclear Information System (INIS)

    Spjut, R.E.; Bar-Ziv, E.; Sarofim, A.F.; Longwell, J.P.

    1986-01-01

    The design and operation of a new device for studying single-aerosol-particle kinetics at elevated temperatures, the electrodynamic thermogravimetric analyzer (EDTGA), was examined theoretically and experimentally. The completed device consists of an electrodynamic balance modified to permit particle heating by a CO 2 laser, temperature measurement by a three-color infrared-pyrometry system, and continuous weighing by a position-control system. In this paper, the position-control, particle-weight-measurement, heating, and temperature-measurement systems are described and their limitations examined

  4. Analyzing Chinese Financial Reporting

    Institute of Scientific and Technical Information of China (English)

    SABRINA; ZHANG

    2008-01-01

    If the world’s capital markets could use a harmonized accounting framework it would not be necessary for a comparison between two or more sets of accounting standards. However,there is much to do before this becomes reality.This article aims to pres- ent a general overview of China’s General Accepted Accounting Principles(GAAP), U.S.General Accepted Accounting Principles and International Financial Reporting Standards(IFRS),and to analyze the differ- ences among IFRS,U.S.GAAP and China GAAP using fixed assets as an example.

  5. Inductive dielectric analyzer

    International Nuclear Information System (INIS)

    Agranovich, Daniel; Popov, Ivan; Ben Ishai, Paul; Feldman, Yuri; Polygalov, Eugene

    2017-01-01

    One of the approaches to bypass the problem of electrode polarization in dielectric measurements is the free electrode method. The advantage of this technique is that, the probing electric field in the material is not supplied by contact electrodes, but rather by electromagnetic induction. We have designed an inductive dielectric analyzer based on a sensor comprising two concentric toroidal coils. In this work, we present an analytic derivation of the relationship between the impedance measured by the sensor and the complex dielectric permittivity of the sample. The obtained relationship was successfully employed to measure the dielectric permittivity and conductivity of various alcohols and aqueous salt solutions. (paper)

  6. Plutonium solution analyzer

    International Nuclear Information System (INIS)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded)

  7. Multiple capillary biochemical analyzer

    Science.gov (United States)

    Dovichi, N.J.; Zhang, J.Z.

    1995-08-08

    A multiple capillary analyzer allows detection of light from multiple capillaries with a reduced number of interfaces through which light must pass in detecting light emitted from a sample being analyzed, using a modified sheath flow cuvette. A linear or rectangular array of capillaries is introduced into a rectangular flow chamber. Sheath fluid draws individual sample streams through the cuvette. The capillaries are closely and evenly spaced and held by a transparent retainer in a fixed position in relation to an optical detection system. Collimated sample excitation radiation is applied simultaneously across the ends of the capillaries in the retainer. Light emitted from the excited sample is detected by the optical detection system. The retainer is provided by a transparent chamber having inward slanting end walls. The capillaries are wedged into the chamber. One sideways dimension of the chamber is equal to the diameter of the capillaries and one end to end dimension varies from, at the top of the chamber, slightly greater than the sum of the diameters of the capillaries to, at the bottom of the chamber, slightly smaller than the sum of the diameters of the capillaries. The optical system utilizes optic fibers to deliver light to individual photodetectors, one for each capillary tube. A filter or wavelength division demultiplexer may be used for isolating fluorescence at particular bands. 21 figs.

  8. Plutonium solution analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded).

  9. Trace impurity analyzer

    International Nuclear Information System (INIS)

    Schneider, W.J.; Edwards, D. Jr.

    1979-01-01

    The desirability for long-term reliability of large scale helium refrigerator systems used on superconducting accelerator magnets has necessitated detection of impurities to levels of a few ppM. An analyzer that measures trace impurity levels of condensable contaminants in concentrations of less than a ppM in 15 atm of He is described. The instrument makes use of the desorption temperature at an indicated pressure of the various impurities to determine the type of contaminant. The pressure rise at that temperature yields a measure of the contaminant level of the impurity. A LN 2 cryogenic charcoal trap is also employed to measure air impurities (nitrogen and oxygen) to obtain the full range of contaminant possibilities. The results of this detector which will be in use on the research and development helium refrigerator of the ISABELLE First-Cell is described

  10. Analyzing Water's Optical Absorption

    Science.gov (United States)

    2002-01-01

    A cooperative agreement between World Precision Instruments (WPI), Inc., and Stennis Space Center has led the UltraPath(TM) device, which provides a more efficient method for analyzing the optical absorption of water samples at sea. UltraPath is a unique, high-performance absorbance spectrophotometer with user-selectable light path lengths. It is an ideal tool for any study requiring precise and highly sensitive spectroscopic determination of analytes, either in the laboratory or the field. As a low-cost, rugged, and portable system capable of high- sensitivity measurements in widely divergent waters, UltraPath will help scientists examine the role that coastal ocean environments play in the global carbon cycle. UltraPath(TM) is a trademark of World Precision Instruments, Inc. LWCC(TM) is a trademark of World Precision Instruments, Inc.

  11. PDA: Pooled DNA analyzer

    Directory of Open Access Journals (Sweden)

    Lin Chin-Yu

    2006-04-01

    Full Text Available Abstract Background Association mapping using abundant single nucleotide polymorphisms is a powerful tool for identifying disease susceptibility genes for complex traits and exploring possible genetic diversity. Genotyping large numbers of SNPs individually is performed routinely but is cost prohibitive for large-scale genetic studies. DNA pooling is a reliable and cost-saving alternative genotyping method. However, no software has been developed for complete pooled-DNA analyses, including data standardization, allele frequency estimation, and single/multipoint DNA pooling association tests. This motivated the development of the software, 'PDA' (Pooled DNA Analyzer, to analyze pooled DNA data. Results We develop the software, PDA, for the analysis of pooled-DNA data. PDA is originally implemented with the MATLAB® language, but it can also be executed on a Windows system without installing the MATLAB®. PDA provides estimates of the coefficient of preferential amplification and allele frequency. PDA considers an extended single-point association test, which can compare allele frequencies between two DNA pools constructed under different experimental conditions. Moreover, PDA also provides novel chromosome-wide multipoint association tests based on p-value combinations and a sliding-window concept. This new multipoint testing procedure overcomes a computational bottleneck of conventional haplotype-oriented multipoint methods in DNA pooling analyses and can handle data sets having a large pool size and/or large numbers of polymorphic markers. All of the PDA functions are illustrated in the four bona fide examples. Conclusion PDA is simple to operate and does not require that users have a strong statistical background. The software is available at http://www.ibms.sinica.edu.tw/%7Ecsjfann/first%20flow/pda.htm.

  12. A neutron activation analyzer

    International Nuclear Information System (INIS)

    Westphal, G.P.; Lemmel, H.; Grass, F.; De Regge, P.P.; Burns, K.; Markowicz, A.

    2005-01-01

    Dubbed 'Analyzer' because of its simplicity, a neutron activation analysis facility for short-lived isomeric transitions is based on a low-cost rabbit system and an adaptive digital filter which are controlled by a software performing irradiation control, loss-free gamma-spectrometry, spectra evaluation, nuclide identification and calculation of concentrations in a fully automatic flow of operations. Designed for TRIGA reactors and constructed from inexpensive plastic tubing and an aluminum in-core part, the rabbit system features samples of 5 ml and 10 ml with sample separation at 150 ms and 200 ms transport time or 25 ml samples without separation at a transport time of 300 ms. By automatically adapting shaping times to pulse intervals the preloaded digital filter gives best throughput at best resolution up to input counting rates of 10 6 cps. Loss-free counting enables quantitative correction of counting losses of up to 99%. As a test of system reproducibility in sample separation geometry, K, Cl, Mn, Mg, Ca, Sc, and V have been determined in various reference materials at excellent agreement with consensus values. (author)

  13. Downhole Fluid Analyzer Development

    Energy Technology Data Exchange (ETDEWEB)

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  14. Climate Model Diagnostic Analyzer

    Science.gov (United States)

    Lee, Seungwon; Pan, Lei; Zhai, Chengxing; Tang, Benyang; Kubar, Terry; Zhang, Zia; Wang, Wei

    2015-01-01

    The comprehensive and innovative evaluation of climate models with newly available global observations is critically needed for the improvement of climate model current-state representation and future-state predictability. A climate model diagnostic evaluation process requires physics-based multi-variable analyses that typically involve large-volume and heterogeneous datasets, making them both computation- and data-intensive. With an exploratory nature of climate data analyses and an explosive growth of datasets and service tools, scientists are struggling to keep track of their datasets, tools, and execution/study history, let alone sharing them with others. In response, we have developed a cloud-enabled, provenance-supported, web-service system called Climate Model Diagnostic Analyzer (CMDA). CMDA enables the physics-based, multivariable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. At the same time, CMDA provides a crowd-sourcing space where scientists can organize their work efficiently and share their work with others. CMDA is empowered by many current state-of-the-art software packages in web service, provenance, and semantic search.

  15. Analyzing Visibility Configurations.

    Science.gov (United States)

    Dachsbacher, C

    2011-04-01

    Many algorithms, such as level of detail rendering and occlusion culling methods, make decisions based on the degree of visibility of an object, but do not analyze the distribution, or structure, of the visible and occluded regions across surfaces. We present an efficient method to classify different visibility configurations and show how this can be used on top of existing methods based on visibility determination. We adapt co-occurrence matrices for visibility analysis and generalize them to operate on clusters of triangular surfaces instead of pixels. We employ machine learning techniques to reliably classify the thus extracted feature vectors. Our method allows perceptually motivated level of detail methods for real-time rendering applications by detecting configurations with expected visual masking. We exemplify the versatility of our method with an analysis of area light visibility configurations in ray tracing and an area-to-area visibility analysis suitable for hierarchical radiosity refinement. Initial results demonstrate the robustness, simplicity, and performance of our method in synthetic scenes, as well as real applications.

  16. Digital Microfluidics Sample Analyzer

    Science.gov (United States)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  17. Multichannel analyzer development in CAMAC

    International Nuclear Information System (INIS)

    Nagy, J.Z.; Zarandy, A.

    1988-01-01

    The data acquisition in TOKAMAK experiments some CAMAC modules have been developed. The modules are the following: 64 K analyzer memory, 32 K analyzer memory, 6-channel pulse peak analyzer memory which contains the 32 K analyzer memory and eight AD-converters

  18. A tandem parallel plate analyzer

    International Nuclear Information System (INIS)

    Hamada, Y.; Fujisawa, A.; Iguchi, H.; Nishizawa, A.; Kawasumi, Y.

    1996-11-01

    By a new modification of a parallel plate analyzer the second-order focus is obtained in an arbitrary injection angle. This kind of an analyzer with a small injection angle will have an advantage of small operational voltage, compared to the Proca and Green analyzer where the injection angle is 30 degrees. Thus, the newly proposed analyzer will be very useful for the precise energy measurement of high energy particles in MeV range. (author)

  19. Biodegradation of trichloroethylene (TCE) by methanotrophic community.

    Science.gov (United States)

    Shukla, Awadhesh K; Vishwakarma, Pranjali; Upadhyay, S N; Tripathi, Anil K; Prasana, H C; Dubey, Suresh K

    2009-05-01

    Laboratory incubation experiments were carried out to assess the potential of methanotrophic culture for degrading TCE. Measurements of the growth rate and TCE degradation showed that the methanotrophs not only grew in presence of TCE but also degraded TCE. The rate of TCE degradation was found to be 0.19 ppm h(-1). The reverse transcriptase-PCR test was conducted to quantify expression of pmoA and mmoX genes. RT-PCR revealed expression of pmoA gene only. This observation provides evidence that the pmoA gene was functionally active for pMMO enzyme during the study. The diversity of the methanotrophs involved in TCE degradation was assessed by PCR amplification, cloning, restriction fragment length polymorphism and phylogenetic analysis of pmoA genes. Results suggested the occurrence of nine different phylotypes belonging to Type II methanotrophs in the enriched cultures. Out of the nine, five clustered with, genera Methylocystis and rest got clustered in to a separate group.

  20. Digital Multi Channel Analyzer Enhancement

    International Nuclear Information System (INIS)

    Gonen, E.; Marcus, E.; Wengrowicz, U.; Beck, A.; Nir, J.; Sheinfeld, M.; Broide, A.; Tirosh, D.

    2002-01-01

    A cement analyzing system based on radiation spectroscopy had been developed [1], using novel digital approach for real-time, high-throughput and low-cost Multi Channel Analyzer. The performance of the developed system had a severe problem: the resulted spectrum suffered from lack of smoothness, it was very noisy and full of spikes and surges, therefore it was impossible to use this spectrum for analyzing the cement substance. This paper describes the work carried out to improve the system performance

  1. PM 3655 PHILIPS Logic analyzer

    CERN Multimedia

    A logic analyzer is an electronic instrument that captures and displays multiple signals from a digital system or digital circuit. A logic analyzer may convert the captured data into timing diagrams, protocol decodes, state machine traces, assembly language, or may correlate assembly with source-level software. Logic Analyzers have advanced triggering capabilities, and are useful when a user needs to see the timing relationships between many signals in a digital system.

  2. Identification of active methanotrophs in a landfill cover soil through detection of expression of 16S rRNA and functional genes.

    Science.gov (United States)

    Chen, Yin; Dumont, Marc G; Cébron, Aurélie; Murrell, J Colin

    2007-11-01

    Active methanotrophs in a landfill soil were revealed by detecting the 16S rRNA of methanotrophs and the mRNA transcripts of key genes involved in methane oxidation. New 16S rRNA primers targeting type I and type II methanotrophs were designed and optimized for analysis by denaturing gradient gel electrophoresis. Direct extraction of RNA from soil enabled the analysis of the expression of the functional genes: mmoX, pmoA and mxaF, which encode subunits of soluble methane monooxygenase, particulate methane monooxygenase and methanol dehydrogenase respectively. The 16S rRNA polymerase chain reaction (PCR) primers for type I methanotrophs detected Methylomonas, Methylosarcina and Methylobacter sequences from both soil DNA and cDNA which was generated from RNA extracted directly from the landfill cover soil. The 16S rRNA primers for type II methanotrophs detected primarily Methylocella and some Methylocystis 16S rRNA genes. Phylogenetic analysis of mRNA recovered from the soil indicated that Methylobacter, Methylosarcina, Methylomonas, Methylocystis and Methylocella were actively expressing genes involved in methane and methanol oxidation. Transcripts of pmoA but not mmoX were readily detected by reverse transcription polymerase chain reaction (RT-PCR), indicating that particulate methane monooxygenase may be largely responsible for methane oxidation in situ.

  3. A novel methanotroph in the genus Methylomonas that contains a distinct clade of soluble methane monooxygenase.

    Science.gov (United States)

    Nguyen, Ngoc-Loi; Yu, Woon-Jong; Yang, Hye-Young; Kim, Jong-Geol; Jung, Man-Young; Park, Soo-Je; Roh, Seong-Woon; Rhee, Sung-Keun

    2017-10-01

    Aerobic methane oxidation is a key process in the global carbon cycle that acts as a major sink of methane. In this study, we describe a novel methanotroph designated EMGL16-1 that was isolated from a freshwater lake using the floating filter culture technique. Based on a phylogenetic analysis of 16S rRNA gene sequences, the isolate was found to be closely related to the genus Methylomonas in the family Methylococcaceae of the class Gammaproteobacteria with 94.2-97.4% 16S rRNA gene similarity to Methylomonas type strains. Comparison of chemotaxonomic and physiological properties further suggested that strain EMGL16-1 was taxonomically distinct from other species in the genus Methylomonas. The isolate was versatile in utilizing nitrogen sources such as molecular nitrogen, nitrate, nitrite, urea, and ammonium. The genes coding for subunit of the particulate form methane monooxygenase (pmoA), soluble methane monooxygenase (mmoX), and methanol dehydrogenase (mxaF) were detected in strain EMGL16-1. Phylogenetic analysis of mmoX indicated that mmoX of strain EMGL16-1 is distinct from those of other strains in the genus Methylomonas. This isolate probably represents a novel species in the genus. Our study provides new insights into the diversity of species in the genus Methylomonas and their environmental adaptations.

  4. Multichannel analyzer type CMA-3

    International Nuclear Information System (INIS)

    Czermak, A.; Jablonski, J.; Ostrowicz, A.

    1978-01-01

    Multichannel analyzer CMA-3 is designed for two-parametric analysis with operator controlled logical windows. It is implemented in CAMAC standard. A single crate contains all required modules and is controlled by the PDP-11/10 minicomputer. Configuration of CMA-3 is shown. CMA-3 is the next version of the multichannel analyzer described in report No 958/E-8. (author)

  5. Analyzing data files in SWAN

    CERN Document Server

    Gajam, Niharika

    2016-01-01

    Traditionally analyzing data happens via batch-processing and interactive work on the terminal. The project aims to provide another way of analyzing data files: A cloud-based approach. It aims to make it a productive and interactive environment through the combination of FCC and SWAN software.

  6. [Automated analyzer of enzyme immunoassay].

    Science.gov (United States)

    Osawa, S

    1995-09-01

    Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.

  7. DEMorphy, German Language Morphological Analyzer

    OpenAIRE

    Altinok, Duygu

    2018-01-01

    DEMorphy is a morphological analyzer for German. It is built onto large, compactified lexicons from German Morphological Dictionary. A guesser based on German declension suffixed is also provided. For German, we provided a state-of-art morphological analyzer. DEMorphy is implemented in Python with ease of usability and accompanying documentation. The package is suitable for both academic and commercial purposes wit a permissive licence.

  8. A Categorization of Dynamic Analyzers

    Science.gov (United States)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input

  9. CSTT Update: Fuel Quality Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Brosha, Eric L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lujan, Roger W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mukundan, Rangachary [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockward, Tommy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Romero, Christopher J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Stefan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wilson, Mahlon S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-06

    These are slides from a presentation. The following topics are covered: project background (scope and approach), developing the prototype (timeline), update on intellectual property, analyzer comparisons (improving humidification, stabilizing the baseline, applying clean-up strategy, impact of ionomer content and improving clean-up), proposed operating mode, considerations for testing in real-world conditions (Gen 1 analyzer electronics development, testing partner identified, field trial planning), summary, and future work.

  10. On-Demand Urine Analyzer

    Science.gov (United States)

    Farquharson, Stuart; Inscore, Frank; Shende, Chetan

    2010-01-01

    A lab-on-a-chip was developed that is capable of extracting biochemical indicators from urine samples and generating their surface-enhanced Raman spectra (SERS) so that the indicators can be quantified and identified. The development was motivated by the need to monitor and assess the effects of extended weightlessness, which include space motion sickness and loss of bone and muscle mass. The results may lead to developments of effective exercise programs and drug regimes that would maintain astronaut health. The analyzer containing the lab-on-a- chip includes materials to extract 3- methylhistidine (a muscle-loss indicator) and Risedronate (a bone-loss indicator) from the urine sample and detect them at the required concentrations using a Raman analyzer. The lab-on- a-chip has both an extractive material and a SERS-active material. The analyzer could be used to monitor the onset of diseases, such as osteoporosis.

  11. Device for analyzing a solution

    International Nuclear Information System (INIS)

    Marchand, Joseph.

    1978-01-01

    The device enables a solution containing an antigen to be analyzed by the radio-immunology technique without coming up against the problems of antigen-antibody complex and free antigen separation. This device, for analyzing a solution containing a biological compound capable of reacting with an antagonistic compound specific of the biological compound, features a tube closed at its bottom end and a component set and immobilized in the bottom of the tube so as to leave a capacity between the bottom of the tube and its lower end. The component has a large developed surface and is so shaped that it allows the solution to be analyzed to have access to the bottom of the tube; it is made of a material having some elastic deformation and able to take up a given quantity of the biological compound or of the antagonistic compound specific of the biological compound [fr

  12. Multichannel analyzer embedded in FPGA

    International Nuclear Information System (INIS)

    Garcia D, A.; Hernandez D, V. M.; Vega C, H. R.; Ordaz G, O. O.; Bravo M, I.

    2017-10-01

    Ionizing radiation has different applications, so it is a very significant and useful tool, which in turn can be dangerous for living beings if they are exposed to uncontrolled doses. However, due to its characteristics, it cannot be perceived by any of the senses of the human being, so that in order to know the presence of it, radiation detectors and additional devices are required to quantify and classify it. A multichannel analyzer is responsible for separating the different pulse heights that are generated in the detectors, in a certain number of channels; according to the number of bits of the analog to digital converter. The objective of the work was to design and implement a multichannel analyzer and its associated virtual instrument, for nuclear spectrometry. The components of the multichannel analyzer were created in VHDL hardware description language and packaged in the Xilinx Vivado design suite, making use of resources such as the ARM processing core that the System on Chip Zynq contains and the virtual instrument was developed on the LabView programming graphics platform. The first phase was to design the hardware architecture to be embedded in the FPGA and for the internal control of the multichannel analyzer the application was generated for the ARM processor in C language. For the second phase, the virtual instrument was developed for the management, control and visualization of the results. The data obtained as a result of the development of the system were observed graphically in a histogram showing the spectrum measured. The design of the multichannel analyzer embedded in FPGA was tested with two different radiation detection systems (hyper-pure germanium and scintillation) which allowed determining that the spectra obtained are similar in comparison with the commercial multichannel analyzers. (Author)

  13. Loviisa nuclear power plant analyzer

    International Nuclear Information System (INIS)

    Porkholm, K.; Nurmilaukas, P.; Tiihonen, O.; Haenninen, M.; Puska, E.

    1992-12-01

    The APROS Simulation Environment has been developed since 1986 by Imatran Voima Oy (IVO) and the Technical Research Centre of Finland (VTT). It provides tools, solution algorithms and process components for use in different simulation systems for design, analysis and training purposes. One of its main nuclear applications is the Loviisa Nuclear Power Plant Analyzer (LPA). The Loviisa Plant Analyzer includes all the important plant components both in the primary and in the secondary circuits. In addition, all the main control systems, the protection system and the high voltage electrical systems are included. (orig.)

  14. The security analyzer: A security analyzer program written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.; Densley, P.J.

    1986-09-01

    The Security Analyzer is a software tool capable of analyzing the effectiveness of a facility's security system. It is written in the Prolog logic programming computer language, using entity-relationship data modeling techniques. The program performs the following functions: (1) provides descriptive, locational and operational status information about intrusion detectors and assessment devices (i.e., ''sensors'' and ''cameras'') upon request; (2) provides for storage and retrieval of maintenance history information for various components of the security system (including intrusion detectors), and allows for changing that information as desired; (3) provides a ''search'' mode, wherein all paths are found from any specified physical location to another specified location which satisfy user chosen ''intruder detection'' probability and elapsed time criteria (i.e., the program finds the ''weakest paths'' from a security point of view). The first two of these functions can be provided fairly easily with a conventional database program; the third function could be provided using Fortran or some similar language, though with substantial difficulty. In the Security Analyzer program, all these functions are provided in a simple and straight-forward manner. This simplicity is possible because the program is written in the symbolic (as opposed to numeric) processing language Prolog, and because the knowledge base is structured according to entity-relationship modeling principles. Also, the use of Prolog and the entity-relationship modeling technique allows the capabilities of the Security analyzer program, both for knowledge base interrogation and for searching-type operations, to be easily expanded in ways that would be very difficult for a numeric and more algorithmically deterministic language such as Fortran to duplicate. 4 refs

  15. Methods of analyzing crude oil

    Science.gov (United States)

    Cooks, Robert Graham; Jjunju, Fred Paul Mark; Li, Anyin; Rogan, Iman S.

    2017-08-15

    The invention generally relates to methods of analyzing crude oil. In certain embodiments, methods of the invention involve obtaining a crude oil sample, and subjecting the crude oil sample to mass spectrometry analysis. In certain embodiments, the method is performed without any sample pre-purification steps.

  16. Therapy Talk: Analyzing Therapeutic Discourse

    Science.gov (United States)

    Leahy, Margaret M.

    2004-01-01

    Therapeutic discourse is the talk-in-interaction that represents the social practice between clinician and client. This article invites speech-language pathologists to apply their knowledge of language to analyzing therapy talk and to learn how talking practices shape clinical roles and identities. A range of qualitative research approaches,…

  17. The Convertible Arbitrage Strategy Analyzed

    NARCIS (Netherlands)

    Loncarski, I.; Ter Horst, J.R.; Veld, C.H.

    2006-01-01

    This paper analyzes convertible bond arbitrage on the Canadian market for the period 1998 to 2004.Convertible bond arbitrage is the combination of a long position in convertible bonds and a short position in the underlying stocks. Convertible arbitrage has been one of the most successful strategies

  18. Analyzing the complexity of nanotechnology

    NARCIS (Netherlands)

    Vries, de M.J.; Schummer, J.; Baird, D.

    2006-01-01

    Nanotechnology is a highly complex technological development due to many uncertainties in our knowledge about it. The Dutch philosopher Herman Dooyeweerd has developed a conceptual framework that can be used (1) to analyze the complexity of technological developments and (2) to see how priorities

  19. Proton-beam energy analyzer

    International Nuclear Information System (INIS)

    Belan, V.N.; Bolotin, L.I.; Kiselev, V.A.; Linnik, A.F.; Uskov, V.V.

    1989-01-01

    The authors describe a magnetic analyzer for measurement of proton-beam energy in the range from 100 keV to 25 MeV. The beam is deflected in a uniform transverse magnetic field and is registered by photographing a scintillation screen. The energy spectrum of the beam is constructed by microphotometry of the photographic film

  20. [Effects of copper on biodegradation mechanism of trichloroethylene by mixed microorganisms].

    Science.gov (United States)

    Gao, Yanhui; Zhao, Tiantao; Xing, Zhilin; He, Zhi; Zhang, Lijie; Peng, Xuya

    2016-05-25

    We isolated and enriched mixed microorganisms SWA1 from landfill cover soils supplemented with trichloroethylene (TCE). The microbial mixture could degrade TCE effectively under aerobic conditions. Then, we investigated the effect of copper ion (0 to 15 μmol/L) on TCE biodegradation. Results show that the maximum TCE degradation speed was 29.60 nmol/min with 95.75% degradation when copper ion was at 0.03 μmol/L. In addition, genes encoding key enzymes during biodegradation were analyzed by Real-time quantitative reverse transcription PCR (RT-qPCR). The relative expression abundance of pmoA gene (4.22E-03) and mmoX gene (9.30E-06) was the highest when copper ion was at 0.03 μmol/L. Finally, we also used MiSeq pyrosequencing to investigate the diversity of microbial community. Methylocystaceae that can co-metabolic degrade TCE were the dominant microorganisms; other microorganisms with the function of direct oxidation of TCE were also included in SWA1 and the microbial diversity decreased significantly along with increasing of copper ion concentration. Based on the above results, variation of copper ion concentration affected the composition of SWA1 and degradation mechanism of TCE. The degradation mechanism of TCE included co-metabolism degradation of methanotrophs and oxidation metabolism directly at copper ion of 0.03 μmol/L. When copper ion at 5 μmol/L (biodegradation was 84.75%), the degradation mechanism of TCE included direct-degradation and co-metabolism degradation of methanotrophs and microorganisms containing phenol hydroxylase. Therefore, biodegradation of TCE by microorganisms was a complicated process, the degradation mechanism included co-metabolism degradation of methanotrophs and bio-oxidation of non-methanotrophs.

  1. Analyzer for gamma cameras diagnostic

    International Nuclear Information System (INIS)

    Oramas Polo, I.; Osorio Deliz, J. F.; Diaz Garcia, A.

    2013-01-01

    This research work was carried out to develop an analyzer for gamma cameras diagnostic. It is composed of an electronic system that includes hardware and software capabilities, and operates from the acquisition of the 4 head position signals of a gamma camera detector. The result is the spectrum of the energy delivered by nuclear radiation coming from the camera detector head. This system includes analog processing of position signals from the camera, digitization and the subsequent processing of the energy signal in a multichannel analyzer, sending data to a computer via a standard USB port and processing of data in a personal computer to obtain the final histogram. The circuits are composed of an analog processing board and a universal kit with micro controller and programmable gate array. (Author)

  2. Methods for Analyzing Social Media

    DEFF Research Database (Denmark)

    Jensen, Jakob Linaa

    2013-01-01

    Social media is becoming increasingly attractive for users. It is a fast way to communicate ideas and a key source of information. It is therefore one of the most influential mediums of communication of our time and an important area for audience research. The growth of social media invites many...... new questions such as: How can we analyze social media? Can we use traditional audience research methods and apply them to online content? Which new research strategies have been developed? Which ethical research issues and controversies do we have to pay attention to? This book focuses on research...... strategies and methods for analyzing social media and will be of interest to researchers and practitioners using social media, as well as those wanting to keep up to date with the subject....

  3. New approach to analyzing vulnerability

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.; Carlson, R.L.; Riedeman, G.W.

    1986-01-01

    The Westinghouse Hanford Company (WHC) has recently completed construction of the Fuel Cycle Plant (FCP) at Richland, Washington. At start-up the facility will fabricate driver fuel for the Fast Flux Test Facility in the Secure Automated Fabrication line. After construction completion, but before facility certification, the Department of Energy (DOE) Richland Operation Office requested that a vulnerability analysis be performed which assumed multiple insiders as a threat to the security system. A unique method of analyzing facility vulnerabilities was developed at the Security Applications Center (SAC), which is managed by WHC for DOE. The method that was developed verifies a previous vulnerability assessment, as well as introducing a modeling technique which analyzes security alarms in relation to delaying factors and possible insider activities. With this information it is possible to assess the relative strength or weakness of various possible routes to and from a target within a facility

  4. Analyzing the Facebook Friendship Graph

    OpenAIRE

    Catanese, Salvatore; De Meo, Pasquale; Ferrara, Emilio; Fiumara, Giacomo

    2010-01-01

    Online Social Networks (OSN) during last years acquired a huge and increasing popularity as one of the most important emerging Web phenomena, deeply modifying the behavior of users and contributing to build a solid substrate of connections and relationships among people using the Web. In this preliminary work paper, our purpose is to analyze Facebook, considering a significant sample of data reflecting relationships among subscribed users. Our goal is to extract, from this platform, relevant ...

  5. Sulfur Dioxide Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, Stephen R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-05-01

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. Brookhaven National Laboratory (BNL) has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

  6. Remote Laser Diffraction PSD Analyzer

    International Nuclear Information System (INIS)

    Batcheller, T.A.; Huestis, G.M.; Bolton, S.M.

    2000-01-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified off-the-shelf classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a hot cell (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives

  7. A new uranium automatic analyzer

    International Nuclear Information System (INIS)

    Xia Buyun; Zhu Yaokun; Wang Bin; Cong Peiyuan; Zhang Lan

    1993-01-01

    A new uranium automatic analyzer based on the flow injection analysis (FIA) principle has been developed. It consists of a multichannel peristaltic pump, an injection valve, a photometric detector, a single-chip microprocessor system and electronic circuit. The new designed multifunctional auto-injection valve can automatically change the injection volume of the sample and the channels so that the determination ranges and items can easily be changed. It also can make the instrument vary the FIA operation modes that it has functions of a universal instrument. A chromatographic column with extractant-containing resin was installed in the manifold of the analyzer for the concentration and separation of trace uranium. The 2-(5-bromo-2-pyridylazo)-5-diethyl-aminophenol (Br-PADAP) was used as colour reagent. Uranium was determined in the aqueous solution by adding cetyl-pyridium bromide (CPB). The uranium in the solution in the range 0.02-500 mg · L -1 can be directly determined without any pretreatment. A sample throughput rate of 30-90 h -1 and reproducibility of 1-2% were obtained. The analyzer has been satisfactorily applied to the laboratory and the plant

  8. Remote Laser Diffraction PSD Analyzer

    International Nuclear Information System (INIS)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2000-01-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified ''off-the-shelf'' classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a ''hot cell'' (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable--making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives

  9. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Directory of Open Access Journals (Sweden)

    Mike W.-L. Cheung

    2016-05-01

    Full Text Available Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  10. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    Science.gov (United States)

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  11. The security analyzer, a security analyzer program written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.; Densley, P.J.; Carlson, R.L.

    1987-01-01

    A technique has been developed to characterize a nuclear facility and measure the strengths and weaknesses of the physical protection system. It utilizes the artificial intelligence capabilities available in the prolog programming language to probe a facility's defenses and find potential attack paths that meet designated search criteria. As sensors or barriers become inactive due to maintenance, failure, or inclement weather conditions, the protection system can rapidly be reanalyzed to discover weaknesses that would need to be strengthened by alternative means. Conversely, proposed upgrades and enhancements can be easily entered into the database and their effect measured against a variety of potential adversary attacks. Thus the security analyzer is a tool that aids the protection planner as well as the protection operations staff

  12. The Aqueduct Global Flood Analyzer

    Science.gov (United States)

    Iceland, Charles

    2015-04-01

    As population growth and economic growth take place, and as climate change accelerates, many regions across the globe are finding themselves increasingly vulnerable to flooding. A recent OECD study of the exposure of the world's large port cities to coastal flooding found that 40 million people were exposed to a 1 in 100 year coastal flood event in 2005, and the total value of exposed assets was about US 3,000 billion, or 5% of global GDP. By the 2070s, those numbers were estimated to increase to 150 million people and US 35,000 billion, or roughly 9% of projected global GDP. Impoverished people in developing countries are particularly at risk because they often live in flood-prone areas and lack the resources to respond. WRI and its Dutch partners - Deltares, IVM-VU University Amsterdam, Utrecht University, and PBL Netherlands Environmental Assessment Agency - are in the initial stages of developing a robust set of river flood and coastal storm surge risk measures that show the extent of flooding under a variety of scenarios (both current and future), together with the projected human and economic impacts of these flood scenarios. These flood risk data and information will be accessible via an online, easy-to-use Aqueduct Global Flood Analyzer. We will also investigate the viability, benefits, and costs of a wide array of flood risk reduction measures that could be implemented in a variety of geographic and socio-economic settings. Together, the activities we propose have the potential for saving hundreds of thousands of lives and strengthening the resiliency and security of many millions more, especially those who are most vulnerable. Mr. Iceland will present Version 1.0 of the Aqueduct Global Flood Analyzer and provide a preview of additional elements of the Analyzer to be released in the coming years.

  13. Fuel analyzer; Analisador de combustiveis

    Energy Technology Data Exchange (ETDEWEB)

    Cozzolino, Roberval [RS Motors, Indaiatuba, SP (Brazil)

    2008-07-01

    The current technology 'COMBUSTIMETRO' aims to examine the fuel through performance of the engine, as the role of the fuel is to produce energy for the combustion engine in the form of which is directly proportional to the quality and type of fuel. The 'COMBUSTIMETRO' has an engine that always keeps the same entry of air, fuel and fixed point of ignition. His operation is monitored by sensors (Sonda Lambda, RPM and Gases Analyzer) connected to a processor that performs calculations and records the information, generate reports and graphs. (author)

  14. Compact Microwave Fourier Spectrum Analyzer

    Science.gov (United States)

    Savchenkov, Anatoliy; Matsko, Andrey; Strekalov, Dmitry

    2009-01-01

    A compact photonic microwave Fourier spectrum analyzer [a Fourier-transform microwave spectrometer, (FTMWS)] with no moving parts has been proposed for use in remote sensing of weak, natural microwave emissions from the surfaces and atmospheres of planets to enable remote analysis and determination of chemical composition and abundances of critical molecular constituents in space. The instrument is based on a Bessel beam (light modes with non-zero angular momenta) fiber-optic elements. It features low power consumption, low mass, and high resolution, without a need for any cryogenics, beyond what is achievable by the current state-of-the-art in space instruments. The instrument can also be used in a wide-band scatterometer mode in active radar systems.

  15. Charge Analyzer Responsive Local Oscillations

    Science.gov (United States)

    Krause, Linda Habash; Thornton, Gary

    2015-01-01

    The first transatlantic radio transmission, demonstrated by Marconi in December of 1901, revealed the essential role of the ionosphere for radio communications. This ionized layer of the upper atmosphere controls the amount of radio power transmitted through, reflected off of, and absorbed by the atmospheric medium. Low-frequency radio signals can propagate long distances around the globe via repeated reflections off of the ionosphere and the Earth's surface. Higher frequency radio signals can punch through the ionosphere to be received at orbiting satellites. However, any turbulence in the ionosphere can distort these signals, compromising the performance or even availability of space-based communication and navigations systems. The physics associated with this distortion effect is analogous to the situation when underwater images are distorted by convecting air bubbles. In fact, these ionospheric features are often called 'plasma bubbles' since they exhibit some of the similar behavior as underwater air bubbles. These events, instigated by solar and geomagnetic storms, can cause communication and navigation outages that last for hours. To help understand and predict these outages, a world-wide community of space scientists and technologists are devoted to researching this topic. One aspect of this research is to develop instruments capable of measuring the ionospheric plasma bubbles. Figure 1 shows a photo of the Charge Analyzer Responsive to Local Oscillations (CARLO), a new instrument under development at NASA Marshall Space Flight Center (MSFC). It is a frequency-domain ion spectrum analyzer designed to measure the distributions of ionospheric turbulence from 1 Hz to 10 kHz (i.e., spatial scales from a few kilometers down to a few centimeters). This frequency range is important since it focuses on turbulence scales that affect VHF/UHF satellite communications, GPS systems, and over-the-horizon radar systems. CARLO is based on the flight-proven Plasma Local

  16. Radiation energy detector and analyzer

    International Nuclear Information System (INIS)

    Roberts, T.G.

    1981-01-01

    A radiation detector array and a method for measuring the spectral content of radiation. The radiation sensor or detector is an array or stack of thin solid-electrolyte batteries. The batteries, arranged in a stack, may be composed of independent battery cells or may be arranged so that adjacent cells share a common terminal surface. This common surface is possible since the polarity of the batteries with respect to an adjacent battery is unrestricted, allowing a reduction in component parts of the assembly and reducing the overall stack length. Additionally, a test jig or chamber for allowing rapid measurement of the voltage across each battery is disclosed. A multichannel recorder and display may be used to indicate the voltage gradient change across the cells, or a small computer may be used for rapidly converting these voltage readings to a graph of radiation intensity versus wavelength or energy. The behavior of the batteries when used as a radiation detector and analyzer are such that the voltage measurements can be made at leisure after the detector array has been exposed to the radiation, and it is not necessary to make rapid measurements as is now done

  17. Nuclear plant analyzer desktop workstation

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1990-01-01

    In 1983 the U.S. Nuclear Regulatory Commission (USNRC) commissioned the Idaho National Engineering Laboratory (INEL) to develop a Nuclear Plant Analyzer (NPA). The NPA was envisioned as a graphical aid to assist reactor safety analysts in comprehending the results of thermal-hydraulic code calculations. The development was to proceed in three distinct phases culminating in a desktop reactor safety workstation. The desktop NPA is now complete. The desktop NPA is a microcomputer based reactor transient simulation, visualization and analysis tool developed at INEL to assist an analyst in evaluating the transient behavior of nuclear power plants by means of graphic displays. The NPA desktop workstation integrates advanced reactor simulation codes with online computer graphics allowing reactor plant transient simulation and graphical presentation of results. The graphics software, written exclusively in ANSI standard C and FORTRAN 77 and implemented over the UNIX/X-windows operating environment, is modular and is designed to interface to the NRC's suite of advanced thermal-hydraulic codes to the extent allowed by that code. Currently, full, interactive, desktop NPA capabilities are realized only with RELAP5

  18. ADAM: Analyzer for Dialectal Arabic Morphology

    Directory of Open Access Journals (Sweden)

    Wael Salloum

    2014-12-01

    Full Text Available While Modern Standard Arabic (MSA has many resources, Arabic Dialects, the primarily spoken local varieties of Arabic, are quite impoverished in this regard. In this article, we present ADAM (Analyzer for Dialectal Arabic Morphology. ADAM is a poor man’s solution to quickly develop morphological analyzers for dialectal Arabic. ADAM has roughly half the out-of-vocabulary rate of a state-of-the-art MSA analyzer and is comparable in its recall performance to an Egyptian dialectal morphological analyzer that took years and expensive resources to build.

  19. Time-delay analyzer with continuous discretization

    International Nuclear Information System (INIS)

    Bayatyan, G.L.; Darbinyan, K.T.; Mkrtchyan, K.K.; Stepanyan, S.S.

    1988-01-01

    A time-delay analyzer is described which when triggered by a start pulse of adjustable duration performs continuous discretization of the analyzed signal within nearly 22 ns time intervals, the recording in a memory unit with following slow read-out of the information to the computer and its processing. The time-delay analyzer consists of four CAMAC-VECTOR systems of unit width. With its help one can separate comparatively short, small-amplitude rare signals against the background of quasistationary noise processes. 4 refs.; 3 figs

  20. Systems Analyze Water Quality in Real Time

    Science.gov (United States)

    2010-01-01

    A water analyzer developed under Small Business Innovation Research (SBIR) contracts with Kennedy Space Center now monitors treatment processes at water and wastewater facilities around the world. Originally designed to provide real-time detection of nutrient levels in hydroponic solutions for growing plants in space, the ChemScan analyzer, produced by ASA Analytics Inc., of Waukesha, Wisconsin, utilizes spectrometry and chemometric algorithms to automatically analyze multiple parameters in the water treatment process with little need for maintenance, calibration, or operator intervention. The company has experienced a compound annual growth rate of 40 percent over its 15-year history as a direct result of the technology's success.

  1. On-Demand Urine Analyzer, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer that can be integrated into International Space Station (ISS) toilets to measure key...

  2. Low Gravity Drug Stability Analyzer, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this proposed program (through Phase III) is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation....

  3. New high voltage parallel plate analyzer

    International Nuclear Information System (INIS)

    Hamada, Y.; Kawasumi, Y.; Masai, K.; Iguchi, H.; Fujisawa, A.; Abe, Y.

    1992-01-01

    A new modification on the parallel plate analyzer for 500 keV heavy ions to eliminate the effect of the intense UV and visible radiations, is successfully conducted. Its principle and results are discussed. (author)

  4. Analyzing the economic impacts of transportation projects.

    Science.gov (United States)

    2013-09-01

    The main goal of the study is to explore methods, approaches and : analytical software tools for analyzing economic activity that results from largescale : transportation investments in Connecticut. The primary conclusion is that the : transportation...

  5. Digital dynamic amplitude-frequency spectra analyzer

    International Nuclear Information System (INIS)

    Kalinnikov, V.A.; )

    2006-01-01

    The spectra analyzer is intended for the dynamic spectral analysis of signals physical installations and noise filtering. The recurrence Fourier transformation algorithm is used in the digital dynamic analyzer. It is realized on the basis of the fast logic FPGA matrix and the special signal ADSP microprocessor. The discretization frequency is 2 kHz-10 MHz. The number of calculated spectral coefficients is not less 512. The functional fast-action is 20 ns [ru

  6. FST Based Morphological Analyzer for Hindi Language

    OpenAIRE

    Deepak Kumar; Manjeet Singh; Seema Shukla

    2012-01-01

    Hindi being a highly inflectional language, FST (Finite State Transducer) based approach is most efficient for developing a morphological analyzer for this language. The work presented in this paper uses the SFST (Stuttgart Finite State Transducer) tool for generating the FST. A lexicon of root words is created. Rules are then added for generating inflectional and derivational words from these root words. The Morph Analyzer developed was used in a Part Of Speech (POS) Tagger based on Stanford...

  7. A new automatic analyzer for uranium determination

    International Nuclear Information System (INIS)

    Xia Buyun; Zhu Yaokun; Wang Bin; Cong Peiyan; Zhang Lan

    1992-08-01

    An intellectual automatic analyzer for uranium based on the principle of flow injection analysis (FIA) has been developed. It can directly determine the uranium solution in range of 0.02 to 500 mg/L without any pre-process. A chromatographic column with extractant, in which the trace uranium is concentrated and separated, has special ability to enrich uranium, is connected to the manifold of the analyzer. The analyzer is suited for trace uranium determination in varies samples. The 2-(5-bromo-2-pyridylazo)-5-diethyl-aminophenol (Br-PADAP) is used as color reagent. Uranium is determined in aqueous solution by adding cation surfactant, cetyl-pyridinium bromide (PCB). The rate of analysis is 30 to 90 samples per hour. The relative standard deviation of determination is 1% ∼ 2%. The analyzer has been used in factories and laboratory, and the results are satisfied. The determination range can easily be changed by using a multi-function auto-injection valve that changes the injection volume of the sample and channels. So, it could adopt varies FIA operation modes to meet the needs of FIA determination for other substance. The analyzer has universal functions

  8. Scintiscans data analyzer model AS-10

    International Nuclear Information System (INIS)

    Malesa, J.; Wierzbicki, W.

    1975-01-01

    The principle of work and construction elements of the device made up for scintiscans data analyzation by ''square root scaling'' is presented. The device is equipped with cassette tape recorder type MK-125, made in Poland serving like scintiscans data bank, and with scintiscans data analyzation three programs. The cassette of two types, C-60 and C-90, is applied with working time of 2 x 30 min. and 2 x 45 min. respectivly. Results of scintiscans data analysation are printed by electric typewriter at figures in form of digital scintigram. (author)

  9. Analyzing Engineered Nanoparticles using Photothermal Infrared Spectroscopy

    DEFF Research Database (Denmark)

    Yamada, Shoko

    . To facilitate occupational safety and health there is a need to develop instruments to monitor and analyze nanoparticles in the industry, research and urban environments. The aim of this Ph.D. project was to develop new sensors that can analyze engineered nanoparticles. Two sensors were studied: (i......) a miniaturized toxicity sensor based on electrochemistry and (ii) a photothermal spectrometer based on tensile-stressed mechanical resonators (string resonators). Miniaturization of toxicity sensor targeting engineered nanoparticles was explored. This concept was based on the results of the biodurability test...

  10. Analyzing Web Behavior in Indoor Retail Spaces

    OpenAIRE

    Ren, Yongli; Tomko, Martin; Salim, Flora; Ong, Kevin; Sanderson, Mark

    2015-01-01

    We analyze 18 million rows of Wi-Fi access logs collected over a one year period from over 120,000 anonymized users at an inner-city shopping mall. The anonymized dataset gathered from an opt-in system provides users' approximate physical location, as well as Web browsing and some search history. Such data provides a unique opportunity to analyze the interaction between people's behavior in physical retail spaces and their Web behavior, serving as a proxy to their information needs. We find: ...

  11. Analyzing Log Files using Data-Mining

    Directory of Open Access Journals (Sweden)

    Marius Mihut

    2008-01-01

    Full Text Available Information systems (i.e. servers, applications and communication devices create a large amount of monitoring data that are saved as log files. For analyzing them, a data-mining approach is helpful. This article presents the steps which are necessary for creating an ‘analyzing instrument’, based on an open source software called Waikato Environment for Knowledge Analysis (Weka [1]. For exemplification, a system log file created by a Windows-based operating system, is used as input file.

  12. BWR plant analyzer development at BNL

    International Nuclear Information System (INIS)

    Cheng, H.S.; Wulff, W.; Mallen, A.N.; Lekach, S.V.; Stritar, A.; Cerbone, R.J.

    1985-01-01

    Advanced technology for high-speed interactive nuclear power plant simulations is of great value for timely resolution of safety issues, for plant monitoring, and for computer-aided emergency responses to an accident. Presented is the methodology employed at BNL to develop a BWR plant analyzer capable of simulating severe plant transients at much faster than real-time process speeds. Five modeling principles are established and a criterion is given for selecting numerical procedures and efficient computers to achieve the very high simulation speeds. Typical results are shown to demonstrate the modeling fidelity of the BWR plant analyzer

  13. X-ray fluorescence analyzer arrangement

    International Nuclear Information System (INIS)

    Vatai, Endre; Ando, Laszlo; Gal, Janos.

    1981-01-01

    An x-ray fluorescence analyzer for the quantitative determination of one or more elements of complex samples is reported. The novelties of the invention are the excitation of the samples by x-rays or γ-radiation, the application of a balanced filter pair as energy selector, and the measurement of the current or ion charge of ionization detectors used as sensors. Due to the increased sensitivity and accuracy, the novel design can extend the application fields of x-ray fluorescence analyzers. (A.L.)

  14. A Novel Architecture For Multichannel Analyzer

    International Nuclear Information System (INIS)

    Marcus, E.; Elhanani, I.; Nir, J.; Ellenbogen, M.; Kadmon, Y.; Tirosh, D.

    1999-01-01

    A novel digital approach to real-time, high-throughput, low-cost Multichannel Analyzer (MCA) for radiation spectroscopy is being presented. The MCA input is a shaped nuclear pulse sampled at a high rate, using an Analog-to-Digital Converter (ADC) chip. The digital samples are analyzed by a state-of-the-art Field Programmable Gate Away (FPGA). A customized algorithm is utilized to estimate the peak of the pulse, to reject pile-up and to eliminate processing dead time. The valid pulses estimated peaks are transferred to a micro controller system that creates the histogram and controls the Human Machine Interface (HMI)

  15. Fluidization quality analyzer for fluidized beds

    Science.gov (United States)

    Daw, C.S.; Hawk, J.A.

    1995-07-25

    A control loop and fluidization quality analyzer for a fluidized bed utilizes time varying pressure drop measurements. A fast-response pressure transducer measures the overall bed pressure drop, or over some segment of the bed, and the pressure drop signal is processed to produce an output voltage which changes with the degree of fluidization turbulence. 9 figs.

  16. SINDA, Systems Improved Numerical Differencing Analyzer

    Science.gov (United States)

    Fink, L. C.; Pan, H. M. Y.; Ishimoto, T.

    1972-01-01

    Computer program has been written to analyze group of 100-node areas and then provide for summation of any number of 100-node areas to obtain temperature profile. SINDA program options offer user variety of methods for solution of thermal analog modes presented in network format.

  17. Analyzing the Biology on the System Level

    OpenAIRE

    Tong, Wei

    2016-01-01

    Although various genome projects have provided us enormous static sequence information, understanding of the sophisticated biology continues to require integrating the computational modeling, system analysis, technology development for experiments, and quantitative experiments all together to analyze the biology architecture on various levels, which is just the origin of systems biology subject. This review discusses the object, its characteristics, and research attentions in systems biology,...

  18. Analyzing the Acoustic Beat with Mobile Devices

    Science.gov (United States)

    Kuhn, Jochen; Vogt, Patrik; Hirth, Michael

    2014-01-01

    In this column, we have previously presented various examples of how physical relationships can be examined by analyzing acoustic signals using smartphones or tablet PCs. In this example, we will be exploring the acoustic phenomenon of small beats, which is produced by the overlapping of two tones with a low difference in frequency ?f. The…

  19. Automatic radioxenon analyzer for CTBT monitoring

    International Nuclear Information System (INIS)

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.

    1996-12-01

    Over the past 3 years, with support from US DOE's NN-20 Comprehensive Test Ban Treaty (CTBT) R ampersand D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, 131m Xe(11.9 d), 133m Xe(2.19 d), 133 Xe (5.24 d), and 135 Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m 3 /h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 μBq/m 3 of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers

  20. How to Analyze Company Using Social Network?

    Science.gov (United States)

    Palus, Sebastian; Bródka, Piotr; Kazienko, Przemysław

    Every single company or institution wants to utilize its resources in the most efficient way. In order to do so they have to be have good structure. The new way to analyze company structure by utilizing existing within company natural social network and example of its usage on Enron company are presented in this paper.

  1. Environmental applications of the centrifugal fast analyzer

    International Nuclear Information System (INIS)

    Goldstein, G.; Strain, J.E.; Bowling, J.L.

    1975-12-01

    The centrifugal fast analyzer (GeMSAEC Fast Analyzer) was applied to the analysis of pollutants in air and water. Since data acquisition and processing are computer controlled, considerable effort went into devising appropriate software. A modified version of the standard FOCAL interpreter was developed which includes special machine language functions for data timing, acquisition, and storage, and also permits chaining together of programs stored on a disk. Programs were written and experimental procedures developed to implement spectrophotometric, turbidimetric, kinetic (including initial-rate, fixed-time, and variable-time techniques), and chemiluminescence methods of analysis. Analytical methods were developed for the following elements and compounds: SO 2 , O 3 , Ca, Cr, Cu, Fe, Mg, Se(IV), Zn, Cl - , I - , NO 2 - , PO 4 -3 , S -2 , and SO 4 -2 . In many cases, standard methods could be adapted to the centrifugal analyzer, in others new methods were employed. In general, analyses performed with the centrifugal fast analyzer were faster, more precise, and more accurate than with conventional instrumentation

  2. Analyzing Vessel Behavior Using Process Mining

    NARCIS (Netherlands)

    Maggi, F.M.; Mooij, A.J.; Aalst, W.M.P. van der

    2013-01-01

    In the maritime domain, electronic sensors such as AIS receivers and radars collect large amounts of data about the vessels in a certain geographical area. We investigate the use of process mining techniques for analyzing the behavior of the vessels based on these data. In the context of maritime

  3. Strengthening 4-H by Analyzing Enrollment Data

    Science.gov (United States)

    Hamilton, Stephen F.; Northern, Angela; Neff, Robert

    2014-01-01

    The study reported here used data from the ACCESS 4-H Enrollment System to gain insight into strengthening New York State's 4-H programming. Member enrollment lists from 2009 to 2012 were analyzed using Microsoft Excel to determine trends and dropout rates. The descriptive data indicate declining 4-H enrollment in recent years and peak enrollment…

  4. Methodology for analyzing risk at nuclear facilities

    International Nuclear Information System (INIS)

    Yoo, Hosik; Lee, Nayoung; Ham, Taekyu; Seo, Janghoon

    2015-01-01

    Highlights: • A new methodology for evaluating the risk at nuclear facilities was developed. • Five measures reflecting all factors that should be concerned to assess risk were developed. • The attributes on NMAC and nuclear security culture are included as attributes for analyzing. • The newly developed methodology can be used to evaluate risk of both existing facility and future nuclear system. - Abstract: A methodology for evaluating risks at nuclear facilities is developed in this work. A series of measures is drawn from the analysis of factors that determine risks. Five measures are created to evaluate risks at nuclear facilities. These include the legal and institutional framework, material control, physical protection system effectiveness, human resources, and consequences. Evaluation attributes are developed for each measure and specific values are given in order to calculate the risk value quantitatively. Questionnaires are drawn up on whether or not a state has properly established a legal and regulatory framework (based on international standards). These questionnaires can be a useful measure for comparing the status of the physical protection regime between two countries. Analyzing an insider threat is not an easy task and no methodology has been developed for this purpose. In this study, attributes that could quantitatively evaluate an insider threat, in the case of an unauthorized removal of nuclear materials, are developed by adopting the Nuclear Material Accounting & Control (NMAC) system. The effectiveness of a physical protection system, P(E), could be analyzed by calculating the probability of interruption, P(I), and the probability of neutralization, P(N). In this study, the Tool for Evaluating Security System (TESS) code developed by KINAC is used to calculate P(I) and P(N). Consequence is an important measure used to analyze risks at nuclear facilities. This measure comprises radiological, economic, and social damage. Social and

  5. Detection, Isolation, and Characterization of Acidophilic Methanotrophs from Sphagnum Mosses ▿ †

    Science.gov (United States)

    Kip, Nardy; Ouyang, Wenjing; van Winden, Julia; Raghoebarsing, Ashna; van Niftrik, Laura; Pol, Arjan; Pan, Yao; Bodrossy, Levente; van Donselaar, Elly G.; Reichart, Gert-Jan; Jetten, Mike S. M.; Sinninghe Damsté, Jaap S.; Op den Camp, Huub J. M.

    2011-01-01

    Sphagnum peatlands are important ecosystems in the methane cycle. Methane-oxidizing bacteria in these ecosystems serve as a methane filter and limit methane emissions. Yet little is known about the diversity and identity of the methanotrophs present in and on Sphagnum mosses of peatlands, and only a few isolates are known. The methanotrophic community in Sphagnum mosses, originating from a Dutch peat bog, was investigated using a pmoA microarray. A high biodiversity of both gamma- and alphaproteobacterial methanotrophs was found. With Sphagnum mosses as the inoculum, alpha- and gammaproteobacterial acidophilic methanotrophs were isolated using established and newly designed media. The 16S rRNA, pmoA, pxmA, and mmoX gene sequences showed that the alphaproteobacterial isolates belonged to the Methylocystis and Methylosinus genera. The Methylosinus species isolated are the first acid-tolerant members of this genus. Of the acidophilic gammaproteobacterial strains isolated, strain M5 was affiliated with the Methylomonas genus, and the other strain, M200, may represent a novel genus, most closely related to the genera Methylosoma and Methylovulum. So far, no acidophilic or acid-tolerant methanotrophs in the Gammaproteobacteria class are known. All strains showed the typical features of either type I or II methanotrophs and are, to the best of our knowledge, the first isolated (acidophilic or acid-tolerant) methanotrophs from Sphagnum mosses. PMID:21724892

  6. Detection, isolation, and characterization of acidophilic methanotrophs from Sphagnum mosses.

    Science.gov (United States)

    Kip, Nardy; Ouyang, Wenjing; van Winden, Julia; Raghoebarsing, Ashna; van Niftrik, Laura; Pol, Arjan; Pan, Yao; Bodrossy, Levente; van Donselaar, Elly G; Reichart, Gert-Jan; Jetten, Mike S M; Damsté, Jaap S Sinninghe; Op den Camp, Huub J M

    2011-08-15

    Sphagnum peatlands are important ecosystems in the methane cycle. Methane-oxidizing bacteria in these ecosystems serve as a methane filter and limit methane emissions. Yet little is known about the diversity and identity of the methanotrophs present in and on Sphagnum mosses of peatlands, and only a few isolates are known. The methanotrophic community in Sphagnum mosses, originating from a Dutch peat bog, was investigated using a pmoA microarray. A high biodiversity of both gamma- and alphaproteobacterial methanotrophs was found. With Sphagnum mosses as the inoculum, alpha- and gammaproteobacterial acidophilic methanotrophs were isolated using established and newly designed media. The 16S rRNA, pmoA, pxmA, and mmoX gene sequences showed that the alphaproteobacterial isolates belonged to the Methylocystis and Methylosinus genera. The Methylosinus species isolated are the first acid-tolerant members of this genus. Of the acidophilic gammaproteobacterial strains isolated, strain M5 was affiliated with the Methylomonas genus, and the other strain, M200, may represent a novel genus, most closely related to the genera Methylosoma and Methylovulum. So far, no acidophilic or acid-tolerant methanotrophs in the Gammaproteobacteria class are known. All strains showed the typical features of either type I or II methanotrophs and are, to the best of our knowledge, the first isolated (acidophilic or acid-tolerant) methanotrophs from Sphagnum mosses.

  7. Development of pulse neutron coal analyzer

    International Nuclear Information System (INIS)

    Jing Shiwie; Gu Deshan; Qiao Shuang; Liu Yuren; Liu Linmao; Jing Shiwei

    2005-01-01

    This article introduced the development of pulsed neutron coal analyzer by pulse fast-thermal neutron analysis technology in the Radiation Technology Institute of Northeast Normal University. The 14 MeV pulse neutron generator and bismuth germanate detector and 4096 multichannel analyzer were applied in this system. The multiple linear regression method employed to process data solved the interferential problem of multiple elements. The prototype (model MZ-MKFY) had been applied in Changshan and Jilin power plant for about a year. The results of measuring the main parameters of coal such as low caloric power, whole total water, ash content, volatile content, and sulfur content, with precision acceptable to the coal industry, are presented

  8. Real time speech formant analyzer and display

    Science.gov (United States)

    Holland, George E.; Struve, Walter S.; Homer, John F.

    1987-01-01

    A speech analyzer for interpretation of sound includes a sound input which converts the sound into a signal representing the sound. The signal is passed through a plurality of frequency pass filters to derive a plurality of frequency formants. These formants are converted to voltage signals by frequency-to-voltage converters and then are prepared for visual display in continuous real time. Parameters from the inputted sound are also derived and displayed. The display may then be interpreted by the user. The preferred embodiment includes a microprocessor which is interfaced with a television set for displaying of the sound formants. The microprocessor software enables the sound analyzer to present a variety of display modes for interpretive and therapeutic used by the user.

  9. Analyzing public health policy: three approaches.

    Science.gov (United States)

    Coveney, John

    2010-07-01

    Policy is an important feature of public and private organizations. Within the field of health as a policy arena, public health has emerged in which policy is vital to decision making and the deployment of resources. Public health practitioners and students need to be able to analyze public health policy, yet many feel daunted by the subject's complexity. This article discusses three approaches that simplify policy analysis: Bacchi's "What's the problem?" approach examines the way that policy represents problems. Colebatch's governmentality approach provides a way of analyzing the implementation of policy. Bridgman and Davis's policy cycle allows for an appraisal of public policy development. Each approach provides an analytical framework from which to rigorously study policy. Practitioners and students of public health gain much in engaging with the politicized nature of policy, and a simple approach to policy analysis can greatly assist one's understanding and involvement in policy work.

  10. Miniature multichannel analyzer for process monitoring

    International Nuclear Information System (INIS)

    Halbig, J.K.; Klosterbuer, S.F.; Russo, P.A.; Sprinkle, J.K. Jr.; Stephens, M.M.; Wiig, L.G.; Ianakiev, K.D.

    1993-01-01

    A new, 4,000-channel analyzer has been developed for gamma-ray spectroscopy applications. A design philosophy of hardware and software building blocks has been combined with design goals of simplicity, compactness, portability, and reliability. The result is a miniature, modular multichannel analyzer (MMMCA), which offers solution to a variety of nondestructive assay (NDA) needs in many areas of general application, independent of computer platform or operating system. Detector-signal analog electronics, the bias supply, and batteries are included in the virtually pocket-size, low-power MMMCA unit. The MMMCA features digital setup and control, automated data reduction, and automated quality assurance. Areas of current NDA applications include on-line continuous (process) monitoring, process material holdup measurements, and field inspections

  11. Testing the Application for Analyzing Structured Entities

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2011-01-01

    Full Text Available The paper presents the testing process of the application for the analysis of structured text entities. The structured entities are presented. Quality characteristics of structured entities are identified and analyzed. The design and building processes are presented. Rules for building structured entities are described. The steps of building the application for the analysis of structured text entities are presented. The objective of the testing process is defined. Ways of testing the application on components and as a whole are established. A testing strategy for different objectives is proposed. The behavior of users during the testing period is analyzed. Statistical analysis regarding the behavior of users in processes of infinite resources access are realized.

  12. A new approach to analyzing vulnerability

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.; Carlson, R.L.; Riedeman, G.W.

    1986-01-01

    The Westinghouse Hanford Company (WHC) has recently completed construction of the Fuel Cycle Plant (FCP) at Richland, Washington. At start-up the facility will fabricate driver fuel for the Fast Flux Test Facility in the Secure Automated Fabrication line. After construction completion, but before facility certification, the Department of Energy (DOE) Richland Operation Office requested that a vulnerability analysis be performed which assumed multiple insiders as a threat to the security system. A unique method of analyzing facility vulnerabilities was developed at the Security Applications Center (SAC), which is managed by WHC for DOE. The method that was developed verifies a previous vulnerability assessment, as well as introducing a modeling technique which analyzes security alarms in relation to delaying factors and possible insider activities. With this information it is possible to assess the relative strength or weakness of various possible routes to and from a target within a facility,

  13. Real-time airborne particle analyzer

    Science.gov (United States)

    Reilly, Peter T.A.

    2012-10-16

    An aerosol particle analyzer includes a laser ablation chamber, a gas-filled conduit, and a mass spectrometer. The laser ablation chamber can be operated at a low pressure, which can be from 0.1 mTorr to 30 mTorr. The ablated ions are transferred into a gas-filled conduit. The gas-filled conduit reduces the electrical charge and the speed of ablated ions as they collide and mix with buffer gases in the gas-filled conduit. Preferably, the gas filled-conduit includes an electromagnetic multipole structure that collimates the nascent ions into a beam, which is guided into the mass spectrometer. Because the gas-filled conduit allows storage of vast quantities of the ions from the ablated particles, the ions from a single ablated particle can be analyzed multiple times and by a variety of techniques to supply statistically meaningful analysis of composition and isotope ratios.

  14. Development of a nuclear plant analyzer (NPA)

    International Nuclear Information System (INIS)

    De Vlaminck, M.; Mampaey, L.; Vanhoenacker, L.; Bastenaire, F.

    1990-01-01

    A Nuclear Plant Analyzer has been developed by TRACTABEL. Three distinct functional units make up the Nuclear Plant Analyser, a model builder, a run time unit and an analysis unit. The model builder is intended to build simulation models which describe on the one hand the geometric structure and initial conditions of a given plant and on the other hand command control logics and reactor protection systems. The run time unit carries out dialog between the user and the thermal-hydraulic code. The analysis unit is aimed at deep analyzing of the transient results. The model builder is being tested in the framework of the International Standard Problem ISP-26, which is the simulation of a LOCA on the Japanese ROSA facility

  15. Computer-based radionuclide analyzer system

    International Nuclear Information System (INIS)

    Ohba, Kengo; Ishizuka, Akira; Kobayashi, Akira; Ohhashi, Hideaki; Tsuruoka, Kimitoshi.

    1978-01-01

    The radionuclide analysis in nuclear power plants, practiced for the purpose of monitoring the quality of the primary loop water, the confirmation of the performance of reactor cleanup system and monitoring the radioactive waste effluent, is an important job. Important as it is, it requires considerable labor of experts, because the samples to be analyzed are multifarious and very large in number, and in addition, this job depends much on manual work. With a view of saving the labor, simplifying and standardizing the work, reducing radiation exposure, and automatizing the work of analysis, the computerized analyzer system has been worked out. The results of its performance test at the operating power plant have proved that the development has fairly accomplished the objects and that the system is well useful. The developmental work was carried out by the cooperation between The Tokyo Electric Power Co. and Toshiba in about 4 years from 1974 to this year. (auth.)

  16. Neutral Particle Analyzer Diagnostic on NSTX

    International Nuclear Information System (INIS)

    Medley, S.S.; Roquemore, A.L.

    2004-01-01

    The Neutral Particle Analyzer (NPA) diagnostic on the National Spherical Torus Experiment (NSTX) utilizes a PPPL-designed E||B spectrometer that measures the energy spectra of minority hydrogen and bulk deuterium species simultaneously with 39 energy channels per mass specie and a time resolution of 1 ms. The calibrated energy range is E = 0.5-150 keV and the energy resolution varies from AE/E = 3-7% over the surface of the microchannel plate detector

  17. Analyzing Gender Stereotyping in Bollywood Movies

    OpenAIRE

    Madaan, Nishtha; Mehta, Sameep; Agrawaal, Taneea S; Malhotra, Vrinda; Aggarwal, Aditi; Saxena, Mayank

    2017-01-01

    The presence of gender stereotypes in many aspects of society is a well-known phenomenon. In this paper, we focus on studying such stereotypes and bias in Hindi movie industry (Bollywood). We analyze movie plots and posters for all movies released since 1970. The gender bias is detected by semantic modeling of plots at inter-sentence and intra-sentence level. Different features like occupation, introduction of cast in text, associated actions and descriptions are captured to show the pervasiv...

  18. Neutral Particle Analyzer Diagnostic on NSTX

    Energy Technology Data Exchange (ETDEWEB)

    S.S. Medley; A.L. Roquemore

    2004-03-16

    The Neutral Particle Analyzer (NPA) diagnostic on the National Spherical Torus Experiment (NSTX) utilizes a PPPL-designed E||B spectrometer that measures the energy spectra of minority hydrogen and bulk deuterium species simultaneously with 39 energy channels per mass specie and a time resolution of 1 ms. The calibrated energy range is E = 0.5-150 keV and the energy resolution varies from AE/E = 3-7% over the surface of the microchannel plate detector.

  19. A seal analyzer for testing container integrity

    International Nuclear Information System (INIS)

    McDaniel, P.; Jenkins, C.

    1988-01-01

    This paper reports on the development of laboratory and production seal analyzer that offers a rapid, nondestructive method of assuring the seal integrity of virtually any type of single or double sealed container. The system can test a broad range of metal cans, drums and trays, membrane-lidded vessels, flexible pouches, aerosol containers, and glass or metal containers with twist-top lids that are used in the chemical/pesticide (hazardous materials/waste), beverage, food, medical and pharmaceutical industries

  20. Information decomposition method to analyze symbolical sequences

    International Nuclear Information System (INIS)

    Korotkov, E.V.; Korotkova, M.A.; Kudryashov, N.A.

    2003-01-01

    The information decomposition (ID) method to analyze symbolical sequences is presented. This method allows us to reveal a latent periodicity of any symbolical sequence. The ID method is shown to have advantages in comparison with application of the Fourier transformation, the wavelet transform and the dynamic programming method to look for latent periodicity. Examples of the latent periods for poetic texts, DNA sequences and amino acids are presented. Possible origin of a latent periodicity for different symbolical sequences is discussed

  1. Analyzing the Existing Undergraduate Engineering Leadership Skills

    OpenAIRE

    Hamed M. Almalki; Luis Rabelo; Charles Davis; Hammad Usmani; Debra Hollister; Alfonso Sarmiento

    2016-01-01

    Purpose: Studying and analyzing the undergraduate engineering students' leadership skills to discover their potential leadership strengths and weaknesses. This study will unveil potential ways to enhance the ways we teach engineering leadership. The research has great insights that might assist engineering programs to improve curricula for the purpose of better engineering preparation to meet industry's demands. Methodology and Findings: 441 undergraduate engineering students have been s...

  2. General methods for analyzing bounded proportion data

    OpenAIRE

    Hossain, Abu

    2017-01-01

    This thesis introduces two general classes of models for analyzing proportion response variable when the response variable Y can take values between zero and one, inclusive of zero and/or one. The models are inflated GAMLSS model and generalized Tobit GAMLSS model. The inflated GAMLSS model extends the flexibility of beta inflated models by allowing the distribution on (0,1) of the continuous component of the dependent variable to come from any explicit or transformed (i.e. logit or truncated...

  3. The analyzing of Dove marketing strategy

    Institute of Scientific and Technical Information of China (English)

    Guo; Yaohui

    2015-01-01

    <正>1.Introduction In this report,I try to analyze the related information about DOVE chocolate.Firstly,I would like to introduce this product.Dove chocolate is one of a series of products launched by the world’s largest pet food and snack food manufacturers,U.S.multinational food company Mars(Mars).Entered China in 1989,It becomes China’s leading brand of chocolate in

  4. Analyzing negative ties in social networks

    Directory of Open Access Journals (Sweden)

    Mankirat Kaur

    2016-03-01

    Full Text Available Online social networks are a source of sharing information and maintaining personal contacts with other people through social interactions and thus forming virtual communities online. Social networks are crowded with positive and negative relations. Positive relations are formed by support, endorsement and friendship and thus, create a network of well-connected users whereas negative relations are a result of opposition, distrust and avoidance creating disconnected networks. Due to increase in illegal activities such as masquerading, conspiring and creating fake profiles on online social networks, exploring and analyzing these negative activities becomes the need of hour. Usually negative ties are treated in same way as positive ties in many theories such as balance theory and blockmodeling analysis. But the standard concepts of social network analysis do not yield same results in respect of each tie. This paper presents a survey on analyzing negative ties in social networks through various types of network analysis techniques that are used for examining ties such as status, centrality and power measures. Due to the difference in characteristics of flow in positive and negative tie networks some of these measures are not applicable on negative ties. This paper also discusses new methods that have been developed specifically for analyzing negative ties such as negative degree, and h∗ measure along with the measures based on mixture of positive and negative ties. The different types of social network analysis approaches have been reviewed and compared to determine the best approach that can appropriately identify the negative ties in online networks. It has been analyzed that only few measures such as Degree and PN centrality are applicable for identifying outsiders in network. For applicability in online networks, the performance of PN measure needs to be verified and further, new measures should be developed based upon negative clique concept.

  5. Testing the Application for Analyzing Structured Entities

    OpenAIRE

    Ion IVAN; Bogdan VINTILA

    2011-01-01

    The paper presents the testing process of the application for the analysis of structured text entities. The structured entities are presented. Quality characteristics of structured entities are identified and analyzed. The design and building processes are presented. Rules for building structured entities are described. The steps of building the application for the analysis of structured text entities are presented. The objective of the testing process is defined. Ways of testing the applicat...

  6. Evaluation of the Air Void Analyzer

    Science.gov (United States)

    2013-07-01

    concrete using image analysis: Petrography of cementitious materials. ASTM STP 1215. S.M. DeHayes and D. Stark, eds. Philadelphia, PA: American...Administration (FHWA). 2006. Priority, market -ready technologies and innovations: Air Void Analyzer. Washington D.C. PDF file. Germann Instruments (GI). 2011...tests and properties of concrete and concrete-making materials. STP 169D. West Conshohocken, PA: ASTM International. Magura, D.D. 1996. Air void

  7. Semantic analyzability in children's understanding of idioms.

    Science.gov (United States)

    Gibbs, R W

    1991-06-01

    This study investigated the role of semantic analyzability in children's understanding of idioms. Kindergartners and first, third, and fourth graders listened to idiomatic expressions either alone or at the end of short story contexts. Their task was to explain verbally the intended meanings of these phrases and then to choose their correct idiomatic interpretations. The idioms presented to the children differed in their degree of analyzability. Some idioms were highly analyzable or decomposable, with the meanings of their parts contributing independently to their overall figurative meanings. Other idioms were nondecomposable because it was difficult to see any relation between a phrase's individual components and the idiom's figurative meaning. The results showed that younger children (kindergartners and first graders) understood decomposable idioms better than they did nondecomposable phrases. Older children (third and fourth graders) understood both kinds of idioms equally well in supporting contexts, but were better at interpreting decomposable idioms than they were at understanding nondecomposable idioms without contextual information. These findings demonstrate that young children better understand idiomatic phrases whose individual parts independently contribute to their overall figurative meanings.

  8. Handheld Fluorescence Microscopy based Flow Analyzer.

    Science.gov (United States)

    Saxena, Manish; Jayakumar, Nitin; Gorthi, Sai Siva

    2016-03-01

    Fluorescence microscopy has the intrinsic advantages of favourable contrast characteristics and high degree of specificity. Consequently, it has been a mainstay in modern biological inquiry and clinical diagnostics. Despite its reliable nature, fluorescence based clinical microscopy and diagnostics is a manual, labour intensive and time consuming procedure. The article outlines a cost-effective, high throughput alternative to conventional fluorescence imaging techniques. With system level integration of custom-designed microfluidics and optics, we demonstrate fluorescence microscopy based imaging flow analyzer. Using this system we have imaged more than 2900 FITC labeled fluorescent beads per minute. This demonstrates high-throughput characteristics of our flow analyzer in comparison to conventional fluorescence microscopy. The issue of motion blur at high flow rates limits the achievable throughput in image based flow analyzers. Here we address the issue by computationally deblurring the images and show that this restores the morphological features otherwise affected by motion blur. By further optimizing concentration of the sample solution and flow speeds, along with imaging multiple channels simultaneously, the system is capable of providing throughput of about 480 beads per second.

  9. A Raman-Based Portable Fuel Analyzer

    Science.gov (United States)

    Farquharson, Stuart

    2010-08-01

    Fuel is the single most import supply during war. Consider that the US Military is employing over 25,000 vehicles in Iraq and Afghanistan. Most fuel is obtained locally, and must be characterized to ensure proper operation of these vehicles. Fuel properties are currently determined using a deployed chemical laboratory. Unfortunately, each sample requires in excess of 6 hours to characterize. To overcome this limitation, we have developed a portable fuel analyzer capable of determine 7 fuel properties that allow determining fuel usage. The analyzer uses Raman spectroscopy to measure the fuel samples without preparation in 2 minutes. The challenge, however, is that as distilled fractions of crude oil, all fuels are composed of hundreds of hydrocarbon components that boil at similar temperatures, and performance properties can not be simply correlated to a single component, and certainly not to specific Raman peaks. To meet this challenge, we measured over 800 diesel and jet fuels from around the world and used chemometrics to correlate the Raman spectra to fuel properties. Critical to the success of this approach is laser excitation at 1064 nm to avoid fluorescence interference (many fuels fluoresce) and a rugged interferometer that provides 0.1 cm-1 wavenumber (x-axis) accuracy to guarantee accurate correlations. Here we describe the portable fuel analyzer, the chemometric models, and the successful determination of these 7 fuel properties for over 100 unknown samples provided by the US Marine Corps, US Navy, and US Army.

  10. Method of stabilizing single channel analyzers

    International Nuclear Information System (INIS)

    Fasching, G.E.; Patton, G.H.

    1975-01-01

    A method and the apparatus to reduce the drift of single channel analyzers are described. Essentially, this invention employs a time-sharing or multiplexing technique to insure that the outputs from two single channel analyzers (SCAS) maintain the same count ratio regardless of variations in the threshold voltage source or voltage changes, the multiplexing technique is accomplished when a flip flop, actuated by a clock, changes state to switch the output from the individual SCAS before these outputs are sent to a ratio counting scalar. In the particular system embodiment disclosed that illustrates this invention, the sulfur content of coal is determined by subjecting the coal to radiation from a neutron producing source. A photomultiplier and detector system equates the transmitted gamma radiation to an analog voltage signal and sends the same signal after amplification, to a SCA system that contains the invention. Therein, at least two single channel analyzers scan the analog signal over different parts of a spectral region. The two outputs may then be sent to a digital multiplexer so that the output from the multiplexer contains counts falling within two distinct segments of the region. By dividing the counts from the multiplexer by each other, the percentage of sulfur within the coal sample under observation may be determined. (U.S.)

  11. Mass spectrometer calibration of Cosmic Dust Analyzer

    Science.gov (United States)

    Ahrens, Thomas J.; Gupta, Satish C.; Jyoti, G.; Beauchamp, J. L.

    2003-02-01

    The time-of-flight (TOF) mass spectrometer (MS) of the Cosmic Dust Analyzer (CDA) instrument aboard the Cassini spacecraft is expected to be placed in orbit about Saturn to sample submicrometer-diameter ring particles and impact ejecta from Saturn's satellites. The CDA measures a mass spectrum of each particle that impacts the chemical analyzer sector of the instrument. Particles impact a Rh target plate at velocities of 1-100 km/s and produce some 10-8 to 10-5 times the particle mass of positive valence, single-charged ions. These are analyzed via a TOF MS. Initial tests employed a pulsed N2 laser acting on samples of kamacite, pyrrhotite, serpentine, olivine, and Murchison meteorite induced bursts of ions which were detected with a microchannel plate and a charge sensitive amplifier (CSA). Pulses from the N2 laser (1011 W/cm2) are assumed to simulate particle impact. Using aluminum alloy as a test sample, each pulse produces a charge of ~4.6 pC (mostly Al+1), whereas irradiation of a stainless steel target produces a ~2.8 pC (Fe+1) charge. Thus the present system yields ~10-5% of the laser energy in resulting ions. A CSA signal indicates that at the position of the microchannel plate, the ion detector geometry is such that some 5% of the laser-induced ions are collected in the CDA geometry. Employing a multichannel plate detector in this MS yields for Al-Mg-Cu alloy and kamacite targets well-defined peaks at 24 (Mg+1), 27(Al+1), and 64 (Cu+1) and 56 (Fe+1), 58 (Ni+1), and 60 (Ni+1) dalton, respectively.

  12. Remote Laser Diffraction Particle Size Distribution Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2001-03-01

    In support of a radioactive slurry sampling and physical characterization task, an “off-the-shelf” laser diffraction (classical light scattering) particle size analyzer was utilized for remote particle size distribution (PSD) analysis. Spent nuclear fuel was previously reprocessed at the Idaho Nuclear Technology and Engineering Center (INTEC—formerly recognized as the Idaho Chemical Processing Plant) which is on DOE’s INEEL site. The acidic, radioactive aqueous raffinate streams from these processes were transferred to 300,000 gallon stainless steel storage vessels located in the INTEC Tank Farm area. Due to the transfer piping configuration in these vessels, complete removal of the liquid can not be achieved. Consequently, a “heel” slurry remains at the bottom of an “emptied” vessel. Particle size distribution characterization of the settled solids in this remaining heel slurry, as well as suspended solids in the tank liquid, is the goal of this remote PSD analyzer task. A Horiba Instruments Inc. Model LA-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a “hot cell” (gamma radiation) environment. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not previously achievable—making this technology far superior than the traditional methods used. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  13. IRISpy: Analyzing IRIS Data in Python

    Science.gov (United States)

    Ryan, Daniel; Christe, Steven; Mumford, Stuart; Baruah, Ankit; Timothy, Shelbe; Pereira, Tiago; De Pontieu, Bart

    2017-08-01

    IRISpy is a new community-developed open-source software library for analysing IRIS level 2 data. It is written in Python, a free, cross-platform, general-purpose, high-level programming language. A wide array of scientific computing software packages have already been developed in Python, from numerical computation (NumPy, SciPy, etc.), to visualization and plotting (matplotlib), to solar-physics-specific data analysis (SunPy). IRISpy is currently under development as a SunPy-affiliated package which means it depends on the SunPy library, follows similar standards and conventions, and is developed with the support of of the SunPy development team. IRISpy’s has two primary data objects, one for analyzing slit-jaw imager data and another for analyzing spectrograph data. Both objects contain basic slicing, indexing, plotting, and animating functionality to allow users to easily inspect, reduce and analyze the data. As part of this functionality the objects can output SunPy Maps, TimeSeries, Spectra, etc. of relevant data slices for easier inspection and analysis. Work is also ongoing to provide additional data analysis functionality including derivation of systematic measurement errors (e.g. readout noise), exposure time correction, residual wavelength calibration, radiometric calibration, and fine scale pointing corrections. IRISpy’s code base is publicly available through github.com and can be contributed to by anyone. In this poster we demonstrate IRISpy’s functionality and future goals of the project. We also encourage interested users to become involved in further developing IRISpy.

  14. Grid and Data Analyzing and Security

    Directory of Open Access Journals (Sweden)

    Fatemeh SHOKRI

    2012-12-01

    Full Text Available This paper examines the importance of secure structures in the process of analyzing and distributing information with aid of Grid-based technologies. The advent of distributed network has provided many practical opportunities for detecting and recording the time of events, and made efforts to identify the events and solve problems of storing information such as being up-to-date and documented. In this regard, the data distribution systems in a network environment should be accurate. As a consequence, a series of continuous and updated data must be at hand. In this case, Grid is the best answer to use data and resource of organizations by common processing.

  15. The kpx, a program analyzer for parallelization

    International Nuclear Information System (INIS)

    Matsuyama, Yuji; Orii, Shigeo; Ota, Toshiro; Kume, Etsuo; Aikawa, Hiroshi.

    1997-03-01

    The kpx is a program analyzer, developed as a common technological basis for promoting parallel processing. The kpx consists of three tools. The first is ktool, that shows how much execution time is spent in program segments. The second is ptool, that shows parallelization overhead on the Paragon system. The last is xtool, that shows parallelization overhead on the VPP system. The kpx, designed to work for any FORTRAN cord on any UNIX computer, is confirmed to work well after testing on Paragon, SP2, SR2201, VPP500, VPP300, Monte-4, SX-4 and T90. (author)

  16. A low power Multi-Channel Analyzer

    International Nuclear Information System (INIS)

    Anderson, G.A.; Brackenbush, L.W.

    1993-06-01

    The instrumentation used in nuclear spectroscopy is generally large, is not portable, and requires a lot of power. Key components of these counting systems are the computer and the Multi-Channel Analyzer (MCA). To assist in performing measurements requiring portable systems, a small, very low power MCA has been developed at Pacific Northwest Laboratory (PNL). This MCA is interfaced with a Hewlett Packard palm top computer for portable applications. The MCA can also be connected to an IBM/PC for data storage and analysis. In addition, a real-time time display mode allows the user to view the spectra as they are collected

  17. The SPAR thermal analyzer: Present and future

    Science.gov (United States)

    Marlowe, M. B.; Whetstone, W. D.; Robinson, J. C.

    The SPAR thermal analyzer, a system of finite-element processors for performing steady-state and transient thermal analyses, is described. The processors communicate with each other through the SPAR random access data base. As each processor is executed, all pertinent source data is extracted from the data base and results are stored in the data base. Steady state temperature distributions are determined by a direct solution method for linear problems and a modified Newton-Raphson method for nonlinear problems. An explicit and several implicit methods are available for the solution of transient heat transfer problems. Finite element plotting capability is available for model checkout and verification.

  18. Light-weight analyzer for odor recognition

    Energy Technology Data Exchange (ETDEWEB)

    Vass, Arpad A; Wise, Marcus B

    2014-05-20

    The invention provides a light weight analyzer, e.g., detector, capable of locating clandestine graves. The detector utilizes the very specific and unique chemicals identified in the database of human decompositional odor. This detector, based on specific chemical compounds found relevant to human decomposition, is the next step forward in clandestine grave detection and will take the guess-work out of current methods using canines and ground-penetrating radar, which have historically been unreliable. The detector is self contained, portable and built for field use. Both visual and auditory cues are provided to the operator.

  19. Analyzing water/wastewater infrastructure interdependencies

    International Nuclear Information System (INIS)

    Gillette, J. L.; Fisher, R. E.; Peerenboom, J. P.; Whitfield, R. G.

    2002-01-01

    This paper describes four general categories of infrastructure interdependencies (physical, cyber, geographic, and logical) as they apply to the water/wastewater infrastructure, and provides an overview of one of the analytic approaches and tools used by Argonne National Laboratory to evaluate interdependencies. Also discussed are the dimensions of infrastructure interdependency that create spatial, temporal, and system representation complexities that make analyzing the water/wastewater infrastructure particularly challenging. An analytical model developed to incorporate the impacts of interdependencies on infrastructure repair times is briefly addressed

  20. Analyzing Argumentation In Rich, Natural Contexts

    Directory of Open Access Journals (Sweden)

    Anita Reznitskaya

    2008-02-01

    Full Text Available The paper presents the theoretical and methodological aspects of research on the development of argument- ation in elementary school children. It presents a theoretical framework detailing psychological mechanisms responsible for the acquisition and transfer of argumentative discourse and demonstrates several applications of the framework, described in sufficient detail to guide future empirical investigations of oral, written, individual, or group argumentation performance. Software programs capable of facilitating data analysis are identified and their uses illustrated. The analytic schemes can be used to analyze large amounts of verbal data with reasonable precision and efficiency. The conclusion addresses more generally the challenges for and possibilities of empirical study of the development of argumentation.

  1. Nonlinear single-spin spectrum analyzer.

    Science.gov (United States)

    Kotler, Shlomi; Akerman, Nitzan; Glickman, Yinnon; Ozeri, Roee

    2013-03-15

    Qubits have been used as linear spectrum analyzers of their environments. Here we solve the problem of nonlinear spectral analysis, required for discrete noise induced by a strongly coupled environment. Our nonperturbative analytical model shows a nonlinear signal dependence on noise power, resulting in a spectral resolution beyond the Fourier limit as well as frequency mixing. We develop a noise characterization scheme adapted to this nonlinearity. We then apply it using a single trapped ion as a sensitive probe of strong, non-Gaussian, discrete magnetic field noise. Finally, we experimentally compared the performance of equidistant vs Uhrig modulation schemes for spectral analysis.

  2. Thermo Scientific Ozone Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The primary measurement output from the Thermo Scientific Ozone Analyzer is the concentration of the analyte (O3) reported at 1-s resolution in units of ppbv in ambient air. Note that because of internal pneumatic switching limitations the instrument only makes an independent measurement every 4 seconds. Thus, the same concentration number is repeated roughly 4 times at the uniform, monotonic 1-s time base used in the AOS systems. Accompanying instrument outputs include sample temperatures, flows, chamber pressure, lamp intensities and a multiplicity of housekeeping information. There is also a field for operator comments made at any time while data is being collected.

  3. ASDA - Advanced Suit Design Analyzer computer program

    Science.gov (United States)

    Bue, Grant C.; Conger, Bruce C.; Iovine, John V.; Chang, Chi-Min

    1992-01-01

    An ASDA model developed to evaluate the heat and mass transfer characteristics of advanced pressurized suit design concepts for low pressure or vacuum planetary applications is presented. The model is based on a generalized 3-layer suit that uses the Systems Integrated Numerical Differencing Analyzer '85 in conjunction with a 41-node FORTRAN routine. The latter simulates the transient heat transfer and respiratory processes of a human body in a suited environment. The user options for the suit encompass a liquid cooled garment, a removable jacket, a CO2/H2O permeable layer, and a phase change layer.

  4. Development of a Portable Water Quality Analyzer

    Directory of Open Access Journals (Sweden)

    Germán COMINA

    2010-08-01

    Full Text Available A portable water analyzer based on a voltammetric electronic tongue has been developed. The system uses an electrochemical cell with two working electrodes as sensors, a computer controlled potentiostat, and software based on multivariate data analysis for pattern recognition. The system is suitable to differentiate laboratory made and real in-situ river water samples contaminated with different amounts of Escherichia coli. This bacteria is not only one of the main indicators for water quality, but also a main concern for public health, affecting especially people living in high-burden, resource-limiting settings.

  5. CRISP90 - SOFTWARE DESIGN ANALYZER SYSTEM

    Science.gov (United States)

    Tausworthe, R. C.

    1994-01-01

    The CRISP90 Software Design Analyzer System, an update of CRISP-80, is a set of programs forming a software design and documentation tool which supports top-down, hierarchic, modular, structured design and programming methodologies. The quality of a computer program can often be significantly influenced by the design medium in which the program is developed. The medium must foster the expression of the programmer's ideas easily and quickly, and it must permit flexible and facile alterations, additions, and deletions to these ideas as the design evolves. The CRISP90 software design analyzer system was developed to provide the PDL (Programmer Design Language) programmer with such a design medium. A program design using CRISP90 consists of short, English-like textual descriptions of data, interfaces, and procedures that are imbedded in a simple, structured, modular syntax. The display is formatted into two-dimensional, flowchart-like segments for a graphic presentation of the design. Together with a good interactive full-screen editor or word processor, the CRISP90 design analyzer becomes a powerful tool for the programmer. In addition to being a text formatter, the CRISP90 system prepares material that would be tedious and error prone to extract manually, such as a table of contents, module directory, structure (tier) chart, cross-references, and a statistics report on the characteristics of the design. Referenced modules are marked by schematic logic symbols to show conditional, iterative, and/or concurrent invocation in the program. A keyword usage profile can be generated automatically and glossary definitions inserted into the output documentation. Another feature is the capability to detect changes that were made between versions. Thus, "change-bars" can be placed in the output document along with a list of changed pages and a version history report. Also, items may be marked as "to be determined" and each will appear on a special table until the item is

  6. Analyzing endocrine system conservation and evolution.

    Science.gov (United States)

    Bonett, Ronald M

    2016-08-01

    Analyzing variation in rates of evolution can provide important insights into the factors that constrain trait evolution, as well as those that promote diversification. Metazoan endocrine systems exhibit apparent variation in evolutionary rates of their constituent components at multiple levels, yet relatively few studies have quantified these patterns and analyzed them in a phylogenetic context. This may be in part due to historical and current data limitations for many endocrine components and taxonomic groups. However, recent technological advancements such as high-throughput sequencing provide the opportunity to collect large-scale comparative data sets for even non-model species. Such ventures will produce a fertile data landscape for evolutionary analyses of nucleic acid and amino acid based endocrine components. Here I summarize evolutionary rate analyses that can be applied to categorical and continuous endocrine traits, and also those for nucleic acid and protein-based components. I emphasize analyses that could be used to test whether other variables (e.g., ecology, ontogenetic timing of expression, etc.) are related to patterns of rate variation and endocrine component diversification. The application of phylogenetic-based rate analyses to comparative endocrine data will greatly enhance our understanding of the factors that have shaped endocrine system evolution. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Improving respiration measurements with gas exchange analyzers.

    Science.gov (United States)

    Montero, R; Ribas-Carbó, M; Del Saz, N F; El Aou-Ouad, H; Berry, J A; Flexas, J; Bota, J

    2016-12-01

    Dark respiration measurements with open-flow gas exchange analyzers are often questioned for their low accuracy as their low values often reach the precision limit of the instrument. Respiration was measured in five species, two hypostomatous (Vitis Vinifera L. and Acanthus mollis) and three amphistomatous, one with similar amount of stomata in both sides (Eucalyptus citriodora) and two with different stomata density (Brassica oleracea and Vicia faba). CO 2 differential (ΔCO 2 ) increased two-fold with no change in apparent R d , when the two leaves with higher stomatal density faced outside. These results showed a clear effect of the position of stomata on ΔCO 2 . Therefore, it can be concluded that leaf position is important to guarantee the improvement of respiration measurements increasing ΔCO 2 without affecting the respiration results by leaf or mass units. This method will help to increase the accuracy of leaf respiration measurements using gas exchange analyzers. Copyright © 2016 Elsevier GmbH. All rights reserved.

  8. Solar Probe ANalyzer for Ions - Laboratory Performance

    Science.gov (United States)

    Livi, R.; Larson, D. E.; Kasper, J. C.; Korreck, K. E.; Whittlesey, P. L.

    2017-12-01

    The Parker Solar Probe (PSP) mission is a heliospheric satellite that will orbit the Sun closer than any prior mission to date with a perihelion of 35 solar radii (RS) and an aphelion of 10 RS. PSP includes the Solar Wind Electrons Alphas and Protons (SWEAP) instrument suite, which in turn consists of four instruments: the Solar Probe Cup (SPC) and three Solar Probe ANalyzers (SPAN) for ions and electrons. Together, this suite will take local measurements of particles and electromagnetic fields within the Sun's corona. SPAN-Ai has completed flight calibration and spacecraft integration and is set to be launched in July of 2018. The main mode of operation consists of an electrostatic analyzer (ESA) at its aperture followed by a Time-of-Flight section to measure the energy and mass per charge (m/q) of the ambient ions. SPAN-Ai's main objective is to measure solar wind ions within an energy range of 5 eV - 20 keV, a mass/q between 1-60 [amu/q] and a field of view of 2400x1200. Here we will show flight calibration results and performance.

  9. Analyzing delay causes in Egyptian construction projects

    Directory of Open Access Journals (Sweden)

    Mohamed M. Marzouk

    2014-01-01

    Full Text Available Construction delays are common problems in civil engineering projects in Egypt. These problems occur frequently during project life-time leading to disputes and litigation. Therefore, it is essential to study and analyze causes of construction delays. This research presents a list of construction delay causes retrieved from literature. The feedback of construction experts was obtained through interviews. Subsequently, a questionnaire survey was prepared. The questionnaire survey was distributed to thirty-three construction experts who represent owners, consultants, and contractor’s organizations. Frequency Index, Severity Index, and Importance Index are calculated and according to the highest values of them the top ten delay causes of construction projects in Egypt are determined. A case study is analyzed and compared to the most important delay causes in the research. Statistical analysis is carried out using analysis of variance ANOVA method to test delay causes, obtained from the survey. The test results reveal good correlation between groups while there is significant difference between them for some delay causes and finally roadmap for prioritizing delay causes groups is presented.

  10. Analyzing rare diseases terms in biomedical terminologies

    Directory of Open Access Journals (Sweden)

    Erika Pasceri

    2012-03-01

    Full Text Available Rare disease patients too often face common problems, including the lack of access to correct diagnosis, lack of quality information on the disease, lack of scientific knowledge of the disease, inequities and difficulties in access to treatment and care. These things could be changed by implementing a comprehensive approach to rare diseases, increasing international cooperation in scientific research, by gaining and sharing scientific knowledge about and by developing tools for extracting and sharing knowledge. A significant aspect to analyze is the organization of knowledge in the biomedical field for the proper management and recovery of health information. For these purposes, the sources needed have been acquired from the Office of Rare Diseases Research, the National Organization of Rare Disorders and Orphanet, organizations that provide information to patients and physicians and facilitate the exchange of information among different actors involved in this field. The present paper shows the representation of rare diseases terms in biomedical terminologies such as MeSH, ICD-10, SNOMED CT and OMIM, leveraging the fact that these terminologies are integrated in the UMLS. At the first level, it was analyzed the overlap among sources and at a second level, the presence of rare diseases terms in target sources included in UMLS, working at the term and concept level. We found that MeSH has the best representation of rare diseases terms.

  11. Analyzing Virtual Physics Simulations with Tracker

    Science.gov (United States)

    Claessens, Tom

    2017-12-01

    In the physics teaching community, Tracker is well known as a user-friendly open source video analysis software, authored by Douglas Brown. With this tool, the user can trace markers indicated on a video or on stroboscopic photos and perform kinematic analyses. Tracker also includes a data modeling tool that allows one to fit some theoretical equations of motion onto experimentally obtained data. In the field of particle mechanics, Tracker has been effectively used for learning and teaching about projectile motion, "toss up" and free-fall vertical motion, and to explain the principle of mechanical energy conservation. Also, Tracker has been successfully used in rigid body mechanics to interpret the results of experiments with rolling/slipping cylinders and moving rods. In this work, I propose an original method in which Tracker is used to analyze virtual computer simulations created with a physics-based motion solver, instead of analyzing video recording or stroboscopic photos. This could be an interesting approach to study kinematics and dynamics problems in physics education, in particular when there is no or limited access to physical labs. I demonstrate the working method with a typical (but quite challenging) problem in classical mechanics: a slipping/rolling cylinder on a rough surface.

  12. Automatic analyzing device for chlorine ion

    International Nuclear Information System (INIS)

    Sugibayashi, Shinji; Morikawa, Yoshitake; Fukase, Kazuo; Kashima, Hiromasa.

    1997-01-01

    The present invention provides a device of automatically analyzing a trance amount of chlorine ions contained in feedwater, condensate and reactor water of a BWR type power plant. Namely, zero-adjustment or span calibration in this device is conducted as follows. (1) A standard chlorine ion liquid is supplied from a tank to a mixer by a constant volume pump, and the liquid is diluted and mixed with purified water to form a standard liquid. (2) The pH of the standard liquid is adjusted by a pH adjuster. (3) The standard liquid is supplied to an electrode cell to conduct zero adjustment or span calibration. Chlorine ions in a specimen are measured by the device of the present invention as follows. (1) The specimen is supplied to a head tank through a line filter. (2) The pH of the specimen is adjusted by a pH adjuster. (3) The specimen is supplied to an electrode cell to electrically measure the concentration of the chlorine ions in the specimen. The device of the present invention can automatically analyze trance amount of chlorine ions at a high accuracy, thereby capable of improving the sensitivity, reducing an operator's burden and radiation exposure. (I.S.)

  13. Plutonium solution analyzer. Revised February 1995

    International Nuclear Information System (INIS)

    Burns, D.A.

    1995-02-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%--0.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40--240 g/l: and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4--4.0 g/y. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 ml of each sample and standard, and generates waste at the rate of about 1.5 ml per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded)

  14. Optoacoustic 13C-breath test analyzer

    Science.gov (United States)

    Harde, Hermann; Helmrich, Günther; Wolff, Marcus

    2010-02-01

    The composition and concentration of exhaled volatile gases reflects the physical ability of a patient. Therefore, a breath analysis allows to recognize an infectious disease in an organ or even to identify a tumor. One of the most prominent breath tests is the 13C-urea-breath test, applied to ascertain the presence of the bacterium helicobacter pylori in the stomach wall as an indication of a gastric ulcer. In this contribution we present a new optical analyzer that employs a compact and simple set-up based on photoacoustic spectroscopy. It consists of two identical photoacoustic cells containing two breath samples, one taken before and one after capturing an isotope-marked substrate, where the most common isotope 12C is replaced to a large extent by 13C. The analyzer measures simultaneously the relative CO2 isotopologue concentrations in both samples by exciting the molecules on specially selected absorption lines with a semiconductor laser operating at a wavelength of 2.744 μm. For a reliable diagnosis changes of the 13CO2 concentration of 1% in the exhaled breath have to be detected at a concentration level of this isotope in the breath of about 500 ppm.

  15. Automated Root Tracking with "Root System Analyzer"

    Science.gov (United States)

    Schnepf, Andrea; Jin, Meina; Ockert, Charlotte; Bol, Roland; Leitner, Daniel

    2015-04-01

    Crucial factors for plant development are water and nutrient availability in soils. Thus, root architecture is a main aspect of plant productivity and needs to be accurately considered when describing root processes. Images of root architecture contain a huge amount of information, and image analysis helps to recover parameters describing certain root architectural and morphological traits. The majority of imaging systems for root systems are designed for two-dimensional images, such as RootReader2, GiA Roots, SmartRoot, EZ-Rhizo, and Growscreen, but most of them are semi-automated and involve mouse-clicks in each root by the user. "Root System Analyzer" is a new, fully automated approach for recovering root architectural parameters from two-dimensional images of root systems. Individual roots can still be corrected manually in a user interface if required. The algorithm starts with a sequence of segmented two-dimensional images showing the dynamic development of a root system. For each image, morphological operators are used for skeletonization. Based on this, a graph representation of the root system is created. A dynamic root architecture model helps to determine which edges of the graph belong to an individual root. The algorithm elongates each root at the root tip and simulates growth confined within the already existing graph representation. The increment of root elongation is calculated assuming constant growth. For each root, the algorithm finds all possible paths and elongates the root in the direction of the optimal path. In this way, each edge of the graph is assigned to one or more coherent roots. Image sequences of root systems are handled in such a way that the previous image is used as a starting point for the current image. The algorithm is implemented in a set of Matlab m-files. Output of Root System Analyzer is a data structure that includes for each root an identification number, the branching order, the time of emergence, the parent

  16. Buccal microbiology analyzed by infrared spectroscopy

    Science.gov (United States)

    de Abreu, Geraldo Magno Alves; da Silva, Gislene Rodrigues; Khouri, Sônia; Favero, Priscila Pereira; Raniero, Leandro; Martin, Airton Abrahão

    2012-01-01

    Rapid microbiological identification and characterization are very important in dentistry and medicine. In addition to dental diseases, pathogens are directly linked to cases of endocarditis, premature delivery, low birth weight, and loss of organ transplants. Fourier Transform Infrared Spectroscopy (FTIR) was used to analyze oral pathogens Aggregatibacter actinomycetemcomitans ATCC 29523, Aggregatibacter actinomycetemcomitans-JP2, and Aggregatibacter actinomycetemcomitans which was clinically isolated from the human blood-CI. Significant spectra differences were found among each organism allowing the identification and characterization of each bacterial species. Vibrational modes in the regions of 3500-2800 cm-1, the 1484-1420 cm-1, and 1000-750 cm-1 were used in this differentiation. The identification and classification of each strain were performed by cluster analysis achieving 100% separation of strains. This study demonstrated that FTIR can be used to decrease the identification time, compared to the traditional methods, of fastidious buccal microorganisms associated with the etiology of the manifestation of periodontitis.

  17. Sensors and Automated Analyzers for Radionuclides

    International Nuclear Information System (INIS)

    Grate, Jay W.; Egorov, Oleg B.

    2003-01-01

    The production of nuclear weapons materials has generated large quantities of nuclear waste and significant environmental contamination. We have developed new, rapid, automated methods for determination of radionuclides using sequential injection methodologies to automate extraction chromatographic separations, with on-line flow-through scintillation counting for real time detection. This work has progressed in two main areas: radionuclide sensors for water monitoring and automated radiochemical analyzers for monitoring nuclear waste processing operations. Radionuclide sensors have been developed that collect and concentrate radionuclides in preconcentrating minicolumns with dual functionality: chemical selectivity for radionuclide capture and scintillation for signal output. These sensors can detect pertechnetate to below regulatory levels and have been engineered into a prototype for field testing. A fully automated process monitor has been developed for total technetium in nuclear waste streams. This instrument performs sample acidification, speciation adjustment, separation and detection in fifteen minutes or less

  18. Nuclear Plant Analyzer: Installation manual. Volume 1

    International Nuclear Information System (INIS)

    Snider, D.M.; Wagner, K.L.; Grush, W.H.; Jones, K.R.

    1995-01-01

    This report contains the installation instructions for the Nuclear Plant Analyzer (NPA) System. The NPA System consists of the Computer Visual System (CVS) program, the NPA libraries, the associated utility programs. The NPA was developed at the Idaho National Engineering Laboratory under the sponsorship of the US Nuclear Regulatory Commission to provide a highly flexible graphical user interface for displaying the results of these analysis codes. The NPA also provides the user with a convenient means of interactively controlling the host program through user-defined pop-up menus. The NPA was designed to serve primarily as an analysis tool. After a brief introduction to the Computer Visual System and the NPA, an analyst can quickly create a simple picture or set of pictures to aide in the study of a particular phenomenon. These pictures can range from simple collections of square boxes and straight lines to complex representations of emergency response information displays

  19. Method and apparatus for analyzing ionizable materials

    International Nuclear Information System (INIS)

    Ehrlich, B.J.; Hall, R.C.; Thiede, P.W.

    1979-01-01

    An apparatus and method are described for analyzing a solution of ionizable compounds in a liquid. The solution is irradiated with electromagnetic radiation to ionize the compounds and the electrical conductivity of the solution is measured. The radiation may be X-rays, ultra-violet, infra-red or microwaves. The solution may be split into two streams, only one of which is irradiated, the other being used as a reference by comparing conductivities of the two streams. The liquid must be nonionizable and is preferably a polar solvent. The invention provides an analysis technique useful in liquid chromatography and in gas chromatography after dissolving the eluted gases in a suitable solvent. Electrical conductivity measurements performed on the irradiated eluent provide a quantitative indication of the ionizable materials existing within the eluent stream and a qualitative indication of the purity of the eluent stream. (author)

  20. Analyzing Options for Airborne Emergency Wireless Communications

    Energy Technology Data Exchange (ETDEWEB)

    Michael Schmitt; Juan Deaton; Curt Papke; Shane Cherry

    2008-03-01

    In the event of large-scale natural or manmade catastrophic events, access to reliable and enduring commercial communication systems is critical. Hurricane Katrina provided a recent example of the need to ensure communications during a national emergency. To ensure that communication demands are met during these critical times, Idaho National Laboratory (INL) under the guidance of United States Strategic Command has studied infrastructure issues, concerns, and vulnerabilities associated with an airborne wireless communications capability. Such a capability could provide emergency wireless communications until public/commercial nodes can be systematically restored. This report focuses on the airborne cellular restoration concept; analyzing basic infrastructure requirements; identifying related infrastructure issues, concerns, and vulnerabilities and offers recommended solutions.

  1. Analyzing and forecasting the European social climate

    Directory of Open Access Journals (Sweden)

    Liliana DUGULEANĂ

    2015-06-01

    Full Text Available The paper uses the results of the sample survey Eurobarometer, which has been requested by the European Commission. The social climate index is used to measure the level of perceptions of population by taking into account their personal situation and their perspective at national level. The paper makes an analysis of the evolution of social climate indices for the countries of European Union and offers information about the expectations of population of analyzed countries. The obtained results can be compared with the forecasting of Eurobarometer, on short term of one year and medium term of five years. Modelling the social climate index and its influence factors offers useful information about the efficiency of social protection and inclusion policies.

  2. Analyzing Demand: Hegemonic Masculinity and Feminine Prostitution

    Directory of Open Access Journals (Sweden)

    Beatriz Ranea Triviño

    2016-12-01

    Full Text Available In this article, it is presented an exploratory research in which we analyzed the relationship between the construction of hegemonic masculinity and consumption of female prostitution. We have focused our attention on the experiences, attitudes and perceptions of young heterosexual men who have ever paid for sex. Following with a quantitative method of analysis, we conducted six semi-structured interviews with men between 18 to 35 years old. The analysis of the interviews shows the different demographic characteristics, such as, frequency of payment for sexual services, diversity of motivations, spaces where prostitutes are searched, opinions on prostitution and prostitutes. The main conclusions of this study are that the discourses of the interviewees reproduce gender stereotypes and gender sexual roles. And it is suggested that prostitution can be interpreted as a scenario where these men performance their hegemonic masculinity.

  3. Using wavelet features for analyzing gamma lines

    International Nuclear Information System (INIS)

    Medhat, M.E.; Abdel-hafiez, A.; Hassan, M.F.; Ali, M.A.; Uzhinskii, V.V.

    2004-01-01

    Data processing methods for analyzing gamma ray spectra with symmetric bell-shaped peaks form are considered. In many cases the peak form is symmetrical bell shaped in particular a Gaussian case is the most often used due to many physical reasons. The problem is how to evaluate parameters of such peaks, i.e. their positions, amplitudes and also their half-widths, that is for a single peak and overlapped peaks. Through wavelet features by using Marr wavelet (Mexican Hat) as a correlation method, it could be to estimate the optimal wavelet parameters and to locate peaks in the spectrum. The performance of the proposed method and others shows a better quality of wavelet transform method

  4. Analyzing petabytes of data with Hadoop

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    Abstract The open source Apache Hadoop project provides a powerful suite of tools for storing and analyzing petabytes of data using commodity hardware. After several years of production use inside of web companies like Yahoo! and Facebook and nearly a year of commercial support and development by Cloudera, the technology is spreading rapidly through other disciplines, from financial services and government to life sciences and high energy physics. The talk will motivate the design of Hadoop and discuss some key implementation details in depth. It will also cover the major subprojects in the Hadoop ecosystem, go over some example applications, highlight best practices for deploying Hadoop in your environment, discuss plans for the future of the technology, and provide pointers to the many resources available for learning more. In addition to providing more information about the Hadoop platform, a major goal of this talk is to begin a dialogue with the ATLAS research team on how the tools commonly used in t...

  5. Analyzer of neutron flux in real time

    International Nuclear Information System (INIS)

    Rojas S, A.S.; Carrillo M, R.A.; Balderas, E.G.

    1999-01-01

    With base in the study of the real signals of neutron flux of instability events occurred in the Laguna Verde nuclear power plant where the nucleus oscillation phenomena of the reactor are in the 0 to 2.5 Hz range, it has been seen the possibility about the development a surveillance and diagnostic equipment capable to analyze in real time the behavior of nucleus in this frequencies range. An important method for surveillance the stability of the reactor nucleus is the use of the Power spectral density which allows to determine the frequencies and amplitudes contained in the signals. It is used an instrument carried out by LabVIEW graphic programming with a data acquisition card of 16 channels which works at Windows 95/98 environment. (Author)

  6. Nuclear plant analyzer development and analysis applications

    International Nuclear Information System (INIS)

    Laats, E.T.

    1984-10-01

    The Nuclear Plant Analyzer (NPA) is being developed as the US Nuclear Regulatory Commission's (NRC's) state of the art safety analysis and engineering tool to address key nuclear plant safety issues. This paper describes four applications of the NPA in assisting reactor safety analyses. Two analyses evaluated reactor operating procedures, during off-normal operation, for a pressurized water reactor (PWR) and a boiling water reactor (BWR), respectively. The third analysis was performed in support of a reactor safety experiment conducted in the Semiscale facility. The final application demonstrated the usefulness of atmospheric dispersion computer codes for site emergency planning purposes. An overview of the NPA and how it supported these analyses are the topics of this paper

  7. Analyzing Strategic Business Rules through Simulation Modeling

    Science.gov (United States)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  8. Nuclear plant analyzer development at INEL

    International Nuclear Information System (INIS)

    Laats, E.T.; Russell, K.D.; Stewart, H.D.

    1983-01-01

    The Office of Nuclear Regulatory Research of the US Nuclear Regulatory Commission (NRC) has sponsored development of a software-hardware system called the Nuclear Plant Analyzer (NPA). This paper describes the status of the NPA project at the INEL after one year of development. When completed, the NPA will be an integrated network of analytical tools for performing reactor plant analyses. Development of the NPA in FY-1983 progressed along two parallel pathways; namely, conceptual planning and software development. Regarding NPA planning, and extensive effort was conducted to define the function requirements of the NPA, conceptual design, and hardware needs. Regarding software development conducted in FY-1983, all development was aimed toward demonstrating the basic concept and feasibility of the NPA. Nearly all software was developed and resides on the INEL twin Control Data Corporation 176 mainframe computers

  9. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    as the software development and distribution by a set of actors dependent on each other and the ecosystem. We commence on the hypothesis that the establishment of a software ecosystem on the telemedicine services of Denmark would address these issues and investigate how a software ecosystem can foster...... the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...... of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business...

  10. Structural factoring approach for analyzing stochastic networks

    Science.gov (United States)

    Hayhurst, Kelly J.; Shier, Douglas R.

    1991-01-01

    The problem of finding the distribution of the shortest path length through a stochastic network is investigated. A general algorithm for determining the exact distribution of the shortest path length is developed based on the concept of conditional factoring, in which a directed, stochastic network is decomposed into an equivalent set of smaller, generally less complex subnetworks. Several network constructs are identified and exploited to reduce significantly the computational effort required to solve a network problem relative to complete enumeration. This algorithm can be applied to two important classes of stochastic path problems: determining the critical path distribution for acyclic networks and the exact two-terminal reliability for probabilistic networks. Computational experience with the algorithm was encouraging and allowed the exact solution of networks that have been previously analyzed only by approximation techniques.

  11. RELAP5 nuclear plant analyzer capabilities

    International Nuclear Information System (INIS)

    Wagner, R.J.; Ransom, V.H.

    1982-01-01

    An interactive execution capability has been developed for the RELAP5 code which permits it to be used as a Nuclear Plant Analyzer. This capability has been demonstrated using a simplified primary and secondary loop model of a PWR. A variety of loss-of-feed-water accidents have been simulated using this model. The computer execution time on a CDC Cyber 176 is one half of the transient simulation time so that the results can be displayed in real time. The results of the demonstration problems are displayed in digital form on a color schematic of the plant model using a Textronics 4027 CRT terminal. The interactive feature allows the user to enter commands in much the same manner as a reactor operator

  12. Statistical network analysis for analyzing policy networks

    DEFF Research Database (Denmark)

    Robins, Garry; Lewis, Jenny; Wang, Peng

    2012-01-01

    and policy network methodology is the development of statistical modeling approaches that can accommodate such dependent data. In this article, we review three network statistical methods commonly used in the current literature: quadratic assignment procedures, exponential random graph models (ERGMs......To analyze social network data using standard statistical approaches is to risk incorrect inference. The dependencies among observations implied in a network conceptualization undermine standard assumptions of the usual general linear models. One of the most quickly expanding areas of social......), and stochastic actor-oriented models. We focus most attention on ERGMs by providing an illustrative example of a model for a strategic information network within a local government. We draw inferences about the structural role played by individuals recognized as key innovators and conclude that such an approach...

  13. Diffractive interference optical analyzer (DiOPTER)

    Science.gov (United States)

    Sasikumar, Harish; Prasad, Vishnu; Pal, Parama; Varma, Manoj M.

    2016-03-01

    This report demonstrates a method for high-resolution refractometric measurements using, what we have termed as, a Diffractive Interference Optical Analyzer (DiOpter). The setup consists of a laser, polarizer, a transparent diffraction grating and Si-photodetectors. The sensor is based on the differential response of diffracted orders to bulk refractive index changes. In these setups, the differential read-out of the diffracted orders suppresses signal drifts and enables time-resolved determination of refractive index changes in the sample cell. A remarkable feature of this device is that under appropriate conditions, the measurement sensitivity of the sensor can be enhanced by more than two orders of magnitude due to interference between multiply reflected diffracted orders. A noise-equivalent limit of detection (LoD) of 6x10-7 RIU was achieved in glass. This work focuses on devices with integrated sample well, made on low-cost PDMS. As the detection methodology is experimentally straightforward, it can be used across a wide array of applications, ranging from detecting changes in surface adsorbates via binding reactions to estimating refractive index (and hence concentration) variations in bulk samples. An exciting prospect of this technique is the potential integration of this device to smartphones using a simple interface based on transmission mode configuration. In a transmission configuration, we were able to achieve an LoD of 4x10-4 RIU which is sufficient to explore several applications in food quality testing and related fields. We are envisioning the future of this platform as a personal handheld optical analyzer for applications ranging from environmental sensing to healthcare and quality testing of food products.

  14. Relativistic effects in the calibration of electrostatic electron analyzers. I. Toroidal analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Keski Rahkonen, O [Helsinki University of Technology, Espoo (Finland). Laboratory of Physics; Krause, M O [Oak Ridge National Lab., Tenn. (USA)

    1978-02-01

    Relativistic correction terms up to the second order are derived for the kinetic energy of an electron travelling along the circular central trajectory of a toroidal analyzer. Furthermore, a practical energy calibration equation of the spherical sector plate analyzer is written for the variable-plate-voltage recording mode. Accurate measurements with a spherical analyzer performed using kinetic energies from 600 to 2100 eV are in good agreement with this theory showing our approximation (neglect of fringing fields, and source and detector geometry) is realistic enough for actual calibration purposes.

  15. Plasma diagnostics with a retarding potential analyzer

    International Nuclear Information System (INIS)

    Jack, T.M.

    1996-01-01

    The plasma rocket is located at NASA Johnson Space Center. To produce a thrust in space, an inert gas is ionized into a plasma and heated in the linear section of a tokamak fusion device. The magnetic field used to contain the plasma has a magnitude of 2--10 kGauss. The plasma plume has a variable thrust and specific impulse. A high temperature retarding potential analyzer (RPA) is being developed to characterize the plasma in the plume and at the edge of the magnetically contained plasma. The RPA measures the energy and density of ions or electrons entering into its solid angle of collection. An oscilloscope displays the ion flux versus the collected current. All measurements are made relative to the facility ground. Testing of this device involves the determination of its output parameters, sensitivity, and responses to a wide range of energies and densities. Each grid will be tested individually by changing only its voltage and observing the output from the RPA. To verify that the RPA is providing proper output, it is compared to the output from a Langmuir or Faraday probe

  16. Analyzing the development of Indonesia shrimp industry

    Science.gov (United States)

    Wati, L. A.

    2018-04-01

    This research aimed to analyze the development of shrimp industry in Indonesia. Porter’s Diamond Theory was used for analysis. The Porter’s Diamond theory is one of framework for industry analysis and business strategy development. The Porter’s Diamond theory has five forces that determine the competitive intensity in an industry, namely (1) the threat of substitute products, (2) the threat of competition, (3) the threat of new entrants, (4) bargaining power of suppliers, and (5) bargaining power of consumers. The development of Indonesian shrimp industry pretty good, explained by Porter Diamond Theory analysis. Analysis of Porter Diamond Theory through four main components namely factor conditions; demand condition; related and supporting industries; and firm strategy, structure and rivalry coupled with a two-component supporting (regulatory the government and the factor of chance). Based on the result of this research show that two-component supporting (regulatory the government and the factor of chance) have positive. Related and supporting industries have negative, firm and structure strategy have negative, rivalry has positive, factor condition have positive (except science and technology resources).

  17. Analyzing Spatiotemporal Anomalies through Interactive Visualization

    Directory of Open Access Journals (Sweden)

    Tao Zhang

    2014-06-01

    Full Text Available As we move into the big data era, data grows not just in size, but also in complexity, containing a rich set of attributes, including location and time information, such as data from mobile devices (e.g., smart phones, natural disasters (e.g., earthquake and hurricane, epidemic spread, etc. We are motivated by the rising challenge and build a visualization tool for exploring generic spatiotemporal data, i.e., records containing time location information and numeric attribute values. Since the values often evolve over time and across geographic regions, we are particularly interested in detecting and analyzing the anomalous changes over time/space. Our analytic tool is based on geographic information system and is combined with spatiotemporal data mining algorithms, as well as various data visualization techniques, such as anomaly grids and anomaly bars superimposed on the map. We study how effective the tool may guide users to find potential anomalies through demonstrating and evaluating over publicly available spatiotemporal datasets. The tool for spatiotemporal anomaly analysis and visualization is useful in many domains, such as security investigation and monitoring, situation awareness, etc.

  18. Alternative approach to analyzing occupational mortality data

    International Nuclear Information System (INIS)

    Gilbert, E.S.; Buchanan, J.A.

    1984-01-01

    It is widely recognized that analyzing occupational mortality by calculating standardized mortality ratios based on death rates from the general population is subject to a number of limitations. An alternative approach described in this report takes advantage of the fact that comparisons of mortality by subgroups and assessments of trends in mortality are often of equal or greater interest than overall assessments and that such comparisons do not require an external control. A computer program MOX (Mortality and Occupational Exposure) is available for performing the needed calculations for several diseases. MOX was written to asses the effect of radiation exposure on Hanford nuclear workers. For this application, analyses have been based on cumulative exposure computed (by MOX) from annual records of radiation exposure obtained from personal dosimeter readings. This program provides tests for differences and trends among subcategories defined by variables such as length of employment, job category, or exposure measurements and also provides control for age, calendar year, and several other potentially confounding variables. 29 references, 2 tables

  19. Analyzing Design Heating Loads in Superinsulated Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Arena, Lois [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)

    2015-06-16

    The U.S. Department of Energy’s Building America research team Consortium for Advanced Residential Buildings (CARB) worked with the EcoVillage cohousing community in Ithaca, New York, on the Third Residential EcoVillage Experience neighborhood. This communityscale project consists of 40 housing units—15 apartments and 25 single-family residences. Units range in size from 450 ft2 to 1,664 ft2 and cost from $80,000 for a studio apartment to $235,000 for a three- or four-bedroom single-family home. For the research component of this project, CARB analyzed current heating system sizing methods for superinsulated homes in cold climates to determine if changes in building load calculation methodology should be recommended. Actual heating energy use was monitored and compared to results from the Air Conditioning Contractors of America’s Manual J8 (MJ8) and the Passive House Planning Package software. Results from that research indicate that MJ8 significantly oversizes heating systems for superinsulated homes and that thermal inertia and internal gains should be considered for more accurate load calculations.

  20. A framework to analyze emissions implications of ...

    Science.gov (United States)

    Future year emissions depend highly on the evolution of the economy, technology and current and future regulatory drivers. A scenario framework was adopted to analyze various technology development pathways and societal change while considering existing regulations and future uncertainty in regulations and evaluate resulting emissions growth patterns. The framework integrates EPA’s energy systems model with an economic Input-Output (I/O) Life Cycle Assessment model. The EPAUS9r MARKAL database is assembled from a set of technologies to represent the U.S. energy system within MARKAL bottom-up technology rich energy modeling framework. The general state of the economy and consequent demands for goods and services from these sectors are taken exogenously in MARKAL. It is important to characterize exogenous inputs about the economy to appropriately represent the industrial sector outlook for each of the scenarios and case studies evaluated. An economic input-output (I/O) model of the US economy is constructed to link up with MARKAL. The I/O model enables user to change input requirements (e.g. energy intensity) for different sectors or the share of consumer income expended on a given good. This gives end-users a mechanism for modeling change in the two dimensions of technological progress and consumer preferences that define the future scenarios. The framework will then be extended to include environmental I/O framework to track life cycle emissions associated

  1. Update on the USNRC's nuclear plant analyzer

    International Nuclear Information System (INIS)

    Laats, E.T.

    1987-01-01

    The Nuclear Plant Analyzer (NPA) is the U.S. Nuclear Regulatory Commission's (NRC's) state-of-the-art nuclear reactor simulation capability. This computer software package integrates high fidelity nuclear reactor simulation codes such as the TRAC and RELAPS series of codes with color graphics display techniques and advanced workstation hardware. An overview of this program was given at the 1984 Summer Computer Simulation Conference (SCSC), with selected topics discussed at the 1985 and 1986 SCSCs. This paper addresses these activities and related experiences. First, The Class VI computer implementation is discussed. The trade-offs between gaining significantly greater computational speed and central memory, with the loss of performance due to many more simultaneous users is shown. Second, the goal of the super-minicomputer implementation is to produce a very cost-effective system that utilizes advanced (multi-dimensional, two-phase coolant) simulation capabilities at real wall-clock simulation times. Benchmarking of the initial super-minicomputer implementation is discussed. Finally, the technical and economic feasibility is addressed for implementing the super-minicomputer version of the NPA with the RELAPS simulation code onto the Black Fox full scope nuclear power plant simulator

  2. Analyzing the Pension System of the USSR

    Directory of Open Access Journals (Sweden)

    Aleksei V. Pudovkin

    2015-01-01

    Full Text Available The article under the title "ANALYSIS OF THE PENSION SYSTEM OF THE USSR" deals with numerous aspects of development of the pension system of the former USSR. Since the improvement of the Russian pension system is presently high on the agenda, the author believes that analyzing the own historical experience in the first line is essential in order to create a sound and efficient pension system in Russia. The study presented in the article aims to execute an in-depth analysis of legislature on the soviet pension system with the view to recreate the architecture of the pension system of the USSR. In addition, the study also reflects on the official statistics for the said period to make a qualified and fundamental conclusion on the efficiency of the soviet pension system. The evolution of the pension system, based on statistical data evidently proves the efficiently of the soviet pension system. It is highly recommended that the positive aspects of the soviet pension system are taken into consideration when reforming the actual pension system of Russian Federation.

  3. Analyzing Music Services Positioning Through Qualitative Research

    Directory of Open Access Journals (Sweden)

    Manuel Cuadrado

    2015-12-01

    Full Text Available Information technologies have produced new ways of distributing and consuming music, mainly by youth, in relation to both goods and services. In the case of goods, there has been a dramatic shift from traditional ways of buying and listening to music to new digital platforms. There has also been an evolution in relation to music services. In this sense, live music concerts have been losing their audiences over the past few years, as have music radio stations, in favor of streaming platforms. Curious about this phenomenon, we conducted an exploratory research in order to analyze how all these services, both traditional and new ones were perceived. Specifically, we aimed to study youth´s assessment of the three most relevant music service categories: music radio stations, digital streaming platforms, and pop-rock music festivals. To do so, we used the projective technique of image association to gather information. The population of the study consisted of individuals between 18 and 25 years of age. Our results, after using content analysis, were poor due to spontaneous recall. Therefore, we duplicated the study, but in a more focus-oriented way. Information gathered this time allowed us not only to better know how all these organizations are positioned but also to obtain a list of descriptors to be used in a subsequent descriptive research study.

  4. Analyzing the Existing Undergraduate Engineering Leadership Skills

    Directory of Open Access Journals (Sweden)

    Hamed M. Almalki

    2016-12-01

    Full Text Available Purpose: Studying and analyzing the undergraduate engineering students' leadership skills to discover their potential leadership strengths and weaknesses. This study will unveil potential ways to enhance the ways we teach engineering leadership. The research has great insights that might assist engineering programs to improve curricula for the purpose of better engineering preparation to meet industry's demands. Methodology and Findings: 441 undergraduate engineering students have been surveyed in two undergraduate engineering programs to discover their leadership skills. The results in both programs were revealing that undergraduate engineering students are lacking behind in the visionary leadership skills compared to directing, including and cultivating leadership styles. Recommendation: A practical framework has been proposed to enhance the lacking leadership skills by utilizing the Matrix of Change (MOC, and the Balanced Scorecard BSC to capture the best leadership scenarios to design virtual simulation environment as per the lacking leadership skills which is the visionary leadership skills in this case. After that, the virtual simulation will be used to provide an experiential learning by replacing human beings with avatars that can be managed or dramatized by real people to enable the creation of live, practical, measurable, and customizable leadership development programs.

  5. PSAIA – Protein Structure and Interaction Analyzer

    Directory of Open Access Journals (Sweden)

    Vlahoviček Kristian

    2008-04-01

    Full Text Available Abstract Background PSAIA (Protein Structure and Interaction Analyzer was developed to compute geometric parameters for large sets of protein structures in order to predict and investigate protein-protein interaction sites. Results In addition to most relevant established algorithms, PSAIA offers a new method PIADA (Protein Interaction Atom Distance Algorithm for the determination of residue interaction pairs. We found that PIADA produced more satisfactory results than comparable algorithms implemented in PSAIA. Particular advantages of PSAIA include its capacity to combine different methods to detect the locations and types of interactions between residues and its ability, without any further automation steps, to handle large numbers of protein structures and complexes. Generally, the integration of a variety of methods enables PSAIA to offer easier automation of analysis and greater reliability of results. PSAIA can be used either via a graphical user interface or from the command-line. Results are generated in either tabular or XML format. Conclusion In a straightforward fashion and for large sets of protein structures, PSAIA enables the calculation of protein geometric parameters and the determination of location and type for protein-protein interaction sites. XML formatted output enables easy conversion of results to various formats suitable for statistic analysis. Results from smaller data sets demonstrated the influence of geometry on protein interaction sites. Comprehensive analysis of properties of large data sets lead to new information useful in the prediction of protein-protein interaction sites.

  6. A Methodology to Analyze Photovoltaic Tracker Uptime

    Energy Technology Data Exchange (ETDEWEB)

    Muller, Matthew T [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ruth, Dan [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-04-17

    A metric is developed to analyze the daily performance of single-axis photovoltaic (PV) trackers. The metric relies on comparing correlations between the daily time series of the PV power output and an array of simulated plane-of-array irradiances for the given day. Mathematical thresholds and a logic sequence are presented, so the daily tracking metric can be applied in an automated fashion on large-scale PV systems. The results of applying the metric are visually examined against the time series of the power output data for a large number of days and for various systems. The visual inspection results suggest that overall, the algorithm is accurate in identifying stuck or functioning trackers on clear-sky days. Visual inspection also shows that there are days that are not classified by the metric where the power output data may be sufficient to identify a stuck tracker. Based on the daily tracking metric, uptime results are calculated for 83 different inverters at 34 PV sites. The mean tracker uptime is calculated at 99% based on 2 different calculation methods. The daily tracking metric clearly has limitations, but as there is no existing metrics in the literature, it provides a valuable tool for flagging stuck trackers.

  7. Nuclear plant analyzer development and analysis applications

    International Nuclear Information System (INIS)

    Laats, E.T.

    1984-01-01

    The Nuclear Plant Analyzer (NPA) is being developed as the U.S. Nuclear Regulatory Commission's (NRC's) state of the art safety analysis and engineering tool to address key nuclear plant safety issues. The NPA integrates the NRC's computerized reactor behavior simulation codes such as RELAP5 and TRAC-BWR, both of which are well-developed computer graphics programs, and large repositories of reactor design and experimental data. Utilizing the complex reactor behavior codes as well as the experiment data repositories enables simulation applications of the NPA that are generally not possible with more simplistic, less mechanistic reactor behavior codes. These latter codes are used in training simulators or with other NPA-type software packages and are limited to displaying calculated data only. This paper describes four applications of the NPA in assisting reactor safety analyses. Two analyses evaluated reactor operating procedures, during off-normal operation, for a pressurized water reactor (PWR) and a boiling water reactor (BWR), respectively. The third analysis was performed in support of a reactor safety experiment conducted in the Semiscale facility. The final application demonstrated the usefulness of atmospheric dispersion computer codes for site emergency planning purposes. An overview of the NPA and how it supported these analyses are the topics of this paper

  8. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the

  9. Analyzing wildfire exposure on Sardinia, Italy

    Science.gov (United States)

    Salis, Michele; Ager, Alan A.; Arca, Bachisio; Finney, Mark A.; Alcasena, Fermin; Bacciu, Valentina; Duce, Pierpaolo; Munoz Lozano, Olga; Spano, Donatella

    2014-05-01

    We used simulation modeling based on the minimum travel time algorithm (MTT) to analyze wildfire exposure of key ecological, social and economic features on Sardinia, Italy. Sardinia is the second largest island of the Mediterranean Basin, and in the last fifty years experienced large and dramatic wildfires, which caused losses and threatened urban interfaces, forests and natural areas, and agricultural productions. Historical fires and environmental data for the period 1995-2009 were used as input to estimate fine scale burn probability, conditional flame length, and potential fire size in the study area. With this purpose, we simulated 100,000 wildfire events within the study area, randomly drawing from the observed frequency distribution of burn periods and wind directions for each fire. Estimates of burn probability, excluding non-burnable fuels, ranged from 0 to 1.92x10-3, with a mean value of 6.48x10-5. Overall, the outputs provided a quantitative assessment of wildfire exposure at the landscape scale and captured landscape properties of wildfire exposure. We then examined how the exposure profiles varied among and within selected features and assets located on the island. Spatial variation in modeled outputs resulted in a strong effect of fuel models, coupled with slope and weather. In particular, the combined effect of Mediterranean maquis, woodland areas and complex topography on flame length was relevant, mainly in north-east Sardinia, whereas areas with herbaceous fuels and flat areas were in general characterized by lower fire intensity but higher burn probability. The simulation modeling proposed in this work provides a quantitative approach to inform wildfire risk management activities, and represents one of the first applications of burn probability modeling to capture fire risk and exposure profiles in the Mediterranean basin.

  10. Modeling and Analyzing Academic Researcher Behavior

    Directory of Open Access Journals (Sweden)

    Phuc Huu Nguyen

    2016-12-01

    Full Text Available Abstract. This paper suggests a theoretical framework for analyzing the mechanism of the behavior of academic researchers whose interests are tangled and vary widely in academic factors (the intrinsic satisfaction in conducting research, the improvement in individual research ability, etc. or non-academic factors (career rewards, financial rewards, etc.. Furthermore, each researcher also has his/her different academic stances in their preferences about academic freedom and academic entrepreneurship. Understanding the behavior of academic researchers will contribute to nurture young researchers, to improve the standard of research and education as well as to boost collaboration in academia-industry. In particular, as open innovation is increasingly in need of the involvement of university researchers, to establish a successful approach to entice researchers into enterprises’ research, companies must comprehend the behavior of university researchers who have multiple complex motivations. The paper explores academic researchers' behaviors through optimizing their utility functions, i.e. the satisfaction obtained by their research outputs. This paper characterizes these outputs as the results of researchers' 3C: Competence (the ability to implement the research, Commitment (the effort to do the research, and Contribution (finding meaning in the research. Most of the previous research utilized the empirical methods to study researcher's motivation. Without adopting economic theory into the analysis, the past literature could not offer a deeper understanding of researcher's behavior. Our contribution is important both conceptually and practically because it provides the first theoretical framework to study the mechanism of researcher's behavior. Keywords: Academia-Industry, researcher behavior, ulrich model’s 3C.

  11. Update on the USNRC's Nuclear Plant Analyzer

    International Nuclear Information System (INIS)

    Laats, E.T.

    1987-01-01

    The Nuclear Plant Analyzer (NPA) is the US Nuclear Regulatory Commission's (NRC's) state-of-the-art nuclear reactor simulation capability. This computer software package integrates high fidelity nuclear reactor simulation codes such as the TRAC and RELAP5 series of codes with color graphics display techniques and advanced workstation hardware. An overview of this program was given at the 1984 Summer Computer Simulation Conference (SCSC), with selected topics discussed at the 1985 and 1986 SCSCs. Since the 1984 presentation, major redirections of this NRC program have been taken. The original NPA system was developed for operation on a Control Data Corporation CYBER 176 computer, technology that is some 10 to 15 years old. The NPA system has recently been implemented on Class VI computers to gain increased computational capabilities, and is now being implemented on super-minicomputers for use by the scientific community and possibly by the commercial nuclear power plant simulator community. This paper addresses these activities and related experiences. First, the Class VI computer implementation is discussed. The trade-offs between gaining significantly greater computational speed and central memory, with the loss of performance due to many more simultaneous users is shown. Second, the goal of the super-minicomputer implementation is to produce a very cost-effective system that utilizes advanced (multi-dimensional, two-phase coolant) simulation capabilities at real wall-clock simulation times. Benchmarking of the initial super-minicomputer implementation is discussed. Finally, the technical and economic feasibility is addressed for implementing the super-minicomputer version of the NPA with the RELAP5 simulation code onto the Black Fox full scope nuclear power plant simulator

  12. Nuclear plant analyzer program for Bulgaria

    International Nuclear Information System (INIS)

    Shier, W.; Kennett, R.

    1993-01-01

    An interactive nuclear plant analyzer(NPA) has been developed for use by the Bulgarian technical community in the training of plant personnel, the development and verification of plant operating procedures, and in the analysis of various anticipated operational occurrences and accident scenarios. The current NPA includes models for a VVER-440 Model 230 and a VVER-1000 Model 320 and is operational on an IBM RISC6000 workstation. The RELAP5/MOD2 computer code has been used for the calculation of the reactor responses to the interactive commands initiated by the NPA operator. The interactive capabilities of the NPA have been developed to provide considerable flexibility in the plant actions that can be initiated by the operator. The current capabilities for both the VVER-440 and VVER-1000 models include: (1) scram initiation; (2) reactor coolant pump trip; (3) high pressure safety injection system initiation; (4) low pressure safety injection system initiation; (5) pressurizer safety valve opening; (6) steam generator relief/safety valve opening; (7) feedwater system initiation and trip; (8) turbine trip; and (9) emergency feedwater initiation. The NPA has the capability to display the results of the simulations in various forms that are determined by the model developer. Results displayed on the reactor mask are shown through the user defined, digital display of various plant parameters and through color changes that reflect changes in primary system fluid temperatures, fuel and clad temperatures, and the temperature of other metal structures. In addition, changes in the status of various components and systems can be initiated and/or displayed both numerically and graphically on the mask. This paper provides a description of the structure of the NPA, a discussion of the simulation models used for the VVER-440 and the VVER-1000, and an overview of the NPA capabilities. Typical results obtained using both simulation models will be discussed

  13. Analyzing personalized policies for online biometric verification.

    Science.gov (United States)

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  14. Novel topological descriptors for analyzing biological networks

    Directory of Open Access Journals (Sweden)

    Varmuza Kurt K

    2010-06-01

    Full Text Available Abstract Background Topological descriptors, other graph measures, and in a broader sense, graph-theoretical methods, have been proven as powerful tools to perform biological network analysis. However, the majority of the developed descriptors and graph-theoretical methods does not have the ability to take vertex- and edge-labels into account, e.g., atom- and bond-types when considering molecular graphs. Indeed, this feature is important to characterize biological networks more meaningfully instead of only considering pure topological information. Results In this paper, we put the emphasis on analyzing a special type of biological networks, namely bio-chemical structures. First, we derive entropic measures to calculate the information content of vertex- and edge-labeled graphs and investigate some useful properties thereof. Second, we apply the mentioned measures combined with other well-known descriptors to supervised machine learning methods for predicting Ames mutagenicity. Moreover, we investigate the influence of our topological descriptors - measures for only unlabeled vs. measures for labeled graphs - on the prediction performance of the underlying graph classification problem. Conclusions Our study demonstrates that the application of entropic measures to molecules representing graphs is useful to characterize such structures meaningfully. For instance, we have found that if one extends the measures for determining the structural information content of unlabeled graphs to labeled graphs, the uniqueness of the resulting indices is higher. Because measures to structurally characterize labeled graphs are clearly underrepresented so far, the further development of such methods might be valuable and fruitful for solving problems within biological network analysis.

  15. Thromboelastography platelet mapping in healthy dogs using 1 analyzer versus 2 analyzers.

    Science.gov (United States)

    Blois, Shauna L; Banerjee, Amrita; Wood, R Darren; Park, Fiona M

    2013-07-01

    The objective of this study was to describe the results of thromboelastography platelet mapping (TEG-PM) carried out using 2 techniques in 20 healthy dogs. Maximum amplitudes (MA) generated by thrombin (MAthrombin), fibrin (MAfibrin), adenosine diphosphate (ADP) receptor activity (MAADP), and thromboxane A2 (TxA2) receptor activity (stimulated by arachidonic acid, MAAA) were recorded. Thromboelastography platelet mapping was carried out according to the manufacturer's guidelines (2-analyzer technique) and using a variation of this method employing only 1 analyzer (1-analyzer technique) on 2 separate blood samples obtained from each dog. Mean [± standard deviation (SD)] MA values for the 1-analyzer/2-analyzer techniques were: MAthrombin = 51.9 mm (± 7.1)/52.5 mm (± 8.0); MAfibrin = 20.7 mm (± 21.8)/23.0 mm (± 26.1); MAADP = 44.5 mm (± 15.6)/45.6 mm (± 17.0); and MAAA = 45.7 mm (± 11.6)/45.0 mm (± 15.4). Mean (± SD) percentage aggregation due to ADP receptor activity was 70.4% (± 32.8)/67.6% (± 33.7). Mean percentage aggregation due to TxA2 receptor activity was 77.3% (± 31.6)/78.1% (± 50.2). Results of TEG-PM were not significantly different for the 1-analyzer and 2-analyzer methods. High correlation was found between the 2 methods for MAfibrin [concordance correlation coefficient (r) = 0.930]; moderate correlation was found for MAthrombin (r = 0.70) and MAADP (r = 0.57); correlation between the 2 methods for MAAA was lower (r = 0.32). Thromboelastography platelet mapping (TEG-PM) should be further investigated to determine if it is a suitable method for measuring platelet dysfunction in dogs with thrombopathy.

  16. Analyzing the attributes of Indiana's STEM schools

    Science.gov (United States)

    Eltz, Jeremy

    "Primary and secondary schools do not seem able to produce enough students with the interest, motivation, knowledge, and skills they will need to compete and prosper in the emerging world" (National Academy of Sciences [NAS], 2007a, p. 94). This quote indicated that there are changing expectations for today's students which have ultimately led to new models of education, such as charters, online and blended programs, career and technical centers, and for the purposes of this research, STEM schools. STEM education as defined in this study is a non-traditional model of teaching and learning intended to "equip them [students] with critical thinking, problem solving, creative and collaborative skills, and ultimately establishes connections between the school, work place, community and the global economy" (Science Foundation Arizona, 2014, p. 1). Focusing on science, technology, engineering, and math (STEM) education is believed by many educational stakeholders to be the solution for the deficits many students hold as they move on to college and careers. The National Governors Association (NGA; 2011) believes that building STEM skills in the nation's students will lead to the ability to compete globally with a new workforce that has the capacity to innovate and will in turn spur economic growth. In order to accomplish the STEM model of education, a group of educators and business leaders from Indiana developed a comprehensive plan for STEM education as an option for schools to use in order to close this gap. This plan has been promoted by the Indiana Department of Education (IDOE, 2014a) with the goal of increasing STEM schools throughout Indiana. To determine what Indiana's elementary STEM schools are doing, this study analyzed two of the elementary schools that were certified STEM by the IDOE. This qualitative case study described the findings and themes from two elementary STEM schools. Specifically, the research looked at the vital components to accomplish STEM

  17. analyzers in overweight/obese renal patients

    Directory of Open Access Journals (Sweden)

    Mariusz Kusztal

    2015-05-01

    Full Text Available Bioelectrical impedance analysis (BIA is an affordable, non-invasive and fast alternative method to assess body composition. The purpose of this study was to compare two different tetrapolar BIA devices for estimating body fluid volumes and body cell mass (BCM in a clinical setting among patients with kidney failure.All double measurements were performed by multi-frequency (MF and single-frequency (SF BIA analyzers: a Body Composition Monitor (Fresenius Medical Care, Germany and BIA-101 (Akern, Italy, respectively. All procedures were conducted according to the manufacturers’ instructions (dedicated electrodes, measurement sites, positions, etc. Total body water (TBW, extracellular water (ECW, intracellular water (ICW and BCM were compared. The study included 39 chronic kidney disease patients (stage III-V with a mean age of 45.8 ± 8 years (21 men and 18 women who had a wide range of BMI [17-34 kg/m2 (mean 26.6 ±5].A comparison of results from patients with BMI <25 vs ≥25 revealed a significant discrepancy in measurements between the two BIA devices. Namely, in the group with BMI <25 (n=16 acceptable correlations were obtained in TBW (r 0.99; p<0.01, ICW (0.92; p<0.01, BCM (0.68; p<0.01, and ECW (0.96 p<0.05, but those with BMI ≥25 (n=23 showed a discrepancy (lower correlations in TBW (r 0.82; p<0.05, ICW (0.78; p<0.05, BCM (0.52; p<0.05, and ECW (0.76; p<0.01.Since estimates of TBW, ICW and BCM by the present BIA devices do not differ in patients with BMI <25, they might be interchangeable. This does not hold true for overweight/obese renal patients.

  18. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2014-12-01

    We have developed a cloud-enabled web-service system that empowers physics-based, multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks. The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the observational datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation, (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs, and (3) ECMWF reanalysis outputs for several environmental variables in order to supplement observational datasets. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, (4) the calculation of difference between two variables, and (5) the conditional sampling of one physical variable with respect to another variable. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA will be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. In order to support 30+ simultaneous users during the school, we have deployed CMDA to the Amazon cloud environment. The cloud-enabled CMDA will provide each student with a virtual machine while the user interaction with the system will remain the same

  19. The Albuquerque Seismological Laboratory Data Quality Analyzer

    Science.gov (United States)

    Ringler, A. T.; Hagerty, M.; Holland, J.; Gee, L. S.; Wilson, D.

    2013-12-01

    The U.S. Geological Survey's Albuquerque Seismological Laboratory (ASL) has several efforts underway to improve data quality at its stations. The Data Quality Analyzer (DQA) is one such development. The DQA is designed to characterize station data quality in a quantitative and automated manner. Station quality is based on the evaluation of various metrics, such as timing quality, noise levels, sensor coherence, and so on. These metrics are aggregated into a measurable grade for each station. The DQA consists of a website, a metric calculator (Seedscan), and a PostgreSQL database. The website allows the user to make requests for various time periods, review specific networks and stations, adjust weighting of the station's grade, and plot metrics as a function of time. The website dynamically loads all station data from a PostgreSQL database. The database is central to the application; it acts as a hub where metric values and limited station descriptions are stored. Data is stored at the level of one sensor's channel per day. The database is populated by Seedscan. Seedscan reads and processes miniSEED data, to generate metric values. Seedscan, written in Java, compares hashes of metadata and data to detect changes and perform subsequent recalculations. This ensures that the metric values are up to date and accurate. Seedscan can be run in a scheduled task or on demand by way of a config file. It will compute metrics specified in its configuration file. While many metrics are currently in development, some are completed and being actively used. These include: availability, timing quality, gap count, deviation from the New Low Noise Model, deviation from a station's noise baseline, inter-sensor coherence, and data-synthetic fits. In all, 20 metrics are planned, but any number could be added. ASL is actively using the DQA on a daily basis for station diagnostics and evaluation. As Seedscan is scheduled to run every night, data quality analysts are able to then use the

  20. Technology for collecting and analyzing relational data

    Directory of Open Access Journals (Sweden)

    E. N. Fedorova

    2016-01-01

    summarize the information there is a mechanism of data grouping, which provides general data of the number of entries, maximum, minimum, average values for different groups of records.Results. This technology has been tested in the monitoring requirements of the services of additional professional education and the definition of the educational needs of teachers and executives of educational organizations of the Irkutsk region. The survey has involved 2,780 respondents in 36 municipalities. Creating the data model took several hours. The survey was conducted during the month.Conclusion. The proposed technology allows a short time to collect the information in relational form, and then analyze it without the need for programming with flexible assignment of the operating logic for form.

  1. Analyzers Measure Greenhouse Gases, Airborne Pollutants

    Science.gov (United States)

    2012-01-01

    In complete darkness, a NASA observatory waits. When an eruption of boiling water billows from a nearby crack in the ground, the observatory s sensors seek particles in the fluid, measure shifts in carbon isotopes, and analyze samples for biological signatures. NASA has landed the observatory in this remote location, far removed from air and sunlight, to find life unlike any that scientists have ever seen. It might sound like a scene from a distant planet, but this NASA mission is actually exploring an ocean floor right here on Earth. NASA established a formal exobiology program in 1960, which expanded into the present-day Astrobiology Program. The program, which celebrated its 50th anniversary in 2010, not only explores the possibility of life elsewhere in the universe, but also examines how life begins and evolves, and what the future may hold for life on Earth and other planets. Answers to these questions may be found not only by launching rockets skyward, but by sending probes in the opposite direction. Research here on Earth can revise prevailing concepts of life and biochemistry and point to the possibilities for life on other planets, as was demonstrated in December 2010, when NASA researchers discovered microbes in Mono Lake in California that subsist and reproduce using arsenic, a toxic chemical. The Mono Lake discovery may be the first of many that could reveal possible models for extraterrestrial life. One primary area of interest for NASA astrobiologists lies with the hydrothermal vents on the ocean floor. These vents expel jets of water heated and enriched with chemicals from off-gassing magma below the Earth s crust. Also potentially within the vents: microbes that, like the Mono Lake microorganisms, defy the common characteristics of life on Earth. Basically all organisms on our planet generate energy through the Krebs Cycle, explains Mike Flynn, research scientist at NASA s Ames Research Center. This metabolic process breaks down sugars for energy

  2. Noise and analyzer-crystal angular position analysis for analyzer-based phase-contrast imaging

    Science.gov (United States)

    Majidi, Keivan; Li, Jun; Muehleman, Carol; Brankov, Jovan G.

    2014-04-01

    The analyzer-based phase-contrast x-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the angular intensity profile of the x-ray beam passed through the object at different angular positions of the analyzer crystal. The noise in the estimated ABI parametric images depends upon imaging conditions like the source intensity (flux), measurements angular positions, object properties, and the estimation method. In this paper, we use the Cramér-Rao lower bound (CRLB) to quantify the noise properties in parametric images and to investigate the effect of source intensity, different analyzer-crystal angular positions and object properties on this bound, assuming a fixed radiation dose delivered to an object. The CRLB is the minimum bound for the variance of an unbiased estimator and defines the best noise performance that one can obtain regardless of which estimation method is used to estimate ABI parametric images. The main result of this paper is that the variance (hence the noise) in parametric images is directly proportional to the source intensity and only a limited number of analyzer-crystal angular measurements (eleven for uniform and three for optimal non-uniform) are required to get the best parametric images. The following angular measurements only spread the total dose to the measurements without improving or worsening CRLB, but the added measurements may improve parametric images by reducing estimation bias. Next, using CRLB we evaluate the multiple-image radiography, diffraction enhanced imaging and scatter diffraction enhanced imaging estimation techniques

  3. Noise and analyzer-crystal angular position analysis for analyzer-based phase-contrast imaging

    International Nuclear Information System (INIS)

    Majidi, Keivan; Brankov, Jovan G; Li, Jun; Muehleman, Carol

    2014-01-01

    The analyzer-based phase-contrast x-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the angular intensity profile of the x-ray beam passed through the object at different angular positions of the analyzer crystal. The noise in the estimated ABI parametric images depends upon imaging conditions like the source intensity (flux), measurements angular positions, object properties, and the estimation method. In this paper, we use the Cramér–Rao lower bound (CRLB) to quantify the noise properties in parametric images and to investigate the effect of source intensity, different analyzer-crystal angular positions and object properties on this bound, assuming a fixed radiation dose delivered to an object. The CRLB is the minimum bound for the variance of an unbiased estimator and defines the best noise performance that one can obtain regardless of which estimation method is used to estimate ABI parametric images. The main result of this paper is that the variance (hence the noise) in parametric images is directly proportional to the source intensity and only a limited number of analyzer-crystal angular measurements (eleven for uniform and three for optimal non-uniform) are required to get the best parametric images. The following angular measurements only spread the total dose to the measurements without improving or worsening CRLB, but the added measurements may improve parametric images by reducing estimation bias. Next, using CRLB we evaluate the multiple-image radiography, diffraction enhanced imaging and scatter diffraction enhanced imaging estimation techniques

  4. Comparative evaluation of Plateletworks, Multiplate analyzer and Platelet function analyzer-200 in cardiology patients.

    Science.gov (United States)

    Kim, Jeeyong; Cho, Chi Hyun; Jung, Bo Kyeung; Nam, Jeonghun; Seo, Hong Seog; Shin, Sehyun; Lim, Chae Seung

    2018-04-14

    The objective of this study was to comparatively evaluate three commercial whole-blood platelet function analyzer systems: Platelet Function Analyzer-200 (PFA; Siemens Canada, Mississauga, Ontario, Canada), Multiplate analyzer (MP; Roche Diagnostics International Ltd., Rotkreuz, Switzerland), and Plateletworks Combo-25 kit (PLW; Helena Laboratories, Beaumont, TX, USA). Venipuncture was performed on 160 patients who visited a department of cardiology. Pairwise agreement among the three platelet function assays was assessed using Cohen's kappa coefficient and percent agreement within the reference limit. Kappa values with the same agonists were poor between PFA-collagen (COL; agonist)/adenosine diphosphate (ADP) and MP-ADP (-0.147), PFA-COL/ADP and PLW-ADP (0.089), MP-ADP and PLW-ADP (0.039), PFA-COL/ADP and MP-COL (-0.039), and between PFA-COL/ADP and PLW-COL (-0.067). Nonetheless, kappa values for the same assay principle with a different agonist were slightly higher between PFA-COL/ADP and PFA-COL/EPI (0.352), MP-ADP and MP-COL (0.235), and between PLW-ADP and PLW-COL (0.247). The range of percent agreement values was 38.7% to 73.8%. Therefore, various measurements of platelet function by more than one method were needed to obtain a reliable interpretation of platelet function considering low kappa coefficient and modest percent agreement rates among 3 different platelet function tests.

  5. Development of remote controlled electron probe micro analyzer with crystal orientation analyzer

    International Nuclear Information System (INIS)

    Honda, Junichi; Matsui, Hiroki; Harada, Akio; Obata, Hiroki; Tomita, Takeshi

    2012-07-01

    The advanced utilization of Light Water Reactor (LWR) fuel is progressed in Japan to save the power generating cost and the volume of nuclear wastes. The electric power companies have continued the approach to the burnup extension and to rise up the thermal power increase of the commercial fuel. The government should be accumulating the detailed information on the newest technologies to make the regulations and guidelines for the safety of the advanced nuclear fuels. The remote controlled Electron Probe Micro Analyzer (EPMA) attached with crystal orientation analyzer has been developed in Japan Atomic Energy Agency (JAEA) to study the fuel behavior of the high burnup fuels under the accident condition. The effects of the cladding microstructure on the fuel behavior will be evaluated more conveniently and quantitatively by this EPMA. The commercial model of EPMA has been modified to have the performance of airtight and earthquake resistant in compliance with the safety regulation by the government for handling the high radioactive elements. This paper describes the specifications of EPMA which were specialised for post irradiation examination and the test results of the cold mock-up to confirm their performances and reliabilities. (author)

  6. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Roozbeh Rashed

    2013-01-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.  

  7. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Farzin Heravi

    2012-09-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.

  8. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  9. Looking for a Framework for Analyzing Eco-innovation Dynamics

    DEFF Research Database (Denmark)

    Yang, Yan

    2011-01-01

    Looking for a Framework for Analyzing Eco-innovation Dynamics: A Triple Helix Model of Innovation Perspective.......Looking for a Framework for Analyzing Eco-innovation Dynamics: A Triple Helix Model of Innovation Perspective....

  10. Portable Programmable Multifunction Body Fluids Analyzer, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced Liquid Logic proposes to develop a very capable analyzer based on its digital microfluidic technology. Such an analyzer would be:  Capable of both simple...

  11. 21 CFR 870.3640 - Indirect pacemaker generator function analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Indirect pacemaker generator function analyzer... Indirect pacemaker generator function analyzer. (a) Identification. An indirect pacemaker generator function analyzer is an electrically powered device that is used to determine pacemaker function or...

  12. 21 CFR 870.3630 - Pacemaker generator function analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Pacemaker generator function analyzer. 870.3630... (CONTINUED) MEDICAL DEVICES CARDIOVASCULAR DEVICES Cardiovascular Prosthetic Devices § 870.3630 Pacemaker generator function analyzer. (a) Identification. A pacemaker generator function analyzer is a device that is...

  13. 21 CFR 868.1700 - Nitrous oxide gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Nitrous oxide gas analyzer. 868.1700 Section 868...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1700 Nitrous oxide gas analyzer. (a) Identification. A nitrous oxide gas analyzer is a device intended to measure the concentration of nitrous oxide...

  14. 21 CFR 862.2500 - Enzyme analyzer for clinical use.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Enzyme analyzer for clinical use. 862.2500 Section... Instruments § 862.2500 Enzyme analyzer for clinical use. (a) Identification. An enzyme analyzer for clinical use is a device intended to measure enzymes in plasma or serum by nonkinetic or kinetic measurement of...

  15. Hardware Realization of an Ethernet Packet Analyzer Search Engine

    Science.gov (United States)

    2000-06-30

    specific for the home automation industry. This analyzer will be at the gateway of a network and analyze Ethernet packets as they go by. It will keep... home automation and not the computer network. This system is a stand-alone real-time network analyzer capable of decoding Ethernet protocols. The

  16. 40 CFR 86.1322-84 - Carbon monoxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... be used. (2) Zero the carbon monoxide analyzer with either zero-grade air or zero-grade nitrogen. (3... columns is one form of corrective action which may be taken.) (b) Initial and periodic calibration. Prior... calibrated. (1) Adjust the analyzer to optimize performance. (2) Zero the carbon monoxide analyzer with...

  17. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  18. Whole community genome amplification (WCGA) leads to compositional bias in methane oxidizing communities as assessed by pmoA based microarray analyses and QPCR

    NARCIS (Netherlands)

    Bodelier, P.L.E.; Kamst, M.; Meima-Franke, M.; Stralis-Pavese, N.; Bodrossy, L.

    2009-01-01

    Whole-genome amplification (WGA) using multiple displacement amplification (MDA) has recently been introduced to the field of environmental microbiology. The amplification of single-cell genomes or whole-community metagenomes decreases the minimum amount of DNA needed for subsequent molecular

  19. Development of a Telemetric, Miniaturized Electrochemical Amperometric Analyzer

    OpenAIRE

    Jaehyo Jung; Jihoon Lee; Siho Shin; Youn Tae Kim

    2017-01-01

    In this research, we developed a portable, three-electrode electrochemical amperometric analyzer that can transmit data to a PC or a tablet via Bluetooth communication. We performed experiments using an indium tin oxide (ITO) glass electrode to confirm the performance and reliability of the analyzer. The proposed analyzer uses a current-to-voltage (I/V) converter to convert the current generated by the reduction-oxidation (redox) reaction of the buffer solution to a voltage signal. This signa...

  20. Faraday cup for analyzing multi-ion plasma

    International Nuclear Information System (INIS)

    Fujita, Takao

    1987-01-01

    A compact and convenient ion analyzer (a kind of a Faraday cup) is developed in order to analyze weakly ionized multi-ion plasmas. This Faraday cup consists of three mesh electrodes and a movable ion collector. With a negative gate pulse superimposed on the ion retarding bias, ions are analyzed by means of time-of-flight. The identification of ion species and measurements of ion density and ion temperature are studied. (author)

  1. L G-2 Scintrex manual.Fluorescence analyzer

    International Nuclear Information System (INIS)

    Pirelli, H.

    1987-01-01

    The Scintrex Fluorescence Analyzer LG-2 selectively detects the presence of certain fluorescent minerals through UV photoluminescence induced and provides quantitative information on its distribution.

  2. 40 CFR 91.314 - Analyzer accuracy and specifications.

    Science.gov (United States)

    2010-07-01

    .... (3) Zero drift. The analyzer zero-response drift during a one-hour period must be less than two percent of full-scale chart deflection on the lowest range used. The zero-response is defined as the mean... calibration or span gas. (2) Noise. The analyzer peak-to-peak response to zero and calibration or span gases...

  3. A data mining approach to analyze occupant behavior motivation

    NARCIS (Netherlands)

    Ren, X.; Zhao, Y.; Zeiler, W.; Boxem, G.; Li, T.

    2017-01-01

    Occupants' behavior could bring significant impact on the performance of built environment. Methods of analyzing people's behavior have not been adequately developed. The traditional methods such as survey or interview are not efficient. This study proposed a data-driven method to analyze the

  4. 40 CFR 86.327-79 - Quench checks; NOX analyzer.

    Science.gov (United States)

    2010-07-01

    ... any flow rate into the reaction chamber. This includes, but is not limited to, sample capillary, ozone... Quench checks; NOX analyzer. (a) Perform the reaction chamber quench check for each model of high vacuum reaction chamber analyzer prior to initial use. (b) Perform the reaction chamber quench check for each new...

  5. Development of parallel/serial program analyzing tool

    International Nuclear Information System (INIS)

    Watanabe, Hiroshi; Nagao, Saichi; Takigawa, Yoshio; Kumakura, Toshimasa

    1999-03-01

    Japan Atomic Energy Research Institute has been developing 'KMtool', a parallel/serial program analyzing tool, in order to promote the parallelization of the science and engineering computation program. KMtool analyzes the performance of program written by FORTRAN77 and MPI, and it reduces the effort for parallelization. This paper describes development purpose, design, utilization and evaluation of KMtool. (author)

  6. Control of a pulse height analyzer using an RDX workstation

    International Nuclear Information System (INIS)

    Montelongo, S.; Hunt, D.N.

    1984-12-01

    The Nuclear Chemistry Division of Lawrence Livermore National laboratory is in the midst of upgrading its radiation counting facilities to automate data acquisition and quality control. This upgrade requires control of a pulse height analyzer (PHA) from an interactive LSI-11/23 workstation running RSX-11M. The PHA is a micro-computer based multichannel analyzer system providing data acquisition, storage, display, manipulation and input/output from up to four independent acquisition interfaces. Control of the analyzer includes reading and writing energy spectra, issuing commands, and servicing device interrupts. The analyzer communicates to the host system over a 9600-baud serial line using the Digital Data Communications link level Protocol (DDCMP). We relieved the RSX workstation CPU from the DDCMP overhead by implementing a DEC compatible in-house designed DMA serial line board (the ISL-11) to communicate with the analyzer. An RSX I/O device driver was written to complete the path between the analyzer and the RSX system by providing the link between the communication board and an application task. The I/O driver is written to handle several ISL-11 cards all operating in parallel thus providing support for control of multiple analyzers from a single workstation. The RSX device driver, its design and use by application code controlling the analyzer, and its operating environment will be discussed

  7. Analyzing FCS Professionals in Higher Education: A Case Study

    Science.gov (United States)

    Hall, Scott S.; Harden, Amy; Pucciarelli, Deanna L.

    2016-01-01

    A national study of family and consumer sciences (FCS) professionals in higher education was analyzed as a case study to illustrate procedures useful for investigating issues related to FCS. The authors analyzed response rates of more than 1,900 FCS faculty and administrators by comparing those invited to participate and the 345 individuals who…

  8. A multichannel analyzer computer system for simultaneously measuring 64 spectra

    International Nuclear Information System (INIS)

    Jin Yuheng; Wan Yuqing; Zhang Jiahong; Li Li; Chen Guozhu

    2000-01-01

    The author introduces a multichannel analyzer computer system for simultaneously measuring 64 spectra with 64 coded independent inputs. The system is developed for a double chopper neutron scattering time-of-flight spectrometer. The system structure, coding method, operating principle and performances are presented. The system can also be used for other nuclear physics experiments which need multichannel analyzer with independent coded inputs

  9. Computer-based multi-channel analyzer based on internet

    International Nuclear Information System (INIS)

    Zhou Xinzhi; Ning Jiaoxian

    2001-01-01

    Combined the technology of Internet with computer-based multi-channel analyzer, a new kind of computer-based multi-channel analyzer system which is based on browser is presented. Its framework and principle as well as its implementation are discussed

  10. A Morphological Analyzer for Vocalized or Not Vocalized Arabic Language

    Science.gov (United States)

    El Amine Abderrahim, Med; Breksi Reguig, Fethi

    This research has been to show the realization of a morphological analyzer of the Arabic language (vocalized or not vocalized). This analyzer is based upon our object model for the Arabic Natural Language Processing (NLP) and can be exploited by NLP applications such as translation machine, orthographical correction and the search for information.

  11. Analyzing Population Genetics Data: A Comparison of the Software

    Science.gov (United States)

    Choosing a software program for analyzing population genetic data can be a challenge without prior knowledge of the methods used by each program. There are numerous web sites listing programs by type of data analyzed, type of analyses performed, or other criteria. Even with programs categorized in ...

  12. 40 CFR 90.318 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer calibration. 90.318 Section 90.318 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Emission Test Equipment Provisions § 90.318 Oxides of nitrogen analyzer calibration. (a) Calibrate the...

  13. 40 CFR 91.318 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer calibration. 91.318 Section 91.318 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Provisions § 91.318 Oxides of nitrogen analyzer calibration. (a) Calibrate the chemiluminescent oxides of...

  14. 40 CFR 89.321 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer calibration. 89.321 Section 89.321 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Test Equipment Provisions § 89.321 Oxides of nitrogen analyzer calibration. (a) The chemiluminescent...

  15. THE EXPERIENCE OF COMPARISON OF STATIC SECURITY CODE ANALYZERS

    Directory of Open Access Journals (Sweden)

    Alexey Markov

    2015-09-01

    Full Text Available This work presents a methodological approach to comparison of static security code analyzers. It substantiates the comparison of the static analyzers as to efficiency and functionality indicators, which are stipulated in the international regulatory documents. The test data for assessment of static analyzers efficiency is represented by synthetic sets of open-source software, which contain vulnerabilities. We substantiated certain criteria for quality assessment of the static security code analyzers subject to standards NIST SP 500-268 and SATEC. We carried out experiments that allowed us to assess a number of the Russian proprietary software tools and open-source tools. We came to the conclusion that it is of paramount importance to develop Russian regulatory framework for testing software security (firstly, for controlling undocumented features and evaluating the quality of static security code analyzers.

  16. Experimental analysis of a new retarding field energy analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Yu-Xiang [Shanghai Institute of Mechanical and Electrical Engineering, No. 3888, Yuanjiang Road, Minhang District, Shanghai 201109 (China); Institute of Electronics, Chinese Academy of Sciences, No. 19, North 4th Ring Road West, Haidian District, Beijing 100190 (China); Liu, Shu-Qing; Li, Xian-Xia; Shen, Hong-Li; Huang, Ming-Guang [Institute of Electronics, Chinese Academy of Sciences, No. 19, North 4th Ring Road West, Haidian District, Beijing 100190 (China); Liu, Pu-Kun, E-mail: pkliu@pku.edu.cn [School of Electronics Engineering and Computer Science, Peking University, No. 5, Yiheyuan Road, Haidian District, Beijing 100871 (China)

    2015-06-11

    In this paper, a new compact retarding field energy analyzer (RFEA) is designed for diagnosing electron beams of a K-band space travelling-wave tube (TWT). This analyzer has an aperture plate to sample electron beams and a cylindrical electrode to overcome the defocusing effects. The front end of the analyzer constructed as a multistage depression collector (MDC) structure is intended to shape the field to prevent electrons from being accelerated to escape. The direct-current (DC) beams of the K-band space TWTs with the removing MDC can be investigated on the beam measurement system. The current density distribution of DC beams is determined by the analyzer, while the anode voltage and helix voltage of the TWTs are 7000 V and 6850 V, respectively. The current curve’s slope effect due to the reflection of secondary electrons on the copper collector of the analyzer is discussed. The experimental analysis shows this RFEA has a good energy resolution to satisfy the requirement of beam measurement. - Highlights: • A new retarding field energy analyzer (RFEA) is designed to diagnose the electron beam of a K-band space TWT. • The current density distribution of direct-current beam is determined. • The reflection effect of secondary electrons on the copper collector of the analyzer is discussed.

  17. Effect of nutrient and selective inhibitor amendments on methane oxidation, nitrous oxide production, and key gene presence and expression in landfill cover soils: characterization of the role of methanotrophs, nitrifiers, and denitrifiers.

    Science.gov (United States)

    Lee, Sung-Woo; Im, Jeongdae; Dispirito, Alan A; Bodrossy, Levente; Barcelona, Michael J; Semrau, Jeremy D

    2009-11-01

    Methane and nitrous oxide are both potent greenhouse gasses, with global warming potentials approximately 25 and 298 times that of carbon dioxide. A matrix of soil microcosms was constructed with landfill cover soils collected from the King Highway Landfill in Kalamazoo, Michigan and exposed to geochemical parameters known to affect methane consumption by methanotrophs while also examining their impact on biogenic nitrous oxide production. It was found that relatively dry soils (5% moisture content) along with 15 mg NH (4) (+) (kg soil)(-1) and 0.1 mg phenylacetylene(kg soil)(-1) provided the greatest stimulation of methane oxidation while minimizing nitrous oxide production. Microarray analyses of pmoA showed that the methanotrophic community structure was dominated by Type II organisms, but Type I genera were more evident with the addition of ammonia. When phenylacetylene was added in conjunction with ammonia, the methanotrophic community structure was more similar to that observed in the presence of no amendments. PCR analyses showed the presence of amoA from both ammonia-oxidizing bacteria and archaea, and that the presence of key genes associated with these cells was reduced with the addition of phenylacetylene. Messenger RNA analyses found transcripts of pmoA, but not of mmoX, nirK, norB, or amoA from either ammonia-oxidizing bacteria or archaea. Pure culture analyses showed that methanotrophs could produce significant amounts of nitrous oxide, particularly when expressing the particulate methane monooxygenase (pMMO). Collectively, these data suggest that methanotrophs expressing pMMO played a role in nitrous oxide production in these microcosms.

  18. Analysis and discussion on the experimental data of electrolyte analyzer

    Science.gov (United States)

    Dong, XinYu; Jiang, JunJie; Liu, MengJun; Li, Weiwei

    2018-06-01

    In the subsequent verification of electrolyte analyzer, we found that the instrument can achieve good repeatability and stability in repeated measurements with a short period of time, in line with the requirements of verification regulation of linear error and cross contamination rate, but the phenomenon of large indication error is very common, the measurement results of different manufacturers have great difference, in order to find and solve this problem, help enterprises to improve quality of product, to obtain accurate and reliable measurement data, we conducted the experimental evaluation of electrolyte analyzer, and the data were analyzed by statistical analysis.

  19. Transit time spreads in biased paracentric hemispherical deflection analyzers

    International Nuclear Information System (INIS)

    Sise, Omer; Zouros, Theo J.M.

    2016-01-01

    The biased paracentric hemispherical deflection analyzers (HDAs) are an alternative to conventional (centric) HDAs maintaining greater dispersion, lower angular aberrations, and hence better energy resolution without the use of any additional fringing field correctors. In the present work, the transit time spread of the biased paracentric HDA is computed over a wide range of analyzer parameters. The combination of high energy resolution with good time resolution and simplicity of design makes the biased paracentric analyzers very promising for both coincidence and singles spectroscopy applications.

  20. Transit time spreads in biased paracentric hemispherical deflection analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Sise, Omer, E-mail: omersise@sdu.edu.tr [Dept. of Science Education, Faculty of Education, Suleyman Demirel Univ., 32260 Isparta (Turkey); Zouros, Theo J.M. [Dept. of Physics, Univ. of Crete, P.O. Box 2208, GR 71003 Heraklion (Greece); Tandem Lab, INPP, NCSR Demokritos, P.O. Box 60228, GR 15310 Ag. Paraskevi (Greece)

    2016-02-15

    The biased paracentric hemispherical deflection analyzers (HDAs) are an alternative to conventional (centric) HDAs maintaining greater dispersion, lower angular aberrations, and hence better energy resolution without the use of any additional fringing field correctors. In the present work, the transit time spread of the biased paracentric HDA is computed over a wide range of analyzer parameters. The combination of high energy resolution with good time resolution and simplicity of design makes the biased paracentric analyzers very promising for both coincidence and singles spectroscopy applications.

  1. Transit time spreads in biased paracentric hemispherical deflection analyzers

    Science.gov (United States)

    Sise, Omer; Zouros, Theo J. M.

    2016-02-01

    The biased paracentric hemispherical deflection analyzers (HDAs) are an alternative to conventional (centric) HDAs maintaining greater dispersion, lower angular aberrations, and hence better energy resolution without the use of any additional fringing field correctors. In the present work, the transit time spread of the biased paracentric HDA is computed over a wide range of analyzer parameters. The combination of high energy resolution with good time resolution and simplicity of design makes the biased paracentric analyzers very promising for both coincidence and singles spectroscopy applications.

  2. Emergency response training with the BNL plant analyzer

    International Nuclear Information System (INIS)

    Cheng, H.S.; Guppy, J.G.; Mallen, A.N.; Wulff, W.

    1987-01-01

    Presented is the experience in the use of the BNL Plant Analyzer for NRC emergency response training to simulated accidents in a BWR. The unique features of the BNL Plant Analyzer that are important for the emergency response training are summarized. A closed-loop simulation of all the key systems of a power plant in question was found essential to the realism of the emergency drills conducted at NRC. The faster than real-time simulation speeds afforded by the BNL Plant Analyzer have demonstrated its usefulness for the timely conduct of the emergency response training

  3. Intermittency in multiparticle production analyzed by means of stochastic theories

    International Nuclear Information System (INIS)

    Bartl, A.; Suzuki, N.

    1990-01-01

    Intermittency in multiparticle production is described by means of probability distributions derived from pure birth stochastic equations. The UA1, TASSO, NA22 and cosmic ray data are analyzed. 24 refs., 1 fig. (Authors)

  4. Automated Real-Time Clearance Analyzer (ARCA), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The Automated Real-Time Clearance Analyzer (ARCA) addresses the future safety need for Real-Time System-Wide Safety Assurance (RSSA) in aviation and progressively...

  5. Triple Isotope Water Analyzer for Extraplanetary Studies, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Los Gatos Research (LGR) proposes to employ Off-Axis ICOS to develop triple-isotope water analyzers for lunar and other extraplanetary exploration. This instrument...

  6. Analyzing Software Errors in Safety-Critical Embedded Systems

    Science.gov (United States)

    Lutz, Robyn R.

    1994-01-01

    This paper analyzes the root causes of safty-related software faults identified as potentially hazardous to the system are distributed somewhat differently over the set of possible error causes than non-safety-related software faults.

  7. Mini Total Organic Carbon Analyzer (miniTOCA)

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this development is to create a prototype hand-held, 1 to 2 liter size battery-powered Total Organic Carbon Analyzer (TOCA). The majority of...

  8. The quality infrastructure measuring, analyzing, and improving library services

    CERN Document Server

    Murphy, Sarah Anne

    2013-01-01

    Summarizing specific tools for measuring service quality alongside tips for using these tools most effectively, this book helps libraries of all kinds take a programmatic approach to measuring, analyzing, and improving library services.

  9. Josephson junction spectrum analyzer for millimeter and submillimeter wavelengths

    International Nuclear Information System (INIS)

    Larkin, S.Y.; Anischenko, S.E.; Khabayev, P.V.

    1994-01-01

    A prototype of the Josephson-effect spectrum analyzer developed for the millimeter-wave band is described. The measurement results for spectra obtained in the frequency band from 50 to 250 GHz are presented

  10. Analyzing radial acceleration with a smartphone acceleration sensor

    Science.gov (United States)

    Vogt, Patrik; Kuhn, Jochen

    2013-03-01

    This paper continues the sequence of experiments using the acceleration sensor of smartphones (for description of the function and the use of the acceleration sensor, see Ref. 1) within this column, in this case for analyzing the radial acceleration.

  11. The Photo-Pneumatic CO2 Analyzer, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We are proposing to build a new technology, the photo-pneumatic analyzer. It is small, solid-state, inexpensive, and appropriate for observations of atmospheric...

  12. Josephson junction spectrum analyzer for millimeter and submillimeter wavelengths

    Energy Technology Data Exchange (ETDEWEB)

    Larkin, S.Y.; Anischenko, S.E.; Khabayev, P.V. [State Research Center, Kiev (Ukraine)

    1994-12-31

    A prototype of the Josephson-effect spectrum analyzer developed for the millimeter-wave band is described. The measurement results for spectra obtained in the frequency band from 50 to 250 GHz are presented.

  13. Methyl-Analyzer--whole genome DNA methylation profiling.

    Science.gov (United States)

    Xin, Yurong; Ge, Yongchao; Haghighi, Fatemeh G

    2011-08-15

    Methyl-Analyzer is a python package that analyzes genome-wide DNA methylation data produced by the Methyl-MAPS (methylation mapping analysis by paired-end sequencing) method. Methyl-MAPS is an enzymatic-based method that uses both methylation-sensitive and -dependent enzymes covering >80% of CpG dinucleotides within mammalian genomes. It combines enzymatic-based approaches with high-throughput next-generation sequencing technology to provide whole genome DNA methylation profiles. Methyl-Analyzer processes and integrates sequencing reads from methylated and unmethylated compartments and estimates CpG methylation probabilities at single base resolution. Methyl-Analyzer is available at http://github.com/epigenomics/methylmaps. Sample dataset is available for download at http://epigenomicspub.columbia.edu/methylanalyzer_data.html. fgh3@columbia.edu Supplementary data are available at Bioinformatics online.

  14. Analyzed method for calculating the distribution of electrostatic field

    International Nuclear Information System (INIS)

    Lai, W.

    1981-01-01

    An analyzed method for calculating the distribution of electrostatic field under any given axial gradient in tandem accelerators is described. This method possesses satisfactory accuracy compared with the results of numerical calculation

  15. Generalized empirical likelihood methods for analyzing longitudinal data

    KAUST Repository

    Wang, S.; Qian, L.; Carroll, R. J.

    2010-01-01

    Efficient estimation of parameters is a major objective in analyzing longitudinal data. We propose two generalized empirical likelihood based methods that take into consideration within-subject correlations. A nonparametric version of the Wilks

  16. NRC nuclear-plant-analyzer concept and status at INEL

    International Nuclear Information System (INIS)

    Aguilar, F.; Wagner, R.J.

    1982-01-01

    The Office of Research of the US NRC has proposed development of a software-hardware system called the Nuclear Plant Analyzer (NPA). This paper describes how we of the INEL envision the nuclear-plant analyzer. The paper also describes a pilot RELAP5 plant-analyzer project completed during the past year and current work. A great deal of analysis is underway to determine nuclear-steam-system response. System transient analysis being so complex, there is the need to present analytical results in a way that interconnections among phenomena and all the nuances of the transient are apparent. There is the need for the analyst to dynamically control system calculations to simulate plant operation in order to perform what if studies as well as the need to perform system analysis within hours of a plant emergency to diagnose the state of the stricken plant and formulate recovery actions. The NRC-proposed nuclear-plant analyzer can meet these needs

  17. AmAMorph: Finite State Morphological Analyzer for Amazighe

    Directory of Open Access Journals (Sweden)

    Fatima Zahra Nejme

    2016-03-01

    Full Text Available This paper presents AmAMorph, a morphological analyzer for Amazighe language using a system based on the NooJ linguistic development environment. The paper begins with the development of Amazighe lexicons with large coverage formalization. The built electronic lexicons, named ‘NAmLex’, ‘VAmLex’ and ‘PAmLex’ which stand for ‘Noun Amazighe Lexicon’, ‘Verb Amazighe Lexicon’ and ‘Particles Amazighe Lexicon’, link inflectional, morphological, and syntacticsemantic information to the list of lemmas. Automated inflectional and derivational routines are applied to each lemma producing over inflected forms. To our knowledge,AmAMorph is the first morphological analyzer for Amazighe. It identifies the component morphemes of the forms using large coverage morphological grammars. Along with the description of how the analyzer is implemented, this paper gives an evaluation of the analyzer.

  18. Radiometric flow injection analysis with an ASIA (Ismatec) analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Myint, U; Win, N; San, K; Han, B; Myoe, K M [Yangon Univ. (Myanmar). Dept. of Chemistry; Toelgyessy, J [Slovak Technical Univ., Bratislava (Slovakia). Dept. of Environmental Science

    1994-07-01

    Radiometric Flow Injection Analysis of a radioactive ([sup 131]I) sample is described. For analysis an ASIA (Ismatec) analyzer with a NaI(Tl) scintillation detector was used. (author) 5 refs.; 3 figs.

  19. Multisensor Analyzed Sea Ice Extent - Northern Hemisphere (MASIE-NH)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Multisensor Analyzed Sea Ice Extent Northern Hemisphere (MASIE-NH) products provide measurements of daily sea ice extent and sea ice edge boundary for the...

  20. Quality Performance of Drugs Analyzed in the Drug Analysis and ...

    African Journals Online (AJOL)

    ICT TEAM

    performance of drug samples analyzed therein. Previous reports have ... wholesalers, non-governmental organizations, hospitals, analytical ..... a dispute concerning discharge of waste water ... Healthcare Industry in Kenya, December. 2008.

  1. Generating and analyzing non-diffracting vector vortex beams

    CSIR Research Space (South Africa)

    Li, Y

    2013-08-01

    Full Text Available single order Bessel beam and superposition cases are studied. The polarization and the azimuthal modes of the generated beams are analyzed. The results of modal decompositions on polarization components are in good agreement with theory. We demonstrate...

  2. A Framework for Modeling and Analyzing Complex Distributed Systems

    National Research Council Canada - National Science Library

    Lynch, Nancy A; Shvartsman, Alex Allister

    2005-01-01

    Report developed under STTR contract for topic AF04-T023. This Phase I project developed a modeling language and laid a foundation for computational support tools for specifying, analyzing, and verifying complex distributed system designs...

  3. Analyzing Spread of Influence in Social Networks for Transportation Applications

    Science.gov (United States)

    2016-09-02

    This project analyzed the spread of influence in social media, in particular, the Twitter social media site, and identified the individuals who exert the most influence to those they interact with. There are published studies that use social media to...

  4. Analyzing Spread of Influence in Social Networks for Transportation Application.

    Science.gov (United States)

    2016-09-02

    This project analyzed the spread of influence in social media, in particular, the Twitter social media site, and identified the individuals who exert the most influence to those they interact with. There are published studies that use social media to...

  5. Airspace Analyzer for Assessing Airspace Directional Permeability, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We build a software tool which enables the user (airline or Air Traffic Service Provider (ATSP)) the ability to analyze the flight-level-by-flight-level permeability...

  6. Giessen polarization facility. III. Multi-detector analyzing system

    Energy Technology Data Exchange (ETDEWEB)

    Krause, H H; Stock, R; Arnold, W; Berg, H; Huttel, E; Ulbricht, J; Clausnitzer, G [Giessen Univ. (Germany, F.R.). Strahlenzentrum

    1977-06-15

    An analyzing system with a PDP 11 computer and a digital multiplexer is described. It allows to accept signals from 16 detectors with individual ADCs simultaneously. For measurements of analyzing powers the polarization of the ion beam can be switched to zero with a frequency of 1 kHz. The switching operation additionally controls the handling of the detector pulses. The software contains special programs for the analysis of polarization experiments.

  7. APPROACHES TO ANALYZE THE QUALITY OF ROMANIAN TOURISM WEB SITES

    Directory of Open Access Journals (Sweden)

    Lacurezeanu Ramona

    2013-07-01

    The purpose of our work is to analyze travel web-sites, more exactly, whether the criteria used to analyze virtual stores are also adequate for the Romanian tourism product. Following the study, we concluded that the Romanian online tourism web-sites for the Romanian market have the features that we found listed on similar web-sites of France, England, Germany, etc. In conclusion, online Romanian tourism can be considered one of the factors of economic growth.

  8. Tests of the Royce ultrasonic interface level analyzer

    International Nuclear Information System (INIS)

    WITWER, K.S.

    1999-01-01

    This document describes testing carried out in 1995 on the Royce Interface Level Analyzer. The testing was carried out in the 305 Bldg., Engineering Testing Laboratory, 300 Area. The Level Analyzer was shown to be able to effectively locate the solid liquid interface layer of two different simulants under various conditions and was able to do so after being irradiated with over 5 million RADS gamma from a Cobalt 60 source

  9. Evaluation of haematology analyzer CELL-DYN 3700 SL

    Directory of Open Access Journals (Sweden)

    Enver Suljević

    2003-05-01

    Full Text Available Research on the parameters of full blood count and differential white blood count is included in the program of all medical laboratories of primary, secondary and tertiary health care levels. Today, all haematological tests are exclusively performed on the haematology analyzers. Automation of haematology laboratories is a result of the huge requires for haematological test performing, timely issuing of the haematological findings, and possibility of the usage of modern techniques.This work is an evaluation of laser haematology analyzer Cell-Dyn 3700 SL. It investigates the reliability of test results throughout the following parameters: precision, accuracy, sensitivity and specificity of determination methods. It also explores the influence of sample transferring and correlation with haematology analyzer MAXM Retti. Haematology parameters that have been investigated are: white blood cell (WBC, neutrophils (NEU, lymphocytes (LXM, monocytes (MONO, eosinophils (EOS, basophils (BASO, red blood cells (RBC, haemoglobin (HGB, haematocrit (HCT, mean corpuscular volume (MCV, mean corpuscular haemoglobin (MCHC red cell distribution width (RDW, platelet (PLT, mean platelet volume (MPV, plateletocrit (PCT, and platelet distribution width (PDW.The results confirms that precision of analyzer fulfils the reproducibility of testing parameters: WBC, RBC, HGB, MCV, MCH, MCHC, and PLT. Correlation coefficient values (r gained throughout the statistical analysis, that is linear regression results obtained throughout the comparison of two analyzers are adequate except for MCHC (r = 0.64, what is in accordance with literature data.Accuracy is tested by haematology analyzer method and microscopic differentiating method. Correlation coefficient results for granulocytes, lymphocytes and monocytes point the accuracy of methods. Sensitivity and specificity parameters fulfil the analytical criteria.It is confirmed that haematology analyzer Cell-Dyn 3700 SL is reliable for

  10. Magnetic systems for wide-aperture neutron polarizers and analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Gilev, A.G. [Neutron Research Department, Petersburg Nuclear Physics Institute, NRC “Kurchatov Institute”, Orlova Roscha, Gatchina, St. Petersburg 188300 (Russian Federation); Pleshanov, N.K., E-mail: pnk@pnpi.spb.ru [Neutron Research Department, Petersburg Nuclear Physics Institute, NRC “Kurchatov Institute”, Orlova Roscha, Gatchina, St. Petersburg 188300 (Russian Federation); Bazarov, B.A.; Bulkin, A.P.; Schebetov, A.F. [Neutron Research Department, Petersburg Nuclear Physics Institute, NRC “Kurchatov Institute”, Orlova Roscha, Gatchina, St. Petersburg 188300 (Russian Federation); Syromyatnikov, V.G. [Neutron Research Department, Petersburg Nuclear Physics Institute, NRC “Kurchatov Institute”, Orlova Roscha, Gatchina, St. Petersburg 188300 (Russian Federation); Physical Department, St. Petersburg State University, Ulyanovskaya, 1, Petrodvorets, St. Petersburg 198504 (Russian Federation); Tarnavich, V.V.; Ulyanov, V.A. [Neutron Research Department, Petersburg Nuclear Physics Institute, NRC “Kurchatov Institute”, Orlova Roscha, Gatchina, St. Petersburg 188300 (Russian Federation)

    2016-10-11

    Requirements on the field uniformity in neutron polarizers are analyzed in view of the fact that neutron polarizing coatings have been improved during the past decade. The design of magnetic systems that meet new requirements is optimized by numerical simulations. Magnetic systems for wide-aperture multichannel polarizers and analyzers are represented, including (a) the polarizer to be built at channel 4-4′ of the reactor PIK (Gatchina, Russia) for high-flux experiments with a 100×150 mm{sup 2} beam of polarized cold neutrons; (b) the fan analyzer covering a 150×100 mm{sup 2} window of the detector at the Magnetism Reflectometer (SNS, ORNL, USA); (c) the polarizer and (d) the fan analyzer covering a 220×110 mm{sup 2} window of the detector at the reflectometer NERO, which is transferred to PNPI (Russia) from HZG (Germany). Deviations of the field from the vertical did not exceed 2°. The polarizing efficiency of the analyzer at the Magnetism Reflectometer reached 99%, a record level for wide-aperture supermirror analyzers.

  11. The comparison of automated urine analyzers with manual microscopic examination for urinalysis automated urine analyzers and manual urinalysis

    OpenAIRE

    ?nce, Fatma Demet; Ellida?, Hamit Ya?ar; Koseo?lu, Mehmet; ?im?ek, Ne?e; Yal??n, H?lya; Zengin, Mustafa Osman

    2016-01-01

    Objectives: Urinalysis is one of the most commonly performed tests in the clinical laboratory. However, manual microscopic sediment examination is labor-intensive, time-consuming, and lacks standardization in high-volume laboratories. In this study, the concordance of analyses between manual microscopic examination and two different automatic urine sediment analyzers has been evaluated. Design and methods: 209 urine samples were analyzed by the Iris iQ200 ELITE (Ä°ris Diagnostics, USA), Dirui...

  12. Applications of Electronstatic Lenses to Electron Gun and Energy Analyzers

    International Nuclear Information System (INIS)

    Sise, O.

    2004-01-01

    Focal properties and geometries are given for several types of electrostatic lens systems commonly needed in electron impact studies. One type is an electron gun which focuses electrons over a wide range of energy onto a fixed point, such as target, and the other type is an analyzer system which focuses scattered electrons of variable energy onto a fixed position, such as the entrance plane of an analyzer. There are many different types and geometries of these lenses for controlling and focusing of the electron beams. In this presentation we discussed the criteria used for the design of the electrostatic lenses associated with the electron gun and energy analyzers and determined the fundamental relationships between the operation and behaviour of multi-element electrostatic lenses, containing five, six and seven elements. The focusing of the electron beam was achieved by applying suitable voltages to the series of these lens elements, Design of the lens system for electron gun was based on our requirements that the beam at the target had a small spot size and zero beam angle, that is, afocal mode. For energy analyzer systems we considered the entrance of the hemispherical analyzer which determines the energy of the electron beam and discussed the focusing condition of this lens systems

  13. Health Services Cost Analyzing in Tabriz Health Centers 2008

    Directory of Open Access Journals (Sweden)

    Massumeh gholizadeh

    2015-08-01

    Full Text Available Background and objectives : Health Services cost analyzing is an important management tool for evidence-based decision making in health system. This study was conducted with the purpose of cost analyzing and identifying the proportion of different factors on total cost of health services that are provided in urban health centers in Tabriz. Material and Methods : This study was a descriptive and analytic study. Activity Based Costing method (ABC was used for cost analyzing. This cross–sectional survey analyzed and identified the proportion of different factors on total cost of health services that are provided in Tabriz urban health centers. The statistical population of this study was comprised of urban community health centers in Tabriz. In this study, a multi-stage sampling method was used to collect data. Excel software was used for data analyzing. The results were described with tables and graphs. Results : The study results showed the portion of different factors in various health services. Human factors by 58%, physical space 8%, medical equipment 1.3% were allocated with high portion of expenditures and costs of health services in Tabriz urban health centers. Conclusion : Based on study results, since the human factors included the highest portion of health services costs and expenditures in Tabriz urban health centers, balancing workload with staff number, institutionalizing performance-based management and using multidisciplinary staffs may lead to reduced costs of services. ​

  14. Test of a two-dimensional neutron spin analyzer

    International Nuclear Information System (INIS)

    Falus, Peter; Vorobiev, Alexei; Krist, Thomas

    2006-01-01

    The aim of this measurement was to test the new large-area spin polarization analyzer for the EVA-SERGIS beamline at Institute Laue Langevin (ILL). The spin analyzer, which was built in Berlin selects one of the two spin states of a neutron beam of wavelength 5.5 A impinging on a horizontal sample and reflected or scattered from the sample. The spin is analyzed for all neutrons scattered into a detector with an area of 190 mmx190 mm positioned 2.7 m behind the sample, thus covering an angular interval of 4 o x4 o . The tests were done at the HMI V14 beamline followed by tests at the EVA beamline at ILL. The transmission for the two spin components, the flipping ratio and small angle scattering were recorded while scanning the incoming beam on the analyzer. It was clearly visible, that due to the stacked construction the intensity is blocked at regular intervals. Careful inspection shows that the transmission of the good spin component is more than 0.72 for 60% of the detector area and the corrected flipping ratio is more than 47 for 60% of the detector area. Although some small-angle scattering is visible, it is notable that this analyzer design has small scattering intensities

  15. Test of a two-dimensional neutron spin analyzer

    Science.gov (United States)

    Falus, Péter; Vorobiev, Alexei; Krist, Thomas

    2006-11-01

    The aim of this measurement was to test the new large-area spin polarization analyzer for the EVA-SERGIS beamline at Institute Laue Langevin (ILL). The spin analyzer, which was built in Berlin selects one of the two spin states of a neutron beam of wavelength 5.5 Å impinging on a horizontal sample and reflected or scattered from the sample. The spin is analyzed for all neutrons scattered into a detector with an area of 190 mm×190 mm positioned 2.7 m behind the sample, thus covering an angular interval of 4°×4°. The tests were done at the HMI V14 beamline followed by tests at the EVA beamline at ILL. The transmission for the two spin components, the flipping ratio and small angle scattering were recorded while scanning the incoming beam on the analyzer. It was clearly visible, that due to the stacked construction the intensity is blocked at regular intervals. Careful inspection shows that the transmission of the good spin component is more than 0.72 for 60% of the detector area and the corrected flipping ratio is more than 47 for 60% of the detector area. Although some small-angle scattering is visible, it is notable that this analyzer design has small scattering intensities.

  16. Pseudocode Interpreter (Pseudocode Integrated Development Environment with Lexical Analyzer and Syntax Analyzer using Recursive Descent Parsing Algorithm

    Directory of Open Access Journals (Sweden)

    Christian Lester D. Gimeno

    2017-11-01

    Full Text Available –This research study focused on the development of a software that helps students design, write, validate and run their pseudocode in a semi Integrated Development Environment (IDE instead of manually writing it on a piece of paper.Specifically, the study aimed to develop lexical analyzer or lexer, syntax analyzer or parser using recursive descent parsing algorithm and an interpreter. The lexical analyzer reads pseudocodesource in a sequence of symbols or characters as lexemes.The lexemes are then analyzed by the lexer that matches a pattern for valid tokens and passes to the syntax analyzer or parser. The syntax analyzer or parser takes those valid tokens and builds meaningful commands using recursive descent parsing algorithm in a form of an abstract syntax tree. The generation of an abstract syntax tree is based on the specified grammar rule created by the researcher expressed in Extended Backus-Naur Form. The Interpreter takes the generated abstract syntax tree and starts the evaluation or interpretation to produce pseudocode output. The software was evaluated using white-box testing by several ICT professionals and black-box testing by several computer science students based on the International Organization for Standardization (ISO 9126 software quality standards. The overall results of the evaluation both for white-box and black-box were described as “Excellent in terms of functionality, reliability, usability, efficiency, maintainability and portability”.

  17. The Common Technique for Analyzing the Financial Results Report

    Directory of Open Access Journals (Sweden)

    Pasternak Maria M.

    2017-04-01

    Full Text Available The article is aimed at generalizing the theoretical approaches to the structure and elements of the technique for analysis of the Financial results report (Cumulative income report and providing suggestions for its improvement. The current methods have been analyzed, relevance of the application of a common technique for such analysis has been substantiated. A common technique for analyzing the Financial results report has been proposed, which includes definition of the objectives and tasks of analysis, subjects and objects of analysis, sources of its information. Stages of such an analysis were allocated and described. The findings of the article can be used to theoretically substantiate and to practically develop a technique for analyzing the Financial results report in the branches of Ukrainian economy.

  18. Development of a low energy neutral analyzer (LENA). Final report

    International Nuclear Information System (INIS)

    Curtis, C.C.; Fan, C.Y.; Hsieh, K.C.; McCullen, J.D.

    1986-05-01

    A low energy neutral particle analyzer (LENA) has been developed at the University of Arizona to detect particles originating in the edge plasma of fusion reactors. LENA was designed to perform energy analysis and measure flux levels of neutrals having energies between 5 and 50 eV (with possible extension to 500 eV neutrals), and do this with 1 to 10 ms time resolution. The instrument uses hot filaments to produce a 10 mA diffusion electron beam which ionizes incoming neutrals in a nearly field free region so that their velocity distribution is nearly undisturbed. The resultant ions are energy analyzed in a hyperbolic electrostatic analyzer, and detected by an MCP detector. LENA has been installed and operated on the ALCATOR C tokamak at the MIT Plasma Fusion Center. Results to date are discussed. At present, the LENA exhibits excessive sensitivity to the extremely high ultraviolet photon flux emanating from the plasma. Measures to correct this are suggested

  19. Demonstration of analyzers for multimode photonic time-bin qubits

    Science.gov (United States)

    Jin, Jeongwan; Agne, Sascha; Bourgoin, Jean-Philippe; Zhang, Yanbao; Lütkenhaus, Norbert; Jennewein, Thomas

    2018-04-01

    We demonstrate two approaches for unbalanced interferometers as time-bin qubit analyzers for quantum communication, robust against mode distortions and polarization effects as expected from free-space quantum communication systems including wavefront deformations, path fluctuations, pointing errors, and optical elements. Despite strong spatial and temporal distortions of the optical mode of a time-bin qubit, entangled with a separate polarization qubit, we verify entanglement using the Negative Partial Transpose, with the measured visibility of up to 0.85 ±0.01 . The robustness of the analyzers is further demonstrated for various angles of incidence up to 0 .2∘ . The output of the interferometers is coupled into multimode fiber yielding a high system throughput of 0.74. Therefore, these analyzers are suitable and efficient for quantum communication over multimode optical channels.

  20. Coherent error study in a retarding field energy analyzer

    International Nuclear Information System (INIS)

    Cui, Y.; Zou, Y.; Reiser, M.; Kishek, R.A.; Haber, I.; Bernal, S.; O'Shea, P.G.

    2005-01-01

    A novel cylindrical retarding electrostatic field energy analyzer for low-energy beams has been designed, simulated, and tested with electron beams of several keV, in which space charge effects play an important role. A cylindrical focusing electrode is used to overcome the beam expansion inside the device due to space-charge forces, beam emittance, etc. In this paper, we present the coherent error analysis for this energy analyzer with beam envelope equation including space charge and emittance effects. The study shows that this energy analyzer can achieve very high resolution (with relative error of around 10 -5 ) if taking away the coherent errors by using proper focusing voltages. The theoretical analysis is compared with experimental results

  1. Atmospheric analyzer, carbon monoxide monitor and toluene diisocyanate monitor

    Science.gov (United States)

    Shannon, A. V.

    1977-01-01

    The purpose of the atmospheric analyzer and the carbon monoxide and toluene diisocyanate monitors is to analyze the atmospheric volatiles and to monitor carbon monoxide and toluene diisocyanate levels in the cabin atmosphere of Skylab. The carbon monoxide monitor was used on Skylab 2, 3, and 4 to detect any carbon monoxide levels above 25 ppm. Air samples were taken once each week. The toluene diisocyanate monitor was used only on Skylab 2. The loss of a micrometeoroid shield following the launch of Skylab 1 resulted in overheating of the interior walls of the Orbital Workshop. A potential hazard existed from outgassing of an isocyanate derivative resulting from heat-decomposition of the rigid polyurethane wall insulation. The toluene diisocyanate monitor was used to detect any polymer decomposition. The atmospheric analyzer was used on Skylab 4 because of a suspected leak in the Skylab cabin. An air sample was taken at the beginning, middle, and the end of the mission.

  2. A cascading failure model for analyzing railway accident causation

    Science.gov (United States)

    Liu, Jin-Tao; Li, Ke-Ping

    2018-01-01

    In this paper, a new cascading failure model is proposed for quantitatively analyzing the railway accident causation. In the model, the loads of nodes are redistributed according to the strength of the causal relationships between the nodes. By analyzing the actual situation of the existing prevention measures, a critical threshold of the load parameter in the model is obtained. To verify the effectiveness of the proposed cascading model, simulation experiments of a train collision accident are performed. The results show that the cascading failure model can describe the cascading process of the railway accident more accurately than the previous models, and can quantitatively analyze the sensitivities and the influence of the causes. In conclusion, this model can assist us to reveal the latent rules of accident causation to reduce the occurrence of railway accidents.

  3. Set-up with electrostatic analyzer for mass spectrometers

    International Nuclear Information System (INIS)

    Ivanov, V.P.; Sysoev, A.A.; Samsonov, G.A.

    1977-01-01

    An attachment with an electrostatic analyzer that enables to implement a double focusing of ion beams when used in conjunction with a magnetic analyzer, is suggested. Used as the electrostatic analyzer is a cylindrical capacitor placed in a vacuum chamber. Apart from this, the attachment includes a vacuum pump, a nitrogen trap, a battery supply unit, one-beam ion receivers and a bellows inlet for capacitor adjustment. All assemblies and parts of the attachment are made of stainless steel. The test of a combined operation of the mass-spactrometer and the attachment indicate that the use of the attachment enables the utilization of sources which form ion beams with an energy dispersion of up to 1.5%, the mass-spectrometer resolving power being unchanged

  4. Research on key techniques in portable XRF analyzers

    International Nuclear Information System (INIS)

    Li Guodong; Jia Wenyi; Zhou Rongsheng; Tang Hong

    1999-01-01

    Focused on the problems of low sensitivity, poor detection limits, small number of determined elements and poor ability of matrix effect correction of the current field-portable X-ray fluorescence (XRF) analyzers, research work on key units of excitation source, detector, measurement circuit and microcomputerization is carried out. A miniature, low power X-ray tube excitation source is developed. A low dissipative 1024 channel analyzer, fitting to high resolution detectors, is prepared. Microcomputerization based on a notebook computer is realized. On the basis, a field, highly sensitive XRF system is constituted. With this system, multielements can be determined with the detection limits of less than 20 μg/g for the elements with medium or lower atomic numbers, one order of magnitude or more lower than those of the current portable XRF analyzers. The capabilities for matrix effect correction and data processing are enhanced. This system gets rid of radionuclide sources, making its use and carry safe and convenient

  5. np elastic scattering analyzing power characteristics at intermediate energies

    International Nuclear Information System (INIS)

    Abegg, R.; Davis, C.A.; Delheij, P.P.J.; Green, P.W.; Greeniaus, L.G.; Healey, D.C.; Miller, C.A.; Rodning, N.L.; Wait, G.D.; Ahmad, M.; Cairns, E.B.; Coombes, G.H.; Lapointe, C.; McDonald, W.J.; Moss, G.A.; Roy, G.; Soukup, J.; Tkachuk, R.R.; Ye, Y.; Watson, J.W.

    1989-06-01

    Recent measurements of charge symmetry breaking in the np system at 477 MeV, and of A oonn for np elastic scattering at 220, 325 and 425 MeV also yield accurate analyzing power data. These data allow the energy dependence of the analyzing power zero-crossing angle and the slope of the analyzing power at the zero-crossing to be determined. The incident neutron energies span a region where the zero-crossing angle is strongly energy dependent (Ε n n > 350 MeV). The results are compared to current phase shift analysis predictions, recently published LAMPF data, and the predictions of the Bonn and Paris potentials. (Author) 13 refs., 2 tabs., 2 figs

  6. On-line analyzers to distributed control system linking

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, S.F.; Buchanan, B.R.; Sanders, M.A.

    1990-01-01

    The Analytical Development Section (ADS) of the Savannah River Laboratory is developing on-line analyzers to monitor various site processes. Data from some of the on-line analyzers (OLA's) will be used for process control by distributed control systems (DCS's) such as the Fisher PRoVOX. A problem in the past has been an efficient and cost effective way to get analyzer data onto the DCS data highway. ADS is developing a system to accomplish the linking of OLA's to PRoVOX DCS's. The system will be described, and results of operation in a research and development environment given. Plans for the installation in the production environment will be discussed.

  7. Novel Approach to Analyzing MFE of Noncoding RNA Sequences.

    Science.gov (United States)

    George, Tina P; Thomas, Tessamma

    2016-01-01

    Genomic studies have become noncoding RNA (ncRNA) centric after the study of different genomes provided enormous information on ncRNA over the past decades. The function of ncRNA is decided by its secondary structure, and across organisms, the secondary structure is more conserved than the sequence itself. In this study, the optimal secondary structure or the minimum free energy (MFE) structure of ncRNA was found based on the thermodynamic nearest neighbor model. MFE of over 2600 ncRNA sequences was analyzed in view of its signal properties. Mathematical models linking MFE to the signal properties were found for each of the four classes of ncRNA analyzed. MFE values computed with the proposed models were in concordance with those obtained with the standard web servers. A total of 95% of the sequences analyzed had deviation of MFE values within ±15% relative to those obtained from standard web servers.

  8. A nuclear facility Security Analyzer written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.

    1987-01-01

    The Security Analyzer project was undertaken to use the Prolog artificial intelligence programming language and Entity-Relationship database construction techniques to produce an intelligent database computer program capable of analyzing the effectiveness of a nuclear facility's security systems. The Security Analyzer program can search through a facility to find all possible surreptitious entry paths that meet various user-selected time and detection probability criteria. The program can also respond to user-formulated queries concerning the database information. The intelligent database approach allows the program to perform a more comprehensive path search than other programs that only find a single optimal path. The program also is more flexible in that the database, once constructed, can be interrogated and used for purposes independent of the searching function

  9. A nuclear facility Security Analyzer written in PROLOG

    International Nuclear Information System (INIS)

    Zimmerman, B.D.

    1987-08-01

    The Security Analyzer project was undertaken to use the Prolog ''artificial intelligence'' programming language and Entity-Relationship database construction techniques to produce an intelligent database computer program capable of analyzing the effectiveness of a nuclear facility's security systems. The Security Analyzer program can search through a facility to find all possible surreptitious entry paths that meet various user-selected time and detection probability criteria. The program can also respond to user-formulated queries concerning the database information. The intelligent database approach allows the program to perform a more comprehensive path search than other programs that only find a single ''optimal'' path. The program also is more flexible in that the database, once constructed, can be interrogated and used for purposes independent of the searching function

  10. Multichannel analyzer embedded in FPGA; Analizador multicanal embebido en FPGA

    Energy Technology Data Exchange (ETDEWEB)

    Garcia D, A.; Hernandez D, V. M.; Vega C, H. R. [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Cipres No. 10, Fracc. La Penuela, 98060 Zacatecas, Zac. (Mexico); Ordaz G, O. O. [Universidad de Cordoba, Departamento de Arquitectura de Computadores, Electronica y Tecnologia Electronica, Campus de Rabanales, Ctra. N-IVa Km 396, 14071 Cordoba (Spain); Bravo M, I., E-mail: angelogarciad@hotmail.com [Universidad de Alcala de Henares, Departamento de Electronica, Campus Universitario, Carretera Madrid-Barcelona Km 33.600, 28801 Alcala de Henares, Madrid (Spain)

    2017-10-15

    Ionizing radiation has different applications, so it is a very significant and useful tool, which in turn can be dangerous for living beings if they are exposed to uncontrolled doses. However, due to its characteristics, it cannot be perceived by any of the senses of the human being, so that in order to know the presence of it, radiation detectors and additional devices are required to quantify and classify it. A multichannel analyzer is responsible for separating the different pulse heights that are generated in the detectors, in a certain number of channels; according to the number of bits of the analog to digital converter. The objective of the work was to design and implement a multichannel analyzer and its associated virtual instrument, for nuclear spectrometry. The components of the multichannel analyzer were created in VHDL hardware description language and packaged in the Xilinx Vivado design suite, making use of resources such as the ARM processing core that the System on Chip Zynq contains and the virtual instrument was developed on the LabView programming graphics platform. The first phase was to design the hardware architecture to be embedded in the FPGA and for the internal control of the multichannel analyzer the application was generated for the ARM processor in C language. For the second phase, the virtual instrument was developed for the management, control and visualization of the results. The data obtained as a result of the development of the system were observed graphically in a histogram showing the spectrum measured. The design of the multichannel analyzer embedded in FPGA was tested with two different radiation detection systems (hyper-pure germanium and scintillation) which allowed determining that the spectra obtained are similar in comparison with the commercial multichannel analyzers. (Author)

  11. Evaluation of performance of veterinary in-clinic hematology analyzers.

    Science.gov (United States)

    Rishniw, Mark; Pion, Paul D

    2016-12-01

    A previous study provided information regarding the quality of in-clinic veterinary biochemistry testing. However, no similar studies for in-clinic veterinary hematology testing have been conducted. The objective of this study was to assess the quality of hematology testing in veterinary in-clinic laboratories using results obtained from testing 3 levels of canine EDTA blood samples. Clinicians prepared blood samples to achieve measurand concentrations within, below, and above their RIs and evaluated the samples in triplicate using their in-clinic analyzers. Quality was assessed by comparison of calculated total error with quality requirements, determination of sigma metrics, use of a quality goal index, and agreement between in-clinic and reference laboratory instruments. Suitability for statistical quality control was determined using adaptations from the computerized program, EZRules3. Evaluation of 10 veterinary in-clinic hematology analyzers showed that these instruments often fail to meet quality requirements. At least 60% of analyzers reasonably determined RBC, WBC, HCT, and HGB, when assessed by most quality goal criteria; platelets were less reliably measured, with 80% deemed suitable for low platelet counts, but only 30% for high platelet counts, and automated differential leukocyte counts were generally considered unsuitable for clinical use with fewer than 40% of analyzers meeting the least stringent quality goal requirements. Fewer than 50% of analyzers were able to meet requirements for statistical quality control for any measurand. These findings reflect the current status of in-clinic hematology analyzer performance and provide a basis for future evaluations of the quality of veterinary laboratory testing. © 2016 American Society for Veterinary Clinical Pathology.

  12. Development of a Telemetric, Miniaturized Electrochemical Amperometric Analyzer.

    Science.gov (United States)

    Jung, Jaehyo; Lee, Jihoon; Shin, Siho; Kim, Youn Tae

    2017-10-23

    In this research, we developed a portable, three-electrode electrochemical amperometric analyzer that can transmit data to a PC or a tablet via Bluetooth communication. We performed experiments using an indium tin oxide (ITO) glass electrode to confirm the performance and reliability of the analyzer. The proposed analyzer uses a current-to-voltage (I/V) converter to convert the current generated by the reduction-oxidation (redox) reaction of the buffer solution to a voltage signal. This signal is then digitized by the processor. The configuration of the power and ground of the printed circuit board (PCB) layer is divided into digital and analog parts to minimize the noise interference of each part. The proposed analyzer occupies an area of 5.9 × 3.25 cm² with a current resolution of 0.4 nA. A potential of 0~2.1 V can be applied between the working and the counter electrodes. The results of this study showed the accuracy of the proposed analyzer by measuring the Ruthenium(III) chloride ( Ru III ) concentration in 10 mM phosphate-buffered saline (PBS) solution with a pH of 7.4. The measured data can be transmitted to a PC or a mobile such as a smartphone or a tablet PC using the included Bluetooth module. The proposed analyzer uses a 3.7 V, 120 mAh lithium polymer battery and can be operated for 60 min when fully charged, including data processing and wireless communication.

  13. Development of a Telemetric, Miniaturized Electrochemical Amperometric Analyzer

    Directory of Open Access Journals (Sweden)

    Jaehyo Jung

    2017-10-01

    Full Text Available In this research, we developed a portable, three-electrode electrochemical amperometric analyzer that can transmit data to a PC or a tablet via Bluetooth communication. We performed experiments using an indium tin oxide (ITO glass electrode to confirm the performance and reliability of the analyzer. The proposed analyzer uses a current-to-voltage (I/V converter to convert the current generated by the reduction-oxidation (redox reaction of the buffer solution to a voltage signal. This signal is then digitized by the processor. The configuration of the power and ground of the printed circuit board (PCB layer is divided into digital and analog parts to minimize the noise interference of each part. The proposed analyzer occupies an area of 5.9 × 3.25 cm2 with a current resolution of 0.4 nA. A potential of 0~2.1 V can be applied between the working and the counter electrodes. The results of this study showed the accuracy of the proposed analyzer by measuring the Ruthenium(III chloride ( Ru III concentration in 10 mM phosphate-buffered saline (PBS solution with a pH of 7.4. The measured data can be transmitted to a PC or a mobile such as a smartphone or a tablet PC using the included Bluetooth module. The proposed analyzer uses a 3.7 V, 120 mAh lithium polymer battery and can be operated for 60 min when fully charged, including data processing and wireless communication.

  14. Time asymmetry: Polarization and analyzing power in the nuclear reactions

    International Nuclear Information System (INIS)

    Rioux, C.; Roy, R.; Slobodrian, R.J.; Conzett, H.E.

    1983-01-01

    Measurements of the proton polarization in the reactions 7 Li( 3 He, p vector) 9 Be and 9 Be( 3 He, p vector) 11 B and of the analyzing powers of the inverse reactions, initiated by polarized protons at the same c.m. energies, show significant differences which imply the failure of the polarization-analyzing-power theorem and, prima facie, of time-reversal invariance in these reactions. The reaction 2 H( 3 He, p vector) 4 He and its inverse have also been investigated and show some smaller differences. A discussion of the instrumental asymmetries is presented. (orig.)

  15. Effects of intense ultraviolet radiation on electrostatic energy analyzers

    International Nuclear Information System (INIS)

    Mathew, J.; Jennings, W.C.; Hickok, R.L.; Connor, K.A.; Schoch, P.M.; Hallock, G.A.

    1984-01-01

    Intense ultraviolet radiation from the plasma poses a significant problem for the implementation of heavy ion beam probe diagnostic systems on fusion-oriented confinement devices. The radiation enters the electrostatic energy analyzer used to detect secondary ions, resulting in both a distortion of the electric field inside the analyzer and noise generation in the detector channels. Data acquisition procedures and mechanical design techniques have been developed to significantly reduce these effects. We have also been successful in modelling the electric field distortion and have developed a data correction procedure based on this model. Methods for approaching the problems anticipated in future devices are also suggested

  16. Pulse shape analyzer/timing-SCA application to beta measurement

    International Nuclear Information System (INIS)

    Selvi, S.; Celiktas, C.

    2001-01-01

    Electrical noise contribution to pulse height distributions from beta sources due to BC-400 plastic scintillator(PS), preamplifier and spectroscopy amplifier was rejected by setting the electronic set-up processing of the modified beta spectrometer consisted of pulse shape analyzer/timing single channel analyzer (PSA/SCA) and related complementary equipments. Improved noise rejection performance was evaluated in terms of elimination practically only all of the noise band of C-14 and Tl-204 spectra obtained using the two alternate beta spectrometer

  17. Wind energy system time-domain (WEST) analyzers

    Science.gov (United States)

    Dreier, M. E.; Hoffman, J. A.

    1981-01-01

    A portable analyzer which simulates in real time the complex nonlinear dynamics of horizontal axis wind energy systems was constructed. Math models for an aeroelastic rotor featuring nonlinear aerodynamic and inertial terms were implemented with high speed digital controllers and analog calculation. This model was combined with other math models of elastic supports, control systems, a power train and gimballed rotor kinematics. A stroboscopic display system graphically depicting distributed blade loads, motion, and other aerodynamic functions on a cathode ray tube is included. Limited correlation efforts showed good comparison between the results of this analyzer and other sophisticated digital simulations. The digital simulation results were successfully correlated with test data.

  18. Analyzing solid waste management practices for the hotel industry

    OpenAIRE

    S.T. Pham Phu; M.G. Hoang; T. Fujiwara

    2018-01-01

    The current study aims to analyze waste characteristics and management practices of the hotel industry in Hoi An, a tourism city in the center of Vietnam. Solid wastes from 120 hotels were sampled, the face-to-face interviews were conducted, and statistical methods were carried out to analyze the data. The results showed that the mean of waste generation rate of the hotels was 2.28 kg/guest/day and strongly correlated to internal influencing factors such as the capacity, the price of the room...

  19. Analyzing the use of pins in safety bearings

    DEFF Research Database (Denmark)

    da Fonseca, Cesar A. L. L.; Weber, Hans I.; Fleischer, Philip F.

    2015-01-01

    A new concept for safety bearings is analyzed: useful in emergency situations, it shall protect the bearing from destruction by the use of pins which impact with a disc, both capable of good energy dissipation. Results of work in progress are presented by validating partial stages......–Kutta method is validated with experimental results. Simulations of rotor orbits due to the impact condition are analyzed and compared to data obtained from the experiment giving a good perspective on the use of pins. The contact interaction between rotor and pins uses an elastic-dissipative model. In addition...

  20. Analyzing Double Image Illusion through Double Indiscernibility and Lattice Theory

    Directory of Open Access Journals (Sweden)

    Kohei Sonoda

    2011-10-01

    Full Text Available The figure-ground division plays a fundamental role in all image perceptions. Although there are a lot of studies about extraction of a figure such as detection of edges or grouping of texture, there are few discussions about a relationship between obtained figure and ground. We focused on double image illusions having two complementary relationships be- tween figure and ground and analyzed them. We divided the double image illusions according to two different interpretations and using these divisions we extracted and analyzed their logical structures by lattices derived from rough sets that we had developed. As a result we discovered unusual logical structures in double image illusions.

  1. The comparison of automated urine analyzers with manual microscopic examination for urinalysis automated urine analyzers and manual urinalysis

    Directory of Open Access Journals (Sweden)

    Fatma Demet Ä°nce

    2016-08-01

    Full Text Available Objectives: Urinalysis is one of the most commonly performed tests in the clinical laboratory. However, manual microscopic sediment examination is labor-intensive, time-consuming, and lacks standardization in high-volume laboratories. In this study, the concordance of analyses between manual microscopic examination and two different automatic urine sediment analyzers has been evaluated. Design and methods: 209 urine samples were analyzed by the Iris iQ200 ELITE (Ä°ris Diagnostics, USA, Dirui FUS-200 (DIRUI Industrial Co., China automatic urine sediment analyzers and by manual microscopic examination. The degree of concordance (Kappa coefficient and the rates within the same grading were evaluated. Results: For erythrocytes, leukocytes, epithelial cells, bacteria, crystals and yeasts, the degree of concordance between the two instruments was better than the degree of concordance between the manual microscopic method and the individual devices. There was no concordance between all methods for casts. Conclusion: The results from the automated analyzers for erythrocytes, leukocytes and epithelial cells were similar to the result of microscopic examination. However, in order to avoid any error or uncertainty, some images (particularly: dysmorphic cells, bacteria, yeasts, casts and crystals have to be analyzed by manual microscopic examination by trained staff. Therefore, the software programs which are used in automatic urine sediment analysers need further development to recognize urinary shaped elements more accurately. Automated systems are important in terms of time saving and standardization. Keywords: Urinalysis, Autoanalysis, Microscopy

  2. The comparison of automated urine analyzers with manual microscopic examination for urinalysis automated urine analyzers and manual urinalysis.

    Science.gov (United States)

    İnce, Fatma Demet; Ellidağ, Hamit Yaşar; Koseoğlu, Mehmet; Şimşek, Neşe; Yalçın, Hülya; Zengin, Mustafa Osman

    2016-08-01

    Urinalysis is one of the most commonly performed tests in the clinical laboratory. However, manual microscopic sediment examination is labor-intensive, time-consuming, and lacks standardization in high-volume laboratories. In this study, the concordance of analyses between manual microscopic examination and two different automatic urine sediment analyzers has been evaluated. 209 urine samples were analyzed by the Iris iQ200 ELITE (İris Diagnostics, USA), Dirui FUS-200 (DIRUI Industrial Co., China) automatic urine sediment analyzers and by manual microscopic examination. The degree of concordance (Kappa coefficient) and the rates within the same grading were evaluated. For erythrocytes, leukocytes, epithelial cells, bacteria, crystals and yeasts, the degree of concordance between the two instruments was better than the degree of concordance between the manual microscopic method and the individual devices. There was no concordance between all methods for casts. The results from the automated analyzers for erythrocytes, leukocytes and epithelial cells were similar to the result of microscopic examination. However, in order to avoid any error or uncertainty, some images (particularly: dysmorphic cells, bacteria, yeasts, casts and crystals) have to be analyzed by manual microscopic examination by trained staff. Therefore, the software programs which are used in automatic urine sediment analysers need further development to recognize urinary shaped elements more accurately. Automated systems are important in terms of time saving and standardization.

  3. Analyzing Counsel/Witness Discourse in Nnewi, Anambra State ...

    African Journals Online (AJOL)

    This paper analyzed counsel/witness discourse using the High Court in. Nnewi Municipal Council. Specifically, it described the structure and organization of counsel/witness discourse in the courtroom context highlighting some discourse features inherent in them, and observed the communication strategies and motivation ...

  4. Analyzing Perceptions of Prospective Teachers about Their Media Literacy Competencies

    Science.gov (United States)

    Recepoglu, Ergun; Ergun, Muammer

    2013-01-01

    The purpose of this study is to analyze perceptions of prospective teachers about their media literacy competencies in terms of different variables. This is a descriptive research in the survey model which tries to detect the current situation. Study group includes 580 prospective teachers from Turkish, Primary School, Social Studies, Science,…

  5. 40 CFR 90.314 - Analyzer accuracy and specifications.

    Science.gov (United States)

    2010-07-01

    ... zero and calibration or span gases over any 10-second period must not exceed two percent of full-scale chart deflection on all ranges used. (3) Zero drift. The analyzer zero-response drift during a one-hour period must be less than two percent of full-scale chart deflection on the lowest range used. The zero...

  6. Differentiation and Integration: Guiding Principles for Analyzing Cognitive Change

    Science.gov (United States)

    Siegler, Robert S.; Chen, Zhe

    2008-01-01

    Differentiation and integration played large roles within classic developmental theories but have been relegated to obscurity within contemporary theories. However, they may have a useful role to play in modern theories as well, if conceptualized as guiding principles for analyzing change rather than as real-time mechanisms. In the present study,…

  7. 21 CFR 868.1720 - Oxygen gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... gases by techniques such as mass spectrometry, polarography, thermal conductivity, or gas chromatography... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Oxygen gas analyzer. 868.1720 Section 868.1720 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED...

  8. 21 CFR 868.1075 - Argon gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... thermal conductivity. (b) Classification. Class II (performance standards). ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Argon gas analyzer. 868.1075 Section 868.1075 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL...

  9. 21 CFR 868.1670 - Neon gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... patient. The device may use techniques such as mass spectrometry or thermal conductivity. (b... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Neon gas analyzer. 868.1670 Section 868.1670 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL...

  10. 21 CFR 868.1640 - Helium gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... mixture during pulmonary function testing. The device may use techniques such as thermal conductivity, gas... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Helium gas analyzer. 868.1640 Section 868.1640 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED...

  11. PLT and PDX perpendicular charge-exchange analyzers

    International Nuclear Information System (INIS)

    Mueller, D.; Hammett, G.W.; McCune, D.C.

    1986-01-01

    The perpendicular charge-exchange systems used on the poloidal divertor experiment and the Princeton large torus are comprised of ten-channel, mass-resolved, charge-exchange analyzers. Results from these systems indicate that instrumental effects can lead to erroneous temperature measurements during deuterium neutral beam injection or at low hydrogen concentrations

  12. A high-speed interface for multi-channel analyzer

    International Nuclear Information System (INIS)

    Shen Ji; Zheng Zhong; Qiao Chong; Chen Ziyu; Ye Yunxiu; Ye Zhenyu

    2003-01-01

    This paper presents a high-speed computer interface for multi-channel analyzer based on DMA technique. Its essential principle and operating procedure are introduced. By the detecting of γ spectrum of 137 Cs with the interface, it's proved that the interface can meet the requirements of high-speed data acquisition

  13. Analyzing the Hidden Curriculum of Screen Media Advertising

    Science.gov (United States)

    Mason, Lance E.

    2015-01-01

    This media literacy article introduces a questioning framework for analyzing screen media with students and provides an example analysis of two contemporary commercials. Investigating screen conventions can help students understand the persuasive functions of commercials, as well as how the unique sensory experience of screen viewing affects how…

  14. Analyzing the drivers of green manufacturing with fuzzy approach

    DEFF Research Database (Denmark)

    Govindan, Kannan; Diabat, Ali; Madan Shankar, K.

    2015-01-01

    India, and aided by their replies; a pair-wise comparison was made among the drivers. The pair-wise comparison is used as an input data and the drivers were analyzed on its basis. The analysis resorted to the use of a fuzzy Multi Criteria Decision Making (MCDM) approach. The obtained results...

  15. Analyzing Parental Involvement Dimensions in Early Childhood Education

    Science.gov (United States)

    Kurtulmus, Zeynep

    2016-01-01

    The importance of parental involvement in children's academic and social development has been widely accepted. For children's later school success, the first years are crucial. Majority of the research focuses on enhancing and supporting parental involvement in educational settings. The purpose of this study was to analyze dimensions of parental…

  16. AnalyzeThis: An Analysis Workflow-Aware Storage System

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Hyogi [ORNL; Kim, Youngjae [ORNL; Vazhkudai, Sudharshan S [ORNL; Tiwari, Devesh [ORNL; Anwar, Ali [Virginia Tech, Blacksburg, VA; Butt, Ali R [Virginia Tech, Blacksburg, VA; Ramakrishnan, Lavanya [Lawrence Berkeley National Laboratory (LBNL)

    2015-01-01

    The need for novel data analysis is urgent in the face of a data deluge from modern applications. Traditional approaches to data analysis incur significant data movement costs, moving data back and forth between the storage system and the processor. Emerging Active Flash devices enable processing on the flash, where the data already resides. An array of such Active Flash devices allows us to revisit how analysis workflows interact with storage systems. By seamlessly blending together the flash storage and data analysis, we create an analysis workflow-aware storage system, AnalyzeThis. Our guiding principle is that analysis-awareness be deeply ingrained in each and every layer of the storage, elevating data analyses as first-class citizens, and transforming AnalyzeThis into a potent analytics-aware appliance. We implement the AnalyzeThis storage system atop an emulation platform of the Active Flash array. Our results indicate that AnalyzeThis is viable, expediting workflow execution and minimizing data movement.

  17. Analyzing International Letters in a Business Communication Class.

    Science.gov (United States)

    Devet, Bonnie

    1998-01-01

    Shows how students can use persuasive principles of communication (format and writer's purpose) and of classical rhetoric (organization, ethos, pathos, logos, and style) to improve their ability to analyze business letters. Shows how applying these principles to the analysis of business letters from other countries helps students write better and…

  18. Development and performance of on-line uranium analyzers

    International Nuclear Information System (INIS)

    Ofalt, A.E.; O'Rourke, P.E.

    1985-10-01

    A diode-array spectrophotometer and and x-ray fluorescence analyzer were installed online in a full-scale prototype facility to monitor uranium loading and breakthrough of ion exchange columns. Uranium concentrations of 10 ppM in uranyl nitrate solutions can be detected online to improve process control and material accountability. 9 figs

  19. Analyzing State Security Risks in South China Sea Conflict

    Directory of Open Access Journals (Sweden)

    Дмитрий Владимирович Пивоваров

    2009-09-01

    Full Text Available The article is devoted to the regional security issues in South East Asia. The author analyses the international relations that go closely to the foreign policy and foreign policy strategy problems. The author proposes risk analysis as a new and promising method in political science to generate foreign policy plans and analyze international conflicts and problems.

  20. Analyzing the security posture of South African websites

    CSIR Research Space (South Africa)

    Mtsweni, Jabu, S

    2015-08-12

    Full Text Available observed. Research studies also suggest that over 80% of the active websites are vulnerable to a myriad of attacks. This paper reports on a study conducted to passively analyze and determine the security posture of over 70 South African websites from...

  1. Analyzing the performance index for a hybrid electric vehicle

    NARCIS (Netherlands)

    Ngo, D. V.; Hofman, T.; Steinbuch, M.; Serrarens, A. F A

    2011-01-01

    The definition of a performance index for the optimization design and optimal control problem of a Hybrid Electric Vehicle is not often considered and analyzed explicitly. In literature, there is no study about proposing a method of building or evaluating whether a performance index is appropriate.

  2. A Linguistic Technique for Marking and Analyzing Syntactic Parallelism.

    Science.gov (United States)

    Sackler, Jessie Brome

    Sentences in rhetoric texts were used in this study to determine a way in which thetorical syntactic parallelism can be analyzed. A tagmemic analysis determined tagmas which were parallel or identical or similar to one another. These were distinguished from tagmas which were identical because of the syntactic constraints of the language…

  3. Success Is Cheesecake: A Guide to Analyzing Student Discourse

    Science.gov (United States)

    Andrelchik, Hillary

    2016-01-01

    Action research, conducted by teachers in their own classrooms, with the goal of improving practice, is an invaluable practice that can provide insight into students' lives. One of the many challenges associated with action research is knowing how to analyze and interpret data. In this manuscript, written as a "how to" of sorts, I…

  4. Analyzing Properties of Stochastic Business Processes By Model Checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2013-01-01

    This chapter presents an approach to precise formal analysis of business processes with stochastic properties. The method presented here allows for both qualitative and quantitative properties to be individually analyzed at design time without requiring a full specification. This provides...... an effective means to explore possible designs for a business process and to debug any flaws....

  5. Resource values in analyzing fire management programs for economic efficiency

    Science.gov (United States)

    Irene A. Althaus; Thomas J. Mills

    1982-01-01

    In analyzing fire management programs for their economic efficiency, it is necessary to assign monetary values to the changes in resource outputs caused by, fire. The derivation of resource values is complicated by imperfect or nonexistent commercial market structures. The valuation concept recommended for fire program analyses is willingness-to-pay because it permits...

  6. Literally better : Analyzing and improving the quality of literals

    NARCIS (Netherlands)

    Beek, Wouter; Ilievski, Filip; Debattista, Jeremy; Schlobach, Stefan; Wielemaker, Jan

    2018-01-01

    Quality is a complicated and multifarious topic in contemporary Linked Data research. The aspect of literal quality in particular has not yet been rigorously studied. Nevertheless, analyzing and improving the quality of literals is important since literals form a substantial (one in seven

  7. A computerised EEG-analyzing system for small laboratory animals

    NARCIS (Netherlands)

    Kropveld, D.; Chamuleau, R. A.; Popken, R. J.; Smith, J.

    1983-01-01

    The experimental setup, including instrumentation and software packaging, is described for the use of a minicomputer as an on-line analyzing system of the EEG in rats. Complete fast Fourier transformation of the EEG sampled in 15 episodes of 10 s each is plotted out within 7 min after the start of

  8. Performance Evaluation of the Sysmex CS-5100 Automated Coagulation Analyzer.

    Science.gov (United States)

    Chen, Liming; Chen, Yu

    2015-01-01

    Coagulation testing is widely applied clinically, and laboratories increasingly demand automated coagulation analyzers with short turn-around times and high-throughput. The purpose of this study was to evaluate the performance of the Sysmex CS-5100 automated coagulation analyzer for routine use in a clinical laboratory. The prothrombin time (PT), international normalized ratio (INR), activated partial thromboplastin time (APTT), fibrinogen (Fbg), and D-dimer were compared between the Sysmex CS-5100 and Sysmex CA-7000 analyzers, and the imprecision, comparison, throughput, STAT function, and performance for abnormal samples were measured in each. The within-run and between-run coefficients of variation (CV) for the PT, APTT, INR, and D-dimer analyses showed excellent results both in the normal and pathologic ranges. The correlation coefficients between the Sysmex CS-5100 and Sysmex CA-7000 were highly correlated. The throughput of the Sysmex CS-5100 was faster than that of the Sysmex CA-7000. There was no interference at all by total bilirubin concentrations and triglyceride concentrations in the Sysmex CS-5100 analyzer. We demonstrated that the Sysmex CS-5100 performs with satisfactory imprecision and is well suited for coagulation analysis in laboratories processing large sample numbers and icteric and lipemic samples.

  9. Fluorescence x-ray analyzer for plating-bath solution

    International Nuclear Information System (INIS)

    Komatsu, Shigemi; Hato, Yoshio; Tono, Miki; Ishijima, Hiroshi

    1984-01-01

    This analyzer was developed for the analysis of plating solution composition and the measurement of plating thickness at the same time in the noble and base metal plating applied to electronic components. The analyzer operates on the principle of fluorescence X-ray measurement which features the capability of high accuracy, non-destructive and multi-element simultaneous analysis. In this paper, the principle of measurement, the configuration of a model SFA 875 analyzer and also the main specifications and measurement examples are described. As the measurement examples, it is described in detail that the model SFA 875 achieved the expansion of the range of application and the improvement of accuracy and the accuracy of simply repeated measurement by combining a digital filter with the linear method of least squares. The digital filter method is that for eliminating noise in data, smoothing spectra and subtracting background. The linear method of least squares is the method for separating the spectra with overlapping peaks into individual peaks. The combination of them allows the analyzer to perform various analyses even by using the spectra obtained with proportional counters. The analysis of the elements whose atomic numbers are close to each other is also possible. The accuracy of measurement of plating thickness is about 0.2 to 2.0 μm, and that of solution composition is about 0.4 to 0.7 g/l. (Wakatsuki, Y.)

  10. Analyzing Oscillations of a Rolling Cart Using Smartphones and Tablets

    Science.gov (United States)

    Egri, Sandor; Szabo, Lorant

    2015-01-01

    It is well known that "interactive engagement" helps students to understand basic concepts in physics. Performing experiments and analyzing measured data are effective ways to realize interactive engagement, in our view. Some experiments need special equipment, measuring instruments, or laboratories, but in this activity we advocate…

  11. Evaluation of the three-nucleon analyzing power puzzle

    International Nuclear Information System (INIS)

    Tornow, W.; Witala, H.

    1998-01-01

    The current status of the three-nucleon analyzing power puzzle is reviewed. Applying tight constraints on the allowed deviations between calculated predictions and accepted values for relevant nucleon-nucleon observables reveals that energy independent correction factors applied to the 3 P j nucleon-nucleon interactions can not solve the puzzle. Furthermore, using the same constraints, charge-independence breaking in the 3 P j nucleon-nucleon interactions can be ruled out as a possible tool to improve the agreement between three-nucleon calculations and data. The study of the energy dependence of the three-nucleon analyzing power puzzle gives clear evidence that the 3 P j nucleon-nucleon interaction obtained from phase-shift analyses and used in potential models are correct above about 25 MeV, i.e., the 3 P j nucleon-nucleon interactions have to be modified only at lower energies in order to solve the three-nucleon analyzing power puzzle, unless new three-nucleon forces can be found that account for the three-nucleon analyzing power puzzle without destroying the beautiful agreement between rigorous three-nucleon calculations and a large body of accurate three-nucleon data. (orig.)

  12. Computer program analyzes and monitors electrical power systems (POSIMO)

    Science.gov (United States)

    Jaeger, K.

    1972-01-01

    Requirements to monitor and/or simulate electric power distribution, power balance, and charge budget are discussed. Computer program to analyze power system and generate set of characteristic power system data is described. Application to status indicators to denote different exclusive conditions is presented.

  13. Using the Analytic Hierarchy Process to Analyze Multiattribute Decisions.

    Science.gov (United States)

    Spires, Eric E.

    1991-01-01

    The use of the Analytic Hierarchy Process (AHP) in assisting researchers to analyze decisions is discussed. The AHP is compared with other decision-analysis techniques, including multiattribute utility measurement, conjoint analysis, and general linear models. Insights that AHP can provide are illustrated with data gathered in an auditing context.…

  14. Using UPPAAL to Analyze an MPEG-2 Algorithm

    DEFF Research Database (Denmark)

    Cambronero, M. Emilia; Ravn, Anders Peter; Valero, Valentin

    2005-01-01

    The performance of a parallel algorithm for an MPEG-2 encoding is analyzed using timed automata models in the UppAal tool. We have constructed both a sequential model of MPEG-2, and a parallel model of MPEG-2 and then, a comparison of the results obtained for both models is made. We show how...

  15. Analyzing discussions on twitter: Case study on HPV vaccinations

    NARCIS (Netherlands)

    Kaptein, R.; Boertjes, E.; Langley, D.

    2014-01-01

    In this work we analyze the discussions on Twitter around the Human papillomavirus (HPV) vaccinations. We collect a dataset consisting of tweets related to the HPV vaccinations by searching for relevant keywords, by retrieving the conversations on Twitter, and by retrieving tweets from our user

  16. Agent-based Modeling Methodology for Analyzing Weapons Systems

    Science.gov (United States)

    2015-03-26

    technique involve model structure, system representation and the degree of validity, coupled with the simplicity, of the overall model. ABM is best suited... system representation of the air combat system . We feel that a simulation model that combines ABM with equation-based representation of weapons and...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Casey D. Connors, Major, USA

  17. Probability model for analyzing fire management alternatives: theory and structure

    Science.gov (United States)

    Frederick W. Bratten

    1982-01-01

    A theoretical probability model has been developed for analyzing program alternatives in fire management. It includes submodels or modules for predicting probabilities of fire behavior, fire occurrence, fire suppression, effects of fire on land resources, and financial effects of fire. Generalized "fire management situations" are used to represent actual fire...

  18. 21 CFR 868.1620 - Halothane gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... infrared or ultraviolet radiation. (b) Classification. Class II (performance standards). ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Halothane gas analyzer. 868.1620 Section 868.1620 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED...

  19. 40 CFR 90.325 - Analyzer interference checks.

    Science.gov (United States)

    2010-07-01

    ... broadening the absorption band of the measured gas, and in CLD instruments by the interfering gas quenching... quench check. The two gases of concern for CLD (and HCLD) analyzers are CO2 and water vapor. Quench... the CO2 NDIR and CLD (or HCLD). Record the CO2 and NO values as “b” and “c” respectively. (iii) Shut...

  20. 40 CFR 91.325 - Analyzer interference checks.

    Science.gov (United States)

    2010-07-01

    ... the measured gas, and in CLD instruments by the interfering gas quenching the radiation. The... concern for CLD (and HCLD) analyzers are CO2 and water vapor. Quench responses to these two gases are... gas approximately 50 percent with NO span gas and pass through the CO2 NDIR and CLD (or HCLD). Record...

  1. 40 CFR 89.318 - Analyzer interference checks.

    Science.gov (United States)

    2010-07-01

    ... broadening the absorption band of the measured gas and in CLD instruments by the interfering gas quenching... gases of concern for CLD (and HCLD) analyzers are CO2 and water vapor. Quench responses to these two... CLD (or HCLD), with the CO2 and NO values recorded as b and c respectively. The CO2 shall then be shut...

  2. 21 CFR 864.5680 - Automated heparin analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated heparin analyzer. 864.5680 Section 864.5680 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  3. Analyzing Digital Library Initiatives: 5S Theory Perspective

    Science.gov (United States)

    Isah, Abdulmumin; Mutshewa, Athulang; Serema, Batlang; Kenosi, Lekoko

    2015-01-01

    This article traces the historical development of Digital Libraries (DLs), examines some DL initiatives in developed and developing countries and uses 5S Theory as a lens for analyzing the focused DLs. The analysis shows that present-day systems, in both developed and developing nations, are essentially content and user centric, with low level…

  4. Analyzing Single-Event Gate Ruptures In Power MOSFET's

    Science.gov (United States)

    Zoutendyk, John A.

    1993-01-01

    Susceptibilities of power metal-oxide/semiconductor field-effect transistors (MOSFET's) to single-event gate ruptures analyzed by exposing devices to beams of energetic bromine ions while applying appropriate bias voltages to source, gate, and drain terminals and measuring current flowing into or out of each terminal.

  5. Processing of Mining Induced Seismic Events by Spectra Analyzer Software

    Czech Academy of Sciences Publication Activity Database

    Kaláb, Zdeněk; Lednická, Markéta; Lyubushin, A. A.

    2011-01-01

    Roč. 6, č. 1 (2011), s. 75-83 ISSN 1896-3145. [Ochrona środowiska w górnictwie podziemnym, odkrywkowym i otworowym. Walbrzych, 18.05.2011-20.05.2011] Institutional research plan: CEZ:AV0Z30860518 Keywords : mining seismicity * Spectra Analyzer Software * wavelet decomposition * time-frequency map Subject RIV: DC - Siesmology, Volcanology, Earth Structure

  6. Analyzing the Structure of the International Business Curriculum in India

    Science.gov (United States)

    Srivastava, Deepak K.

    2012-01-01

    This article analyzes the structure of the international business curriculum through a questionnaire-based survey among current students and young managers who are studying or have studied international business courses in one of the top B-Schools of India. Respondents have the opinion that international business is more than internationalization…

  7. Analyzing the aesthetics of participation of media architecture

    DEFF Research Database (Denmark)

    Fritsch, Jonas; Grönvall, Erik; Breinbjerg, Morten

    2016-01-01

    This paper presents a theoretical framework for analyzing the aesthetics of participation of media architecture. The framework is based on a close reading of French philosopher Jacques Rancière and provides four points of emphasis: modes of sense perception, forms of engagement, community...

  8. Analyzing composability of applications on MPSoC platforms

    NARCIS (Netherlands)

    Kumar, A.; Mesman, B.; Theelen, B.D.; Corporaal, H.; Yajun, H.

    2008-01-01

    Modern day applications require use of multi-processor systems for reasons of erformance, scalability and power efficiency. As more and more applications are integrated in a single system, mapping and analyzing them on a multi-processor platform becomes a multidimensional problem. Each possible set

  9. Evaluation of the three-nucleon analyzing power puzzle

    Energy Technology Data Exchange (ETDEWEB)

    Tornow, W. [Duke Univ., Durham, NC (United States). Dept. of Physics]|[Triangle Univ. Nuclear Lab., Durham, NC (United States); Witala, H. [Uniwersytet Jagiellonski, Cracow (Poland). Inst. Fizyki

    1998-07-20

    The current status of the three-nucleon analyzing power puzzle is reviewed. Applying tight constraints on the allowed deviations between calculated predictions and accepted values for relevant nucleon-nucleon observables reveals that energy independent correction factors applied to the {sup 3}P{sub j} nucleon-nucleon interactions can not solve the puzzle. Furthermore, using the same constraints, charge-independence breaking in the {sup 3}P{sub j} nucleon-nucleon interactions can be ruled out as a possible tool to improve the agreement between three-nucleon calculations and data. The study of the energy dependence of the three-nucleon analyzing power puzzle gives clear evidence that the {sup 3}P{sub j} nucleon-nucleon interaction obtained from phase-shift analyses and used in potential models are correct above about 25 MeV, i.e., the {sup 3}P{sub j} nucleon-nucleon interactions have to be modified only at lower energies in order to solve the three-nucleon analyzing power puzzle, unless new three-nucleon forces can be found that account for the three-nucleon analyzing power puzzle without destroying the beautiful agreement between rigorous three-nucleon calculations and a large body of accurate three-nucleon data. (orig.) 18 refs.

  10. A Study Analyzing the Career Path of Librarians

    Science.gov (United States)

    Noh, Younghee

    2010-01-01

    This study aims to identify the career movement patterns of librarians, analyze factors influencing their career movements, and compare differences in such factors between librarians and chief librarians. Findings showed that the jobs with the highest retention rate were those in public libraries, that library automation system developers showed…

  11. CTG Analyzer: A graphical user interface for cardiotocography.

    Science.gov (United States)

    Sbrollini, Agnese; Agostinelli, Angela; Burattini, Luca; Morettini, Micaela; Di Nardo, Francesco; Fioretti, Sandro; Burattini, Laura

    2017-07-01

    Cardiotocography (CTG) is the most commonly used test for establishing the good health of the fetus during pregnancy and labor. CTG consists in the recording of fetal heart rate (FHR; bpm) and maternal uterine contractions (UC; mmHg). FHR is characterized by baseline, baseline variability, tachycardia, bradycardia, acceleration and decelerations. Instead, UC signal is characterized by presence of contractions and contractions period. Such parameters are usually evaluated by visual inspection. However, visual analysis of CTG recordings has a well-demonstrated poor reproducibility, due to the complexity of physiological phenomena affecting fetal heart rhythm and being related to clinician's experience. Computerized tools in support of clinicians represents a possible solution for improving correctness in CTG interpretation. This paper proposes CTG Analyzer as a graphical tool for automatic and objective analysis of CTG tracings. CTG Analyzer was developed under MATLAB®; it is a very intuitive and user friendly graphical user interface. FHR time series and UC signal are represented one under the other, on a grid with reference lines, as usually done for CTG reports printed on paper. Colors help identification of FHR and UC features. Automatic analysis is based on some unchangeable features definitions provided by the FIGO guidelines, and other arbitrary settings whose default values can be changed by the user. Eventually, CTG Analyzer provides a report file listing all the quantitative results of the analysis. Thus, CTG Analyzer represents a potentially useful graphical tool for automatic and objective analysis of CTG tracings.

  12. Analyzing Exertion of Hardy's Tragic Effect in "Tess"

    Science.gov (United States)

    Yang, Wei

    2009-01-01

    This paper begins with a brief introduction to [Thomas] Hardy's whole life and his works, especially this novel "Tess [of the D'Urbervilles]" and points out the tragic effect's importance and Hardy's tragic idea. Linked to this tragic effect, this paper analyzes the nice application in "Tess." At last, we can understand more…

  13. Using Outcomes to Analyze Patients Rather than Patients to Analyze Outcomes: A Step toward Pragmatism in Benefit:risk Evaluation

    Science.gov (United States)

    Evans, Scott R.; Follmann, Dean

    2016-01-01

    In the future, clinical trials will have an increased emphasis on pragmatism, providing a practical description of the effects of new treatments in realistic clinical settings. Accomplishing pragmatism requires better summaries of the totality of the evidence in ways that clinical trials consumers---patients, physicians, insurers---find transparent and allow for informed benefit:risk decision-making. The current approach to the analysis of clinical trials is to analyze efficacy and safety separately and then combine these analyses into a benefit:risk assessment. Many assume that this will effectively describe the impact on patients. But this approach is suboptimal for evaluating the totality of effects on patients. We discuss methods for benefit:risk assessment that have greater pragmatism than methods that separately analyze efficacy and safety. These include the concepts of within-patient analyses and composite benefit:risk endpoints with a goal of understanding how to analyze one patient before trying to figure out how to analyze many. We discuss the desirability of outcome ranking (DOOR) and introduce the partial credit strategy using an example in a clinical trial evaluating the effects of a new antibiotic. As part of the example we introduce a strategy to engage patients as a resource to inform benefit:risk analyses consistent with the goal of measuring and weighing outcomes that are most important from the patient’s perspective. We describe a broad vision for the future of clinical trials consistent with increased pragmatism. Greater focus on using endpoints to analyze patients rather than patients to analyze endpoints particularly in late-phase/stage clinical trials is an important part of this vision. PMID:28435515

  14. Active methanotrophs in two contrasting North American peatland ecosystems revealed using DNA-SIP.

    Science.gov (United States)

    Gupta, Varun; Smemo, Kurt A; Yavitt, Joseph B; Basiliko, Nathan

    2012-02-01

    The active methanotroph community was investigated in two contrasting North American peatlands, a nutrient-rich sedge fen and nutrient-poor Sphagnum bog using in vitro incubations and (13)C-DNA stable-isotope probing (SIP) to measure methane (CH(4)) oxidation rates and label active microbes followed by fingerprinting and sequencing of bacterial and archaeal 16S rDNA and methane monooxygenase (pmoA and mmoX) genes. Rates of CH(4) oxidation were slightly, but significantly, faster in the bog and methanotrophs belonged to the class Alphaproteobacteria and were similar to other methanotrophs of the genera Methylocystis, Methylosinus, and Methylocapsa or Methylocella detected in, or isolated from, European bogs. The fen had a greater phylogenetic diversity of organisms that had assimilated (13)C, including methanotrophs from both the Alpha- and Gammaproteobacteria classes and other potentially non-methanotrophic organisms that were similar to bacteria detected in a UK and Finnish fen. Based on similarities between bacteria in our sites and those in Europe, including Russia, we conclude that site physicochemical characteristics rather than biogeography controlled the phylogenetic diversity of active methanotrophs and that differences in phylogenetic diversity between the bog and fen did not relate to measured CH(4) oxidation rates. A single crenarchaeon in the bog site appeared to be assimilating (13)C in 16S rDNA; however, its phylogenetic similarity to other CO(2)-utilizing archaea probably indicates that this organism is not directly involved in CH(4) oxidation in peat.

  15. NASTRAN thermal analyzer: Theory and application including a guide to modeling engineering problems, volume 1. [thermal analyzer manual

    Science.gov (United States)

    Lee, H. P.

    1977-01-01

    The NASTRAN Thermal Analyzer Manual describes the fundamental and theoretical treatment of the finite element method, with emphasis on the derivations of the constituent matrices of different elements and solution algorithms. Necessary information and data relating to the practical applications of engineering modeling are included.

  16. 40 CFR 1065.309 - Continuous gas analyzer system-response and updating-recording verification-for gas analyzers...

    Science.gov (United States)

    2010-07-01

    ... not apply to any processing of individual analyzer signals that are time aligned to their t 50 times... for water removed from the sample done in post-processing according to § 1065.659 and it does not... used during emission testing. You may not use interpolation or filtering to alter the recorded values...

  17. Organization of a multichannel analyzer for gamma ray spectrometry

    International Nuclear Information System (INIS)

    Robinet, Genevieve

    1988-06-01

    This report describes the software organization of a medium scale multichannel analyzer for qualitative and quantitative measurements of the gamma rays emitted by radioactive samples. The first part reminds basis of radioactivity, principle of gamma ray detection, and data processing used for interpretation of a nuclear spectrum. The second part describes first the general organization of the software and then gives some details on interactivity, multidetector capabilites, and integration of complex algorithms for peak search and nuclide identification;problems encountered during the design phase are mentioned and solutions are given. Basic ideas are presented for further developments, such as expert system which should improve interpretation of the results. This present software has been integrated in a manufactured multichannel analyzer named 'POLYGAM NU416'. [fr

  18. Analyzing Big Data with the Hybrid Interval Regression Methods

    Directory of Open Access Journals (Sweden)

    Chia-Hui Huang

    2014-01-01

    Full Text Available Big data is a new trend at present, forcing the significant impacts on information technologies. In big data applications, one of the most concerned issues is dealing with large-scale data sets that often require computation resources provided by public cloud services. How to analyze big data efficiently becomes a big challenge. In this paper, we collaborate interval regression with the smooth support vector machine (SSVM to analyze big data. Recently, the smooth support vector machine (SSVM was proposed as an alternative of the standard SVM that has been proved more efficient than the traditional SVM in processing large-scale data. In addition the soft margin method is proposed to modify the excursion of separation margin and to be effective in the gray zone that the distribution of data becomes hard to be described and the separation margin between classes.

  19. Radiometric analyzer with plural radiation sources and detectors

    International Nuclear Information System (INIS)

    Arima, S.; Oda, M.; Miyashita, K.; Takada, M.

    1977-01-01

    A radiometric analyzer for measuring characteristics of a material by radiation comprises a plurality of systems in which each consists of a radiation source and a radiation detector which are the same in number as the number of elements of the molecule of the material and a linear calibration circuit having inverse response characteristics (calibration curve) of the respective systems of detectors, whereby the measurement is carried out by four fundamental rules by operation of the mutual outputs of said detector system obtained through said linear calibration circuit. One typical embodiment is a radiometric analyzer for hydrocarbons which measures the density of heavy oil, the sulfur content and the calorific value by three detector systems which include a γ-ray source (E/sub γ/ greater than 50 keV), a soft x-ray source (Ex approximately 20 keV), and a neutron ray source. 2 claims, 6 figures

  20. Analyzing Oil Futures with a Dynamic Nelson-Siegel Model

    DEFF Research Database (Denmark)

    Hansen, Niels Strange; Lunde, Asger

    In this paper we are interested in the term structure of futures contracts on oil. The objective is to specify a relatively parsimonious model which explains data well and performs well in a real time out of sample forecasting. The dynamic Nelson-Siegel model is normally used to analyze and forec......In this paper we are interested in the term structure of futures contracts on oil. The objective is to specify a relatively parsimonious model which explains data well and performs well in a real time out of sample forecasting. The dynamic Nelson-Siegel model is normally used to analyze...... and forecast interest rates of different maturities. The structure of oil futures resembles the structure of interest rates and this motivates the use of this model for our purposes. The data set is vast and the dynamic Nelson-Siegel model allows for a significant dimension reduction by introducing three...

  1. Formulation of Generic Simulation Models for Analyzing Construction Claims

    Directory of Open Access Journals (Sweden)

    Rifat Rustom

    2012-11-01

    Full Text Available While there are several techniques for analyzing the impact of claims on time schedule and productivity,very few are considered adequate and comprehensive to consider risks and uncertainties.A generic approach for claims analysis using simulation is proposed. The formulation of the generic methodology presented in this paper depends on three simulation models;As-Planned Model (APM,As-Built Model (ABM, and What-Would-HaveBeenModel(WWHBM. The proposed generic methodology as presented in this paper provides a good basis as a more elaborate approach to better analyze claims and their impacts on project time and productivity utilizing discrete event simulation.The approach proposed allows for scenario analysis to account for the disputed events and workflow disruptions. The proposed models will assist claimants in presenting their cases effectively and professionally.

  2. Analyzing the financial crisis using the entropy density function

    Science.gov (United States)

    Oh, Gabjin; Kim, Ho-yong; Ahn, Seok-Won; Kwak, Wooseop

    2015-02-01

    The risk that is created by nonlinear interactions among subjects in economic systems is assumed to increase during an abnormal state of a financial market. Nevertheless, investigating the systemic risk in financial markets following the global financial crisis is not sufficient. In this paper, we analyze the entropy density function in the return time series for several financial markets, such as the S&P500, KOSPI, and DAX indices, from October 2002 to December 2011 and analyze the variability in the entropy value over time. We find that the entropy density function of the S&P500 index during the subprime crisis exhibits a significant decrease compared to that in other periods, whereas the other markets, such as those in Germany and Korea, exhibit no significant decrease during the market crisis. These findings demonstrate that the S&P500 index generated a regular pattern in the return time series during the financial crisis.

  3. Analyzing complex networks evolution through Information Theory quantifiers

    International Nuclear Information System (INIS)

    Carpi, Laura C.; Rosso, Osvaldo A.; Saco, Patricia M.; Ravetti, Martin Gomez

    2011-01-01

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  4. Analyzing and Driving Cluster Formation in Atomistic Simulations.

    Science.gov (United States)

    Tribello, Gareth A; Giberti, Federico; Sosso, Gabriele C; Salvalaglio, Matteo; Parrinello, Michele

    2017-03-14

    In this paper a set of computational tools for identifying the phases contained in a system composed of atoms or molecules is introduced. The method is rooted in graph theory and combines atom centered symmetry functions, adjacency matrices, and clustering algorithms to identify regions of space where the properties of the system constituents can be considered uniform. We show how this method can be used to define collective variables and how these collective variables can be used to enhance the sampling of nucleation events. We then show how this method can be used to analyze simulations of crystal nucleation and growth by using it to analyze simulations of the nucleation of the molecular crystal urea and simulations of nucleation in a semiconducting alloy. The semiconducting alloy example we discuss is particular challenging as multiple nucleation centers are formed. We show, however, that our algorithm is able to detect the grain boundaries in the resulting polycrystal.

  5. Analyzing complex networks evolution through Information Theory quantifiers

    Energy Technology Data Exchange (ETDEWEB)

    Carpi, Laura C., E-mail: Laura.Carpi@studentmail.newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Rosso, Osvaldo A., E-mail: rosso@fisica.ufmg.b [Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Chaos and Biology Group, Instituto de Calculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Pabellon II, Ciudad Universitaria, 1428 Ciudad de Buenos Aires (Argentina); Saco, Patricia M., E-mail: Patricia.Saco@newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Hidraulica, Facultad de Ciencias Exactas, Ingenieria y Agrimensura, Universidad Nacional de Rosario, Avenida Pellegrini 250, Rosario (Argentina); Ravetti, Martin Gomez, E-mail: martin.ravetti@dep.ufmg.b [Departamento de Engenharia de Producao, Universidade Federal de Minas Gerais, Av. Antonio Carlos, 6627, Belo Horizonte (31270-901), MG (Brazil)

    2011-01-24

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  6. Analyzing Resiliency of the Smart Grid Communication Architectures

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2016-08-01

    Smart grids are susceptible to cyber-attack as a result of new communication, control and computation techniques employed in the grid. In this paper, we characterize and analyze the resiliency of smart grid communication architecture, specifically an RF mesh based architecture, under cyber attacks. We analyze the resiliency of the communication architecture by studying the performance of high-level smart grid functions such as metering, and demand response which depend on communication. Disrupting the operation of these functions impacts the operational resiliency of the smart grid. Our analysis shows that it takes an attacker only a small fraction of meters to compromise the communication resiliency of the smart grid. We discuss the implications of our result to critical smart grid functions and to the overall security of the smart grid.

  7. Vector network analyzer (VNA) measurements and uncertainty assessment

    CERN Document Server

    Shoaib, Nosherwan

    2017-01-01

    This book describes vector network analyzer measurements and uncertainty assessments, particularly in waveguide test-set environments, in order to establish their compatibility to the International System of Units (SI) for accurate and reliable characterization of communication networks. It proposes a fully analytical approach to measurement uncertainty evaluation, while also highlighting the interaction and the linear propagation of different uncertainty sources to compute the final uncertainties associated with the measurements. The book subsequently discusses the dimensional characterization of waveguide standards and the quality of the vector network analyzer (VNA) calibration techniques. The book concludes with an in-depth description of the novel verification artefacts used to assess the performance of the VNAs. It offers a comprehensive reference guide for beginners to experts, in both academia and industry, whose work involves the field of network analysis, instrumentation and measurements.

  8. Monitoring machining conditions by analyzing cutting force vibration

    Energy Technology Data Exchange (ETDEWEB)

    Piao, Chun Guang; Kim, Ju Wan; Kim, Jin Oh; Shin, Yoan [Soongsl University, Seoul (Korea, Republic of)

    2015-09-15

    This paper deals with an experimental technique for monitoring machining conditions by analyzing cutting-force vibration measured at a milling machine. This technique is based on the relationship of the cutting-force vibrations with the feed rate and cutting depth as reported earlier. The measurement system consists of dynamic force transducers and a signal amplifier. The analysis system includes an oscilloscope and a computer with a LabVIEW program. Experiments were carried out at various feed rates and cutting depths, while the rotating speed was kept constant. The magnitude of the cutting force vibration component corresponding to the number of cutting edges multiplied by the frequency of rotation was linearly correlated with the machining conditions. When one condition of machining is known, another condition can be identified by analyzing the cutting-force vibration.

  9. Analyzing the BBOB results by means of benchmarking concepts.

    Science.gov (United States)

    Mersmann, O; Preuss, M; Trautmann, H; Bischl, B; Weihs, C

    2015-01-01

    We present methods to answer two basic questions that arise when benchmarking optimization algorithms. The first one is: which algorithm is the "best" one? and the second one is: which algorithm should I use for my real-world problem? Both are connected and neither is easy to answer. We present a theoretical framework for designing and analyzing the raw data of such benchmark experiments. This represents a first step in answering the aforementioned questions. The 2009 and 2010 BBOB benchmark results are analyzed by means of this framework and we derive insight regarding the answers to the two questions. Furthermore, we discuss how to properly aggregate rankings from algorithm evaluations on individual problems into a consensus, its theoretical background and which common pitfalls should be avoided. Finally, we address the grouping of test problems into sets with similar optimizer rankings and investigate whether these are reflected by already proposed test problem characteristics, finding that this is not always the case.

  10. Methods for Analyzing Multivariate Phenotypes in Genetic Association Studies

    Directory of Open Access Journals (Sweden)

    Qiong Yang

    2012-01-01

    Full Text Available Multivariate phenotypes are frequently encountered in genetic association studies. The purpose of analyzing multivariate phenotypes usually includes discovery of novel genetic variants of pleiotropy effects, that is, affecting multiple phenotypes, and the ultimate goal of uncovering the underlying genetic mechanism. In recent years, there have been new method development and application of existing statistical methods to such phenotypes. In this paper, we provide a review of the available methods for analyzing association between a single marker and a multivariate phenotype consisting of the same type of components (e.g., all continuous or all categorical or different types of components (e.g., some are continuous and others are categorical. We also reviewed causal inference methods designed to test whether the detected association with the multivariate phenotype is truly pleiotropy or the genetic marker exerts its effects on some phenotypes through affecting the others.

  11. Data Auditor: Analyzing Data Quality Using Pattern Tableaux

    Science.gov (United States)

    Srivastava, Divesh

    Monitoring databases maintain configuration and measurement tables about computer systems, such as networks and computing clusters, and serve important business functions, such as troubleshooting customer problems, analyzing equipment failures, planning system upgrades, etc. These databases are prone to many data quality issues: configuration tables may be incorrect due to data entry errors, while measurement tables may be affected by incorrect, missing, duplicate and delayed polls. We describe Data Auditor, a tool for analyzing data quality and exploring data semantics of monitoring databases. Given a user-supplied constraint, such as a boolean predicate expected to be satisfied by every tuple, a functional dependency, or an inclusion dependency, Data Auditor computes "pattern tableaux", which are concise summaries of subsets of the data that satisfy or fail the constraint. We discuss the architecture of Data Auditor, including the supported types of constraints and the tableau generation mechanism. We also show the utility of our approach on an operational network monitoring database.

  12. Analyzing Bullwhip Effect in Supply Networks under Exogenous Uncertainty

    Directory of Open Access Journals (Sweden)

    Mitra Darvish

    2014-05-01

    Full Text Available This paper explains a model for analyzing and measuring the propagation of order amplifications (i.e. bullwhip effect for a single-product supply network topology considering exogenous uncertainty and linear and time-invariant inventory management policies for network entities. The stream of orders placed by each entity of the network is characterized assuming customer demand is ergodic. In fact, we propose an exact formula in order to measure the bullwhip effect in the addressed supply network topology considering the system in Markovian chain framework and presenting a matrix of network member relationships and relevant order sequences. The formula turns out using a mathematical method called frequency domain analysis. The major contribution of this paper is analyzing the bullwhip effect considering exogenous uncertainty in supply networks and using the Fourier transform in order to simplify the relevant calculations. We present a number of numerical examples to assess the analytical results accuracy in quantifying the bullwhip effect.

  13. Time asymmetry: Polarization and analyzing power in the nuclear reactions

    Energy Technology Data Exchange (ETDEWEB)

    Rioux, C.; Roy, R.; Slobodrian, R.J. (Laval Univ., Quebec City (Canada). Lab. de Physique Nucleaire); Conzett, H.E. (California Univ., Berkeley (USA). Lawrence Berkeley Lab.)

    1983-02-28

    Measurements of the proton polarization in the reactions /sup 7/Li(/sup 3/He, p vector)/sup 9/Be and /sup 9/Be(/sup 3/He, p vector)/sup 11/B and of the analyzing powers of the inverse reactions, initiated by polarized protons at the same c.m. energies, show significant differences which imply the failure of the polarization-analyzing-power theorem and, prima facie, of time-reversal invariance in these reactions. The reaction /sup 2/H(/sup 3/He, p vector)/sup 4/ He and its inverse have also been investigated and show some smaller differences. A discussion of the instrumental asymmetries is presented.

  14. Monitoring machining conditions by analyzing cutting force vibration

    International Nuclear Information System (INIS)

    Piao, Chun Guang; Kim, Ju Wan; Kim, Jin Oh; Shin, Yoan

    2015-01-01

    This paper deals with an experimental technique for monitoring machining conditions by analyzing cutting-force vibration measured at a milling machine. This technique is based on the relationship of the cutting-force vibrations with the feed rate and cutting depth as reported earlier. The measurement system consists of dynamic force transducers and a signal amplifier. The analysis system includes an oscilloscope and a computer with a LabVIEW program. Experiments were carried out at various feed rates and cutting depths, while the rotating speed was kept constant. The magnitude of the cutting force vibration component corresponding to the number of cutting edges multiplied by the frequency of rotation was linearly correlated with the machining conditions. When one condition of machining is known, another condition can be identified by analyzing the cutting-force vibration

  15. [Integrated Development of Full-automatic Fluorescence Analyzer].

    Science.gov (United States)

    Zhang, Mei; Lin, Zhibo; Yuan, Peng; Yao, Zhifeng; Hu, Yueming

    2015-10-01

    In view of the fact that medical inspection equipment sold in the domestic market is mainly imported from abroad and very expensive, we developed a full-automatic fluorescence analyzer in our center, presented in this paper. The present paper introduces the hardware architecture design of FPGA/DSP motion controlling card+PC+ STM32 embedded micro processing unit, software system based on C# multi thread, design and implementation of double-unit communication in detail. By simplifying the hardware structure, selecting hardware legitimately and adopting control system software to object-oriented technology, we have improved the precision and velocity of the control system significantly. Finally, the performance test showed that the control system could meet the needs of automated fluorescence analyzer on the functionality, performance and cost.

  16. ANALYZER OF QUANTITY AND QUALITY OF THE ELECTRIC POWER

    Directory of Open Access Journals (Sweden)

    A. I. Semilyak

    2013-01-01

    Full Text Available One of the activities of the research center for “Energy Saving Technologies and Smart Metering in Electrical Power Engineering" is research work on the use of electronic devices and systems of intelligent power distribution, produced by Analog Devices and equipped with the accurate energy consumption measurement feature. The article focuses on the development of the analyzer of quantity and quality of electric energy.The main part of the analyzer is a metering IC by Analog Devices ADE7878, designed for use in commercial and industrial smart electricity meters. Such counters measure the amount of consumed or produced electric energy with high accuracy and have the means of remote meter reading.

  17. Analyzing force concept inventory with item response theory

    Science.gov (United States)

    Wang, Jing; Bao, Lei

    2010-10-01

    Item response theory is a popular assessment method used in education. It rests on the assumption of a probability framework that relates students' innate ability and their performance on test questions. Item response theory transforms students' raw test scores into a scaled proficiency score, which can be used to compare results obtained with different test questions. The scaled score also addresses the issues of ceiling effects and guessing, which commonly exist in quantitative assessment. We used item response theory to analyze the force concept inventory (FCI). Our results show that item response theory can be useful for analyzing physics concept surveys such as the FCI and produces results about the individual questions and student performance that are beyond the capability of classical statistics. The theory yields detailed measurement parameters regarding the difficulty, discrimination features, and probability of correct guess for each of the FCI questions.

  18. Dehydration process of fish analyzed by neutron beam imaging

    International Nuclear Information System (INIS)

    Tanoi, K.; Hamada, Y.; Seyama, S.; Saito, T.; Iikura, H.; Nakanishi, T.M.

    2009-01-01

    Since regulation of water content of the dried fish is an important factor for the quality of the fish, water-losing process during drying (squid and Japanese horse mackerel) was analyzed through neutron beam imaging. The neutron image showed that around the shoulder of mackerel, there was a part where water content was liable to maintain high during drying. To analyze water-losing process more in detail, spatial image was produced. From the images, it was clearly indicated that the decrease of water content was regulated around the shoulder part. It was suggested that to prevent deterioration around the shoulder part of the dried fish is an important factor to keep quality of the dried fish in the storage.

  19. Photoelectric equipment type MFS-7 for analyzing oils

    International Nuclear Information System (INIS)

    Orlova, S.A.; Fridman, M.G.; Kholosha, T.V.; Ezhoda, G.D.; Nechitailov, V.V.

    1987-01-01

    The authors describe the equipment type MFS-7 which is intended for analyzing used oils for the wear products of motors. The difference between type MFS-7 and its predecessors lies in the application of computer techniques to control the equipment and process the output data; and in the design of the sample container, which allows for two methods of introducing the sample into the discharge. The photoelectric equipment consists of an excitation spectrum source IVS-28, having an ac arc mode and 1.v. spark, a polychoromator, a special sample holder for analyzing liquid samples, an electronic recording apparatus with digital voltmeter type ERU-18 and control computer system Spectr 2.2 based on a minicomputer with its own printer. The type MFS-7 equipment has been tested and put into mass production

  20. Textbooks in the EFL classroom: Defining, assessing and analyzing

    Directory of Open Access Journals (Sweden)

    Radić-Bojanić Biljana B.

    2016-01-01

    Full Text Available The aim of this paper is to define textbooks, analyze their advantages and disadvantages and to explicate the process of textbook selection and reasons for analyzing textbooks. The paper describes two reasons for performing a textbook analysis: evaluating for potential and evaluating for suitability, and further states various processes of textbook content analysis including the analysis of the stated aims and objectives, learner needs, their abilities and preferences, as well as the establishment of criteria in relation to previously set objectives. The paper concludes by stating that the task that teachers are faced with when selecting and evaluating textbooks is not that is an easy one, but it is crucial. With the assistance of clear guidelines and detailed criteria, they should be able to make an informed decision and choose a textbook that is most suitable for the requirements of their specific classroom context.

  1. New ultra small battery operated portable multi-channel analyzer

    International Nuclear Information System (INIS)

    Wolf, M.A.; Umbarger, C.J.

    1979-01-01

    A newly designed portable multi-channel analyzer (MCA) has been developd at Los Alamos that has much improved physical and performance characteristics over previous designs. Namely, the instrument is very compact (25 cm wide x 14 cm deep x 21 cm high) and has a mass of 4.2 Kg (9.2 lb). The device has 1024 channels and is microprocessor controlled. The instrument has most of the standard features of present laboratory-based pulse height analyzers, including CRT display, region of interest integration, etc. Battery life of the MCA is nearly eight hours, with full charging over night. An accessory case carries a small audio cassette recorder for data storage. The case also contains two different NaI(Tl) detectors

  2. What Hold us Together? Analyzing Biotech Field Formation

    Directory of Open Access Journals (Sweden)

    Jackeline Amantino de Andrade

    2011-03-01

    Full Text Available This article proposes to analyze the formation of biotechnological field bringing actor-network theory’s lens as contribution. Based on conclusions of studies developed by Walter Powell and colleagues it was held a research to analyze the diversity of institutional relations that are active by hemophilia therapies, the principle of generalized symmetry adopted for actor-network theory is highlight to identify how socio-technical associations are assembled. Besides the interorganizational relations, research’s findings indicate the scientific and technological contents have a significant mediating role to create and sustain those connections of knowledge. So, it is emphasized the need of a boarder theoretical discussion to enlarge explanations about the dynamics of organizational fields as well as innovation processes.

  3. A New ABCD Technique to Analyze Business Models & Concepts

    OpenAIRE

    Aithal P. S.; Shailasri V. T.; Suresh Kumar P. M.

    2015-01-01

    Various techniques are used to analyze individual characteristics or organizational effectiveness like SWOT analysis, SWOC analysis, PEST analysis etc. These techniques provide an easy and systematic way of identifying various issues affecting a system and provides an opportunity for further development. Whereas these provide a broad-based assessment of individual institutions and systems, it suffers limitations while applying to business context. The success of any business model depends on ...

  4. A Rigorous Methodology for Analyzing and Designing Plug-Ins

    DEFF Research Database (Denmark)

    Fasie, Marieta V.; Haxthausen, Anne Elisabeth; Kiniry, Joseph

    2013-01-01

    . This paper addresses these problems by describing a rigorous methodology for analyzing and designing plug-ins. The methodology is grounded in the Extended Business Object Notation (EBON) and covers informal analysis and design of features, GUI, actions, and scenarios, formal architecture design, including...... behavioral semantics, and validation. The methodology is illustrated via a case study whose focus is an Eclipse environment for the RAISE formal method's tool suite....

  5. SEAscan 3.5: A simulator performance analyzer

    International Nuclear Information System (INIS)

    Dennis, T.; Eisenmann, S.

    1990-01-01

    SEAscan 3.5 is a personal computer based tool developed to analyze the dynamic performance of nuclear power plant training simulators. The system has integrated features to provide its own human featured performance. In this paper, the program is described as a tool for the analysis of training simulator performance. The structure and operating characteristics of SEAscan 3.5 are described. The hardcopy documents are shown to aid in verification of conformance to ANSI/ANS-3.5-1985

  6. Multi-channel amplitude analyzer CMA-1 and CMA-2

    International Nuclear Information System (INIS)

    Czermak, A.; Jablonski, J.; Ostrowicz, A.

    1977-01-01

    Analyzer CMA is implemented in the CAMAC system. A single crate contains the required modules and is controlled by the PDP-11/10 minicomputer with 8k 16 bit word memory. Spectra can be accumulated in full 4k, 2k 1k or 0,5k. System provides: display of stored data in the form of full memory, very accurate representation of any part (44 channels) on alphanumeric display, readout of the data by paper tape punch or printing. (author)

  7. A fully integrated standalone portable cavity ringdown breath acetone analyzer

    Science.gov (United States)

    Sun, Meixiu; Jiang, Chenyu; Gong, Zhiyong; Zhao, Xiaomeng; Chen, Zhuying; Wang, Zhennan; Kang, Meiling; Li, Yingxin; Wang, Chuji

    2015-09-01

    Breath analysis is a promising new technique for nonintrusive disease diagnosis and metabolic status monitoring. One challenging issue in using a breath biomarker for potential particular disease screening is to find a quantitative relationship between the concentration of the breath biomarker and clinical diagnostic parameters of the specific disease. In order to address this issue, we need a new instrument that is capable of conducting real-time, online breath analysis with high data throughput, so that a large scale of clinical test (more subjects) can be achieved in a short period of time. In this work, we report a fully integrated, standalone, portable analyzer based on the cavity ringdown spectroscopy technique for near-real time, online breath acetone measurements. The performance of the portable analyzer in measurements of breath acetone was interrogated and validated by using the certificated gas chromatography-mass spectrometry. The results show that this new analyzer is useful for reliable online (online introduction of a breath sample without pre-treatment) breath acetone analysis with high sensitivity (57 ppb) and high data throughput (one data per second). Subsequently, the validated breath analyzer was employed for acetone measurements in 119 human subjects under various situations. The instrument design, packaging, specifications, and future improvements were also described. From an optical ringdown cavity operated by the lab-set electronics reported previously to this fully integrated standalone new instrument, we have enabled a new scientific tool suited for large scales of breath acetone analysis and created an instrument platform that can even be adopted for study of other breath biomarkers by using different lasers and ringdown mirrors covering corresponding spectral fingerprints.

  8. Wavelength encoding technique for particle analyses in hematology analyzer

    Science.gov (United States)

    Rongeat, Nelly; Brunel, Patrick; Gineys, Jean-Philippe; Cremien, Didier; Couderc, Vincent; Nérin, Philippe

    2011-07-01

    The aim of this study is to combine multiple excitation wavelengths in order to improve accuracy of fluorescence characterization of labeled cells. The experimental demonstration is realized with a hematology analyzer based on flow cytometry and a CW laser source emitting two visible wavelengths. A given optical encoding associated to each wavelength allows fluorescence identification coming from specific fluorochromes and avoiding the use of noisy compensation method.

  9. Emission pathway modeling to analyze national ambition levels of decarbonization

    International Nuclear Information System (INIS)

    Kainuma, Mikiko; Waisman, Henri

    2015-01-01

    The Deep Decarbonization Pathways Project (DDPP) is a knowledge network comprising 15 Country Research Teams and several Partner Organizations which develop and share methods, assumptions, and findings related to deep decarbonization. It analyzes the technical decarbonization potential, exploring options for deep decarbonization, but also better taking into account existing infrastructure stocks. It shows the possibility to reduce total CO 2 -energy emissions by 45% by 2050, with bottom-up analyses by 15 Country Research Teams

  10. Development of a Multichannel Analyzer for modular ADCs

    International Nuclear Information System (INIS)

    Bannos Rodriguez, U.; Diaz Castro, M.; Rivero Ramirez, D.

    2013-01-01

    This paper describes the design and construction of a multichannel analyzer with USB interface from ADC module of the Fast ComTec 7074. One PIC18F4550 microprocessor from Microchip, one CPLD isPLSI 1032E from Lattice and one 32Kx8 SRAM memory were used. It also includes details of the programming circuitry and development of program acquisition, storage and display of the spectra using the Qt libraries. Finally, preliminary tests to the device are show. (Author)

  11. Analyzing China Smart Water Meter Industry Cluster Competitiveness

    OpenAIRE

    Chan, Parker

    2013-01-01

    Sustainable development has always been a top issue nowadays. The smart water management is one of the methods to achieve the sustainable development. This paper aims to focus on analyzing the competitiveness of industrial clusters (Guangzhou, Ningbo and Shanghai) in China specifically to the smart water meter industry. It is part of the CEMIS sourcing work package under the KVTELIOS project with Mr. Al Natsheh Anas, and is supervised by Ms. Komulainen Ruey. Porter Diamond Theory is used ...

  12. A framework for analyzing performance in higher education

    OpenAIRE

    Lola C. Duque

    2013-01-01

    Drawing on Tinto’s dropout intentions model (1975), Bean’s socialization model (1985), Astin’s involvement theory (1999), and the service marketing literature, this research presents a conceptual framework for analyzing students’ satisfaction, perceived learning outcomes, and dropout intentions. This framework allows for a better understanding of how students assess the university experience and how these perceptions affect future intentions. This article presents four studies ...

  13. A Fuzzy Logic System to Analyze a Student's Lifestyle

    OpenAIRE

    Ghosh, Sourish; Boob, Aaditya Sanjay; Nikhil, Nishant; Vysyaraju, Nayan Raju; Kumar, Ankit

    2016-01-01

    A college student's life can be primarily categorized into domains such as education, health, social and other activities which may include daily chores and travelling time. Time management is crucial for every student. A self realisation of one's daily time expenditure in various domains is therefore essential to maximize one's effective output. This paper presents how a mobile application using Fuzzy Logic and Global Positioning System (GPS) analyzes a student's lifestyle and provides recom...

  14. Analyzing and synthesizing phylogenies using tree alignment graphs.

    Directory of Open Access Journals (Sweden)

    Stephen A Smith

    Full Text Available Phylogenetic trees are used to analyze and visualize evolution. However, trees can be imperfect datatypes when summarizing multiple trees. This is especially problematic when accommodating for biological phenomena such as horizontal gene transfer, incomplete lineage sorting, and hybridization, as well as topological conflict between datasets. Additionally, researchers may want to combine information from sets of trees that have partially overlapping taxon sets. To address the problem of analyzing sets of trees with conflicting relationships and partially overlapping taxon sets, we introduce methods for aligning, synthesizing and analyzing rooted phylogenetic trees within a graph, called a tree alignment graph (TAG. The TAG can be queried and analyzed to explore uncertainty and conflict. It can also be synthesized to construct trees, presenting an alternative to supertrees approaches. We demonstrate these methods with two empirical datasets. In order to explore uncertainty, we constructed a TAG of the bootstrap trees from the Angiosperm Tree of Life project. Analysis of the resulting graph demonstrates that areas of the dataset that are unresolved in majority-rule consensus tree analyses can be understood in more detail within the context of a graph structure, using measures incorporating node degree and adjacency support. As an exercise in synthesis (i.e., summarization of a TAG constructed from the alignment trees, we also construct a TAG consisting of the taxonomy and source trees from a recent comprehensive bird study. We synthesized this graph into a tree that can be reconstructed in a repeatable fashion and where the underlying source information can be updated. The methods presented here are tractable for large scale analyses and serve as a basis for an alternative to consensus tree and supertree methods. Furthermore, the exploration of these graphs can expose structures and patterns within the dataset that are otherwise difficult to

  15. Analyzing and synthesizing phylogenies using tree alignment graphs.

    Science.gov (United States)

    Smith, Stephen A; Brown, Joseph W; Hinchliff, Cody E

    2013-01-01

    Phylogenetic trees are used to analyze and visualize evolution. However, trees can be imperfect datatypes when summarizing multiple trees. This is especially problematic when accommodating for biological phenomena such as horizontal gene transfer, incomplete lineage sorting, and hybridization, as well as topological conflict between datasets. Additionally, researchers may want to combine information from sets of trees that have partially overlapping taxon sets. To address the problem of analyzing sets of trees with conflicting relationships and partially overlapping taxon sets, we introduce methods for aligning, synthesizing and analyzing rooted phylogenetic trees within a graph, called a tree alignment graph (TAG). The TAG can be queried and analyzed to explore uncertainty and conflict. It can also be synthesized to construct trees, presenting an alternative to supertrees approaches. We demonstrate these methods with two empirical datasets. In order to explore uncertainty, we constructed a TAG of the bootstrap trees from the Angiosperm Tree of Life project. Analysis of the resulting graph demonstrates that areas of the dataset that are unresolved in majority-rule consensus tree analyses can be understood in more detail within the context of a graph structure, using measures incorporating node degree and adjacency support. As an exercise in synthesis (i.e., summarization of a TAG constructed from the alignment trees), we also construct a TAG consisting of the taxonomy and source trees from a recent comprehensive bird study. We synthesized this graph into a tree that can be reconstructed in a repeatable fashion and where the underlying source information can be updated. The methods presented here are tractable for large scale analyses and serve as a basis for an alternative to consensus tree and supertree methods. Furthermore, the exploration of these graphs can expose structures and patterns within the dataset that are otherwise difficult to observe.

  16. Resolution of VISION, a crystal-analyzer spectrometer

    International Nuclear Information System (INIS)

    Seeger, Philip A.; Daemen, Luke L.; Larese, John Z.

    2009-01-01

    We present both analytic and Monte Carlo calculations of the resolution of VISION, which is a crystal-analyzer spectrometer based on the TOSCA design. The analyzer crystal in VISION is configured to focus in time, radial, and transverse directions ('triple focused'). Previously published analytical results have two serious flaws in the handling of the statistics, which gave misleading results. First, Gaussian distributions were assumed for all resolution components, so that full-width-half-maximum could be used. Not only is this a very poor approximation for most terms, it is also completely unnecessary because standard deviations can be combined in quadrature for any shape distribution (except Lorentzian). The second flaw was the choice of variables that are not independent, so that significant correlations were ignored. An example of the effect of including correlations is that the mosaic spread of the analyzer crystals does not contribute to the resolution in first order. Monte Carlo simulation is not limited to first order, and we find a mild optimum value for mosaic spread. A complete set of six independent variables is: neutron emission time, incident flight-path variation (due to moderator tilt), sample thickness, mean path in the analyzer (due to multiple reflections), sample-to-detector radial distance, and detector thickness. We treat separately the resolution contributions from histogramming and rebinning during data acquisition and reduction, and describe a scheme for VISION that minimizes the effect on resolution. We compare the contributions of the six variables to the total resolution, both analytically and by Monte Carlo simulations of a complete VISION model using the Neutron Instrument Simulation Package (NISP).

  17. X-ray chemical analyzer for field applications

    International Nuclear Information System (INIS)

    Gamba, O.O.M.

    1977-01-01

    A self-supporting portable field multichannel x-ray chemical analyzer system is claimed. It comprises a lightweight, flexibly connected, remotely locatable, radioisotope-excited sensing probe utilizing a cryogenically-cooled solid state semi-conductor crystal detector for fast in situ non-destructive, qualitative and quantitative analysis of elements in solid, powder, liquid or slurried form, utilizing an x-ray energy dispersive spectrometry technique

  18. R Package clickstream: Analyzing Clickstream Data with Markov Chains

    Directory of Open Access Journals (Sweden)

    Michael Scholz

    2016-10-01

    Full Text Available Clickstream analysis is a useful tool for investigating consumer behavior, market research and software testing. I present the clickstream package which provides functionality for reading, clustering, analyzing and writing clickstreams in R. The package allows for a modeling of lists of clickstreams as zero-, first- and higher-order Markov chains. I illustrate the application of clickstream for a list of representative clickstreams from an online store.

  19. Analyze of the Measuring Performance for Artificially Business Intelligent Systems

    OpenAIRE

    Vatuiu, Teodora

    2007-01-01

    This paper analyzes the measuring performance of artificially business intelligent systems. Thousands of persons-years have been devoted to the research and development in the vari¬ous aspects of artificially intelligent systems. Much progress has been attained. However, there has been no means of evaluating the progress of the field. How can we assess the cur¬rent state of the science? Most of business intelligent systems are beginning to be deployed commercially. How can a commercial buyer ...

  20. Analyzing the User Behavior toward Electronic Commerce Stimuli

    OpenAIRE

    Lorenzo-Romero, Carlota; Alarcón-del-Amo, María-del-Carmen; Gómez-Borja, Miguel-Ángel

    2016-01-01

    Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e., navigational structure as utilitarian stimulus) versus non-verbal web technology (music and presentation of products as hedonic stimuli). Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this res...

  1. Analyzing the user behavior towards Electronic Commerce stimuli

    OpenAIRE

    Carlota Lorenzo-Romero; María-del-Carmen Alarcón-del-Amo

    2016-01-01

    Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e. navigational structure as utilitarian stimulus) versus nonverbal web technology (music and presentation of products as hedonic stimuli). Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this resea...

  2. Analyzing Equity Capital Programs of Banks for Cooperatives

    OpenAIRE

    Ismail Ahmad; Ken D. Duft; Ron C. Mittelhammer

    1986-01-01

    Characteristics of Banks for Cooperatives term loan and equity capital programs contribute toward complex intermittent exchanges of positive and negative cash flows between the cooperative lender and borrower and complicate the analysis of the net present value and effective interest of the financing project. A multiperiod linear program was developed to analyze the effect of variations in equity capital program components on the present value of the financing project. Furthermore, the concep...

  3. A fully integrated standalone portable cavity ringdown breath acetone analyzer.

    Science.gov (United States)

    Sun, Meixiu; Jiang, Chenyu; Gong, Zhiyong; Zhao, Xiaomeng; Chen, Zhuying; Wang, Zhennan; Kang, Meiling; Li, Yingxin; Wang, Chuji

    2015-09-01

    Breath analysis is a promising new technique for nonintrusive disease diagnosis and metabolic status monitoring. One challenging issue in using a breath biomarker for potential particular disease screening is to find a quantitative relationship between the concentration of the breath biomarker and clinical diagnostic parameters of the specific disease. In order to address this issue, we need a new instrument that is capable of conducting real-time, online breath analysis with high data throughput, so that a large scale of clinical test (more subjects) can be achieved in a short period of time. In this work, we report a fully integrated, standalone, portable analyzer based on the cavity ringdown spectroscopy technique for near-real time, online breath acetone measurements. The performance of the portable analyzer in measurements of breath acetone was interrogated and validated by using the certificated gas chromatography-mass spectrometry. The results show that this new analyzer is useful for reliable online (online introduction of a breath sample without pre-treatment) breath acetone analysis with high sensitivity (57 ppb) and high data throughput (one data per second). Subsequently, the validated breath analyzer was employed for acetone measurements in 119 human subjects under various situations. The instrument design, packaging, specifications, and future improvements were also described. From an optical ringdown cavity operated by the lab-set electronics reported previously to this fully integrated standalone new instrument, we have enabled a new scientific tool suited for large scales of breath acetone analysis and created an instrument platform that can even be adopted for study of other breath biomarkers by using different lasers and ringdown mirrors covering corresponding spectral fingerprints.

  4. Analyzing E-mail communication of prospective learners

    Directory of Open Access Journals (Sweden)

    Anurag SAXENA

    2005-10-01

    Full Text Available Analyzing E-mail communication of prospective learners Vibha JOSHIAnurag SAXENA Reader School of Management IGNOU, New Delhi, INDIA ABSTRACT Today, computer has replaced all means of communication significantly. E-mail is the most popular means of communication through computers. It has vanished the boundaries between the cities, countries and continents. Earlier Studies that used this medium, had shown evidence of higher quality of responses and also significant cost savings feature of this method (mainly for convenience of dispatch for research purposes. In this communication, an attempt has been made to utilize email responses of prospective learners in education discipline based programs offered through open learning system. The inception of this study is linked to the creation of website (http://www.ignou.ac.in/ and availability of information related to all activated and prospective programs of studies, various school of studies, faculty email addresses and faculty vis-à-vis program coordinator in the University. The present study analyzed 65 email responses received from July 2002 to till date by the researcher on her email ID. These respondents got the email ID from the University website and due to their eagerness to attain qualifications in the field of education/guidance/educational management, sent an e-mail to the program coordinator(s. These email responses were analyzed in order to visualize the learner profiles and viability of the ongoing as well as prospective programs of studies. This paper tries to analyze the e-mail responses of the people who either by surfing IGNOU’s website or from other sources came to know about the various programs of studies offered by the School of Education. The study validates that there is an amount of hidden information even in the curiosity of the learners.

  5. RELIABILITY ASSESSMENTS OF INFANT INCUBATOR AND THE ANALYZER

    OpenAIRE

    Özdemirci, Emre; Özarslan Yatak, Meral; Duran, Fecir; Canal, Mehmet Rahmi

    2014-01-01

    Approximately 80% of newborn in Turkey are put in neonatal incubators because of their problematic cases. Incubators used for treatment may adversely affect baby’s health seriously, if they adjusts or measures the parameters incorrectly. In this study, complications arisen because of inaccurate adjustment and measurement of incubator parameters were investigated. Current infant incubator analyzers were researched and the deficiencies were evaluated considering the standards and clin...

  6. Dependencies in event trees analyzed by Petri nets

    International Nuclear Information System (INIS)

    Nývlt, Ondřej; Rausand, Marvin

    2012-01-01

    This paper discusses how non-marked Petri nets can be used to model and analyze event trees where the pivotal (branching) events are dependent and modeled by fault trees. The dependencies may, for example, be caused by shared utilities, shared components, or general common cause failures that are modeled by beta-factor models. These dependencies are cumbersome to take into account when using standard event-/fault tree modeling techniques, and may lead to significant errors in the calculated end-state probabilities of the event tree if they are not properly analyzed. A new approach is proposed in this paper, where the whole event tree is modeled by a non-marked Petri net and where P-invariants, representing the structural properties of the Petri net, are used to obtain the frequency of each end-state of the event tree with dependencies. The new approach is applied to a real example of an event tree analysis of the Strahov highway tunnel in Prague, Czech Republic, including two types of dependencies (shared Programmable Logic Controllers and Common Cause Failures). - Highlights: ► In this paper, we model and analyze event trees (ET) using Petri nets. ► The pivotal events of the modeled event trees are dependent (e.g., shared PLCs, CCF). ► A new method based on P-invariants to obtain probabilities of end states is proposed. ► Method is shown in the case study of the Stahov tunnel in the Czech Republic.

  7. Monitoring, Analyzing and Assessing Radiation Belt Loss and Energization

    Science.gov (United States)

    Daglis, I.; Balasis, G.; Bourdarie, S.; Horne, R.; Khotyaintsev, Y.; Mann, I.; Santolik, O.; Turner, D.; Anastasiadis, A.; Georgiou, M.; Giannakis, O.; Papadimitriou, C.; Ropokis, G.; Sandberg, I.; Angelopoulos, V.; Glauert, S.; Grison, B., Kersten T.; Kolmasova, I.; Lazaro, D.; Mella, M.; Ozeke, L.; Usanova, M.

    2013-09-01

    We present the concept, objectives and expected impact of the MAARBLE (Monitoring, Analyzing and Assessing Radiation Belt Loss and Energization) project, which is being implemented by a consortium of seven institutions (five European, one Canadian and one US) with support from the European Community's Seventh Framework Programme. The MAARBLE project employs multi-spacecraft monitoring of the geospace environment, complemented by ground-based monitoring, in order to analyze and assess the physical mechanisms leading to radiation belt particle energization and loss. Particular attention is paid to the role of ULF/VLF waves. A database containing properties of the waves is being created and will be made available to the scientific community. Based on the wave database, a statistical model of the wave activity dependent on the level of geomagnetic activity, solar wind forcing, and magnetospheric region will be developed. Multi-spacecraft particle measurements will be incorporated into data assimilation tools, leading to new understanding of the causal relationships between ULF/VLF waves and radiation belt dynamics. Data assimilation techniques have been proven as a valuable tool in the field of radiation belts, able to guide 'the best' estimate of the state of a complex system. The MAARBLE (Monitoring, Analyzing and Assessing Radiation Belt Energization and Loss) collaborative research project has received funding from the European Union’s Seventh Framework Programme (FP7-SPACE-2011-1) under grant agreement no. 284520.

  8. Performance evaluation of Samsung LABGEO(HC10) Hematology Analyzer.

    Science.gov (United States)

    Park, Il Joong; Ahn, Sunhyun; Kim, Young In; Kang, Seon Joo; Cho, Sung Ran

    2014-08-01

    The Samsung LABGEO(HC10) Hematology Analyzer (LABGEO(HC10)) is a recently developed automated hematology analyzer that uses impedance technologies. The analyzer provides 18 parameters including 3-part differential at a maximum rate of 80 samples per hour. To evaluate the performance of the LABGEO(HC10). We evaluated precision, linearity, carryover, and relationship for complete blood cell count parameters between the LABGEO(HC10) and the LH780 (Beckman Coulter Inc) in a university hospital in Korea according to the Clinical and Laboratory Standards Institute guidelines. Sample stability and differences due to the anticoagulant used (K₂EDTA versus K₃EDTA) were also evaluated. The LABGEO(HC10) showed linearity over a wide range and minimal carryover ( 0.92) except for mean corpuscular hemoglobin concentration. The bias estimated was acceptable for all parameters investigated except for monocyte count. Most parameters were stable until 24 hours both at room temperature and at 4°C. The difference by anticoagulant type was statistically insignificant for all parameters except for a few red cell parameters. The accurate results achievable and simplicity of operation make the unit recommendable for small to medium-sized laboratories.

  9. Free neutron-proton analyzing power at medium energies

    International Nuclear Information System (INIS)

    Newsom, C.R.

    1980-01-01

    In recent years, increasing efforts have been made to measure the nucleon-nucleon polarization parameters. To date, no free neutron-proton spin correlated parameters have been published in the energy range 500 to 800 MeV. Existing analyzing power data is of low precision and in most cases was obtained by quasi-free proton scattering. As a first step in determining the neutron-proton scattering matrix, the free neutron-proton analyzing power has been measured at the Los Alamos Physics Facility as a function of energy and angle. The experiment was performed by scattering a neutron beam from a polarized proton target. The neutron beam was generated by scattering 800 MeV protons from a Beryllium target and using the neutrons produced at 0 degrees. The incident energy ranged from 300 MeV to 800 MeV. The energy spread of the neutron beam made it possible to measure the analyzing power at different energies simultaneously. Angular distributions were taken from 60 to 170 degrees in the center of mass system (c.m.)

  10. Analyzing the Investment Behaviour of Households at the Microlevel

    Directory of Open Access Journals (Sweden)

    Lepeyko Tetyana I.

    2017-10-01

    Full Text Available The article is aimed at analyzing the characteristics of the investment behaviour of households at the microlevel. Essence of the investment behavior of of households was considered, substantiating that it differs in relation to the social indicators of households as well as to their income and costs. In order to analyze the investment behavior of households at the microlevel, it was proposed to conduct an expert survey of economic agents (staff of enterprises, private entrepreneurs, etc.. Using the hierarchy analysis method, it has been substantiated that the most appropriate method for the selected criteria (minimizing the time and cost of the survey, improving the truthfulness and completeness of the responses is the anonymous respondent survey. To implement this method, a list of questions was proposed that would allow to analyze the social indicators of households, structure of their incomes and costs. On the basis of the survey conducted at the enterprises of Kharkiv region, the main prerequisites for a possible improvement of the microlevel investment behavior of households have been identified.

  11. ANALYZING CONSUMER BEHAVIOR IN BANKING SECTOR OF KOSOVO

    Directory of Open Access Journals (Sweden)

    Vjosa Fejza

    2017-12-01

    Full Text Available Considering the importance of understanding, analyzing and studying consumer behavior and behavior model, it was deemed necessary to conduct a research on this issue. As part of this research, consumer behavior models in the banking system of Kosovo were studied and analyzed. The first part of the study is characterized by a review of various literature, publications and scientific journals related to understanding the role and importance of consumer behavior in enterprises. Whereas the second part of the study includes a survey questionnaire, with a 500 individual client sample base, randomly selected from commercial banks in Kosovo. This survey was done with the purpose to collect data to determine behavior models of existing consumers in the banking sector and analyze various internal and external factors which influence such behaviors. Finally, data obtained from questionnaire surveys were used to draw conclusions on issues central to this research and issue recommendations which may be useful to commercial banks currently operating in Kosovo, as well as other financial institutions interested in this field.

  12. Experienced physicians benefit from analyzing initial diagnostic hypotheses

    Science.gov (United States)

    Bass, Adam; Geddes, Colin; Wright, Bruce; Coderre, Sylvain; Rikers, Remy; McLaughlin, Kevin

    2013-01-01

    Background Most incorrect diagnoses involve at least one cognitive error, of which premature closure is the most prevalent. While metacognitive strategies can mitigate premature closure in inexperienced learners, these are rarely studied in experienced physicians. Our objective here was to evaluate the effect of analytic information processing on diagnostic performance of nephrologists and nephrology residents. Methods We asked nine nephrologists and six nephrology residents at the University of Calgary and Glasgow University to diagnose ten nephrology cases. We provided presenting features along with contextual information, after which we asked for an initial diagnosis. We then primed participants to use either hypothetico-deductive reasoning or scheme-inductive reasoning to analyze the remaining case data and generate a final diagnosis. Results After analyzing initial hypotheses, both nephrologists and residents improved the accuracy of final diagnoses (31.1% vs. 65.6%, p inductive reasoning (odds ratio [95% confidence interval] 5.69 [1.59, 20.33], p = 0.07), whereas the performance of experienced nephrologists did not differ between strategies (odds ratio 0.57 [0.23, 1.39], p = 0.20). Discussion Experienced nephrologists and nephrology residents can improve their performance by analyzing initial diagnostic hypotheses. The explanation of the interaction between experience and the effect of different reasoning strategies is unclear, but may relate to preferences in reasoning strategy, or the changes in knowledge structure with experience. PMID:26451203

  13. A Systematic Method to Analyze Force Majeure in Construction Claims

    Directory of Open Access Journals (Sweden)

    Saud Alshammari

    2017-12-01

    Full Text Available In construction delay claims, force majeure is normally recognized as an excusable risk that entitles contractors only to time extensions, but neither of the contracting parties is entitled to monetary compensation to recover delay damages. However, there are instances where contractors are entitled to both time and cost compensations, as evidenced by some court cases relating to force majeure claims. Such instances involve attributing the occurrence of the force majeure to the effect of other prior delay events that pushed project performance into the period of the force majeure. Existing delay analysis methods are not capable of addressing this issue, as none take the impact of other delays into consideration when analyzing force majeure claims. Stimulated by this gap, this study proposes an improved and sound method for fairly analyzing the responsibility of force majeure delay claims amongst contracting parties. This method was implemented on a case project to help demonstrate its application and also ascertain its practicability. The contribution of this paper is twofold. First, it has highlighted the situation of force majeure delay that can be compensable, creating more awareness among researchers and industry practitioners. The second is a proposed systematic process to appropriately analyze its responsibility, which equitably addresses claims from such delays with little or no chance of dispute ensuing.

  14. Design and Construction of an Autonomous Low-Cost Pulse Height Analyzer and a Single Channel Analyzer for Moessbauer Spectroscopy

    International Nuclear Information System (INIS)

    Velasquez, A.A.; Trujillo, J.M.; Morales, A.L.; Tobon, J.E.; Gancedo, J.R.; Reyes, L.

    2005-01-01

    A multichannel analyzer (MCA) and a single channel-analyzer (SCA) for Moessbauer spectrometry application have been designed and built. Both systems include low-cost digital and analog components. A microcontroller manages, either in PHA or MCS mode, the data acquisition, data storage and setting of the pulse discriminator limits. The user can monitor the system from an external PC through the serial port with the RS232 communication protocol. A graphic interface made with the LabVIEW software allows the user to adjust digitally the lower and upper limits of the pulse discriminator, and to visualize as well as save the PHA spectra in a file. The system has been tested using a 57Co radioactive source and several iron compounds, yielding satisfactory results. The low cost of its design, construction and maintenance make this equipment an attractive choice when assembling a Moessbauer spectrometer

  15. Micro-Analyzer: automatic preprocessing of Affymetrix microarray data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Cannataro, Mario

    2013-08-01

    A current trend in genomics is the investigation of the cell mechanism using different technologies, in order to explain the relationship among genes, molecular processes and diseases. For instance, the combined use of gene-expression arrays and genomic arrays has been demonstrated as an effective instrument in clinical practice. Consequently, in a single experiment different kind of microarrays may be used, resulting in the production of different types of binary data (images and textual raw data). The analysis of microarray data requires an initial preprocessing phase, that makes raw data suitable for use on existing analysis platforms, such as the TIGR M4 (TM4) Suite. An additional challenge to be faced by emerging data analysis platforms is the ability to treat in a combined way those different microarray formats coupled with clinical data. In fact, resulting integrated data may include both numerical and symbolic data (e.g. gene expression and SNPs regarding molecular data), as well as temporal data (e.g. the response to a drug, time to progression and survival rate), regarding clinical data. Raw data preprocessing is a crucial step in analysis but is often performed in a manual and error prone way using different software tools. Thus novel, platform independent, and possibly open source tools enabling the semi-automatic preprocessing and annotation of different microarray data are needed. The paper presents Micro-Analyzer (Microarray Analyzer), a cross-platform tool for the automatic normalization, summarization and annotation of Affymetrix gene expression and SNP binary data. It represents the evolution of the μ-CS tool, extending the preprocessing to SNP arrays that were not allowed in μ-CS. The Micro-Analyzer is provided as a Java standalone tool and enables users to read, preprocess and analyse binary microarray data (gene expression and SNPs) by invoking TM4 platform. It avoids: (i) the manual invocation of external tools (e.g. the Affymetrix Power

  16. Development of turbine cycle performance analyzer using intelligent data mining

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Gyun Young

    2004-02-15

    In recent year, the performance enhancement of turbine cycle in nuclear power plants is being highlighted because of worldwide deregulation environment. Especially the first target of operating plants became the reduction of operating cost to compete other power plants. It is known that overhaul interval is closely related to operating cost Author identified that the rapid and reliable performance tests, analysis, and diagnosis play an important role in the control of overhaul interval through field investigation. First the technical road map was proposed to clearly set up the objectives. The controversial issues were summarized into data gathering, analysis tool, and diagnosis method. Author proposed the integrated solution on the basis of intelligent data mining techniques. For the reliable data gathering, the state analyzer composed of statistical regression, wavelet analysis, and neural network was developed. The role of the state analyzer is to estimate unmeasured data and to increase the reliability of the collected data. For the advanced performance analysis, performance analysis toolbox was developed. The purpose of this tool makes analysis process easier and more accurate by providing three novel heat balance diagrams. This tool includes the state analyzer and turbine cycle simulation code. In diagnosis module, the probabilistic technique based on Bayesian network model and the deterministic technique based on algebraical model are provided together. It compromises the uncertainty in diagnosis process and the pin-point capability. All the modules were validated by simulated data as well as actual test data, and some modules are used as industrial applications. We have a lot of thing to be improved in turbine cycle in order to increase plant availability. This study was accomplished to remind the concern about the importance of turbine cycle and to propose the solutions on the basis of academic as well as industrial needs.

  17. Development of turbine cycle performance analyzer using intelligent data mining

    International Nuclear Information System (INIS)

    Heo, Gyun Young

    2004-02-01

    In recent year, the performance enhancement of turbine cycle in nuclear power plants is being highlighted because of worldwide deregulation environment. Especially the first target of operating plants became the reduction of operating cost to compete other power plants. It is known that overhaul interval is closely related to operating cost Author identified that the rapid and reliable performance tests, analysis, and diagnosis play an important role in the control of overhaul interval through field investigation. First the technical road map was proposed to clearly set up the objectives. The controversial issues were summarized into data gathering, analysis tool, and diagnosis method. Author proposed the integrated solution on the basis of intelligent data mining techniques. For the reliable data gathering, the state analyzer composed of statistical regression, wavelet analysis, and neural network was developed. The role of the state analyzer is to estimate unmeasured data and to increase the reliability of the collected data. For the advanced performance analysis, performance analysis toolbox was developed. The purpose of this tool makes analysis process easier and more accurate by providing three novel heat balance diagrams. This tool includes the state analyzer and turbine cycle simulation code. In diagnosis module, the probabilistic technique based on Bayesian network model and the deterministic technique based on algebraical model are provided together. It compromises the uncertainty in diagnosis process and the pin-point capability. All the modules were validated by simulated data as well as actual test data, and some modules are used as industrial applications. We have a lot of thing to be improved in turbine cycle in order to increase plant availability. This study was accomplished to remind the concern about the importance of turbine cycle and to propose the solutions on the basis of academic as well as industrial needs

  18. A coastal surface seawater analyzer for nitrogenous nutrient mapping

    Science.gov (United States)

    Masserini, Robert T.; Fanning, Kent A.; Hendrix, Steven A.; Kleiman, Brittany M.

    2017-11-01

    Satellite-data-based modeling of chlorophyll indicates that ocean waters in the mesosphere category are responsible for the majority of oceanic net primary productivity. Coastal waters, which frequently have surface chlorophyll values in the mesosphere range and have strong horizontal chlorophyll gradients and large temporal variations. Thus programs of detailed coastal nutrient surveys are essential to the study of the dynamics of oceanic net primary productivity, along with land use impacts on estuarine and coastal ecosystems. The degree of variability in these regions necessitates flexible instrumentation capable of near real-time analysis to detect and monitor analytes of interest. This work describes the development of a portable coastal surface seawater analyzer for nutrient mapping that can simultaneously elucidate with high resolution the distribution of nitrate, nitrite, and ammonium - the three principal nitrogenous inorganic nutrients in coastal systems. The approach focuses on the use of pulsed xenon flash lamps to construct an analyzer which can be adapted to any automated chemistry with fluorescence detection. The system has two heaters, on-the-fly standardization, on-board data logging, an independent 24 volt direct current power supply, internal local operating network, a 12 channel peristaltic pump, four rotary injection/selection valves, and an intuitive graphical user interface. Using the methodology of Masserini and Fanning (2000) the detection limits for ammonium, nitrite, and nitrate plus nitrite were 11, 10, and 22 nM, respectively. A field test of the analyzer in Gulf of Mexico coastal waters demonstrated its ability to monitor and delineate the complexity of inorganic nitrogen nutrient enrichments within a coastal system.

  19. MIR hollow waveguide (HWG) isotope ratio analyzer for environmental applications

    Science.gov (United States)

    Wang, Zhenyou; Zhuang, Yan; Deev, Andrei; Wu, Sheng

    2017-05-01

    An advanced commercial Mid-InfraRed Isotope Ratio (IR2) analyzer was developed in Arrow Grand Technologies based on hollow waveguide (HWG) as the sample tube. The stable carbon isotope ratio, i.e. δ13C, was obtained by measuring the selected CO2 absorption peaks in the MIR. Combined with a GC and a combustor, it has been successfully employed to measure compound specific δ13C isotope ratios in the field. By using both the 1- pass HWG and 5-path HWG, we are able to measure δ13C isotope ratio at a broad CO2 concentration of 300 ppm-37,500 ppm. Here, we demonstrate its applications in environmental studies. The δ13C isotope ratio and concentration of CO2 exhaled by soil samples was measured in real time with the isotope analyzer. The concentration was found to change with the time. We also convert the Dissolved Inorganic Carbon (DIC) into CO2, and then measure the δ13C isotope ratio with an accuracy of better than 0.3 ‰ (1 σ) with a 6 min test time and 1 ml sample usage. Tap water, NaHCO3 solvent, coca, and even beer were tested. Lastly, the 13C isotope ratio of CO2 exhaled by human beings was obtained <10 seconds after simply blowing the exhaled CO2 into a tube with an accuracy of 0.5‰ (1 σ) without sample preconditioning. In summary, a commercial HWG isotope analyzer was demonstrated to be able to perform environmental and health studies with a high accuracy ( 0.3 ‰/Hz1/2 1 σ), fast sampling rate (up to 10 Hz), low sample consumption ( 1 ml), and broad CO2 concentration range (300 ppm-37,500 ppm).

  20. An R package for analyzing and modeling ranking data.

    Science.gov (United States)

    Lee, Paul H; Yu, Philip L H

    2013-05-14

    In medical informatics, psychology, market research and many other fields, researchers often need to analyze and model ranking data. However, there is no statistical software that provides tools for the comprehensive analysis of ranking data. Here, we present pmr, an R package for analyzing and modeling ranking data with a bundle of tools. The pmr package enables descriptive statistics (mean rank, pairwise frequencies, and marginal matrix), Analytic Hierarchy Process models (with Saaty's and Koczkodaj's inconsistencies), probability models (Luce model, distance-based model, and rank-ordered logit model), and the visualization of ranking data with multidimensional preference analysis. Examples of the use of package pmr are given using a real ranking dataset from medical informatics, in which 566 Hong Kong physicians ranked the top five incentives (1: competitive pressures; 2: increased savings; 3: government regulation; 4: improved efficiency; 5: improved quality care; 6: patient demand; 7: financial incentives) to the computerization of clinical practice. The mean rank showed that item 4 is the most preferred item and item 3 is the least preferred item, and significance difference was found between physicians' preferences with respect to their monthly income. A multidimensional preference analysis identified two dimensions that explain 42% of the total variance. The first can be interpreted as the overall preference of the seven items (labeled as "internal/external"), and the second dimension can be interpreted as their overall variance of (labeled as "push/pull factors"). Various statistical models were fitted, and the best were found to be weighted distance-based models with Spearman's footrule distance. In this paper, we presented the R package pmr, the first package for analyzing and modeling ranking data. The package provides insight to users through descriptive statistics of ranking data. Users can also visualize ranking data by applying a thought

  1. A Portable, Field-Deployable Analyzer for Isotopic Water Measurements

    Science.gov (United States)

    Berman, E. S.; Gupta, M.; Huang, Y. W.; Lacelle, D.; McKay, C. P.; Fortson, S.

    2015-12-01

    Water stable isotopes have for many years been used to study the hydrological cycle, catchment hydrology, and polar climate among other applications. Typically, discrete water samples are collected and transported to a laboratory for isotope analysis. Due to the expense and labor associated with such sampling, isotope studies have generally been limited in scope and time-resolution. Field sampling of water isotopes has been shown in recent years to provide dense data sets with the increased time resolution illuminating substantially greater short term variability than is generally observed during discrete sampling. A truly portable instrument also opens the possibility to utilize the instrument as a tool for identifying which water samples would be particularly interesting for further laboratory investigation. To make possible such field measurements of liquid water isotopes, Los Gatos Research has developed a miniaturized, field-deployable liquid water isotope analyzer. The prototype miniature liquid water isotope analyzer (mini-LWIA) uses LGR's patented Off-Axis ICOS (Integrated Cavity Output Spectroscopy) technology in a rugged, Pelican case housing for easy transport and field operations. The analyzer simultaneously measures both δ2H and δ18O from liquid water, with both manual and automatic water introduction options. The laboratory precision for δ2H is 0.6 ‰, and for δ18O is 0.3 ‰. The mini-LWIA was deployed in the high Arctic during the summer of 2015 at Inuvik in the Canadian Northwest Territories. Samples were collected from Sachs Harbor, on the southwest coast of Banks Island, including buried basal ice from the Lurentide Ice Sheet, some ice wedges, and other types of ground ice. Methodology and water analysis results from this extreme field deployment will be presented.

  2. Experienced physicians benefit from analyzing initial diagnostic hypotheses

    Directory of Open Access Journals (Sweden)

    Adam Bass

    2013-03-01

    Full Text Available Background: Most incorrect diagnoses involve at least one cognitive error, of which premature closure is the most prevalent. While metacognitive strategies can mitigate premature closure in inexperienced learners, these are rarely studied in experienced physicians. Our objective here was to evaluate the effect of analytic information processing on diagnostic performance of nephrologists and nephrology residents. Methods: We asked nine nephrologists and six nephrology residents at the University of Calgary and Glasgow University to diagnose ten nephrology cases. We provided presenting features along with contextual information, after which we asked for an initial diagnosis. We then primed participants to use either hypothetico-deductive reasoning or scheme-inductive reasoning to analyze the remaining case data and generate a final diagnosis. Results: After analyzing initial hypotheses, both nephrologists and residents improved the accuracy of final diagnoses (31.1% vs. 65.6%, p < 0.001, and 40.0% vs. 70.0%, p < 0.001, respectively. We found a significant interaction between experience and analytic processing strategy (p = 0.002: nephrology residents had significantly increased odds of diagnostic success when using scheme-inductive reasoning (odds ratio [95% confidence interval] 5.69 [1.59, 20.33], p = 0.007, whereas the performance of experienced nephrologists did not differ between strategies (odds ratio 0.57 [0.23, 1.39], p = 0.2. Discussion: Experienced nephrologists and nephrology residents can improve their performance by analyzing initial diagnostic hypotheses. The explanation of the interaction between experience and the effect of different reasoning strategies is unclear, but may relate to preferences in reasoning strategy, or the changes in knowledge structure with experience.

  3. Analyzing Motives, Preferences, and Experiences in Video Game Play

    Directory of Open Access Journals (Sweden)

    Donald Loffredo

    2017-04-01

    Full Text Available This paper presents the results of analyzing motives, preferences, and experiences in video game play. A sample of 112 (64 male and 48 female students completed online the Gaming Attitudes, Motives, and Experiences Scales (GAMES. Separate one-way independent-measures multivariate analyses of variance (MANOVAs were used to determine if there were statistically significant differences by gender, age category, hours of videogame play, and ethnicity on the nine Factor Subscales of the GAMES. The results supported two of the proposed hypotheses. There were statistically differences by gender and hours of videogame play on some of the Factor Subscales of the GAMES.

  4. SIMULAND - A CODE TO ANALYZE DYNAMIC EFFECTS DURING LANDING

    Directory of Open Access Journals (Sweden)

    Marcel STERE

    2010-03-01

    Full Text Available The landing gear of an aircraft is part of the aircraft structure. It is the most critical part of the flight mission and also the component that will likely cause the most problems in the aircraft design. The landing gear design combines the best in mechanical, structural and hydraulic design. The designed landing gear should be able to meet the specifications and requirements imposed by the CS23. SIMULAND-01 is a program intended to analyze a reduced model (4-30 DoF of the aircraft under transient dynamic loads during the landing phase (touchdown.

  5. Designing experiments and analyzing data a model comparison perspective

    CERN Document Server

    Maxwell, Scott E

    2013-01-01

    Through this book's unique model comparison approach, students and researchers are introduced to a set of fundamental principles for analyzing data. After seeing how these principles can be applied in simple designs, students are shown how these same principles also apply in more complicated designs. Drs. Maxwell and Delaney believe that the model comparison approach better prepares students to understand the logic behind a general strategy of data analysis appropriate for various designs; and builds a stronger foundation, which allows for the introduction of more complex topics omitt

  6. Analyzing Program Termination and Complexity Automatically with AProVE

    DEFF Research Database (Denmark)

    Giesl, Jürgen; Aschermann, Cornelius; Brockschmidt, Marc

    2017-01-01

    In this system description, we present the tool AProVE for automatic termination and complexity proofs of Java, C, Haskell, Prolog, and rewrite systems. In addition to classical term rewrite systems (TRSs), AProVE also supports rewrite systems containing built-in integers (int-TRSs). To analyze...... programs in high-level languages, AProVE automatically converts them to (int-)TRSs. Then, a wide range of techniques is employed to prove termination and to infer complexity bounds for the resulting rewrite systems. The generated proofs can be exported to check their correctness using automatic certifiers...

  7. AmAMorph: Finite State Morphological Analyzer for Amazighe

    OpenAIRE

    Fatima Zahra Nejme; Siham Boulaknadel; Driss Aboutajdine

    2016-01-01

    This paper presents AmAMorph, a morphological analyzer for Amazighe language using a system based on the NooJ linguistic development environment. The paper begins with the development of Amazighe lexicons with large coverage formalization. The built electronic lexicons, named ‘NAmLex’, ‘VAmLex’ and ‘PAmLex’ which stand for ‘Noun Amazighe Lexicon’, ‘Verb Amazighe Lexicon’ and ‘Particles Amazighe Lexicon’, link inflectional, morphological, and syntacticsemantic information to the list of lemmas...

  8. Multi-faceted data gathering and analyzing system

    International Nuclear Information System (INIS)

    Gustavson, D.B.; Rich, K.

    1977-10-01

    A low-cost general purpose data gathering and analyzing system based on a microprocessor, an interface to CAMAC, and a phone link to a time-sharing system was implemented. The parts cost for the microprocessor system was about $6000. The microprocessor buffers the data such that the variable response of the time-sharing system is acceptable for performing real-time data acquisition. The full power and flexibility of the time-sharing system excels at the task of on-line data analysis once this buffering problem is solved. 4 figures

  9. Modeling and Analyzing Transient Military Air Traffic Control

    Science.gov (United States)

    2010-12-01

    arrive and be serviced. In general, for n flights, the number of ways that flights can enter and leave the ATC is given by the nth Catalan number ...collection of information if it does not display a currently valid OMB control number . 1. REPORT DATE DEC 2010 2. REPORT TYPE 3. DATES COVERED 00-00...2010 to 00-00-2010 4. TITLE AND SUBTITLE Modeling and Analyzing Transient Military Air Traffic Control 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c

  10. Systems and methods for modeling and analyzing networks

    Science.gov (United States)

    Hill, Colin C; Church, Bruce W; McDonagh, Paul D; Khalil, Iya G; Neyarapally, Thomas A; Pitluk, Zachary W

    2013-10-29

    The systems and methods described herein utilize a probabilistic modeling framework for reverse engineering an ensemble of causal models, from data and then forward simulating the ensemble of models to analyze and predict the behavior of the network. In certain embodiments, the systems and methods described herein include data-driven techniques for developing causal models for biological networks. Causal network models include computational representations of the causal relationships between independent variables such as a compound of interest and dependent variables such as measured DNA alterations, changes in mRNA, protein, and metabolites to phenotypic readouts of efficacy and toxicity.

  11. Mass analyzer ``MASHA'' high temperature target and plasma ion source

    Science.gov (United States)

    Semchenkov, A. G.; Rassadov, D. N.; Bekhterev, V. V.; Bystrov, V. A.; Chizov, A. Yu.; Dmitriev, S. N.; Efremov, A. A.; Guljaev, A. V.; Kozulin, E. M.; Oganessian, Yu. Ts.; Starodub, G. Ya.; Voskresensky, V. M.; Bogomolov, S. L.; Paschenko, S. V.; Zelenak, A.; Tikhonov, V. I.

    2004-05-01

    A new separator and mass analyzer of super heavy atoms (MASHA) has been created at the FLNR JINR Dubna to separate and measure masses of nuclei and molecules with precision better than 10-3. First experiments with the FEBIAD plasma ion source have been done and give an efficiency of ionization of up to 20% for Kr with a low flow test leak (6 particle μA). We suppose a magnetic field optimization, using the additional electrode (einzel lens type) in the extracting system, and an improving of the vacuum conditions in order to increase the ion source efficiency.

  12. Mass analyzer 'MASHA' high temperature target and plasma ion source

    International Nuclear Information System (INIS)

    Semchenkov, A.G.; Rassadov, D.N.; Bekhterev, V.V.; Bystrov, V.A.; Chizov, A.Yu.; Dmitriev, S.N.; Efremov, A.A.; Guljaev, A.V.; Kozulin, E.M.; Oganessian, Yu.Ts.; Starodub, G.Ya.; Voskresensky, V.M.; Bogomolov, S.L.; Paschenko, S.V.; Zelenak, A.; Tikhonov, V.I.

    2004-01-01

    A new separator and mass analyzer of super heavy atoms (MASHA) has been created at the FLNR JINR Dubna to separate and measure masses of nuclei and molecules with precision better than 10 -3 . First experiments with the FEBIAD plasma ion source have been done and give an efficiency of ionization of up to 20% for Kr with a low flow test leak (6 particle μA). We suppose a magnetic field optimization, using the additional electrode (einzel lens type) in the extracting system, and an improving of the vacuum conditions in order to increase the ion source efficiency

  13. Uncertainty in Analyzed Water and Energy Budgets at Continental Scales

    Science.gov (United States)

    Bosilovich, Michael G.; Robertson, F. R.; Mocko, D.; Chen, J.

    2011-01-01

    Operational analyses and retrospective-analyses provide all the physical terms of mater and energy budgets, guided by the assimilation of atmospheric observations. However, there is significant reliance on the numerical models, and so, uncertainty in the budget terms is always present. Here, we use a recently developed data set consisting of a mix of 10 analyses (both operational and retrospective) to quantify the uncertainty of analyzed water and energy budget terms for GEWEX continental-scale regions, following the evaluation of Dr. John Roads using individual reanalyses data sets.

  14. Using linear programming to analyze and optimize stochastic flow lines

    DEFF Research Database (Denmark)

    Helber, Stefan; Schimmelpfeng, Katja; Stolletz, Raik

    2011-01-01

    This paper presents a linear programming approach to analyze and optimize flow lines with limited buffer capacities and stochastic processing times. The basic idea is to solve a huge but simple linear program that models an entire simulation run of a multi-stage production process in discrete time...... programming and hence allows us to solve buffer allocation problems. We show under which conditions our method works well by comparing its results to exact values for two-machine models and approximate simulation results for longer lines....

  15. The Future is Hera! Analyzing Astronomical Over the Internet

    Science.gov (United States)

    Valencic, L. A.; Chai, P.; Pence, W.; Shafer, R.; Snowden, S.

    2008-01-01

    Hera is the data processing facility provided by the High Energy Astrophysics Science Archive Research Center (HEASARC) at the NASA Goddard Space Flight Center for analyzing astronomical data. Hera provides all the pre-installed software packages, local disk space, and computing resources need to do general processing of FITS format data files residing on the users local computer, and to do research using the publicly available data from the High ENergy Astrophysics Division. Qualified students, educators and researchers may freely use the Hera services over the internet of research and educational purposes.

  16. The wireshark field guide analyzing and troubleshooting network traffic

    CERN Document Server

    Shimonski, Robert

    2013-01-01

    The Wireshark Field Guide provides hackers, pen testers, and network administrators with practical guidance on capturing and interactively browsing computer network traffic. Wireshark is the world's foremost network protocol analyzer, with a rich feature set that includes deep inspection of hundreds of protocols, live capture, offline analysis and many other features. The Wireshark Field Guide covers the installation, configuration and use of this powerful multi-platform tool. The book give readers the hands-on skills to be more productive with Wireshark as they drill

  17. Value Encounters - Modeling and Analyzing Co-creation of Value

    Science.gov (United States)

    Weigand, Hans

    Recent marketing and management literature has introduced the concept of co-creation of value. Current value modeling approaches such as e3-value focus on the exchange of value rather than co-creation. In this paper, an extension to e3-value is proposed in the form of a “value encounter”. Value encounters are defined as interaction spaces where a group of actors meet and derive value by each one bringing in some of its own resources. They can be analyzed from multiple strategic perspectives, including knowledge management, social network management and operational management. Value encounter modeling can be instrumental in the context of service analysis and design.

  18. Modelling of the new FLNR magnetic analyzer vacuum channel

    International Nuclear Information System (INIS)

    Bashevoj, V.V.; Majdikov, V.Z.

    1998-01-01

    The quality of any magnetic analyzer directly depends on the area of radial cross section of its volume filled with the ions trajectories. The conception of new magnetic spectrometer vacuum channel is based on computer modelling of the maximum filling of the spectrometer acceptance with given pole pieces width and the gap height of the magnetic dipole together with the maximum transmission of underflected in magnetic field emission from the target at the angle of measurements. The correct correlation of the aperture of the vacuum channel with durability, engineering and ease of handling characteristics combined with ion-optical properties of the spectrometer determines its construction in the whole

  19. DMET-analyzer: automatic analysis of Affymetrix DMET data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Agapito, Giuseppe; Di Martino, Maria Teresa; Arbitrio, Mariamena; Tassone, Pierfrancesco; Tagliaferri, Pierosandro; Cannataro, Mario

    2012-10-05

    Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters) is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism) on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix) and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i) to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii) the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP), (iii) the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists) to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different case studies regarding the analysis of

  20. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    Full Text Available Abstract Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP, (iii the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different

  1. Constrained Fisher Scoring for a Mixture of Factor Analyzers

    Science.gov (United States)

    2016-09-01

    where ω ∈ [0, 4π). Each observation of the spiral is corrupted by additive white Gaussian noise with unit variance. This model was used in previous works...that notwithstanding any other provision of  law, no person shall be subject to any penalty  for  failing  to comply with a collection of  information...from different aspects and then learn a joint statistical model for the object manifold. We employ a mixture of factor analyzers model and derive a

  2. Analyzing clinical and electrophysiological characteristics of Paroxysmal Dyskinesia

    Directory of Open Access Journals (Sweden)

    Jue-qian Zhou

    2011-01-01

    Full Text Available The classification, clinical and electrophysiological characteristics, treatment outcome and pathogenesis of paroxysmal dyskinesia were summarized and analyzed. Paroxysmal dyskinesia was classified into three types. Different types had different incentives in clinical practice. Patients were mostly male adolescents, and the attacks, which were in various forms, manifested as dysmyotonia of choreoathetosis, body torsion and facemaking; no disturbance of consciousness during attacks. Electroencephalogram and other examinations showed no specific abnormalities during both the attacks and interictal period. Paroxysmal dyskinesia was an independent disease and different from epilepsy.

  3. A parametric model for analyzing anticipation in genetically predisposed families

    DEFF Research Database (Denmark)

    Larsen, Klaus; Petersen, Janne; Bernstein, Inge

    2009-01-01

    and are sensitive to right truncation of the data. We propose a normal random effects model that allows for right-censored observations and includes covariates, and draw statistical inference based on the likelihood function. We applied the model to the hereditary nonpolyposis colorectal cancer (HNPCC)/Lynch...... syndrome family cohort from the national Danish HNPCC register. Age-at-onset was analyzed in 824 individuals from 2-4 generations in 125 families with proved disease-predisposing mutations. A significant effect from anticipation was identified with a mean of 3 years earlier age-at-onset per generation...

  4. Liver plasma membranes: an effective method to analyze membrane proteome.

    Science.gov (United States)

    Cao, Rui; Liang, Songping

    2012-01-01

    Plasma membrane proteins are critical for the maintenance of biological systems and represent important targets for the treatment of disease. The hydrophobicity and low abundance of plasma membrane proteins make them difficult to analyze. The protocols given here are the efficient isolation/digestion procedures for liver plasma membrane proteomic analysis. Both protocol for the isolation of plasma membranes and protocol for the in-gel digestion of gel-embedded plasma membrane proteins are presented. The later method allows the use of a high detergent concentration to achieve efficient solubilization of hydrophobic plasma membrane proteins while avoiding interference with the subsequent LC-MS/MS analysis.

  5. Analyzing high school students' reasoning about electromagnetic induction

    Science.gov (United States)

    Jelicic, Katarina; Planinic, Maja; Planinsic, Gorazd

    2017-06-01

    Electromagnetic induction is an important, yet complex, physics topic that is a part of Croatian high school curriculum. Nine Croatian high school students of different abilities in physics were interviewed using six demonstration experiments from electromagnetism (three of them concerned the topic of electromagnetic induction). Students were asked to observe, describe, and explain the experiments. The analysis of students' explanations indicated the existence of many conceptual and reasoning difficulties with the basic concepts of electromagnetism, and especially with recognizing and explaining the phenomenon of electromagnetic induction. Three student mental models of electromagnetic induction, formed during the interviews, which reoccurred among students, are described and analyzed within the knowledge-in-pieces framework.

  6. Development of an Adolescent Depression Ontology for Analyzing Social Data.

    Science.gov (United States)

    Jung, Hyesil; Park, Hyeoun-Ae; Song, Tae-Min; Jeon, Eunjoo; Kim, Ae Ran; Lee, Joo Yun

    2015-01-01

    Depression in adolescence is associated with significant suicidality. Therefore, it is important to detect the risk for depression and provide timely care to adolescents. This study aims to develop an ontology for collecting and analyzing social media data about adolescent depression. This ontology was developed using the 'ontology development 101'. The important terms were extracted from several clinical practice guidelines and postings on Social Network Service. We extracted 777 terms, which were categorized into 'risk factors', 'sign and symptoms', 'screening', 'diagnosis', 'treatment', and 'prevention'. An ontology developed in this study can be used as a framework to understand adolescent depression using unstructured data from social media.

  7. Interactive nuclear plant analyzer for VVER-440 reactor

    International Nuclear Information System (INIS)

    Shier, W.; Horak, W.; Kennett, R.

    1992-05-01

    This document discusses an interactive nuclear plant analyzer (NPA) which has been developed for a VVER-440, Model 213 reactor for use in the training of plant personnel, the development and verification of plant operating procedures, and in the analysis of various anticipated operational occurrences and accident scenarios. This NPA is operational on an IBM RISC-6000 workstation and utilizes the RELAP5/MOD2 computer code for the calculation of the VVER-440 reactor response to the interactive commands initiated by the NPA operator

  8. Methods for Analyzing Electric Load Shape and its Variability

    Energy Technology Data Exchange (ETDEWEB)

    Price, Philip

    2010-05-12

    Current methods of summarizing and analyzing electric load shape are discussed briefly and compared. Simple rules of thumb for graphical display of load shapes are suggested. We propose a set of parameters that quantitatively describe the load shape in many buildings. Using the example of a linear regression model to predict load shape from time and temperature, we show how quantities such as the load?s sensitivity to outdoor temperature, and the effectiveness of demand response (DR), can be quantified. Examples are presented using real building data.

  9. Blood culture cross contamination associated with a radiometric analyzer

    International Nuclear Information System (INIS)

    Griffin, M.R.; Miller, A.D.; Davis, A.C.

    1982-01-01

    During a 9-day period in August 1980 in a New Jersey hospital, three pairs of consecutively numbered blood cultures from different patients were identified as positive for the same organism, for each pair, both cultures were positive in the same atmosphere, both organisms had the same sensitivities, and the second of each pair grew at least 2 days after the first and was the only positive blood culture obtained from the patient. When the hospital laboratory discontinued use of its radiometric culture analyzer for 15 days, no more consecutive pairs of positive cultures occurred. Subsequent use of the machine for 9 days with a new power unit but the original circuit boards resulted in one more similar consecutive pair (Staphylococcus epidermidis). After replacement of the entire power unit, there were no further such pairs. Examination of the machine by the manufacturer revealed a defective circuit board which resulted in inadequate needle sterilization. Laboratories which utilize radiometric analyzers should be aware of the potential for cross contamination. Recognition of such events requires alert microbiologists and infection control practitioners and a record system in the bacteriology laboratory designed to identify such clusters

  10. Modeling and Analyzing the Slipping of the Ball Screw

    Directory of Open Access Journals (Sweden)

    Nannan Xu

    Full Text Available AbstractThis paper aims to set up the ball systematic slipping model and analyze the slipping characteristics caused by different factors for a ball screw operating at high speeds. To investigate the ball screw slipping mechanism, transformed coordinate system should be established firstly. Then it is used to set up mathematical modeling for the ball slipping caused by the three main reasons and the speed of slipping can be calculated. Later, the influence of the contact angle, helix angle and screw diameter for ball screw slipping will be analyzed according to the ball slipping model and slipping speeds equation and the slipping analysis will be obtained. Finally, curve of slipping analysis and that of mechanical efficiency of the ball screw analysis by Lin are compared, which will indirectly verify the correctness of the slipping model. The slipping model and the curve of slipping analysis established in this paper will provide theory basis for reducing slipping and improving the mechanical efficiency of a ball screw operating at high speeds.

  11. Historical civilian nuclear accident based Nuclear Reactor Condition Analyzer

    Science.gov (United States)

    McCoy, Kaylyn Marie

    There are significant challenges to successfully monitoring multiple processes within a nuclear reactor facility. The evidence for this observation can be seen in the historical civilian nuclear incidents that have occurred with similar initiating conditions and sequences of events. Because there is a current lack within the nuclear industry, with regards to the monitoring of internal sensors across multiple processes for patterns of failure, this study has developed a program that is directed at accomplishing that charge through an innovation that monitors these systems simultaneously. The inclusion of digital sensor technology within the nuclear industry has appreciably increased computer systems' capabilities to manipulate sensor signals, thus making the satisfaction of these monitoring challenges possible. One such manipulation to signal data has been explored in this study. The Nuclear Reactor Condition Analyzer (NRCA) program that has been developed for this research, with the assistance of the Nuclear Regulatory Commission's Graduate Fellowship, utilizes one-norm distance and kernel weighting equations to normalize all nuclear reactor parameters under the program's analysis. This normalization allows the program to set more consistent parameter value thresholds for a more simplified approach to analyzing the condition of the nuclear reactor under its scrutiny. The product of this research provides a means for the nuclear industry to implement a safety and monitoring program that can oversee the system parameters of a nuclear power reactor facility, like that of a nuclear power plant.

  12. AACSD: An atomistic analyzer for crystal structure and defects

    Science.gov (United States)

    Liu, Z. R.; Zhang, R. F.

    2018-01-01

    We have developed an efficient command-line program named AACSD (Atomistic Analyzer for Crystal Structure and Defects) for the post-analysis of atomic configurations generated by various atomistic simulation codes. The program has implemented not only the traditional filter methods like the excess potential energy (EPE), the centrosymmetry parameter (CSP), the common neighbor analysis (CNA), the common neighborhood parameter (CNP), the bond angle analysis (BAA), and the neighbor distance analysis (NDA), but also the newly developed ones including the modified centrosymmetry parameter (m-CSP), the orientation imaging map (OIM) and the local crystallographic orientation (LCO). The newly proposed OIM and LCO methods have been extended for all three crystal structures including face centered cubic, body centered cubic and hexagonal close packed. More specially, AACSD can be easily used for the atomistic analysis of metallic nanocomposite with each phase to be analyzed independently, which provides a unique pathway to capture their dynamic evolution of various defects on the fly. In this paper, we provide not only a throughout overview on various theoretical methods and their implementation into AACSD program, but some critical evaluations, specific testing and applications, demonstrating the capability of the program on each functionality.

  13. Capturing and analyzing wheelchair maneuvering patterns with mobile cloud computing.

    Science.gov (United States)

    Fu, Jicheng; Hao, Wei; White, Travis; Yan, Yuqing; Jones, Maria; Jan, Yih-Kuen

    2013-01-01

    Power wheelchairs have been widely used to provide independent mobility to people with disabilities. Despite great advancements in power wheelchair technology, research shows that wheelchair related accidents occur frequently. To ensure safe maneuverability, capturing wheelchair maneuvering patterns is fundamental to enable other research, such as safe robotic assistance for wheelchair users. In this study, we propose to record, store, and analyze wheelchair maneuvering data by means of mobile cloud computing. Specifically, the accelerometer and gyroscope sensors in smart phones are used to record wheelchair maneuvering data in real-time. Then, the recorded data are periodically transmitted to the cloud for storage and analysis. The analyzed results are then made available to various types of users, such as mobile phone users, traditional desktop users, etc. The combination of mobile computing and cloud computing leverages the advantages of both techniques and extends the smart phone's capabilities of computing and data storage via the Internet. We performed a case study to implement the mobile cloud computing framework using Android smart phones and Google App Engine, a popular cloud computing platform. Experimental results demonstrated the feasibility of the proposed mobile cloud computing framework.

  14. Improving new product development (NPD process by analyzing failure cases

    Directory of Open Access Journals (Sweden)

    Yeon-Hak Kim

    2017-01-01

    Full Text Available Purpose - The purpose of this study is to develop an appropriate new product development (NPD process of Company “T”, a medium-sized firm, by analyzing the existing NPD process and failure cases of the Company. Design/methodology/approach - The proposed research framework is as follows: first, prospective studies of the NPD process are performed using the existing literature and preliminary references; second, comparative analysis between the current processes and a NPD process is performed; third, phase-based evaluations upon failed product cases are conducted with a NPD process so as to identify the abridged steps and root-causes of failures; finally, renewed priorities are set forth by utilizing the analytic hierarchy process analysis and questionnaire analysis upon the above identified causes of failures. Findings - The resulting accomplishments include the establishment of NPD processes that resonates with the current states of Company “T”, which, in turn, ensures the increase of efficiency, the decrease in development duration and the strategy of capacity-concentration and priority-selection. Originality/value - As Company “T”’s development process is outdated and products are developed without adequate market information research and feasibility analysis, the percentage of failed development project is as high as 87 per cent. Thus, this study aims to develop an appropriate NPD process of Company “T” by analyzing the existing NPD process and failure cases of the Company.

  15. Bicycle: a bioinformatics pipeline to analyze bisulfite sequencing data.

    Science.gov (United States)

    Graña, Osvaldo; López-Fernández, Hugo; Fdez-Riverola, Florentino; González Pisano, David; Glez-Peña, Daniel

    2018-04-15

    High-throughput sequencing of bisulfite-converted DNA is a technique used to measure DNA methylation levels. Although a considerable number of computational pipelines have been developed to analyze such data, none of them tackles all the peculiarities of the analysis together, revealing limitations that can force the user to manually perform additional steps needed for a complete processing of the data. This article presents bicycle, an integrated, flexible analysis pipeline for bisulfite sequencing data. Bicycle analyzes whole genome bisulfite sequencing data, targeted bisulfite sequencing data and hydroxymethylation data. To show how bicycle overtakes other available pipelines, we compared them on a defined number of features that are summarized in a table. We also tested bicycle with both simulated and real datasets, to show its level of performance, and compared it to different state-of-the-art methylation analysis pipelines. Bicycle is publicly available under GNU LGPL v3.0 license at http://www.sing-group.org/bicycle. Users can also download a customized Ubuntu LiveCD including bicycle and other bisulfite sequencing data pipelines compared here. In addition, a docker image with bicycle and its dependencies, which allows a straightforward use of bicycle in any platform (e.g. Linux, OS X or Windows), is also available. ograna@cnio.es or dgpena@uvigo.es. Supplementary data are available at Bioinformatics online.

  16. Analyzing Multimode Wireless Sensor Networks Using the Network Calculus

    Directory of Open Access Journals (Sweden)

    Xi Jin

    2015-01-01

    Full Text Available The network calculus is a powerful tool to analyze the performance of wireless sensor networks. But the original network calculus can only model the single-mode wireless sensor network. In this paper, we combine the original network calculus with the multimode model to analyze the maximum delay bound of the flow of interest in the multimode wireless sensor network. There are two combined methods A-MM and N-MM. The method A-MM models the whole network as a multimode component, and the method N-MM models each node as a multimode component. We prove that the maximum delay bound computed by the method A-MM is tighter than or equal to that computed by the method N-MM. Experiments show that our proposed methods can significantly decrease the analytical delay bound comparing with the separate flow analysis method. For the large-scale wireless sensor network with 32 thousands of sensor nodes, our proposed methods can decrease about 70% of the analytical delay bound.

  17. Analyzing B-vitamins in Human Milk: Methodological Approaches.

    Science.gov (United States)

    Hampel, Daniela; Allen, Lindsay H

    2016-01-01

    According to the World Health Organization (WHO), infants should be exclusively breastfed for the first six months of life. However, there is insufficient information about the concentration of nutrients in human milk. For some nutrients, including B-vitamins, maternal intake affects their concentration in human milk but the extent to which inadequate maternal diets affect milk B-vitamin content is poorly documented. Little is known about infant requirements for B-vitamins; recommendations are generally set as Adequate Intakes (AI) calculated on the basis of the mean volume of milk (0.78 L/day) consumed by infants exclusively fed with human milk from well-nourished mothers during the first six months, and the concentration of each vitamin in milk based on reported values. Methods used for analyzing B-vitamins, commonly microbiological, radioisotope dilution or more recently chromatographic, coupled with UV, fluorometric and MS detection, have rarely been validated for the complex human milk matrix. Thus the validity, accuracy, and sensitivity of analytical methods is important for understanding infant requirements for these nutrients, the maternal intakes needed to support adequate concentrations in breast milk. This review summarizes current knowledge on methods used for analyzing the B-vitamins thiamin, riboflavin, niacin, vitamin B-6 and pantothenic acid, vitamin B-12, folate, biotin, and choline in human milk, their chemical and physical properties, the different forms and changes in concentration during lactation, and the effects of deficiency on the infant.

  18. A software tool for analyzing multichannel cochlear implant signals.

    Science.gov (United States)

    Lai, Wai Kong; Bögli, Hans; Dillier, Norbert

    2003-10-01

    A useful and convenient means to analyze the radio frequency (RF) signals being sent by a speech processor to a cochlear implant would be to actually capture and display them with appropriate software. This is particularly useful for development or diagnostic purposes. sCILab (Swiss Cochlear Implant Laboratory) is such a PC-based software tool intended for the Nucleus family of Multichannel Cochlear Implants. Its graphical user interface provides a convenient and intuitive means for visualizing and analyzing the signals encoding speech information. Both numerical and graphic displays are available for detailed examination of the captured CI signals, as well as an acoustic simulation of these CI signals. sCILab has been used in the design and verification of new speech coding strategies, and has also been applied as an analytical tool in studies of how different parameter settings of existing speech coding strategies affect speech perception. As a diagnostic tool, it is also useful for troubleshooting problems with the external equipment of the cochlear implant systems.

  19. Detecting Android Malwares with High-Efficient Hybrid Analyzing Methods

    Directory of Open Access Journals (Sweden)

    Yu Liu

    2018-01-01

    Full Text Available In order to tackle the security issues caused by malwares of Android OS, we proposed a high-efficient hybrid-detecting scheme for Android malwares. Our scheme employed different analyzing methods (static and dynamic methods to construct a flexible detecting scheme. In this paper, we proposed some detecting techniques such as Com+ feature based on traditional Permission and API call features to improve the performance of static detection. The collapsing issue of traditional function call graph-based malware detection was also avoided, as we adopted feature selection and clustering method to unify function call graph features of various dimensions into same dimension. In order to verify the performance of our scheme, we built an open-access malware dataset in our experiments. The experimental results showed that the suggested scheme achieved high malware-detecting accuracy, and the scheme could be used to establish Android malware-detecting cloud services, which can automatically adopt high-efficiency analyzing methods according to the properties of the Android applications.

  20. QA practice for online analyzers in water steam cycles

    International Nuclear Information System (INIS)

    Staub, L.

    2010-01-01

    The liberalization of power markets throughout the world has resulted in more and more power stations being operated in cycling mode, with frequent load changes and multiple daily start-up and shut-down cycles. This more flexible operation also calls for better automation and poses new challenges to water chemistry in water steam cycles, to avoid subsequent damage to vital plant components such as turbines, boilers or condensers. But automation for the most important chemistry control tool, the sampling and online analyzer system, is only possible if chemists can rely on their online analysis equipment. Proof of plausibility as well as reliability and availability of online analysis results becomes a major focus. While SOP and standard QA procedures for laboratory equipment are well established and daily practice, such measures are widely neglected for online process analyzers. This paper is aiming to establish a roadmap for the implementation of SOP and QA/QC procedures for online instruments in water steam cycles, leading to reliable chemical information that is trustworthy for process automation and chemistry control in water steam cycles. (author)

  1. QA practice for online analyzers in water steam cycles

    International Nuclear Information System (INIS)

    Staub Lukas

    2009-01-01

    The liberalization of power markets throughout the world has resulted in more and more power stations being operated in cycling mode, with frequent load changes and multiple daily start-up and shut-down cycles. This more flexible operation also calls for better automation and poses new challenges to water chemistry in water steam cycles, to avoid subsequent damage to vital plant components such as turbines, boilers or condensers. But automation for the most important chemistry control tool, the sampling and online analyzer system, is only possible if chemists can rely on their online analysis equipment. Proof of plausibility as well as reliability and availability of online analysis results becomes a major focus. While SOP and standard QA procedures for laboratory equipment are well established and daily practice, such measures are widely neglected for online process analyzers. This paper is aiming to establish a roadmap for the implementation of SOP and QA/QC procedures for online instruments in water steam cycles, leading to reliable chemical information that is trustworthy for process automation and chemistry control in water steam cycles. (author)

  2. Analyzing students' attitudes towards science during inquiry-based lessons

    Science.gov (United States)

    Kostenbader, Tracy C.

    Due to the logistics of guided-inquiry lesson, students learn to problem solve and develop critical thinking skills. This mixed-methods study analyzed the students' attitudes towards science during inquiry lessons. My quantitative results from a repeated measures survey showed no significant difference between student attitudes when taught with either structured-inquiry or guided-inquiry lessons. The qualitative results analyzed through a constant-comparative method did show that students generate positive interest, critical thinking and low level stress during guided-inquiry lessons. The qualitative research also gave insight into a teacher's transition to guided-inquiry. This study showed that with my students, their attitudes did not change during this transition according to the qualitative data however, the qualitative data did how high levels of excitement. The results imply that students like guided-inquiry laboratories, even though they require more work, just as much as they like traditional laboratories with less work and less opportunity for creativity.

  3. APROS 3-D core models for simulators and plant analyzers

    International Nuclear Information System (INIS)

    Puska, E.K.

    1999-01-01

    The 3-D core models of APROS simulation environment can be used in simulator and plant analyzer applications, as well as in safety analysis. The key feature of APROS models is that the same physical models can be used in all applications. For three-dimensional reactor cores the APROS models cover both quadratic BWR and PWR cores and the hexagonal lattice VVER-type cores. In APROS environment the user can select the number of flow channels in the core and either five- or six-equation thermal hydraulic model for these channels. The thermal hydraulic model and the channel description have a decisive effect on the calculation time of the 3-D core model and thus just these selection make at present the major difference between a safety analysis model and a training simulator model. The paper presents examples of various types of 3-D LWR-type core descriptions for simulator and plant analyzer use and discusses the differences of calculation speed and physical results between a typical safety analysis model description and a real-time simulator model description in transients. (author)

  4. Development of a process analyzer for trace uranium

    International Nuclear Information System (INIS)

    Hiller, J.M.

    1990-01-01

    A process analyzer, based on time-resolved laser-induced luminescence, is being developed for the Department of Energy's Oak Ridge Y-12 Plant for the ultra-trace determination of uranium. The present instrument has a detection limit of 1 μg/L; the final instrument will have a detection limit near 1 ng/L for continuous environmental monitoring. Time-resolved luminescence decay is used to enhance sensitivity, reduce interferences, and eliminate the need for standard addition. The basic analyzer sequence is: a pulse generator triggers the laser; the laser beam strikes a photodiode which initiates data acquisition and synchronizes the timing, nearly simultaneously, laser light strikes the sample; intensity data are collected under control of the gated photon counter; and the cycle repeats as necessary. Typically, data are collected in 10 μs intervals over 700 μs (several luminescence half-lives). The final instrument will also collect and prepare samples, calibrate itself, reduce the raw data, and transmit reduced data to the control station(s)

  5. Mammography imaging studies using a laue crystal analyzer

    International Nuclear Information System (INIS)

    Chapman, D.; Thomlinson, W.; Arfelli, F.

    1995-01-01

    Synchrotron based mammography imaging experiments have been performed with monochromatic x-rays in which a laue crystal placed after the object being imaged has been used to split the beam transmitted through the object. The X27C R ampersand D beamline at the National Synchrotron Light Source was used with the white beam monochromatized by a double crystal Si(111) monochromator tuned to 18 keV. The imaging beam was a thin horizontal line approximately 0.5 mm high by 100 mm wide. Images were acquired in line scan mode with the phantom and detector both scanned together. The detector for these experiments was an image plate. A thin Si(l11) laue analyzer was used to diffract a portion of the beam transmitted through the phantom before the image plate detector. This ''scatter free'' diffracted beam was then recorded on the image plate during the phantom scan. Since the thin laue crystal also transmitted a fraction of the incident beam, this beam was also simultaneously recorded on the image plate. The imaging results are interpreted in terms of an x-ray schliere or refractive index inhomogeneities. The analyzer images taken at various points in the rocking curve will be presented

  6. An image analyzer system for the analysis of nuclear traces

    International Nuclear Information System (INIS)

    Cuapio O, A.

    1990-10-01

    Inside the project of nuclear traces and its application techniques to be applied in the detection of nuclear reactions of low section (non detectable by conventional methods), in the study of accidental and personal neutron dosemeters, and other but, are developed. All these studies are based on the fact that the charged particles leave latent traces of dielectric that if its are engraved with appropriate chemical solutions its are revealed until becoming visible to the optical microscope. From the analysis of the different trace forms, it is possible to obtain information of the characteristic parameters of the incident particles (charge, mass and energy). Of the density of traces it is possible to obtain information of the flow of the incident radiation and consequently of the received dose. For carry out this analysis has been designed and coupled different systems, that it has allowed the solution of diverse outlined problems. Notwithstanding it has been detected that to make but versatile this activity is necessary to have an Image Analyzer System that allow us to digitize, to process and to display the images with more rapidity. The present document, presents the proposal to carry out the acquisition of the necessary components for to assembling an Image Analyzing System, like support to the mentioned project. (Author)

  7. Analyzing the User Behavior toward Electronic Commerce Stimuli.

    Science.gov (United States)

    Lorenzo-Romero, Carlota; Alarcón-Del-Amo, María-Del-Carmen; Gómez-Borja, Miguel-Ángel

    2016-01-01

    Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e., navigational structure as utilitarian stimulus) versus non-verbal web technology (music and presentation of products as hedonic stimuli). Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this research consists in analyzing the impact of these web technologies -which constitute the web atmosphere or webmosphere of a website- on shopping human behavior (i.e., users' internal states -affective, cognitive, and satisfaction- and behavioral responses - approach responses, and real shopping outcomes-) within the retail online store created by computer, taking into account some mediator variables (i.e., involvement, atmospheric responsiveness, and perceived risk). A 2 ("free" versus "hierarchical" navigational structure) × 2 ("on" versus "off" music) × 2 ("moving" versus "static" images) between-subjects computer experimental design is used to test empirically this research. In addition, an integrated methodology was developed allowing the simulation, tracking and recording of virtual user behavior within an online shopping environment. As main conclusion, this study suggests that the positive responses of online consumers might increase when they are allowed to freely navigate the online stores and their experience is enriched by animate gifts and music background. The effect caused by mediator variables modifies relatively the final shopping human behavior.

  8. Analyzing the user behavior towards Electronic Commerce stimuli

    Directory of Open Access Journals (Sweden)

    Carlota Lorenzo-Romero

    2016-11-01

    Full Text Available Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e. navigational structure as utilitarian stimulus versus nonverbal web technology (music and presentation of products as hedonic stimuli. Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this research consists in analyzing the impact of these web technologies –which constitute the web atmosphere or webmosphere of a website– on shopping human bebaviour (i.e. users’ internal states -affective, cognitive, and satisfaction- and behavioral responses - approach responses, and real shopping outcomes- within the retail online store created by computer, taking into account some mediator variables (i.e. involvement, atmospheric responsiveness, and perceived risk. A 2(free versus hierarchical navigational structure x2(on versus off music x2(moving versus static images between-subjects computer experimental design is used to test empirically this research. In addition, an integrated methodology was developed allowing the simulation, tracking and recording of virtual user behavior within an online shopping environment. As main conclusion, this study suggests that the positive responses of online consumers might increase when they are allowed to freely navigate the online stores and their experience is enriched by animate gifts and music background. The effect caused by mediator variables modifies relatively the final shopping human behavior.

  9. Sensor gas analyzer for acetone determination in expired air

    Science.gov (United States)

    Baranov, Vitaly V.

    2001-05-01

    Diseases and changes in the way of life change the concentration and composition of the expired air. Our adaptable gas analyzer is intended for the selective analysis of expired air and can be adapted for the solution of current diagnostic and analytical tasks by the user (a physician or a patient). Having analyzed the existing trends in the development of noninvasive diagnostics we have chosen the method of noninvasive acetone detection in expired air, where the acetone concentration correlates with blood and urine glucose concentrations. The appearance of acetone in expired air is indicative of disorders that may be caused not only by diabetes but also be wrong diet, incorrect sportsmen training etc. To control the disorders one should know the acetone concentration in the human body. This knowledge allows one to judge upon the state of the patient, choose a correct diet that will not cause damage to the patient's health, determine sportsmen training efficiency and results and solve the artificial pancreas problem. Our device provide highly accurate analysis, rapid diagnostics and authentic acetone quantification in the patient's body at any time aimed at prediction of the patient's state and assessing the efficiency of the therapy used. Clinical implementation of the device will improve the health and save lives of many thousands of diabetes sufferers.

  10. 3002 Humidified Tandem Differential Mobility Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Uin, Janek [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The Brechtel Manufacturing Inc. (BMI) Humidified Tandem Differential Mobility Analyzer (HT-DMA Model 3002) (Brechtel and Kreidenweis 2000a,b, Henning et al. 2005, Xerxes et al. 2014) measures how aerosol particles of different initial dry sizes grow or shrink when exposed to changing relative humidity (RH) conditions. It uses two different mobility analyzers (DMA) and a humidification system to make the measurements. One DMA selects a narrow size range of dry aerosol particles, which are exposed to varying RH conditions in the humidification system. The second (humidified) DMA scans the particle size distribution output from the humidification system. Scanning a wide range of particle sizes enables the second DMA to measure changes in size or growth factor (growth factor = humidified size/dry size), due to water uptake by the particles. A Condensation Particle Counter (CPC) downstream of the second DMA counts particles as a function of selected size in order to obtain the number size distribution of particles exposed to different RH conditions.

  11. Multimode laser beam analyzer instrument using electrically programmable optics.

    Science.gov (United States)

    Marraccini, Philip J; Riza, Nabeel A

    2011-12-01

    Presented is a novel design of a multimode laser beam analyzer using a digital micromirror device (DMD) and an electronically controlled variable focus lens (ECVFL) that serve as the digital and analog agile optics, respectively. The proposed analyzer is a broadband laser characterization instrument that uses the agile optics to smartly direct light to the required point photodetectors to enable beam measurements of minimum beam waist size, minimum waist location, divergence, and the beam propagation parameter M(2). Experimental results successfully demonstrate these measurements for a 500 mW multimode test laser beam with a wavelength of 532 nm. The minimum beam waist, divergence, and M(2) experimental results for the test laser are found to be 257.61 μm, 2.103 mrad, 1.600 and 326.67 μm, 2.682 mrad, 2.587 for the vertical and horizontal directions, respectively. These measurements are compared to a traditional scan method and the results of the beam waist are found to be within error tolerance of the demonstrated instrument.

  12. Very low cost multichannel analyzer with some additional features

    Energy Technology Data Exchange (ETDEWEB)

    Tudyka, Konrad, E-mail: konrad.tudyka@polsl.pl [Centre of Excellence-Gliwice Absolute Dating Methods Centre, Institute of Physics, Silesian University of Technology (Poland); Bluszcz, Andrzej [Centre of Excellence-Gliwice Absolute Dating Methods Centre, Institute of Physics, Silesian University of Technology (Poland)

    2011-12-11

    In this paper we present a multichannel analyzer (MCA) based on a digital signal controller (DSC). The multichannel analyzer is characterized by a very low cost and an additional feature of recording time intervals between pulses. The total cost of electronic parts used in construction of the MCA is around 50 USD. The electronic circuit is based on dsPIC30F2020 DSC unit from Microchip. The device has a 10-bit analogue-to-digital converter (ADC) which can sample and convert 2 samples per {mu}s. The DSC samples the input voltage continuously and detects incoming pulses. The data belonging to a detected pulse and its time stamp are sent to a PC on-line. The analysis of data stored on the PC is performed off-line with the help of a genetic algorithm (GA) used to fit the pulse shape function. This allows determination of amplitude of each individual pulse. The effective resolution varies with the pulse length and is typically 1000 channels for pulses approximately 4{mu}s long.

  13. Fault Diagnosis of Motor Bearing by Analyzing a Video Clip

    Directory of Open Access Journals (Sweden)

    Siliang Lu

    2016-01-01

    Full Text Available Conventional bearing fault diagnosis methods require specialized instruments to acquire signals that can reflect the health condition of the bearing. For instance, an accelerometer is used to acquire vibration signals, whereas an encoder is used to measure motor shaft speed. This study proposes a new method for simplifying the instruments for motor bearing fault diagnosis. Specifically, a video clip recording of a running bearing system is captured using a cellphone that is equipped with a camera and a microphone. The recorded video is subsequently analyzed to obtain the instantaneous frequency of rotation (IFR. The instantaneous fault characteristic frequency (IFCF of the defective bearing is obtained by analyzing the sound signal that is recorded by the microphone. The fault characteristic order is calculated by dividing IFCF by IFR to identify the fault type of the bearing. The effectiveness and robustness of the proposed method are verified by a series of experiments. This study provides a simple, flexible, and effective solution for motor bearing fault diagnosis. Given that the signals are gathered using an affordable and accessible cellphone, the proposed method is proven suitable for diagnosing the health conditions of bearing systems that are located in remote areas where specialized instruments are unavailable or limited.

  14. Analyzing a 35-Year Hourly Data Record: Why So Difficult?

    Science.gov (United States)

    Lynnes, Chris

    2014-01-01

    At the Goddard Distributed Active Archive Center, we have recently added a 35-Year record of output data from the North American Land Assimilation System (NLDAS) to the Giovanni web-based analysis and visualization tool. Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure) offers a variety of data summarization and visualization to users that operate at the data center, obviating the need for users to download and read the data themselves for exploratory data analysis. However, the NLDAS data has proven surprisingly resistant to application of the summarization algorithms. Algorithms that were perfectly happy analyzing 15 years of daily satellite data encountered limitations both at the algorithm and system level for 35 years of hourly data. Failures arose, sometimes unexpectedly, from command line overflows, memory overflows, internal buffer overflows, and time-outs, among others. These serve as an early warning sign for the problems likely to be encountered by the general user community as they try to scale up to Big Data analytics. Indeed, it is likely that more users will seek to perform remote web-based analysis precisely to avoid the issues, or the need to reprogram around them. We will discuss approaches to mitigating the limitations and the implications for data systems serving the user communities that try to scale up their current techniques to analyze Big Data.

  15. Analyzing the impact of human capital factors on competitivenes

    Directory of Open Access Journals (Sweden)

    Óhegyi Katalin

    2014-01-01

    Full Text Available There are a number of approaches to measure national competitiveness. However, in these reports human capital typically appears indirectly. The author's purpose is to uncover how human capital contributes to competitiveness of economies and to propose an approach to identify the most effective improvement opportunities for countries, illustrated on the example of Hungary. The analysis is based on the data of the Global Talent Index Report (2011 and the Global Competitiveness Report 2012-2013. The components of the Global Talent Index (GTI and their relation to the Global Competitiveness Index (GCI were analyzed with a linear programming based similarity analysis method, component-based object comparison for objectivity (COCO. Based on the output of the analysis it was identified how sensitive the Global Competitiveness Index is to the components of the GTI. Hungary's position was analyzed further to quantify improvement opportunities and threats based on the step function resulted by the COCO analysis. The author concludes that the human resource of a country is a pivotal element of national competitiveness. By developing human capital of the country the overall competitive position may be improved. Areas of priorities may be identified and the level of intervention may be quantified specific to a country. This could help policy makers to decide in the allocation of resource to maximize effectiveness, leading to improve (or protect a country's overall competitive position in the global arena.

  16. Educational and Scientific Applications of Climate Model Diagnostic Analyzer

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Zhang, J.; Bao, Q.

    2016-12-01

    Climate Model Diagnostic Analyzer (CMDA) is a web-based information system designed for the climate modeling and model analysis community to analyze climate data from models and observations. CMDA provides tools to diagnostically analyze climate data for model validation and improvement, and to systematically manage analysis provenance for sharing results with other investigators. CMDA utilizes cloud computing resources, multi-threading computing, machine-learning algorithms, web service technologies, and provenance-supporting technologies to address technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. As CMDA infrastructure and technology have matured, we have developed the educational and scientific applications of CMDA. Educationally, CMDA supported the summer school of the JPL Center for Climate Sciences for three years since 2014. In the summer school, the students work on group research projects where CMDA provide datasets and analysis tools. Each student is assigned to a virtual machine with CMDA installed in Amazon Web Services. A provenance management system for CMDA is developed to keep track of students' usages of CMDA, and to recommend datasets and analysis tools for their research topic. The provenance system also allows students to revisit their analysis results and share them with their group. Scientifically, we have developed several science use cases of CMDA covering various topics, datasets, and analysis types. Each use case developed is described and listed in terms of a scientific goal, datasets used, the analysis tools used, scientific results discovered from the use case, an analysis result such as output plots and data files, and a link to the exact analysis service call with all the input arguments filled. For example, one science use case is the evaluation of NCAR CAM5 model with MODIS total cloud fraction. The analysis service used is Difference Plot Service of

  17. Ion-Exclusion Chromatography for Analyzing Organics in Water

    Science.gov (United States)

    Sauer, Richard; Rutz, Jeffrey A.; Schultz, John R.

    2006-01-01

    A liquid-chromatography technique has been developed for use in the quantitative analysis of urea (and of other nonvolatile organic compounds typically found with urea) dissolved in water. The technique involves the use of a column that contains an ion-exclusion resin; heretofore, this column has been sold for use in analyzing monosaccharides and food softeners, but not for analyzing water supplies. The prior technique commonly used to analyze water for urea content has been one of high-performance liquid chromatography (HPLC), with reliance on hydrophobic interactions between analytes in a water sample and long-chain alkyl groups bonded to an HPLC column. The prior technique has proven inadequate because of a strong tendency toward co-elution of urea with other compounds. Co-elution often causes the urea and other compounds to be crowded into a narrow region of the chromatogram (see left part of figure), thereby giving rise to low chromatographic resolution and misidentification of compounds. It is possible to quantitate urea or another analyte via ultraviolet- and visible-light absorbance measurements, but in order to perform such measurements, it is necessary to dilute the sample, causing a significant loss of sensitivity. The ion-exclusion resin used in the improved technique is sulfonated polystyrene in the calcium form. Whereas the alkyl-chain column used in the prior technique separates compounds on the basis of polarity only, the ion-exclusion-resin column used in the improved technique separates compounds on the basis of both molecular size and electric charge. As a result, the degree of separation is increased: instead of being crowded together into a single chromatographic peak only about 1 to 2 minutes wide as in the prior technique, the chromatographic peaks of different compounds are now separated from each other and spread out over a range about 33 minutes wide (see right part of figure), and the urea peak can readily be distinguished from the other

  18. NASGRO 3.0: A Software for Analyzing Aging Aircraft

    Science.gov (United States)

    Mettu, S. R.; Shivakumar, V.; Beek, J. M.; Yeh, F.; Williams, L. C.; Forman, R. G.; McMahon, J. J.; Newman, J. C., Jr.

    1999-01-01

    Structural integrity analysis of aging aircraft is a critical necessity in view of the increasing numbers of such aircraft in general aviation, the airlines and the military. Efforts are in progress by NASA, the FAA and the DoD to focus attention on aging aircraft safety. The present paper describes the NASGRO software which is well-suited for effectively analyzing the behavior of defects that may be found in aging aircraft. The newly revised Version 3.0 has many features specifically implemented to suit the needs of the aircraft community. The fatigue crack growth computer program NASA/FLAGRO 2.0 was originally developed to analyze space hardware such as the Space Shuttle, the International Space Station and the associated payloads. Due to popular demand, the software was enhanced to suit the needs of the aircraft industry. Major improvements in Version 3.0 are the incorporation of the ability to read aircraft spectra of unlimited size, generation of common aircraft fatigue load blocks, and the incorporation of crack-growth models which include load-interaction effects such as retardation due to overloads and acceleration due to underloads. Five new crack-growth models, viz., generalized Willenborg, modified generalized Willenborg, constant closure model, Walker-Chang model and the deKoning-Newman strip-yield model, have been implemented. To facilitate easier input of geometry, material properties and load spectra, a Windows-style graphical user interface has been developed. Features to quickly change the input and rerun the problem as well as examine the output are incorporated. NASGRO has been organized into three modules, the crack-growth module being the primary one. The other two modules are the boundary element module and the material properties module. The boundary-element module provides the ability to model and analyze complex two-dimensional problems to obtain stresses and stress-intensity factors. The material properties module allows users to store and

  19. An UI Layout Files Analyzer for Test Data Generation

    Directory of Open Access Journals (Sweden)

    Paul POCATILU

    2014-01-01

    Full Text Available Prevention actions (trainings, audits and inspections (tests, validations, code reviews are the crucial factors in achieving a high quality level for any software application simply because low investments in this area are leading to significant expenses in terms of corrective actions needed for defect fixing. Mobile applications testing involves the use of various tools and scenarios. An important process is represented by test data generation. This paper proposes a test data generator (TDG system for mobile applications using several sources for test data and it focuses on the UI layout files analyzer module. The proposed architecture aims to reduce time-to-market for mobile applications. The focus is on test data generators based on the source code, user interface layout files (using markup languages like XML or XAML and application specifications. In order to assure a common interface for test data generators, an XML or JSON-based language called Data Specification Language (DSL is proposed.

  20. Governance of Aquatic Agricultural Systems: Analyzing Representation, Power, and Accountability

    Directory of Open Access Journals (Sweden)

    Blake D. Ratner

    2013-12-01

    Full Text Available Aquatic agricultural systems in developing countries face increasing competition from multiple stakeholders over rights to access and use natural resources, land, water, wetlands, and fisheries, essential to rural livelihoods. A key implication is the need to strengthen governance to enable equitable decision making amidst competition that spans sectors and scales, building capacities for resilience, and for transformations in institutions that perpetuate poverty. In this paper we provide a simple framework to analyze the governance context for aquatic agricultural system development focused on three dimensions: stakeholder representation, distribution of power, and mechanisms of accountability. Case studies from Cambodia, Bangladesh, Malawi/Mozambique, and Solomon Islands illustrate the application of these concepts to fisheries and aquaculture livelihoods in the broader context of intersectoral and cross-scale governance interactions. Comparing these cases, we demonstrate how assessing governance dimensions yields practical insights into opportunities for transforming the institutions that constrain resilience in local livelihoods.

  1. Analyzing Enterprise Networks Needs: Action Research from the Mechatronics Sector

    Science.gov (United States)

    Cagnazzo, Luca; Taticchi, Paolo; Bidini, Gianni; Baglieri, Enzo

    New business models and theories are developing nowadays towards collaborative environments direction, and many new tools in sustaining companies involved in these organizations are emerging. Among them, a plethora of methodologies to analyze their needs are already developed for single companies. Few academic works are available about Enterprise Networks (ENs) need analysis. This paper presents the learning from an action research (AR) in the mechatronics sector: AR has been used in order to experience the issue of evaluating network needs and therefore define, develop, and test a complete framework for network evaluation. Reflection on the story in the light of the experience and the theory is presented, as well as extrapolation to a broader context and articulation of usable knowledge.

  2. USING THE FACTORIAL CORRESPONDENCES FOR ANALYZING TOURIST FLOWS

    Directory of Open Access Journals (Sweden)

    Kamer Ainur M. AIVAZ

    2016-06-01

    Full Text Available This study aims to analyze the distribution of each flow of non-residents tourists, coming from 33 countries, on the six main categories of touristic destinations in Romania, in 2015, and assumes that there are differences or similarities between the tourists origin country and the touristic destinations that they chose. The performances recorded in Romania regarding the attraction of foreign tourists were relatively modest during the past three decades, from various reasons, starting with a poor access infrastructure and finishing with a deficient and, sometimes inadequate activity of tourism promotion. The statistical method used is the factorial correspondences analysis. The data processing, the indicators significance testing and the graph representations were performed using SPSS statistical software. We consider that the usage of this method allows the indirect knowledge of the tourist preferences and the results may be useful in developing a strategy for tourism promotion, customized for each country that sends tourists.

  3. Improving and analyzing signage within a healthcare setting.

    Science.gov (United States)

    Rousek, J B; Hallbeck, M S

    2011-11-01

    Healthcare facilities are increasingly utilizing pictograms rather than text signs to help direct people. The purpose of this study was to analyze a wide variety of standardized healthcare pictograms and the effects of color contrasts and complexity for participants with both normal and impaired vision. Fifty (25 males, 25 females) participants completed a signage recognition questionnaire and identified pictograms while wearing vision simulators to represent specific visual impairment. The study showed that certain color contrasts, complexities and orientations can help or hinder comprehension of signage for people with and without visual impairment. High contrast signage with consistent pictograms involving human figures (not too detailed or too abstract) is most identifiable. Standardization of healthcare signage is recommended to speed up and aid the cognitive thought process in detecting signage and determining meaning. These fundamental signage principles are critical in producing an efficient, universal wayfinding system for healthcare facilities. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. Nuclear Plant Analyzer development at the Idaho National Engineering Laboratory

    International Nuclear Information System (INIS)

    Laats, E.T.

    1986-10-01

    The Nuclear Plant Analyzer (NPA) is a state-of-the-art safety analysis and engineering tool being used to address key nuclear power plant safety issues. Under the sponsorship of the US Nuclear Regulatory Commission (NRC), the NPA has been developed to integrate the NRC's computerized reactor behavior simulation codes such as RELAP5, TRAC-BWR and TRAC-PWR, with well-developed computer color graphics programs and large repositories of reactor design and experimental data. An important feature of the NPA is the capability to allow an analyst to redirect a RELAP5 or TRAC calculation as it progresses through its simulated scenario. The analyst can have the same power plant control capabilities as the operator of an actual plant. The NPA resides on the dual Control Data Corporation Cyber 176 mainframe computers at the Idaho National Engineering Laboratory and Cray-1S computers at the Los Alamos National Laboratory (LANL) and Kirtland Air Force Weapons Laboratory (KAFWL)

  5. Measuring and analyzing the causes of problematic Internet use.

    Science.gov (United States)

    Chiang, I-Ping; Su, Yung-Hsiang

    2012-11-01

    Since Internet surfing became a daily activity, people have changed their behavior. This research analyzes the causes of problematic Internet use through an online survey, where 1,094 samples were collected. Based on the results of structural equation modeling analysis, the following conclusions are reached: First, novelty, security, and efficiency increase users' online trust. Second, information and efficiency enhance users' sharing and anonymity online. Third, greater trust in Internet environments leads to an increase in a user's cognitive bias toward online behavioral responsibility and Internet addiction. Fourth, a user's attitude toward online sharing further increases the cognitive bias toward online copyright. Fifth, a user's attitude toward anonymity increases cognitive bias toward online copyright, online behavioral responsibility, and deepens Internet addiction.

  6. Guiding center model to interpret neutral particle analyzer results

    Science.gov (United States)

    Englert, G. W.; Reinmann, J. J.; Lauver, M. R.

    1974-01-01

    The theoretical model is discussed, which accounts for drift and cyclotron components of ion motion in a partially ionized plasma. Density and velocity distributions are systematically precribed. The flux into the neutral particle analyzer (NPA) from this plasma is determined by summing over all charge exchange neutrals in phase space which are directed into apertures. Especially detailed data, obtained by sweeping the line of sight of the apertures across the plasma of the NASA Lewis HIP-1 burnout device, are presented. Selection of randomized cyclotron velocity distributions about mean azimuthal drift yield energy distributions which compared well with experiment. Use of data obtained with a bending magnet on the NPA showed that separation between energy distribution curves of various mass species correlate well with a drift divided by mean cyclotron energy parameter of the theory. Use of the guiding center model in conjunction with NPA scans across the plasma aid in estimates of ion density and E field variation with plasma radius.

  7. Comparison of electron cloud mitigating coatings using retarding field analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Calvey, J.R., E-mail: jrc97@cornell.edu; Hartung, W.; Li, Y.; Livezey, J.A.; Makita, J.; Palmer, M.A.; Rubin, D.

    2014-10-01

    In 2008, the Cornell Electron Storage Ring (CESR) was reconfigured to serve as a test accelerator (CESRTA) for next generation lepton colliders, in particular for the ILC damping ring. A significant part of this program has been the installation of diagnostic devices to measure and quantify the electron cloud effect, a potential limiting factor in these machines. One such device is the Retarding Field Analyzer (RFA), which provides information on the local electron cloud density and energy distribution. Several different styles of RFAs have been designed, tested, and deployed throughout the CESR ring. They have been used to study the growth of the cloud in different beam conditions, and to evaluate the efficacy of different mitigation techniques. This paper will provide an overview of RFA results obtained in a magnetic field free environment.

  8. ANALYZING FAT-TAILED DISTRIBUTIONS IN EMERGING CAPITAL MARKETS

    Directory of Open Access Journals (Sweden)

    FELICIA RAMONA BIRĂU

    2013-05-01

    Full Text Available The aim of this article focuses on analyzing the implications of fat-tailed distributions in emerging capital markets. An essential aspect that was highlighted by most empirical research, especially in terms of emerging capital markets, emphasizes the fact that extreme financial events can not be accurately predicted by the normal distribution. Fat-tailed distributions establish a very effective econometric tool in the analysis of rare events which are characterized by extreme values that occur with a relatively high frequency .The importance of exploring this particular issue derives from the fact that it is fundamental for optimal portfolio selection, derivatives valuation, financial hedging and risk management strategies. The implications of fat-tailed distributions for investment process are significant especially in the turbulent context of the global financial crisis.

  9. Analyzing Software Requirements Errors in Safety-Critical, Embedded Systems

    Science.gov (United States)

    Lutz, Robyn R.

    1993-01-01

    This paper analyzes the root causes of safety-related software errors in safety-critical, embedded systems. The results show that software errors identified as potentially hazardous to the system tend to be produced by different error mechanisms than non- safety-related software errors. Safety-related software errors are shown to arise most commonly from (1) discrepancies between the documented requirements specifications and the requirements needed for correct functioning of the system and (2) misunderstandings of the software's interface with the rest of the system. The paper uses these results to identify methods by which requirements errors can be prevented. The goal is to reduce safety-related software errors and to enhance the safety of complex, embedded systems.

  10. Interlaboratory variation in a commercial bone mineral analyzer

    International Nuclear Information System (INIS)

    Mazess, R.B.; Witt, R.

    1983-01-01

    Measurements of bone mineral content (BMC) were made in 14 different laboratories in the US and four in Europe using commercially produced instrumentation (Norland Bone Mineral Analyzer) for 125 I absorptiometry. A three-chambered standard (dipotassium hydrogen phosphate) was measured in each laboratory following their own calibration. The values of BMC in the middle range (0.6 g/cm) all were adequate (within +-2%), but the BMC values were underestimated by 5% or more in five laboratories for the largest chamber and in three laboratories for the smallest chamber. Width values were accurate (+-3%) over 0.7 to 1.6 cm. The effect of underestimating large values in clinical studies is to reduce the difference between normals and abnormals. Calibration error also may be responsible for the variable normal values found in the US and Europe by osme users of this instrument

  11. Development of a useful technique for analyzing behavioral teratologic data

    International Nuclear Information System (INIS)

    Jensh, R.P.; Brent, R.L.

    1989-01-01

    Two measures of postnatal development are described in this paper: the PAC50 and AD50. These measures proved to be more sensitive than the use of means in the evaluation of three radiation studies involving postnatal developmental evaluation. PAC50 is the percent of achievement of a goal by litters or offspring in an experimental group at the age when 50% of the control litters or offspring attain that goal. AD50 is the age (acquisition day) at which 50% of the litters or offspring in each group attain a specified developmental goal. This methodology is a useful technique for analyzing selected behavioral data following in utero X-irradiation and may prove to be a sensitive means of determining postnatal alteration due to prenatal exposure to a variety of suspect agents

  12. Developing an Approach for Analyzing and Verifying System Communication

    Science.gov (United States)

    Stratton, William C.; Lindvall, Mikael; Ackermann, Chris; Sibol, Deane E.; Godfrey, Sally

    2009-01-01

    This slide presentation reviews a project for developing an approach for analyzing and verifying the inter system communications. The motivation for the study was that software systems in the aerospace domain are inherently complex, and operate under tight constraints for resources, so that systems of systems must communicate with each other to fulfill the tasks. The systems of systems requires reliable communications. The technical approach was to develop a system, DynSAVE, that detects communication problems among the systems. The project enhanced the proven Software Architecture Visualization and Evaluation (SAVE) tool to create Dynamic SAVE (DynSAVE). The approach monitors and records low level network traffic, converting low level traffic into meaningful messages, and displays the messages in a way the issues can be detected.

  13. Monitoring aspirin therapy with the Platelet Function Analyzer-100

    DEFF Research Database (Denmark)

    Mortensen, Jette; Poulsen, Tina Svenstrup; Grove, Erik Lerkevang

    2008-01-01

    OBJECTIVE: Low platelet response to aspirin has been reported to be associated with a high incidence of vascular events. The reported prevalence of aspirin low-responsiveness varies, which may be explained by poor reproducibility of the methods used to evaluate aspirin response and low compliance....... The Platelet Function Analyzer-100 (PFA-100) is a commonly used platelet function test. We aimed to assess the reproducibility of the PFA-100 and the agreement with optical platelet aggregometry (OPA) in healthy volunteers and in patients with coronary artery disease (CAD) treated with low-dose aspirin....... MATERIAL AND METHODS: Twenty-one healthy volunteers and 43 patients with CAD took part in the study. During treatment with aspirin 75 mg daily, all participants had platelet function assessed in duplicate with the PFA-100 and OPA on 4 consecutive days. Additionally, platelet function was assessed before...

  14. Increasing process understanding by analyzing complex interactions in experimental data

    DEFF Research Database (Denmark)

    Naelapaa, Kaisa; Allesø, Morten; Kristensen, Henning Gjelstrup

    2009-01-01

    understanding of a coating process. It was possible to model the response, that is, the amount of drug released, using both mentioned techniques. However, the ANOVAmodel was difficult to interpret as several interactions between process parameters existed. In contrast to ANOVA, GEMANOVA is especially suited...... for modeling complex interactions and making easily understandable models of these. GEMANOVA modeling allowed a simple visualization of the entire experimental space. Furthermore, information was obtained on how relative changes in the settings of process parameters influence the film quality and thereby drug......There is a recognized need for new approaches to understand unit operations with pharmaceutical relevance. A method for analyzing complex interactions in experimental data is introduced. Higher-order interactions do exist between process parameters, which complicate the interpretation...

  15. Outline of fast analyzer for MHD equilibrium 'FAME'

    International Nuclear Information System (INIS)

    Sakata, Shinya; Haginoya, Hirofumi; Tsuruoka, Takuya; Aoyagi, Tetsuo; Saito, Naoyuki; Harada, Hiroo; Tani, Keiji; Watanabe, Hideto.

    1994-03-01

    The FAME (Fast Analyzer for Magnetohydrodynamic (MHD) Equilibrium) system has been developed in order to provide more than 100 MHD equilibria in time series which are enough for the non-stationary analysis of the experimental data of JT-60 within about 20 minutes shot interval. The FAME is an MIMD type small scale parallel computer with 20 microprocessors which are connected by a multi-stage switching system. The maximum theoretical speed is 250 MFLOPS. For the software system of FAME, MHD equilibrium analysis code SELENE and its input data production code FBI are tuned up taking the parallel processing into consideration. Consequently, the computational performance of the FAME system becomes more than 7 times faster than the existing general purpose computer FACOM M780-10s. This report summarizes the outline of the FAME system including hardware, soft-ware and peripheral equipments. (author)

  16. Solid State Neutral Particle Analyzer Array on NSTX

    International Nuclear Information System (INIS)

    Shinohara, K.; Darrow, D.S.; Roquemore, A.L.; Medley, S.S.; Cecil, F.E.

    2004-01-01

    A Solid State Neutral Particle Analyzer (SSNPA) array has been installed on the National Spherical Torus Experiment (NSTX). The array consists of four chords viewing through a common vacuum flange. The tangency radii of the viewing chords are 60, 90, 100, and 120 cm. They view across the three co-injection neutral beam lines (deuterium, 80 keV (typ.) with tangency radii 48.7, 59.2, and 69.4 cm) on NSTX and detect co-going energetic ions. A silicon photodiode used was calibrated by using a mono-energetic deuteron beam source. Deuterons with energy above 40 keV can be detected with the present setup. The degradation of the performance was also investigated. Lead shots and epoxy are used for neutron shielding to reduce handling any hazardous heavy metal. This method also enables us to make an arbitrary shape to be fit into the complex flight tube

  17. UVB DNA dosimeters analyzed by polymerase chain reactors

    International Nuclear Information System (INIS)

    Yoshida, Hiroko; Regan, J.D.; Florida Inst. of Tech., Melbourne, FL

    1997-01-01

    Purified bacteriophage λ DNA was dried on a UV-transparent polymer film and served as a UVB dosimeter for personal and ecological applications. Bacteriophage λ DNA was chosen because it is commercially available and inexpensive, and its entire sequence is known. Each dosimeter contained two sets of DNA sandwiched between UV-transparent polymer films, one exposed to solar radiation (experimental) and another protected from UV radiation by black paper (control). The DNA dosimeter was then analyzed by a polymerase chain reaction (PCR) that amplifies a 500 base pair specific region of λ DNA. Photoinduced damage in DNA blocks polymerase from synthesizing a new strand; therefore, the amount of amplified product in UV-exposed DNA was reduced from that found in control DNA. The dried λ DNA dosimeter is compact, robust, safe and transportable, stable over long storage times and provides the total UVB dose integrated over the exposure time. (author)

  18. ACTIVITY-BASED COST ALLOCATION AND FUNCTION ANALYZES IN TRADE

    Directory of Open Access Journals (Sweden)

    TÜNDE VERES

    2011-01-01

    Full Text Available In this paper the author is thinking about the efficiency analyzes of trading. The most important evaluation factors of trade are the sales value, volume and the margin. Of course the easiest and fastest way is to follow the market situation by the turnover but for long term thinking the sales companies need to concentrate also for efficiency. Trading activity has some functions which can deeply effect for the final result and this is the reason to calculate their clear and reliable costs is an important condition of the decision making. The author reviews the cost categories and the basic functions in trading activity to find possible ways getting reliable information.

  19. Using game theory to analyze green stormwater infrastructure implementation policies

    Science.gov (United States)

    William, R. K.; Garg, J.; Stillwell, A. S.

    2017-12-01

    While green stormwater infrastructure is a useful approach in addressing multiple challenges facing the urban environment, little consensus exists on how to best incentivize its adoption by private landowners. Game theory, a field of study designed to model conflict and cooperation between two or more agents, is well-suited to address this policy question. We used a cooperative game theory framework to analyze the impacts of three different policy approaches frequently used to incentivize the uptake of green infrastructure by private landowners: municipal regulation, direct grants, and stormwater fees. The results indicate that municipal regulation leads to the greatest environmental benefits; however, the choice of "best" regulatory approach is dependent on a variety of different factors including political and financial considerations. Policy impacts are also highly dependent on agents' spatial positions within the stormwater network. This finding leads to important questions of social equity and environmental justice.

  20. Method of analyzing the shipping inspection water in nuclear fuels

    International Nuclear Information System (INIS)

    Ishikawa, Tatsuo; Izumoji, Yoshiaki.

    1984-01-01

    Purpose: To automatically perform radioactive-corrosion-products removing operation and thereby automatic analysis for shipping inspection water. Method: An radioactive-corrosion-product removing device has a column filled with a chelate resin layer. Shipping-inspection water as a specimen is injected from a sample-water inlet to the column, passes through the chelate resin layer, issues through the sample-water exit and is then stored once in a sample collector. The shipping inspection water sampled in the sampling collector is sent to an inspection port in an on-line or off-line manner and gamma-ray determination is made by a gamma-ray spectrum analyzer. (Horiuchi, T.)