WorldWideScience

Sample records for analyzing one-way experiments

  1. An experiment to measure the one-way velocity of propagation of electromagnetic radiation

    Science.gov (United States)

    Kolen, P.; Torr, D. G.

    1982-01-01

    An experiment involving commercially available instrumentation to measure the velocity of the earth with respect to absolute space is described. The experiment involves the measurement of the one-way propagation velocity of electromagnetic radiation down a high-quality coaxial cable. It is demonstrated that the experiment is both physically meaningful and exceedingly simple in concept and in implementation. It is shown that with currently available commercial equipment one might expect to detect a threshold value for the component of velocity of the earth's motion with respect to absolute space in the equatorial plane of approximately 10 km/s, which greatly exceeds the velocity resolution required to detect the motion of the solar system with respect to the center of the galaxy.

  2. Sample size calculation for microarray experiments with blocked one-way design

    Directory of Open Access Journals (Sweden)

    Jung Sin-Ho

    2009-05-01

    Full Text Available Abstract Background One of the main objectives of microarray analysis is to identify differentially expressed genes for different types of cells or treatments. Many statistical methods have been proposed to assess the treatment effects in microarray experiments. Results In this paper, we consider discovery of the genes that are differentially expressed among K (> 2 treatments when each set of K arrays consists of a block. In this case, the array data among K treatments tend to be correlated because of block effect. We propose to use the blocked one-way ANOVA F-statistic to test if each gene is differentially expressed among K treatments. The marginal p-values are calculated using a permutation method accounting for the block effect, adjusting for the multiplicity of the testing procedure by controlling the false discovery rate (FDR. We propose a sample size calculation method for microarray experiments with a blocked one-way design. With FDR level and effect sizes of genes specified, our formula provides a sample size for a given number of true discoveries. Conclusion The calculated sample size is shown via simulations to provide an accurate number of true discoveries while controlling the FDR at the desired level.

  3. Through the one-way mirror.

    NARCIS (Netherlands)

    van Heerden, J.; Hoogstraten, J.

    1981-01-01

    Three experiments tested how the presence of a one-way mirror raises feelings of vulnerability and test anxiety. In Exp I, 37 male undergraduates performed 5 tasks measuring social desirability, personal history, and ethics either in front of a one-way mirror or not. Exp II duplicated Exp I, but

  4. The Roland De Witte Experiment, R. T. Cahill, and the One-Way Speed of Light (Letters to Progress in Physics

    Directory of Open Access Journals (Sweden)

    Catania J.

    2016-01-01

    Full Text Available In “The Roland De Witte 1991 Experiment (to the Memory of Roland De Witte” (Progr. Phys , 2006, v. 2(3, 60–65, R.T. Cahill gives us a briefing on his view that interferometer measurements and one-way RF coaxial cable propagation-time measure- ments amount to a detection of the anisotropy in the speed of light. However, while I obtain first order propagation delays in calculations for on e-way transit which would show geometric modulation by Earth’s rotation, I do not agree with Cahill’s simplistic equation that relates the modulation solely to the projection of the absolute velocity vector v on the coaxial cable, called v P by Cahill (ibid., p. 61–62. The reader should be warned that Cahill’s equation for ∆ t (ibid., p.63 is crude compared with a full Special Relativistic derivation.

  5. A test of the one-way isotropy of the speed of light from the T2L2 space experiment

    Science.gov (United States)

    Belli, A.; Exertier, P.; Samain, E.

    2017-12-01

    The Time Transfer by Laser Link (T2L2) space experiment that is currently flying on-board Jason-2 (1335 km of altitude) provides an opportunity to make a test of the isotropy of the speed of light using one-way propagation on a non-laboratory scale. Following the general framework given by te{Mansouri1977}, which permits violations of Einstein special relativity, we study the problem of deducing the isotropy of the speed between two clocks as the orientation path varies relative to an inertial reference frame. The short term stability of the T2L2 ground-to-space time transfer has been established at 5-6 ps at 60 seconds between a hydrogen maser and the on-board oscillator on use for the Jason-2 satellite. Nevertheless, during the satellite pass above a laser ranging station (of around 1000 seconds), the stability of the space oscillator is decreasing in τ^{3/2} that clearly impacts the expected performance of the present test. We thus give insights into certain modelling issues and processes, including time transfer problems which have a bearing on the global error budget. Our goal is to achieve an accuracy of {δc}/{c} ≈ 2-3.10^{-9} locally with a scope for improvement by cumulating numerous passes over the same laser ranging station.

  6. One-way unlocalizable information deficit

    International Nuclear Information System (INIS)

    Zhu Xuena; Fei Shaoming

    2013-01-01

    We introduce a one-way unlocalizable information deficit with respect to the one-way information deficit, similar to the definition of one-way unlocalizable quantum discord with respect to one-way quantum discord. The properties of the one-way unlocalizable information deficit and the relations among one-way unlocalizable information deficit, one-way unlocalizable quantum discord, one-way quantum discord, one-way information deficit and other quantum correlations are investigated. Analytical formulas of the one-way unlocalizable information deficit are given with detailed examples. (paper)

  7. Analyzing Motives, Preferences, and Experiences in Video Game Play

    Directory of Open Access Journals (Sweden)

    Donald Loffredo

    2017-04-01

    Full Text Available This paper presents the results of analyzing motives, preferences, and experiences in video game play. A sample of 112 (64 male and 48 female students completed online the Gaming Attitudes, Motives, and Experiences Scales (GAMES. Separate one-way independent-measures multivariate analyses of variance (MANOVAs were used to determine if there were statistically significant differences by gender, age category, hours of videogame play, and ethnicity on the nine Factor Subscales of the GAMES. The results supported two of the proposed hypotheses. There were statistically differences by gender and hours of videogame play on some of the Factor Subscales of the GAMES.

  8. Efficient Constructions for One-way Hash Chains

    National Research Council Canada - National Science Library

    Hu, Yih-Chun; Jakobsson, Markus; Perrig, Adrian

    2003-01-01

    .... Our first construction, the Sandwich-chain, provides a smaller bandwidth overhead for one-way chain values, and enables efficient verification of one-way chain values if the trusted one-way chain value is far away...

  9. Efficient one-way quantum computations for quantum error correction

    International Nuclear Information System (INIS)

    Huang Wei; Wei Zhaohui

    2009-01-01

    We show how to explicitly construct an O(nd) size and constant quantum depth circuit which encodes any given n-qubit stabilizer code with d generators. Our construction is derived using the graphic description for stabilizer codes and the one-way quantum computation model. Our result demonstrates how to use cluster states as scalable resources for many multi-qubit entangled states and how to use the one-way quantum computation model to improve the design of quantum algorithms.

  10. Efficient quantum circuits for one-way quantum computing.

    Science.gov (United States)

    Tanamoto, Tetsufumi; Liu, Yu-Xi; Hu, Xuedong; Nori, Franco

    2009-03-13

    While Ising-type interactions are ideal for implementing controlled phase flip gates in one-way quantum computing, natural interactions between solid-state qubits are most often described by either the XY or the Heisenberg models. We show an efficient way of generating cluster states directly using either the imaginary SWAP (iSWAP) gate for the XY model, or the sqrt[SWAP] gate for the Heisenberg model. Our approach thus makes one-way quantum computing more feasible for solid-state devices.

  11. Delay or anticipatory synchronization in one-way coupled systems ...

    Indian Academy of Sciences (India)

    Abstract. We present a mechanism for the synchronization of one-way coupled nonlinear systems in which the coupling uses a variable delay, that is reset at finite intervals. Here the delay varies in the same way as the system in time and so the coupling function remains constant for the reset interval at the end of which it is ...

  12. One-way quantum computing in superconducting circuits

    Science.gov (United States)

    Albarrán-Arriagada, F.; Alvarado Barrios, G.; Sanz, M.; Romero, G.; Lamata, L.; Retamal, J. C.; Solano, E.

    2018-03-01

    We propose a method for the implementation of one-way quantum computing in superconducting circuits. Measurement-based quantum computing is a universal quantum computation paradigm in which an initial cluster state provides the quantum resource, while the iteration of sequential measurements and local rotations encodes the quantum algorithm. Up to now, technical constraints have limited a scalable approach to this quantum computing alternative. The initial cluster state can be generated with available controlled-phase gates, while the quantum algorithm makes use of high-fidelity readout and coherent feedforward. With current technology, we estimate that quantum algorithms with above 20 qubits may be implemented in the path toward quantum supremacy. Moreover, we propose an alternative initial state with properties of maximal persistence and maximal connectedness, reducing the required resources of one-way quantum computing protocols.

  13. Fundamentals of universality in one-way quantum computation

    International Nuclear Information System (INIS)

    Nest, M van den; Duer, W; Miyake, A; Briegel, H J

    2007-01-01

    In this paper, we build a framework allowing for a systematic investigation of the fundamental issue: 'Which quantum states serve as universal resources for measurement-based (one-way) quantum computation?' We start our study by re-examining what is exactly meant by 'universality' in quantum computation, and what the implications are for universal one-way quantum computation. Given the framework of a measurement-based quantum computer, where quantum information is processed by local operations only, we find that the most general universal one-way quantum computer is one which is capable of accepting arbitrary classical inputs and producing arbitrary quantum outputs-we refer to this property as CQ-universality. We then show that a systematic study of CQ-universality in one-way quantum computation is possible by identifying entanglement features that are required to be present in every universal resource. In particular, we find that a large class of entanglement measures must reach its supremum on every universal resource. These insights are used to identify several families of states as being not universal, such as one-dimensional (1D) cluster states, Greenberger-Horne-Zeilinger (GHZ) states, W states, and ground states of non-critical 1D spin systems. Our criteria are strengthened by considering the efficiency of a quantum computation, and we find that entanglement measures must obey a certain scaling law with the system size for all efficient universal resources. This again leads to examples of non-universal resources, such as, e.g. ground states of critical 1D spin systems. On the other hand, we provide several examples of efficient universal resources, namely graph states corresponding to hexagonal, triangular and Kagome lattices. Finally, we consider the more general notion of encoded CQ-universality, where quantum outputs are allowed to be produced in an encoded form. Again we provide entanglement-based criteria for encoded universality. Moreover, we present a

  14. Modified one-way coupled map lattices as communication cryptosystems

    International Nuclear Information System (INIS)

    Zhao Mingchao; Li Kezan; Fu Xinchu

    2009-01-01

    In this paper, we modify the original communication cryptosystem based on OCML (one-way coupled map lattices), and present a modified OCML communication cryptosystem. The modified OCML communication cryptosystem is shown to have some additional advantages compared to the original one, e.g., it has a larger parameter space, and is more capable of anti-error analysis. And, we apply this modified OCML communication cryptosystem for multiplex OCML communication.

  15. One-way spatial integration of hyperbolic equations

    Science.gov (United States)

    Towne, Aaron; Colonius, Tim

    2015-11-01

    In this paper, we develop and demonstrate a method for constructing well-posed one-way approximations of linear hyperbolic systems. We use a semi-discrete approach that allows the method to be applied to a wider class of problems than existing methods based on analytical factorization of idealized dispersion relations. After establishing the existence of an exact one-way equation for systems whose coefficients do not vary along the axis of integration, efficient approximations of the one-way operator are constructed by generalizing techniques previously used to create nonreflecting boundary conditions. When physically justified, the method can be applied to systems with slowly varying coefficients in the direction of integration. To demonstrate the accuracy and computational efficiency of the approach, the method is applied to model problems in acoustics and fluid dynamics via the linearized Euler equations; in particular we consider the scattering of sound waves from a vortex and the evolution of hydrodynamic wavepackets in a spatially evolving jet. The latter problem shows the potential of the method to offer a systematic, convergent alternative to ad hoc regularizations such as the parabolized stability equations.

  16. A one-way text messaging intervention for obesity.

    Science.gov (United States)

    Ahn, Ahleum; Choi, Jaekyung

    2016-04-01

    Worldwide, there has been a startling increase in the number of people who are obese or overweight. Obesity increases the risk of cardiovascular disease and overall mortality. Mobile phone messaging is an important means of human communication globally. Because the mobile phone can be used anywhere at any time, mobile phone messaging has the potential to manage obesity. We investigated the effectiveness of a one-way text messaging intervention for obesity. Participants' body mass index and waist circumference were measured at the beginning of the programme and again after 12 weeks. The text message group received text messages about exercise, dietary intake, and general information about obesity three times a week, while the control group did not receive any text messages from the study. Of the 80 participants, 25 subjects in the text message group and 29 participants in the control group completed the study. After adjusting for baseline body mass index, the body mass index was significantly lower in the text message group than in the control group (27.9 vs. 28.3; p = 0.02). After adjusting for the baseline waist circumference, the difference of waist circumference between the text message group and control group was not significant (93.4 vs. 94.6; p = 0.13). The one-way text messaging intervention was a simple and effective way to manage obesity. The one-way text messaging intervention may be a useful method for lifestyle modification in obese subjects. © The Author(s) 2015.

  17. One-way entangled-photon autocompensating quantum cryptography

    Science.gov (United States)

    Walton, Zachary D.; Abouraddy, Ayman F.; Sergienko, Alexander V.; Saleh, Bahaa E.; Teich, Malvin C.

    2003-06-01

    A quantum cryptography implementation is presented that uses entanglement to combine one-way operation with an autocompensating feature that has hitherto only been available in implementations that require the signal to make a round trip between the users. Using the concept of advanced waves, it is shown that this proposed implementation is related to the round-trip implementation in the same way that Ekert’s two-particle scheme is related to the original one-particle scheme of Bennett and Brassard. The practical advantages and disadvantages of the proposed implementation are discussed in the context of existing schemes.

  18. Acoustic topological insulator and robust one-way sound transport

    Science.gov (United States)

    He, Cheng; Ni, Xu; Ge, Hao; Sun, Xiao-Chen; Chen, Yan-Bin; Lu, Ming-Hui; Liu, Xiao-Ping; Chen, Yan-Feng

    2016-12-01

    Topological design of materials enables topological symmetries and facilitates unique backscattering-immune wave transport. In airborne acoustics, however, the intrinsic longitudinal nature of sound polarization makes the use of the conventional spin-orbital interaction mechanism impossible for achieving band inversion. The topological gauge flux is then typically introduced with a moving background in theoretical models. Its practical implementation is a serious challenge, though, due to inherent dynamic instabilities and noise. Here we realize the inversion of acoustic energy bands at a double Dirac cone and provide an experimental demonstration of an acoustic topological insulator. By manipulating the hopping interaction of neighbouring ’atoms’ in this new topological material, we successfully demonstrate the acoustic quantum spin Hall effect, characterized by robust pseudospin-dependent one-way edge sound transport. Our results are promising for the exploration of new routes for experimentally studying topological phenomena and related applications, for example, sound-noise reduction.

  19. Inference for One-Way ANOVA with Equicorrelation Error Structure

    Directory of Open Access Journals (Sweden)

    Weiyan Mu

    2014-01-01

    Full Text Available We consider inferences in a one-way ANOVA model with equicorrelation error structures. Hypotheses of the equality of the means are discussed. A generalized F-test has been proposed by in the literature to compare the means of all populations. However, they did not discuss the performance of that test. We propose two methods, a generalized pivotal quantities-based method and a parametric bootstrap method, to test the hypotheses of equality of the means. We compare the empirical performance of the proposed tests with the generalized F-test. It can be seen from the simulation results that the generalized F-test does not perform well in terms of Type I error rate, and the proposed tests perform much better. We also provide corresponding simultaneous confidence intervals for all pair-wise differences of the means, whose coverage probabilities are close to the confidence level.

  20. Study of the one-way speed of light anisotropy with particle beams

    Energy Technology Data Exchange (ETDEWEB)

    Wojtsekhowski, Bogdan B. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2017-04-01

    Concepts of high precision studies of the one-way speed of light anisotropy are discussed. The high energy particle beam allows measurement of a one-way speed of light anisotropy (SOLA) via analysis of the beam momentum variation with sidereal phase without the use of synchronized clocks. High precision beam position monitors could provide accurate monitoring of the beam orbit and determination of the particle beam momentum with relative accuracy on the level of 10^-10, which corresponds to a limit on SOLA of 10^-18 with existing storage rings. A few additional versions of the experiment are also presented.

  1. Epidemic spreading on one-way-coupled networks

    Science.gov (United States)

    Wang, Lingna; Sun, Mengfeng; Chen, Shanshan; Fu, Xinchu

    2016-09-01

    Numerous real-world networks (e.g., social, communicational, and biological networks) have been observed to depend on each other, and this results in interconnected networks with different topology structures and dynamics functions. In this paper, we focus on the scenario of epidemic spreading on one-way-coupled networks comprised of two subnetworks, which can manifest the transmission of some zoonotic diseases. By proposing a mathematical model through mean-field approximation approach, we prove the global stability of the disease-free and endemic equilibria of this model. Through the theoretical and numerical analysis, we obtain interesting results: the basic reproduction number R0 of the whole network is the maximum of the basic reproduction numbers of the two subnetworks; R0 is independent of the cross-infection rate and cross contact pattern; R0 increases rapidly with the growth of inner infection rate if the inner contact pattern is scale-free; in order to eradicate zoonotic diseases from human beings, we must simultaneously eradicate them from animals; bird-to-bird infection rate has bigger impact on the human's average infected density than bird-to-human infection rate.

  2. Analysis of One-Way Laser Ranging Data to LRO, Time Transfer and Clock Characterization

    Science.gov (United States)

    Bauer, S.; Hussmann, H.; Oberst, J.; Dirkx, D.; Mao, D.; Neumann, G. A.; Mazarico, E.; Torrence, M. H.; McGarry, J. F.; Smith, D. E.; hide

    2016-01-01

    We processed and analyzed one-way laser ranging data from International Laser Ranging Service ground stations to NASA's Lunar Reconnaissance Orbiter (LRO), obtained from June 13, 2009 until September 30, 2014. We pair and analyze the one-way range observables from station laser fire and spacecraft laser arrival times by using nominal LRO orbit models based on the GRAIL gravity field. We apply corrections for instrument range walk, as well as for atmospheric and relativistic effects. In total we derived a tracking data volume of approximately 3000 hours featuring 64 million Full Rate and 1.5 million Normal Point observations. From a statistical analysis of the dataset we evaluate the experiment and the ground station performance. We observe a laser ranging measurement precision of 12.3 centimeters in case of the Full Rate data which surpasses the LOLA (Lunar Orbiting Laser Altimeter) timestamp precision of 15 centimeters. The averaging to Normal Point data further reduces the measurement precision to 5.6 centimeters. We characterized the LRO clock with fits throughout the mission time and estimated the rate to 6.9 times10 (sup -8), the aging to 1.6 times 10 (sup -12) per day and the change of aging to 2.3 times 10 (sup -14) per day squared over all mission phases. The fits also provide referencing of onboard time to the TDB (Barycentric Dynamical Time) time scale at a precision of 166 nanoseconds over two and 256 nanoseconds over all mission phases, representing ground to space time transfer. Furthermore we measure ground station clock differences from the fits as well as from simultaneous passes which we use for ground to ground time transfer from common view observations. We observed relative offsets ranging from 33 to 560 nanoseconds and relative rates ranging from 2 times 10 (sup -13) to 6 times 10 (sup -12) between the ground station clocks during selected mission phases. We study the results from the different methods and discuss their applicability for time

  3. Comparative analysis of the performance of One-Way and Two-Way urban road networks

    Science.gov (United States)

    Gheorghe, Carmen

    2017-10-01

    The fact that the number of vehicles is increasing year after year represents a challenge in road traffic management because it is necessary to adjust the road traffic, in order to prevent any incidents, using mostly the same road infrastructure. At this moment one-way road network provides efficient traffic flow for vehicles but it is not ideal for pedestrians. Therefore, a proper solution must be found and applied when and where it is necessary. Replacing one-way road network with two-way road network may be a viable solution especially if in the area is high pedestrian traffic. The paper aims to highlight the influence of both, one-way and two-way urban road networks through an experimental research which was performed by using traffic data collected in the field. Each of the two scenarios analyzed were based on the same traffic data, the same geometrical conditions of the road (lane width, total road segment width, road slopes, total length of the road network) and also the same signaling conditions (signalised intersection or roundabout). The analysis which involves two-way scenario reveals changes in the performance parameters like delay average, stops average, delay stop average and vehicle speed average. Based on the values obtained, it was possible to perform a comparative analysis between the real, one-way, scenario and the theoretical, two-way, scenario.

  4. On the experimental determination of the one-way speed of light

    International Nuclear Information System (INIS)

    Perez, Israel

    2011-01-01

    In this paper the question of the isotropy of the one-way speed of light is addressed from an experimental perspective. In particular, we analyse two experimental methods commonly used in its determination. The analysis is aimed at clarifying the view that the one-way speed of light cannot be determined by techniques in which physical entities close paths. The procedure employed here will provide epistemological tools so that physicists understand that a direct measurement of the speed not only of light but of any physical entity is by no means trivial. Our results shed light on the physics behind the experiments which may be of interest for both physicists with an elemental knowledge in special relativity and philosophers of science.

  5. Experimental realization of a one-way quantum computer algorithm solving Simon's problem.

    Science.gov (United States)

    Tame, M S; Bell, B A; Di Franco, C; Wadsworth, W J; Rarity, J G

    2014-11-14

    We report an experimental demonstration of a one-way implementation of a quantum algorithm solving Simon's problem-a black-box period-finding problem that has an exponential gap between the classical and quantum runtime. Using an all-optical setup and modifying the bases of single-qubit measurements on a five-qubit cluster state, key representative functions of the logical two-qubit version's black box can be queried and solved. To the best of our knowledge, this work represents the first experimental realization of the quantum algorithm solving Simon's problem. The experimental results are in excellent agreement with the theoretical model, demonstrating the successful performance of the algorithm. With a view to scaling up to larger numbers of qubits, we analyze the resource requirements for an n-qubit version. This work helps highlight how one-way quantum computing provides a practical route to experimentally investigating the quantum-classical gap in the query complexity model.

  6. FVCOM one-way and two-way nesting using ESMF: Development and validation

    Science.gov (United States)

    Qi, Jianhua; Chen, Changsheng; Beardsley, Robert C.

    2018-04-01

    Built on the Earth System Modeling Framework (ESMF), the one-way and two-way nesting methods were implemented into the unstructured-grid Finite-Volume Community Ocean Model (FVCOM). These methods help utilize the unstructured-grid multi-domain nesting of FVCOM with an aim at resolving the multi-scale physical and ecosystem processes. A detail of procedures on implementing FVCOM into ESMF was described. The experiments were made to validate and evaluate the performance of the nested-grid FVCOM system. The first was made for a wave-current interaction case with a two-domain nesting with an emphasis on qualifying a critical need of nesting to resolve a high-resolution feature near the coast and harbor with little loss in computational efficiency. The second was conducted for the pseudo river plume cases to examine the differences in the model-simulated salinity between one-way and two-way nesting approaches and evaluate the performance of mass conservative two-way nesting method. The third was carried out for the river plume case in the realistic geometric domain in Mass Bay, supporting the importance for having the two-way nesting for coastal-estuarine integrated modeling. The nesting method described in this paper has been used in the Northeast Coastal Ocean Forecast System (NECOFS)-a global-regional-coastal nesting FVCOM system that has been placed into the end-to-end forecast and hindcast operations since 2007.

  7. Planning, performing and analyzing X-ray Raman scattering experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sahle, Ch. J., E-mail: christoph.sahle@esrf.fr [Department of Physics, PO Box 64, FI-00014 University of Helsinki, Helsinki (Finland); European Synchrotron Radiation Facility, BP 220, F-38043 Grenoble Cedex (France); Mirone, A. [European Synchrotron Radiation Facility, BP 220, F-38043 Grenoble Cedex (France); Niskanen, J.; Inkinen, J. [Department of Physics, PO Box 64, FI-00014 University of Helsinki, Helsinki (Finland); Krisch, M. [European Synchrotron Radiation Facility, BP 220, F-38043 Grenoble Cedex (France); Huotari, S. [Department of Physics, PO Box 64, FI-00014 University of Helsinki, Helsinki (Finland)

    2015-02-03

    A summarising review of data treatment for non-resonant inelastic X-ray scattering data from modern synchrotron-based multi-analyzer spectrometers. A compilation of procedures for planning and performing X-ray Raman scattering (XRS) experiments and analyzing data obtained from them is presented. In particular, it is demonstrated how to predict the overall shape of the spectra, estimate detection limits for dilute samples, and how to normalize the recorded spectra to absolute units. In addition, methods for processing data from multiple-crystal XRS spectrometers with imaging capability are presented, including a super-resolution method that can be used for direct tomography using XRS spectra as the contrast. An open-source software package with these procedures implemented is also made available.

  8. Radioactive beam experiments using the Fragment Mass Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Davids, C.N.

    1994-04-01

    The Fragment Mass Analyzer (FMA) is a recoil mass spectrometer that has many potential applications in experiments with radioactive beams. The FMA can be used for spectroscopic studies of nuclei produced in reactions with radioactive beams. The FMA is also an ideal tool for studying radiative capture reactions of astrophysical interest, using inverse kinematics. The FMA has both mass and energy dispersion, which can be used to efficiently separate the reaction recoils from the primary beam. When used with radioactive beams, the FMA allows the recoils from radiative capture reactions to be detected in a low-background environment.

  9. Remark on the One-Way Quantum Deficit for General Two-Qubit States

    International Nuclear Information System (INIS)

    Shao Lian-He; Xi Zheng-Jun; Li Yong-Ming

    2013-01-01

    We reinvestigate the one-way quantum deficit, which is a measure of quantum correlation emerging from a thermodynamical approach. We give a tight upper bound of the one-way quantum deficit for general mixed states, and give a sufficient condition for this bound. Finally, we discuss a universal way to evaluate the one-way quantum deficit for general two-qubit states. (general)

  10. On the Experimental Determination of the One-Way Speed of Light

    Science.gov (United States)

    Perez, Israel

    2011-01-01

    In this paper the question of the isotropy of the one-way speed of light is addressed from an experimental perspective. In particular, we analyse two experimental methods commonly used in its determination. The analysis is aimed at clarifying the view that the one-way speed of light cannot be determined by techniques in which physical entities…

  11. One-way wave propagation with amplitude based on pseudo-differential operators

    NARCIS (Netherlands)

    op 't Root, T.J.P.M.; Stolk, C.C.

    2010-01-01

    The one-way wave equation is widely used in seismic migration. Equipped with wave amplitudes, the migration can be provided with the reconstruction of the strength of reflectivity. We derive the one-way wave equation with geometrical amplitude by using a symmetric square root operator and a wave

  12. Evaluation of one-way valves used in medical devices for prevention of cross-contamination.

    Science.gov (United States)

    Nandy, Poulomi; Young, Megan; Haugen, Shanil P; Katzenmeyer-Pleuss, Kristy; Gordon, Edward A; Retta, Stephen M; Wood, Steven C; Lucas, Anne D

    2017-07-01

    One-way valves used in day use devices (used on multiple patients throughout a day without reprocessing between patients) are intended to reduce the potential for cross-contamination between patients resulting from the backflow of patient fluids. One-way valves are typically designed to withstand high levels of back pressure before failure; however, they may not be explicitly designed as a means of infection control as used in medical device applications. Five different medical grade one-way valves were placed in low pressure configurations. After flushing in the intended direction of flow, bacteriophage, bacteria, or dye was placed patient side for 24 hours. The upstream device side of the valve was then evaluated for microbial growth or presence of visible dye. Leakage (ie, backflow) of the microorganisms occurred with a variety of one-way valve designs across a range of fluid properties tested. This study describes testing of the one-way valves (component-level testing) for the potential of cross-contamination. Although day use medical device systems may use numerous other factors to prevent patient cross-contamination, this work demonstrates that one-way valves themselves may not prevent leakage of contaminated fluid if the fluid is able to reach the upstream side of the one-way valve. Published by Elsevier Inc.

  13. Moments Based Framework for Performance Analysis of One-Way/Two-Way CSI-Assisted AF Relaying

    KAUST Repository

    Xia, Minghua

    2012-09-01

    When analyzing system performance of conventional one-way relaying or advanced two-way relaying, these two techniques are always dealt with separately and, thus, their performance cannot be compared efficiently. Moreover, for ease of mathematical tractability, channels considered in such studies are generally assumed to be subject to Rayleigh fading or to be Nakagami-$m$ channels with integer fading parameters, which is impractical in typical urban environments. In this paper, we propose a unified moments-based framework for general performance analysis of channel-state-information (CSI) assisted amplify-and-forward (AF) relaying systems. The framework is applicable to both one-way and two-way relaying over arbitrary Nakagami-$m$ fading channels, and it includes previously reported results as special cases. Specifically, the mathematical framework is firstly developed under the umbrella of the weighted harmonic mean of two Gamma-distributed variables in conjunction with the theory of Pad\\\\\\'e approximants. Then, general expressions for the received signal-to-noise ratios of the users in one-way/two-way relaying systems and the corresponding moments, moment generation function, and cumulative density function are established. Subsequently, the mathematical framework is applied to analyze, compare, and gain insights into system performance of one-way and two-way relaying techniques, in terms of outage probability, average symbol error probability, and achievable data rate. All analytical results are corroborated by simulation results as well as previously reported results whenever available, and they are shown to be efficient tools to evaluate and compare system performance of one-way and two-way relaying.

  14. Bimodal behavior of post-measured entropy and one-way quantum deficit for two-qubit X states

    Science.gov (United States)

    Yurischev, Mikhail A.

    2018-01-01

    A method for calculating the one-way quantum deficit is developed. It involves a careful study of post-measured entropy shapes. We discovered that in some regions of X-state space the post-measured entropy \\tilde{S} as a function of measurement angle θ \\in [0,π /2] exhibits a bimodal behavior inside the open interval (0,π /2), i.e., it has two interior extrema: one minimum and one maximum. Furthermore, cases are found when the interior minimum of such a bimodal function \\tilde{S}(θ ) is less than that one at the endpoint θ =0 or π /2. This leads to the formation of a boundary between the phases of one-way quantum deficit via finite jumps of optimal measured angle from the endpoint to the interior minimum. Phase diagram is built up for a two-parameter family of X states. The subregions with variable optimal measured angle are around 1% of the total region, with their relative linear sizes achieving 17.5%, and the fidelity between the states of those subregions can be reduced to F=0.968. In addition, a correction to the one-way deficit due to the interior minimum can achieve 2.3%. Such conditions are favorable to detect the subregions with variable optimal measured angle of one-way quantum deficit in an experiment.

  15. One-way optical transmission in silicon photonic crystal heterojunction with circular and square scatterers

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Dan, E-mail: liudanhu725@126.com [School of Physics and Mechanical & Electrical Engineering, Hubei University of Education, Wuhan, 430205 (China); Hu, Sen [School of Physics and Mechanical & Electrical Engineering, Hubei University of Education, Wuhan, 430205 (China); Gao, Yihua [Wuhan National Laboratory for Optoelectronics (WNLO), School of Physics, Huazhong University of Science and Technology (HUST), Wuhan, 430074 (China)

    2017-07-12

    A 2D orthogonal square-lattice photonic crystal (PC) heterojunction consisting of circular and square air holes in silicon is presented. Band structures are calculated using the plane wave expansion method, and the transmission properties are investigated by the finite-different time-domain simulations. Thanks to the higher diffraction orders excited when the circular and square holes are interlaced along the interface, one-way transmission phenomena can exist within wide frequency regions. The higher order diffraction is further enhanced through two different interface optimization designs proposed by modifying the PC structure of the hetero-interface. An orthogonal PC heterojunction for wide-band and efficient one-way transmission is constructed, and the maximum transmissivity is up to 78%. - Highlights: • Photonic crystal heterojunction with circular and square scatterers is first studied. • One-way transmission efficiency is closely related to the hetero-interface. • Wide-band and efficient one-way transmission is realized.

  16. Measuring the user experience collecting, analyzing, and presenting usability metrics

    CERN Document Server

    Tullis, Thomas

    2013-01-01

    Measuring the User Experience was the first book that focused on how to quantify the user experience. Now in the second edition, the authors include new material on how recent technologies have made it easier and more effective to collect a broader range of data about the user experience. As more UX and web professionals need to justify their design decisions with solid, reliable data, Measuring the User Experience provides the quantitative analysis training that these professionals need. The second edition presents new metrics such as emotional engagement, personas, k

  17. "Imprisoned" in pain: analyzing personal experiences of phantom pain.

    Science.gov (United States)

    Nortvedt, Finn; Engelsrud, Gunn

    2014-11-01

    This article explores the phenomenon of "phantom pain." The analysis is based on personal experiences elicited from individuals who have lost a limb or live with a paralyzed body part. Our study reveals that the ways in which these individuals express their pain experience is an integral aspect of that experience. The material consists of interviews undertaken with men who are living with phantom pain resulting from a traumatic injury. The phenomenological analysis is inspired by Zahavi (J Conscious Stud 8(5-7):151-167, 2001) and Merleau-Ponty (Phenomenology of perception. Routledge and Kegan Paul, London, 1962/2000). On a descriptive level the metaphors these patients invoke to describe their condition reveal immense suffering, such as a feeling of being invaded by insects or of their skin being scorched and stripped from their body. Such metaphors express a dimension of experience concerning the self that is in pain and others whom the sufferer relates to through this pain, as well as the agony that this pain inflicts in the world of lived experience. This pain has had a profound impact on their lives and altered their relationship with self (body), others and the world. Their phantom pain has become a reminder of their formerly intact and functioning body; they describe the contrast between their past and present body as an ambiguous and disturbing experience. We conclude that these sensitive and personalized experiences of phantom pain illuminates how acts of expression--spoken pain--constitute a fundamental dimension of a first-person perspective which contribute to the field of knowledge about "phantom pain".

  18. Performance Analysis of a Forecasting Relocation Model for One-Way Carsharing

    Directory of Open Access Journals (Sweden)

    Ganjar Alfian

    2017-06-01

    Full Text Available A carsharing service can be seen as a transport alternative between private and public transport that enables a group of people to share vehicles based at certain stations. The advanced carsharing service, one-way carsharing, enables customers to return the car to another station. However, one-way implementation generates an imbalanced distribution of cars in each station. Thus, this paper proposes forecasting relocation to solve car distribution imbalances for one-way carsharing services. A discrete event simulation model was developed to help evaluate the proposed model performance. A real case dataset was used to find the best simulation result. The results provide a clear insight into the impact of forecasting relocation on high system utilization and the reservation acceptance ratio compared to traditional relocation methods.

  19. Trellis plots as visual aids for analyzing split plot experiments

    DEFF Research Database (Denmark)

    Kulahci, Murat; Menon, Anil

    2017-01-01

    The analysis of split plot experiments can be challenging due to a complicated error structure resulting from restrictions on complete randomization. Similarly, standard visualization methods do not provide the insight practitioners desire to understand the data, think of explanations, generate h...

  20. Soliton creation, propagation, and annihilation in aeromechanical arrays of one-way coupled bistable elements

    Science.gov (United States)

    Rosenberger, Tessa; Lindner, John F.

    We study the dynamics of mechanical arrays of bistable elements coupled one-way by wind. Unlike earlier hydromechanical unidirectional arrays, our aeromechanical one-way arrays are simpler, easier to study, and exhibit a broader range of phenomena. Soliton-like waves propagate in one direction at speeds proportional to wind speeds. Periodic boundaries enable solitons to annihilate in pairs in even arrays where adjacent elements are attracted to opposite stable states. Solitons propagate indefinitely in odd arrays where pairing is frustrated. Large noise spontaneously creates soliton- antisoliton pairs, as predicted by prior computer simulations. Soliton annihilation times increase quadratically with initial separations, as expected for random walk models of soliton collisions.

  1. Experimental realization of a quantum game on a one-way quantum computer

    International Nuclear Information System (INIS)

    Prevedel, Robert; Stefanov, Andre; Walther, Philip; Zeilinger, Anton

    2007-01-01

    We report the first demonstration of a quantum game on an all-optical one-way quantum computer. Following a recent theoretical proposal we implement a quantum version of Prisoner's Dilemma, where the quantum circuit is realized by a four-qubit box-cluster configuration and the player's local strategies by measurements performed on the physical qubits of the cluster. This demonstration underlines the strength and versatility of the one-way model and we expect that this will trigger further interest in designing quantum protocols and algorithms to be tested in state-of-the-art cluster resources

  2. Designing experiments and analyzing data a model comparison perspective

    CERN Document Server

    Maxwell, Scott E

    2013-01-01

    Through this book's unique model comparison approach, students and researchers are introduced to a set of fundamental principles for analyzing data. After seeing how these principles can be applied in simple designs, students are shown how these same principles also apply in more complicated designs. Drs. Maxwell and Delaney believe that the model comparison approach better prepares students to understand the logic behind a general strategy of data analysis appropriate for various designs; and builds a stronger foundation, which allows for the introduction of more complex topics omitt

  3. Cruising tourism Novi Sad and Belgrade residents' experience analyze

    Directory of Open Access Journals (Sweden)

    Berić Dejan

    2012-01-01

    Full Text Available The subject of this paper are previous nautical experiences of the local population of Novi Sad and Belgrade. The aim of this paper is to determine how local population (who has already cruised, but as well as those who have not is experiencing our country in terms of cruises. The research was based on conducting interviews with local people in Novi Sad and Belgrade. Interviews with a structured questionnaire, were performed from March to June 2010 with the task to determine past and potential cruise destinations, as well as attitudes about the potentials of Serbia in terms of nautical tourism. The importance of this paper is primarily based on the enrichment of knowledge on the segment of tourism that is the subject of this paper. Obtained results may help further studies of the causal link between cruises and experiences of local population in nautical tourism. [Projekat Ministarstva nauke Republike Srbije, br. 176020: Transformations of Geo Area of Serbia - past, current problems and suggestions of the solutions

  4. Historiography taking issue: analyzing an experiment with heroin abusers.

    Science.gov (United States)

    Dehue, Trudy

    2004-01-01

    This article discusses the predicament of historians becoming part of the history they are investigating and illustrates the issue in a particular case. The case is that of the randomized controlled trial (RCT)-more specifically, its use for testing the effects of providing heroin to severe heroin abusers. I counter the established view of the RCT as a matter of timeless logic and argue that this research design was developed in the context of administrative knowledge making under twentieth-century economic liberalism of which it epitomizes some central values. I also argue that the applicability of the RCT depends on the degree to which its advocates can define the issue to be studied according to its inherent values. Next, I demonstrate how advocates of an RCT with heroin provision in the Netherlands steered the political discussion on heroin provision and how the values of economic liberalism also shaped the results of the Dutch maintenance experiment. In addition, I relate how my analysis of this experiment became part of political debates in the Netherlands. Contrary to my intentions, adversaries of heroin maintenance used my critique on the heroin RCT as an argument against heroin maintenance. Such risks are inherent to historiography and sociology of science aiming at practical relevance while challenging treasured scientific beliefs. I conclude that it still seems better to expose arguments on unjustified certainties than to suppress them for strategic reasons. Copyright 2004 Wiley Periodicals, Inc.

  5. Teaching Principles of One-Way Analysis of Variance Using M&M's Candy

    Science.gov (United States)

    Schwartz, Todd A.

    2013-01-01

    I present an active learning classroom exercise illustrating essential principles of one-way analysis of variance (ANOVA) methods. The exercise is easily conducted by the instructor and is instructive (as well as enjoyable) for the students. This is conducive for demonstrating many theoretical and practical issues related to ANOVA and lends itself…

  6. Thermal hydraulic studies of spallation target for one-way coupled ...

    Indian Academy of Sciences (India)

    1Laser and Plasma Technology Division; 2Nuclear Physics Division, Bhabha Atomic. Research Centre ... Abstract. BARC has recently proposed a one-way coupled ADS reactor. This reactor requires typically ~1 GeV proton beam with 2 mA of current. Approximately 8 kW of .... The overall target module dimensions and flow ...

  7. Using hyperspherical coordinates to analyze many-particle fragmentation experiments

    Science.gov (United States)

    Feizollah, Peyman; Rajput, Jyoti; Berry, Ben; Jochim, Bethany; Severt, T.; Kanaka Raju, P.; Carnes, K. D.; Ben-Itzhak, I.; Esry, B. D.

    2016-05-01

    Analyzing and plotting the distribution of momenta for processes producing more than two fragments has long been a challenge for two reasons: lack of an appropriate representation and our inability to effectively plot and visualize more than two or three dimensions. While there is little that can be done about the latter, we propose using hyperspherical coordinates to address the former. Existing methods such as Newton and Dalitz plots give us information about the fragmentation process for three or four bodies, but neither can be easily generalized to fragmentation with an arbitrary number of particles. We will show that hyperspherical coordinates provide a systematic framework for doing exactly this. We will compare the suggested method with Newton and Dalitz plots for three-body breakup and discuss the similarities and differences between them. This work was supported by the Chemical Sciences, Geosciences, and Biosciences Division, Office of Basic Energy Sciences, Office of Science, U. S. Department of Energy. BJ was also supported in part by DOE-SCGF (DE-AC05-06OR23100).

  8. Managing expatriates: analyzing the experience of an internationalized Brazilian Company

    Directory of Open Access Journals (Sweden)

    Bernardo Meyer

    2016-12-01

    Full Text Available The management of expatriate employees is an important topic in the field of international business. Most of the studies on expatriation management are based on the experience of North American and Western European corporations. Few studies focus on corporations from developing countries. The purpose of this study was to examine, in the cultural dimension, how expatriates were managed by a Brazilian multinational corporation. This is a single case study based on qualitative research. The focus was a Brazilian telecommunications company operating in China. The research investigated the preparedness of expatriates prior to departure and the kinds of difficulties they faced in daily life abroad. The analysis revealed that the lack of preparation of expatriates before departure was an important barrier to overcome in order to achieve better performance. The findings indicated that psychic distance was the key factor responsible for major problems facing expatriate workers. The contribution of the study is that it shows how a Brazilian corporation addressed the challenges of managing its expatriates in China.

  9. Semi-device-independent security of one-way quantum key distribution

    International Nuclear Information System (INIS)

    Pawlowski, Marcin; Brunner, Nicolas

    2011-01-01

    By testing nonlocality, the security of entanglement-based quantum key distribution (QKD) can be enhanced to being ''device-independent.'' Here we ask whether such a strong form of security could also be established for one-way (prepare and measure) QKD. While fully device-independent security is impossible, we show that security can be guaranteed against individual attacks in a semi-device-independent scenario. In the latter, the devices used by the trusted parties are noncharacterized, but the dimensionality of the quantum systems used in the protocol is assumed to be bounded. Our security proof relies on the analogies between one-way QKD, dimension witnesses, and random-access codes.

  10. Topological one-way quantum computation on verified logical cluster states

    International Nuclear Information System (INIS)

    Fujii, Keisuke; Yamamoto, Katsuji

    2010-01-01

    We present a scheme to improve the noise threshold for fault-tolerant topological one-way computation with constant overhead. Certain cluster states of finite size, say star clusters, are constructed with logical qubits through an efficient verification process to achieve high fidelity. Then, the star clusters are connected near-deterministically with verification to form a three-dimensional cluster state to implement topological one-way computation. The necessary postselection for verification is localized within the star clusters, ensuring the scalability of computation. By using the Steane seven-qubit code for the logical qubits, this scheme works with a high error rate of 2% and reasonable resources comparable to or less than those for the other fault-tolerant schemes. A higher noise threshold would be achieved by adopting a larger code.

  11. [Extracorporeal ventriculoatrial shunt with the use of one-way ball valve].

    Science.gov (United States)

    Kubo, Shigeki; Takimoto, Hiroshi; Hosoi, Kazuki; Toyota, Shingo; Takakura, Shuji; Hayashi, Yasuhiro; Ueno, Masato; Morisako, Toshitaka; Karasawa, Jun; Ninaga, Hideo; Yoshimine, Toshiki

    2002-04-01

    We developed a simple system of an "extracorporeal" ventriculoatrial (VA) shunt using a one-way ball valve (Acty valve II, Kaneka Medix) to release the patient from postoperative constraint during the ventricular drainage. The system is constructed in such a way that the ventricular drainage tube is connected to the central venous catheter via a one-way valve. The CSF is regulated by using the valve and is diverted into the systemic circulation as in the conventional ventriculoatrial shunt. After 2 or 3 weeks of CSF diversion through the extracorporeal VA shunt, a ventriculoperitoneal shunt is placed if hydrocephalus is apparent by temporary occlusion of the system. We applied this system to 4 patients with hydrocephalus, and we found it useful and free from adverse effects. The patient was freed from physical constraint involved in conventional ventricular drainage and an effective program of early rehabilitation was able to be started.

  12. Statistical Tests for One-way/Two-way Translation in Translational Medicine

    Directory of Open Access Journals (Sweden)

    Siu-Keung Tse

    2008-12-01

    Full Text Available Translational medicine has been defined as bench-to-bedside research, where a basic laboratory discovery becomes applicable to the diagnosis, treatment or prevention of a specific disease, and is brought forth by either a physician/scientist who works at the interface between the research laboratory and patient care, or by a team of basic and clinical science investigators. Statistics plays an important role in translational medicine to ensure that the translational process is accurate and reliable, with statistical assurance. For this purpose, statistical criteria for assessment of one-way and two-way translation are proposed. Under a well established and validated translational model, statistical tests for one-way and two-way translation are discussed. Some discussion on lost in translation is also given.

  13. Location of Stations in a One-Way Electric Car Sharing System

    OpenAIRE

    Çalik, Hatice; Fortz, Bernard

    2017-01-01

    International audience; We introduce a strategic decision problem in a one-way electric car sharing system. We propose a mixed integer linear programming formulation for solving this problem. We conduct an extensive computational study to test the performance of our formulation and its relaxations by using real data instances. The results turn out to be encouraging for diving into more challenging extensions of the problem under consideration.

  14. A Class of Non Invertible Matrices in GF (2) for Practical One Way Hash Algorithm

    OpenAIRE

    Berisha, Artan; Baxhaku, Behar; Alidema, Artan

    2012-01-01

    In this paper, we describe non invertible matrix in GF(2)which can be used as multiplication matrix in Hill Cipher technique for one way hash algorithm. The matrices proposed are permutation matrices with exactly one entry 1 in each row and each column and 0 elsewhere. Such matrices represent a permutation of m elements. Since the invention, Hill cipher algorithm was used for symmetric encryption, where the multiplication matrix is the key. The Hill cipher requires the inverse of the matrix t...

  15. Experimental realization of Deutsch's algorithm in a one-way quantum computer.

    Science.gov (United States)

    Tame, M S; Prevedel, R; Paternostro, M; Böhi, P; Kim, M S; Zeilinger, A

    2007-04-06

    We report the first experimental demonstration of an all-optical one-way implementation of Deutsch's quantum algorithm on a four-qubit cluster state. All the possible configurations of a balanced or constant function acting on a two-qubit register are realized within the measurement-based model for quantum computation. The experimental results are in excellent agreement with the theoretical model, therefore demonstrating the successful performance of the algorithm.

  16. More Than One Way to Catch a Fish: Effective Translation of Ocean Science for the Public

    National Research Council Canada - National Science Library

    Meeson, B. W; McDonnell, J; Kohut, J; Litchenwahler, S; Helling, H

    2006-01-01

    .... Surprisingly little is known about effective translation of complex scientific and technical information into information that is easy to understand, easy to use, and of interest to the public and educators. Information translation is one way to make science and technical information more accessible to the public and thereby, improve scientific literacy of many Americans. We present three information translation models that promote scientific and technical literacy.

  17. Periodic responses of a pulley-belt system with one-way clutch under inertia excitation

    Science.gov (United States)

    Ding, Hu

    2015-09-01

    The stable steady-state periodic response of a two-pulley belt drive system coupled with an accessory by a one-way clutch is presented. For the first time, the pulley-belt system is studied under double excitations. Specifically, the dual excitations consist of harmonic motion of the driving pulley and inertia excitation. The belt spans are modeled as axially moving viscoelastic beams by considering belt bending stiffness. Therefore, integro-partial-differential equations are derived for governing the transverse vibrations of the belt spans. Moreover, the transverse vibrations of the moving belt are coupled with the rotation vibrations of the pulleys by nonlinear dynamic tension. For describing the unidirectional decoupling function of the one-way device, rotation vibrations of the driven pulley and accessory are modeled as coupled piecewise ordinary differential equations. In order to eliminate the influence of the boundary of the belt spans, the non-trivial equilibriums of the pulley-belt system are numerically determined. Furthermore, A nonlinear piecewise discrete-continuous dynamical system is derived by introducing a coordinate transform. Coupled vibrations of the pulley-belt system are investigated via the Galerkin truncation. The natural frequencies of the coupled vibrations are obtained by using the fast Fourier transform. Moreover, frequency-response curves are abstracted from time histories. Therefore, resonance areas of the belt spans, the driven pulley and the accessory are presented. Furthermore, validity of the Galerkin method is examined by comparing with the differential and integral quadrature methods (DQM & IQM). By comparing the results with and without one-way device, significant damping effect of clutch on the dynamic response is discovered. Furthermore, the effects of the intensity of the driving pulley excitation and the inertia excitation are studied. Moreover, numerical results demonstrate that the two excitations interact on the steady

  18. Aperiodicity in one-way Markov cycles and repeat times of large earthquakes in faults

    OpenAIRE

    Tejedor, Alejandro; Gómez, Javier; Pacheco, Amalio F.

    2011-01-01

    A common use of Markov Chains is the simulation of the seismic cycle in a fault, i.e. as a renewal model for the repetition of its characteristic earthquakes. This representation is consistent with Reid's elastic rebound theory. Here it is proved that in {\\it any} one-way Markov cycle, the aperiodicity of the corresponding distribution of cycle lengths is always lower than one. This fact concurs with observations of large earthquakes in faults all over the world.

  19. Fish ladder of Lajeado Dam: migrations on one-way routes?

    Directory of Open Access Journals (Sweden)

    Angelo Antônio Agostinho

    half of the reservoir; those found in water flowing through the spillways, turbines or fish ladder of Lajeado Dam belonged essentially to non-migratory clupeids that spawn in the inner part of the reservoir. In addition, results showed that in a reservoir with no fish-passage mechanism, migrants select habitats that still maintain riverine characteristics, in the upper parts of the impounded area. The downward movements are negligible compared to those upward, in the experiments conducted in the fish ladder. It is concluded, therefore, that the Lajeado fish ladder, and possibly those at other dams, is essentially a one-way route that promotes upstream movements of migrants, without the necessary return of adults or their offspring. Thus, the low permeability of the connections provided by these management tools can drastically increase the level of environmental impact that they were actually intended to reduce.

  20. Relay Selection and Resource Allocation in One-Way and Two-Way Cognitive Relay Networks

    KAUST Repository

    Alsharoa, Ahmad M.

    2013-05-08

    In this work, the problem of relay selection and resource power allocation in one- way and two-way cognitive relay networks using half duplex channels with different relaying protocols is investigated. Optimization problems for both single and multiple relay selection that maximize the sum rate of the secondary network without degrading the quality of service of the primary network by respecting a tolerated interference threshold were formulated. Single relay selection and optimal power allocation for two-way relaying cognitive radio networks using decode-and-forward and amplify-and-forward protocols were studied. Dual decomposition and subgradient methods were used to find the optimal power allocation. The transmission process to exchange two different messages between two transceivers for two-way relaying technique takes place in two time slots. In the first slot, the transceivers transmit their signals simultaneously to the relay. Then, during the second slot the relay broadcasts its signal to the terminals. Moreover, improvement of both spectral and energy efficiency can be achieved compared with the one-way relaying technique. As an extension, a multiple relay selection for both one-way and two-way relaying under cognitive radio scenario using amplify-and-forward were discussed. A strong optimization tool based on genetic and iterative algorithms was employed to solve the 
formulated optimization problems for both single and multiple relay selection, where discrete relay power levels were considered. Simulation results show that the practical and low-complexity heuristic approaches achieve almost the same performance of the optimal relay selection schemes either with discrete or continuous power distributions while providing a considerable saving in terms of computational complexity.

  1. Semi-device-independent security of one-way quantum key distribution

    OpenAIRE

    Pawlowski, Marcin; Brunner, Nicolas

    2011-01-01

    By testing nonlocality, the security of entanglement-based quantum key distribution (QKD) can be enhanced to being 'device-independent'. Here we ask whether such a strong form of security could also be established for one-way (prepare and measure) QKD. While fully device-independent security is impossible, we show that security can be guaranteed against individual attacks in a semi-device-independent scenario. In the latter, the devices used by the trusted parties are non-characterized, but t...

  2. One-way acoustic mirror based on anisotropic zero-index media

    Energy Technology Data Exchange (ETDEWEB)

    Gu, Zhong-ming; Liang, Bin, E-mail: liangbin@nju.edu.cn, E-mail: jccheng@nju.edu.cn; Yang, Jing; Cheng, Jian-chun, E-mail: liangbin@nju.edu.cn, E-mail: jccheng@nju.edu.cn [Key Laboratory of Modern Acoustics, MOE, Institute of Acoustics, Department of Physics, Nanjing University, Nanjing 210093 (China); Collaborative Innovation Center of Advanced Microstructures, Nanjing University, Nanjing 210093 (China); Zou, Xin-ye [Key Laboratory of Modern Acoustics, MOE, Institute of Acoustics, Department of Physics, Nanjing University, Nanjing 210093 (China); Collaborative Innovation Center of Advanced Microstructures, Nanjing University, Nanjing 210093 (China); State Key Laboratory of Acoustics, Chinese Academy of Sciences, Beijing 100190 (China); Li, Yong [CNRS, Institut Jean Lamour, Vandoeuvre-lès-Nancy F-54506, France and Institut Jean Lamour, Université de Lorraine, Boulevard des Aiguillettes, BP: 70239, 54506 Vandoeuvre-lès-Nancy (France); Yang, Jun [Key Laboratory of Noise and Vibration Research, Institute of Acoustics, Chinese Academy of Sciences, Beijing 100190 (China)

    2015-11-23

    We have designed a one-way acoustic mirror comprising anisotropic zero-index media. For acoustic beam incident at a particular angle, the designed structure behaves like a high-efficient mirror that redirects almost all the incident energy into another direction predicted by the Snell's law, while becoming virtually transparent to beams propagating reversely along this output path. Furthermore, the mirror can be tailored to work at arbitrary incident angle by simply adjusting its geometry. Our design, with undirectional reflection functionality and flexible working angle, may offer possibilities in space isolations and have deep implication in various scenarios like ultrasound imaging or noise control.

  3. Valence bond solid formalism for d-level one-way quantum computation

    International Nuclear Information System (INIS)

    Clark, Sean

    2006-01-01

    The d-level or qudit one-way quantum computer (d1WQC) is described using the valence bond solid formalism and the generalized Pauli group. This formalism provides a transparent means of deriving measurement patterns for the implementation of quantum gates in the computational model. We introduce a new universal set of qudit gates and use it to give a constructive proof of the universality of d1WQC. We characterize the set of gates that can be performed in one parallel time step in this model

  4. One-way propagation of acoustic waves through a periodic structure

    Science.gov (United States)

    Xu, Zheng; Xu, Wei; Yan, Xu; Qian, Menglu; Cheng, Qian

    2018-02-01

    One-way acoustic transmission is achieved through a brass plate with a periodic grating on the surface. Using the Schlieren imaging technique, the positive and negative propagation processes of acoustic waves through the periodic structure were experimentally observed. Simulations were performed using the finite-element method. Both the experimental and simulation results revealed a very large transmission ratio between positive and negative incidence, thus demonstrating the feasibility of using this structure as an acoustic rectifier. The results indicate that the structure has a broadband working frequency. The structure has potential applications in ultrasonic medical devices and sonochemical reactors.

  5. One-Way Information Deficit and Geometry for a Class of Two-Qubit States

    International Nuclear Information System (INIS)

    Wang Yaokun; Ma Teng; Wang Zhixi; Li Bo

    2013-01-01

    The work deficit, as introduced by Jonathan Oppenheim et al. [Phys. Rev. Lett. 89 (2002) 180402] is a good measure of the quantum correlations in a state and provides a new standpoint for understanding quantum non-locality. In this paper, we analytically evaluate the one-way information deficit (OWID) for the Bell-diagonal states and a class of two-qubit states and further give the geometry picture for OWID. The dynamic behavior of the OWID under decoherence channel is investigated and it is shown that the OWID of some classes of X states is more robust against the decoherence than the entanglement. (general)

  6. Exact Markov chain and approximate diffusion solution for haploid genetic drift with one-way mutation.

    Science.gov (United States)

    Hössjer, Ola; Tyvand, Peder A; Miloh, Touvia

    2016-02-01

    The classical Kimura solution of the diffusion equation is investigated for a haploid random mating (Wright-Fisher) model, with one-way mutations and initial-value specified by the founder population. The validity of the transient diffusion solution is checked by exact Markov chain computations, using a Jordan decomposition of the transition matrix. The conclusion is that the one-way diffusion model mostly works well, although the rate of convergence depends on the initial allele frequency and the mutation rate. The diffusion approximation is poor for mutation rates so low that the non-fixation boundary is regular. When this happens we perturb the diffusion solution around the non-fixation boundary and obtain a more accurate approximation that takes quasi-fixation of the mutant allele into account. The main application is to quantify how fast a specific genetic variant of the infinite alleles model is lost. We also discuss extensions of the quasi-fixation approach to other models with small mutation rates. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Behavior of one-way reinforced concrete slabs subjected to fire

    Directory of Open Access Journals (Sweden)

    Said M. Allam

    2013-12-01

    Full Text Available A finite difference analysis was performed to investigate the behavior of one-way reinforced concrete slabs exposed to fire. The objective of the study was to investigate the fire resistance and the fire risk after extinguishing the fire. Firstly, the fire resistance was obtained using the ISO834 standard fire without cooling phase. Secondly, the ISO834 parametric fire with cooling phase was applied to study the effect of cooling time. Accordingly, the critical time for cooling was identified and the corresponding failure time was calculated. Moreover, the maximum risk time which is the time between the fire extinguishing and the collapse of slab was obtained. Sixteen one-way reinforced concrete slabs were considered to study the effect of important parameters namely: the concrete cover thickness; the plaster; and the live load ratio. Equations for heat transfer through the slab thickness were used in the fire resistance calculations. Studying the cooling time revealed that the slabs are still prone to collapse although they were cooled before their fire resistance. Moreover, increasing the concrete cover thickness and the presence of plaster led to an increase in the maximum risk time. However, the variation in the live load ratio has almost no effect on such time.

  8. Quantification of Information in a One-Way Plant-to-Animal Communication System

    Directory of Open Access Journals (Sweden)

    Laurance R. Doyle

    2009-08-01

    Full Text Available In order to demonstrate possible broader applications of information theory to the quantification of non-human communication systems, we apply calculations of information entropy to a simple chemical communication from the cotton plant (Gossypium hirsutum to the wasp (Cardiochiles nigriceps studied by DeMoraes et al. The purpose of this chemical communication from cotton plants to wasps is presumed to be to allow the predatory wasp to more easily obtain the location of its preferred prey—one of two types of parasitic herbivores feeding on the cotton plants. Specification of the plant-eating herbivore feeding on it by the cotton plants allows preferential attraction of the wasps to those individual plants. We interpret the emission of nine chemicals by the plants as individual signal differences, (depending on the herbivore type, to be detected by the wasps as constituting a nine-signal one-way communication system across kingdoms (from the kingdom Plantae to the kingdom Animalia. We use fractional differences in the chemical abundances, (emitted as a result of the two herbivore types, to calculate the Shannon information entropic measures (marginal, joint, and mutual entropies, as well as the ambiguity, etc. of the transmitted message. We then compare these results with the subsequent behavior of the wasps, (calculating the equivocation in the message reception, for possible insights into the history and actual working of this one-way communication system.

  9. One-way spatial integration of Navier-Stokes equations: stability of wall-bounded flows

    Science.gov (United States)

    Rigas, Georgios; Colonius, Tim; Towne, Aaron; Beyar, Michael

    2016-11-01

    For three-dimensional flows, questions of stability, receptivity, secondary flows, and coherent structures require the solution of large partial-derivative eigenvalue problems. Reduced-order approximations are thus required for engineering prediction since these problems are often computationally intractable or prohibitively expensive. For spatially slowly evolving flows, such as jets and boundary layers, a regularization of the equations of motion sometimes permits a fast spatial marching procedure that results in a huge reduction in computational cost. Recently, a novel one-way spatial marching algorithm has been developed by Towne & Colonius. The new method overcomes the principle flaw observed in Parabolized Stability Equations (PSE), namely the ad hoc regularization that removes upstream propagating modes. The one-way method correctly parabolizes the flow equations based on estimating, in a computationally efficient way, the local spectrum in each cross-stream plane and an efficient spectral filter eliminates modes with upstream group velocity. Results from the application of the method to wall-bounded flows will be presented and compared with predictions from the full linearized compressible Navier-Stokes equations and PSE.

  10. Extending the CLAST sequential rule to one-way ANOVA under group sampling.

    Science.gov (United States)

    Ximénez, Carmen; Revuelta, Javier

    2007-02-01

    Several studies have demonstrated that the fixed-sample stopping rule (FSR), in which the sample size is determined in advance, is less practical and efficient than are sequential-stopping rules. The composite limited adaptive sequential test (CLAST) is one such sequential-stopping rule. Previous research has shown that CLAST is more efficient in terms of sample size and power than are the FSR and other sequential rules and that it reflects more realistically the practice of experimental psychology researchers. The CLAST rule has been applied only to the t test of mean differences with two matched samples and to the chi-square independence test for twofold contingency tables. The present work extends previous research on the efficiency of CLAST to multiple group statistical tests. Simulation studies were conducted to test the efficiency of the CLAST rule for the one-way ANOVA for fixed effects models. The ANOVA general test and two linear contrasts of multiple comparisons among treatment means are considered. The article also introduces four rules for allocating N observations to J groups under the general null hypothesis and three allocation rules for the linear contrasts. Results show that the CLAST rule is generally more efficient than the FSR in terms of sample size and power for one-way ANOVA tests. However, the allocation rules vary in their optimality and have a differential impact on sample size and power. Thus, selecting an allocation rule depends on the cost of sampling and the intended precision.

  11. Control of distributed interference in the one-way quantum cryptography system

    Science.gov (United States)

    Balygin, K. A.; Klimov, A. N.; Kulik, S. P.; Molotkov, S. N.

    2017-07-01

    The possibility of controlling interference in two spaced fiber Mach-Zehnder interferometers and maintaining a nearly ideal visibility has been demonstrated for the one-way quantum cryptography system directly in the key distribution process through a communication channel with a length of 50 km. It has been shown that the deviation of the visibility from ideal is certainly due to the detected difference between the numbers of 0's and 1's in the raw (sifted) key. For this reason, an interferometer can be balanced only in the quasi-singlephoton mode without the interruption of the process of key distribution by using the difference between the numbers of 0's and 1's in the raw key as an indicator of an error. The proposed approach reduces the balancing time and, furthermore, does not require additional exchanges through an open communication channel.

  12. One-way ticket” Romanian Migration at the beginning of the 20th century

    Directory of Open Access Journals (Sweden)

    Silvia Bocancea

    2011-12-01

    Full Text Available Unlike the new age of migration that we are experiencing now, the social mobility specific to the end of the 19th century and the beginning of the 20th century may be defined by the expression “one way ticket”; the immigrant, usually a man, used to leave his country of origin and settle for good in his host-country (he did not look for his happiness from state to state, as it happens now. The Romanian migration of that time was directed mainly towards the New World (particularly to the USA, and less to Canada. This paper is an attempt to sketch the image of Romanian emigration by taking into account the peculiarities determined by: emigration causes, geographical predominance, social composition, occupational options in the host-country, and the structure of Romanian immigrant communities.

  13. One-Way Nested Large-Eddy Simulation over the Askervein Hill

    Directory of Open Access Journals (Sweden)

    James D. Doyle

    2009-07-01

    Full Text Available Large-eddy simulation (LES models have been used extensively to study atmospheric boundary layer turbulence over flat surfaces; however, LES applications over topography are less common. We evaluate the ability of an existing model – COAMPS®-LES – to simulate flow over terrain using data from the Askervein Hill Project. A new approach is suggested for the treatment of the lateral boundaries using one-way grid nesting. LES wind profile and speed-up are compared with observations at various locations around the hill. The COAMPS-LES model performs generally well. This case could serve as a useful benchmark for evaluating LES models for applications over topography.

  14. File-Based One-Way BISON Coupling Through VERA: User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Stimpson, Shane G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-02-28

    Activities to incorporate fuel performance capabilities into the Virtual Environment for Reactor Applications (VERA) are receiving increasing attention [1–6]. The multiphysics emphasis is expanding as the neutronics (MPACT) and thermal-hydraulics (CTF) packages are becoming more mature. Capturing the finer details of fuel phenomena (swelling, densification, relocation, gap closure, etc.) is the natural next step in the VERA development process since these phenomena are currently not directly taken into account. While several codes could be used to accomplish this, the BISON fuel performance code [8,9] being developed by the Idaho National Laboratory (INL) is the focus of ongoing work in the Consortium for Advanced Simulation of Light Water Reactors (CASL). Built on INL’s MOOSE framework [10], BISON uses the finite element method for geometric representation and a Jacobian-free Newton-Krylov (JFNK) scheme to solve systems of partial differential equations for various fuel characteristic relationships. There are several modes of operation in BISON, but, this work uses a 2D azimuthally symmetric (R-Z) smeared-pellet model. This manual is intended to cover (1) the procedure pertaining to the standalone BISON one-way coupling from VERA and (2) the procedure to generate BISON fuel temperature tables that VERA can use.

  15. Resolvent analysis of shear flows using One-Way Navier-Stokes equations

    Science.gov (United States)

    Rigas, Georgios; Schmidt, Oliver; Towne, Aaron; Colonius, Tim

    2017-11-01

    For three-dimensional flows, questions of stability, receptivity, secondary flows, and coherent structures require the solution of large partial-derivative eigenvalue problems. Reduced-order approximations are thus required for engineering prediction since these problems are often computationally intractable or prohibitively expensive. For spatially slowly evolving flows, such as jets and boundary layers, the One-Way Navier-Stokes (OWNS) equations permit a fast spatial marching procedure that results in a huge reduction in computational cost. Here, an adjoint-based optimization framework is proposed and demonstrated for calculating optimal boundary conditions and optimal volumetric forcing. The corresponding optimal response modes are validated against modes obtained in terms of global resolvent analysis. For laminar base flows, the optimal modes reveal modal and non-modal transition mechanisms. For turbulent base flows, they predict the evolution of coherent structures in a statistical sense. Results from the application of the method to three-dimensional laminar wall-bounded flows and turbulent jets will be presented. This research was supported by the Office of Naval Research (N00014-16-1-2445) and Boeing Company (CT-BA-GTA-1).

  16. Effect of non-normality on test statistics for one-way independent groups designs.

    Science.gov (United States)

    Cribbie, Robert A; Fiksenbaum, Lisa; Keselman, H J; Wilcox, Rand R

    2012-02-01

    The data obtained from one-way independent groups designs is typically non-normal in form and rarely equally variable across treatment populations (i.e., population variances are heterogeneous). Consequently, the classical test statistic that is used to assess statistical significance (i.e., the analysis of variance F test) typically provides invalid results (e.g., too many Type I errors, reduced power). For this reason, there has been considerable interest in finding a test statistic that is appropriate under conditions of non-normality and variance heterogeneity. Previously recommended procedures for analysing such data include the James test, the Welch test applied either to the usual least squares estimators of central tendency and variability, or the Welch test with robust estimators (i.e., trimmed means and Winsorized variances). A new statistic proposed by Krishnamoorthy, Lu, and Mathew, intended to deal with heterogeneous variances, though not non-normality, uses a parametric bootstrap procedure. In their investigation of the parametric bootstrap test, the authors examined its operating characteristics under limited conditions and did not compare it to the Welch test based on robust estimators. Thus, we investigated how the parametric bootstrap procedure and a modified parametric bootstrap procedure based on trimmed means perform relative to previously recommended procedures when data are non-normal and heterogeneous. The results indicated that the tests based on trimmed means offer the best Type I error control and power when variances are unequal and at least some of the distribution shapes are non-normal. © 2011 The British Psychological Society.

  17. Joint Preprocesser-Based Detectors for One-Way and Two-Way Cooperative Communication Networks

    KAUST Repository

    Abuzaid, Abdulrahman I.

    2014-05-01

    Efficient receiver designs for cooperative communication networks are becoming increasingly important. In previous work, cooperative networks communicated with the use of L relays. As the receiver is constrained, channel shortening and reduced-rank techniques were employed to design the preprocessing matrix that reduces the length of the received vector from L to U. In the first part of the work, a receiver structure is proposed which combines our proposed threshold selection criteria with the joint iterative optimization (JIO) algorithm that is based on the mean square error (MSE). Our receiver assists in determining the optimal U. Furthermore, this receiver provides the freedom to choose U for each frame depending on the tolerable difference allowed for MSE. Our study and simulation results show that by choosing an appropriate threshold, it is possible to gain in terms of complexity savings while having no or minimal effect on the BER performance of the system. Furthermore, the effect of channel estimation on the performance of the cooperative system is investigated. In the second part of the work, a joint preprocessor-based detector for cooperative communication networks is proposed for one-way and two-way relaying. This joint preprocessor-based detector operates on the principles of minimizing the symbol error rate (SER) instead of minimizing MSE. For a realistic assessment, pilot symbols are used to estimate the channel. From our simulations, it can be observed that our proposed detector achieves the same SER performance as that of the maximum likelihood (ML) detector with all participating relays. Additionally, our detector outperforms selection combining (SC), channel shortening (CS) scheme and reduced-rank techniques when using the same U. Finally, our proposed scheme has the lowest computational complexity.

  18. One-way-coupling simulation of cavitation accompanied by high-speed droplet impact

    Energy Technology Data Exchange (ETDEWEB)

    Kondo, Tomoki; Ando, Keita, E-mail: kando@mech.keio.ac.jp [Department of Mechanical Engineering, Keio University, 3-14-1 Hiyoshi, Kohoku-ku, Yokohama 223-8522 (Japan)

    2016-03-15

    Erosion due to high-speed droplet impact is a crucial issue in industrial applications. The erosion is caused by the water-hammer loading on material surfaces and possibly by the reloading from collapsing cavitation bubbles that appear within the droplet. Here, we simulate the dynamics of cavitation bubbles accompanied by high-speed droplet impact against a deformable wall in order to see whether the bubble collapse is violent enough to give rise to cavitation erosion on the wall. The evolution of pressure waves in a single water (or gelatin) droplet to collide with a deformable wall at speed up to 110 m/s is inferred from simulations of multicomponent Euler flow where phase changes are not permitted. Then, we examine the dynamics of cavitation bubbles nucleated from micron/submicron-sized gas bubble nuclei that are supposed to exist inside the droplet. For simplicity, we perform Rayleigh–Plesset-type calculations in a one-way-coupling manner, namely, the bubble dynamics are determined according to the pressure variation obtained from the Euler flow simulation. In the simulation, the preexisting bubble nuclei whose size is either micron or submicron show large growth to submillimeters because tension inside the droplet is obtained through interaction of the pressure waves and the droplet interface; this supports the possibility of having cavitation due to the droplet impact. It is also found, in particular, for the case of cavitation arising from very small nuclei such as nanobubbles, that radiated pressure from the cavitation bubble collapse can overwhelm the water-hammer pressure directly created by the impact. Hence, cavitation may need to be accounted for when it comes to discussing erosion in the droplet impact problem.

  19. Randomized dynamical decoupling strategies and improved one-way key rates for quantum cryptography

    International Nuclear Information System (INIS)

    Kern, Oliver

    2009-01-01

    The present thesis deals with various methods of quantum error correction. It is divided into two parts. In the first part, dynamical decoupling methods are considered which have the task of suppressing the influence of residual imperfections in a quantum memory. Such imperfections might be given by couplings between the finite dimensional quantum systems (qudits) constituting the quantum memory, for instance. The suppression is achieved by altering the dynamics of an imperfect quantum memory with the help of a sequence of local unitary operations applied to the qudits. Whereas up to now the operations of such decoupling sequences have been constructed in a deterministic fashion, strategies are developed in this thesis which construct the operations by random selection from a suitable set. Formulas are derived which estimate the average performance of such strategies. As it turns out, randomized decoupling strategies offer advantages and disadvantages over deterministic ones. It is possible to benefit from the advantages of both kind of strategies by designing combined strategies. Furthermore, it is investigated if and how the discussed decoupling strategies can be employed to protect a quantum computation running on the quantum memory. It is shown that a purely randomized decoupling strategy may be used by applying the decoupling operations and adjusted gates of the quantum algorithm in an alternating fashion. Again this method can be enhanced by the means of deterministic methods in order to obtain a combined decoupling method for quantum computations analogously to the combining strategies for quantum memories. The second part of the thesis deals with quantum error-correcting codes and protocols for quantum key distribution. The focus is on the BB84 and the 6-state protocol making use of only one-way communication during the error correction and privacy amplification steps. It is shown that by adding additional errors to the preliminary key (a process called

  20. Randomized dynamical decoupling strategies and improved one-way key rates for quantum cryptography

    Energy Technology Data Exchange (ETDEWEB)

    Kern, Oliver

    2009-05-25

    The present thesis deals with various methods of quantum error correction. It is divided into two parts. In the first part, dynamical decoupling methods are considered which have the task of suppressing the influence of residual imperfections in a quantum memory. Such imperfections might be given by couplings between the finite dimensional quantum systems (qudits) constituting the quantum memory, for instance. The suppression is achieved by altering the dynamics of an imperfect quantum memory with the help of a sequence of local unitary operations applied to the qudits. Whereas up to now the operations of such decoupling sequences have been constructed in a deterministic fashion, strategies are developed in this thesis which construct the operations by random selection from a suitable set. Formulas are derived which estimate the average performance of such strategies. As it turns out, randomized decoupling strategies offer advantages and disadvantages over deterministic ones. It is possible to benefit from the advantages of both kind of strategies by designing combined strategies. Furthermore, it is investigated if and how the discussed decoupling strategies can be employed to protect a quantum computation running on the quantum memory. It is shown that a purely randomized decoupling strategy may be used by applying the decoupling operations and adjusted gates of the quantum algorithm in an alternating fashion. Again this method can be enhanced by the means of deterministic methods in order to obtain a combined decoupling method for quantum computations analogously to the combining strategies for quantum memories. The second part of the thesis deals with quantum error-correcting codes and protocols for quantum key distribution. The focus is on the BB84 and the 6-state protocol making use of only one-way communication during the error correction and privacy amplification steps. It is shown that by adding additional errors to the preliminary key (a process called

  1. Non-reciprocity and topology in optics: one-way road for light via surface magnon polariton

    Science.gov (United States)

    Ochiai, Tetsuyuki

    2015-01-01

    We show how non-reciprocity and topology are used to construct an optical one-way waveguide in the Voigt geometry. First, we present a traditional approach of the one-way waveguide of light using surface polaritons under a static magnetic field. Second, we explain a recent discovery of a topological approach using photonic crystals with the magneto-optical coupling. Third, we present a combination of the two approaches, toward a broadband one-way waveguide in the microwave range. PMID:27877739

  2. One way and the other: the bidirectional relationship between ambivalence and body movement.

    Science.gov (United States)

    Schneider, Iris K; Eerland, Anita; van Harreveld, Frenk; Rotteveel, Mark; van der Pligt, Joop; van der Stoep, Nathan; Zwaan, Rolf A

    2013-03-01

    Prior research exploring the relationship between evaluations and body movements has focused on one-sided evaluations. However, people regularly encounter objects or situations about which they simultaneously hold both positive and negative views, which results in the experience of ambivalence. Such experiences are often described in physical terms: For example, people say they are "wavering" between two sides of an issue or are "torn." Building on this observation, we designed two studies to explore the relationship between the experience of ambivalence and side-to-side movement, or wavering. In Study 1, we used a Wii Balance Board to measure movement and found that people who are experiencing ambivalence move from side to side more than people who are not experiencing ambivalence. In Study 2, we induced body movement to explore the reverse relationship and found that when people are made to move from side to side, their experiences of ambivalence are enhanced.

  3. Assessing Fan Flutter Stability in Presence of Inlet Distortion Using One-Way and Two-Way Coupled Methods

    Science.gov (United States)

    Herrick, Gregory P.

    2014-01-01

    Concerns regarding noise, propulsive efficiency, and fuel burn are inspiring aircraft designs wherein the propulsive turbomachines are partially (or fully) embedded within the airframe; such designs present serious concerns with regard to aerodynamic and aeromechanic performance of the compression system in response to inlet distortion. Previously, a preliminary design of a forward-swept high-speed fan exhibited flutter concerns in clean-inlet flows, and the present author then studied this fan further in the presence of off-design distorted in-flows. Continuing this research, a three-dimensional, unsteady, Navier-Stokes computational fluid dynamics code is again applied to analyze and corroborate fan performance with clean inlet flow and now with a simplified, sinusoidal distortion of total pressure at the aerodynamic interface plane. This code, already validated in its application to assess aerodynamic damping of vibrating blades at various flow conditions using a one-way coupled energy-exchange approach, is modified to include a two-way coupled timemarching aeroelastic simulation capability. The two coupling methods are compared in their evaluation of flutter stability in the presence of distorted in-flows.

  4. One way and the other: The bi-directional relationship between ambivalence and body movement

    NARCIS (Netherlands)

    Schneider, I.K.; Eerland, A.; van Harreveld, F.; Rotteveel, M.; van der Pligt, J.; van der Stoep, N.; Zwaan, R.A.

    2013-01-01

    Prior research exploring the relationship between evaluations and body movements has focused on one-sided evaluations. However, people regularly encounter objects or situations about which they simultaneously hold both positive and negative views, which results in the experience of ambivalence. Such

  5. One way and the other: the bidirectional relationship between ambivalence and body movement

    NARCIS (Netherlands)

    Schneider, I.K.; Eerland, A.; van Harreveld, F.; Rotteveel, M.; van der Pligt, J.; van der Stoep, N.; Zwaan, R.A.

    2013-01-01

    Prior research exploring the relationship between evaluations and body movements has focused on one-sided evaluations. However, people regularly encounter objects or situations about which they simultaneously hold both positive and negative views, which results in the experience of ambivalence. Such

  6. [Health impact assessment: one way to introduce health in all policies. SESPAS Report 2010].

    Science.gov (United States)

    Esnaola, Santiago; Bacigalupe, Amaia; Sanz, Elvira; Aldasoro, Elena; Calderón, Carlos; Zuazagoitia, Juan; Cambra, Koldo

    2010-12-01

    Health impact assessment is a predictive tool to support decisions in policy-making. Current experience shows that health impact assessment could play an important role in the development of the Health in All Policies strategy. This strategy has been extensively used in other European countries and in a wide range of policy and administrative sectors. Health impact assessment is hardly ever mandatory and is frequently carried out separately from other impact assessments. The use of this process in Spain is relatively new, limited and fundamentally based on local level experiences and the screening of regional interventions. The current normative and organizational reform of public health in Spain provides an excellent opportunity to promote the development of health impact assessment. Some of the barriers to the development of this process are related to the biomedical model of health prevailing among health professionals, politicians, and the general population, political disaffection, lack of assessment culture, underdevelopment of community participation processes, and insufficient intersectoral work. Health impact assessment provides an opportunity to move toward improving the population's health and reducing inequalities in health. Consequently, political commitment, as well as investment in education and research, is needed to introduce and develop health impact assessment in all administrative settings and policy sectors. Copyright © 2010 SESPAS. Published by Elsevier Espana. All rights reserved.

  7. More than one way to see it: Individual heuristics in avian visual computation.

    Science.gov (United States)

    Ravignani, Andrea; Westphal-Fitch, Gesche; Aust, Ulrike; Schlumpp, Martin M; Fitch, W Tecumseh

    2015-10-01

    Comparative pattern learning experiments investigate how different species find regularities in sensory input, providing insights into cognitive processing in humans and other animals. Past research has focused either on one species' ability to process pattern classes or different species' performance in recognizing the same pattern, with little attention to individual and species-specific heuristics and decision strategies. We trained and tested two bird species, pigeons (Columba livia) and kea (Nestor notabilis, a parrot species), on visual patterns using touch-screen technology. Patterns were composed of several abstract elements and had varying degrees of structural complexity. We developed a model selection paradigm, based on regular expressions, that allowed us to reconstruct the specific decision strategies and cognitive heuristics adopted by a given individual in our task. Individual birds showed considerable differences in the number, type and heterogeneity of heuristic strategies adopted. Birds' choices also exhibited consistent species-level differences. Kea adopted effective heuristic strategies, based on matching learned bigrams to stimulus edges. Individual pigeons, in contrast, adopted an idiosyncratic mix of strategies that included local transition probabilities and global string similarity. Although performance was above chance and quite high for kea, no individual of either species provided clear evidence of learning exactly the rule used to generate the training stimuli. Our results show that similar behavioral outcomes can be achieved using dramatically different strategies and highlight the dangers of combining multiple individuals in a group analysis. These findings, and our general approach, have implications for the design of future pattern learning experiments, and the interpretation of comparative cognition research more generally. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  8. Behind the scenes of GS: there’s only one way to go

    CERN Multimedia

    Anaïs Schaeffer

    2014-01-01

    At CERN, all of the Laboratory’s imports and exports are routed in the same way: through the Logistics Service. This GS-IS Group service is responsible for receiving, inspecting and distributing all goods sent to the Organization.   Whether products for the CERN Stores, components for the experiments, tools, machinery and materials for the workshops or equipment for users and members of the personnel, nothing escapes the attention of CERN’s Logistics Service, which every year processes nearly 70,000 incoming deliveries, 7,500 shipments and 160,000 distributed items. “The vast majority of our imports come from CERN Member States,” says imports and customs procedures manager Laurence Planque, “but we are receiving more and more goods for collaborators working at CERN from non-Member States such as China, India and Pakistan. All these imports are entitled to diplomatic exemption, so every day we have to manage the customs clearance procedures wit...

  9. Topologically-protected one-way leaky waves in nonreciprocal plasmonic structures

    Science.gov (United States)

    Hassani Gangaraj, S. Ali; Monticone, Francesco

    2018-03-01

    We investigate topologically-protected unidirectional leaky waves on magnetized plasmonic structures acting as homogeneous photonic topological insulators. Our theoretical analyses and numerical experiments aim at unveiling the general properties of these exotic surface waves, and their nonreciprocal and topological nature. In particular, we study the behavior of topological leaky modes in stratified structures composed of a magnetized plasma at the interface with isotropic conventional media, and we show how to engineer their propagation and radiation properties, leading to topologically-protected backscattering-immune wave propagation, and highly directive and tunable radiation. Taking advantage of the non-trivial topological properties of these leaky modes, we also theoretically demonstrate advanced functionalities, including arbitrary re-routing of leaky waves on the surface of bodies with complex shapes, as well as the realization of topological leaky-wave (nano)antennas with isolated channels of radiation that are completely independent and separately tunable. Our findings help shedding light on the behavior of topologically-protected modes in open wave-guiding structures, and may open intriguing directions for future antenna generations based on topological structures, at microwaves and optical frequencies.

  10. Thermomechanical characterization of one-way shape memory Nitinol as an actuator for active surgical needle

    Science.gov (United States)

    Honarvar, Mohammad

    tested and different ranges of critical stress were found for different wire diameters. The transformation temperatures of different wire diameters at zero stress have been achieved by performing the Differential Scanning Calorimetry (DSC) test. The actuation force created by Nitinol wire is measured through constant strain experiment. X-Ray Diffraction (XRD) study was also performed to investigate the phase of Nitinol wires under various thermomechanical loading conditions. In summary, the effect of wire diameter on the required critical stresses to avoid the unrecovered strain between first and second cycle of heating and cooling are presented and the results of both mechanical tests are justified by the results obtained from the XRD study.

  11. Analyzing through Resonant Experience - Becoming the one to understand the other

    DEFF Research Database (Denmark)

    Revsbæk, Line

    As the post-qualitative research argument intensifies (St. Pierre, 2011), critiquing data-reliant methods of analysis (Brinkmann, 2014; Jackson & St. Pierre, 2014), we draw on insights from G. H. Mead’s Philosophy of the Present (1932) to describe our situated experience of doing analysis...... in instances of listening to audio recorded case study material (Revsbæk & Tanggaard, 2015). Iterating the practice of ‘analyzing in the present’ (Ibid.), I take an (auto-)biographic approach to analyzing case study material, drawing on resonant experience of my own to understand that expressed by the (case...... academic merits, but also about resonant life experience and the discipline of becoming the one to understand the other....

  12. Neutrinoless double beta decay experiment DCBA using a magnetic momentum-analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Ishihara, N., E-mail: nobuhiro.ishihara@kek.jp [High Energy Accelerator Research Organization (KEK), Tsukuba, 305-0801 (Japan); Kato, Y.; Inagaki, T.; Ohama, T.; Takeda, S.; Yamada, Y. [High Energy Accelerator Research Organization (KEK), Tsukuba, 305-0801 Japan (Japan); Ukishima, N.; Teramoto, Y. [Osaka City University, Sumiyoshi, Osaka, 558-8585 (Japan); Morishima, Y.; Nakano, I. [Okayama University, Okayama, 700-8530 (Japan); Kitamura, S. [Tokyo Metropolitan University, Arakawa, Tokyo, 116-8551 (Japan); Sakamoto, Y. [Tohoku Gakuin University, Izumi, Sendai, 981-3193 (Japan); Nagasaka, Y. [Hiroshima Institute of Technology, Saeki, Hiroshima, 731-5193 (Japan); Tamura, N. [Niigata University, Niigata, 950-2181 (Japan); Tanaka, K. [BTE, Minato, Tokyo, 105-0011 (Japan); Ito, R. [ZTJ, Chiyoda, Tokyo, 101-0047 (Japan)

    2011-12-15

    A magnetic momentum-analyzer is being developed at KEK for neutrinoless double beta decay experiment called DCBA (Drift Chamber Beta-ray Analyzer, inverted ABCD). A lot of thin plates of {sup 150}Nd compound are installed in tracking detectors located in a uniform magnetic field. The three-dimensional position information is obtained for the helical track of a beta ray. More R and D will be studied using the second test apparatus DCBA-T2, which is now under construction.

  13. One Way to Holland

    DEFF Research Database (Denmark)

    Marselis, Randi; Schütze, Laura Maria

    2013-01-01

    , Twitter, Flickr as well as the museum’s blog to reach migrant communities in order to collect and share information and stories related to photographs of postcolonial migrants. Through combining these different social media with promotion of the related offline photo exhibition in print media, the museum......Museums in many parts of the world are challenged by increased diversity within the populations that make up their potential audiences, and many museums of cultural history now acknowledge the culture of ethnic minority groups as an important subject in multiethnic societies. A central issue...... materials and the expertise of museum staff but are at the same time recognized as able to contribute with valuable perspectives on their own culture (Peers & Brown, 2003, p. 1). This chapter examines how the Tropenmuseum in the Netherlands, one of Europe’s largest ethnographic museums, has used Facebook...

  14. Effect of one-way clutch on the nonlinear vibration of belt-drive systems with a continuous belt model

    Science.gov (United States)

    Ding, Hu; Zu, Jean W.

    2013-11-01

    This study focuses on the nonlinear steady-state response of a belt-drive system with a one-way clutch. A dynamic model is established to describe the rotations of the driving pulley, the driven pulley, and the accessory shaft. Moreover, the model considers the transverse vibration of the translating belt spans for the first time in belt-drive systems coupled with a one-way clutch. The excitation of the belt-drive system is derived from periodic fluctuation of the driving pulley. In automotive systems, this kind of fluctuation is induced by the engine firing harmonic pulsations. The derived coupled discrete-continuous nonlinear equations consist of integro-partial-differential equations and piece-wise ordinary differential equations. Using the Galerkin truncation, a set of nonlinear ordinary differential equations is obtained from the integro-partial-differential equations. Applying the Runge-Kutta time discretization, the time histories of the dynamic response are numerically solved for the driven pulley and the accessory shaft and the translating belt spans. The resonance areas of the coupled belt-drive system are determined using the frequency sweep. The effects of the one-way clutch on the belt-drive system are studied by comparing the frequency-response curves of the translating belt with and without one-way clutch device. Furthermore, the results of 2-term and 4-term Galerkin truncation are compared to determine the numerical convergence. Moreover, parametric studies are conducted to understand the effects of the system parameters on the nonlinear steady-state response. It is concluded that one-way clutch not only decreases the resonance amplitude of the driven pulley and shaft's rotational vibration, but also reduces the resonance region of the belt's transverse vibration.

  15. Verbal communication with the Blom low profile and Passy-Muir one-way tracheotomy tube speaking valves.

    Science.gov (United States)

    Adam, Stewart I; Srinet, Prateek; Aronberg, Ryan M; Rosenberg, Graeme; Leder, Steven B

    2015-01-01

    To investigate physiologic parameters, voice production abilities, and functional verbal communication ratings of the Blom low profile voice inner cannula and Passy-Muir one-way tracheotomy tube speaking valves. Case series with planned data collection. Large, urban, tertiary care teaching hospital. Referred sample of 30 consecutively enrolled adults requiring a tracheotomy tube and tested with Blom and Passy-Muir valves. Physiologic parameters recorded were oxygen saturation, respiration rate, and heart rate. Voice production abilities included maximum voice intensity in relation to ambient room noise and maximum phonation duration of the vowel/a/. Functional verbal communication was determined from randomized and blinded listener ratings of counting 1-10, saying the days of the week, and reading aloud the sentence, "There is according to legend a boiling pot of gold at one end." There were no significant differences (p>0.05) between the Blom and Passy-Muir valves for the physiologic parameters of oxygen saturation, respiration rate, and heart rate; voice production abilities of both maximum intensity and duration of/a/; and functional verbal communication ratings. Both valves allowed for significantly greater maximum voice intensity over ambient room noise (pMuir one-way speaking valves exhibited equipoise regarding patient physiologic parameters, voice production abilities, and functional verbal communication ratings. Readers will understand the importance of verbal communication for patients who require a tracheotomy tube; will be able to determine the differences between the Blom low profile voice inner cannula and Passy-Muir one-way tracheotomy tube speaking valves; and will be confident in knowing that both the Blom and Passy-Muir one-way tracheotomy tube speaking valves are equivalent regarding physiological functioning and speech production abilities. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Using Labeled Choice Experiments to Analyze Demand Structure and Market Position among Seafood Products

    DEFF Research Database (Denmark)

    Nguyen, Thong Tien; Solgaard, Hans Stubbe; Haider, Wolfgang

    2017-01-01

    of the elements with respect to the firm’s position in the market. In this paper we analyze the demand structure and market positions for a variety of seafood products in the French retail market. We use a labeled choice experiment (LCE) to analyze twelve seafood species. The choice options are labeled...... and reveal that salmon and cod have the strongest market position while monkfish and pangasius have the weakest. In general, the demand for seafood is moderately sensitive to price (market elasticity of -1.31). Large and low income households, female consumers, people in the age range 35-44 years and self......-employed consumers are the most sensitive to price. Four segments are identified and described in terms of both consumer characteristics and preferences. Our results are meaningful for producers and retailers to develop marketing strategies and production plan....

  17. Towards for Analyzing Alternatives of Interaction Design Based on Verbal Decision Analysis of User Experience

    Directory of Open Access Journals (Sweden)

    Marília Soares Mendes

    2010-04-01

    Full Text Available In domains (as digital TV, smart home, and tangible interfaces that represent a new paradigm of interactivity, the decision of the most appropriate interaction design solution is a challenge. HCI researchers have promoted in their works the validation of design alternative solutions with users before producing the final solution. User experience with technology is a subject that has also gained ground in these works in order to analyze the appropriate solution(s. Following this concept, a study was accomplished under the objective of finding a better interaction solution for an application of mobile TV. Three executable applications of mobile TV prototypes were built. A Verbal Decision Analysis model was applied on the investigations for the favorite characteristics in each prototype based on the user’s experience and their intentions of use. This model led a performance of a qualitative analysis which objectified the design of a new prototype.

  18. Spectral analysis of one-way and two-way downscaling applications for a tidally driven coastal ocean forecasting system

    Science.gov (United States)

    Solano, Miguel; Gonzalez, Juan; Canals, Miguel; Capella, Jorge; Morell, Julio; Leonardi, Stefano

    2017-04-01

    ways: 1) using Rich Pawlowicz's t_tide package (classic harmonic analysis), 2) with traditional band-pass filters (e.g. Lanczos) and 3) using Proper Orthogonal Decomposition. The tide filtering approach shows great improvement in the high frequency response of tidal motions at the open boundaries. Results are validated with NOAA tide gauges, Acoustic Doppler Current Profilers, High Frequency Radars (6km and 2km resolution). A floating drifter experiment is performed in coastal zones, in which 12 drifters were deployed at different coastal zones and tracked for several days. The results show an improvement of the forecast skill with the proper implementation of the tide filtering approach by adjusting the nudging time scales and adequately removing the tidal signals. Significant improvement is found in the tracking skill of the floating drifters for the one-way grid and the two-way nested application also shows some improvement over the offline downscaling approach at higher resolutions.

  19. A likelihood-based biostatistical model for analyzing consumer movement in simultaneous choice experiments.

    Science.gov (United States)

    Zeilinger, Adam R; Olson, Dawn M; Andow, David A

    2014-08-01

    Consumer feeding preference among resource choices has critical implications for basic ecological and evolutionary processes, and can be highly relevant to applied problems such as ecological risk assessment and invasion biology. Within consumer choice experiments, also known as feeding preference or cafeteria experiments, measures of relative consumption and measures of consumer movement can provide distinct and complementary insights into the strength, causes, and consequences of preference. Despite the distinct value of inferring preference from measures of consumer movement, rigorous and biologically relevant analytical methods are lacking. We describe a simple, likelihood-based, biostatistical model for analyzing the transient dynamics of consumer movement in a paired-choice experiment. With experimental data consisting of repeated discrete measures of consumer location, the model can be used to estimate constant consumer attraction and leaving rates for two food choices, and differences in choice-specific attraction and leaving rates can be tested using model selection. The model enables calculation of transient and equilibrial probabilities of consumer-resource association, which could be incorporated into larger scale movement models. We explore the effect of experimental design on parameter estimation through stochastic simulation and describe methods to check that data meet model assumptions. Using a dataset of modest sample size, we illustrate the use of the model to draw inferences on consumer preference as well as underlying behavioral mechanisms. Finally, we include a user's guide and computer code scripts in R to facilitate use of the model by other researchers.

  20. Analyzing price and efficiency dynamics of large appliances with the experience curve approach

    International Nuclear Information System (INIS)

    Weiss, Martin; Patel, Martin K.; Junginger, Martin; Blok, Kornelis

    2010-01-01

    Large appliances are major power consumers in households of industrialized countries. Although their energy efficiency has been increasing substantially in past decades, still additional energy efficiency potentials exist. Energy policy that aims at realizing these potentials faces, however, growing concerns about possible adverse effects on commodity prices. Here, we address these concerns by applying the experience curve approach to analyze long-term price and energy efficiency trends of three wet appliances (washing machines, laundry dryers, and dishwashers) and two cold appliances (refrigerators and freezers). We identify a robust long-term decline in both specific price and specific energy consumption of large appliances. Specific prices of wet appliances decline at learning rates (LR) of 29±8% and thereby much faster than those of cold appliances (LR of 9±4%). Our results demonstrate that technological learning leads to substantial price decline, thus indicating that the introduction of novel and initially expensive energy efficiency technologies does not necessarily imply adverse price effects in the long term. By extending the conventional experience curve approach, we find a steady decline in the specific energy consumption of wet appliances (LR of 20-35%) and cold appliances (LR of 13-17%). Our analysis suggests that energy policy might be able to bend down energy experience curves. (author)

  1. Structural Behavior of Fibrous Reinforced Concrete Hollow Core One-Way Slabs Strengthening by C.F.R.P

    Directory of Open Access Journals (Sweden)

    وصيف مجيد

    2016-02-01

    Full Text Available A reinforced concrete hollow core one-way slab is one of the types of slabs used widely around the world in residential and industrial buildings to take advantage of them Economic and thermal insulation as well as to reduce the self-weight of the construction. The aim of the present study is to examine the structural behavior of the reinforced concrete hollow core one-way slabs reduce failure using the normal concrete and fibrous concrete and then strengthened using carbon fiber(CFRPThis study include molding of ( 6 specimens differ in terms of the voids volume (Vv , volumetric percentage of steel fibers (ا, and then strengthened by using fibers of carbon , with the aim of rehabilitation by fibers, carbon polymer (CFRP is to find out how efficient element structural when treated after the occurrence of the failure and the validity of its use in the event of a failure has occurred entirely or partly in the roof, and re- examined using the same method and conditions that were examined ceilings is affected through it, knowing that these ceilings have been addressed and strengthened in the same way , the results of the tests of the models that have been rehabilitated using carbon fiber (CFRP, compared with the same models before strengthening and examined reduce failure, increased very high susceptibility endurance extreme , with the increase ranging from (51.6% to (96.2%, as has been observed decrease in deflection value of models after strengthening by (CFRP.It is concluded through this study the possibility of using its concrete hollow core one-way slab as a roofing system for buildings also proved the highly efficient for this slab after rehabilitation using carbon fiber (CFRP.

  2. An integrative conceptual framework for analyzing customer satisfaction with shopping trip experiences in grocery retailing

    DEFF Research Database (Denmark)

    Esbjerg, Lars; Jensen, Birger Boutrup; Bech-Larsen, Tino

    2012-01-01

    Grocery retailers aim to satisfy customers, and because grocery shopping trips are frequently recurring, they must do socontinuously. Surprisingly, little research has addressed satisfaction with individual grocery shopping trips. This article therefore develops a conceptual framework for analyzing...... customer satisfaction with individual grocery shopping trip experiences within a overall ‘disconfirmation of expectations model’ of customer satisfaction. The contribution of the framework is twofold. First, by focusing on satisfaction with individual grocery shopping trips, previous research...... on satisfaction is extended to a context marked by frequently recurring, often tedious and routine activities. Understanding what causes satisfaction/dissatisfaction with individual shopping trips is required to explain overall, cumulative satisfaction with a retailer, which has been the focus of prior research...

  3. On the sub-model errors of a generalized one-way coupling scheme for linking models at different scales

    Science.gov (United States)

    Zeng, Jicai; Zha, Yuanyuan; Zhang, Yonggen; Shi, Liangsheng; Zhu, Yan; Yang, Jinzhong

    2017-11-01

    Multi-scale modeling of the localized groundwater flow problems in a large-scale aquifer has been extensively investigated under the context of cost-benefit controversy. An alternative is to couple the parent and child models with different spatial and temporal scales, which may result in non-trivial sub-model errors in the local areas of interest. Basically, such errors in the child models originate from the deficiency in the coupling methods, as well as from the inadequacy in the spatial and temporal discretizations of the parent and child models. In this study, we investigate the sub-model errors within a generalized one-way coupling scheme given its numerical stability and efficiency, which enables more flexibility in choosing sub-models. To couple the models at different scales, the head solution at parent scale is delivered downward onto the child boundary nodes by means of the spatial and temporal head interpolation approaches. The efficiency of the coupling model is improved either by refining the grid or time step size in the parent and child models, or by carefully locating the sub-model boundary nodes. The temporal truncation errors in the sub-models can be significantly reduced by the adaptive local time-stepping scheme. The generalized one-way coupling scheme is promising to handle the multi-scale groundwater flow problems with complex stresses and heterogeneity.

  4. Resource cost results for one-way entanglement distillation and state merging of compound and arbitrarily varying quantum sources

    International Nuclear Information System (INIS)

    Boche, H.; Janßen, G.

    2014-01-01

    We consider one-way quantum state merging and entanglement distillation under compound and arbitrarily varying source models. Regarding quantum compound sources, where the source is memoryless, but the source state an unknown member of a certain set of density matrices, we continue investigations begun in the work of Bjelaković et al. [“Universal quantum state merging,” J. Math. Phys. 54, 032204 (2013)] and determine the classical as well as entanglement cost of state merging. We further investigate quantum state merging and entanglement distillation protocols for arbitrarily varying quantum sources (AVQS). In the AVQS model, the source state is assumed to vary in an arbitrary manner for each source output due to environmental fluctuations or adversarial manipulation. We determine the one-way entanglement distillation capacity for AVQS, where we invoke the famous robustification and elimination techniques introduced by Ahlswede. Regarding quantum state merging for AVQS we show by example that the robustification and elimination based approach generally leads to suboptimal entanglement as well as classical communication rates

  5. Multi-user quantum private comparison with scattered preparation and one-way convergent transmission of quantum states

    Science.gov (United States)

    Ye, TianYu; Ji, ZhaoXu

    2017-09-01

    Quantum private comparison (QPC) aims to accomplish the equality comparison of the secrets from different users without disclosing their genuine contents by using the principles of quantum mechanics. In this paper, we summarize eight modes of quantum state preparation and transmission existing in current QPC protocols first. Then, by using the mode of scattered preparation and one-way convergent transmission, we construct a new multi-user quantum private comparison (MQPC) protocol with two-particle maximally entangled states, which can accomplish arbitrary pair's comparison of equality among K users within one execution. Analysis turns out that its output correctness and its security against both the outside attack and the participant attack are guaranteed. The proposed MQPC protocol can be implemented with current technologies. It can be concluded that the mode of scattered preparation and one-way convergent transmission of quantum states is beneficial to designing the MQPC protocol which can accomplish arbitrary pair's comparison of equality among K users within one execution.

  6. Using symbolic interactionism to analyze a specialized STEM high school teacher's experience in curriculum reform

    Science.gov (United States)

    Teo, Tang Wee; Osborne, Margery

    2012-09-01

    In this paper, we present a microanalysis of a specialized STEM (science, technology, engineering, and mathematics) high school teacher's experience of self-initiated science inquiry curriculum reform. We examine the meanings of these two constructs: inquiry curriculum and curriculum change through the process lens of interactions, actions, and interpretations. Symbolic interactionism is the theoretical framework we used to frame our analysis of how this teacher, Darren Daley (a pseudonym) and various stakeholders purposefully and strategically engaged in "face-work" and act out lines of actions to advocate or oppose curriculum change. Symbols are used in this world of face-to-face encounters to communicate, imply, and assert, meanings through socially flexible and adjustable processes. We scrutinize how Daley (un)consciously engaged all of these to defend his decisions, actions, and outcomes and "look" to others as doing inquiry reform. The meanings of such work are not intrinsically driven or reactions to psychological and extraneous factors and forces, but emergent through interactions. The data collection methods include interviews with Daley, school administrators, students, and parents, lesson observations in Daley's class, and gathering of school website pages, brochures, and curriculum materials. We represent data in narratives describing storied history, voices, interactions, anecdotal accounts from individuals' experiences, and interpretations. The analysis and findings illuminate the nature of teacher agency—how it is reclaimed, sustained, reinforced, contested, exercised, and modified in more nuanced ways, hence offering an alternative lens to theorizing and empirically analyzing this construct.

  7. Quantum cryptography with finite resources: unconditional security bound for discrete-variable protocols with one-way postprocessing.

    Science.gov (United States)

    Scarani, Valerio; Renner, Renato

    2008-05-23

    We derive a bound for the security of quantum key distribution with finite resources under one-way postprocessing, based on a definition of security that is composable and has an operational meaning. While our proof relies on the assumption of collective attacks, unconditional security follows immediately for standard protocols such as Bennett-Brassard 1984 and six-states protocol. For single-qubit implementations of such protocols, we find that the secret key rate becomes positive when at least N approximately 10(5) signals are exchanged and processed. For any other discrete-variable protocol, unconditional security can be obtained using the exponential de Finetti theorem, but the additional overhead leads to very pessimistic estimates.

  8. MicrobiomeDB: a systems biology platform for integrating, mining and analyzing microbiome experiments.

    Science.gov (United States)

    Oliveira, Francislon S; Brestelli, John; Cade, Shon; Zheng, Jie; Iodice, John; Fischer, Steve; Aurrecoechea, Cristina; Kissinger, Jessica C; Brunk, Brian P; Stoeckert, Christian J; Fernandes, Gabriel R; Roos, David S; Beiting, Daniel P

    2018-01-04

    MicrobiomeDB (http://microbiomeDB.org) is a data discovery and analysis platform that empowers researchers to fully leverage experimental variables to interrogate microbiome datasets. MicrobiomeDB was developed in collaboration with the Eukaryotic Pathogens Bioinformatics Resource Center (http://EuPathDB.org) and leverages the infrastructure and user interface of EuPathDB, which allows users to construct in silico experiments using an intuitive graphical 'strategy' approach. The current release of the database integrates microbial census data with sample details for nearly 14 000 samples originating from human, animal and environmental sources, including over 9000 samples from healthy human subjects in the Human Microbiome Project (http://portal.ihmpdcc.org/). Query results can be statistically analyzed and graphically visualized via interactive web applications launched directly in the browser, providing insight into microbial community diversity and allowing users to identify taxa associated with any experimental covariate. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. A Novel Through Capacity Model for One-way Channel Based on Characteristics of the Vessel Traffic Flow

    Directory of Open Access Journals (Sweden)

    Yuanyuan Nie

    2017-09-01

    Full Text Available Vessel traffic flow is a key parameter for channel-through capacity and is of great significance to vessel traffic management, channel and port design and navigational risk evaluation. Based on the study of parameters of characteristics of vessel traffic flow related to channel-through capacity, this paper puts forward a brand-new mathematical model for one-way channel-through capacity in which parameters of channel length, vessel arrival rate and velocity difference in different vessels are involved and a theoretical calculating mechanism for the channel-through capacity is provided. In order to verify availability and reliability of the model, extensive simulation studies have been carried out and based on the historical AIS data, an analytical case study on the Xiazhimen Channel validating the proposed model is presented. Both simulation studies and the case study show that the proposed model is valid and all relative parameters can be readjusted and optimized to further improve the channel-through capacity. Thus, all studies demonstrate that the model is valuable for channel design and vessel management.

  10. An preliminary clinical study of transbronchoscopic interventional treatment for severe emphysema with local made one-way valvular stents

    International Nuclear Information System (INIS)

    Fan Yong; Wu Qi; Liang Chunbao; Wu Xianjie; Tian Jing; Du Zhongzhen; Li Ping; Wu Junping; Shi Lixia; Zhao Chongfa; Li Yuping; Yu Lei; He Nengshu

    2008-01-01

    Objective: To evaluate the safety and efficiency of bronchoscopic interventional lung volume reduction with domestic-made stents. Methods: The target areas of 7 patients with severe emphysema for valvular stenting form May 2006 to Aug. 2007 were prospectively selected on the basis of CT scan. Under general anesthesia, one-way valvular stenting were carried out over a guidewire under flexible bronchoscopy and fluoroscopic control. The symptom, pulmonary function, blood gas analysis, B-ultrasonic wave, 6 minute walk distance and thoracic CT were undertaken. Results: 4-6 stents per patient took place in 136 ± 72.3 min to obstruct the upper-lobe segments unilaterally but without obvious atelectasis under imaging. The patients could walk 2 hours after the operation with relief of dyspnea. No major change in radiologic findings and lung function occurred in 2 weeks, only bronchi distal to the stents gathered together. 6-min walk distance, Borg dyspnea scale fell and the pulmonary arterial pressure showed significant (P<0.05)statistical discrepancy. No major life-threatening complications were noted in the 15-day study period and no conspicuous change in lung function, blood gas analysis and lung volume. Lower-lobe pneumonia of nontarget area developed in 1 patient and acute episode of COPD occurred in another. Conclusions: Bronchoscopic interventional lung volume reduction may improve dyspnea and quality of life, as a rather safety therapeutic measure. (authors)

  11. [The relationship between Ridit analysis and rank sum test for one-way ordinal contingency table in medical research].

    Science.gov (United States)

    Wang, Ling; Xia, Jie-lai; Yu, Li-li; Li, Chan-juan; Wang, Su-zhen

    2008-06-01

    To explore several numerical methods of ordinal variable in one-way ordinal contingency table and their interrelationship, and to compare corresponding statistical analysis methods such as Ridit analysis and rank sum test. Formula deduction was based on five simplified grading approaches including rank_r(i), ridit_r(i), ridit_r(ci), ridit_r(mi), and table scores. Practical data set was verified by SAS8.2 in clinical practice (to test the effect of Shiwei solution in treatment for chronic tracheitis). Because of the linear relationship of rank_r(i) = N ridit_r(i) + 1/2 = N ridit_r(ci) = (N + 1) ridit_r(mi), the exact chi2 values in Ridit analysis based on ridit_r(i), ridit_r(ci), and ridit_r(mi), were completely the same, and they were equivalent to the Kruskal-Wallis H test. Traditional Ridit analysis was based on ridit_r(i), and its corresponding chi2 value calculated with an approximate variance (1/12) was conservative. The exact chi2 test of Ridit analysis should be used when comparing multiple groups in the clinical researches because of its special merits such as distribution of mean ridit value on (0,1) and clear graph expression. The exact chi2 test of Ridit analysis can be output directly by proc freq of SAS8.2 with ridit and modridit option (SCORES =). The exact chi2 test of Ridit analysis is equivalent to the Kruskal-Wallis H test, and should be used when comparing multiple groups in the clinical researches.

  12. Combining NASA/JPL One-Way Optical-Fiber Light-Speed Data with Spacecraft Earth-Flyby Doppler-Shift Data to Characterise 3-Space Flow

    Directory of Open Access Journals (Sweden)

    Cahill R. T.

    2009-10-01

    Full Text Available We combine data from two high precision NASA / JPL experiments: (i the one-way speed of light experiment using optical fibers: Krisher T.P., Maleki L., Lutes G.F., Pri- mas L.E., Logan R.T., Anderson J.D. and Will C.M. Phys. Rev. D , 1990, v. 42, 731–734, and (ii the spacecraft earth-flyby Doppler shift data: Anderson J.D., Campbell J.K., Ekelund J.E., Ellis J. and Jordan J.F. Phys. Rev. Lett. , 2008, v. 100, 091102, to give the solar-system galactic 3-space average speed of 486 km / s in the direction RA = 4.29 h , Dec = -75.0°. Turbulence effects (gravitational waves are also evident. Data also reveals the 30 km / s orbital speed of the Earth and the Sun inflow component at 1AU of 42 km / s and also 615 km / s near the Sun, and for the first time, experimental measure- ment of the 3-space 11.2 km / s inflow of the Earth. The NASA / JPL data is in remark- able agreement with that determined in other light speed anisotropy experiments, such as Michelson-Morley (1887, Miller (1933, Torr and Kolen (1981, DeWitte (1991, Cahill (2006, Munera (2007, Cahill and Stokes (2008 and Cahill (2009.

  13. Combining NASA/JPL One-Way Optical-Fiber Light-Speed Data with Spacecraft Earth-Flyby Doppler-Shift Data to Characterise 3-Space Flow

    Directory of Open Access Journals (Sweden)

    Cahill R. T.

    2009-10-01

    Full Text Available We combine data from two high precision NASA/JPL experiments: (i the one-way speed of light experiment using optical fibers: Krisher T.P., Maleki L., Lutes G.F., Primas L.E., Logan R.T., Anderson J.D. and Will C.M. Phys. Rev. D, 1990, v.42, 731-734, and (ii the spacecraft earth-flyby Doppler shift data: Anderson J.D., Campbell J.K., Ekelund J.E., Ellis J. and Jordan J.F. Phys. Rev. Lett., 2008, v.100, 091102, to give the solar-system galactic 3-space average speed of 486 km/s in the direction RA = 4.29 h, Dec = -75.0 Deg. Turbulence effects (gravitational waves are also evident. Data also reveals the 30 km/s orbital speed of the Earth and the Sun inflow component at 1AU of 42 km/s and also 615 km/s near the Sun, and for the first time, experimental measurement of the 3-space 11.2 km/s inflow of the Earth. The NASA/JPL data is in remarkable agreement with that determined in other light speed anisotropy experiments, such as Michelson-Morley (1887, Miller (1933, Torr and Kolen (1981, DeWitte (1991, Cahill (2006, Munera (2007, Cahill and Stokes (2008 and Cahill (2009.

  14. Challenging the One-Way Paradigm for More Effective Science Communication: A Critical Review of Two Public Campaigns Addressing Contentious Environmental Issues

    Science.gov (United States)

    McEntee, Marie; Mortimer, Claire

    2013-01-01

    This article examines two large-scale public communication campaigns to explore the appropriateness and effectiveness of using one-way communication in contentious environmental issues. The findings show while one-way communication can be successfully employed in contentious issues, it is not appropriate for all contexts and may contribute to…

  15. The application of stata multiple inputation command to analyze design of experiments with multiple regression

    OpenAIRE

    Clara Novoa; Suleima Alkusari

    2012-01-01

    This talk exemplifies the application of the multiple imputation technique available in STATA to analize a design of experiments with multiple responses and missing data. No imputation and multiple imputation methodologies are compared.

  16. "One-way-street" streamlined admission of critically ill trauma patients reduces emergency department length of stay.

    Science.gov (United States)

    Fuentes, Eva; Shields, Jean-Francois; Chirumamilla, Nandan; Martinez, Myriam; Kaafarani, Haytham; Yeh, Daniel Dante; White, Benjamin; Filbin, Michael; DePesa, Christopher; Velmahos, George; Lee, Jarone

    2017-10-01

    Emergency department (ED) overcrowding remains a significant problem in many hospitals, and results in multiple negative effects on patient care outcomes and operational metrics. We sought to test whether implementing a quality improvement project could decrease ED LOS for trauma patients requiring an ICU admission from the ED, specifically by directly admitting critically ill trauma patients from the ED CT scanner to an ICU bed. This was a retrospective study comparing patients during the intervention period (2013-2014) to historical controls (2011-2013). Critically ill trauma patients requiring a CT scan, but not the operating room (OR) or Interventional Radiology (IR), were directly admitted from the CT scanner to the ICU, termed the "One-way street (OWS)". Controls from the 2011-2013 Trauma Registry were matched 1:1 based on the following criteria: Injury Severity Score; mechanism of injury; and age. Only patients who required emergent trauma consult were included. Our primary outcome was ED LOS, defined in minutes. Our secondary outcomes were ICU LOS, hospital LOS and mortality. Paired t test or Wilcoxon signed rank test were used for continuous univariate analysis and Chi square for categorical variables. Logistic regression and linear regressions were used for categorical and continuous multivariable analysis, respectively. 110 patients were enrolled in this study, with 55 in the OWS group and 55 matched controls. Matched controls had lower APACHE II score (12 vs. 15, p = 0.03) and a higher GCS (14 vs. 6, p = 0.04). ED LOS was 229 min shorter in the OWS group (82 vs. 311 min, p < 0.0001). The time between CT performed and ICU disposition decreased by 230 min in the OWS arm (30 vs. 300 min, p < 0.001). There was no difference in ED arrival to CT time between groups. Following multivariable analysis, mortality was primarily predicted by the APACHE II score (OR 1.29, p < 0.001), and not ISS, mechanism of injury, or age. After controlling for APACHE

  17. The Large-Scale Biosphere-Atmosphere Experiment in Amazonia: Analyzing Regional Land Use Change Effects.

    Science.gov (United States)

    Michael Keller; Maria Assunção Silva-Dias; Daniel C. Nepstad; Meinrat O. Andreae

    2004-01-01

    The Large-Scale Biosphere-Atmosphere Experiment in Amazonia (LBA) is a multi-disciplinary, multinational scientific project led by Brazil. LBA researchers seek to understand Amazonia in its global context especially with regard to regional and global climate. Current development activities in Amazonia including deforestation, logging, cattle ranching, and agriculture...

  18. Experimenting with Spirituality: Analyzing "The God Gene" in a Nonmajors Laboratory Course

    Science.gov (United States)

    Silveira, Linda A.

    2008-01-01

    References linking genes to complex human traits, such as personality type or disease susceptibility, abound in the news media and popular culture. In his book "The God Gene: How Faith is Hardwired into Our Genes", Dean Hamer argues that a variation in the "VMAT2" gene plays a role in one's openness to spiritual experiences. In a nonmajors class,…

  19. Analyzing Exonuclease-Induced Hyperchromicity by Uv Spectroscopy: An Undergraduate Biochemistry Laboratory Experiment

    Science.gov (United States)

    Ackerman, Megan M.; Ricciardi, Christopher; Weiss, David; Chant, Alan; Kraemer-Chant, Christina M.

    2016-01-01

    An undergraduate biochemistry laboratory experiment is described that utilizes free online bioinformatics tools along with readily available exonucleases to study the effects of base stacking and hydrogen bonding on the UV absorbance of DNA samples. UV absorbance of double-stranded DNA at the ?[subscript max] is decreased when the DNA bases are…

  20. Analyzing the Relationship Between Bus Pollution Policies and Morbidity Using a Quasi-Experiment

    OpenAIRE

    Ngo, Nicole S.

    2015-01-01

    Abstract Transit buses are used by millions of commuters every day, but they emit toxic diesel fumes. In 1988, the U.S. Environmental Protection Agency implemented emission standards for transit buses, which have been continually updated. Yet there is no quantitative evidence of the health benefits from these bus pollution policies due to data constraints and confounding variables. In this study, a quasi-experiment is used to exploit the geographic and temporal variation in emission standards...

  1. A two-compartment thermal-hydraulic experiment (LACE-LA4) analyzed by ESCADRE code

    International Nuclear Information System (INIS)

    Passalacqua, R.

    1994-01-01

    Large scale experiments show that whenever a Loss of Coolant Accident (LOCA) occurs, water pools are generated. Stratifications of steam saturated gas develop above water pools causing a two-compartment thermal-hydraulics. The LACE (LWR Advanced Containment Experiment) LA4 experiment, performed at the Hanford Engineering Development Laboratory (HEDL), exhibited a strong stratification, at all times, above a growing water pool. JERICHO and AEROSOLS-B2 are part of the ESCADRE code system (Ensemble de Systemes de Codes d'Analyse d'accident Des Reacteurs A Eau), a tool for evaluating the response of a nuclear plant to severe accidents. These two codes are here used to simulate respectively the thermal-hydraulics and the associated aerosol behavior. Code results have shown that modelling large containment thermal-hydraulics without taking account of the stratification phenomenon leads to large overpredictions of containment pressure and temperature. If the stratification is modelled as a zone with a higher steam condensation rate and a higher thermal resistance, ESCADRE predictions match quite well experimental data. The stratification thermal-hydraulics is controlled by power (heat fluxes) repartition in the lower compartment between the water pool and the nearby walls. Therefore the total, direct heat exchange between the two compartment is reduced. Stratification modelling is believed to be important for its influence on aerosol behavior: aerosol deposition through the inter-face of the two subcompartments is improved by diffusiophoresis and thermophoresis. In addition the aerosol concentration gradient, through the stratification, will cause a driving force for motion of smaller particles towards the pool. (author)

  2. Early Mission Orbit Determination Error Analysis Results for Low-Earth Orbiting Missions using TDRSS Differenced One-way Doppler Tracking Data

    Science.gov (United States)

    Marr, Greg C.

    2003-01-01

    Differencing multiple, simultaneous Tracking and Data Relay Satellite System (TDRSS) one-way Doppler passes can yield metric tracking data usable for orbit determination for (low-cost) spacecraft which do not have TDRSS transponders or local oscillators stable enough to allow the one-way TDRSS Doppler tracking data to be used for early mission orbit determination. Orbit determination error analysis results are provided for low Earth orbiting spacecraft for various early mission tracking scenarios.

  3. Using the factorial experiment method to analyze the corrosion protection process

    Directory of Open Access Journals (Sweden)

    Ţîţu Mihail Aurel

    2017-01-01

    Full Text Available The organization functions are: research-development, production, commercial, financial-accounting, personnel and quality. In this paper the factorial experimental method will be applied, which is currently one of the most widespread methods used in the research-development departments of the organizations, due to its advantages and efficiency. The experiment was carried out at SC Coifer Impex SRL-Mirsa’s metal structures factory. In this paper it is presented the factors modelling that exerts their influence on two objectives functions: the ensuring the nominal thickness of the rough-cast film and the consumption limiting. For data processing the STATISTICA 7 software was used which provides an accurately and effectively way to determine the influence degree of each variable on the settled objectives.

  4. Context, Experience, Expectation, and Action—Towards an Empirically Grounded, General Model for Analyzing Biographical Uncertainty

    Directory of Open Access Journals (Sweden)

    Herwig Reiter

    2010-01-01

    Full Text Available The article proposes a general, empirically grounded model for analyzing biographical uncertainty. The model is based on findings from a qualitative-explorative study of transforming meanings of unemployment among young people in post-Soviet Lithuania. In a first step, the particular features of the uncertainty puzzle in post-communist youth transitions are briefly discussed. A historical event like the collapse of state socialism in Europe, similar to the recent financial and economic crisis, is a generator of uncertainty par excellence: it undermines the foundations of societies and the taken-for-grantedness of related expectations. Against this background, the case of a young woman and how she responds to the novel threat of unemployment in the transition to the world of work is introduced. Her uncertainty management in the specific time perspective of certainty production is then conceptually rephrased by distinguishing three types or levels of biographical uncertainty: knowledge, outcome, and recognition uncertainty. Biographical uncertainty, it is argued, is empirically observable through the analysis of acting and projecting at the biographical level. The final part synthesizes the empirical findings and the conceptual discussion into a stratification model of biographical uncertainty as a general tool for the biographical analysis of uncertainty phenomena. URN: urn:nbn:de:0114-fqs100120

  5. Analyzing the Relationship Between Bus Pollution Policies and Morbidity Using a Quasi-Experiment.

    Science.gov (United States)

    Ngo, Nicole S

    2015-09-01

    Transit buses are used by millions of commuters every day, but they emit toxic diesel fumes. In 1988, the U.S. Environmental Protection Agency implemented emission standards for transit buses, which have been continually updated. Yet there is no quantitative evidence of the health benefits from these bus pollution policies due to data constraints and confounding variables. In this study, a quasi-experiment is used to exploit the geographic and temporal variation in emission standards by using bus vintage as a proxy for bus emissions. This is accomplished using a unique, rich panel data set, which includes daily information on bus vintage and route for the New York City Transit bus fleet between 2006 and 2009. This information is merged with daily data on emergency department (ED) visits for respiratory illnesses, which include patients' residences at the census block level and exact admission date. Economic benefits resulting from these bus pollution policies are then estimated. Results show that stricter transit bus emission standards by the U.S. Environmental Protection Agency for particulate matter are associated with reduced ED visits for respiratory diseases for patients living within a few hundred feet of a bus route. These findings demonstrate that bus pollution policies have made critical improvements to public health.

  6. Peeking at ecosystem stability: making use of a natural disturbance experiment to analyze resistance and resilience.

    Science.gov (United States)

    Bruelheide, Helge; Luginbühl, Ute

    2009-05-01

    To determine which factors contribute most to the stability of species composition in a beech forest after profound disturbance, we made use of a natural experiment caused by a severe windthrow that occurred at a permanent monitoring site in an old beech forest in Lower Saxony (Germany). The floristic composition was recorded for the succeeding five years after the disturbance and used to derive measures of resistance and resilience for plots as well as for individual species. Due to the existence of previously established randomly distributed permanent plots, we had precise information of the pre-disturbance state, including initial cover of the herb layer, species richness, and species composition. Variables describing the floristic change, resistance, and resilience were derived from correspondence analysis allowing for partitioning the effects of variation among plots from those of temporal change. We asked to which degree these variables could be predicted by pre-disturbance state and disturbance intensity. We found that both the pre-disturbance state and the disturbance intensity were good predictors for floristic change and resistance, while they failed to predict resilience. Among the descriptors of the pre-disturbance state the initial cover of the herb layer turned out to be a useful predictor, which is explained by a high vegetation cover buffering against losses and preventing establishment of newcomers. In contrast, species number neither showed a relationship to floristic change nor to resistance. Putative positive effects of species number on stability according to the insurance hypothesis might have been counterbalanced by a disruption of niche complementarity in species-rich communities. Among the descriptors of disturbance intensity, the loss in canopy cover and the change in photosynthetically active radiation after the storm were equally good predictors for the change in floristic composition and resistance. The analysis of the responses of

  7. Determination of the extra buffering distance in the one-way nesting procedure for the regional ocean

    Science.gov (United States)

    Hwang, Jin Hwan; Pham, Van Sy

    2017-04-01

    The Big-Brother Experiment (BBE) evaluates the effect of domain size on the ocean regional circulation model (ORCMs) in the downscaling and nesting from the ocean global circulation (OGCMs). The BBE first establishes a mimic ocean global circulation models (M-OGCMs) data and employs a ORCM to simulate for a highly resolved large domain. This M-OGCM's results were then filtered to remove short scales then used for boundary and initial conditions of the nested ORCMs, which have the same resolution to the M-OGCMs. The various sizes of domain were embedded in the M-OGCMs and the cases were simulated to see the effect of domain size with the extra buffering distance to the results of the ORCMs. The diagnostic variables including temperature, salinity and vorticity of the nested domain are then compared with those of the M-OGCMs before filtering. Differences between them can address the errors associating with the sizes of the domain, which are not attributed unambiguously to models errors or observational errors. The results showed that domain size significantly impacts on the results of ORCMs. As the domain size of the ORCM becomes lager, the distance of the extra space between the area of interest and the updated LBCs increases. So, the results of ORCMs perform more highly correlated with the M-OGCM. But, there are a certain optimal sizes of the OGCMs, which could be larger than nested ORCMs' domain size from 2 to 10 times, depending on the computational costs. Key words: domain size, error, ocean regional circulation model, Big-Brother Experiment. Acknowledgement: This research was supported by grants from the Korean Ministry of Oceans and Fisheries entitled "Development of integrated estuarine management system" and a National Research Foundation of Korea (NRF) Grant (No. 2015R1A5A 7037372) funded by MSIP of Korea. The authors thank the Integrated Research Institute of Construction and Environmental Engineering of Seoul National University for administrative support.

  8. Water diffusion through compacted clays analyzed by neutron scattering and tracer experiments

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez Sanchez, F

    2007-11-15

    samples using various methodologies to better interpret the dynamic results. Parameters such as particle size, layer spacing, chemical composition, external and total surfaces and porosity were determined. The fundamental transport processes in compacted clay systems were studied over a broad range of temperatures, combining microscopic diffusion experiments (T {approx} 98 to -23 {sup o}C) with macroscopic measurements (T {approx} 70 to 0 {sup o}C). Moreover, the freezing behaviour of water in compacted clays and its dynamical properties in the supercooled regime were also investigated using microscopic techniques such as neutron scattering, time of flight and backscattering techniques. Such knowledge contributes to develop simplified models for water (and possibly also solute) transport through clays, as they will be used to assess the performance of radioactive waste repositories. From this point of view, it can be beneficial for the safety of radioactive waste repositories and thus the protection of the environment.

  9. Water diffusion through compacted clays analyzed by neutron scattering and tracer experiments

    International Nuclear Information System (INIS)

    Gonzalez Sanchez, F.

    2007-11-01

    using various methodologies to better interpret the dynamic results. Parameters such as particle size, layer spacing, chemical composition, external and total surfaces and porosity were determined. The fundamental transport processes in compacted clay systems were studied over a broad range of temperatures, combining microscopic diffusion experiments (T ∼ 98 to -23 o C) with macroscopic measurements (T ∼ 70 to 0 o C). Moreover, the freezing behaviour of water in compacted clays and its dynamical properties in the supercooled regime were also investigated using microscopic techniques such as neutron scattering, time of flight and backscattering techniques. Such knowledge contributes to develop simplified models for water (and possibly also solute) transport through clays, as they will be used to assess the performance of radioactive waste repositories. From this point of view, it can be beneficial for the safety of radioactive waste repositories and thus the protection of the environment

  10. Assessing Fan Flutter Stability in the Presence of Inlet Distortion Using One-way and Two-way Coupled Methods

    Science.gov (United States)

    Herrick, Gregory P.

    2014-01-01

    Concerns regarding noise, propulsive efficiency, and fuel burn are inspiring aircraft designs wherein the propulsive turbomachines are partially (or fully)embedded within the airframe; such designs present serious concerns with regard to aerodynamic and aeromechanic performance of the compression system in response to inlet distortion. Previously, a preliminary design of a forward-swept high-speed fan exhibited flutter concerns in clean-inlet flows, and the present author then studied this fan further in the presence of off-design distorted in-flows. A three-dimensional, unsteady, Navier-Stokes computational fluid dynamics code is applied to analyze and corroborate fan performance with clean inlet flow. This code, already validated in its application to assess aerodynamic damping of vibrating blades at various flow conditions using a loosely-coupled approach, is modified to include a tightly-coupled aeroelastic simulation capability, and then loosely-coupled and tightly-coupled methods arecompared in their evaluation of flutter stability in distorted in-flows.

  11. Transient analyzer

    International Nuclear Information System (INIS)

    Muir, M.D.

    1975-01-01

    The design and design philosophy of a high performance, extremely versatile transient analyzer is described. This sub-system was designed to be controlled through the data acquisition computer system which allows hands off operation. Thus it may be placed on the experiment side of the high voltage safety break between the experimental device and the control room. This analyzer provides control features which are extremely useful for data acquisition from PPPL diagnostics. These include dynamic sample rate changing, which may be intermixed with multiple post trigger operations with variable length blocks using normal, peak to peak or integrate modes. Included in the discussion are general remarks on the advantages of adding intelligence to transient analyzers, a detailed description of the characteristics of the PPPL transient analyzer, a description of the hardware, firmware, control language and operation of the PPPL transient analyzer, and general remarks on future trends in this type of instrumentation both at PPPL and in general

  12. Analyze Experiment For Vigas and Pertamax to Performance and Exhaust Gas Emission for Gasoline Motor 2000cc

    Science.gov (United States)

    As'adi, Muhamad; Chrisna Ayu Dwiharpini Tupan, Diachirta

    2018-02-01

    The purpose and target for this analyze experiment is we get the performance variabel from gasoline motor which used LGV for fuel and Pertamax, so can give knowledge to community if LGV can be using LGV for fuel to transportation industry and more economic. We used experiment method of engine gasoline motor with 2000 cc which is LGV and Pertamax for fuel. The experiment with static experiment tes above Dyno Test. The result is engine perform to subscribe Torque, power, fuel consumption. Beside the static test we did the Exhaust Steam Emission. The result is the used LGV with the commercial brand Vigas can increase the maximum Engine Power 20.86% and Average Power 14.1%, the maximum torque for Motor which is use LGV as fuel is smaller than Motor with Pertamax, the decrease is 0.94%.Using Vigas in Motor can increase the mileage until 6.9% compare with the Motor with pertamax.Air Fuel Ratio (AFR) for both of the fuels still below the standard, so still happen waste of fuel, specially in low compression.Using Vigas can reduce the Exhaust Steam Emission especially CO2

  13. Quantum computing with photons: introduction to the circuit model, the one-way quantum computer, and the fundamental principles of photonic experiments

    International Nuclear Information System (INIS)

    Barz, Stefanie

    2015-01-01

    Quantum physics has revolutionized our understanding of information processing and enables computational speed-ups that are unattainable using classical computers. This tutorial reviews the fundamental tools of photonic quantum information processing. The basics of theoretical quantum computing are presented and the quantum circuit model as well as measurement-based models of quantum computing are introduced. Furthermore, it is shown how these concepts can be implemented experimentally using photonic qubits, where information is encoded in the photons’ polarization. (tutorial)

  14. Sticky or Slippery Wetting: Network Formation Conditions Can Provide a One-Way Street for Water Flow on Platinum-cured Silicone.

    Science.gov (United States)

    Wang, Chenyu; Nair, Sithara S; Veeravalli, Sharon; Moseh, Patricia; Wynne, Kenneth J

    2016-06-08

    -temperature end (slippery surface) but became pinned at the low-temperature end (sticky surface) and did not move when the slide was rotated 180°. The surface was therefore a "one-way street" for water droplet flow. Theory provides fundamental understanding for slippery/sticky behavior for gradient S-PDMS and Pt-PDMS coatings. A model for network formation is based on hydrosilylation at high temperature and condensation curing of Si-OH from autoxidation of Si-H at low temperatures. In summary, network formation conditions strongly affect receding contact angles and water adhesion for Sylgard 184 and the filler-free mimic Pt-PDMS. These findings suggest careful control of curing conditions is important to silicones used in microfluidic devices or as biomedical materials. Network-forming conditions also impact bulk mechanical properties for Sylgard 184, but the range that can be obtained has not been critically examined for specific applications.

  15. In Situ Analysis of Organic Compounds on Mars by Gas Chromatography : Application to the Mars Organic Molecule Analyzer (MOMA) Experiment

    Science.gov (United States)

    Buch, Arnaud; Sternberg, R.; Freissinet, C.; Szopa, C.; Coll, P.; Garnier, C.; Rodier, C.; Phillipon, C.; El bekri, J.; Stambouli, M.; Goesmann, F.; Raulin, F.; MOMA GC-team

    2009-09-01

    The search for signs of past or present life is one of the primary goals of the future Mars exploratory missions. With this aim the Mars Organic Molecule Analyzer (MOMA) experiment of the ExoMars 2016 next coming European Space Agency mission is designed to the in situ analysis of organic molecules of exobiological interest in the Martian soil such as amino acids, carboxylic acids, nucleobases or polycyclic aromatic hydrocarbons (PAHs). With the aim to extract from the soil, separate and detect organic compounds we have developed a sample processing system allowing the Gas Chromatographic analysis, within space compatible operating conditions, of the refractory organic compounds able to be contained at trace level in the Martian soil. The sample processing is performed in the oven dedicated to the MOMA experiment containing the solid sample ( 200mg). The internal temperature can be ranged from 20 to 1000 °C. The extraction step is achieved by using thermodesorption in the range of 100 to 300°C for 5 to 20 min. Then, the chemical derivatization or thermochemolysis of the extracted compounds is achieved directly on the soil with a mixture of MTBSTFA-DMF, TMAH or DMF-DMA solution when enantiomeric separation is required. By decreasing the polarity of the target molecules, this step allows their volatilization at a temperature below 250°C without any chemical degradation. Once derivatized, the target volatile molecules are trapped in a cold or chemical trap and promptly desorbed in the gas chromatograph coupled with a mass spectrometer. Organic compounds such as amino and carboxylic acids contained in Martian analogue soil (Atacama) have been detected by using our sample processing system.

  16. Improved One-Way Hash Chain and Revocation Polynomial-Based Self-Healing Group Key Distribution Schemes in Resource-Constrained Wireless Networks

    Directory of Open Access Journals (Sweden)

    Huifang Chen

    2014-12-01

    Full Text Available Self-healing group key distribution (SGKD aims to deal with the key distribution problem over an unreliable wireless network. In this paper, we investigate the SGKD issue in resource-constrained wireless networks. We propose two improved SGKD schemes using the one-way hash chain (OHC and the revocation polynomial (RP, the OHC&RP-SGKD schemes. In the proposed OHC&RP-SGKD schemes, by introducing the unique session identifier and binding the joining time with the capability of recovering previous session keys, the problem of the collusion attack between revoked users and new joined users in existing hash chain-based SGKD schemes is resolved. Moreover, novel methods for utilizing the one-way hash chain and constructing the personal secret, the revocation polynomial and the key updating broadcast packet are presented. Hence, the proposed OHC&RP-SGKD schemes eliminate the limitation of the maximum allowed number of revoked users on the maximum allowed number of sessions, increase the maximum allowed number of revoked/colluding users, and reduce the redundancy in the key updating broadcast packet. Performance analysis and simulation results show that the proposed OHC&RP-SGKD schemes are practical for resource-constrained wireless networks in bad environments, where a strong collusion attack resistance is required and many users could be revoked.

  17. Solar wind interaction with Mars Upper atmosphere: Results from the one-way coupling between the Multi-fluid MHD model and the M-TGCM model

    Science.gov (United States)

    Dong, C.; Bougher, S. W.; Ma, Y.; Toth, G.; Nagy, A. F.; Brain, D. A.; Najib, D.

    2012-12-01

    The study of the solar wind interaction with Mars upper atmosphere/ionosphere has triggered great interest in recent years. Among the large number of topics in this research area, the investigation of ion escape rates has become increasingly important due to its potential impact on the long-term evolution of Mars atmosphere (e.g., loss of water) over its history. In the present work, we adopt the 3D Mars neutral atmosphere profiles from the well-regarded Mars Thermospheric Global Circulation Model (M-TGCM) and one-way couple it with the 3D BATS-R-US Mars multi-fluid MHD model that solves separate momentum equations for each ion species. The M-TGCM model takes into account the effects of the solar cycle (solar minimum: F10.7=70 and solar maximum: F10.7=200 with equinox condition: Ls=0), allowing us to investigate the effects of the solar cycle on the Mars upper atmosphere ion escape by using a one-way coupling, i.e., the M-TGCM model outputs are used as inputs for the multi-fluid MHD model. A case for solar maximum with extremely high solar wind parameters is also investigated to estimate how high the escape flux can be for such an extreme case. Moreover, the ion escape flux along a satellite trajectory will be studied. This has the potential to provide predictions of ion escape rates for comparison to future data to be returned by the MAVEN mission (2012-2016). In order to make the code run more efficiently, we adopt a more appropriate grid structure compared to the one used previously. This new grid structure will benefit us to investigate the effects of some dynamic events (such as CME and dust storm) on the ion escape flux.

  18. Mechanism of One-Way Traffic of Hexameric Phi29 DNA Packaging Motor with Four Electropositive Relaying Layers Facilitating Antiparallel Revolution

    Science.gov (United States)

    2013-01-01

    The importance of nanomotors in nanotechnology is akin to that of mechanical engines to daily life. The AAA+ superfamily is a class of nanomotors performing various functions. Their hexagonal arrangement facilitates bottom-up assembly for stable structures. The bacteriophage phi29 DNA translocation motor contains three coaxial rings: a dodecamer channel, a hexameric ATPase ring, and a hexameric pRNA ring. The viral DNA packaging motor has been believed to be a rotational machine. However, we discovered a revolution mechanism without rotation. By analogy, the earth revolves around the sun while rotating on its own axis. One-way traffic of dsDNA translocation is facilitated by five factors: (1) ATPase changes its conformation to revolve dsDNA within a hexameric channel in one direction; (2) the 30° tilt of the channel subunits causes an antiparallel arrangement between two helices of dsDNA and channel wall to advance one-way translocation; (3) unidirectional flow property of the internal channel loops serves as a ratchet valve to prevent reversal; (4) 5′–3′ single-direction movement of one DNA strand along the channel wall ensures single direction; and (5) four electropositive layers interact with one strand of the electronegative dsDNA phosphate backbone, resulting in four relaying transitional pauses during translocation. The discovery of a riding system along one strand provides a motion nanosystem for cargo transportation and a tool for studying force generation without coiling, friction, and torque. The revolution of dsDNA among 12 subunits offers a series of recognition sites on the DNA backbone to provide additional spatial variables for nucleotide discrimination for sensing applications. PMID:23510192

  19. Les aboutissements de la circulation à sens unique dans le discours médical [The Outcomes of one-way circulation in medical discourse

    Directory of Open Access Journals (Sweden)

    Mª Dolores Vivero García

    2010-12-01

    Full Text Available Ce travail concerne le discours médical sur la dépression. L'analyse de la mise en scène de la circulation des connaissancesdans notre corpus montre que ce discours se fonde sur une doxa scientifique construite comme le lieu d'une circulation. Il prend également appui sur une représentation de la maladie supposéepartagée par les destinataires (les médecins généralistes, si bien que l'énonciation se rattache implicitement à un discours doxique relatif à un certain modèle de la dépression.We analyse medical discourse about depressive disorder. Analyse of representation of knowledge circulation shows that this discourse is based on Doxa as a one-way circulation. It is based in the social representation of depressive disorder supposedly shared by interlocutors, that is to say the general practitioners. The enunciation appears as implicitly connected to discourse of one model of depressive disorder.

  20. Multi-decadal scenario simulation over Korea using a one-way double-nested regional climate model system. Part 2: future climate projection (2021-2050)

    Energy Technology Data Exchange (ETDEWEB)

    Im, Eun-Soon; Kwon, Won-Tae [Meteorological Research Institute, Korea Meteorological Administration, Climate Research Lab, Seoul (Korea); Ahn, Joong-Bae [Pusan National University, Department of Atmospheric Sciences, Pusan (Korea); Giorgi, Filippo [Abdus Salam ICTP, Trieste (Italy)

    2008-02-15

    An analysis of simulated future surface climate change over the southern half of Korean Peninsula using a RegCM3-based high-resolution one-way double-nested system is presented. Changes in mean climate as well as the frequency and intensity of extreme climate events are discussed for the 30-year-period of 2021-2050 with respect to the reference period of 1971-2000 based on the IPCC SRES B2 emission scenario. Warming in the range of 1-4 C is found throughout the analysis region and in all seasons. The warming is maximum in the higher latitudes of the South Korean Peninsula and in the cold season. A large reduction in snow depth is projected in response to the increase of winter minimum temperature induced by the greenhouse warming. The change in precipitation shows a distinct seasonal variation and a substantial regional variability. In particular, we find a large increase of wintertime precipitation over Korea, especially in the upslope side of major mountain systems. Summer precipitation increases over the northern part of South Korea and decreases over the southern regions, indicating regional diversity. The precipitation change also shows marked intraseasonal variations throughout the monsoon season. The temperature change shows a positive trend throughout 2021-2050 while the precipitation change is characterized by pronounced interdecadal variations. The PDF of the daily temperature is shifted towards higher values and is somewhat narrower in the scenario run than the reference one. The number of frost days decreases markedly and the number of hot days increases. The regional distribution of heavy precipitation (over 80 mm/day) changes considerably, indicating changes in flood vulnerable regions. The climate change signal shows pronounced fine scale signal over Korea, indicating the need of high-resolution climate simulations. (orig.)

  1. A simple computer-based method for performing and analyzing intracranial self-stimulation experiments in rats.

    Science.gov (United States)

    Kling-Petersen, T; Svensson, K

    1993-05-01

    Intracranial self-stimulation (ICSS) in the rat is a useful tool for studying the importance of various brain monoamines in positive reinforcement. The effects of compounds interacting with dopaminergic neurotransmission is measurable by studying the changes of reward thresholds. By computerisation of the analysis of these thresholds, standardisation and reproducibility is greatly enhanced. The use of an object-oriented programming language simplifies the programming of a specific application and it provides scientists without formal training in computer programming the means to create their own software. A system for the acquisition, execution, analysis and storage of ICSS experiments is described. The hardware is based on Apple Macintosh computers, interfaced to the test chambers and physiological stimulators using a plug-in card supporting A/D, D/A, digital I/O and timer functions. The software written in G (LabVIEW) provides the user with a graphically based 'Virtual Instrument' performing all aspect of the ICSS experiment. The software performs threshold analysis immediately after completion of the ICSS experiment, thereby greatly reducing the total time previously needed to evaluate these experiments. The graphical approach used in LabVIEW allows the programmer to make fast and simple alterations to suit different experimental problems.

  2. HIGH-TEMPERATURE EXAFS EXPERIMENTS ON LIQUID KPB ALLOYS ANALYZED WITH THE REVERSE MONTE-CARLO METHOD

    NARCIS (Netherlands)

    BRAS, W; XU, R; WICKS, JD; VANDERHORST, F; OVERSLUIZEN, M; MCGREEVY, RL; VANDERLUGT, W

    1994-01-01

    A new sample chamber has been designed which allows high temperature Extended X-ray Absorption Fine Structure (EXAFS) experiments on metallic melts which offer a number of special experimental problems: they are highly corrosive, have high vapour pressures and strongly absorb X-rays. The EXAFS

  3. One-way coupling of an integrated assessment model and a water resources model: evaluation and implications of future changes over the US Midwest

    Science.gov (United States)

    Voisin, N.; Liu, L.; Hejazi, M.; Tesfa, T.; Li, H.; Huang, M.; Liu, Y.; Leung, L. R.

    2013-11-01

    An integrated model is being developed to advance our understanding of the interactions between human activities, terrestrial system and water cycle, and to evaluate how system interactions will be affected by a changing climate at the regional scale. As a first step towards that goal, a global integrated assessment model, which includes a water-demand model driven by socioeconomics at regional and global scales, is coupled in a one-way fashion with a land surface hydrology-routing-water resources management model. To reconcile the scale differences between the models, a spatial and temporal disaggregation approach is developed to downscale the annual regional water demand simulations into a daily time step and subbasin representation. The model demonstrates reasonable ability to represent the historical flow regulation and water supply over the US Midwest (Missouri, Upper Mississippi, and Ohio river basins). Implications for future flow regulation, water supply, and supply deficit are investigated using climate change projections with the B1 and A2 emission scenarios, which affect both natural flow and water demand. Although natural flow is projected to increase under climate change in both the B1 and A2 scenarios, there is larger uncertainty in the changes of the regulated flow. Over the Ohio and Upper Mississippi river basins, changes in flow regulation are driven by the change in natural flow due to the limited storage capacity. However, both changes in flow and demand have effects on the Missouri River Basin summer regulated flow. Changes in demand are driven by socioeconomic factors, energy and food demands, global markets and prices with rainfed crop demand handled directly by the land surface modeling component. Even though most of the changes in supply deficit (unmet demand) and the actual supply (met demand) are driven primarily by the change in natural flow over the entire region, the integrated framework shows that supply deficit over the Missouri River

  4. Malaysia's Experiences in Analyzing the Energy Policy and Strategies to Promote Sustainable Development using IAEA Tools

    International Nuclear Information System (INIS)

    Fairuz Suzana Mohd Chachuli; Faisal Izwan Abdul Rashid; Muhammed Zolfakar Zolkaffly; Siti Syarina Mat Sali; Noriah Jamal

    2016-01-01

    Full text: Malaysia has long experiences in using the IAEA tools for energy planning and analysis since year 1980s. However due to renewed interest in nuclear power programme, Malaysia has started again developing our national capabilities in using the IAEA tools through our national project MAL4009 entitled Building Capacities In Nuclear Power Programme Planning. Under this project, Malaysia has successful trained our researchers from various agencies, through participation in national workshops and development of case studies using IAEAs tools particularly Model for Analysis of Energy Demand (MAED), Model of Energy Supply Strategy Alternatives and their General Environmental Impacts (MESSAGE), Wien Automatic System Planning Package (WASP), Model for Financial Analysis of Electric Sector Expansion Plans (FINPLAN), Simplified Approach for Estimating Impacts of Electricity Generation (SIMPACTS) and Indicators for Sustainable Energy Development (ISED). Through this project, Malaysia has developed various case studies to evaluate the competitiveness of nuclear power plant in comparison with the non-nuclear energy technologies such as coal, natural gas, hydro and renewable energy. The IAEA energy planning tools has assisted Malaysia in assessing our energy situation and evaluating alternatives energy strategies that take into account the techno-economic and environmental aspects of various energy option parameters in relation to energy afford ability, energy security, environmental and climate change impacts in the context of sustainable development. In this regards, Malaysia as a newcomer country wishing to embark on nuclear power programme, has shown our interest in conducting a Nuclear Energy System Assessment (NESA) to consider possible future nuclear systems in a holistic and comprehensive manner to determine whether or nor this technology would meet our country sustainable development objectives. (author)

  5. A full quantum analysis of the Stern–Gerlach experiment using the evolution operator method: analyzing current issues in teaching quantum mechanics

    International Nuclear Information System (INIS)

    Benítez Rodríguez, E; Aguilar, L M Arévalo; Martínez, E Piceno

    2017-01-01

    To the quantum mechanics specialists community it is a well-known fact that the famous original Stern–Gerlach experiment (SGE) produces entanglement between the external degrees of freedom (position) and the internal degree of freedom (spin) of silver atoms. Despite this fact, almost all textbooks on quantum mechanics explain this experiment using a semiclassical approach, where the external degrees of freedom are considered classical variables, the internal degree is treated as a quantum variable, and Newton's second law is used to describe the dynamics. In the literature there are some works that analyze this experiment in its full quantum mechanical form. However, astonishingly, to the best of our knowledge the original experiment, where the initial states of the spin degree of freedom are randomly oriented coming from the oven, has not been analyzed yet in the available textbooks using the Schrödinger equation (to the best of our knowledge there is only one paper that treats this case: Hsu et al (2011 Phys. Rev. A 83 012109)). Therefore, in this contribution we use the time-evolution operator to give a full quantum mechanics analysis of the SGE when the initial state of the internal degree of freedom is completely random, i.e. when it is a statistical mixture. Additionally, as the SGE and the development of quantum mechanics are heavily intermingled, we analyze some features and drawbacks in the current teaching of quantum mechanics. We focus on textbooks that use the SGE as a starting point, based on the fact that most physicist do not use results from physics education research, and comment on traditional pedagogical attitudes in the physics community. (paper)

  6. A full quantum analysis of the Stern-Gerlach experiment using the evolution operator method: analyzing current issues in teaching quantum mechanics

    Science.gov (United States)

    Benítez Rodríguez, E.; Arévalo Aguilar, L. M.; Piceno Martínez, E.

    2017-03-01

    To the quantum mechanics specialists community it is a well-known fact that the famous original Stern-Gerlach experiment (SGE) produces entanglement between the external degrees of freedom (position) and the internal degree of freedom (spin) of silver atoms. Despite this fact, almost all textbooks on quantum mechanics explain this experiment using a semiclassical approach, where the external degrees of freedom are considered classical variables, the internal degree is treated as a quantum variable, and Newton's second law is used to describe the dynamics. In the literature there are some works that analyze this experiment in its full quantum mechanical form. However, astonishingly, to the best of our knowledge the original experiment, where the initial states of the spin degree of freedom are randomly oriented coming from the oven, has not been analyzed yet in the available textbooks using the Schrödinger equation (to the best of our knowledge there is only one paper that treats this case: Hsu et al (2011 Phys. Rev. A 83 012109)). Therefore, in this contribution we use the time-evolution operator to give a full quantum mechanics analysis of the SGE when the initial state of the internal degree of freedom is completely random, i.e. when it is a statistical mixture. Additionally, as the SGE and the development of quantum mechanics are heavily intermingled, we analyze some features and drawbacks in the current teaching of quantum mechanics. We focus on textbooks that use the SGE as a starting point, based on the fact that most physicist do not use results from physics education research, and comment on traditional pedagogical attitudes in the physics community.

  7. Response in electrostatic analyzers due to backscattered electrons: case study analysis with the Juno Jovian Auroral Distribution Experiment-Electron instrument.

    Science.gov (United States)

    Clark, G; Allegrini, F; Randol, B M; McComas, D J; Louarn, P

    2013-10-01

    In this study, we introduce a model to characterize electron scattering in an electrostatic analyzer. We show that electrons between 0.5 and 30 keV scatter from internal surfaces to produce a response up to ~20% of the ideal, unscattered response. We compare our model results to laboratory data from the Jovian Auroral Distribution Experiment-Electron sensor onboard the NASA Juno mission. Our model reproduces the measured energy-angle response of the instrument well. Understanding and quantifying this scattering process is beneficial to the analysis of scientific data as well as future instrument optimization.

  8. One Way to Holland’

    DEFF Research Database (Denmark)

    Marselis, Randi; Schütze, Laura Maria

    2013-01-01

    museums, but is also relevant to ongoing collection practices. An important theme in relation to source communities is ownership and repatriation of cultural objects. Furthermore, working with source communities implies a two-way information process through which groups are given access to memory...... materials and the expertise of museum staff but are at the same time recognized as able to contribute with valuable perspectives on their own culture (Peers & Brown, 2003, p. 1). This chapter examines how the Tropenmuseum in the Netherlands, one of Europe’s largest ethnographic museums, has used Facebook...

  9. ONE WAY ANOVA RANDOMIZED COMPLETE BLOCKS

    African Journals Online (AJOL)

    ******

    2012-04-24

    Apr 24, 2012 ... volume, 350 L, which was divided by plastic sieved connected with iron frame. Each tank was supplied with fresh water at a rate of 0.5. L/min with supplemental aeration. Fish were fed twice daily with pellet diet (28% protein) to satiation six days a week, and weighed biweekly for 63 days. Injection of foreign ...

  10. Neutral Particle Analyzer Vertically Scanning Measurements of MHD-induced Energetic Ion Redistribution or Loss in the National Spherical Torus Experiment

    International Nuclear Information System (INIS)

    Medley, S.S.; Andre, R.; Bell, R.E.; Darrow, D.S.; Domier, C.W.; Fredrickson, E.D.; Gorelenkov, N.N.; Kaye, S.M.; LeBlanc, B.P.; Lee, K.C.; Levinton, F.M.; Liu, D.; Luhmann, N.C. Jr.; Menard, J.E.; Park, H.; Stutman, D.; Roquemore, A.L.; Tritz, K.; Yuh, H

    2007-01-01

    Observations of magneto-hydro-dynamic (MHD) induced redistribution or loss of energetic ions measured using the vertically scanning capability of the Neutral Particle Analyzer diagnostic on the National Spherical Torus Experiment (NSTX) are presented along with TRANSP and ORBIT code analysis of the results. Although redistribution or loss of energetic ions due to bursting fishbone-like and low-frequency (f ∼ 10 kHz) kinktype MHD activity has been reported previously, the primary goal of this work is to study redistribution or loss due to continuous Alfvenic (f ∼ 20-150 kHz) modes, a topic that heretofore has not been investigated in detail for NSTX plasmas. Initial indications are that the former drive energetic ion loss whereas the continuous Alfvenic modes only cause redistribution and the energetic ions remain confined.

  11. Neutral Particle Analyzer Vertically Scanning Measurements of MHD-induced Energetic Ion Redistribution or Loss in the National Spherical Torus Experiment

    Energy Technology Data Exchange (ETDEWEB)

    S.S. Medley, R. Andre, R.E. Bell, D.S. Darrow, C.W. Domier, E.D. Fredrickson, N.N. Gorelenkov, S.M. Kaye, B.P. LeBlanc, K.C. Lee, F.M. Levinton, D. Liu, N.C. Luhmann, Jr., J.E. Menard, H. Park, D. Stutman, A.L. Roquemore, K. Tritz, H. Yuh and the NSTX Team

    2007-11-15

    Observations of magneto-hydro-dynamic (MHD) induced redistribution or loss of energetic ions measured using the vertically scanning capability of the Neutral Particle Analyzer diagnostic on the National Spherical Torus Experiment (NSTX) are presented along with TRANSP and ORBIT code analysis of the results. Although redistribution or loss of energetic ions due to bursting fishbone-like and low-frequency (f ~ 10 kHz) kinktype MHD activity has been reported previously, the primary goal of this work is to study redistribution or loss due to continuous Alfvénic (f ~ 20 – 150 kHz) modes, a topic that heretofore has not been investigated in detail for NSTX plasmas. Initial indications are that the former drive energetic ion loss whereas the continuous Alfvénic modes only cause redistribution and the energetic ions remain confined.

  12. Optimization of thermochemolysis analysis conditions for the in situ detection of organic compounds in Martian soil with the Mars Organic Molecule Analyzer (MOMA) experiment

    Science.gov (United States)

    Morisson, Marietta; Buch, Arnaud; Szopa, Cyril; Raulin, François; Stambouli, Moncef

    2017-04-01

    Martian surface is exposed to harsh radiative and oxidative conditions which are destructive for organic molecules. That is why the future ExoMars rover will examine the molecular composition of samples acquired from depths down to two meters below the Martian surface, where organics may have been protected from radiative and oxidative degradation. The samples will then be analyzed by the Pyrolysis-Gas Chromatography-Mass Spectrometry (Pyr-GC-MS) operational mode of the Mars Organic Molecule Analyzer (MOMA) instrument. To prevent thermal alteration of organic molecules during pyrolysis, thermochemolysis with tetramethylammonium hydroxide (TMAH) will extract the organics from the mineral matrix and methylate the polar functional groups, allowing the volatilization of molecules at lower temperatures and protecting the most labile chemical groups from thermal degradation. This study has been carried out on a Martian regolith analogue (JSC-Mars-1) with a high organic content with the aim of optimizing the thermochemolysis temperature within operating conditions similar to the MOMA experiment ones. We also performed Pyrolysis-GC-MS analysis as a comparison. The results show that, unlike pyrolysis alone - which mainly produces aromatics, namely thermally altered molecules - thermochemolysis allows the extraction and identification of numerous organic molecules of astrobiological interest. They also show that the main compounds start to be detectable at low thermochemolysis temperatures ranging from 400°C to 600°C. However, we noticed that the more the temperature increases, the more the chromatograms are saturated with thermally evolved molecules leading to many coelutions and making identification difficult.

  13. Development of a Sample Processing System (SPS) for the in situ search of organic compounds on Mars : application to the Mars Organic Molecule Analyzer (MOMA) experiment

    Science.gov (United States)

    Buch, A.; Sternberg, R.; Garnier, C.; Fressinet, C.; Szopa, C.; El Bekri, J.; Coll, P.; Rodier, C.; Raulin, F.; Goesmann, F.

    2008-09-01

    The search for past or present life signs is one of the primary goals of the future Mars exploratory missions. With this aim the Mars Organic Molecule Analyzer (MOMA) module of the ExoMars 2013 next coming European space mission is designed to the in situ analysis, in the Martian soil, of organic molecules of exobiological interest such as amino acids, carboxylic acids, nucleobases or polycyclic aromatic hydrocarbons (PAHs). In the frame of the MOMA experiment we have been developing a Sample Processing System (SPS) compatible with gas chromatography (GC) analysis. The main goal of SPS is to allow the extraction and the gas chromatography separation of the refractory organic compounds from a solid matrix at trace level within space compatible operating conditions. The SPS is a mini-reactor, containing the solid sample (~500mg), able to increase (or decrease) the internal temperature from 20 to 500 °C within 13 sec. The extraction step is therefore performed by using thermodesorption, the best yield of extraction being obtained at 300°C for 10 to 20 min. It has to be noticed that the temperature could be increased up to 500°C without a significant lost of efficiency if the heating run time is kept below 3 min. After the thermodesorption the chemical derivatization of the extracted compounds is performed directly on the soil with a mixture of MTBSTFA and DMF [buch et al.]. By decreasing the polarity of the target molecules, this step allows their volatilization at a temperature below 250°C without any chemical degradation. Once derivatized, the targeted volatile molecules are transferred through a heated transfer line in the gas chromatograph coupled with a mass spectrometer for the detection. The SPS is a "one step/one pot" sample preparation system which should allow the MOMA experiment to detect the refractory molecules absorbed in the Martian soil at a detection limit below the ppb level. A. Buch, R. Sternberg, C. Szopa, C. Freissinet, C. Garnier, J. El Bekri

  14. Multichannel analyzer development in CAMAC

    International Nuclear Information System (INIS)

    Nagy, J.Z.; Zarandy, A.

    1988-01-01

    The data acquisition in TOKAMAK experiments some CAMAC modules have been developed. The modules are the following: 64 K analyzer memory, 32 K analyzer memory, 6-channel pulse peak analyzer memory which contains the 32 K analyzer memory and eight AD-converters

  15. Effect of time walk in the use of single channel analyzer/discriminator for saturated pulses in the 4πβ–γ coincidence experiments

    International Nuclear Information System (INIS)

    Kawada, Yasushi; Yunoki, Akira; Yamada, Takahiro; Hino, Yoshio

    2016-01-01

    Using the TAC technique, the timing properties of a 4πβ–γ coincidence counting system were experimentally studied with an emphasis on saturated pulses. Experiments were performed for several discriminators (integral mode of TSCA) each with different kinds of timing techniques. Timing spectra were measured at various applied voltage to the 4π proportional detector covering the entire region of the plateau. Most of timing discriminators show good timing property when the pulses remain the linear region, but suddenly deteriorate after the pulses was saturated, and the timing spectra expands seriously up to a few μs in some types of timing discriminator. To overcome this problem, two techniques were proposed. - Highlights: • Timing properties of several kinds of SCA/Discriminators were studied, including trailing edge CFT. • Focus of study on saturated pulses, using a 4πβ–γ coincidence counting system and TAC. • Validity of two novel techniques to overcome this problem was shown.

  16. Solar wind interaction with Mars' upper atmosphere: Results from 3-D studies using one-way coupling between the Multi-fluid MHD, the M-GITM and the AMPS models

    Science.gov (United States)

    Dong, C.; Bougher, S. W.; Ma, Y.; Toth, G.; Lee, Y.; Nagy, A. F.; Tenishev, V.; Pawlowski, D. J.; Meng, X.; Combi, M. R.

    2013-12-01

    The study of the solar wind interaction with Mars upper atmosphere/ionosphere has triggered a great of interest in recent years. Among the large number of topics in this research area, the investigation of ion escape fluxes has become increasingly important due to its potential impact on the long-term evolution of Mars atmosphere (e.g., loss of water) over its history. In the present work, we adopt the 3-D Mars cold neutral atmosphere profiles (0~300 km) from the newly developed and validated Mars Global Ionosphere Thermosphere Model (M-GITM) and the 3-D hot oxygen profiles (100km~5RM) from the exosphere Monte Carlo model Adaptive Mesh Particle Simulator (AMPS). We apply these 3-D model outputs fields into the 3-D BATS-R-US Mars multi-fluid MHD model (100km~20RM) that can better simulate the interplay between Mars upper atmosphere and solar wind by considering the dynamics of individual ion species. The multi-fluid model solves separate continuity, momentum and energy equations for each ion species (H+, O+, O2+, CO2+). The M-GITM model together with the AMPS exosphere model take into account the effects of solar cycle and seasonal variations on both cold and hot neutral atmospheres, allowing us to investigate the corresponding effects on the Mars upper atmosphere ion escape by using a one-way coupling approach, i.e., both the M-GITM and AMPS model outputs are used as the inputs for the multi-fluid model and M-GITM is used as input into the AMPS exosphere model. The calculations are carried out for selected cases with different nominal solar wind, solar cycle and crustal field orientation conditions. This work has the potential to provide predictions of ion escape rates for comparison to future data to be returned by the MAVEN primary mission (2014-2016) and thereby improve our understanding of present day escape processes. Acknowledgments: The work presented here was supported by NASA grants NNH10CC04C, NNX09AL26G, NSF grant ATM-0535811.

  17. Design of modern experiments

    International Nuclear Information System (INIS)

    Park, Sung Hweon

    1984-03-01

    This book is for researchers and engineers, which is written to focus on practical design of experiments. It gives descriptions of conception of design of experiments, basic statistics theory, one way design of experiment, two-way layout without repetition, two-way layout with repetition, partition, a correlation analysis and regression analysis, latin squares, factorial design, design of experiment by table of orthogonal arrays, design of experiment of response surface, design of experiment on compound, Evop, and design of experiment of taguchi.

  18. Soft Decision Analyzer

    Science.gov (United States)

    Steele, Glen; Lansdowne, Chatwin; Zucha, Joan; Schlensinger, Adam

    2013-01-01

    The Soft Decision Analyzer (SDA) is an instrument that combines hardware, firmware, and software to perform realtime closed-loop end-to-end statistical analysis of single- or dual- channel serial digital RF communications systems operating in very low signal-to-noise conditions. As an innovation, the unique SDA capabilities allow it to perform analysis of situations where the receiving communication system slips bits due to low signal-to-noise conditions or experiences constellation rotations resulting in channel polarity in versions or channel assignment swaps. SDA s closed-loop detection allows it to instrument a live system and correlate observations with frame, codeword, and packet losses, as well as Quality of Service (QoS) and Quality of Experience (QoE) events. The SDA s abilities are not confined to performing analysis in low signal-to-noise conditions. Its analysis provides in-depth insight of a communication system s receiver performance in a variety of operating conditions. The SDA incorporates two techniques for identifying slips. The first is an examination of content of the received data stream s relation to the transmitted data content and the second is a direct examination of the receiver s recovered clock signals relative to a reference. Both techniques provide benefits in different ways and allow the communication engineer evaluating test results increased confidence and understanding of receiver performance. Direct examination of data contents is performed by two different data techniques, power correlation or a modified Massey correlation, and can be applied to soft decision data widths 1 to 12 bits wide over a correlation depth ranging from 16 to 512 samples. The SDA detects receiver bit slips within a 4 bits window and can handle systems with up to four quadrants (QPSK, SQPSK, and BPSK systems). The SDA continuously monitors correlation results to characterize slips and quadrant change and is capable of performing analysis even when the

  19. Analyzing Peace Pedagogies

    Science.gov (United States)

    Haavelsrud, Magnus; Stenberg, Oddbjorn

    2012-01-01

    Eleven articles on peace education published in the first volume of the Journal of Peace Education are analyzed. This selection comprises peace education programs that have been planned or carried out in different contexts. In analyzing peace pedagogies as proposed in the 11 contributions, we have chosen network analysis as our method--enabling…

  20. Miniature mass analyzer

    CERN Document Server

    Cuna, C; Lupsa, N; Cuna, S; Tuzson, B

    2003-01-01

    The paper presents the concept of different mass analyzers that were specifically designed as small dimension instruments able to detect with great sensitivity and accuracy the main environmental pollutants. The mass spectrometers are very suited instrument for chemical and isotopic analysis, needed in environmental surveillance. Usually, this is done by sampling the soil, air or water followed by laboratory analysis. To avoid drawbacks caused by sample alteration during the sampling process and transport, the 'in situ' analysis is preferred. Theoretically, any type of mass analyzer can be miniaturized, but some are more appropriate than others. Quadrupole mass filter and trap, magnetic sector, time-of-flight and ion cyclotron mass analyzers can be successfully shrunk, for each of them some performances being sacrificed but we must know which parameters are necessary to be kept unchanged. To satisfy the miniaturization criteria of the analyzer, it is necessary to use asymmetrical geometries, with ion beam obl...

  1. Analog multivariate counting analyzers

    CERN Document Server

    Nikitin, A V; Armstrong, T P

    2003-01-01

    Characterizing rates of occurrence of various features of a signal is of great importance in numerous types of physical measurements. Such signal features can be defined as certain discrete coincidence events, e.g. crossings of a signal with a given threshold, or occurrence of extrema of a certain amplitude. We describe measuring rates of such events by means of analog multivariate counting analyzers. Given a continuous scalar or multicomponent (vector) input signal, an analog counting analyzer outputs a continuous signal with the instantaneous magnitude equal to the rate of occurrence of certain coincidence events. The analog nature of the proposed analyzers allows us to reformulate many problems of the traditional counting measurements, and cast them in a form which is readily addressed by methods of differential calculus rather than by algebraic or logical means of digital signal processing. Analog counting analyzers can be easily implemented in discrete or integrated electronic circuits, do not suffer fro...

  2. Analyzing Stereotypes in Media.

    Science.gov (United States)

    Baker, Jackie

    1996-01-01

    A high school film teacher studied how students recognized messages in film, examining how film education could help students identify and analyze racial and gender stereotypes. Comparison of students' attitudes before and after the film course found that the course was successful in raising students' consciousness. (SM)

  3. Centrifugal analyzer development

    International Nuclear Information System (INIS)

    Burtis, C.A.; Bauer, M.L.; Bostick, W.D.

    1976-01-01

    The development of the centrifuge fast analyzer (CFA) is reviewed. The development of a miniature CFA with computer data analysis is reported and applications for automated diagnostic chemical and hematological assays are discussed. A portable CFA system with microprocessor was adapted for field assays of air and water samples for environmental pollutants, including ammonia, nitrates, nitrites, phosphates, sulfates, and silica. 83 references

  4. Americal options analyzed differently

    NARCIS (Netherlands)

    Nieuwenhuis, J.W.

    2003-01-01

    In this note we analyze in a discrete-time context and with a finite outcome space American options starting with the idea that every tradable should be a martingale under a certain measure. We believe that in this way American options become more understandable to people with a good working

  5. Digital, One Way, Acoustic Communication in the Ocean

    Science.gov (United States)

    1990-09-01

    phase is vital to the receiver in order to be able to detect the transmitted signal. For incoherent modulation schemes, only the knowledge of the carrier...34 -5 - c=0 , =0.2T -:10 , -10 0 4 8 12 16 20 0 4 8 L2 1 20 Signol to Noise Ratio 1 9 ] Signo ] to Noise Poifo I 05 k=0.4 cc,, k=Oz 5 ’ _E 0 -5

  6. One-Way Quantum Authenticated Secure Communication Using Rotation Operation

    International Nuclear Information System (INIS)

    Tsai Chia-Wei; Wei Toung-Shang; Hwang Tzonelih

    2011-01-01

    This study proposes a theoretical quantum authenticated secure communication (QASC) protocol using Einstein-Podolsky-Rosen (EPR) entangle state, which enables a sender to send a secure as well as authenticated message to a receiver within only one step quantum transmission without having the classical channels and the certification authority. (general)

  7. Guns and Fear: A One-Way Street?

    Science.gov (United States)

    Hauser, Will; Kleck, Gary

    2013-01-01

    Surveys show that more than one half of gun owners report owning their firearm for self-protection. Although research has examined the effect of fear of crime on gun ownership, the issue of reciprocity and temporal order has been largely ignored. Furthermore, the effect of firearm acquisition and relinquishment on fear has not been evaluated…

  8. Root hair growth: it's a one way street.

    Science.gov (United States)

    Mendrinna, Amelie; Persson, Staffan

    2015-01-01

    Over the last few decades, our understanding of directed cell growth in different organisms has substantially improved. Tip-growing cells in plants elongate rapidly via targeted deposition of cell wall and membrane material at the cell apex, and use turgor pressure as a driving force for expansion. This type of polar growth requires a high degree of coordination between a plethora of cellular and extracellular components and compounds, including calcium dynamics, apoplastic reactive oxygen species and pH, the cytoskeleton, and vesicular trafficking. In this review, we attempt to outline and summarize the factors that control root hair growth and how they work together as a team.

  9. Ants can learn to forage on one-way trails.

    Directory of Open Access Journals (Sweden)

    Pedro Leite Ribeiro

    Full Text Available The trails formed by many ant species between nest and food source are two-way roads on which outgoing and returning workers meet and touch each other all along. The way to get back home, after grasping a food load, is to take the same route on which they have arrived from the nest. In many species such trails are chemically marked by pheromones providing orientation cues for the ants to find their way. Other species rely on their vision and use landmarks as cues. We have developed a method to stop foraging ants from shuttling on two-way trails. The only way to forage is to take two separate roads, as they cannot go back on their steps after arriving at the food or at the nest. The condition qualifies as a problem because all their orientation cues -- chemical, visual or any other -- are disrupted, as all of them cannot but lead the ants back to the route on which they arrived. We have found that workers of the leaf-cutting ant Atta sexdens rubropilosa can solve the problem. They could not only find the alternative way, but also used the unidirectional traffic system to forage effectively. We suggest that their ability is an evolutionary consequence of the need to deal with environmental irregularities that cannot be negotiated by means of excessively stereotyped behavior, and that it is but an example of a widespread phenomenon. We also suggest that our method can be adapted to other species, invertebrate and vertebrate, in the study of orientation, memory, perception, learning and communication.

  10. Delay or anticipatory synchronization in one-way coupled systems ...

    Indian Academy of Sciences (India)

    tems with enhanced security. The synchronizing channel carries information about the transmitter dynamics only at intervals of reset time and hence is not susceptible to recon- struction easily. The message channel being separate can be made complex enough by adding nonlinear combination of the transmitter variable at ...

  11. Analyzed Using Statistical Moments

    International Nuclear Information System (INIS)

    Oltulu, O.

    2004-01-01

    Diffraction enhanced imaging (DEl) technique is a new x-ray imaging method derived from radiography. The method uses a monorheumetten x-ray beam and introduces an analyzer crystal between an object and a detector Narrow angular acceptance of the analyzer crystal generates an improved contrast over the evaluation radiography. While standart radiography can produce an 'absorption image', DEl produces 'apparent absorption' and 'apparent refraction' images with superior quality. Objects with similar absorption properties may not be distinguished with conventional techniques due to close absorption coefficients. This problem becomes more dominant when an object has scattering properties. A simple approach is introduced to utilize scattered radiation to obtain 'pure absorption' and 'pure refraction' images

  12. Charged particle analyzer PLAZMAG

    International Nuclear Information System (INIS)

    Apathy, Istvan; Endroeczy, Gabor; Szemerey, Istvan; Szendroe, Sandor

    1985-01-01

    The scientific task of the charged particle analyzer PLAZMAG, a part of the VEGA space probe, and the physical background of the measurements are described. The sensor of the device face the Sun and the comet Halley measuring the energy and mass spectrum of ion and electron components of energies lower than 25 keV. The tasks of the individual electronic parts, the design aspects and the modes of operation in different phases of the flight are dealt with. (author)

  13. Fractional channel multichannel analyzer

    Science.gov (United States)

    Brackenbush, L.W.; Anderson, G.A.

    1994-08-23

    A multichannel analyzer incorporating the features of the present invention obtains the effect of fractional channels thus greatly reducing the number of actual channels necessary to record complex line spectra. This is accomplished by using an analog-to-digital converter in the asynchronous mode, i.e., the gate pulse from the pulse height-to-pulse width converter is not synchronized with the signal from a clock oscillator. This saves power and reduces the number of components required on the board to achieve the effect of radically expanding the number of channels without changing the circuit board. 9 figs.

  14. Plutonium solution analyzer

    International Nuclear Information System (INIS)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded)

  15. Ring Image Analyzer

    Science.gov (United States)

    Strekalov, Dmitry V.

    2012-01-01

    Ring Image Analyzer software analyzes images to recognize elliptical patterns. It determines the ellipse parameters (axes ratio, centroid coordinate, tilt angle). The program attempts to recognize elliptical fringes (e.g., Newton Rings) on a photograph and determine their centroid position, the short-to-long-axis ratio, and the angle of rotation of the long axis relative to the horizontal direction on the photograph. These capabilities are important in interferometric imaging and control of surfaces. In particular, this program has been developed and applied for determining the rim shape of precision-machined optical whispering gallery mode resonators. The program relies on a unique image recognition algorithm aimed at recognizing elliptical shapes, but can be easily adapted to other geometric shapes. It is robust against non-elliptical details of the image and against noise. Interferometric analysis of precision-machined surfaces remains an important technological instrument in hardware development and quality analysis. This software automates and increases the accuracy of this technique. The software has been developed for the needs of an R&TD-funded project and has become an important asset for the future research proposal to NASA as well as other agencies.

  16. Plutonium solution analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded).

  17. Analyzing Water's Optical Absorption

    Science.gov (United States)

    2002-01-01

    A cooperative agreement between World Precision Instruments (WPI), Inc., and Stennis Space Center has led the UltraPath(TM) device, which provides a more efficient method for analyzing the optical absorption of water samples at sea. UltraPath is a unique, high-performance absorbance spectrophotometer with user-selectable light path lengths. It is an ideal tool for any study requiring precise and highly sensitive spectroscopic determination of analytes, either in the laboratory or the field. As a low-cost, rugged, and portable system capable of high- sensitivity measurements in widely divergent waters, UltraPath will help scientists examine the role that coastal ocean environments play in the global carbon cycle. UltraPath(TM) is a trademark of World Precision Instruments, Inc. LWCC(TM) is a trademark of World Precision Instruments, Inc.

  18. PDA: Pooled DNA analyzer

    Directory of Open Access Journals (Sweden)

    Lin Chin-Yu

    2006-04-01

    Full Text Available Abstract Background Association mapping using abundant single nucleotide polymorphisms is a powerful tool for identifying disease susceptibility genes for complex traits and exploring possible genetic diversity. Genotyping large numbers of SNPs individually is performed routinely but is cost prohibitive for large-scale genetic studies. DNA pooling is a reliable and cost-saving alternative genotyping method. However, no software has been developed for complete pooled-DNA analyses, including data standardization, allele frequency estimation, and single/multipoint DNA pooling association tests. This motivated the development of the software, 'PDA' (Pooled DNA Analyzer, to analyze pooled DNA data. Results We develop the software, PDA, for the analysis of pooled-DNA data. PDA is originally implemented with the MATLAB® language, but it can also be executed on a Windows system without installing the MATLAB®. PDA provides estimates of the coefficient of preferential amplification and allele frequency. PDA considers an extended single-point association test, which can compare allele frequencies between two DNA pools constructed under different experimental conditions. Moreover, PDA also provides novel chromosome-wide multipoint association tests based on p-value combinations and a sliding-window concept. This new multipoint testing procedure overcomes a computational bottleneck of conventional haplotype-oriented multipoint methods in DNA pooling analyses and can handle data sets having a large pool size and/or large numbers of polymorphic markers. All of the PDA functions are illustrated in the four bona fide examples. Conclusion PDA is simple to operate and does not require that users have a strong statistical background. The software is available at http://www.ibms.sinica.edu.tw/%7Ecsjfann/first%20flow/pda.htm.

  19. Robot task space analyzer

    International Nuclear Information System (INIS)

    Hamel, W.R.; Osborn, J.

    1997-01-01

    Many nuclear projects such as environmental restoration and waste management challenges involve radiation or other hazards that will necessitate the use of remote operations that protect human workers from dangerous exposures. Remote work is far more costly to execute than what workers could accomplish directly with conventional tools and practices because task operations are slow and tedious due to difficulties of remote manipulation and viewing. Decades of experience within the nuclear remote operations community show that remote tasks may take hundreds of times longer than hands-on work; even with state-of-the-art force- reflecting manipulators and television viewing, remote task performance execution is five to ten times slower than equivalent direct contact work. Thus the requirement to work remotely is a major cost driver in many projects. Modest improvements in the work efficiency of remote systems can have high payoffs by reducing the completion time of projects. Additional benefits will accrue from improved work quality and enhanced safety

  20. Building and analyzing models from data by stirred tank experiments for investigation of matrix effects caused by inorganic matrices and selection of internal standards in Inductively Coupled Plasma-Atomic Emission Spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Grotti, Marco [Dipartimento di Chimica e Chimica Industriale, Via Dodecaneso 31, 16146 Genova (Italy)], E-mail: grotti@chimica.unige.it; Paredes, Eduardo; Maestre, Salvador; Todoli, Jose Luis [Departamento de Quimica Analitica, Nutricion y Bromatologia, Universidad de Alicante, 03080, Alicante (Spain)

    2008-05-15

    Interfering effects caused by inorganic matrices (inorganic acids as well as easily ionized elements) in inductively coupled plasma-atomic emission spectroscopy have been modeled by regression analysis of experimental data obtained using the 'stirred tank method'. The main components of the experimental set-up were a magnetically-stirred container and two peristaltic pumps. In this way the matrix composition was gradually and automatically varied, while the analyte concentration remained unchanged throughout the experiment. An inductively coupled plasma spectrometer with multichannel detection based on coupled charge device was used to simultaneously measure the emission signal at several wavelengths when the matrix concentration was modified. Up to 50 different concentrations were evaluated in a period of time of 10 min. Both single interfering species (nitric, hydrochloric and sulphuric acids, sodium and calcium) and different mixtures (aqua regia, sulfonitric mixture, sodium-calcium mixture and sodium-nitric acid mixture) were investigated. The dependence of the emission signal on acid concentration was well-fitted by logarithmic models. Conversely, for the easily ionized elements, 3-order polynomial models were more suitable to describe the trends. Then, the coefficients of these models were used as 'signatures' of the matrix-related signal variations and analyzed by principal component analysis. Similarities and differences among the emission lines were highlighted and discussed, providing a new insight into the interference phenomena, mainly with regards to the combined effect of concomitants. The combination of the huge amount of data obtained by the stirred tank method in a short period of time and the speed of analysis of principal component analysis provided a judicious means for the selection of the optimal internal standard in inductively coupled plasma-atomic emission spectroscopy.

  1. Transient Enhancement ('Spike-on-Tail') Observed on Neutral-Beam-Injected Energetic Ion Spectra Using the E||B Neutral Particle Analyzer in the National Spherical Torus Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Medley, S. S. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Gorelenkov, N. N. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Bell, R. E. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Fredrickson, E. D. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Gerhardt, S. P. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); LeBlanc, B. P. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Podesta, M. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Roquemore, A. L. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States)

    2010-06-01

    An increase of up to four-fold in the E||B Neutral Particle Analyzer (NPA) charge exchange neutral flux localized at the Neutral Beam (NB) injection full energy is observed in the National Spherical Torus Experiment (NSTX). Termed the High-Energy Feature (HEF), it appears on the NB-injected energetic ion spectrum only in discharges where tearing or kink-type modes (f < 10 kHz) are absent, TAE activity (f ~ 10-150 kHz) is weak (δBrms < 75 mGauss) and CAE/GAE activity (f ~ 400 – 1200 kHz) is robust. The feature exhibits a growth time of ~ 20 - 80 ms and occasionally develops a slowing down distribution that continues to evolve over periods of 100's of milliseconds, a time scale long compared with the typical ~ 10's ms equilibration time of the NB injected particles. The HEF is observed only in H-mode (not L-mode) discharges with injected NB power of 4 MW or greater and in the field pitch range v||/v ~ 0.7 – 0.9; i.e. only for passing (never trapped) energetic ions. The HEF is suppressed by vessel conditioning using lithium deposition at rates ~ 100 mg/shot, a level sufficient to suppress ELM activity. Increases of ~ 10 - 30 % in the measured neutron yield and total stored energy are observed to coincide with the feature along with broadening of measured Te(r), Ti(r) and ne(r) profiles. However, TRANSP analysis shows that such increases are driven by plasma profile changes and not the HEF phenomenon itself. Though a definitive mechanism has yet to be developed, the HEF appears to be caused by a form of TAE/CAE wave-particle interaction that distorts of the NB fast ion distribution in phase space.

  2. Downhole Fluid Analyzer Development

    Energy Technology Data Exchange (ETDEWEB)

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  3. PULSE HEIGHT ANALYZER

    Science.gov (United States)

    Johnstone, C.W.

    1958-01-21

    An anticoincidence device is described for a pair of adjacent channels of a multi-channel pulse height analyzer for preventing the lower channel from generating a count pulse in response to an input pulse when the input pulse has sufficient magnitude to reach the upper level channel. The anticoincidence circuit comprises a window amplifier, upper and lower level discriminators, and a biased-off amplifier. The output of the window amplifier is coupled to the inputs of the discriminators, the output of the upper level discriminator is connected to the resistance end of a series R-C network, the output of the lower level discriminator is coupled to the capacitance end of the R-C network, and the grid of the biased-off amplifier is coupled to the junction of the R-C network. In operation each discriminator produces a negative pulse output when the input pulse traverses its voltage setting. As a result of the connections to the R-C network, a trigger pulse will be sent to the biased-off amplifier when the incoming pulse level is sufficient to trigger only the lower level discriminator.

  4. Analyzing Spacecraft Telecommunication Systems

    Science.gov (United States)

    Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric

    2004-01-01

    Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.

  5. Experience

    Directory of Open Access Journals (Sweden)

    Michele Grieco

    2015-01-01

    Full Text Available Abdominoplasty is one of the most popular body-contouring procedures. It is associated with a significant number of complications: the most common ones are seroma, hematoma, infection, wound-healing problems, and skin flap necrosis. From January 2012 to December 2014, 25 patients (18 women and 7 men (mean age: 51 years underwent abdominoplastic surgery at the Plastic Surgery Section, Department of Surgical Sciences, University of Parma, Italy. All patients reported a weight loss between 15 kg and 47 kg. All of the of 25 patients were included in the study; minor and major complications were seen in 17 (68% and 8 (32% patients, respectively. The percentage of complications in our patients was as follows: 9 patients with seroma (36%; 4 patients with wound dehiscence with delayed wound healing (16%; 3 cases with hematoma (12%; 2 patients with postoperative bleeding (8%; 1 patient (4% with an umbilical necrosis; 1 patient (4% with a deep vein thrombosis; 3 patients with infected seroma (12%; and 2 patients with wound infection (8%. There were no cases of postoperative mortality. The aim of this study is to analyze our complications in postbariatric abdominoplasty.

  6. Utilization of the BARC critical facility for ADS related experiments

    Indian Academy of Sciences (India)

    . It has been proposed to utilize the CF for ADSS experiments also, with some appropriate neutron source at the center. Recently BARC has proposed a one way coupled core configuration for ADSS application. This coupled core consists.

  7. PM 3655 PHILIPS Logic analyzer

    CERN Multimedia

    A logic analyzer is an electronic instrument that captures and displays multiple signals from a digital system or digital circuit. A logic analyzer may convert the captured data into timing diagrams, protocol decodes, state machine traces, assembly language, or may correlate assembly with source-level software. Logic Analyzers have advanced triggering capabilities, and are useful when a user needs to see the timing relationships between many signals in a digital system.

  8. Digital Multi Channel Analyzer Enhancement

    International Nuclear Information System (INIS)

    Gonen, E.; Marcus, E.; Wengrowicz, U.; Beck, A.; Nir, J.; Sheinfeld, M.; Broide, A.; Tirosh, D.

    2002-01-01

    A cement analyzing system based on radiation spectroscopy had been developed [1], using novel digital approach for real-time, high-throughput and low-cost Multi Channel Analyzer. The performance of the developed system had a severe problem: the resulted spectrum suffered from lack of smoothness, it was very noisy and full of spikes and surges, therefore it was impossible to use this spectrum for analyzing the cement substance. This paper describes the work carried out to improve the system performance

  9. Multichannel analyzer type CMA-3

    International Nuclear Information System (INIS)

    Czermak, A.; Jablonski, J.; Ostrowicz, A.

    1978-01-01

    Multichannel analyzer CMA-3 is designed for two-parametric analysis with operator controlled logical windows. It is implemented in CAMAC standard. A single crate contains all required modules and is controlled by the PDP-11/10 minicomputer. Configuration of CMA-3 is shown. CMA-3 is the next version of the multichannel analyzer described in report No 958/E-8. (author)

  10. Comparison of fiber length analyzers

    Science.gov (United States)

    Don Guay; Nancy Ross Sutherland; Walter Rantanen; Nicole Malandri; Aimee Stephens; Kathleen Mattingly; Matt Schneider

    2005-01-01

    In recent years, several fiber new fiber length analyzers have been developed and brought to market. The new instruments provide faster measurements and the capability of both laboratory and on-line analysis. Do the various fiber analyzers provide the same length, coarseness, width, and fines measurements for a given fiber sample? This paper provides a comparison of...

  11. Apitherapy: Usage and Experience in German Beekeepers

    Directory of Open Access Journals (Sweden)

    Markus Hellner

    2008-01-01

    Full Text Available This study aimed to investigate the practice of apitherapy - using bee products such as honey, pollen, propolis, royal jelly and bee venom to prevent or treat illness and promote healing - among German beekeepers and to evaluate their experiences with these therapies. A questionnaire incorporating two instruments on beekeepers’ physical and mental health and working practice was included in three German beekeeping journals and readers were asked to complete it. The instrument included questions on the use of apitherapy. Simple descriptive methods, bivariate correlation, cross-tabulation and one-way ANOVA were used to analyze the data. Altogether 1059 completed questionnaires were received. The beekeepers reported the most effective and favorable therapeutic effects with honey, followed by propolis, pollen and royal jelly. The factors associated with successful experiences were: age, number of hives tended, health consciousness, positive experiences with one product and self-administration of treatment. Beekeepers were asked for which condition they would employ propolis and pollen. They reported that they used propolis most frequently to treat colds, wounds and burns, sore throats, gum disorders and also as a general prophylactic, while pollen was most commonly used as a general prophylactic and, less frequently, in treating prostate diseases. No adverse experiences were reported. The potential benefit of bee products is supported by the positive experiences of a large group of beekeepers who use some of these products to treat a wide range of conditions. The indications and treatments given here may be important in selecting bee products and designing future trials.

  12. Nuclear fuel microsphere gamma analyzer

    International Nuclear Information System (INIS)

    Valentine, K.H.; Long, E.L. Jr.; Willey, M.G.

    1977-01-01

    A gamma analyzer system is provided for the analysis of nuclear fuel microspheres and other radioactive particles. The system consists of an analysis turntable with means for loading, in sequence, a plurality of stations within the turntable; a gamma ray detector for determining the spectrum of a sample in one section; means for analyzing the spectrum; and a receiver turntable to collect the analyzed material in stations according to the spectrum analysis. Accordingly, particles may be sorted according to their quality; e.g., fuel particles with fractured coatings may be separated from those that are not fractured, or according to other properties. 4 claims, 3 figures

  13. Market study: Whole blood analyzer

    Science.gov (United States)

    1977-01-01

    A market survey was conducted to develop findings relative to the commercialization potential and key market factors of the whole blood analyzer which is being developed in conjunction with NASA's Space Shuttle Medical System.

  14. CSTT Update: Fuel Quality Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Brosha, Eric L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lujan, Roger W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mukundan, Rangachary [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockward, Tommy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Romero, Christopher J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Stefan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wilson, Mahlon S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-06

    These are slides from a presentation. The following topics are covered: project background (scope and approach), developing the prototype (timeline), update on intellectual property, analyzer comparisons (improving humidification, stabilizing the baseline, applying clean-up strategy, impact of ionomer content and improving clean-up), proposed operating mode, considerations for testing in real-world conditions (Gen 1 analyzer electronics development, testing partner identified, field trial planning), summary, and future work.

  15. Neutral Particle Analyzer Diagnostic on NSTX

    Energy Technology Data Exchange (ETDEWEB)

    S.S. Medley; A.L. Roquemore

    2004-03-16

    The Neutral Particle Analyzer (NPA) diagnostic on the National Spherical Torus Experiment (NSTX) utilizes a PPPL-designed E||B spectrometer that measures the energy spectra of minority hydrogen and bulk deuterium species simultaneously with 39 energy channels per mass specie and a time resolution of 1 ms. The calibrated energy range is E = 0.5-150 keV and the energy resolution varies from AE/E = 3-7% over the surface of the microchannel plate detector.

  16. Neutral Particle Analyzer Diagnostic on NSTX

    International Nuclear Information System (INIS)

    Medley, S.S.; Roquemore, A.L.

    2004-01-01

    The Neutral Particle Analyzer (NPA) diagnostic on the National Spherical Torus Experiment (NSTX) utilizes a PPPL-designed E||B spectrometer that measures the energy spectra of minority hydrogen and bulk deuterium species simultaneously with 39 energy channels per mass specie and a time resolution of 1 ms. The calibrated energy range is E = 0.5-150 keV and the energy resolution varies from AE/E = 3-7% over the surface of the microchannel plate detector

  17. Compact analyzer: an interactive simulator

    International Nuclear Information System (INIS)

    Ipakchi, A.; Khadem, M.; Colley, R.W.

    1985-01-01

    Compact Analyzer is a computer system that combines dynamic simulation models with interactive and color graphics user interface functions to provide a cost-effective simulator for dynamic analysis and evaluation of power plant operation, with engineering and training applications. Most dynamic simulation packages such as RETRAN and TRAC are designed for a batch-mode operation. With advancements in computer technology and man/machine interface capabilities, it is possible to integrate such codes with interactive and graphic functions into advanced simulators. The US Nuclear Regulatory Commission has sponsored development of plant analyzers with such characteristics. The Compact Analyzer is an Electric Power Research Institute (EPRI)-sponsored project, which currently utilizes the EPRI modular modeling system (MMS) for process simulation, and uses an adaptable color graphic package for dynamic display of the simulation results

  18. On-Demand Urine Analyzer

    Science.gov (United States)

    Farquharson, Stuart; Inscore, Frank; Shende, Chetan

    2010-01-01

    A lab-on-a-chip was developed that is capable of extracting biochemical indicators from urine samples and generating their surface-enhanced Raman spectra (SERS) so that the indicators can be quantified and identified. The development was motivated by the need to monitor and assess the effects of extended weightlessness, which include space motion sickness and loss of bone and muscle mass. The results may lead to developments of effective exercise programs and drug regimes that would maintain astronaut health. The analyzer containing the lab-on-a- chip includes materials to extract 3- methylhistidine (a muscle-loss indicator) and Risedronate (a bone-loss indicator) from the urine sample and detect them at the required concentrations using a Raman analyzer. The lab-on- a-chip has both an extractive material and a SERS-active material. The analyzer could be used to monitor the onset of diseases, such as osteoporosis.

  19. Multichannel analyzer embedded in FPGA

    International Nuclear Information System (INIS)

    Garcia D, A.; Hernandez D, V. M.; Vega C, H. R.; Ordaz G, O. O.; Bravo M, I.

    2017-10-01

    Ionizing radiation has different applications, so it is a very significant and useful tool, which in turn can be dangerous for living beings if they are exposed to uncontrolled doses. However, due to its characteristics, it cannot be perceived by any of the senses of the human being, so that in order to know the presence of it, radiation detectors and additional devices are required to quantify and classify it. A multichannel analyzer is responsible for separating the different pulse heights that are generated in the detectors, in a certain number of channels; according to the number of bits of the analog to digital converter. The objective of the work was to design and implement a multichannel analyzer and its associated virtual instrument, for nuclear spectrometry. The components of the multichannel analyzer were created in VHDL hardware description language and packaged in the Xilinx Vivado design suite, making use of resources such as the ARM processing core that the System on Chip Zynq contains and the virtual instrument was developed on the LabView programming graphics platform. The first phase was to design the hardware architecture to be embedded in the FPGA and for the internal control of the multichannel analyzer the application was generated for the ARM processor in C language. For the second phase, the virtual instrument was developed for the management, control and visualization of the results. The data obtained as a result of the development of the system were observed graphically in a histogram showing the spectrum measured. The design of the multichannel analyzer embedded in FPGA was tested with two different radiation detection systems (hyper-pure germanium and scintillation) which allowed determining that the spectra obtained are similar in comparison with the commercial multichannel analyzers. (Author)

  20. Analyzing Generation Y Workforce Motivation

    Science.gov (United States)

    2011-03-01

    Analyzing Generation Y Workforce Motivation Ian N. Barford n Patrick T. Hester R Defense AT&L: Special Edition: March –April 2011 36 Report...REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Analyzing Generation Y Workforce Motivation 5a. CONTRACT NUMBER 5b...between 1965 and 1979), and Generation Y (born between 1980 and 2000). 37 Defense AT&L: Special Edition: March –April 2011 Defense AT&L: Special

  1. Historiography taking issue : Analyzing an experiment with heroin abusers

    NARCIS (Netherlands)

    Dehue, T.

    2004-01-01

    This article discusses the predicament of historians becoming part of the history they are investigating and illustrates the issue in a particular case. The case is that of the randomized controlled trial (RCT)-more specifically, its use for testing the effects of providing heroin to severe heroin

  2. The security analyzer: A security analyzer program written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.; Densley, P.J.

    1986-09-01

    The Security Analyzer is a software tool capable of analyzing the effectiveness of a facility's security system. It is written in the Prolog logic programming computer language, using entity-relationship data modeling techniques. The program performs the following functions: (1) provides descriptive, locational and operational status information about intrusion detectors and assessment devices (i.e., ''sensors'' and ''cameras'') upon request; (2) provides for storage and retrieval of maintenance history information for various components of the security system (including intrusion detectors), and allows for changing that information as desired; (3) provides a ''search'' mode, wherein all paths are found from any specified physical location to another specified location which satisfy user chosen ''intruder detection'' probability and elapsed time criteria (i.e., the program finds the ''weakest paths'' from a security point of view). The first two of these functions can be provided fairly easily with a conventional database program; the third function could be provided using Fortran or some similar language, though with substantial difficulty. In the Security Analyzer program, all these functions are provided in a simple and straight-forward manner. This simplicity is possible because the program is written in the symbolic (as opposed to numeric) processing language Prolog, and because the knowledge base is structured according to entity-relationship modeling principles. Also, the use of Prolog and the entity-relationship modeling technique allows the capabilities of the Security analyzer program, both for knowledge base interrogation and for searching-type operations, to be easily expanded in ways that would be very difficult for a numeric and more algorithmically deterministic language such as Fortran to duplicate. 4 refs

  3. Historical Thinking: Analyzing Student and Teacher Ability to Analyze Sources

    OpenAIRE

    Cowgill II, Daniel Armond; Waring, Scott M.

    2017-01-01

    The purpose of this study was to partially replicate the Historical Problem Solving: A Study of the Cognitive Process Using Historical Evidence study conducted by Sam Wineburg in 1991. The Historical Problem Solving study conducted by Wineburg (1991) sought to compare the ability of historians and top level students, as they analyzed pictures and written documents centered on the Battle of Lexington Green. In this version of the study, rather than compare historians and students, we sought ...

  4. Pollution Analyzing and Monitoring Instruments.

    Science.gov (United States)

    1972

    Compiled in this book is basic, technical information useful in a systems approach to pollution control. Descriptions and specifications are given of what is available in ready made, on-the-line commercial equipment for sampling, monitoring, measuring and continuously analyzing the multitudinous types of pollutants found in the air, water, soil,…

  5. Methods of analyzing crude oil

    Energy Technology Data Exchange (ETDEWEB)

    Cooks, Robert Graham; Jjunju, Fred Paul Mark; Li, Anyin; Rogan, Iman S.

    2017-08-15

    The invention generally relates to methods of analyzing crude oil. In certain embodiments, methods of the invention involve obtaining a crude oil sample, and subjecting the crude oil sample to mass spectrometry analysis. In certain embodiments, the method is performed without any sample pre-purification steps.

  6. Analyzing Software Piracy in Education.

    Science.gov (United States)

    Lesisko, Lee James

    This study analyzes the controversy of software piracy in education. It begins with a real world scenario that presents the setting and context of the problem. The legalities and background of software piracy are explained and true court cases are briefly examined. Discussion then focuses on explaining why individuals and organizations pirate…

  7. The Convertible Arbitrage Strategy Analyzed

    NARCIS (Netherlands)

    Loncarski, I.; Ter Horst, J.R.; Veld, C.H.

    2006-01-01

    This paper analyzes convertible bond arbitrage on the Canadian market for the period 1998 to 2004.Convertible bond arbitrage is the combination of a long position in convertible bonds and a short position in the underlying stocks. Convertible arbitrage has been one of the most successful strategies

  8. Analyzer for gamma cameras diagnostic

    International Nuclear Information System (INIS)

    Oramas Polo, I.; Osorio Deliz, J. F.; Diaz Garcia, A.

    2013-01-01

    This research work was carried out to develop an analyzer for gamma cameras diagnostic. It is composed of an electronic system that includes hardware and software capabilities, and operates from the acquisition of the 4 head position signals of a gamma camera detector. The result is the spectrum of the energy delivered by nuclear radiation coming from the camera detector head. This system includes analog processing of position signals from the camera, digitization and the subsequent processing of the energy signal in a multichannel analyzer, sending data to a computer via a standard USB port and processing of data in a personal computer to obtain the final histogram. The circuits are composed of an analog processing board and a universal kit with micro controller and programmable gate array. (Author)

  9. New approach to analyzing vulnerability

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.; Carlson, R.L.; Riedeman, G.W.

    1986-01-01

    The Westinghouse Hanford Company (WHC) has recently completed construction of the Fuel Cycle Plant (FCP) at Richland, Washington. At start-up the facility will fabricate driver fuel for the Fast Flux Test Facility in the Secure Automated Fabrication line. After construction completion, but before facility certification, the Department of Energy (DOE) Richland Operation Office requested that a vulnerability analysis be performed which assumed multiple insiders as a threat to the security system. A unique method of analyzing facility vulnerabilities was developed at the Security Applications Center (SAC), which is managed by WHC for DOE. The method that was developed verifies a previous vulnerability assessment, as well as introducing a modeling technique which analyzes security alarms in relation to delaying factors and possible insider activities. With this information it is possible to assess the relative strength or weakness of various possible routes to and from a target within a facility

  10. Methods for Analyzing Social Media

    DEFF Research Database (Denmark)

    Jensen, Jakob Linaa

    2013-01-01

    Social media is becoming increasingly attractive for users. It is a fast way to communicate ideas and a key source of information. It is therefore one of the most influential mediums of communication of our time and an important area for audience research. The growth of social media invites many...... new questions such as: How can we analyze social media? Can we use traditional audience research methods and apply them to online content? Which new research strategies have been developed? Which ethical research issues and controversies do we have to pay attention to? This book focuses on research...... strategies and methods for analyzing social media and will be of interest to researchers and practitioners using social media, as well as those wanting to keep up to date with the subject....

  11. Portable Tandem Mass Spectrometer Analyzer

    Science.gov (United States)

    1991-07-01

    FILE : MHCI TUNE TABLE 84 (SCANNING with PARENT) SCAN RANGE 10.9 TO 700.0 TUNE MASS 355.0 (AUTO) >LENS 1-3 -13. 88 0. 2: POFF - 1. 2 9: COFF - 4. 1 3...and 500 ng of caffeine in I uL of chloroform by GC/A?:,,MS using negative ions. Also analyzed were barbiturates, extracted from urine, in the 3-5 Mg

  12. Remote Laser Diffraction PSD Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2000-06-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified "off-the-shelf" classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a "hot cell" (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  13. Remote Laser Diffraction PSD Analyzer

    International Nuclear Information System (INIS)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2000-01-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified ''off-the-shelf'' classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a ''hot cell'' (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable--making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives

  14. Sulfur Dioxide Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, Stephen R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-05-01

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. Brookhaven National Laboratory (BNL) has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

  15. Remote Laser Diffraction PSD Analyzer

    International Nuclear Information System (INIS)

    Batcheller, T.A.; Huestis, G.M.; Bolton, S.M.

    2000-01-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified off-the-shelf classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a hot cell (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives

  16. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    Science.gov (United States)

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  17. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Science.gov (United States)

    Cheung, Mike W.-L.; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists—and probably the most crucial one—is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study. PMID:27242639

  18. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Directory of Open Access Journals (Sweden)

    Mike W.-L. Cheung

    2016-05-01

    Full Text Available Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  19. Charged particle mobility refrigerant analyzer

    Science.gov (United States)

    Allman, S.L.; Chunghsuan Chen; Chen, F.C.

    1993-02-02

    A method for analyzing a gaseous electronegative species comprises the steps of providing an analysis chamber; providing an electric field of known potential within the analysis chamber; admitting into the analysis chamber a gaseous sample containing the gaseous electronegative species; providing a pulse of free electrons within the electric field so that the pulse of free electrons interacts with the gaseous electronegative species so that a swarm of electrically charged particles is produced within the electric field; and, measuring the mobility of the electrically charged particles within the electric field.

  20. Fuel analyzer; Analisador de combustiveis

    Energy Technology Data Exchange (ETDEWEB)

    Cozzolino, Roberval [RS Motors, Indaiatuba, SP (Brazil)

    2008-07-01

    The current technology 'COMBUSTIMETRO' aims to examine the fuel through performance of the engine, as the role of the fuel is to produce energy for the combustion engine in the form of which is directly proportional to the quality and type of fuel. The 'COMBUSTIMETRO' has an engine that always keeps the same entry of air, fuel and fixed point of ignition. His operation is monitored by sensors (Sonda Lambda, RPM and Gases Analyzer) connected to a processor that performs calculations and records the information, generate reports and graphs. (author)

  1. A chemical analyzer for charged ultrafine particles

    Science.gov (United States)

    Gonser, S. G.; Held, A.

    2013-09-01

    New particle formation is a frequent phenomenon in the atmosphere and of major significance for the Earth's climate and human health. To date the mechanisms leading to the nucleation of particles as well as to aerosol growth are not completely understood. A lack of appropriate measurement equipment for online analysis of the chemical composition of freshly nucleated particles is one major limitation. We have developed a Chemical Analyzer for Charged Ultrafine Particles (CAChUP) capable of analyzing particles with diameters below 30 nm. A bulk of size-separated particles is collected electrostatically on a metal filament, resistively desorbed and subsequently analyzed for its molecular composition in a time of flight mass spectrometer. We report on technical details as well as characterization experiments performed with the CAChUP. Our instrument was tested in the laboratory for its detection performance as well as for its collection and desorption capabilities. The manual application of defined masses of camphene (C10H16) to the desorption filament resulted in a detection limit between 0.5 and 5 ng, and showed a linear response of the mass spectrometer. Flow tube experiments of 25 nm diameter secondary organic aerosol from ozonolysis of alpha-pinene also showed a linear relation between collection time and the mass spectrometer's signal intensity. The resulting mass spectra from the collection experiments are in good agreement with published work on particles generated by the ozonolysis of alpha-pinene. A sensitivity study shows that the current setup of CAChUP is ready for laboratory measurements and for the observation of new particle formation events in the field.

  2. Historical Thinking: Analyzing Student and Teacher Ability to Analyze Sources

    Directory of Open Access Journals (Sweden)

    Daniel Armond Cowgill II

    2017-05-01

    Full Text Available The purpose of this study was to partially replicate the Historical Problem Solving: A Study of the Cognitive Process Using Historical Evidence study conducted by Sam Wineburg in 1991. The Historical Problem Solving study conducted by Wineburg (1991 sought to compare the ability of historians and top level students, as they analyzed pictures and written documents centered on the Battle of Lexington Green. In this version of the study, rather than compare historians and students, we sought out to compare the analytical skills of teachers and students. The main findings relate to the fact that the participants lacked the ability to engage in the very complex activities associated with historical inquiry and the utilization of primary sources in learning about the past. This lack of ability should be used to improve teacher professional development programs and help them develop the skills needed to not only engage in historical evaluation themselves but to also develop skills that will allow them to instruct students to do the same.

  3. Compact Microwave Fourier Spectrum Analyzer

    Science.gov (United States)

    Savchenkov, Anatoliy; Matsko, Andrey; Strekalov, Dmitry

    2009-01-01

    A compact photonic microwave Fourier spectrum analyzer [a Fourier-transform microwave spectrometer, (FTMWS)] with no moving parts has been proposed for use in remote sensing of weak, natural microwave emissions from the surfaces and atmospheres of planets to enable remote analysis and determination of chemical composition and abundances of critical molecular constituents in space. The instrument is based on a Bessel beam (light modes with non-zero angular momenta) fiber-optic elements. It features low power consumption, low mass, and high resolution, without a need for any cryogenics, beyond what is achievable by the current state-of-the-art in space instruments. The instrument can also be used in a wide-band scatterometer mode in active radar systems.

  4. Radiation energy detector and analyzer

    International Nuclear Information System (INIS)

    Roberts, T.G.

    1981-01-01

    A radiation detector array and a method for measuring the spectral content of radiation. The radiation sensor or detector is an array or stack of thin solid-electrolyte batteries. The batteries, arranged in a stack, may be composed of independent battery cells or may be arranged so that adjacent cells share a common terminal surface. This common surface is possible since the polarity of the batteries with respect to an adjacent battery is unrestricted, allowing a reduction in component parts of the assembly and reducing the overall stack length. Additionally, a test jig or chamber for allowing rapid measurement of the voltage across each battery is disclosed. A multichannel recorder and display may be used to indicate the voltage gradient change across the cells, or a small computer may be used for rapidly converting these voltage readings to a graph of radiation intensity versus wavelength or energy. The behavior of the batteries when used as a radiation detector and analyzer are such that the voltage measurements can be made at leisure after the detector array has been exposed to the radiation, and it is not necessary to make rapid measurements as is now done

  5. Thomson parabola ion energy analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Cobble, James A [Los Alamos National Laboratory; Flippo, Kirk A [Los Alamos National Laboratory; Letzring, Samuel A [Los Alamos National Laboratory; Lopez, Frank E [Los Alamos National Laboratory; Offermann, Dustin T [Los Alamos National Laboratory; Oertel, John A [Los Alamos National Laboratory; Mastrosimone, Dino [UNIV OF ROCHESTER

    2010-01-01

    A new, versatile Thomson parabola ion energy (TPIE) analyzer has been designed and constructed for use at the OMEGA-EP facility. Multi-MeV ions from EP targets are transmitted through a W pinhole into a (5- or 8-kG) magnetic field and subsequently through a parallel electric field of up to 30 kV/cm. The ion drift region may have a user-selected length of 10, 50, or 80 cm. With the highest fields, 500-Me V C{sup 6+} and C{sup 5+} may be resolved. TPIE is TIM-mounted at OMEGA-EP and is qualified in all existing TIMs. The instrument runs on pressure-interlocked 15-VDC power available in EP TIM carts. It may be inserted to within several inches of the target to attain sufficient flux for a measurement. For additional flux control, the user may select a square-aperture W pinhole of 0.004-inch or 0.010-inch. The detector consists of CR-39 backed by an image plate. The fully relativistic design code and design features are discussed. Ion spectral results from first use at OMEGA-EP are expected.

  6. Nuclear plant analyzer desktop workstation

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1990-01-01

    In 1983 the U.S. Nuclear Regulatory Commission (USNRC) commissioned the Idaho National Engineering Laboratory (INEL) to develop a Nuclear Plant Analyzer (NPA). The NPA was envisioned as a graphical aid to assist reactor safety analysts in comprehending the results of thermal-hydraulic code calculations. The development was to proceed in three distinct phases culminating in a desktop reactor safety workstation. The desktop NPA is now complete. The desktop NPA is a microcomputer based reactor transient simulation, visualization and analysis tool developed at INEL to assist an analyst in evaluating the transient behavior of nuclear power plants by means of graphic displays. The NPA desktop workstation integrates advanced reactor simulation codes with online computer graphics allowing reactor plant transient simulation and graphical presentation of results. The graphics software, written exclusively in ANSI standard C and FORTRAN 77 and implemented over the UNIX/X-windows operating environment, is modular and is designed to interface to the NRC's suite of advanced thermal-hydraulic codes to the extent allowed by that code. Currently, full, interactive, desktop NPA capabilities are realized only with RELAP5

  7. Visual analyzer as anticipatory system (functional organization)

    Science.gov (United States)

    Kirvelis, Dobilas

    2000-05-01

    Hypothetical functional organization of the visual analyzer is presented. The interpretation of visual perception, anatomic and morphological structure of visual systems of animals, neuro-physiological, psychological and psycho-physiological data in the light of a number of the theoretical solutions of image recognition and visual processes simulation enable active information processing. The activities in special areas of cortex are as follows: focused attention, prediction with analysis of visual scenes and synthesis, predictive mental images. In the projection zone of visual cortex Area Streata or V1 a "sensory" screen (SS) and "reconstruction" screen (RS) are supposed to exist. The functional structure of visual analyzer consist of: analysis of visual scenes projected onto SS; "tracing" of images; preliminary recognition; reversive image reconstruction onto RS; comparison of images projected onto SS with images reconstructed onto RS; and "correction" of preliminary recognition. Special attention is paid to the quasiholographical principles of the neuronal organization within the brain, of the image "tracing," and of reverse image reconstruction. Tachistoscopic experiments revealed that the duration of one such hypothesis-testing cycle of the human visual analyzers is about 8-10 milliseconds.

  8. Earth Exploration Toolbook Workshops: Helping Teachers and Students Analyze Web-based Scientific Data

    Science.gov (United States)

    McAuliffe, C.; Ledley, T.; Dahlman, L.; Haddad, N.

    2007-12-01

    One of the challenges faced by Earth science teachers, particularly in K-12 settings, is that of connecting scientific research to classroom experiences. Helping teachers and students analyze Web-based scientific data is one way to bring scientific research to the classroom. The Earth Exploration Toolbook (EET) was developed as an online resource to accomplish precisely that. The EET consists of chapters containing step-by-step instructions for accessing Web-based scientific data and for using a software analysis tool to explore issues or concepts in science, technology, and mathematics. For example, in one EET chapter, users download Earthquake data from the USGS and bring it into a geographic information system (GIS), analyzing factors affecting the distribution of earthquakes. The goal of the EET Workshops project is to provide professional development that enables teachers to incorporate Web-based scientific data and analysis tools in ways that meet their curricular needs. In the EET Workshops project, Earth science teachers participate in a pair of workshops that are conducted in a combined teleconference and Web-conference format. In the first workshop, the EET Data Analysis Workshop, participants are introduced to the National Science Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). They also walk through an Earth Exploration Toolbook (EET) chapter and discuss ways to use Earth science datasets and tools with their students. In a follow-up second workshop, the EET Implementation Workshop, teachers share how they used these materials in the classroom by describing the projects and activities that they carried out with students. The EET Workshops project offers unique and effective professional development. Participants work at their own Internet-connected computers, and dial into a toll-free group teleconference for step-by-step facilitation and interaction. They also receive support via Elluminate, a Web

  9. Analyzing Oscillations of a Rolling Cart Using Smartphones and Tablets

    Science.gov (United States)

    Egri, Sandor; Szabo, Lorant

    2015-01-01

    It is well known that "interactive engagement" helps students to understand basic concepts in physics. Performing experiments and analyzing measured data are effective ways to realize interactive engagement, in our view. Some experiments need special equipment, measuring instruments, or laboratories, but in this activity we advocate…

  10. Analyzing Virtual Physics Simulations with Tracker

    Science.gov (United States)

    Claessens, Tom

    2017-12-01

    In the physics teaching community, Tracker is well known as a user-friendly open source video analysis software, authored by Douglas Brown. With this tool, the user can trace markers indicated on a video or on stroboscopic photos and perform kinematic analyses. Tracker also includes a data modeling tool that allows one to fit some theoretical equations of motion onto experimentally obtained data. In the field of particle mechanics, Tracker has been effectively used for learning and teaching about projectile motion, "toss up" and free-fall vertical motion, and to explain the principle of mechanical energy conservation. Also, Tracker has been successfully used in rigid body mechanics to interpret the results of experiments with rolling/slipping cylinders and moving rods. In this work, I propose an original method in which Tracker is used to analyze virtual computer simulations created with a physics-based motion solver, instead of analyzing video recording or stroboscopic photos. This could be an interesting approach to study kinematics and dynamics problems in physics education, in particular when there is no or limited access to physical labs. I demonstrate the working method with a typical (but quite challenging) problem in classical mechanics: a slipping/rolling cylinder on a rough surface.

  11. An Investigation of the Effects of Authentic Science Experiences Among Urban High School Students

    Science.gov (United States)

    Chapman, Angela

    Providing equitable learning opportunities for all students has been a persistent issue for some time. This is evident by the science achievement gap that still exists between male and female students as well as between White and many non-White student populations (NCES, 2007, 2009, 2009b) and an underrepresentation of female, African-American, Hispanic, and Native Americans in many science, technology, engineering, and mathematics (STEM) related careers (NCES, 2009b). In addition to gender and ethnicity, socioeconomic status and linguistic differences are also factors that can marginalize students in the science classroom. One factor attributed to the achievement gap and low participation in STEM career is equitable access to resources including textbooks, laboratory equipment, qualified science teachers, and type of instruction. Extensive literature supports authentic science as one way of improving science learning. However, the majority of students do not have access to this type of resource. Additionally, extensive literature posits that culturally relevant pedagogy is one way of improving education. This study examines students' participation in an authentic science experience and argues that this is one way of providing culturally relevant pedagogy in science classrooms. The purpose of this study was to better understand how marginalized students were affected by their participation in an authentic science experience, within the context of an algae biofuel project. Accordingly, an interpretivist approach was taken. Data were collected from pre/post surveys and tests, semi-structured interviews, student journals, and classroom observations. Data analysis used a mixed methods approach. The data from this study were analyzed to better understand whether students perceived the experience to be one of authentic science, as well as how students science identities, perceptions about who can do science, attitudes toward science, and learning of science practices

  12. Nuclear plant analyzer development and analysis applications

    International Nuclear Information System (INIS)

    Laats, E.T.

    1984-01-01

    The Nuclear Plant Analyzer (NPA) is being developed as the U.S. Nuclear Regulatory Commission's (NRC's) state of the art safety analysis and engineering tool to address key nuclear plant safety issues. The NPA integrates the NRC's computerized reactor behavior simulation codes such as RELAP5 and TRAC-BWR, both of which are well-developed computer graphics programs, and large repositories of reactor design and experimental data. Utilizing the complex reactor behavior codes as well as the experiment data repositories enables simulation applications of the NPA that are generally not possible with more simplistic, less mechanistic reactor behavior codes. These latter codes are used in training simulators or with other NPA-type software packages and are limited to displaying calculated data only. This paper describes four applications of the NPA in assisting reactor safety analyses. Two analyses evaluated reactor operating procedures, during off-normal operation, for a pressurized water reactor (PWR) and a boiling water reactor (BWR), respectively. The third analysis was performed in support of a reactor safety experiment conducted in the Semiscale facility. The final application demonstrated the usefulness of atmospheric dispersion computer codes for site emergency planning purposes. An overview of the NPA and how it supported these analyses are the topics of this paper

  13. The development of a new uranium analyzer and its application

    International Nuclear Information System (INIS)

    Hou Shengli; Zeng Weihua; Deng Yonghui; Li Fubin

    2012-01-01

    A new uranium analyzer and its application are introduced in this paper. The analyzer consists of a detector for β-ray, a detector for γ-ray, the circuits for signal-processing and the software for data collecting and processing. The uranium content is measured by simultaneously counting the β-rays and γ-rays radiated by the sample. It is very easy to operate the analyzer and very quick to get the result. The experiment shows that the data error between this analyzer and chemical analysis can meet the demand of uranium mine for the detection of uranium content. (authors)

  14. Update on the USNRC's nuclear plant analyzer

    International Nuclear Information System (INIS)

    Laats, E.T.

    1987-01-01

    The Nuclear Plant Analyzer (NPA) is the U.S. Nuclear Regulatory Commission's (NRC's) state-of-the-art nuclear reactor simulation capability. This computer software package integrates high fidelity nuclear reactor simulation codes such as the TRAC and RELAPS series of codes with color graphics display techniques and advanced workstation hardware. An overview of this program was given at the 1984 Summer Computer Simulation Conference (SCSC), with selected topics discussed at the 1985 and 1986 SCSCs. This paper addresses these activities and related experiences. First, The Class VI computer implementation is discussed. The trade-offs between gaining significantly greater computational speed and central memory, with the loss of performance due to many more simultaneous users is shown. Second, the goal of the super-minicomputer implementation is to produce a very cost-effective system that utilizes advanced (multi-dimensional, two-phase coolant) simulation capabilities at real wall-clock simulation times. Benchmarking of the initial super-minicomputer implementation is discussed. Finally, the technical and economic feasibility is addressed for implementing the super-minicomputer version of the NPA with the RELAPS simulation code onto the Black Fox full scope nuclear power plant simulator

  15. Analyzing Design Heating Loads in Superinsulated Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Arena, Lois [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)

    2015-06-16

    The U.S. Department of Energy’s Building America research team Consortium for Advanced Residential Buildings (CARB) worked with the EcoVillage cohousing community in Ithaca, New York, on the Third Residential EcoVillage Experience neighborhood. This communityscale project consists of 40 housing units—15 apartments and 25 single-family residences. Units range in size from 450 ft2 to 1,664 ft2 and cost from $80,000 for a studio apartment to $235,000 for a three- or four-bedroom single-family home. For the research component of this project, CARB analyzed current heating system sizing methods for superinsulated homes in cold climates to determine if changes in building load calculation methodology should be recommended. Actual heating energy use was monitored and compared to results from the Air Conditioning Contractors of America’s Manual J8 (MJ8) and the Passive House Planning Package software. Results from that research indicate that MJ8 significantly oversizes heating systems for superinsulated homes and that thermal inertia and internal gains should be considered for more accurate load calculations.

  16. Numerical methods for analyzing electromagnetic scattering

    Science.gov (United States)

    Lee, S. W.; Lo, Y. T.; Chuang, S. L.; Lee, C. S.

    1985-01-01

    Attenuation properties of the normal modes in an overmoded waveguide coated with a lossy material were analyzed. It is found that the low-order modes, can be significantly attenuated even with a thin layer of coating if the coating material is not too lossy. A thinner layer of coating is required for large attenuation of the low-order modes if the coating material is magnetic rather than dielectric. The Radar Cross Section (RCS) from an uncoated circular guide terminated by a perfect electric conductor was calculated and compared with available experimental data. It is confirmed that the interior irradiation contributes to the RCS. The equivalent-current method based on the geometrical theory of diffraction (GTD) was chosen for the calculation of the contribution from the rim diffraction. The RCS reduction from a coated circular guide terminated by a PEC are planned schemes for the experiments are included. The waveguide coated with a lossy magnetic material is suggested as a substitute for the corrugated waveguide.

  17. Analyzing the Pension System of the USSR

    Directory of Open Access Journals (Sweden)

    Aleksei V. Pudovkin

    2015-01-01

    Full Text Available The article under the title "ANALYSIS OF THE PENSION SYSTEM OF THE USSR" deals with numerous aspects of development of the pension system of the former USSR. Since the improvement of the Russian pension system is presently high on the agenda, the author believes that analyzing the own historical experience in the first line is essential in order to create a sound and efficient pension system in Russia. The study presented in the article aims to execute an in-depth analysis of legislature on the soviet pension system with the view to recreate the architecture of the pension system of the USSR. In addition, the study also reflects on the official statistics for the said period to make a qualified and fundamental conclusion on the efficiency of the soviet pension system. The evolution of the pension system, based on statistical data evidently proves the efficiently of the soviet pension system. It is highly recommended that the positive aspects of the soviet pension system are taken into consideration when reforming the actual pension system of Russian Federation.

  18. Engaging the Shopping Experience

    DEFF Research Database (Denmark)

    Sørensen, Sanne Dollerup

    The revenues in brick-and-mortar stores have declined in the last decade, not least due to competition from online shopping. This thesis investigates how traditional stores might use principles from experience design to reverse this tendency. Brick-and-mortar stores are very important...... research, experience design, literary theory, the history and sociology of shopping. The method used is observation studies of cases that more or less successfully have been able to engage customers. This has resulted in an in-depth study of one such store and a typology of different types of engaging...... the interest in brick-and-mortar stores by engaging the customers emotionally. This thesis suggests that using insights from Possible World Theory in designing stores is one way to do this. Theoretically the thesis is interdisciplinary by drawing on knowledge from a wide spectrum of fields such as consumer...

  19. Emergency response training with the BNL plant analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, H.S.; Guppy, J.G.; Mallen, A.N.; Wulff, W.

    1987-01-01

    Presented is the experience in the use of the BNL Plant Analyzer for NRC emergency response training to simulated accidents in a BWR. The unique features of the BNL Plant Analyzer that are important for the emergency response training are summarized. A closed-loop simulation of all the key systems of a power plant in question was found essential to the realism of the emergency drills conducted at NRC. The faster than real-time simulation speeds afforded by the BNL Plant Analyzer have demonstrated its usefulness for the timely conduct of the emergency response training.

  20. Portable Programmable Multifunction Body Fluids Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced Liquid Logic proposes to develop a very capable analyzer based on its digital microfluidic technology. Such an analyzer would be:  Capable of both...

  1. A novel method for one-way hash function construction based on spatiotemporal chaos

    Energy Technology Data Exchange (ETDEWEB)

    Ren Haijun [College of Software Engineering, Chongqing University, Chongqing 400044 (China); State Key Laboratory of Power Transmission Equipment and System Security and New Technology, Chongqing University, Chongqing 400044 (China)], E-mail: jhren@cqu.edu.cn; Wang Yong; Xie Qing [Key Laboratory of Electronic Commerce and Logistics of Chongqing, Chongqing University of Posts and Telecommunications, Chongqing 400065 (China); Yang Huaqian [Department of Computer and Modern Education Technology, Chongqing Education of College, Chongqing 400067 (China)

    2009-11-30

    A novel hash algorithm based on a spatiotemporal chaos is proposed. The original message is first padded with zeros if needed. Then it is divided into a number of blocks each contains 32 bytes. In the hashing process, each block is partitioned into eight 32-bit values and input into the spatiotemporal chaotic system. Then, after iterating the system for four times, the next block is processed by the same way. To enhance the confusion and diffusion effect, the cipher block chaining (CBC) mode is adopted in the algorithm. The hash value is obtained from the final state value of the spatiotemporal chaotic system. Theoretic analyses and numerical simulations both show that the proposed hash algorithm possesses good statistical properties, strong collision resistance and high efficiency, as required by practical keyed hash functions.

  2. Competition-based reform of the National Health Service in England: a one-way street?

    Science.gov (United States)

    Reynolds, Lucy; Attaran, Amir; Hervey, Tamara; McKee, Martin

    2012-01-01

    The Conservative-led government in the United Kingdom is embarking on massive changes to the National Health Service in England. These changes will create a competitive market in both purchasing and provision. Although the opposition Labour Party has stated its intention to repeal the legislation when it regains power, this may be difficult because of provisions of competition law derived from international treaties. Yet there is an alternative, illustrated by the decision of the devolved Scottish government to reject competitive markets in health care.

  3. More than one way to toy with ChAT and VAChT

    Czech Academy of Sciences Publication Activity Database

    Castell, X.; Diebler, M. F.; Tomasi, M.; Bigari, C.; De Gois, S.; Berrard, S.; Mallet, J.; Israel, M.; Doležal, Vladimír

    2002-01-01

    Roč. 96, 1-2 (2002), s. 61-72 ISSN 0928-4257 R&D Projects: GA ČR GA305/01/0283 Grant - others:AFM(FR) - Institutional research plan: CEZ:AV0Z5011922 Keywords : Choline acetyltransferase * vesicular acetylcholine transporter -cholinergic differentiation Subject RIV: FH - Neurology Impact factor: 1.070, year: 2002

  4. Serviceability assessment of reinforced concrete one-way slab using probabilistic SBRA method

    Czech Academy of Sciences Publication Activity Database

    Pustka, D.; Marek, Pavel

    2007-01-01

    Roč. 2007, č. 7 (2007), s. 95-98 ISSN 1505-8425 R&D Projects: GA ČR(CZ) GA103/07/0557 Institutional research plan: CEZ:AV0Z20710524 Keywords : Serviceability * Concrete * dead load * live load Subject RIV: JM - Building Engineering

  5. Creating and Sharing Annotated Bibliographies: One Way to Become Familiar with Exemplary Multicultural Literature

    Science.gov (United States)

    Richards, Janet C.

    2015-01-01

    Teachers who meet Common Core recommendations about teaching with multicultural literature need to recognize the traditions, values, and customs of various cultures so they can make sound decisions about the authenticity of the books they select for their students. Furthermore, teachers who support students' literacy achievements with exemplary…

  6. Residual stresses analysis of friction stir welding using one-way FSI simulation

    International Nuclear Information System (INIS)

    Kang, Sung Wook; Jang, Beom Seon; Song, Ha Cheol

    2015-01-01

    When certain mechanisms, such as plastic deformations and temperature gradients, occur and are released in a structure, stresses remain because of the shape of the structure and external constraints. These stresses are referred to as residual stresses. The base material locally expands during heating in the welding process. When the welding is completed and cooled to room temperature, the residual stresses are left at nearly the yield strength level. In the case of friction stir welding, the maximum temperature is 80% to 90% of the melting point of the materials. Thus, the residual stresses in the welding process are smaller than those in other fusion welding processes; these stresses have not been considered previously. However, friction stir welding residual stresses are sometimes measured at approximately 70% or above. These residual stresses significantly affect fatigue behavior and lifetime. The present study investigates the residual stress distributions in various welding conditions and shapes of friction stir welding. In addition, the asymmetric feature is considered in temperature and residual stress distribution. Heat transfer analysis is conducted using the commercial computational fluid dynamics program Fluent, and results are used in the finite element structural analysis with the ANSYS Multiphysics software. The calculated residual stresses are compared with experimental values using the X-ray diffraction method.

  7. Numerical Study of Structural Vibration Induced by Combustion Noise - One Way Interaction

    NARCIS (Netherlands)

    Pozarlik, Artur Krzysztof; Kok, Jacobus B.W.

    2007-01-01

    The turbulent flame in the lean combustion regime in a gas turbine combustor generates significant thermo-acoustic noise. The thermo-acoustic noise induces liner vibrations that may lead to fatigue damage of the combustion system. This phenomenon is investigated in the project FLUISTCOM using both

  8. MANCOVA for one way classification with homogeneity of regression coefficient vectors

    Science.gov (United States)

    Mokesh Rayalu, G.; Ravisankar, J.; Mythili, G. Y.

    2017-11-01

    The MANOVA and MANCOVA are the extensions of the univariate ANOVA and ANCOVA techniques to multidimensional or vector valued observations. The assumption of a Gaussian distribution has been replaced with the Multivariate Gaussian distribution for the vectors data and residual term variables in the statistical models of these techniques. The objective of MANCOVA is to determine if there are statistically reliable mean differences that can be demonstrated between groups later modifying the newly created variable. When randomization assignment of samples or subjects to groups is not possible, multivariate analysis of covariance (MANCOVA) provides statistical matching of groups by adjusting dependent variables as if all subjects scored the same on the covariates. In this research article, an extension has been made to the MANCOVA technique with more number of covariates and homogeneity of regression coefficient vectors is also tested.

  9. More than One Way to Tell a Story: Integrating Storytelling into Your Law Course

    Science.gov (United States)

    Steslow, Donna M.; Gardner, Carolyn

    2011-01-01

    Storytelling has been used in diverse educational settings. It is employed at all educational levels, from elementary schools to graduate schools. Approximately twenty years ago, law school professors began writing about the application of storytelling to various law school subjects as an alternative to the traditional case method. Legal scholars…

  10. One Way Multimedia Broadcasting as a Tool for Education and Development in Developing Nations

    Science.gov (United States)

    Chandrasekhar, M. G.; Venugopal, D.; Sebastian, M.; Chari, B.

    2000-07-01

    An improved quality of life through education and developmental communication is an important necessity of societal up-liftment in the new millennium, especially in the developing nations. The population explosion and the associated pressure on the scarce resources to meet the basic necessities have made it more or less impossible for most of the nations to invest reasonable resources in realizing adequate channels of formal education. Thanks to the developments in satellite communication and associated technologies, new vistas are available today to provide education and developmental communication opportunities to millions of people, spread across the globe. Satellite based Digital Audio and Multimedia Broadcasting is one such new development that is being viewed as an innovative space application in the coming decades. The potential of DAB technology to reach education, information and entertainment directly to the user through a specially designed receiver could be efficiently utilized by the developing nations to overcome their difficulties in realizing formal channels of education and information dissemination. WorldSpace plans to launch three geo-stationary satellites that would cover most of the developing economies in Africa, the Mediterranean, the Middle East, Asia, Latin America and the Caribbean. Apart from a variety of digital, high quality audio channels providing news, views, education and entertainment opportunities, the end users can also get a responsive multimedia. The multimedia is being planned as a specially packaged offering that can meet the demand of students, professionals as well as certain special groups who have certain specific data and information requirements. Apart from WorldSpace, renowned agencies/firms from different parts of the world shall provide the required content to meet these requirements. Though the Internet option is available, higher telephone charges and the difficulty in getting access have made this option less interesting and unpopular in most of the developing countries. The proposed digital audio and multimedia offering from WorldSpace to millions of consumers spread across more than 120 countries is considered as a unique tool for education and development, particularly in the developing nations. In this paper, an attempt is made to briefly describe the issues associated with education and development in developing countries, the WorldSpace offering and how a developing nation can benefit from this offering in the coming decades.

  11. 49 CFR 232.403 - Design standards for one-way end-of-train devices.

    Science.gov (United States)

    2010-10-01

    ... brake pipe pressure variations of ±1 psig; (2) Equipped with a “bleeder valve” that permits the release... undesired emergency brake application; (4) Equipped with either an air gauge or a means of visually... valve to prevent explosion from a high pressure air leak inside the rear unit. (c) Reporting rate...

  12. Cohesive cracked-hinge model for simulation of fracture in one-way slabs on grade

    DEFF Research Database (Denmark)

    Skar, Asmus; Poulsen, Peter Noe; Olesen, John Forbes

    2017-01-01

    Numerical analysis of slab on grade structures subjected to mechanical loads is a complex matter often requiring computationally expensive models. In order to develop a simplified and general concept for non-linear analysis of slab on grade structures, this paper presents a cohesive cracked......-hinge model aimed at the analysis of the bending fracture of the cemented material. The model is based on the fracture mechanics concepts of the fictitious crack model with a linear stress–crack opening relationship. Moreover, the paper presents a two-parameter spring foundation model applied to realistically...

  13. 47 CFR 22.589 - One-way or two-way application requirements.

    Science.gov (United States)

    2010-10-01

    ... exceeds 76.4 kilometers (47.5 miles); and identify each protected Basic Exchange Telephone Radio System... applications to operate a transmitter with ERP exceeding the basic power and height-power limits of § 22.565...

  14. [Teenagers' suicides and suicide attempts: finding one's way in epidemiologic data].

    Science.gov (United States)

    de Tournemire, R

    2010-08-01

    This paper engages into a global assessment of statistics and their potential uses, ranging from 19th century accounts of "suicidal acts" by the central services of criminal justice in France, to European comparative data on suicide in the years 2000. The most recent facts and figures about teenagers' suicides in France are taken into account, thanks to the Inserm records on deaths by suicide and to public polls on suicidal attempts. Copyright (c) 2010 Elsevier Masson SAS. All rights reserved.

  15. One-way gates based on EPR, GHZ and decoherence-free states of W class

    International Nuclear Information System (INIS)

    Basharov, A.M.; Gorbachev, V.N.; Trubilko, A.I.; Yakovleva, E.S.

    2009-01-01

    The logical gates using quantum measurement as a primitive of quantum computation are considered. It is found that these gates achieved with EPR, GHZ and W entangled states have the same structure, allow encoding the classical information into states of quantum system and can perform any calculations. A particular case of decoherence-free W states is discussed as in this very case the logical gate is decoherence-free.

  16. The AFRL MITLL WMT15 System: Theres More than OneWay to Decode It!

    Science.gov (United States)

    2015-06-25

    Public Release els (Galley and Manning, 2008) were also trained for use in the phrase-based systems. An additional phrase table was trained on the...system. 2.7 Neural Network Joint Models NNJM (Devlin et al., 2014) models were used to rescore n-best lists. We trained these mod- els on the alignments...Callison-Burch, Marcello Federico, Nicola Bertoldi, Brooke Cowan, Wade Shen, Christine Moran, Richard Zens , Chris Dyer, Ondřej Bojar, Alexandra

  17. Group Presentation as One Way of Increasing Students’ Participation in the Classroom

    Directory of Open Access Journals (Sweden)

    Clara Herlina Karjo

    2008-11-01

    Full Text Available Teaching English (TOEFL to a class of 50 students or more is a difficult task for a lecturer. Some problems will occur, for example, the improbability for all students to get equal teacher’s attention and equal chance for learning and studying in class. To overcome these problems, the writer conducts a quasi-experimental research involving 100 students in her two classes in Bina Nusantara University. In this research, the writer applies the group presentation method for teaching TOEFL for one semester. The research shows that group scores are slightly higher than individual students’ scores.Keywords:

  18. Balance in Representation - More Than One Way to Tip the Scales

    Science.gov (United States)

    Rowell, D. S.; Carlis, D. L.

    2017-12-01

    Too often diversity balance is seen as purely a numbers issue - one of influx and totals. And thus recruiting and retention being two sides of the same valuable coin. However, balance is not just numbers. There is more than one means to add diversity value to an organization. Numbers is one, but power is another. `Balance' is also reached through empowerment and voice. Whereas recruiting and retention can be seen as hard currency buying balanced demographics, empowerment is an investment that takes various forms and yields various dividends. Empowerment includes investment in people, and training, and systems, and pays dividends in performance, and voice, and careers, and service. In this way space is created for the benefits of diversity even where numbers and composition is statically unyielding. Equilibrium comes from tipping the scales with involvement, impact and affect. And finally in coming back full circle, empowerment has a bonus direct effect on the numbers, as it does improve diverse recruiting and retention. This presentation places a focus on balance in terms of gender efficacy. It will highlight the requirements to grow and mature gender empowerment investments and show how these investments have diversity dividends. And finally the presentation will note how NWS is taking a multi-faceted approach to diversity to achieve `balance' through empowerment, involvement, (gender) mainstreaming, training, etc…, and of course recruiting and retention.

  19. GROUP PRESENTATION AS ONE WAY OF INCREASING STUDENTS PARTICIPATION IN THE CLASSROOM

    OpenAIRE

    Clara Herlina Karjo

    2008-01-01

    Teaching English (TOEFL) to a class of 50 students or more is a difficult task for a lecturer. Some problems will occur, for example, the improbability for all students to get equal teacher’s attention and equal chance for learning and studying in class. To overcome these problems, the writer conducts a quasi-experimental research involving 100 students in her two classes in Bina Nusantara University. In this research, the writer applies the group presentation method for teaching TOEFL for on...

  20. A Modest Proposal: One Way to Save Journalism and Journalism Education

    Science.gov (United States)

    John, Jeffrey Alan

    2013-01-01

    This essay suggests that because anyone and everyone can now be a "journalist," the standards of the field of journalism have been greatly diminished. To regain respect for the profession and retain stature in the academy, journalism education should offer an assurance of the legitimacy of journalism program graduates by recognizing only…

  1. A novel method for one-way hash function construction based on spatiotemporal chaos

    International Nuclear Information System (INIS)

    Ren Haijun; Wang Yong; Xie Qing; Yang Huaqian

    2009-01-01

    A novel hash algorithm based on a spatiotemporal chaos is proposed. The original message is first padded with zeros if needed. Then it is divided into a number of blocks each contains 32 bytes. In the hashing process, each block is partitioned into eight 32-bit values and input into the spatiotemporal chaotic system. Then, after iterating the system for four times, the next block is processed by the same way. To enhance the confusion and diffusion effect, the cipher block chaining (CBC) mode is adopted in the algorithm. The hash value is obtained from the final state value of the spatiotemporal chaotic system. Theoretic analyses and numerical simulations both show that the proposed hash algorithm possesses good statistical properties, strong collision resistance and high efficiency, as required by practical keyed hash functions.

  2. One-way hash function construction based on chaotic map network

    International Nuclear Information System (INIS)

    Yang Huaqian; Wong, K.-W.; Liao Xiaofeng; Wang Yong; Yang Degang

    2009-01-01

    A novel chaotic hash algorithm based on a network structure formed by 16 chaotic maps is proposed. The original message is first padded with zeros to make the length a multiple of four. Then it is divided into a number of blocks each contains 4 bytes. In the hashing process, the blocks are mixed together by the chaotic map network since the initial value and the control parameter of each tent map are dynamically determined by the output of its neighbors. To enhance the confusion and diffusion effect, the cipher block chaining (CBC) mode is adopted in the algorithm. Theoretic analyses and numerical simulations both show that the proposed hash algorithm possesses good statistical properties, strong collision resistance and high flexibility, as required by practical keyed hash functions.

  3. Thermal hydraulic studies of spallation target for one-way coupled ...

    Indian Academy of Sciences (India)

    FEM) based CFD code, to determine the temperature distribution for ... Higher grid refinement was ensured for the fluid zone near the window, which is the region of interest. 4.3 Flow analysis. The analysis is carried out for 30 MeV proton beam of ...

  4. More Than Just One Way: Teaching Complex Thinking in Addictions Work.

    Science.gov (United States)

    Babcock, Marguerite

    2002-01-01

    This paper describes several topics common in addictions work that illustrate how univariate thinking distorts professional practice. The teacher or supervisor must model intellectual integrity and humility to persuade workers in the field to adopt more sophisticated thinking. (Contains 38 references.) (Author)

  5. One-way quantum computation with four-dimensional photonic qudits

    International Nuclear Information System (INIS)

    Joo, Jaewoo; Knight, Peter L.; O'Brien, Jeremy L.; Rudolph, Terry

    2007-01-01

    We consider the possibility of performing linear optical quantum computations making use of extra photonic degrees of freedom. In particular, we focus on the case where we use photons as quadbits, four-dimensional photonic qudits. The basic 2-quadbit cluster state is a hyperentangled state across polarization and two spatial mode degrees of freedom. We examine the nondeterministic methods whereby such states can be created from single photons and/or Bell pairs and then give some mechanisms for performing higher-dimensional fusion gates

  6. Monitoring Method and Apparatus Using Asynchronous, One-Way Transmission from Sensor to Base Station

    Science.gov (United States)

    Jensen, Scott L. (Inventor); Drouant, George J. (Inventor)

    2013-01-01

    A monitoring system is disclosed, which includes a base station and at least one sensor unit that is separate from the base station. The at least one sensor unit resides in a dormant state until it is awakened by the triggering of a vibration-sensitive switch. Once awakened, the sensor may take a measurement, and then transmit to the base station the measurement. Once data is transmitted from the sensor to the base station, the sensor may return to its dormant state. There may be various sensors for each base station and the various sensors may optionally measure different quantities, such as current, voltage, single-axis and/or three-axis magnetic fields.

  7. Update on the USNRC's Nuclear Plant Analyzer

    International Nuclear Information System (INIS)

    Laats, E.T.

    1987-01-01

    The Nuclear Plant Analyzer (NPA) is the US Nuclear Regulatory Commission's (NRC's) state-of-the-art nuclear reactor simulation capability. This computer software package integrates high fidelity nuclear reactor simulation codes such as the TRAC and RELAP5 series of codes with color graphics display techniques and advanced workstation hardware. An overview of this program was given at the 1984 Summer Computer Simulation Conference (SCSC), with selected topics discussed at the 1985 and 1986 SCSCs. Since the 1984 presentation, major redirections of this NRC program have been taken. The original NPA system was developed for operation on a Control Data Corporation CYBER 176 computer, technology that is some 10 to 15 years old. The NPA system has recently been implemented on Class VI computers to gain increased computational capabilities, and is now being implemented on super-minicomputers for use by the scientific community and possibly by the commercial nuclear power plant simulator community. This paper addresses these activities and related experiences. First, the Class VI computer implementation is discussed. The trade-offs between gaining significantly greater computational speed and central memory, with the loss of performance due to many more simultaneous users is shown. Second, the goal of the super-minicomputer implementation is to produce a very cost-effective system that utilizes advanced (multi-dimensional, two-phase coolant) simulation capabilities at real wall-clock simulation times. Benchmarking of the initial super-minicomputer implementation is discussed. Finally, the technical and economic feasibility is addressed for implementing the super-minicomputer version of the NPA with the RELAP5 simulation code onto the Black Fox full scope nuclear power plant simulator

  8. Analyzing the Hidden Curriculum of Screen Media Advertising

    Science.gov (United States)

    Mason, Lance E.

    2015-01-01

    This media literacy article introduces a questioning framework for analyzing screen media with students and provides an example analysis of two contemporary commercials. Investigating screen conventions can help students understand the persuasive functions of commercials, as well as how the unique sensory experience of screen viewing affects how…

  9. Analyzing user demographics and user behavior for trust assessment

    NARCIS (Netherlands)

    Ceolin, D.; Groth, P.T.; Nottamkandath, A.; Fokkink, W.J.; van Hage, W.R.

    2014-01-01

    In many systems, the determination of trust is reduced to reputation estimation. However, reputation is just one way of determining trust. The estimation of trust can be tackled from a variety of other perspectives. In this chapter, we model trust relying on user reputation, user demographics and

  10. Using expert systems to analyze ATE data

    Science.gov (United States)

    Harrington, Jim

    1994-01-01

    The proliferation of automatic test equipment (ATE) is resulting in the generation of large amounts of component data. Some of this component data is not accurate due to the presence of noise. Analyzing this data requires the use of new techniques. This paper describes the process of developing an expert system to analyze ATE data and provides an example rule in the CLIPS language for analyzing trip thresholds for high gain/high speed comparators.

  11. Electrical spectrum & network analyzers a practical approach

    CERN Document Server

    Helfrick, Albert D

    1991-01-01

    This book presents fundamentals and the latest techniques of electrical spectrum analysis. It focuses on instruments and techniques used on spectrum and network analysis, rather than theory. The book covers the use of spectrum analyzers, tracking generators, and network analyzers. Filled with practical examples, the book presents techniques that are widely used in signal processing and communications applications, yet are difficult to find in most literature.Key Features* Presents numerous practical examples, including actual spectrum analyzer circuits* Instruction on how to us

  12. ADAM: Analyzer for Dialectal Arabic Morphology

    Directory of Open Access Journals (Sweden)

    Wael Salloum

    2014-12-01

    Full Text Available While Modern Standard Arabic (MSA has many resources, Arabic Dialects, the primarily spoken local varieties of Arabic, are quite impoverished in this regard. In this article, we present ADAM (Analyzer for Dialectal Arabic Morphology. ADAM is a poor man’s solution to quickly develop morphological analyzers for dialectal Arabic. ADAM has roughly half the out-of-vocabulary rate of a state-of-the-art MSA analyzer and is comparable in its recall performance to an Egyptian dialectal morphological analyzer that took years and expensive resources to build.

  13. Airborne ground penetrating radar: practical field experiments

    CSIR Research Space (South Africa)

    Van Schoor, Michael

    2013-10-01

    Full Text Available application of the technique are often challenged. The reasons for experimenting with non-traditional applications may vary, but common themes are productivity and logistics: Ways of overcoming logistical obstacles (for example, survey sites... that are difficult to access on foot) and of acquiring data more productively (for example, where large survey areas need to be covered) are often sought. One way of increasing GPR productivity is to employ multiple sensors simultaneously. Another way...

  14. Analyzing metabolomics-based challenge test

    NARCIS (Netherlands)

    Vis, D.J.; Westerhuis, J.A.; Jacobs, D.M.; van Duynhoven, J.P.M.; Wopereis, S.; van Ommen, B.; Hendriks, M.M.W.B.; Smilde, A.K.

    2015-01-01

    Challenge tests are used to assess the resilience of human beings to perturbations by analyzing responses to detect functional abnormalities. Well known examples are allergy tests and glucose tolerance tests. Increasingly, metabolomics analysis of blood or serum samples is used to analyze the

  15. Neighborhood Walking and Social Capital: The Correlation between Walking Experience and Individual Perception of Social Capital

    Directory of Open Access Journals (Sweden)

    Heechul Kim

    2017-04-01

    Full Text Available The purpose of this study was to analyze the relationship between people’s actual walking experience and their social capital levels in order to examine the possibility of restoring weakened social functions of streets and public spaces in a walking-friendly urban environment. Based on the survey data of 591 residents of Seoul, we empirically analyzed the relationship between walking experience for various purposes and individual perceptions of social capital using one-way ANOVA and OLS regression models. As a result of the analysis, we found that the levels of neighborly trust and networking of people who experienced leisure walking were higher than those of people who did not, while there was no difference in the level of social capital according to walking experiences for other purposes. This result is significant in that it shows the basis for the restoration of the social function of neighborhoods through social capital formation of people as an effect of walking. Hence, it is important to create a walking environment that supports leisure activities.

  16. Time-delay analyzer with continuous discretization

    International Nuclear Information System (INIS)

    Bayatyan, G.L.; Darbinyan, K.T.; Mkrtchyan, K.K.; Stepanyan, S.S.

    1988-01-01

    A time-delay analyzer is described which when triggered by a start pulse of adjustable duration performs continuous discretization of the analyzed signal within nearly 22 ns time intervals, the recording in a memory unit with following slow read-out of the information to the computer and its processing. The time-delay analyzer consists of four CAMAC-VECTOR systems of unit width. With its help one can separate comparatively short, small-amplitude rare signals against the background of quasistationary noise processes. 4 refs.; 3 figs

  17. Systems Analyze Water Quality in Real Time

    Science.gov (United States)

    2010-01-01

    A water analyzer developed under Small Business Innovation Research (SBIR) contracts with Kennedy Space Center now monitors treatment processes at water and wastewater facilities around the world. Originally designed to provide real-time detection of nutrient levels in hydroponic solutions for growing plants in space, the ChemScan analyzer, produced by ASA Analytics Inc., of Waukesha, Wisconsin, utilizes spectrometry and chemometric algorithms to automatically analyze multiple parameters in the water treatment process with little need for maintenance, calibration, or operator intervention. The company has experienced a compound annual growth rate of 40 percent over its 15-year history as a direct result of the technology's success.

  18. A novel design for a small retractable cylindrical mirror analyzer

    International Nuclear Information System (INIS)

    McIlroy, D.N.; Dowben, P.A.; Knop, A.; Ruehl, E.

    1995-01-01

    In this paper we will review the performance of a ''miniature'' single pass cylindrical mirror analyzer (CMA) which we have used successfully in a variety of experiments. The underlying premise behind this CMA design was to minimize spatial requirements while maintaining an acceptable level of instrumental resolution. While we are presenting the results of a single pass cylindrical mirror analyzer, improvements on the present design, such as going to a double pass design, will undoubtedly improve the instrumental resolution. copyright 1995 American Vacuum Society

  19. Low Gravity Drug Stability Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed program through Phase III is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation. It will be...

  20. On-Demand Urine Analyzer, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer that can be integrated into International Space Station (ISS) toilets to measure key...

  1. Low Gravity Drug Stability Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this proposed program (through Phase III) is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation....

  2. Analyzing the economic impacts of transportation projects.

    Science.gov (United States)

    2013-09-01

    The main goal of the study is to explore methods, approaches and : analytical software tools for analyzing economic activity that results from largescale : transportation investments in Connecticut. The primary conclusion is that the : transportation...

  3. Low Gravity Drug Stability Analyzer, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this proposed program (through Phase III) is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation....

  4. Guide to analyzing investment options using TWIGS.

    Science.gov (United States)

    Charles R Blinn; Dietmar W. Rose; Monique L. Belli

    1988-01-01

    Describes methods for analyzing economic return of simulated stand management alternatives in TWIGS. Defines and discusses net present value, equivalent annual income, soil expectation value, and real vs. nominal analyses. Discusses risk and sensitivity analysis when comparing alternatives.

  5. Ultrasensitive Atmospheric Analyzer for Miniature UAVs Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this SBIR Phase I effort, Los Gatos Research (LGR) proposes to develop a highly-accurate, lightweight, low-power gas analyzer for quantification of water vapor...

  6. Analyzing Protein Dynamics Using Dimensionality Reduction

    OpenAIRE

    Eryol, Atahan

    2015-01-01

    This thesis investigates dimensionality reduction for analyzing the dynamics ofprotein simulations, particularly disordered proteins which do not fold into a xedshape but are thought to perform their functions through their movements. Ratherthan analyze the movement of the proteins in 3D space, we use dimensionalityreduction to project the molecular structure of the proteins into a target space inwhich each structure is represented as a point. All that is needed to do this arethe pairwise dis...

  7. Digital dynamic amplitude-frequency spectra analyzer

    International Nuclear Information System (INIS)

    Kalinnikov, V.A.; )

    2006-01-01

    The spectra analyzer is intended for the dynamic spectral analysis of signals physical installations and noise filtering. The recurrence Fourier transformation algorithm is used in the digital dynamic analyzer. It is realized on the basis of the fast logic FPGA matrix and the special signal ADSP microprocessor. The discretization frequency is 2 kHz-10 MHz. The number of calculated spectral coefficients is not less 512. The functional fast-action is 20 ns [ru

  8. Analyzing Oscillations of a Rolling Cart Using Smartphones and Tablets

    Science.gov (United States)

    Egri, Sándor; Szabó, Lóránt

    2015-03-01

    It is well known that "interactive engagement" helps students to understand basic concepts in physics.1 Performing experiments and analyzing measured data are effective ways to realize interactive engagement, in our view. Some experiments need special equipment, measuring instruments, or laboratories, but in this activity we advocate student use of mobile phones or tablets to take experimental data. Applying their own devices and measuring simple phenomena from everyday life can improve student interest, while still allowing precise analysis of data, which can give deeper insight into scientific thinking and provide a good opportunity for inquiry-based learning.2

  9. Analyzing high school students' reasoning about electromagnetic induction

    Science.gov (United States)

    Jelicic, Katarina; Planinic, Maja; Planinsic, Gorazd

    2017-06-01

    Electromagnetic induction is an important, yet complex, physics topic that is a part of Croatian high school curriculum. Nine Croatian high school students of different abilities in physics were interviewed using six demonstration experiments from electromagnetism (three of them concerned the topic of electromagnetic induction). Students were asked to observe, describe, and explain the experiments. The analysis of students' explanations indicated the existence of many conceptual and reasoning difficulties with the basic concepts of electromagnetism, and especially with recognizing and explaining the phenomenon of electromagnetic induction. Three student mental models of electromagnetic induction, formed during the interviews, which reoccurred among students, are described and analyzed within the knowledge-in-pieces framework.

  10. Swedish ambulance nurses' experiences of nursing patients suffering cardiac arrest.

    Science.gov (United States)

    Larsson, Ricard; Engström, Åsa

    2013-04-01

    Effective pre-hospital treatment of a person suffering cardiac arrest is a challenging task for the ambulance nurses. The aim of this study was to describe ambulance nurses' experiences of nursing patients suffering cardiac arrest. Qualitative personal interviews were conducted during 2011 in Sweden with seven ambulance nurses with experience of nursing patients suffering cardiac arrests. The interview texts were analyzed using qualitative thematic content analysis, which resulted in the formulation of one theme with six categories. Mutual preparation, regular training and education were important factors in the nursing of patients suffering cardiac arrest. Ambulance nurses are placed in ethically demanding situations regarding if and for how long they should continue cardio-pulmonary resuscitation (CPR) to accord with pre-hospital cardiac guidelines and patients' wishes. When a cardiac arrest patient is nursed their relatives also need the attention of ambulance nurses. Reflection is one way for ambulance nurses to learn from, and talk about, their experiences. This study provides knowledge of ambulance nurses' experiences in the care of people with cardiac arrest. Better feedback about the care given by the ambulance nurses, and about the diagnosis and nursing care the patients received after they were admitted to the hospital are suggested as improvements that would allow ambulance nurses to learn more from their experience. Further development and research concerning the technical equipment might improve the situation for both the ambulance nurses and the patients. Ambulance nurses need regularly training and education to be prepared for saving people's lives and also to be able to make the right decisions. © 2013 Wiley Publishing Asia Pty Ltd.

  11. Neutral Particle Analyzer Measurements of Ion Behavior in NSTX

    Energy Technology Data Exchange (ETDEWEB)

    S.S. Medley; R.E. Bell; D.S. Darrow; A.L. Roquemore

    2002-02-06

    Initial results obtained with the Neutral Particle Analyzer (NPA) diagnostic on the National Spherical Torus Experiment (NSTX) are presented. Magnetohydrodynamic activity and reconnection events cause depletion of the deuterium energetic ion distribution created by neutral-beam injection. Adding High Harmonic Fast Wave Heating to neutral-beam-heated discharges results in the generation of an energetic ion tail above the beam injection energy. NPA measurements of the residual hydrogen ion temperature are in good agreement with those from recombination spectroscopy.

  12. Mixture Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.

    2007-12-01

    A mixture experiment involves combining two or more components in various proportions or amounts and then measuring one or more responses for the resulting end products. Other factors that affect the response(s), such as process variables and/or the total amount of the mixture, may also be studied in the experiment. A mixture experiment design specifies the combinations of mixture components and other experimental factors (if any) to be studied and the response variable(s) to be measured. Mixture experiment data analyses are then used to achieve the desired goals, which may include (i) understanding the effects of components and other factors on the response(s), (ii) identifying components and other factors with significant and nonsignificant effects on the response(s), (iii) developing models for predicting the response(s) as functions of the mixture components and any other factors, and (iv) developing end-products with desired values and uncertainties of the response(s). Given a mixture experiment problem, a practitioner must consider the possible approaches for designing the experiment and analyzing the data, and then select the approach best suited to the problem. Eight possible approaches include 1) component proportions, 2) mathematically independent variables, 3) slack variable, 4) mixture amount, 5) component amounts, 6) mixture process variable, 7) mixture of mixtures, and 8) multi-factor mixture. The article provides an overview of the mixture experiment designs, models, and data analyses for these approaches.

  13. A cascading failure model for analyzing railway accident causation

    Science.gov (United States)

    Liu, Jin-Tao; Li, Ke-Ping

    2018-01-01

    In this paper, a new cascading failure model is proposed for quantitatively analyzing the railway accident causation. In the model, the loads of nodes are redistributed according to the strength of the causal relationships between the nodes. By analyzing the actual situation of the existing prevention measures, a critical threshold of the load parameter in the model is obtained. To verify the effectiveness of the proposed cascading model, simulation experiments of a train collision accident are performed. The results show that the cascading failure model can describe the cascading process of the railway accident more accurately than the previous models, and can quantitatively analyze the sensitivities and the influence of the causes. In conclusion, this model can assist us to reveal the latent rules of accident causation to reduce the occurrence of railway accidents.

  14. Analyzing Log Files using Data-Mining

    Directory of Open Access Journals (Sweden)

    Marius Mihut

    2008-01-01

    Full Text Available Information systems (i.e. servers, applications and communication devices create a large amount of monitoring data that are saved as log files. For analyzing them, a data-mining approach is helpful. This article presents the steps which are necessary for creating an ‘analyzing instrument’, based on an open source software called Waikato Environment for Knowledge Analysis (Weka [1]. For exemplification, a system log file created by a Windows-based operating system, is used as input file.

  15. A Novel Architecture For Multichannel Analyzer

    International Nuclear Information System (INIS)

    Marcus, E.; Elhanani, I.; Nir, J.; Ellenbogen, M.; Kadmon, Y.; Tirosh, D.

    1999-01-01

    A novel digital approach to real-time, high-throughput, low-cost Multichannel Analyzer (MCA) for radiation spectroscopy is being presented. The MCA input is a shaped nuclear pulse sampled at a high rate, using an Analog-to-Digital Converter (ADC) chip. The digital samples are analyzed by a state-of-the-art Field Programmable Gate Away (FPGA). A customized algorithm is utilized to estimate the peak of the pulse, to reject pile-up and to eliminate processing dead time. The valid pulses estimated peaks are transferred to a micro controller system that creates the histogram and controls the Human Machine Interface (HMI)

  16. Advances on CT analyzing urolithiasis constituents

    International Nuclear Information System (INIS)

    Feng Qiang; Ma Zhijun

    2009-01-01

    Urolithiasis is common and frequently-occurring diseases of urology. The treatment of lithiasis is not only relevant with the size, location, brittle and infection of calculi, but also affected by urolithiasis constituents. Knowing the urolithiasis constituents in advance is no doubt to guide treatment. But so far an reliable inspection method was not found to analyze accurately urolithiasis constituents in vivo. CT judge precisely the size, location of calculi and analyze roughly the urolithiasis constituents in vivo, especially the appear of dual soure CT, which provide a new method for studying urolithiasis constituents. It may be helpful to find the cause, prevention and therapy of calculi. (authors)

  17. Empirical mode decomposition for analyzing acoustical signals

    Science.gov (United States)

    Huang, Norden E. (Inventor)

    2005-01-01

    The present invention discloses a computer implemented signal analysis method through the Hilbert-Huang Transformation (HHT) for analyzing acoustical signals, which are assumed to be nonlinear and nonstationary. The Empirical Decomposition Method (EMD) and the Hilbert Spectral Analysis (HSA) are used to obtain the HHT. Essentially, the acoustical signal will be decomposed into the Intrinsic Mode Function Components (IMFs). Once the invention decomposes the acoustic signal into its constituting components, all operations such as analyzing, identifying, and removing unwanted signals can be performed on these components. Upon transforming the IMFs into Hilbert spectrum, the acoustical signal may be compared with other acoustical signals.

  18. Electrical aerosol analyzer: calibration and performance

    Energy Technology Data Exchange (ETDEWEB)

    Pui, D.Y.H.; Liu, B.Y.H.

    1976-01-01

    The Electrical Aerosol Analyzer (EAA) was calibrated by means of monodisperse aerosols generated by two independent techniques. In the 0.02 to 1 ..mu..m diameter range, the aerosol was generated by electrostatic classification. In the range between 0.007 and 0.03 ..mu..m, the aerosols were generated by the photo-oxidation of SO/sub 2/ in a smog chamber. Calibration data are presented showing the performance of the EAA as an aerosol detector and as a size distribution analyzer.

  19. BWR plant analyzer development at BNL

    International Nuclear Information System (INIS)

    Cheng, H.S.; Wulff, W.; Mallen, A.N.; Lekach, S.V.; Stritar, A.; Cerbone, R.J.

    1985-01-01

    Advanced technology for high-speed interactive nuclear power plant simulations is of great value for timely resolution of safety issues, for plant monitoring, and for computer-aided emergency responses to an accident. Presented is the methodology employed at BNL to develop a BWR plant analyzer capable of simulating severe plant transients at much faster than real-time process speeds. Five modeling principles are established and a criterion is given for selecting numerical procedures and efficient computers to achieve the very high simulation speeds. Typical results are shown to demonstrate the modeling fidelity of the BWR plant analyzer

  20. Environmental applications of the centrifugal fast analyzer

    International Nuclear Information System (INIS)

    Goldstein, G.; Strain, J.E.; Bowling, J.L.

    1975-12-01

    The centrifugal fast analyzer (GeMSAEC Fast Analyzer) was applied to the analysis of pollutants in air and water. Since data acquisition and processing are computer controlled, considerable effort went into devising appropriate software. A modified version of the standard FOCAL interpreter was developed which includes special machine language functions for data timing, acquisition, and storage, and also permits chaining together of programs stored on a disk. Programs were written and experimental procedures developed to implement spectrophotometric, turbidimetric, kinetic (including initial-rate, fixed-time, and variable-time techniques), and chemiluminescence methods of analysis. Analytical methods were developed for the following elements and compounds: SO 2 , O 3 , Ca, Cr, Cu, Fe, Mg, Se(IV), Zn, Cl - , I - , NO 2 - , PO 4 -3 , S -2 , and SO 4 -2 . In many cases, standard methods could be adapted to the centrifugal analyzer, in others new methods were employed. In general, analyses performed with the centrifugal fast analyzer were faster, more precise, and more accurate than with conventional instrumentation

  1. Fluidization quality analyzer for fluidized beds

    Science.gov (United States)

    Daw, C.S.; Hawk, J.A.

    1995-07-25

    A control loop and fluidization quality analyzer for a fluidized bed utilizes time varying pressure drop measurements. A fast-response pressure transducer measures the overall bed pressure drop, or over some segment of the bed, and the pressure drop signal is processed to produce an output voltage which changes with the degree of fluidization turbulence. 9 figs.

  2. How to Analyze Company Using Social Network?

    Science.gov (United States)

    Palus, Sebastian; Bródka, Piotr; Kazienko, Przemysław

    Every single company or institution wants to utilize its resources in the most efficient way. In order to do so they have to be have good structure. The new way to analyze company structure by utilizing existing within company natural social network and example of its usage on Enron company are presented in this paper.

  3. Analyzing Vessel Behavior Using Process Mining

    NARCIS (Netherlands)

    Maggi, F.M.; Mooij, A.J.; Aalst, W.M.P. van der

    2013-01-01

    In the maritime domain, electronic sensors such as AIS receivers and radars collect large amounts of data about the vessels in a certain geographical area. We investigate the use of process mining techniques for analyzing the behavior of the vessels based on these data. In the context of maritime

  4. Images & Issues: How to Analyze Election Rhetoric.

    Science.gov (United States)

    Rank, Hugh

    Although it is impossible to know in advance the credibility of political messages, such persuasive discourse can be analyzed in a non-partisan, common sense way using predictable patterns in content and form. The content of a candidate's message can be summarized as "I am competent and trustworthy; from me, you'll get 'more good' and 'less…

  5. Thermal and Evolved-Gas Analyzer Illustration

    Science.gov (United States)

    2008-01-01

    This is a computer-aided drawing of the Thermal and Evolved-Gas Analyzer, or TEGA, on NASA's Phoenix Mars Lander. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  6. Statistical network analysis for analyzing policy networks

    DEFF Research Database (Denmark)

    Robins, Garry; Lewis, Jenny; Wang, Peng

    2012-01-01

    and policy network methodology is the development of statistical modeling approaches that can accommodate such dependent data. In this article, we review three network statistical methods commonly used in the current literature: quadratic assignment procedures, exponential random graph models (ERGMs...... has much to offer in analyzing the policy process....

  7. Consideration Regarding Diagnosis Analyze of Corporate Management

    Directory of Open Access Journals (Sweden)

    Mihaela Ciopi OPREA

    2009-01-01

    Full Text Available Diagnosis management aims to identify critical situations and positive aspectsof corporate management. An effective diagnosis made by a team with thestatus of independence from the organization’s management is for managers auseful feedback necessary to improve performance. The work presented focuseson the methodology to achieve effective diagnosis, considering multitudecriteria and variables to be analyzed.

  8. Analyzing and Interpreting Research in Health Education ...

    African Journals Online (AJOL)

    While qualitative research is used when little or nothing is known about the subject, quantitative research is required when there are quantifiable variables to be measured. By implication, health education research is based on phenomenological, ethnographical and/or grounded theoretical approaches that are analyzable ...

  9. Analyzing Languages for Specific Purposes Discourse

    Science.gov (United States)

    Bowles, Hugo

    2012-01-01

    In the last 20 years, technological advancement and increased multidisciplinarity has expanded the range of data regarded as within the scope of languages for specific purposes (LSP) research and the means by which they can be analyzed. As a result, the analytical work of LSP researchers has developed from a narrow focus on specialist terminology…

  10. Automatic radioxenon analyzer for CTBT monitoring

    International Nuclear Information System (INIS)

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.

    1996-12-01

    Over the past 3 years, with support from US DOE's NN-20 Comprehensive Test Ban Treaty (CTBT) R ampersand D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, 131m Xe(11.9 d), 133m Xe(2.19 d), 133 Xe (5.24 d), and 135 Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m 3 /h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 μBq/m 3 of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers

  11. Performance optimization of spectroscopic process analyzers

    NARCIS (Netherlands)

    Boelens, Hans F. M.; Kok, Wim Th; de Noord, Onno E.; Smilde, Age K.

    2004-01-01

    To increase the power and the robustness of spectroscopic process analyzers, methods are needed that suppress the spectral variation that is not related to the property of interest in the process stream. An approach for the selection of a suitable method is presented. The approach uses the net

  12. ITK and ANALYZE: a synergistic integration

    Science.gov (United States)

    Augustine, Kurt E.; Holmes, David R., III; Robb, Richard A.

    2004-05-01

    The Insight Toolkit (ITK) is a C++ open-source software toolkit developed under sponsorship of the National Library of Medicine. It provides advanced algorithms for performing image registration and segmentation, but does not provide support for visualization and analysis, nor does it offer any graphical user interface (GUI). The purpose of this integration project is to make ITK readily accessible to end-users with little or no programming skills, and provide interactive processing, visualization and measurement capabilities. This is achieved through the integration of ITK with ANALYZE, a multi-dimension image visualization/analysis application installed in over 300 institutions around the world, with a user-base in excess of 4000. This integration is carried out at both the software foundation and GUI levels. The foundation technology upon which ANALYZE is built is a comprehensive C-function library called AVW. A new set of AVW-ITK functions have been developed and integrated into the AVW library, and four new ITK modules have been added to the ANALYZE interface. Since ITK is a software developer"s toolkit, the only way to access its intrinsic power is to write programs that incorporate it. Integrating ITK with ANALYZE opens the ITK algorithms to end-users who otherwise might never be able to take advantage of the toolkit"s advanced functionality. In addition, this integration provides end-to-end interactive problem solving capabilities which allow all users, including programmers, an integrated system to readily display and quantitatively evaluate the results from the segmentation and registration routines in ITK, regardless of the type or format of input images, which are comprehensively supported in ANALYZE.

  13. Methodology for analyzing risk at nuclear facilities

    International Nuclear Information System (INIS)

    Yoo, Hosik; Lee, Nayoung; Ham, Taekyu; Seo, Janghoon

    2015-01-01

    Highlights: • A new methodology for evaluating the risk at nuclear facilities was developed. • Five measures reflecting all factors that should be concerned to assess risk were developed. • The attributes on NMAC and nuclear security culture are included as attributes for analyzing. • The newly developed methodology can be used to evaluate risk of both existing facility and future nuclear system. - Abstract: A methodology for evaluating risks at nuclear facilities is developed in this work. A series of measures is drawn from the analysis of factors that determine risks. Five measures are created to evaluate risks at nuclear facilities. These include the legal and institutional framework, material control, physical protection system effectiveness, human resources, and consequences. Evaluation attributes are developed for each measure and specific values are given in order to calculate the risk value quantitatively. Questionnaires are drawn up on whether or not a state has properly established a legal and regulatory framework (based on international standards). These questionnaires can be a useful measure for comparing the status of the physical protection regime between two countries. Analyzing an insider threat is not an easy task and no methodology has been developed for this purpose. In this study, attributes that could quantitatively evaluate an insider threat, in the case of an unauthorized removal of nuclear materials, are developed by adopting the Nuclear Material Accounting & Control (NMAC) system. The effectiveness of a physical protection system, P(E), could be analyzed by calculating the probability of interruption, P(I), and the probability of neutralization, P(N). In this study, the Tool for Evaluating Security System (TESS) code developed by KINAC is used to calculate P(I) and P(N). Consequence is an important measure used to analyze risks at nuclear facilities. This measure comprises radiological, economic, and social damage. Social and

  14. Design and analysis of experiments classical and regression approaches with SAS

    CERN Document Server

    Onyiah, Leonard C

    2008-01-01

    Introductory Statistical Inference and Regression Analysis Elementary Statistical Inference Regression Analysis Experiments, the Completely Randomized Design (CRD)-Classical and Regression Approaches Experiments Experiments to Compare Treatments Some Basic Ideas Requirements of a Good Experiment One-Way Experimental Layout or the CRD: Design and Analysis Analysis of Experimental Data (Fixed Effects Model) Expected Values for the Sums of Squares The Analysis of Variance (ANOVA) Table Follow-Up Analysis to Check fo

  15. Modeling extreme ultraviolet suppression of electrostatic analyzers

    International Nuclear Information System (INIS)

    Gershman, Daniel J.; Zurbuchen, Thomas H.

    2010-01-01

    In addition to analyzing energy-per-charge ratios of incident ions, electrostatic analyzers (ESAs) for spaceborne time-of-flight mass spectrometers must also protect detectors from extreme ultraviolet (EUV) photons from the Sun. The required suppression rate often exceeds 1:10 7 and is generally established in tests upon instrument design and integration. This paper describes a novel technique to model the EUV suppression of ESAs using photon ray tracing integrated into SIMION, the most commonly used ion optics design software for such instruments. The paper compares simulation results with measurements taken from the ESA of the Mass instrument flying onboard the Wind spacecraft. This novel technique enables an active inclusion of EUV suppression requirements in the ESA design process. Furthermore, the simulation results also motivate design rules for such instruments.

  16. Real-time airborne particle analyzer

    Science.gov (United States)

    Reilly, Peter T.A.

    2012-10-16

    An aerosol particle analyzer includes a laser ablation chamber, a gas-filled conduit, and a mass spectrometer. The laser ablation chamber can be operated at a low pressure, which can be from 0.1 mTorr to 30 mTorr. The ablated ions are transferred into a gas-filled conduit. The gas-filled conduit reduces the electrical charge and the speed of ablated ions as they collide and mix with buffer gases in the gas-filled conduit. Preferably, the gas filled-conduit includes an electromagnetic multipole structure that collimates the nascent ions into a beam, which is guided into the mass spectrometer. Because the gas-filled conduit allows storage of vast quantities of the ions from the ablated particles, the ions from a single ablated particle can be analyzed multiple times and by a variety of techniques to supply statistically meaningful analysis of composition and isotope ratios.

  17. Analyzing Technique of Power Systems Under Deregulation

    Science.gov (United States)

    Miyauchi, Hajime; Kita, Hiroyuki; Ishigame, Atsushi

    Deregulation of the electric utilities has been progressing. Even under the deregulation, the reliability should be the most important problem of power systems. However, according to the deregulation, operation and scheduling of power systems are changing and new techniques to analyze power systems are introducing. To evaluate reliability of power systems, adequacy and security are well employed recently. This paper presents the new analyzing technique which will be realized in near future from the viewpoint of adequacy and security. First, simulation tool to evaluate adequacy is described. As an example of this tool, MARS and other methods are mentioned. Next, to evaluate the security, security constrained unit commitment (SCUC) and security constrained optimal power flow (SCOPF) are mentioned. Finally, some topics concerning ancillary service are described.

  18. Development of a nuclear plant analyzer (NPA)

    International Nuclear Information System (INIS)

    De Vlaminck, M.; Mampaey, L.; Vanhoenacker, L.; Bastenaire, F.

    1990-01-01

    A Nuclear Plant Analyzer has been developed by TRACTABEL. Three distinct functional units make up the Nuclear Plant Analyser, a model builder, a run time unit and an analysis unit. The model builder is intended to build simulation models which describe on the one hand the geometric structure and initial conditions of a given plant and on the other hand command control logics and reactor protection systems. The run time unit carries out dialog between the user and the thermal-hydraulic code. The analysis unit is aimed at deep analyzing of the transient results. The model builder is being tested in the framework of the International Standard Problem ISP-26, which is the simulation of a LOCA on the Japanese ROSA facility

  19. Computer-based radionuclide analyzer system

    International Nuclear Information System (INIS)

    Ohba, Kengo; Ishizuka, Akira; Kobayashi, Akira; Ohhashi, Hideaki; Tsuruoka, Kimitoshi.

    1978-01-01

    The radionuclide analysis in nuclear power plants, practiced for the purpose of monitoring the quality of the primary loop water, the confirmation of the performance of reactor cleanup system and monitoring the radioactive waste effluent, is an important job. Important as it is, it requires considerable labor of experts, because the samples to be analyzed are multifarious and very large in number, and in addition, this job depends much on manual work. With a view of saving the labor, simplifying and standardizing the work, reducing radiation exposure, and automatizing the work of analysis, the computerized analyzer system has been worked out. The results of its performance test at the operating power plant have proved that the development has fairly accomplished the objects and that the system is well useful. The developmental work was carried out by the cooperation between The Tokyo Electric Power Co. and Toshiba in about 4 years from 1974 to this year. (auth.)

  20. Analyzing the Existing Undergraduate Engineering Leadership Skills

    OpenAIRE

    Hamed M. Almalki; Luis Rabelo; Charles Davis; Hammad Usmani; Debra Hollister; Alfonso Sarmiento

    2016-01-01

    Purpose: Studying and analyzing the undergraduate engineering students' leadership skills to discover their potential leadership strengths and weaknesses. This study will unveil potential ways to enhance the ways we teach engineering leadership. The research has great insights that might assist engineering programs to improve curricula for the purpose of better engineering preparation to meet industry's demands. Methodology and Findings: 441 undergraduate engineering students have been s...

  1. Analyzing negative ties in social networks

    Directory of Open Access Journals (Sweden)

    Mankirat Kaur

    2016-03-01

    Full Text Available Online social networks are a source of sharing information and maintaining personal contacts with other people through social interactions and thus forming virtual communities online. Social networks are crowded with positive and negative relations. Positive relations are formed by support, endorsement and friendship and thus, create a network of well-connected users whereas negative relations are a result of opposition, distrust and avoidance creating disconnected networks. Due to increase in illegal activities such as masquerading, conspiring and creating fake profiles on online social networks, exploring and analyzing these negative activities becomes the need of hour. Usually negative ties are treated in same way as positive ties in many theories such as balance theory and blockmodeling analysis. But the standard concepts of social network analysis do not yield same results in respect of each tie. This paper presents a survey on analyzing negative ties in social networks through various types of network analysis techniques that are used for examining ties such as status, centrality and power measures. Due to the difference in characteristics of flow in positive and negative tie networks some of these measures are not applicable on negative ties. This paper also discusses new methods that have been developed specifically for analyzing negative ties such as negative degree, and h∗ measure along with the measures based on mixture of positive and negative ties. The different types of social network analysis approaches have been reviewed and compared to determine the best approach that can appropriately identify the negative ties in online networks. It has been analyzed that only few measures such as Degree and PN centrality are applicable for identifying outsiders in network. For applicability in online networks, the performance of PN measure needs to be verified and further, new measures should be developed based upon negative clique concept.

  2. Analyzing Architecture of Mithraism Rock Temples

    OpenAIRE

    Zohre AliJabbari

    2017-01-01

    This analyzes the architecture of rock temples of West and Northwest of Iran, as well as factors influencing their formation. The creation of rock architecture in this area of Iran is influenced by the religious, geographical and political atmosphere of their time. Most of these structures are formed by dominated empires in the first millennium BC. And in some works we are observing their continuity in later periods and change in their functions. One of the reasons that have attracted man to ...

  3. General methods for analyzing bounded proportion data

    OpenAIRE

    Hossain, Abu

    2017-01-01

    This thesis introduces two general classes of models for analyzing proportion response variable when the response variable Y can take values between zero and one, inclusive of zero and/or one. The models are inflated GAMLSS model and generalized Tobit GAMLSS model. The inflated GAMLSS model extends the flexibility of beta inflated models by allowing the distribution on (0,1) of the continuous component of the dependent variable to come from any explicit or transformed (i.e. logit or truncated...

  4. Development of a Portable Water Quality Analyzer

    OpenAIRE

    Germán COMINA; Martin NISSFOLK; José Luís SOLÍS

    2010-01-01

    A portable water analyzer based on a voltammetric electronic tongue has been developed. The system uses an electrochemical cell with two working electrodes as sensors, a computer controlled potentiostat, and software based on multivariate data analysis for pattern recognition. The system is suitable to differentiate laboratory made and real in-situ river water samples contaminated with different amounts of Escherichia coli. This bacteria is not only one of the main indicators for water qualit...

  5. Moving Block Bootstrap for Analyzing Longitudinal Data.

    Science.gov (United States)

    Ju, Hyunsu

    In a longitudinal study subjects are followed over time. I focus on a case where the number of replications over time is large relative to the number of subjects in the study. I investigate the use of moving block bootstrap methods for analyzing such data. Asymptotic properties of the bootstrap methods in this setting are derived. The effectiveness of these resampling methods is also demonstrated through a simulation study.

  6. A chemical analyzer for charged ultrafine particles

    OpenAIRE

    S. G. Gonser; A. Held

    2013-01-01

    New particle formation is a frequent phenomenon in the atmosphere and of major significance for the earth's climate and human health. To date the mechanisms leading to the nucleation of particles as well as to aerosol growth are not completely understood. A lack of appropriate measurement equipment for online analysis of the chemical composition of freshly nucleated particles is one major limitation. We have developed a Chemical Analyzer for Charged Ultrafine Particles (CAChUP) capable...

  7. A chemical analyzer for charged ultrafine particles

    OpenAIRE

    S. G. Gonser; A. Held

    2013-01-01

    New particle formation is a frequent phenomenon in the atmosphere and of major significance for the Earth's climate and human health. To date the mechanisms leading to the nucleation of particles as well as to aerosol growth are not completely understood. A lack of appropriate measurement equipment for online analysis of the chemical composition of freshly nucleated particles is one major limitation. We have developed a Chemical Analyzer for Charged Ultrafine Particles (CAChUP) capable of ana...

  8. A seal analyzer for testing container integrity

    International Nuclear Information System (INIS)

    McDaniel, P.; Jenkins, C.

    1988-01-01

    This paper reports on the development of laboratory and production seal analyzer that offers a rapid, nondestructive method of assuring the seal integrity of virtually any type of single or double sealed container. The system can test a broad range of metal cans, drums and trays, membrane-lidded vessels, flexible pouches, aerosol containers, and glass or metal containers with twist-top lids that are used in the chemical/pesticide (hazardous materials/waste), beverage, food, medical and pharmaceutical industries

  9. Information decomposition method to analyze symbolical sequences

    International Nuclear Information System (INIS)

    Korotkov, E.V.; Korotkova, M.A.; Kudryashov, N.A.

    2003-01-01

    The information decomposition (ID) method to analyze symbolical sequences is presented. This method allows us to reveal a latent periodicity of any symbolical sequence. The ID method is shown to have advantages in comparison with application of the Fourier transformation, the wavelet transform and the dynamic programming method to look for latent periodicity. Examples of the latent periods for poetic texts, DNA sequences and amino acids are presented. Possible origin of a latent periodicity for different symbolical sequences is discussed

  10. A Raman-Based Portable Fuel Analyzer

    Science.gov (United States)

    Farquharson, Stuart

    2010-08-01

    Fuel is the single most import supply during war. Consider that the US Military is employing over 25,000 vehicles in Iraq and Afghanistan. Most fuel is obtained locally, and must be characterized to ensure proper operation of these vehicles. Fuel properties are currently determined using a deployed chemical laboratory. Unfortunately, each sample requires in excess of 6 hours to characterize. To overcome this limitation, we have developed a portable fuel analyzer capable of determine 7 fuel properties that allow determining fuel usage. The analyzer uses Raman spectroscopy to measure the fuel samples without preparation in 2 minutes. The challenge, however, is that as distilled fractions of crude oil, all fuels are composed of hundreds of hydrocarbon components that boil at similar temperatures, and performance properties can not be simply correlated to a single component, and certainly not to specific Raman peaks. To meet this challenge, we measured over 800 diesel and jet fuels from around the world and used chemometrics to correlate the Raman spectra to fuel properties. Critical to the success of this approach is laser excitation at 1064 nm to avoid fluorescence interference (many fuels fluoresce) and a rugged interferometer that provides 0.1 cm-1 wavenumber (x-axis) accuracy to guarantee accurate correlations. Here we describe the portable fuel analyzer, the chemometric models, and the successful determination of these 7 fuel properties for over 100 unknown samples provided by the US Marine Corps, US Navy, and US Army.

  11. Thermo Scientific Sulfur Dioxide Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. BNL has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

  12. Analyzing communication skills of Pediatric Postgraduate Residents in Clinical Encounter by using video recordings.

    Science.gov (United States)

    Bari, Attia; Khan, Rehan Ahmed; Jabeen, Uzma; Rathore, Ahsan Waheed

    2017-01-01

    To analyze communication skills of pediatric postgraduate residents in clinical encounter by using video recordings. This qualitative exploratory research was conducted through video recording at The Children's Hospital Lahore, Pakistan. Residents who had attended the mandatory communication skills workshop offered by CPSP were included. The video recording of clinical encounter was done by a trained audiovisual person while the resident was interacting with the patient in the clinical encounter. Data was analyzed by thematic analysis. Initially on open coding 36 codes emerged and then through axial and selective coding these were condensed to 17 subthemes. Out of these four main themes emerged: (1) Courteous and polite attitude, (2) Marginal nonverbal communication skills, (3) Power game/Ignoring child participation and (4) Patient as medical object/Instrumental behaviour. All residents treated the patient as a medical object to reach a right diagnosis and ignored them as a human being. There was dominant role of doctors and marginal nonverbal communication skills were displayed by the residents in the form of lack of social touch, and appropriate eye contact due to documenting notes. A brief non-medical interaction for rapport building at the beginning of interaction was missing and there was lack of child involvement. Paediatric postgraduate residents were polite while communicating with parents and child but lacking in good nonverbal communication skills. Communication pattern in our study was mostly one-way showing doctor's instrumental behaviour and ignoring the child participation.

  13. ANALYZING MANAGERS’ PERCEPTION OF CREATIVITY IN TOURISM

    OpenAIRE

    Anamaria Sidonia RĂVAR; Maria-Cristina IORGULESCU

    2014-01-01

    The past decades brought new meanings to creativity as the decline of mass tourism created impetus for the emergence of creative behavior as a major source of competitive advantage in the tourism industry. This led, in turn, to the development of a new type of tourism – creative tourism – which translates into new products and services, new collaboration and partnership structures, new forms of organization and ultimately into new experiences for consumers of tourism services. However, there...

  14. Development of a Telemetric, Miniaturized Electrochemical Amperometric Analyzer

    Directory of Open Access Journals (Sweden)

    Jaehyo Jung

    2017-10-01

    Full Text Available In this research, we developed a portable, three-electrode electrochemical amperometric analyzer that can transmit data to a PC or a tablet via Bluetooth communication. We performed experiments using an indium tin oxide (ITO glass electrode to confirm the performance and reliability of the analyzer. The proposed analyzer uses a current-to-voltage (I/V converter to convert the current generated by the reduction-oxidation (redox reaction of the buffer solution to a voltage signal. This signal is then digitized by the processor. The configuration of the power and ground of the printed circuit board (PCB layer is divided into digital and analog parts to minimize the noise interference of each part. The proposed analyzer occupies an area of 5.9 × 3.25 cm2 with a current resolution of 0.4 nA. A potential of 0~2.1 V can be applied between the working and the counter electrodes. The results of this study showed the accuracy of the proposed analyzer by measuring the Ruthenium(III chloride ( Ru III concentration in 10 mM phosphate-buffered saline (PBS solution with a pH of 7.4. The measured data can be transmitted to a PC or a mobile such as a smartphone or a tablet PC using the included Bluetooth module. The proposed analyzer uses a 3.7 V, 120 mAh lithium polymer battery and can be operated for 60 min when fully charged, including data processing and wireless communication.

  15. IRISpy: Analyzing IRIS Data in Python

    Science.gov (United States)

    Ryan, Daniel; Christe, Steven; Mumford, Stuart; Baruah, Ankit; Timothy, Shelbe; Pereira, Tiago; De Pontieu, Bart

    2017-08-01

    IRISpy is a new community-developed open-source software library for analysing IRIS level 2 data. It is written in Python, a free, cross-platform, general-purpose, high-level programming language. A wide array of scientific computing software packages have already been developed in Python, from numerical computation (NumPy, SciPy, etc.), to visualization and plotting (matplotlib), to solar-physics-specific data analysis (SunPy). IRISpy is currently under development as a SunPy-affiliated package which means it depends on the SunPy library, follows similar standards and conventions, and is developed with the support of of the SunPy development team. IRISpy’s has two primary data objects, one for analyzing slit-jaw imager data and another for analyzing spectrograph data. Both objects contain basic slicing, indexing, plotting, and animating functionality to allow users to easily inspect, reduce and analyze the data. As part of this functionality the objects can output SunPy Maps, TimeSeries, Spectra, etc. of relevant data slices for easier inspection and analysis. Work is also ongoing to provide additional data analysis functionality including derivation of systematic measurement errors (e.g. readout noise), exposure time correction, residual wavelength calibration, radiometric calibration, and fine scale pointing corrections. IRISpy’s code base is publicly available through github.com and can be contributed to by anyone. In this poster we demonstrate IRISpy’s functionality and future goals of the project. We also encourage interested users to become involved in further developing IRISpy.

  16. Air sampling unit for breath analyzers

    Science.gov (United States)

    Szabra, Dariusz; Prokopiuk, Artur; Mikołajczyk, Janusz; Ligor, Tomasz; Buszewski, Bogusław; Bielecki, Zbigniew

    2017-11-01

    The paper presents a portable breath sampling unit (BSU) for human breath analyzers. The developed unit can be used to probe air from the upper airway and alveolar for clinical and science studies. The BSU is able to operate as a patient interface device for most types of breath analyzers. Its main task is to separate and to collect the selected phases of the exhaled air. To monitor the so-called I, II, or III phase and to identify the airflow from the upper and lower parts of the human respiratory system, the unit performs measurements of the exhaled CO2 (ECO2) in the concentration range of 0%-20% (0-150 mm Hg). It can work in both on-line and off-line modes according to American Thoracic Society/European Respiratory Society standards. A Tedlar bag with a volume of 5 dm3 is mounted as a BSU sample container. This volume allows us to collect ca. 1-25 selected breath phases. At the user panel, each step of the unit operation is visualized by LED indicators. This helps us to regulate the natural breathing cycle of the patient. There is also an operator's panel to ensure monitoring and configuration setup of the unit parameters. The operation of the breath sampling unit was preliminarily verified using the gas chromatography/mass spectrometry (GC/MS) laboratory setup. At this setup, volatile organic compounds were extracted by solid phase microextraction. The tests were performed by the comparison of GC/MS signals from both exhaled nitric oxide and isoprene analyses for three breath phases. The functionality of the unit was proven because there was an observed increase in the signal level in the case of the III phase (approximately 40%). The described work made it possible to construct a prototype of a very efficient breath sampling unit dedicated to breath sample analyzers.

  17. A computer program for analyzing channel geometry

    Science.gov (United States)

    Regan, R.S.; Schaffranek, R.W.

    1985-01-01

    The Channel Geometry Analysis Program (CGAP) provides the capability to process, analyze, and format cross-sectional data for input to flow/transport simulation models or other computational programs. CGAP allows for a variety of cross-sectional data input formats through use of variable format specification. The program accepts data from various computer media and provides for modification of machine-stored parameter values. CGAP has been devised to provide a rapid and efficient means of computing and analyzing the physical properties of an open-channel reach defined by a sequence of cross sections. CGAP 's 16 options provide a wide range of methods by which to analyze and depict a channel reach and its individual cross-sectional properties. The primary function of the program is to compute the area, width, wetted perimeter, and hydraulic radius of cross sections at successive increments of water surface elevation (stage) from data that consist of coordinate pairs of cross-channel distances and land surface or channel bottom elevations. Longitudinal rates-of-change of cross-sectional properties are also computed, as are the mean properties of a channel reach. Output products include tabular lists of cross-sectional area, channel width, wetted perimeter, hydraulic radius, average depth, and cross-sectional symmetry computed as functions of stage; plots of cross sections; plots of cross-sectional area and (or) channel width as functions of stage; tabular lists of cross-sectional area and channel width computed as functions of stage for subdivisions of a cross section; plots of cross sections in isometric projection; and plots of cross-sectional area at a fixed stage as a function of longitudinal distance along an open-channel reach. A Command Procedure Language program and Job Control Language procedure exist to facilitate program execution on the U.S. Geological Survey Prime and Amdahl computer systems respectively. (Lantz-PTT)

  18. Remote Laser Diffraction Particle Size Distribution Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2001-03-01

    In support of a radioactive slurry sampling and physical characterization task, an “off-the-shelf” laser diffraction (classical light scattering) particle size analyzer was utilized for remote particle size distribution (PSD) analysis. Spent nuclear fuel was previously reprocessed at the Idaho Nuclear Technology and Engineering Center (INTEC—formerly recognized as the Idaho Chemical Processing Plant) which is on DOE’s INEEL site. The acidic, radioactive aqueous raffinate streams from these processes were transferred to 300,000 gallon stainless steel storage vessels located in the INTEC Tank Farm area. Due to the transfer piping configuration in these vessels, complete removal of the liquid can not be achieved. Consequently, a “heel” slurry remains at the bottom of an “emptied” vessel. Particle size distribution characterization of the settled solids in this remaining heel slurry, as well as suspended solids in the tank liquid, is the goal of this remote PSD analyzer task. A Horiba Instruments Inc. Model LA-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a “hot cell” (gamma radiation) environment. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not previously achievable—making this technology far superior than the traditional methods used. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  19. Analyzing Argumentation In Rich, Natural Contexts

    Directory of Open Access Journals (Sweden)

    Anita Reznitskaya

    2008-02-01

    Full Text Available The paper presents the theoretical and methodological aspects of research on the development of argument- ation in elementary school children. It presents a theoretical framework detailing psychological mechanisms responsible for the acquisition and transfer of argumentative discourse and demonstrates several applications of the framework, described in sufficient detail to guide future empirical investigations of oral, written, individual, or group argumentation performance. Software programs capable of facilitating data analysis are identified and their uses illustrated. The analytic schemes can be used to analyze large amounts of verbal data with reasonable precision and efficiency. The conclusion addresses more generally the challenges for and possibilities of empirical study of the development of argumentation.

  20. Thermo Scientific Ozone Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The primary measurement output from the Thermo Scientific Ozone Analyzer is the concentration of the analyte (O3) reported at 1-s resolution in units of ppbv in ambient air. Note that because of internal pneumatic switching limitations the instrument only makes an independent measurement every 4 seconds. Thus, the same concentration number is repeated roughly 4 times at the uniform, monotonic 1-s time base used in the AOS systems. Accompanying instrument outputs include sample temperatures, flows, chamber pressure, lamp intensities and a multiplicity of housekeeping information. There is also a field for operator comments made at any time while data is being collected.

  1. Grid and Data Analyzing and Security

    Directory of Open Access Journals (Sweden)

    Fatemeh SHOKRI

    2012-12-01

    Full Text Available This paper examines the importance of secure structures in the process of analyzing and distributing information with aid of Grid-based technologies. The advent of distributed network has provided many practical opportunities for detecting and recording the time of events, and made efforts to identify the events and solve problems of storing information such as being up-to-date and documented. In this regard, the data distribution systems in a network environment should be accurate. As a consequence, a series of continuous and updated data must be at hand. In this case, Grid is the best answer to use data and resource of organizations by common processing.

  2. Development of a Portable Water Quality Analyzer

    Directory of Open Access Journals (Sweden)

    Germán COMINA

    2010-08-01

    Full Text Available A portable water analyzer based on a voltammetric electronic tongue has been developed. The system uses an electrochemical cell with two working electrodes as sensors, a computer controlled potentiostat, and software based on multivariate data analysis for pattern recognition. The system is suitable to differentiate laboratory made and real in-situ river water samples contaminated with different amounts of Escherichia coli. This bacteria is not only one of the main indicators for water quality, but also a main concern for public health, affecting especially people living in high-burden, resource-limiting settings.

  3. Light-weight analyzer for odor recognition

    Energy Technology Data Exchange (ETDEWEB)

    Vass, Arpad A; Wise, Marcus B

    2014-05-20

    The invention provides a light weight analyzer, e.g., detector, capable of locating clandestine graves. The detector utilizes the very specific and unique chemicals identified in the database of human decompositional odor. This detector, based on specific chemical compounds found relevant to human decomposition, is the next step forward in clandestine grave detection and will take the guess-work out of current methods using canines and ground-penetrating radar, which have historically been unreliable. The detector is self contained, portable and built for field use. Both visual and auditory cues are provided to the operator.

  4. Nonlinear single-spin spectrum analyzer.

    Science.gov (United States)

    Kotler, Shlomi; Akerman, Nitzan; Glickman, Yinnon; Ozeri, Roee

    2013-03-15

    Qubits have been used as linear spectrum analyzers of their environments. Here we solve the problem of nonlinear spectral analysis, required for discrete noise induced by a strongly coupled environment. Our nonperturbative analytical model shows a nonlinear signal dependence on noise power, resulting in a spectral resolution beyond the Fourier limit as well as frequency mixing. We develop a noise characterization scheme adapted to this nonlinearity. We then apply it using a single trapped ion as a sensitive probe of strong, non-Gaussian, discrete magnetic field noise. Finally, we experimentally compared the performance of equidistant vs Uhrig modulation schemes for spectral analysis.

  5. Monitoring machining conditions by analyzing cutting force vibration

    International Nuclear Information System (INIS)

    Piao, Chun Guang; Kim, Ju Wan; Kim, Jin Oh; Shin, Yoan

    2015-01-01

    This paper deals with an experimental technique for monitoring machining conditions by analyzing cutting-force vibration measured at a milling machine. This technique is based on the relationship of the cutting-force vibrations with the feed rate and cutting depth as reported earlier. The measurement system consists of dynamic force transducers and a signal amplifier. The analysis system includes an oscilloscope and a computer with a LabVIEW program. Experiments were carried out at various feed rates and cutting depths, while the rotating speed was kept constant. The magnitude of the cutting force vibration component corresponding to the number of cutting edges multiplied by the frequency of rotation was linearly correlated with the machining conditions. When one condition of machining is known, another condition can be identified by analyzing the cutting-force vibration

  6. Presenting and analyzing movie stimuli for psychocinematic research

    Directory of Open Access Journals (Sweden)

    Arthur P. Shimamura

    2013-02-01

    Full Text Available Movies have an extraordinary way of capturing our perceptual, conceptual and emotional processes. As such, they offer a useful means of exploring psychological phenomenon in the laboratory. Until recently, it has been rather difficult to present animated stimuli and collect behavioral responses online. However, with advances in digital technology and commercially available software to construct, present, and analyze movies for behavioral investigations, there is a growing interest in psychocinematic research. A rather simple, yet useful procedure is described that presents movie clips and collects multiple behavioral responses during its presentation. It uses E-prime 2.0 Professional software to run the experiment and Microsoft Excel to sort and analyze the data.

  7. An electron density measurement using an analyzer based imaging system

    International Nuclear Information System (INIS)

    Bewer, Brian

    2011-01-01

    Using a monochromatic X-ray beam from a synchrotron source the electron density of a homogeneous target was determined by measuring the refraction that occurs at the air-target interface for a known angle of incidence. The angle of deviation that these X-rays undergo at the transition between materials is micro-radian to submicro-radian in scale. Existing analyzer based imaging systems are designed to measure submicro-radian angle changes and commonly use monochromatic hard X-ray beams generated from synchrotron sources. A preliminary experiment using the analyzer based imaging apparatus at the Canadian Light Source Biomedical Imaging and Therapy beamline and a half cylinder shaped plastic target will be presented. By measuring the angle of deviation of the photon beam at several discrete angular positions of the target the electron density of the target material was determined.

  8. Analyzing delay causes in Egyptian construction projects.

    Science.gov (United States)

    Marzouk, Mohamed M; El-Rasas, Tarek I

    2014-01-01

    Construction delays are common problems in civil engineering projects in Egypt. These problems occur frequently during project life-time leading to disputes and litigation. Therefore, it is essential to study and analyze causes of construction delays. This research presents a list of construction delay causes retrieved from literature. The feedback of construction experts was obtained through interviews. Subsequently, a questionnaire survey was prepared. The questionnaire survey was distributed to thirty-three construction experts who represent owners, consultants, and contractor's organizations. Frequency Index, Severity Index, and Importance Index are calculated and according to the highest values of them the top ten delay causes of construction projects in Egypt are determined. A case study is analyzed and compared to the most important delay causes in the research. Statistical analysis is carried out using analysis of variance ANOVA method to test delay causes, obtained from the survey. The test results reveal good correlation between groups while there is significant difference between them for some delay causes and finally roadmap for prioritizing delay causes groups is presented.

  9. Plutonium solution analyzer. Revised February 1995

    International Nuclear Information System (INIS)

    Burns, D.A.

    1995-02-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%--0.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40--240 g/l: and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4--4.0 g/y. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 ml of each sample and standard, and generates waste at the rate of about 1.5 ml per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded)

  10. Mango: combining and analyzing heterogeneous biological networks.

    Science.gov (United States)

    Chang, Jennifer; Cho, Hyejin; Chou, Hui-Hsien

    2016-01-01

    Heterogeneous biological data such as sequence matches, gene expression correlations, protein-protein interactions, and biochemical pathways can be merged and analyzed via graphs, or networks. Existing software for network analysis has limited scalability to large data sets or is only accessible to software developers as libraries. In addition, the polymorphic nature of the data sets requires a more standardized method for integration and exploration. Mango facilitates large network analyses with its Graph Exploration Language, automatic graph attribute handling, and real-time 3-dimensional visualization. On a personal computer Mango can load, merge, and analyze networks with millions of links and can connect to online databases to fetch and merge biological pathways. Mango is written in C++ and runs on Mac OS, Windows, and Linux. The stand-alone distributions, including the Graph Exploration Language integrated development environment, are freely available for download from http://www.complex.iastate.edu/download/Mango. The Mango User Guide listing all features can be found at http://www.gitbook.com/book/j23414/mango-user-guide.

  11. Analyzing block placement errors in SADP patterning

    Science.gov (United States)

    Kobayashi, Shinji; Okada, Soichiro; Shimura, Satoru; Nafus, Kathleen; Fonseca, Carlos; Demand, Marc; Biesemans, Serge; Versluijs, Janko; Ercken, Monique; Foubert, Philippe; Miyazaki, Shinobu

    2016-03-01

    We discuss edge placement errors (EPE) for multi-patterning of Mx critical layers using ArF lithography. Specific focus is placed on the block formation part of the process. While plenty of literature characterization data exist on spacer formation, only limited published data is available on block processes. We analyze the accuracy of placing blocks relative to narrow spacers. Many publications calculate EPE assuming Gaussian distributions for key process variations contributing to EPE. For practical reasons, each contributor is measured on dedicated test structures. In this work, we complement such analysis and directly measure the EPE in product. We perform high density sampling of blocks using CDSEM images and analyze all feature edges of interest. We find that block placement errors can be very different depending on their local design context. Specifically we report on 2 block populations (further called block A and B) which have a 4x different standard deviation. We attribute this to differences in local topography (spacer shape) and interaction with the plasma-etch process design. Block A (on top of the `core space' S1) has excellent EPE uniformity of ~1 nm while block B (on top of `gap space' S2) has degraded EPE control of ~4 nm. Finally, we suggest that the SOC etch process is at the origin on positioning blocks accurately on slim spacers, helping the manufacturability of spacer-based patterning techniques, and helping its extension toward the 5nm node.

  12. Analyzing endocrine system conservation and evolution.

    Science.gov (United States)

    Bonett, Ronald M

    2016-08-01

    Analyzing variation in rates of evolution can provide important insights into the factors that constrain trait evolution, as well as those that promote diversification. Metazoan endocrine systems exhibit apparent variation in evolutionary rates of their constituent components at multiple levels, yet relatively few studies have quantified these patterns and analyzed them in a phylogenetic context. This may be in part due to historical and current data limitations for many endocrine components and taxonomic groups. However, recent technological advancements such as high-throughput sequencing provide the opportunity to collect large-scale comparative data sets for even non-model species. Such ventures will produce a fertile data landscape for evolutionary analyses of nucleic acid and amino acid based endocrine components. Here I summarize evolutionary rate analyses that can be applied to categorical and continuous endocrine traits, and also those for nucleic acid and protein-based components. I emphasize analyses that could be used to test whether other variables (e.g., ecology, ontogenetic timing of expression, etc.) are related to patterns of rate variation and endocrine component diversification. The application of phylogenetic-based rate analyses to comparative endocrine data will greatly enhance our understanding of the factors that have shaped endocrine system evolution. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Analyzing rare diseases terms in biomedical terminologies

    Directory of Open Access Journals (Sweden)

    Erika Pasceri

    2012-03-01

    Full Text Available Rare disease patients too often face common problems, including the lack of access to correct diagnosis, lack of quality information on the disease, lack of scientific knowledge of the disease, inequities and difficulties in access to treatment and care. These things could be changed by implementing a comprehensive approach to rare diseases, increasing international cooperation in scientific research, by gaining and sharing scientific knowledge about and by developing tools for extracting and sharing knowledge. A significant aspect to analyze is the organization of knowledge in the biomedical field for the proper management and recovery of health information. For these purposes, the sources needed have been acquired from the Office of Rare Diseases Research, the National Organization of Rare Disorders and Orphanet, organizations that provide information to patients and physicians and facilitate the exchange of information among different actors involved in this field. The present paper shows the representation of rare diseases terms in biomedical terminologies such as MeSH, ICD-10, SNOMED CT and OMIM, leveraging the fact that these terminologies are integrated in the UMLS. At the first level, it was analyzed the overlap among sources and at a second level, the presence of rare diseases terms in target sources included in UMLS, working at the term and concept level. We found that MeSH has the best representation of rare diseases terms.

  14. CALIBRATION OF ONLINE ANALYZERS USING NEURAL NETWORKS

    Energy Technology Data Exchange (ETDEWEB)

    Rajive Ganguli; Daniel E. Walsh; Shaohai Yu

    2003-12-05

    Neural networks were used to calibrate an online ash analyzer at the Usibelli Coal Mine, Healy, Alaska, by relating the Americium and Cesium counts to the ash content. A total of 104 samples were collected from the mine, with 47 being from screened coal, and the rest being from unscreened coal. Each sample corresponded to 20 seconds of coal on the running conveyor belt. Neural network modeling used the quick stop training procedure. Therefore, the samples were split into training, calibration and prediction subsets. Special techniques, using genetic algorithms, were developed to representatively split the sample into the three subsets. Two separate approaches were tried. In one approach, the screened and unscreened coal was modeled separately. In another, a single model was developed for the entire dataset. No advantage was seen from modeling the two subsets separately. The neural network method performed very well on average but not individually, i.e. though each prediction was unreliable, the average of a few predictions was close to the true average. Thus, the method demonstrated that the analyzers were accurate at 2-3 minutes intervals (average of 6-9 samples), but not at 20 seconds (each prediction).

  15. Calibration of the portable wear metal analyzer

    Science.gov (United States)

    Quinn, Michael J.

    1987-12-01

    The Portable Wear Metal Analyzer (PWMA), a graphite furnace atomic absorption (AA) spectrometer, developed under a contract for this laboratory, was evaluated using powdered metal particles suspended in oil. The PWMA is a microprocessor controlled automatic sequential multielement AA spectrometer designed to support the deployed aircraft requirement for spectrometric oil analysis. The PWMA will analyze for nine elements (Ni, Fe, Cu, Cr, Ag, Mg, Si, Ti, Al) at a rate of 4 min per sample. The graphite tube and modified sample introduction system increase the detection of particles in oil when compared to the currently used techniques of flame AA or spark atomic emission (AE) spectroscopy. The PWMA shows good-to-excellent response for particles in sizes of 0 to 5 and 5 to 10 micrometers and fair response to particles of 10 to 20 and 20 to 30 micrometers. All trends in statistical variations are easily explained by system considerations. Correction factors to the calibration curves are necessary to correlate the analytical capability of the PWMA to the performance of existing spectrometric oil analysis (SOA) instruments.

  16. Solar Probe ANalyzer for Ions - Laboratory Performance

    Science.gov (United States)

    Livi, R.; Larson, D. E.; Kasper, J. C.; Korreck, K. E.; Whittlesey, P. L.

    2017-12-01

    The Parker Solar Probe (PSP) mission is a heliospheric satellite that will orbit the Sun closer than any prior mission to date with a perihelion of 35 solar radii (RS) and an aphelion of 10 RS. PSP includes the Solar Wind Electrons Alphas and Protons (SWEAP) instrument suite, which in turn consists of four instruments: the Solar Probe Cup (SPC) and three Solar Probe ANalyzers (SPAN) for ions and electrons. Together, this suite will take local measurements of particles and electromagnetic fields within the Sun's corona. SPAN-Ai has completed flight calibration and spacecraft integration and is set to be launched in July of 2018. The main mode of operation consists of an electrostatic analyzer (ESA) at its aperture followed by a Time-of-Flight section to measure the energy and mass per charge (m/q) of the ambient ions. SPAN-Ai's main objective is to measure solar wind ions within an energy range of 5 eV - 20 keV, a mass/q between 1-60 [amu/q] and a field of view of 2400x1200. Here we will show flight calibration results and performance.

  17. Optoacoustic 13C-breath test analyzer

    Science.gov (United States)

    Harde, Hermann; Helmrich, Günther; Wolff, Marcus

    2010-02-01

    The composition and concentration of exhaled volatile gases reflects the physical ability of a patient. Therefore, a breath analysis allows to recognize an infectious disease in an organ or even to identify a tumor. One of the most prominent breath tests is the 13C-urea-breath test, applied to ascertain the presence of the bacterium helicobacter pylori in the stomach wall as an indication of a gastric ulcer. In this contribution we present a new optical analyzer that employs a compact and simple set-up based on photoacoustic spectroscopy. It consists of two identical photoacoustic cells containing two breath samples, one taken before and one after capturing an isotope-marked substrate, where the most common isotope 12C is replaced to a large extent by 13C. The analyzer measures simultaneously the relative CO2 isotopologue concentrations in both samples by exciting the molecules on specially selected absorption lines with a semiconductor laser operating at a wavelength of 2.744 μm. For a reliable diagnosis changes of the 13CO2 concentration of 1% in the exhaled breath have to be detected at a concentration level of this isotope in the breath of about 500 ppm.

  18. Analyzing surface coatings in situ: High-temperature surface film analyzer developed

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    Scientists at Argonne National Laboratory (ANL) have devised a new instrument that can analyze surface coatings under operating conditions. The High-Temperature Surface Film Analyzer is the first such instrument to analyze the molecular composition and structure of surface coatings on metals and solids under conditions of high temperature and pressure in liquid environments. Corrosion layers, oxide coatings, polymers or paint films, or adsorbed molecules are examples of conditions that can be analyzed using this instrument. Film thicknesses may vary from a few molecular layers to several microns or thicker. The instrument was originally developed to study metal corrosion in aqueous solutions similar to the cooling water systems of light-water nuclear reactors. The instrument may have use for the nuclear power industry where coolant pipes degrade due to stress corrosion cracking, which often leads to plant shutdown. Key determinants in the occurrence of stress corrosion cracking are the properties and composition of corrosion scales that form inside pipes. The High-Temperature Surface Analyzer can analyze these coatings under laboratory conditions that simulate the same hostile environment of high temperature, pressure, and solution that exist during plant operations. The ability to analyze these scales in hostile liquid environments is unique to the instrument. Other applications include analyzing paint composition, corrosion of materials in geothermal power systems, integrity of canisters for radioactive waste storage, corrosion inhibitor films on piping and drilling systems, and surface scales on condenser tubes in industrial hot water heat exchangers. The device is not patented

  19. Analyzing Enterprise Networks Needs: Action Research from the Mechatronics Sector

    Science.gov (United States)

    Cagnazzo, Luca; Taticchi, Paolo; Bidini, Gianni; Baglieri, Enzo

    New business models and theories are developing nowadays towards collaborative environments direction, and many new tools in sustaining companies involved in these organizations are emerging. Among them, a plethora of methodologies to analyze their needs are already developed for single companies. Few academic works are available about Enterprise Networks (ENs) need analysis. This paper presents the learning from an action research (AR) in the mechatronics sector: AR has been used in order to experience the issue of evaluating network needs and therefore define, develop, and test a complete framework for network evaluation. Reflection on the story in the light of the experience and the theory is presented, as well as extrapolation to a broader context and articulation of usable knowledge.

  20. Guidelines and precautions in collecting and analyzing for mixed wastes

    International Nuclear Information System (INIS)

    Hall, J.R.; Stagg, D.D.; Clark, S.L.

    1987-01-01

    Regulatory requirements mandated by the US Environmental Protection Agency (EPA) for the Resource Conservation and Recovery Act, the Superfund, and the EPA/US Nuclear Regulatory Commission guidance document have generated an increasing demand for hazardous materials analysis of radioactive contaminated samples (mixed waste). The analysis of these samples, which contain both radioactive and hazardous materials, using EPA methods requires that both the sample collection and the analytical analysis be performed using guidelines and precautions different from those normally used for radioanalytical work. The paper discusses the unique procedures, guidelines, and precautions one must use in collecting and analyzing mixed waste samples in order to achieve accurate and reliable results and reports the recent experience of International Technology (IT) Corporation in constructing and operating a mixed waste laboratory. It also describes the experience of IT personnel in collecting samples in the field

  1. Analyzing high school students’ reasoning about electromagnetic induction

    Directory of Open Access Journals (Sweden)

    Katarina Jelicic

    2017-02-01

    Full Text Available Electromagnetic induction is an important, yet complex, physics topic that is a part of Croatian high school curriculum. Nine Croatian high school students of different abilities in physics were interviewed using six demonstration experiments from electromagnetism (three of them concerned the topic of electromagnetic induction. Students were asked to observe, describe, and explain the experiments. The analysis of students’ explanations indicated the existence of many conceptual and reasoning difficulties with the basic concepts of electromagnetism, and especially with recognizing and explaining the phenomenon of electromagnetic induction. Three student mental models of electromagnetic induction, formed during the interviews, which reoccurred among students, are described and analyzed within the knowledge-in-pieces framework.

  2. Automated Root Tracking with "Root System Analyzer"

    Science.gov (United States)

    Schnepf, Andrea; Jin, Meina; Ockert, Charlotte; Bol, Roland; Leitner, Daniel

    2015-04-01

    Crucial factors for plant development are water and nutrient availability in soils. Thus, root architecture is a main aspect of plant productivity and needs to be accurately considered when describing root processes. Images of root architecture contain a huge amount of information, and image analysis helps to recover parameters describing certain root architectural and morphological traits. The majority of imaging systems for root systems are designed for two-dimensional images, such as RootReader2, GiA Roots, SmartRoot, EZ-Rhizo, and Growscreen, but most of them are semi-automated and involve mouse-clicks in each root by the user. "Root System Analyzer" is a new, fully automated approach for recovering root architectural parameters from two-dimensional images of root systems. Individual roots can still be corrected manually in a user interface if required. The algorithm starts with a sequence of segmented two-dimensional images showing the dynamic development of a root system. For each image, morphological operators are used for skeletonization. Based on this, a graph representation of the root system is created. A dynamic root architecture model helps to determine which edges of the graph belong to an individual root. The algorithm elongates each root at the root tip and simulates growth confined within the already existing graph representation. The increment of root elongation is calculated assuming constant growth. For each root, the algorithm finds all possible paths and elongates the root in the direction of the optimal path. In this way, each edge of the graph is assigned to one or more coherent roots. Image sequences of root systems are handled in such a way that the previous image is used as a starting point for the current image. The algorithm is implemented in a set of Matlab m-files. Output of Root System Analyzer is a data structure that includes for each root an identification number, the branching order, the time of emergence, the parent

  3. Analyzing acoustic phenomena with a smartphone microphone

    Science.gov (United States)

    Kuhn, Jochen; Vogt, Patrik

    2013-02-01

    This paper describes how different sound types can be explored using the microphone of a smartphone and a suitable app. Vibrating bodies, such as strings, membranes, or bars, generate air pressure fluctuations in their immediate vicinity, which propagate through the room in the form of sound waves. Depending on the triggering mechanism, it is possible to differentiate between four types of sound waves: tone, sound, noise, and bang. In everyday language, non-experts use the terms "tone" and "sound" synonymously; however, from a physics perspective there are very clear differences between the two terms. This paper presents experiments that enable learners to explore and understand these differences. Tuning forks and musical instruments (e.g., recorders and guitars) can be used as equipment for the experiments. The data are captured using a smartphone equipped with the appropriate app (in this paper we describe the app Audio Kit for iOS systems ). The values captured by the smartphone are displayed in a screen shot and then viewed directly on the smartphone or exported to a computer graphics program for printing.

  4. Analyzing and reducing plagiarism at university

    Directory of Open Access Journals (Sweden)

    Jorge López Puga

    2014-12-01

    Full Text Available Plagiarism is one of the less desirable practises in the academic context. This paper presents an experience of massive plagiarism detection at university and the steps taken to prevent its subsequent occurrence. Plagiarism was detected in the first assessment phase of a research project practise. As a result, students were required to arrange ethical group discussions with the professor to prevent plagiarism in the future. A substantial reduction in the rate of plagiarism was observed from the first practical assessment to the second one, t(16=2.5, p=.02, d=0.83, 1-?=.63, unilateral contrast. Additionally, a survey was developed to analyse students’ opinions and attitudes about plagiarism. A sample of 64 students (15 boys and 49 girls with an average age of 22.69 (SD=2.8 filled in an electronic questionnaire. More than a half of the sample (56.92% admitted that they had plagiarised before but most of the students (83.08% agreed they would not like someone else plagiarising their reports. A preliminary short scale to measure attitude towards plagiarism in undergraduate students at university is provided. Finally, a set of recommendations are given based on this experience to prevent and to reduce the level of plagiarism in the university contex.

  5. Analyzing and forecasting the European social climate

    Directory of Open Access Journals (Sweden)

    Liliana DUGULEANĂ

    2015-06-01

    Full Text Available The paper uses the results of the sample survey Eurobarometer, which has been requested by the European Commission. The social climate index is used to measure the level of perceptions of population by taking into account their personal situation and their perspective at national level. The paper makes an analysis of the evolution of social climate indices for the countries of European Union and offers information about the expectations of population of analyzed countries. The obtained results can be compared with the forecasting of Eurobarometer, on short term of one year and medium term of five years. Modelling the social climate index and its influence factors offers useful information about the efficiency of social protection and inclusion policies.

  6. Analyzer of neutron flux in real time

    International Nuclear Information System (INIS)

    Rojas S, A.S.; Carrillo M, R.A.; Balderas, E.G.

    1999-01-01

    With base in the study of the real signals of neutron flux of instability events occurred in the Laguna Verde nuclear power plant where the nucleus oscillation phenomena of the reactor are in the 0 to 2.5 Hz range, it has been seen the possibility about the development a surveillance and diagnostic equipment capable to analyze in real time the behavior of nucleus in this frequencies range. An important method for surveillance the stability of the reactor nucleus is the use of the Power spectral density which allows to determine the frequencies and amplitudes contained in the signals. It is used an instrument carried out by LabVIEW graphic programming with a data acquisition card of 16 channels which works at Windows 95/98 environment. (Author)

  7. Analyzing the Control Structure of PEPA

    DEFF Research Database (Denmark)

    Yang, Fan; Nielson, Hanne Riis

    expressed in PEPA. The analysis technique we adopted is Data Flow Analysis. We begin the analysis by defining an appropriate transfer function, then with the classical worklist algorithm we construct a finite automaton that captures all possible interactions among processes. By annotating labels and layers...... to PEPA programs, the approximating result is very precise. Based on the analysis, we also develop algorithms for validating the deadlock property of PEPA programs. The techniques have been implemented in a tool which is able to analyze processes with a control structure that more than one thousand states.......The Performance Evaluation Process Algebra, PEPA, is introduced by Jane Hillston as a stochastic process algebra for modelling distributed systems and especially suitable for performance evaluation. We present a static analysis that very precisely approximates the control structure of processes...

  8. Buccal microbiology analyzed by infrared spectroscopy

    Science.gov (United States)

    de Abreu, Geraldo Magno Alves; da Silva, Gislene Rodrigues; Khouri, Sônia; Favero, Priscila Pereira; Raniero, Leandro; Martin, Airton Abrahão

    2012-01-01

    Rapid microbiological identification and characterization are very important in dentistry and medicine. In addition to dental diseases, pathogens are directly linked to cases of endocarditis, premature delivery, low birth weight, and loss of organ transplants. Fourier Transform Infrared Spectroscopy (FTIR) was used to analyze oral pathogens Aggregatibacter actinomycetemcomitans ATCC 29523, Aggregatibacter actinomycetemcomitans-JP2, and Aggregatibacter actinomycetemcomitans which was clinically isolated from the human blood-CI. Significant spectra differences were found among each organism allowing the identification and characterization of each bacterial species. Vibrational modes in the regions of 3500-2800 cm-1, the 1484-1420 cm-1, and 1000-750 cm-1 were used in this differentiation. The identification and classification of each strain were performed by cluster analysis achieving 100% separation of strains. This study demonstrated that FTIR can be used to decrease the identification time, compared to the traditional methods, of fastidious buccal microorganisms associated with the etiology of the manifestation of periodontitis.

  9. Nuclear Plant Analyzer: Installation manual. Volume 1

    International Nuclear Information System (INIS)

    Snider, D.M.; Wagner, K.L.; Grush, W.H.; Jones, K.R.

    1995-01-01

    This report contains the installation instructions for the Nuclear Plant Analyzer (NPA) System. The NPA System consists of the Computer Visual System (CVS) program, the NPA libraries, the associated utility programs. The NPA was developed at the Idaho National Engineering Laboratory under the sponsorship of the US Nuclear Regulatory Commission to provide a highly flexible graphical user interface for displaying the results of these analysis codes. The NPA also provides the user with a convenient means of interactively controlling the host program through user-defined pop-up menus. The NPA was designed to serve primarily as an analysis tool. After a brief introduction to the Computer Visual System and the NPA, an analyst can quickly create a simple picture or set of pictures to aide in the study of a particular phenomenon. These pictures can range from simple collections of square boxes and straight lines to complex representations of emergency response information displays

  10. Using wavelet features for analyzing gamma lines

    International Nuclear Information System (INIS)

    Medhat, M.E.; Abdel-hafiez, A.; Hassan, M.F.; Ali, M.A.; Uzhinskii, V.V.

    2004-01-01

    Data processing methods for analyzing gamma ray spectra with symmetric bell-shaped peaks form are considered. In many cases the peak form is symmetrical bell shaped in particular a Gaussian case is the most often used due to many physical reasons. The problem is how to evaluate parameters of such peaks, i.e. their positions, amplitudes and also their half-widths, that is for a single peak and overlapped peaks. Through wavelet features by using Marr wavelet (Mexican Hat) as a correlation method, it could be to estimate the optimal wavelet parameters and to locate peaks in the spectrum. The performance of the proposed method and others shows a better quality of wavelet transform method

  11. Coke from small-diameter tubes analyzed

    International Nuclear Information System (INIS)

    Albright, L.F.

    1988-01-01

    The mechanism for coke deposit formation and the nature of the coke itself can vary with the design of the ethylene furnace tube bank. In this article, coke deposits from furnaces with small-diameter pyrolysis tubes are examined. The samples were taken from four furnaces of identical design (Plant B). As in both the first and second installments of the series, the coke deposits were examined using a scanning electron microscope (SEM) and an energy dispersive X-ray analyzer (EDAX). The deposits from the small-diameter tubes are compared with the coke deposits from the furnace discussed in earlier articles. Analysis of the coke in both sets of samples are then used to offer recommendations for improved decoking procedures, operating procedures, better feed selection, and better selection of the metallurgy used in furnace tubes, to extend the operating time of the furnace tubes by reducing the amount and type of coke build up

  12. Analyzing Options for Airborne Emergency Wireless Communications

    Energy Technology Data Exchange (ETDEWEB)

    Michael Schmitt; Juan Deaton; Curt Papke; Shane Cherry

    2008-03-01

    In the event of large-scale natural or manmade catastrophic events, access to reliable and enduring commercial communication systems is critical. Hurricane Katrina provided a recent example of the need to ensure communications during a national emergency. To ensure that communication demands are met during these critical times, Idaho National Laboratory (INL) under the guidance of United States Strategic Command has studied infrastructure issues, concerns, and vulnerabilities associated with an airborne wireless communications capability. Such a capability could provide emergency wireless communications until public/commercial nodes can be systematically restored. This report focuses on the airborne cellular restoration concept; analyzing basic infrastructure requirements; identifying related infrastructure issues, concerns, and vulnerabilities and offers recommended solutions.

  13. Sensors and Automated Analyzers for Radionuclides

    International Nuclear Information System (INIS)

    Grate, Jay W.; Egorov, Oleg B.

    2003-01-01

    The production of nuclear weapons materials has generated large quantities of nuclear waste and significant environmental contamination. We have developed new, rapid, automated methods for determination of radionuclides using sequential injection methodologies to automate extraction chromatographic separations, with on-line flow-through scintillation counting for real time detection. This work has progressed in two main areas: radionuclide sensors for water monitoring and automated radiochemical analyzers for monitoring nuclear waste processing operations. Radionuclide sensors have been developed that collect and concentrate radionuclides in preconcentrating minicolumns with dual functionality: chemical selectivity for radionuclide capture and scintillation for signal output. These sensors can detect pertechnetate to below regulatory levels and have been engineered into a prototype for field testing. A fully automated process monitor has been developed for total technetium in nuclear waste streams. This instrument performs sample acidification, speciation adjustment, separation and detection in fifteen minutes or less

  14. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...... of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business....... This thesis documents the groundwork towards addressing the challenges faced by telemedical technologies today and establishing telemedicine as a means of patient diagnosis and treatment. Furthermore, it serves as an empirical example of designing a software ecosystem....

  15. Analyzing Strategic Business Rules through Simulation Modeling

    Science.gov (United States)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  16. Method and apparatus for analyzing ionizable materials

    International Nuclear Information System (INIS)

    Ehrlich, B.J.; Hall, R.C.; Thiede, P.W.

    1979-01-01

    An apparatus and method are described for analyzing a solution of ionizable compounds in a liquid. The solution is irradiated with electromagnetic radiation to ionize the compounds and the electrical conductivity of the solution is measured. The radiation may be X-rays, ultra-violet, infra-red or microwaves. The solution may be split into two streams, only one of which is irradiated, the other being used as a reference by comparing conductivities of the two streams. The liquid must be nonionizable and is preferably a polar solvent. The invention provides an analysis technique useful in liquid chromatography and in gas chromatography after dissolving the eluted gases in a suitable solvent. Electrical conductivity measurements performed on the irradiated eluent provide a quantitative indication of the ionizable materials existing within the eluent stream and a qualitative indication of the purity of the eluent stream. (author)

  17. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    as the software development and distribution by a set of actors dependent on each other and the ecosystem. We commence on the hypothesis that the establishment of a software ecosystem on the telemedicine services of Denmark would address these issues and investigate how a software ecosystem can foster...... the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...... of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business...

  18. Stackable differential mobility analyzer for aerosol measurement

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Meng-Dawn [Oak Ridge, TN; Chen, Da-Ren [Creve Coeur, MO

    2007-05-08

    A multi-stage differential mobility analyzer (MDMA) for aerosol measurements includes a first electrode or grid including at least one inlet or injection slit for receiving an aerosol including charged particles for analysis. A second electrode or grid is spaced apart from the first electrode. The second electrode has at least one sampling outlet disposed at a plurality different distances along its length. A volume between the first and the second electrode or grid between the inlet or injection slit and a distal one of the plurality of sampling outlets forms a classifying region, the first and second electrodes for charging to suitable potentials to create an electric field within the classifying region. At least one inlet or injection slit in the second electrode receives a sheath gas flow into an upstream end of the classifying region, wherein each sampling outlet functions as an independent DMA stage and classifies different size ranges of charged particles based on electric mobility simultaneously.

  19. Analyzing use cases for knowledge acquisition

    Science.gov (United States)

    Kelsey, Robert L.; Webster, Robert B.

    2000-03-01

    The analysis of use cases describing construction of simulation configuration files in a data/information management system can lead to the acquisition of new information and knowledge. In this application, a user creates a use case with an eXtensible Markup Language (XML) description representing a configuration file for simulation of a physical system. INtelligent agents analyze separate versions of the XML descriptions of a user and additionally, make comparisons of the descriptions with examples form a library of use cases. The agents can then make recommendations to a user on how to proceed or if tutoring is necessary. In a proof-of-concept test, new information is acquired and a user learns from the agent-facilitated tutoring.

  20. Spectroscopic methods to analyze drug metabolites.

    Science.gov (United States)

    Yi, Jong-Jae; Park, Kyeongsoon; Kim, Won-Je; Rhee, Jin-Kyu; Son, Woo Sung

    2018-03-09

    Drug metabolites have been monitored with various types of newly developed techniques and/or combination of common analytical methods, which could provide a great deal of information on metabolite profiling. Because it is not easy to analyze whole drug metabolites qualitatively and quantitatively, a single solution of analytical techniques is combined in a multilateral manner to cover the widest range of drug metabolites. Mass-based spectroscopic analysis of drug metabolites has been expanded with the help of other parameter-based methods. The current development of metabolism studies through contemporary pharmaceutical research are reviewed with an overview on conventionally used spectroscopic methods. Several technical approaches for conducting drug metabolic profiling through spectroscopic methods are discussed in depth.

  1. Analyzing petabytes of data with Hadoop

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    Abstract The open source Apache Hadoop project provides a powerful suite of tools for storing and analyzing petabytes of data using commodity hardware. After several years of production use inside of web companies like Yahoo! and Facebook and nearly a year of commercial support and development by Cloudera, the technology is spreading rapidly through other disciplines, from financial services and government to life sciences and high energy physics. The talk will motivate the design of Hadoop and discuss some key implementation details in depth. It will also cover the major subprojects in the Hadoop ecosystem, go over some example applications, highlight best practices for deploying Hadoop in your environment, discuss plans for the future of the technology, and provide pointers to the many resources available for learning more. In addition to providing more information about the Hadoop platform, a major goal of this talk is to begin a dialogue with the ATLAS research team on how the tools commonly used in t...

  2. Orthopedic surgical analyzer for percutaneous vertebroplasty

    Science.gov (United States)

    Tack, Gye Rae; Choi, Hyung Guen; Lim, Do H.; Lee, Sung J.

    2001-05-01

    Since the spine is one of the most complex joint structures in the human body, its surgical treatment requires careful planning and high degree of precision to avoid any unwanted neurological compromises. In addition, comprehensive biomechanical analysis can be very helpful because the spine is subject to a variety of load. In case for the osteoporotic spine in which the structural integrity has been compromised, it brings out the double challenges for a surgeon both clinically and biomechanically. Thus, we have been developing an integrated medical image system that is capable of doing the both. This system is called orthopedic surgical analyzer and it combines the clinical results from image-guided examination and the biomechanical data from finite element analysis. In order to demonstrate its feasibility, this system was applied to percutaneous vertebroplasty. Percutaneous vertebroplasty is a surgical procedure that has been recently introduced for the treatment of compression fracture of the osteoporotic vertebrae. It involves puncturing vertebrae and filling with polymethylmethacrylate (PMMA). Recent studies have shown that the procedure could provide structural reinforcement for the osteoporotic vertebrae while being minimally invasive and safe with immediate pain relief. However, treatment failures due to excessive PMMA volume injection have been reported as one of complications. It is believed that control of PMMA volume is one of the most critical factors that can reduce the incidence of complications. Since the degree of the osteoporosis can influence the porosity of the cancellous bone in the vertebral body, the injection volume can be different from patient to patient. In this study, the optimal volume of PMMA injection for vertebroplasty was predicted based on the image analysis of a given patient. In addition, biomechanical effects due to the changes in PMMA volume and bone mineral density (BMD) level were investigated by constructing clinically

  3. Experiment E89-044 of quasi-elastic diffusion 3He(e,e'p) at Jefferson Laboratory: Analyze cross sections of the two body breakup in parallel kinematics; Experience E89-044 de diffusion quasi-elastique 3he(e,e'p) au Jefferson Laboratory : analyse des sections efficaces de desintegration a deux corps en cinematique parallele

    Energy Technology Data Exchange (ETDEWEB)

    Penel-Nottaris, Emilie [Univ. Joseph Fourier Grenoble (France)

    2004-07-01

    The Jefferson Lab Hall A experiment has measured the 3He(e,e'p) reaction cross sections. The separation of the longitudinal and transverse response functions for the two-body breakup reaction in parallel kinematics allows to study the bound proton electromagnetic properties in the 3He nucleus and the involved nuclear mechanisms beyond impulse approximation. Preliminary cross sections show some disagreement with theoretical predictions for the forward angles kinematics around 0 MeV/c missing momenta, and sensitivity to final state interactions and 3He wave functions for missing momenta of 300 MeV/c.

  4. Real-Time IPMI Protocol Analyzer

    CERN Document Server

    Kozak, T; Makowski, D

    2011-01-01

    The Advanced Telecommunications Computing Ar- chitecture (ATCA) is a modern platform, which gains popularity, not only in telecommunication, but also in others fields like High Energy Physics (HEP) experiments. Computing systems based on ATCA provide high performance and efficiency and are char- acterized by significant reliability, availability and serviceability. ATCA offers these features because of an integrated manage- ment system realized by the Intelligent Platform Management Interface (IPMI) implemented on dedicated Intelligent Platform Management Controller (IPMC). IPMC is required on each ATCA board to fulfill the ATCA standard and is responsible for many vital procedures performed to support proper operation of ATCA system. It covers, among others, activation and deactivations of modules, monitoring of actual parameters or controlling fans. The commercially available IPMI implementations are expensive and often not suited to demands of specific ATCA applications and available hardware. Thus, many r...

  5. A framework to analyze emissions implications of ...

    Science.gov (United States)

    Future year emissions depend highly on the evolution of the economy, technology and current and future regulatory drivers. A scenario framework was adopted to analyze various technology development pathways and societal change while considering existing regulations and future uncertainty in regulations and evaluate resulting emissions growth patterns. The framework integrates EPA’s energy systems model with an economic Input-Output (I/O) Life Cycle Assessment model. The EPAUS9r MARKAL database is assembled from a set of technologies to represent the U.S. energy system within MARKAL bottom-up technology rich energy modeling framework. The general state of the economy and consequent demands for goods and services from these sectors are taken exogenously in MARKAL. It is important to characterize exogenous inputs about the economy to appropriately represent the industrial sector outlook for each of the scenarios and case studies evaluated. An economic input-output (I/O) model of the US economy is constructed to link up with MARKAL. The I/O model enables user to change input requirements (e.g. energy intensity) for different sectors or the share of consumer income expended on a given good. This gives end-users a mechanism for modeling change in the two dimensions of technological progress and consumer preferences that define the future scenarios. The framework will then be extended to include environmental I/O framework to track life cycle emissions associated

  6. Analyzing Music Services Positioning Through Qualitative Research

    Directory of Open Access Journals (Sweden)

    Manuel Cuadrado

    2015-12-01

    Full Text Available Information technologies have produced new ways of distributing and consuming music, mainly by youth, in relation to both goods and services. In the case of goods, there has been a dramatic shift from traditional ways of buying and listening to music to new digital platforms. There has also been an evolution in relation to music services. In this sense, live music concerts have been losing their audiences over the past few years, as have music radio stations, in favor of streaming platforms. Curious about this phenomenon, we conducted an exploratory research in order to analyze how all these services, both traditional and new ones were perceived. Specifically, we aimed to study youth´s assessment of the three most relevant music service categories: music radio stations, digital streaming platforms, and pop-rock music festivals. To do so, we used the projective technique of image association to gather information. The population of the study consisted of individuals between 18 and 25 years of age. Our results, after using content analysis, were poor due to spontaneous recall. Therefore, we duplicated the study, but in a more focus-oriented way. Information gathered this time allowed us not only to better know how all these organizations are positioned but also to obtain a list of descriptors to be used in a subsequent descriptive research study.

  7. PSAIA – Protein Structure and Interaction Analyzer

    Directory of Open Access Journals (Sweden)

    Vlahoviček Kristian

    2008-04-01

    Full Text Available Abstract Background PSAIA (Protein Structure and Interaction Analyzer was developed to compute geometric parameters for large sets of protein structures in order to predict and investigate protein-protein interaction sites. Results In addition to most relevant established algorithms, PSAIA offers a new method PIADA (Protein Interaction Atom Distance Algorithm for the determination of residue interaction pairs. We found that PIADA produced more satisfactory results than comparable algorithms implemented in PSAIA. Particular advantages of PSAIA include its capacity to combine different methods to detect the locations and types of interactions between residues and its ability, without any further automation steps, to handle large numbers of protein structures and complexes. Generally, the integration of a variety of methods enables PSAIA to offer easier automation of analysis and greater reliability of results. PSAIA can be used either via a graphical user interface or from the command-line. Results are generated in either tabular or XML format. Conclusion In a straightforward fashion and for large sets of protein structures, PSAIA enables the calculation of protein geometric parameters and the determination of location and type for protein-protein interaction sites. XML formatted output enables easy conversion of results to various formats suitable for statistic analysis. Results from smaller data sets demonstrated the influence of geometry on protein interaction sites. Comprehensive analysis of properties of large data sets lead to new information useful in the prediction of protein-protein interaction sites.

  8. Analyzing Architecture of Mithraism Rock Temples

    Directory of Open Access Journals (Sweden)

    Zohre AliJabbari

    2017-06-01

    Full Text Available This analyzes the architecture of rock temples of West and Northwest of Iran, as well as factors influencing their formation. The creation of rock architecture in this area of Iran is influenced by the religious, geographical and political atmosphere of their time. Most of these structures are formed by dominated empires in the first millennium BC. And in some works we are observing their continuity in later periods and change in their functions. One of the reasons that have attracted man to mountain and rock in different schools was the religious structure of community. According to the sanctity of mountains and rocks in the ancient religions, especially in Mithraism, valuable temples and places of worship have emerged in the mountains. Their obvious characteristic is circular dome-shaped spaces; simplicity, arrangement of spaces and the way of creating light that correspond with the tradition of Mithraism in Iran. Mehr Temple in Maragheh, Dashkasan in Zanjan, and Qadamgah Temple in Azarshahr are the rock temples in northwest of Iran that signs and symbols on them indicate the performance of Mithraism duties in these temples. In the West of Iran, Cogan cave in Lorestan, considering the characteristics of Mithraism temples, in a period had function as a temple for the worship of Mithra. This research investigates architectural futures of these temples.

  9. Analyzing the Existing Undergraduate Engineering Leadership Skills

    Directory of Open Access Journals (Sweden)

    Hamed M. Almalki

    2016-12-01

    Full Text Available Purpose: Studying and analyzing the undergraduate engineering students' leadership skills to discover their potential leadership strengths and weaknesses. This study will unveil potential ways to enhance the ways we teach engineering leadership. The research has great insights that might assist engineering programs to improve curricula for the purpose of better engineering preparation to meet industry's demands. Methodology and Findings: 441 undergraduate engineering students have been surveyed in two undergraduate engineering programs to discover their leadership skills. The results in both programs were revealing that undergraduate engineering students are lacking behind in the visionary leadership skills compared to directing, including and cultivating leadership styles. Recommendation: A practical framework has been proposed to enhance the lacking leadership skills by utilizing the Matrix of Change (MOC, and the Balanced Scorecard BSC to capture the best leadership scenarios to design virtual simulation environment as per the lacking leadership skills which is the visionary leadership skills in this case. After that, the virtual simulation will be used to provide an experiential learning by replacing human beings with avatars that can be managed or dramatized by real people to enable the creation of live, practical, measurable, and customizable leadership development programs.

  10. Analyzing human errors in flight mission operations

    Science.gov (United States)

    Bruno, Kristin J.; Welz, Linda L.; Barnes, G. Michael; Sherif, Josef

    1993-01-01

    A long-term program is in progress at JPL to reduce cost and risk of flight mission operations through a defect prevention/error management program. The main thrust of this program is to create an environment in which the performance of the total system, both the human operator and the computer system, is optimized. To this end, 1580 Incident Surprise Anomaly reports (ISA's) from 1977-1991 were analyzed from the Voyager and Magellan projects. A Pareto analysis revealed that 38 percent of the errors were classified as human errors. A preliminary cluster analysis based on the Magellan human errors (204 ISA's) is presented here. The resulting clusters described the underlying relationships among the ISA's. Initial models of human error in flight mission operations are presented. Next, the Voyager ISA's will be scored and included in the analysis. Eventually, these relationships will be used to derive a theoretically motivated and empirically validated model of human error in flight mission operations. Ultimately, this analysis will be used to make continuous process improvements continuous process improvements to end-user applications and training requirements. This Total Quality Management approach will enable the management and prevention of errors in the future.

  11. Complete denture analyzed by optical coherence tomography

    Science.gov (United States)

    Negrutiu, Meda L.; Sinescu, Cosmin; Todea, Carmen; Podoleanu, Adrian G.

    2008-02-01

    The complete dentures are currently made using different technologies. In order to avoid deficiencies of the prostheses made using the classical technique, several alternative systems and procedures were imagined, directly related to the material used and also to the manufacturing technology. Thus, at the present time, there are several injecting systems and technologies on the market, that use chemoplastic materials, which are heat cured (90-100°C), in dry or wet environment, or cold cured (below 60°C). There are also technologies that plasticize a hard cured material by thermoplastic processing (without any chemical changes) and then inject it into a mold. The purpose of this study was to analyze the existence of possible defects in several dental prostheses using a non invasive method, before their insertion in the mouth. Different dental prostheses, fabricated from various materials were investigated using en-face optical coherence tomography. In order to discover the defects, the scanning was made in three planes, obtaining images at different depths, from 0,01 μm to 2 mm. In several of the investigated prostheses we found defects which may cause their fracture. These defects are totally included in the prostheses material and can not be vizualised with other imagistic methods. In conclusion, en-face OCT is an important investigative tool for the dental practice.

  12. Analyzing Spatiotemporal Anomalies through Interactive Visualization

    Directory of Open Access Journals (Sweden)

    Tao Zhang

    2014-06-01

    Full Text Available As we move into the big data era, data grows not just in size, but also in complexity, containing a rich set of attributes, including location and time information, such as data from mobile devices (e.g., smart phones, natural disasters (e.g., earthquake and hurricane, epidemic spread, etc. We are motivated by the rising challenge and build a visualization tool for exploring generic spatiotemporal data, i.e., records containing time location information and numeric attribute values. Since the values often evolve over time and across geographic regions, we are particularly interested in detecting and analyzing the anomalous changes over time/space. Our analytic tool is based on geographic information system and is combined with spatiotemporal data mining algorithms, as well as various data visualization techniques, such as anomaly grids and anomaly bars superimposed on the map. We study how effective the tool may guide users to find potential anomalies through demonstrating and evaluating over publicly available spatiotemporal datasets. The tool for spatiotemporal anomaly analysis and visualization is useful in many domains, such as security investigation and monitoring, situation awareness, etc.

  13. NRC plant-analyzer development at BNL

    International Nuclear Information System (INIS)

    Wulff, W.

    1983-01-01

    The objective of this program is to develop an LWR engineering plant analyzer capable of performing realistic and accurate simulations of plant transients and Small-Break Loss of Coolant Accidents at real-time and faster than real-time computing speeds and at low costs for preparing, executing and evaluating such simulations. The program is directed toward facilitating reactor safety analyses, on-line plant monitoring, on-line accident diagnosis and mitigation and toward improving reactor operator training. The AD10 of Applied Dynamics International, Ann Arbor, MI, a special-purpose peripheral processor for high-speed systems simulation, is programmed through a PDP-11/34 minicomputer and carries out digital simulations with analog hardware in the input/output loop (up to 256 channels). Analog signals from a control panel are being used now to activate or to disable valves and to trip pump drive motors or regulators without interrupting the simulation. An IBM personal computer with multicolor graphics capabilities and a CRT monitor are used to produce on-line labelled diagrams of selected plant parameters as functions of time

  14. Analyzing dialect variation in historical speech corpora.

    Science.gov (United States)

    Renwick, Margaret E L; Olsen, Rachel M

    2017-07-01

    The Linguistic Atlas of the Gulf States is an extensive audio corpus of sociolinguistic interviews with 1121 speakers from eight southeastern U.S. states. Complete interviews have never been fully transcribed, leaving a wealth of phonetic information unexplored. This paper details methods for large-scale acoustic analysis of this historical speech corpus, providing a fuller picture of Southern speech than offered by previous impressionistic analyses. Interviews from 10 speakers (∼36 h) in southeast Georgia were transcribed and analyzed for dialectal features associated with the Southern Vowel Shift and African American Vowel Shift, also considering the effects of age, gender, and race. Multiple tokens of common words were annotated (N = 6085), and formant values of their stressed vowels were extracted. The effects of shifting on relative vowel placement were evaluated via Pillai scores, and vowel dynamics were estimated via functional data analysis and modeled with linear mixed-effects regression. Results indicate that European American speakers show features of the Southern Vowel Shift, though certain speakers shift in more ways than others, and African American speakers' productions are consistent with the African American Vowel Shift. Wide variation is apparent, even within this small geographic region, contributing evidence of the complexity of Southern speech.

  15. Practical Limitations of Aerosol Separation by a Tandem Differential Mobility Analyzer-Aerosol Particle Mass Analyzer

    OpenAIRE

    Radney, James G.; Zangmeister, Christopher D.

    2016-01-01

    A cavity ring-down spectrometer and condensation particle counter were used to investigate the limitations in the separation of singly and multiply charged aerosol particles by a tandem differential mobility analyzer (DMA) and aerosol particle mass analyzer (APM). The impact of particle polydispersity and morphology was investigated using three materials: nearly-monodisperse polystyrene latex nanospheres (PSL); polydisperse, nearly-spherical ammonium sulfate (AS) and polydisperse lacey fracta...

  16. College Student Environmental Activism: How Experiences and Identities Influence Environmental Activism Approaches

    Science.gov (United States)

    King, Laura A. H.

    2016-01-01

    College student environmental activism is one way students civically engage in addressing social issues. This study explores the environmental activism of twelve college students and how their experiences outside of college and in college influenced their activism. In addition, how students' identities influenced their approach to activism was…

  17. Service-learning abroad: a life-changing experience for nursing students.

    Science.gov (United States)

    Hawkins, Janice Evans; Vialet, Channel L

    2012-01-01

    Incorporating service-learning experiences into nursing education is one way to help prepare students for practice in a global, culturally diverse society. Partnering with a church with a long-term mission program in El Salvador offers the nursing school at Old Dominion University opportunity to develop a service-learning program and support healthcare missions.

  18. Analisa Meal Experience Dan Pengaruhnya Terhadap Minat Beli Ulang Di Restoran Jepang Di Surabaya

    OpenAIRE

    Nathania, Clairine; Tjandra, Catherine; Kristanti, Monika

    2015-01-01

    Penelitian ini dilakukan untuk menganalisa perbedaan meal experience yang dialami oleh konsumen dari gender dan kelompok usia yang berbeda, serta apakah meal experience memiliki pengaruh terhadap minat beli ulang di Restoran Jepang di Surabaya. Meal experience terdiri dari aspek makanan, minuman, pelayanan, kebersihan dan higienitas, suasana, serta harga dan nilai uang. Teknik analisa yang digunakan adalah deskriptif kuantitatif, One Way Anova, Independent Sample t-Test, serta regresi linier ...

  19. Analyzing microarray data using quantitative association rules.

    Science.gov (United States)

    Georgii, Elisabeth; Richter, Lothar; Rückert, Ulrich; Kramer, Stefan

    2005-09-01

    We tackle the problem of finding regularities in microarray data. Various data mining tools, such as clustering, classification, Bayesian networks and association rules, have been applied so far to gain insight into gene-expression data. Association rule mining techniques used so far work on discretizations of the data and cannot account for cumulative effects. In this paper, we investigate the use of quantitative association rules that can operate directly on numeric data and represent cumulative effects of variables. Technically speaking, this type of quantitative association rules based on half-spaces can find non-axis-parallel regularities. We performed a variety of experiments testing the utility of quantitative association rules for microarray data. First of all, the results should be statistically significant and robust against fluctuations in the data. Next, the approach should be scalable in the number of variables, which is important for such high-dimensional data. Finally, the rules should make sense biologically and be sufficiently different from rules found in regular association rule mining working with discretizations. In all of these dimensions, the proposed approach performed satisfactorily. Therefore, quantitative association rules based on half-spaces should be considered as a tool for the analysis of microarray gene-expression data. The code is available from the authors on request.

  20. Analyzing critical material demand: A revised approach.

    Science.gov (United States)

    Nguyen, Ruby Thuy; Fishman, Tomer; Zhao, Fu; Imholte, D D; Graedel, T E

    2018-03-02

    Apparent consumption has been widely used as a metric to estimate material demand. However, with technology advancement and complexity of material use, this metric has become less useful in tracking material flows, estimating recycling feedstocks, and conducting life cycle assessment of critical materials. We call for future research efforts to focus on building a multi-tiered consumption database for the global trade network of critical materials. This approach will help track how raw materials are processed into major components (e.g., motor assemblies) and eventually incorporated into complete pieces of equipment (e.g., wind turbines). Foreseeable challenges would involve: 1) difficulty in obtaining a comprehensive picture of trade partners due to business sensitive information, 2) complexity of materials going into components of a machine, and 3) difficulty maintaining such a database. We propose ways to address these challenges such as making use of digital design, learning from the experience of building similar databases, and developing a strategy for financial sustainability. We recommend that, with the advancement of information technology, small steps toward building such a database will contribute significantly to our understanding of material flows in society and the associated human impacts on the environment. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. ANALYZING MANAGERS’ PERCEPTION OF CREATIVITY IN TOURISM

    Directory of Open Access Journals (Sweden)

    Anamaria Sidonia RĂVAR

    2014-12-01

    Full Text Available The past decades brought new meanings to creativity as the decline of mass tourism created impetus for the emergence of creative behavior as a major source of competitive advantage in the tourism industry. This led, in turn, to the development of a new type of tourism – creative tourism – which translates into new products and services, new collaboration and partnership structures, new forms of organization and ultimately into new experiences for consumers of tourism services. However, there is still no consensus on how creativity manifests itself in tourism and how it can be encouraged in order to generate value-added for the customers. To this aim, a qualitative research was carried out, based on a structured interview applied to managers of tourism operators from various segments of the tourism value chain. Results reveal the differences in approach to encourage creativity among employees, bring value-added to the customers through creative services, and build a culture based on creative behavior and practices.

  2. Analyzing Evolutionary Optimization in Noisy Environments.

    Science.gov (United States)

    Qian, Chao; Yu, Yang; Zhou, Zhi-Hua

    2018-01-01

    Many optimization tasks must be handled in noisy environments, where the exact evaluation of a solution cannot be obtained, only a noisy one. For optimization of noisy tasks, evolutionary algorithms (EAs), a type of stochastic metaheuristic search algorithm, have been widely and successfully applied. Previous work mainly focuses on the empirical study and design of EAs for optimization under noisy conditions, while the theoretical understandings are largely insufficient. In this study, we first investigate how noisy fitness can affect the running time of EAs. Two kinds of noise-helpful problems are identified, on which the EAs will run faster with the presence of noise, and thus the noise should not be handled. Second, on a representative noise-harmful problem in which the noise has a strong negative effect, we examine two commonly employed mechanisms dealing with noise in EAs: reevaluation and threshold selection. The analysis discloses that using these two strategies simultaneously is effective for the one-bit noise but ineffective for the asymmetric one-bit noise. Smooth threshold selection is then proposed, which can be proved to be an effective strategy to further improve the noise tolerance ability in the problem. We then complement the theoretical analysis by experiments on both synthetic problems as well as two combinatorial problems, the minimum spanning tree and the maximum matching. The experimental results agree with the theoretical findings and also show that the proposed smooth threshold selection can deal with the noise better.

  3. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the

  4. Novel topological descriptors for analyzing biological networks

    Directory of Open Access Journals (Sweden)

    Varmuza Kurt K

    2010-06-01

    Full Text Available Abstract Background Topological descriptors, other graph measures, and in a broader sense, graph-theoretical methods, have been proven as powerful tools to perform biological network analysis. However, the majority of the developed descriptors and graph-theoretical methods does not have the ability to take vertex- and edge-labels into account, e.g., atom- and bond-types when considering molecular graphs. Indeed, this feature is important to characterize biological networks more meaningfully instead of only considering pure topological information. Results In this paper, we put the emphasis on analyzing a special type of biological networks, namely bio-chemical structures. First, we derive entropic measures to calculate the information content of vertex- and edge-labeled graphs and investigate some useful properties thereof. Second, we apply the mentioned measures combined with other well-known descriptors to supervised machine learning methods for predicting Ames mutagenicity. Moreover, we investigate the influence of our topological descriptors - measures for only unlabeled vs. measures for labeled graphs - on the prediction performance of the underlying graph classification problem. Conclusions Our study demonstrates that the application of entropic measures to molecules representing graphs is useful to characterize such structures meaningfully. For instance, we have found that if one extends the measures for determining the structural information content of unlabeled graphs to labeled graphs, the uniqueness of the resulting indices is higher. Because measures to structurally characterize labeled graphs are clearly underrepresented so far, the further development of such methods might be valuable and fruitful for solving problems within biological network analysis.

  5. Nuclear plant analyzer program for Bulgaria

    International Nuclear Information System (INIS)

    Shier, W.; Kennett, R.

    1993-01-01

    An interactive nuclear plant analyzer(NPA) has been developed for use by the Bulgarian technical community in the training of plant personnel, the development and verification of plant operating procedures, and in the analysis of various anticipated operational occurrences and accident scenarios. The current NPA includes models for a VVER-440 Model 230 and a VVER-1000 Model 320 and is operational on an IBM RISC6000 workstation. The RELAP5/MOD2 computer code has been used for the calculation of the reactor responses to the interactive commands initiated by the NPA operator. The interactive capabilities of the NPA have been developed to provide considerable flexibility in the plant actions that can be initiated by the operator. The current capabilities for both the VVER-440 and VVER-1000 models include: (1) scram initiation; (2) reactor coolant pump trip; (3) high pressure safety injection system initiation; (4) low pressure safety injection system initiation; (5) pressurizer safety valve opening; (6) steam generator relief/safety valve opening; (7) feedwater system initiation and trip; (8) turbine trip; and (9) emergency feedwater initiation. The NPA has the capability to display the results of the simulations in various forms that are determined by the model developer. Results displayed on the reactor mask are shown through the user defined, digital display of various plant parameters and through color changes that reflect changes in primary system fluid temperatures, fuel and clad temperatures, and the temperature of other metal structures. In addition, changes in the status of various components and systems can be initiated and/or displayed both numerically and graphically on the mask. This paper provides a description of the structure of the NPA, a discussion of the simulation models used for the VVER-440 and the VVER-1000, and an overview of the NPA capabilities. Typical results obtained using both simulation models will be discussed

  6. Analyzing personalized policies for online biometric verification.

    Science.gov (United States)

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  7. Analyzing personalized policies for online biometric verification.

    Directory of Open Access Journals (Sweden)

    Apaar Sadhwani

    Full Text Available Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR subject to constraints on the false accept rate (FAR and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses 12 biometrics for each resident, which represents a five (four, respectively log reduction in FRR relative to fingerprint (iris, respectively policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR and acquires an average of 1.3 fingerprints per resident.

  8. Modeling and Analyzing Academic Researcher Behavior

    Directory of Open Access Journals (Sweden)

    Phuc Huu Nguyen

    2016-12-01

    Full Text Available Abstract. This paper suggests a theoretical framework for analyzing the mechanism of the behavior of academic researchers whose interests are tangled and vary widely in academic factors (the intrinsic satisfaction in conducting research, the improvement in individual research ability, etc. or non-academic factors (career rewards, financial rewards, etc.. Furthermore, each researcher also has his/her different academic stances in their preferences about academic freedom and academic entrepreneurship. Understanding the behavior of academic researchers will contribute to nurture young researchers, to improve the standard of research and education as well as to boost collaboration in academia-industry. In particular, as open innovation is increasingly in need of the involvement of university researchers, to establish a successful approach to entice researchers into enterprises’ research, companies must comprehend the behavior of university researchers who have multiple complex motivations. The paper explores academic researchers' behaviors through optimizing their utility functions, i.e. the satisfaction obtained by their research outputs. This paper characterizes these outputs as the results of researchers' 3C: Competence (the ability to implement the research, Commitment (the effort to do the research, and Contribution (finding meaning in the research. Most of the previous research utilized the empirical methods to study researcher's motivation. Without adopting economic theory into the analysis, the past literature could not offer a deeper understanding of researcher's behavior. Our contribution is important both conceptually and practically because it provides the first theoretical framework to study the mechanism of researcher's behavior. Keywords: Academia-Industry, researcher behavior, ulrich model’s 3C.

  9. Analysis of Variance in the Modern Design of Experiments

    Science.gov (United States)

    Deloach, Richard

    2010-01-01

    This paper is a tutorial introduction to the analysis of variance (ANOVA), intended as a reference for aerospace researchers who are being introduced to the analytical methods of the Modern Design of Experiments (MDOE), or who may have other opportunities to apply this method. One-way and two-way fixed-effects ANOVA, as well as random effects ANOVA, are illustrated in practical terms that will be familiar to most practicing aerospace researchers.

  10. Analyzing Non Stationary Processes in Radiometers

    Science.gov (United States)

    Racette, Paul

    2010-01-01

    The lack of well-developed techniques for modeling changing statistical moments in our observations has stymied the application of stochastic process theory for many scientific and engineering applications. Non linear effects of the observation methodology is one of the most perplexing aspects to modeling non stationary processes. This perplexing problem was encountered when modeling the effect of non stationary receiver fluctuations on the performance of radiometer calibration architectures. Existing modeling approaches were found not applicable; particularly problematic is modeling processes across scales over which they begin to exhibit non stationary behavior within the time interval of the calibration algorithm. Alternatively, the radiometer output is modeled as samples from a sequence random variables; the random variables are treated using a conditional probability distribution function conditioned on the use of the variable in the calibration algorithm. This approach of treating a process as a sequence of random variables with non stationary stochastic moments produce sensible predictions of temporal effects of calibration algorithms. To test these model predictions, an experiment using the Millimeter wave Imaging Radiometer (MIR) was conducted. The MIR with its two black body calibration references was configured in a laboratory setting to observe a third ultra-stable reference (CryoTarget). The MIR was programmed to sequentially sample each of the three references in approximately a 1 second cycle. Data were collected over a six-hour interval. The sequence of reference measurements form an ensemble sample set comprised of a series of three reference measurements. Two references are required to estimate the receiver response. A third reference is used to estimate the uncertainty in the estimate. Typically, calibration algorithms are designed to suppress the non stationary effects of receiver fluctuations. By treating the data sequence as an ensemble

  11. analyzers in overweight/obese renal patients

    Directory of Open Access Journals (Sweden)

    Mariusz Kusztal

    2015-05-01

    Full Text Available Bioelectrical impedance analysis (BIA is an affordable, non-invasive and fast alternative method to assess body composition. The purpose of this study was to compare two different tetrapolar BIA devices for estimating body fluid volumes and body cell mass (BCM in a clinical setting among patients with kidney failure.All double measurements were performed by multi-frequency (MF and single-frequency (SF BIA analyzers: a Body Composition Monitor (Fresenius Medical Care, Germany and BIA-101 (Akern, Italy, respectively. All procedures were conducted according to the manufacturers’ instructions (dedicated electrodes, measurement sites, positions, etc. Total body water (TBW, extracellular water (ECW, intracellular water (ICW and BCM were compared. The study included 39 chronic kidney disease patients (stage III-V with a mean age of 45.8 ± 8 years (21 men and 18 women who had a wide range of BMI [17-34 kg/m2 (mean 26.6 ±5].A comparison of results from patients with BMI <25 vs ≥25 revealed a significant discrepancy in measurements between the two BIA devices. Namely, in the group with BMI <25 (n=16 acceptable correlations were obtained in TBW (r 0.99; p<0.01, ICW (0.92; p<0.01, BCM (0.68; p<0.01, and ECW (0.96 p<0.05, but those with BMI ≥25 (n=23 showed a discrepancy (lower correlations in TBW (r 0.82; p<0.05, ICW (0.78; p<0.05, BCM (0.52; p<0.05, and ECW (0.76; p<0.01.Since estimates of TBW, ICW and BCM by the present BIA devices do not differ in patients with BMI <25, they might be interchangeable. This does not hold true for overweight/obese renal patients.

  12. The Albuquerque Seismological Laboratory Data Quality Analyzer

    Science.gov (United States)

    Ringler, A. T.; Hagerty, M.; Holland, J.; Gee, L. S.; Wilson, D.

    2013-12-01

    The U.S. Geological Survey's Albuquerque Seismological Laboratory (ASL) has several efforts underway to improve data quality at its stations. The Data Quality Analyzer (DQA) is one such development. The DQA is designed to characterize station data quality in a quantitative and automated manner. Station quality is based on the evaluation of various metrics, such as timing quality, noise levels, sensor coherence, and so on. These metrics are aggregated into a measurable grade for each station. The DQA consists of a website, a metric calculator (Seedscan), and a PostgreSQL database. The website allows the user to make requests for various time periods, review specific networks and stations, adjust weighting of the station's grade, and plot metrics as a function of time. The website dynamically loads all station data from a PostgreSQL database. The database is central to the application; it acts as a hub where metric values and limited station descriptions are stored. Data is stored at the level of one sensor's channel per day. The database is populated by Seedscan. Seedscan reads and processes miniSEED data, to generate metric values. Seedscan, written in Java, compares hashes of metadata and data to detect changes and perform subsequent recalculations. This ensures that the metric values are up to date and accurate. Seedscan can be run in a scheduled task or on demand by way of a config file. It will compute metrics specified in its configuration file. While many metrics are currently in development, some are completed and being actively used. These include: availability, timing quality, gap count, deviation from the New Low Noise Model, deviation from a station's noise baseline, inter-sensor coherence, and data-synthetic fits. In all, 20 metrics are planned, but any number could be added. ASL is actively using the DQA on a daily basis for station diagnostics and evaluation. As Seedscan is scheduled to run every night, data quality analysts are able to then use the

  13. Technology for collecting and analyzing relational data

    Directory of Open Access Journals (Sweden)

    E. N. Fedorova

    2016-01-01

    summarize the information there is a mechanism of data grouping, which provides general data of the number of entries, maximum, minimum, average values for different groups of records.Results. This technology has been tested in the monitoring requirements of the services of additional professional education and the definition of the educational needs of teachers and executives of educational organizations of the Irkutsk region. The survey has involved 2,780 respondents in 36 municipalities. Creating the data model took several hours. The survey was conducted during the month.Conclusion. The proposed technology allows a short time to collect the information in relational form, and then analyze it without the need for programming with flexible assignment of the operating logic for form.

  14. Analyzing the attributes of Indiana's STEM schools

    Science.gov (United States)

    Eltz, Jeremy

    "Primary and secondary schools do not seem able to produce enough students with the interest, motivation, knowledge, and skills they will need to compete and prosper in the emerging world" (National Academy of Sciences [NAS], 2007a, p. 94). This quote indicated that there are changing expectations for today's students which have ultimately led to new models of education, such as charters, online and blended programs, career and technical centers, and for the purposes of this research, STEM schools. STEM education as defined in this study is a non-traditional model of teaching and learning intended to "equip them [students] with critical thinking, problem solving, creative and collaborative skills, and ultimately establishes connections between the school, work place, community and the global economy" (Science Foundation Arizona, 2014, p. 1). Focusing on science, technology, engineering, and math (STEM) education is believed by many educational stakeholders to be the solution for the deficits many students hold as they move on to college and careers. The National Governors Association (NGA; 2011) believes that building STEM skills in the nation's students will lead to the ability to compete globally with a new workforce that has the capacity to innovate and will in turn spur economic growth. In order to accomplish the STEM model of education, a group of educators and business leaders from Indiana developed a comprehensive plan for STEM education as an option for schools to use in order to close this gap. This plan has been promoted by the Indiana Department of Education (IDOE, 2014a) with the goal of increasing STEM schools throughout Indiana. To determine what Indiana's elementary STEM schools are doing, this study analyzed two of the elementary schools that were certified STEM by the IDOE. This qualitative case study described the findings and themes from two elementary STEM schools. Specifically, the research looked at the vital components to accomplish STEM

  15. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2014-12-01

    We have developed a cloud-enabled web-service system that empowers physics-based, multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks. The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the observational datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation, (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs, and (3) ECMWF reanalysis outputs for several environmental variables in order to supplement observational datasets. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, (4) the calculation of difference between two variables, and (5) the conditional sampling of one physical variable with respect to another variable. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA will be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. In order to support 30+ simultaneous users during the school, we have deployed CMDA to the Amazon cloud environment. The cloud-enabled CMDA will provide each student with a virtual machine while the user interaction with the system will remain the same

  16. Science Opportunity Analyzer (SOA) Version 8

    Science.gov (United States)

    Witoff, Robert J.; Polanskey, Carol A.; Aguinaldo, Anna Marie A.; Liu, Ning; Hofstadter, Mark D.

    2013-01-01

    both JPL and the sponsoring spacecraft. SOA is able to ingest JPL SPICE Kernels that are used to drive the tool and its computations. A Percy search engine is then included that identifies interesting time periods for the user to build observations. When observations are then built, flight-like orientation algorithms replicate spacecraft dynamics to closely simulate the flight spacecraft s dynamics. SOA v8 represents large steps forward from SOA v7 in terms of quality, reliability, maintainability, efficiency, and user experience. A tailored agile development environment has been built around SOA that provides automated unit testing, continuous build and integration, a consolidated Web-based code and documentation storage environment, modern Java enhancements, and a focus on usability

  17. Analyzers Measure Greenhouse Gases, Airborne Pollutants

    Science.gov (United States)

    2012-01-01

    In complete darkness, a NASA observatory waits. When an eruption of boiling water billows from a nearby crack in the ground, the observatory s sensors seek particles in the fluid, measure shifts in carbon isotopes, and analyze samples for biological signatures. NASA has landed the observatory in this remote location, far removed from air and sunlight, to find life unlike any that scientists have ever seen. It might sound like a scene from a distant planet, but this NASA mission is actually exploring an ocean floor right here on Earth. NASA established a formal exobiology program in 1960, which expanded into the present-day Astrobiology Program. The program, which celebrated its 50th anniversary in 2010, not only explores the possibility of life elsewhere in the universe, but also examines how life begins and evolves, and what the future may hold for life on Earth and other planets. Answers to these questions may be found not only by launching rockets skyward, but by sending probes in the opposite direction. Research here on Earth can revise prevailing concepts of life and biochemistry and point to the possibilities for life on other planets, as was demonstrated in December 2010, when NASA researchers discovered microbes in Mono Lake in California that subsist and reproduce using arsenic, a toxic chemical. The Mono Lake discovery may be the first of many that could reveal possible models for extraterrestrial life. One primary area of interest for NASA astrobiologists lies with the hydrothermal vents on the ocean floor. These vents expel jets of water heated and enriched with chemicals from off-gassing magma below the Earth s crust. Also potentially within the vents: microbes that, like the Mono Lake microorganisms, defy the common characteristics of life on Earth. Basically all organisms on our planet generate energy through the Krebs Cycle, explains Mike Flynn, research scientist at NASA s Ames Research Center. This metabolic process breaks down sugars for energy

  18. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    DEFF Research Database (Denmark)

    de Raad, Markus; de Rond, Tristan; Rübel, Oliver

    2017-01-01

    ://openmsinersc.gov), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was :evaluated by analyzing an MSI data set of a high-throughput glycoside...

  19. Coaches’ perception of professional competence as a function of education and experience

    Directory of Open Access Journals (Sweden)

    Isabel Maria Ribeiro Mesquita

    2010-06-01

    Full Text Available The aim of this study was to analyze coaches’ perception of professional competenceas a function of experience and academic education. The sample consisted of 343 Portuguesecoaches from different sport disciplines. A validated questionnaire was used to collect demographicdata and data regarding the perception of professional competence. Exploratory factorial analysisand comparative inferential analysis by one-way ANOVA were used as statistical procedures. Theresults highlighted the wide range of competences necessary for an efficient coach. The answersprovided by the coaches revealed 5 factors: planning; leadership and coach education; competitionplanning and orientation; personal competence, and training orientation. Although all these factorswere at least classified as reasonably important, factors related to competition were consideredto be the most important. Comparative analysis showed that professional experience and academiceducation had a different impact on the perceived importance of competence. More experiencedcoaches perceived competences related to planning, training orientation, and leadership and coacheducation to be more important than less experienced coaches. Coaches with higher educationconsidered competences related to planning, leadership and coach education, training orientationand personal competence to be more important than undergraduate coaches.

  20. Solid State Neutral Particle Analyzer Array on NSTX

    International Nuclear Information System (INIS)

    Shinohara, K.; Darrow, D.S.; Roquemore, A.L.; Medley, S.S.; Cecil, F.E.

    2004-01-01

    A Solid State Neutral Particle Analyzer (SSNPA) array has been installed on the National Spherical Torus Experiment (NSTX). The array consists of four chords viewing through a common vacuum flange. The tangency radii of the viewing chords are 60, 90, 100, and 120 cm. They view across the three co-injection neutral beam lines (deuterium, 80 keV (typ.) with tangency radii 48.7, 59.2, and 69.4 cm) on NSTX and detect co-going energetic ions. A silicon photodiode used was calibrated by using a mono-energetic deuteron beam source. Deuterons with energy above 40 keV can be detected with the present setup. The degradation of the performance was also investigated. Lead shots and epoxy are used for neutron shielding to reduce handling any hazardous heavy metal. This method also enables us to make an arbitrary shape to be fit into the complex flight tube

  1. Assessing High Impact Practices Using NVivo: An Automated Approach to Analyzing Student Reflections for Program Improvement

    Science.gov (United States)

    Blaney, Jennifer; Filer, Kimberly; Lyon, Julie

    2014-01-01

    Critical reflection allows students to synthesize their learning and deepen their understanding of an experience (Ash & Clayton, 2009). A recommended reflection method is for students to write essays about their experiences. However, on a large scale, such reflection essays become difficult to analyze in a meaningful way. At Roanoke College,…

  2. Chromatin immunoprecipitation to analyze DNA binding sites of HMGA2.

    Directory of Open Access Journals (Sweden)

    Nina Winter

    Full Text Available BACKGROUND: HMGA2 is an architectonic transcription factor abundantly expressed during embryonic and fetal development and it is associated with the progression of malignant tumors. The protein harbours three basically charged DNA binding domains and an acidic protein binding C-terminal domain. DNA binding induces changes of DNA conformation and hence results in global overall change of gene expression patterns. Recently, using a PCR-based SELEX (Systematic Evolution of Ligands by Exponential Enrichment procedure two consensus sequences for HMGA2 binding have been identified. METHODOLOGY/PRINCIPAL FINDINGS: In this investigation chromatin immunoprecipitation (ChIP experiments and bioinformatic methods were used to analyze if these binding sequences can be verified on chromatin of living cells as well. CONCLUSION: After quantification of HMGA2 protein in different cell lines the colon cancer derived cell line HCT116 was chosen for further ChIP experiments because of its 3.4-fold higher HMGA2 protein level. 49 DNA fragments were obtained by ChIP. These fragments containing HMGA2 binding sites have been analyzed for their AT-content, location in the human genome and similarities to sequences generated by a SELEX study. The sequences show a significantly higher AT-content than the average of the human genome. The artificially generated SELEX sequences and short BLAST alignments (11 and 12 bp of the ChIP fragments from living cells show similarities in their organization. The flanking regions are AT-rich, whereas a lower conservation is present in the center of the sequences.

  3. Organization of multidimensnonal measurements on the UNO-4096-90 analyzer

    International Nuclear Information System (INIS)

    Zajtsev, M.Yu.; Regushevskij, V.I.

    1984-01-01

    Organization of multidimensional analysis on the UNO-4096-90 analyzer and VECTOR system units is described. Analyzer for displacement of spectrometric units possesses a special frame of input units in which each unit is assinged to its own place. Development of the UNO-4096-90 analyzer and VECTOR system converters enabled to organize the multidimensional analysis under one of the following regimes: amplitude-amplitude, time-amplitude, time-time, amplitude-time with the use of time expander, multitransducer amplitude or time. Mentioned regimes were tested in operation, and gained experience enables to note that organized multidimensional regimes of analysis satisfy the experiment requirements

  4. Analyzing the user behavior towards Electronic Commerce stimuli

    Directory of Open Access Journals (Sweden)

    Carlota Lorenzo-Romero

    2016-11-01

    Full Text Available Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e. navigational structure as utilitarian stimulus versus nonverbal web technology (music and presentation of products as hedonic stimuli. Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this research consists in analyzing the impact of these web technologies –which constitute the web atmosphere or webmosphere of a website– on shopping human bebaviour (i.e. users’ internal states -affective, cognitive, and satisfaction- and behavioral responses - approach responses, and real shopping outcomes- within the retail online store created by computer, taking into account some mediator variables (i.e. involvement, atmospheric responsiveness, and perceived risk. A 2(free versus hierarchical navigational structure x2(on versus off music x2(moving versus static images between-subjects computer experimental design is used to test empirically this research. In addition, an integrated methodology was developed allowing the simulation, tracking and recording of virtual user behavior within an online shopping environment. As main conclusion, this study suggests that the positive responses of online consumers might increase when they are allowed to freely navigate the online stores and their experience is enriched by animate gifts and music background. The effect caused by mediator variables modifies relatively the final shopping human behavior.

  5. Analyzing Multimode Wireless Sensor Networks Using the Network Calculus

    Directory of Open Access Journals (Sweden)

    Xi Jin

    2015-01-01

    Full Text Available The network calculus is a powerful tool to analyze the performance of wireless sensor networks. But the original network calculus can only model the single-mode wireless sensor network. In this paper, we combine the original network calculus with the multimode model to analyze the maximum delay bound of the flow of interest in the multimode wireless sensor network. There are two combined methods A-MM and N-MM. The method A-MM models the whole network as a multimode component, and the method N-MM models each node as a multimode component. We prove that the maximum delay bound computed by the method A-MM is tighter than or equal to that computed by the method N-MM. Experiments show that our proposed methods can significantly decrease the analytical delay bound comparing with the separate flow analysis method. For the large-scale wireless sensor network with 32 thousands of sensor nodes, our proposed methods can decrease about 70% of the analytical delay bound.

  6. Analyzing the User Behavior toward Electronic Commerce Stimuli.

    Science.gov (United States)

    Lorenzo-Romero, Carlota; Alarcón-Del-Amo, María-Del-Carmen; Gómez-Borja, Miguel-Ángel

    2016-01-01

    Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e., navigational structure as utilitarian stimulus) versus non-verbal web technology (music and presentation of products as hedonic stimuli). Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this research consists in analyzing the impact of these web technologies -which constitute the web atmosphere or webmosphere of a website- on shopping human behavior (i.e., users' internal states -affective, cognitive, and satisfaction- and behavioral responses - approach responses, and real shopping outcomes-) within the retail online store created by computer, taking into account some mediator variables (i.e., involvement, atmospheric responsiveness, and perceived risk). A 2 ("free" versus "hierarchical" navigational structure) × 2 ("on" versus "off" music) × 2 ("moving" versus "static" images) between-subjects computer experimental design is used to test empirically this research. In addition, an integrated methodology was developed allowing the simulation, tracking and recording of virtual user behavior within an online shopping environment. As main conclusion, this study suggests that the positive responses of online consumers might increase when they are allowed to freely navigate the online stores and their experience is enriched by animate gifts and music background. The effect caused by mediator variables modifies relatively the final shopping human behavior.

  7. Noise and analyzer-crystal angular position analysis for analyzer-based phase-contrast imaging

    International Nuclear Information System (INIS)

    Majidi, Keivan; Brankov, Jovan G; Li, Jun; Muehleman, Carol

    2014-01-01

    The analyzer-based phase-contrast x-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the angular intensity profile of the x-ray beam passed through the object at different angular positions of the analyzer crystal. The noise in the estimated ABI parametric images depends upon imaging conditions like the source intensity (flux), measurements angular positions, object properties, and the estimation method. In this paper, we use the Cramér–Rao lower bound (CRLB) to quantify the noise properties in parametric images and to investigate the effect of source intensity, different analyzer-crystal angular positions and object properties on this bound, assuming a fixed radiation dose delivered to an object. The CRLB is the minimum bound for the variance of an unbiased estimator and defines the best noise performance that one can obtain regardless of which estimation method is used to estimate ABI parametric images. The main result of this paper is that the variance (hence the noise) in parametric images is directly proportional to the source intensity and only a limited number of analyzer-crystal angular measurements (eleven for uniform and three for optimal non-uniform) are required to get the best parametric images. The following angular measurements only spread the total dose to the measurements without improving or worsening CRLB, but the added measurements may improve parametric images by reducing estimation bias. Next, using CRLB we evaluate the multiple-image radiography, diffraction enhanced imaging and scatter diffraction enhanced imaging estimation techniques

  8. Lazer - um caminho para aliviar as tensões no ambiente de trabalho em UTI: uma concepção da equipe de enfermagem Recreación - un camino para aliviar las tensiones el ambiente de trabajo en UTI: una concepción del equipo de enfermería Leisure - one way to alleviate tension in the ICU work environment: one conception of the nursing team

    Directory of Open Access Journals (Sweden)

    Maria Elizabeth Roza Pereira

    1997-10-01

    Full Text Available Propusemos neste estudo, verificar com uma equipe de enfermagem de UTI, a representação do serviço e o significado desta unidade, levantando também, o sentido do lazer e sua implicação no ambiente profissional. Trabalhamos o referencial teórico de DUMAZEDIER, enfocando o lazer na promoção da saúde. Então, investigamos 10 membros desta equipe, de um hospital mineiro. O grupo pesquisado é constituído na maioria, de mulheres, solteiras, entre 28 e 45 anos, com predominância de mais de 10 anos de experiência no serviço. 90% desses, evidenciam gostar do trabalho, embora reconhecendo ser desgastante e estressante (50%. Identificam o serviço sobretudo como assistencial (80%, exigindo vocação/doação (50%. Questionam a sua dura jornada de trabalho (60%, sugerindo redução de horas. Revelam problemas de relacionamento/comunicação no serviço. Conceituam lazer como diversão/ descontração (80%; relaxamento (20%, etc, ressaltando necessidade de discussão do assunto. Admitem tensão/estresse no local, revelando importância de lazer no serviço, para favorecer a comunicação e alivio de tensões. Sugerimos aos enfermeiros investimento nesta área.Propusimos en este estudio, verificar la representación del servicio y el significado de la UTI, por el equipo de enfermería, incluyendo el sentido de recreación y su implicación en el ambiente profesional. Trabajamos el referencial teórico de DUMAZEDIER acerca de la aplicación de recreación en la salud, a través de una investigación cualicuantitativa. Investigando 10 miembros del equipo de enfermería, entre las diferentes categorías, mayoría mujeres, mitad solteras, entre 28 y 45 años, predominando 10 años de trabajo (UTI. Evidenciaron gusto al trabajo (90%, aún siendo desgastante y estresante (50%, identificaron el trajo como asistencial (80%, exigiendo vocación/donación (50%. Cuestionan su jornada de trabajo (60%. Revelan problemas de relacionamiento y comunicaci

  9. Framework for Analyzing Android I/O Stack Behavior: From Generating theWorkload to Analyzing the Trace

    Directory of Open Access Journals (Sweden)

    Sooman Jeong

    2013-12-01

    Full Text Available Abstract: The existing I/O workload generators and trace capturing tools are not adaptable to generating nor capturing the I/O requests of Android apps. The Android platform needs proper tools to capture and replay real world workload in the Android platform to verify the result of benchmark tools. This paper introduces Android Storage Performance Analysis Tool, AndroStep, which is specifically designed for characterizing and analyzing the behavior of the I/O subsystem in Android based devices. The AndroStep consists of Mobibench (workload generator, MOST (Mobile Storage Analyzer, and Mobigen (workload replayer. Mobibench is an Android app that generates a filesystem as well as SQLite database operations. Mobibench can also vary the number of concurrent threads to examining the filesystem scalability to support concurrency, e.g., metadata updates, journal file creation/deletion. MOST captures the trace and extracts key filesystem access characteristics such as access pattern with respect to file types, ratio between random vs. sequential access, ratio between buffered vs. synchronous I/O, fraction of metadata accesses, etc. MOST implements reverse mapping feature (finding an inode for a given block and retrospective reverse mapping (finding an inode for a deleted file. Mobigen is a trace capturing and replaying tool that is specifically designed to perform the user experiment without actual human intervention. Mobigen records the system calls generated from the user behavior and sanitizes the trace for replayable form. Mobigen can replay this trace on different Android platforms or with different I/O stack configurations. As an example of using AndroStep, we analyzed the performances of twelve Android smartphones and the SQLite performances on five different filesystems. AndroStep makes otherwise time consuming I/O stack analysis extremely versatile. AndroStep makes a significant contribution in terms of shedding light on internal behavior of

  10. Independent multi-channel analyzer with the color display and microcomputer in the CAMAC standard

    International Nuclear Information System (INIS)

    Gyunter, Z.; Elizarov, O.I.; Zhukov, G.P.; Mikhaehlis, B.; Shul'tts, K.Kh.

    1983-01-01

    An independent multi-channel time-of-flight analyzer developed for a spectrometer of polarised neutrons and used in experiments with the IBR-2 pulse reactor is described. Different cyclic modes of measuring are realized in the analyzer, 4 kwords of analyzer momory being assigned for each mode. The spectra are displayed on a colour screen during measurements. Simultaneous displaying of up to 8 spectra is possible. A microcomputer transfers the spectra from the buffer analyzer memory to the microcomputer memory. The accumulated information is transferred to the PDP-11/70 central computer

  11. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    case studies regarding the analysis of clinical datasets produced in the University Hospital of Catanzaro, Italy. Conclusion DMET Analyzer is a novel tool able to automatically analyse data produced by the DMET-platform in case-control association studies. Using such tool user may avoid wasting time in the manual execution of multiple statistical tests avoiding possible errors and reducing the amount of time needed for a whole experiment. Moreover annotations and the direct link to external databases may increase the biological knowledge extracted. The system is freely available for academic purposes at: https://sourceforge.net/projects/dmetanalyzer/files/

  12. DMET-analyzer: automatic analysis of Affymetrix DMET data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Agapito, Giuseppe; Di Martino, Maria Teresa; Arbitrio, Mariamena; Tassone, Pierfrancesco; Tagliaferri, Pierosandro; Cannataro, Mario

    2012-10-05

    clinical datasets produced in the University Hospital of Catanzaro, Italy. DMET Analyzer is a novel tool able to automatically analyse data produced by the DMET-platform in case-control association studies. Using such tool user may avoid wasting time in the manual execution of multiple statistical tests avoiding possible errors and reducing the amount of time needed for a whole experiment. Moreover annotations and the direct link to external databases may increase the biological knowledge extracted. The system is freely available for academic purposes at: https://sourceforge.net/projects/dmetanalyzer/files/

  13. How should we analyze and present mortality in our patients?: a multicentre GCDP experience

    Directory of Open Access Journals (Sweden)

    Darío Janeiro

    2016-03-01

    Conclusions: Although each method may be correct in themselves and express different approaches, the final impression left on the reader is a number that under/overestimates mortality. The CR model expresses better the reality of PD, where the number of patients losing follow-up (transplant, transfer to HD it is 4 times more than deceased patients and only a quarter remain on PD at the end of follow up.

  14. Analyzing the Language of Citation across Discipline and Experience Levels: An Automated Dictionary Approach

    Directory of Open Access Journals (Sweden)

    David Kaufer

    2016-02-01

    Full Text Available Citation practices have been and continue to be a concentrated area of research activity among writing researchers, spanning many disciplines. This research presents a re-analysis of a common data set contributed by Karatsolis (this issue, which focused on the citation practices of 8 PhD advisors and 8 PhD advisees across four disciplines. Our purpose in this paper is to show what automated dictionary methods can uncover on the same data based on a text analysis and visualization environment we have been developing over many years. The results of our analysis suggest that, although automatic dictionary methods cannot reproduce the fine granularity of interpretative coding schemes designed for human coders, it can find significant non-adjacent patterns distributed across a text or corpus that will likely elude the analyst relying solely on serial reading. We report on the discovery of several of these patterns that we believe complement Karatsolis’ original analysis and extend the citation literature at large. We conclude the paper by reviewing some of the advantages and limits of dictionary approaches to textual analysis, as well as debunking some common misconceptions against them.

  15. Analyzing e-bike commuters’ motives, travel behaviour and experiences using GPS-tracking and interviews

    NARCIS (Netherlands)

    Plazier, Paul

    2017-01-01

    IntroductionA major development in transportation in the past years has been the growth of electrically assisted cycling or e-biking. E-bikes (pedal-assisted or bicycle-style electric bicycles) make it possible to cover longer distances at higher speeds against reduced physical effort. Coupled with

  16. SIMTBED: A Graphical Test Bed for Analyzing and Reporting the Results of a Statistical Simulation Experiment.

    Science.gov (United States)

    1983-09-01

    Q O O N ’. 4 ft 4 t I fft4.Ir * ze . WWW*1 4~4 r-AO 0WI ft ~ Nif ~~D 77- f ~ ft~ftft - *,f, $4 - LAOI UJ . 0t 4. -s I *~f -tf ts,4- I I1Im I N 0) A...40. mma 1.4ŕMMOU-4 mo 2= oir 41 0. HW (.. ,= P.i 41 P41.. v~ u 4 Wm C6 " 0 ... * a u U 0m tni Irf tn~ 4 0 Ina u* Huq HI u- 04u uO ~ -4 ~ 4154- 0 .44 0

  17. schoRsch: An R package for analyzing and reporting factorial experiments

    Directory of Open Access Journals (Sweden)

    Pfister, Roland

    2016-09-01

    Full Text Available The schoRsch package aims at improving the usability of the open-source software R for researchers in psychology and related disciplines. It provides easy-to-use functions to format the results of typical tests used in psychology and related fields according to the current style guidelines of the American Psychological Association (APA. These and several other convenience functions, allow for efficient data analysis and aim at expediting the workflow when reporting test results in scientific publications.

  18. Experience in Analyzing Safety of SNF Management Backend in the Russian Federation

    International Nuclear Information System (INIS)

    Bolshov, L.A.; Linge, I.I.; Kondratenko, P.S.; Kapyrin, I.V.; Kovalchuk, V.D.

    2015-01-01

    The safety of SNF management back end regardless the strategy (processing, direct disposal) requires reliable data on the behavior of high-level long-lived radionuclides in geological media. Safety justification for radionuclide disposal facilities implies the use of robust models describing a) radionuclide propagation in complex media with special properties defined by the geological environment as well as b) all the processes which are significant in terms of possible mechanisms that determine the characteristics of radionuclide migration. This paper provides a brief summary of the known research in Russia on groundwater radionuclide migration models and an overview of methodological tool used by IBRAE RAS to address the forecast of radionuclide migration in the geological environment, based on different types of conducting medium representation and methods for water-rock interaction. Structural peculiarities, which may lead to non-classical radionuclide transport modes in geological media, are identified. A number of physical models reflecting these peculiarities and demonstrating anomalous transport behavior are discussed. Special attention is paid to asymptotical behavior of the contaminant concentration at large distances from the source. (author)

  19. Technology and Indigenous Language Revitalization: Analyzing the Experience of Hawai'i.

    Science.gov (United States)

    Warschauer, Mark

    1998-01-01

    Reports two years of ethnographic research on efforts to use online technologies in Hawaiian language revitalization programs. Issues discussed include the Internet's role in promoting or hindering language diversity, relationship of multimedia computing to non-Western patterns of civilization, Internet use for exploring cultural and social…

  20. Genotyping strategy matters when analyzing hypervariable major histocompatibility complex-Experience from a passerine bird.

    Science.gov (United States)

    Rekdal, Silje L; Anmarkrud, Jarl Andreas; Johnsen, Arild; Lifjeld, Jan T

    2018-02-01

    Genotyping of classical major histocompatibility complex (MHC) genes is challenging when they are hypervariable and occur in multiple copies. In this study, we used several different approaches to genotype the moderately variable MHC class I exon 3 (MHCIe3) and the highly polymorphic MHC class II exon 2 (MHCIIβe2) in the bluethroat ( Luscinia svecica ). Two family groups (eight individuals) were sequenced in replicates at both markers using Ion Torrent technology with both a single- and a dual-indexed primer structure. Additionally, MHCIIβe2 was sequenced on Illumina MiSeq. Allele calling was conducted by modifications of the pipeline developed by Sommer et al. (BMC Genomics, 14, 2013, 542) and the software AmpliSAS. While the different genotyping strategies gave largely consistent results for MHCIe3, with a maximum of eight alleles per individual, MHCIIβe2 was remarkably complex with a maximum of 56 MHCIIβe2 alleles called for one individual. Each genotyping strategy detected on average 50%-82% of all MHCIIβe2 alleles per individual, but dropouts were largely allele-specific and consistent within families for each strategy. The discrepancies among approaches indicate PCR biases caused by the platform-specific primer tails. Further, AmpliSAS called fewer alleles than the modified Sommer pipeline. Our results demonstrate that allelic dropout is a significant problem when genotyping the hypervariable MHCIIβe2. As these genotyping errors are largely nonrandom and method-specific, we caution against comparing genotypes across different genotyping strategies. Nevertheless, we conclude that high-throughput approaches provide a major advance in the challenging task of genotyping hypervariable MHC loci, even though they may not reveal the complete allelic repertoire.

  1. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Roozbeh Rashed

    2013-01-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.  

  2. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Farzin Heravi

    2012-09-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.

  3. Practical Limitations of Aerosol Separation by a Tandem Differential Mobility Analyzer-Aerosol Particle Mass Analyzer.

    Science.gov (United States)

    Radney, James G; Zangmeister, Christopher D

    2016-01-01

    A cavity ring-down spectrometer and condensation particle counter were used to investigate the limitations in the separation of singly and multiply charged aerosol particles by a tandem differential mobility analyzer (DMA) and aerosol particle mass analyzer (APM). The impact of particle polydispersity and morphology was investigated using three materials: nearly-monodisperse polystyrene latex nanospheres (PSL); polydisperse, nearly-spherical ammonium sulfate (AS) and polydisperse lacey fractal soot agglomerates. PSL and AS particles were easily resolved as a function of charge. For fresh soot, the presence of multiply charged particles severely affects the isolation of the singly charged particles. In cases where the DMA-APM was unable to fully resolve the singly charged particles of interest, the peak mass deviated by up to 13 % leading to errors in the mass specific extinction cross section of over 100 %. For measurements of non-spherical particles, non-symmetrical distributions of concentration as a function of mass were a sign of the presence of multiply charged particles. Under these conditions, the effects of multiply charged particles can be reduced by using a second charge neutralizer after the DMA and prior to the APM. Dilution of the aerosol stream serves to decrease the total number concentration of particles and does not remove the contributions of multiply charged particles.

  4. Experiments in computing: a survey.

    Science.gov (United States)

    Tedre, Matti; Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general.

  5. Incorporating field wind data into FIRETEC simulations of the International Crown Fire Modeling Experiment (ICFME): preliminary lessons learned

    Science.gov (United States)

    Rodman Linn; Kerry Anderson; Judith Winterkamp; Alyssa Broos; Michael Wotton; Jean-Luc Dupuy; Francois Pimont; Carleton Edminster

    2012-01-01

    Field experiments are one way to develop or validate wildland fire-behavior models. It is important to consider the implications of assumptions relating to the locality of measurements with respect to the fire, the temporal frequency of the measured data, and the changes to local winds that might be caused by the experimental configuration. Twenty FIRETEC simulations...

  6. Understanding patient experience of out-of-hours general practitioner services in South Wales: a qualitative study.

    NARCIS (Netherlands)

    Egbunike, J.N.; Shaw, C.; Bale, S.; Elwyn, G.; Edwards, A.

    2008-01-01

    BACKGROUND: In the light of recent changes in the structure and provision of out-of-hours service in the UK, there is a need to re-assess the quality of care. One way to assess the quality of care is through patient experience. OBJECTIVES: This study aimed to explore patient expectations and

  7. Micro-Analyzer: automatic preprocessing of Affymetrix microarray data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Cannataro, Mario

    2013-08-01

    A current trend in genomics is the investigation of the cell mechanism using different technologies, in order to explain the relationship among genes, molecular processes and diseases. For instance, the combined use of gene-expression arrays and genomic arrays has been demonstrated as an effective instrument in clinical practice. Consequently, in a single experiment different kind of microarrays may be used, resulting in the production of different types of binary data (images and textual raw data). The analysis of microarray data requires an initial preprocessing phase, that makes raw data suitable for use on existing analysis platforms, such as the TIGR M4 (TM4) Suite. An additional challenge to be faced by emerging data analysis platforms is the ability to treat in a combined way those different microarray formats coupled with clinical data. In fact, resulting integrated data may include both numerical and symbolic data (e.g. gene expression and SNPs regarding molecular data), as well as temporal data (e.g. the response to a drug, time to progression and survival rate), regarding clinical data. Raw data preprocessing is a crucial step in analysis but is often performed in a manual and error prone way using different software tools. Thus novel, platform independent, and possibly open source tools enabling the semi-automatic preprocessing and annotation of different microarray data are needed. The paper presents Micro-Analyzer (Microarray Analyzer), a cross-platform tool for the automatic normalization, summarization and annotation of Affymetrix gene expression and SNP binary data. It represents the evolution of the μ-CS tool, extending the preprocessing to SNP arrays that were not allowed in μ-CS. The Micro-Analyzer is provided as a Java standalone tool and enables users to read, preprocess and analyse binary microarray data (gene expression and SNPs) by invoking TM4 platform. It avoids: (i) the manual invocation of external tools (e.g. the Affymetrix Power

  8. Analyzing the User Behavior toward Electronic Commerce Stimuli

    Science.gov (United States)

    Lorenzo-Romero, Carlota; Alarcón-del-Amo, María-del-Carmen; Gómez-Borja, Miguel-Ángel

    2016-01-01

    Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e., navigational structure as utilitarian stimulus) versus non-verbal web technology (music and presentation of products as hedonic stimuli). Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this research consists in analyzing the impact of these web technologies –which constitute the web atmosphere or webmosphere of a website– on shopping human behavior (i.e., users’ internal states -affective, cognitive, and satisfaction- and behavioral responses – approach responses, and real shopping outcomes-) within the retail online store created by computer, taking into account some mediator variables (i.e., involvement, atmospheric responsiveness, and perceived risk). A 2 (“free” versus “hierarchical” navigational structure) × 2 (“on” versus “off” music) × 2 (“moving” versus “static” images) between-subjects computer experimental design is used to test empirically this research. In addition, an integrated methodology was developed allowing the simulation, tracking and recording of virtual user behavior within an online shopping environment. As main conclusion, this study suggests that the positive responses of online consumers might increase when they are allowed to freely navigate the online stores and their experience is enriched by animate gifts and music background. The effect caused by mediator variables modifies relatively the final shopping human behavior. PMID:27965549

  9. Using Fourier transform IR spectroscopy to analyze biological materials.

    Science.gov (United States)

    Baker, Matthew J; Trevisan, Júlio; Bassan, Paul; Bhargava, Rohit; Butler, Holly J; Dorling, Konrad M; Fielden, Peter R; Fogarty, Simon W; Fullwood, Nigel J; Heys, Kelly A; Hughes, Caryn; Lasch, Peter; Martin-Hirsch, Pierre L; Obinaju, Blessing; Sockalingum, Ganesh D; Sulé-Suso, Josep; Strong, Rebecca J; Walsh, Michael J; Wood, Bayden R; Gardner, Peter; Martin, Francis L

    2014-08-01

    IR spectroscopy is an excellent method for biological analyses. It enables the nonperturbative, label-free extraction of biochemical information and images toward diagnosis and the assessment of cell functionality. Although not strictly microscopy in the conventional sense, it allows the construction of images of tissue or cell architecture by the passing of spectral data through a variety of computational algorithms. Because such images are constructed from fingerprint spectra, the notion is that they can be an objective reflection of the underlying health status of the analyzed sample. One of the major difficulties in the field has been determining a consensus on spectral pre-processing and data analysis. This manuscript brings together as coauthors some of the leaders in this field to allow the standardization of methods and procedures for adapting a multistage approach to a methodology that can be applied to a variety of cell biological questions or used within a clinical setting for disease screening or diagnosis. We describe a protocol for collecting IR spectra and images from biological samples (e.g., fixed cytology and tissue sections, live cells or biofluids) that assesses the instrumental options available, appropriate sample preparation, different sampling modes as well as important advances in spectral data acquisition. After acquisition, data processing consists of a sequence of steps including quality control, spectral pre-processing, feature extraction and classification of the supervised or unsupervised type. A typical experiment can be completed and analyzed within hours. Example results are presented on the use of IR spectra combined with multivariate data processing.

  10. Growing tumor vessels: more than one way to skin a cat - implications for angiogenesis targeted cancer therapies.

    Science.gov (United States)

    Leite de Oliveira, Rodrigo; Hamm, Alexander; Mazzone, Massimiliano

    2011-04-01

    The establishment of a functional, integrated vascular system is instrumental for tissue growth and homeostasis. Without blood vessels no adequate nutrition and oxygen would be provided to cells, nor could the undesired waste products be efficiently removed. Blood vessels constitute therefore one of the largest and most complex body network whose assembly depends on the precise balance of growth factors acting in a complementary and coordinated manner with cells of several identities. However, the vessels that are crucial for life can also foster death, given their involvement in cancer progression towards malignancy and metastasis. Targeting tumor vasculature has thus arisen as an appealing anti-cancer therapeutic approach. Since the milestone achievements that vascular endothelial growth factor (VEGF) blockade suppressed angiogenesis and tumor growth in mice and prolonged the survival of cancer patients when administered in combination with chemotherapy, the clinical development of anti-VEGF(R) drugs has accelerated remarkably. FDA has approved the use of bevacizumab - a humanized monoclonal antibody against VEGF - in colorectal, lung and metastatic breast cancers in combination with standard chemotherapy. Additional broad-spectrum VEGF receptor tyrosine kinase inhibitors, such as sunitinib and sorafenib, are used in monotherapy for metastatic renal carcinoma, while sunitinib is also approved for imatinib resistant gastrointestinal stromal tumors and sorafenib for advanced stage hepatocellular carcinoma. Nevertheless, the survival benefit offered by VEGF(R) blockers, either as single agents or in combination with chemotherapy, is calculated merely in the order of months. Posterior studies in preclinical models have reported that despite reducing primary tumor growth, the inhibition of VEGF increased tumor invasiveness and metastasis. The clinical implications of these findings urge the need to reconcile these conflicting results. Anti-angiogenic therapy represents a significant step forth in cancer therapy and in our understanding of cancer biology, but it is also clear that we need to learn how to use it. What is the biological consequence of VEGF-blockade? Does VEGF inhibition starve the tumor to death - as initially postulated - or does it rather foster malignancy? Can anti-VEGF(R) therapy favor tumor vessel formation by VEGF-independent means? Tumors are very diverse and plastic entities, able to adapt to the harshest conditions; this is also reflected by the tumor vasculature. Lessons from the bench to the bedside and vice versa have taught us that the diversity of signals underlying tumor vessel growth will likely be responsive (or resistant) to distinct therapeutic approaches. In this review, we propose a reflection of the different strategies tumors use to grow blood vessels and how these can have impact on the (un)success of current anti-angiogenic therapies. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Go West: A One Way Stepping-Stone Dispersion Model for the Cavefish Lucifuga dentata in Western Cuba.

    Science.gov (United States)

    Hernández, Damir; Casane, Didier; Chevalier-Monteagudo, Pedro; Bernatchez, Louis; García-Machado, Erik

    2016-01-01

    Consistent with the limited dispersal capacity of most troglobitic animals, almost all Lucifuga cavefish species have very narrow geographic distribution in Cuba. However, one species, L. dentata, has a wide but disjointed distribution over 300 km in the west of the island. In order to estimate the relative role of vicariance and dispersal in the unexpected L. dentata distribution, we obtained partial sequences of the mitochondrial DNA (mtDNA) cytochrome b (cytb) gene and control region (CR), and then applied Approximate Bayesian Computation (ABC), based on the identification of five genetic and geographic congruent groups of populations. The process that best explains the distribution of genetic diversity in this species is sequential range expansion from east Matanzas to the western Pinar del Río provinces, followed by isolation of groups of populations. We found relative high haplotype diversity and low nucleotide diversity in all but the Havana group, which has high values for both diversity parameters, suggesting that this group has been demographically stable over time. For two groups of populations (Cayuco and Bolondrón), the mismatch distribution analyses suggests past demographic expansion. In the case of the Cayuco region, the star like relationships of haplotypes in the network suggests a recent founding event, congruent with other evidence indicating that this is the most recently colonized region. Over all, the results suggest that a combination of habitat availability, temporal interconnections, and possibly the biological properties of this species, may have enabled its dispersal and range expansion compared to other species of the genus, which are more geographically restricted.

  12. One-Way Flow of a Rarefied Gas Induced in a Circular Pipe with a Periodic Temperature Distribution

    National Research Council Canada - National Science Library

    Aoki, K

    2000-01-01

    The steady behavior of a rarefied gas in a circular pipe with a saw-like temperature distribution increasing and decreasing periodically in the direction of the pipe axis is investigated numerically...

  13. Keeping health facilities safe: one way of strengthening the interaction between disease-specific programmes and health systems.

    Science.gov (United States)

    Harries, Anthony D; Zachariah, Rony; Tayler-Smith, Katie; Schouten, Erik J; Chimbwandira, Frank; Van Damme, Wim; El-Sadr, Wafaa M

    2010-12-01

    The debate on the interaction between disease-specific programmes and health system strengthening in the last few years has intensified as experts seek to tease out common ground and find solutions and synergies to bridge the divide. Unfortunately, the debate continues to be largely academic and devoid of specificity, resulting in the issues being irrelevant to health care workers on the ground. Taking the theme 'What would entice HIV- and tuberculosis (TB)-programme managers to sit around the table on a Monday morning with health system experts', this viewpoint focuses on infection control and health facility safety as an important and highly relevant practical topic for both disease-specific programmes and health system strengthening. Our attentions, and the examples and lessons we draw on, are largely aimed at sub-Saharan Africa where the great burden of TB and HIV ⁄ AIDS resides, although the principles we outline would apply to other parts of the world as well. Health care infections, caused for example by poor hand hygiene, inadequate testing of donated blood, unsafe disposal of needles and syringes, poorly sterilized medical and surgical equipment and lack of adequate airborne infection control procedures, are responsible for a considerable burden of illness amongst patients and health care personnel, especially in resource-poor countries. Effective infection control in a district hospital requires that all the components of a health system function well: governance and stewardship, financing,infrastructure, procurement and supply chain management, human resources, health information systems, service delivery and finally supervision. We argue in this article that proper attention to infection control and an emphasis on safe health facilities is a concrete first step towards strengthening the interaction between disease-specific programmes and health systems where it really matters – for patients who are sick and for the health care workforce who provide the care and treatment.

  14. Experimental Testing of Monopiles in Sand Subjected to One-Way Long-Term Cyclic Lateral Loading

    DEFF Research Database (Denmark)

    Roesen, Hanne Ravn; Ibsen, Lars Bo; Andersen, Lars Vabbersgaard

    2013-01-01

    cycles. The tests are conducted in dense saturated sand. The maximum moment applied in the cyclic tests is varied from 18% to 36% of the ultimate lateral resistance found in a static loading test. The tests reveal that the accumulated rotation can be expressed by use of a power function. Further, static...... tests conducted post cyclic loading indicate that the static ultimate capacity increases with the magnitude of cyclic loading....

  15. A new supply chain management method with one-way time window: A hybrid PSO-SA approach

    Directory of Open Access Journals (Sweden)

    Reza Tavakkoli-Moghaddam

    2012-01-01

    Full Text Available In this paper, we study a supply chain problem where a whole seller/producer distributes goods among different retailers. The proposed model of this paper is formulated as a more general and realistic form of traditional vehicle routing problem (VRP. The main advantages of the new proposed model are twofold. First, the time window does not consider any lower bound and second, it treats setup time as separate cost components. The resulted problem is solved using a hybrid of particle swarm optimization and simulated annealing (PSO-SA. The results are compared with other hybrid method, which is a combination of Ant colony and Tabu search. We use some well-known benchmark problems to compare the results of our proposed model with other method. The preliminary results indicate that the proposed model of this paper performs reasonably well.

  16. Evaluation of Ensiled Brewer's Grain in the Diet of Piglets by One Way Multiple Analysis of Variance, MANOVA

    Directory of Open Access Journals (Sweden)

    Amang A Mbang, J.

    2007-01-01

    Full Text Available The basic purpose of feeding trials is to find the optimum level of feed ingredients which give the highest economical returns to the farmers. This can be achieved through estimation and comparison of means of different rations. The example we have is a study of incorporation of different levels of ensiled brewers grains in the diet of 24 hybrids weaned piglets from Landrace x Duroc x Berkshire x Large White. They were randomly divided into four groups with three replicates of two piglets per pen. They were fed 0, 10, 20, 30% incorporation of ensiled brewer's grains on dry matter basis during post-weaning period followed by 0, 30, 40 and 50% during growing period and 0, 50, 60 and 70% during finishing period. We have one explanatory variable: initial weight, and four post treatment outcome variables recorded per piglets: final weight, dry matter consumption, weight gain and index of consumption. Comparing of several multivariate treatment means model design analysis is adapted. We obtain the MANOVA (Multiple Analyse of Variance table of each phase, where the treatment differences exist by using Wilk's lambda distribution, and we find the treatment effect by using a confidence interval method of MANOVA. This model has the advantage of computing the responses of all variables in the matrix of sum of squares and more precisely in separation of the different means percentage of Ensiled Brewer's grain.

  17. Repetitive Behaviours and Restricted Interests in Individuals with Down Syndrome—One Way of Managing Their World?

    Directory of Open Access Journals (Sweden)

    Sheila Glenn

    2017-06-01

    Full Text Available This paper argues that the repetitive behaviour and restrictive interests (RBRI displayed by individuals with Down syndrome have mostly positive functions. However, as research has developed from interests in Obsessional Compulsive Disorder or Autistic Spectrum Disorder, unfortunately a view has arisen that RBRI in individuals with Down syndrome are also likely to be pathological. This is particularly the case in adults. The paper reviews: (a measures employed and the perspectives that have been used; (b the development in typically developing individuals, those with Down syndrome, and those with other conditions associated with intellectual disability; (c positive and possible negative effects of RBRI; and (d the need for more research. The conclusion is that, for their level of development, RBRI are helpful for most individuals with Down syndrome.

  18. Common mechanisms of DNA translocation motors in bacteria and viruses using one-way revolution mechanism without rotation.

    Science.gov (United States)

    Guo, Peixuan; Zhao, Zhengyi; Haak, Jeannie; Wang, Shaoying; Wu, Dong; Meng, Bing; Weitao, Tao

    2014-01-01

    Biomotors were once described into two categories: linear motor and rotation motor. Recently, a third type of biomotor with revolution mechanism without rotation has been discovered. By analogy, rotation resembles the Earth rotating on its axis in a complete cycle every 24h, while revolution resembles the Earth revolving around the Sun one circle per 365 days (see animations http://nanobio.uky.edu/movie.html). The action of revolution that enables a motor free of coiling and torque has solved many puzzles and debates that have occurred throughout the history of viral DNA packaging motor studies. It also settles the discrepancies concerning the structure, stoichiometry, and functioning of DNA translocation motors. This review uses bacteriophages Phi29, HK97, SPP1, P22, T4, and T7 as well as bacterial DNA translocase FtsK and SpoIIIE or the large eukaryotic dsDNA viruses such as mimivirus and vaccinia virus as examples to elucidate the puzzles. These motors use ATPase, some of which have been confirmed to be a hexamer, to revolve around the dsDNA sequentially. ATP binding induces conformational change and possibly an entropy alteration in ATPase to a high affinity toward dsDNA; but ATP hydrolysis triggers another entropic and conformational change in ATPase to a low affinity for DNA, by which dsDNA is pushed toward an adjacent ATPase subunit. The rotation and revolution mechanisms can be distinguished by the size of channel: the channels of rotation motors are equal to or smaller than 2 nm, that is the size of dsDNA, whereas channels of revolution motors are larger than 3 nm. Rotation motors use parallel threads to operate with a right-handed channel, while revolution motors use a left-handed channel to drive the right-handed DNA in an anti-chiral arrangement. Coordination of several vector factors in the same direction makes viral DNA-packaging motors unusually powerful and effective. Revolution mechanism that avoids DNA coiling in translocating the lengthy genomic dsDNA helix could be advantageous for cell replication such as bacterial binary fission and cell mitosis without the need for topoisomerase or helicase to consume additional energy. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. More than one way to be an herbivore: convergent evolution of herbivory using different digestive strategies in prickleback fishes (Stichaeidae).

    Science.gov (United States)

    German, Donovan P; Sung, Aaron; Jhaveri, Parth; Agnihotri, Ritika

    2015-06-01

    In fishes, the evolution of herbivory has occured within a spectrum of digestive strategies, with two extremes on opposite ends: (i) a rate-maximization strategy characterized by high intake, rapid throughput of food through the gut, and little reliance on microbial digestion or (ii) a yield-maximization strategy characterized by measured intake, slower transit of food through the gut, and more of a reliance on microbial digestion in the hindgut. One of these strategies tends to be favored within a given clade of fishes. Here, we tested the hypothesis that rate or yield digestive strategies can arise in convergently evolved herbivores within a given lineage. In the family Stichaeidae, convergent evolution of herbivory occured in Cebidichthys violaceus and Xiphister mucosus, and despite nearly identical diets, these two species have different digestive physiologies. We found that C. violaceus has more digesta in its distal intestine than other gut regions, has comparatively high concentrations (>11 mM) of short-chain fatty acids (SCFA, the endpoints of microbial fermentation) in its distal intestine, and a spike in β-glucosidase activity in this gut region, findings that, when coupled to long retention times (>20 h) of food in the guts of C. violaceus, suggest a yield-maximizing strategy in this species. X. mucosus showed none of these features and was more similar to its sister taxon, the omnivorous Xiphister atropurpureus, in terms of digestive enzyme activities, gut content partitioning, and concentrations of SCFA in their distal intestines. We also contrasted these herbivores and omnivores with other sympatric stichaeid fishes, Phytichthys chirus (omnivore) and Anoplarchus purpurescens (carnivore), each of which had digestive physiologies consistent with the consumption of animal material. This study shows that rate- and yield-maximizing strategies can evolve in closely related fishes and suggests that resource partitioning can play out on the level of digestive physiology in sympatric, closely related herbivores. Copyright © 2015 Elsevier GmbH. All rights reserved.

  20. One-way traffic: The inferior frontal gyrus controls brain activation in the middle temporal gyrus during divergent thinking

    DEFF Research Database (Denmark)

    Vartanian, Oshin; Beatty, Erin L.; Smith, Ingrid

    2018-01-01

    the hypothesis that the PFC exerts unidirectional control over the middle temporal gyrus (MTG) and the inferior parietal lobule (IPL), vs. the hypothesis that these two sets of regions exert bidirectional control over each other (in the form of feedback loops). The data were consistent with the former model...... by demonstrating that the right inferior frontal gyrus (IFG) exerts unidirectional control over MTG and IPL, although the evidence was somewhat stronger in the case of the MTG than the IPL. Our findings highlight potential causal pathways that could underlie the neural bases of divergent thinking....

  1. TNF-Α may mediate inflammasome activation in the absence of bacterial infection in more than one way.

    Directory of Open Access Journals (Sweden)

    Susana Álvarez

    Full Text Available Members of the mammalian nucleotide binding domain, leucine-rich repeat (LRR-containing receptor family of proteins are key modulators of innate immunity regulating inflammation. To date, microbial pathogen-associated molecules and toxins have been identified as key triggers of activation of inflammasomes. However, recently, environmental, and neurodegenerative stimuli have been identified that lead to IL-1β release by means of inflammasomes. IL-1β plays a crucial role during brain inflammation, and caspase-1 appears to be a key modulator of IL-1β bioactivity and the consequent transcriptional regulation of gene expression within the brain during inflammation. We show here that exposure of a human neuroblastoma cell line (SK-N-MC cells to TNF-α promotes ROS-mediated caspase-1 activation and IL-1β secretion. The involvement of NF-κB in the regulation of IL-1β synthesis is investigated through specific inhibition of this transcription factor. The effect of TNF-α was abolished in the presence of ROS inhibitors as NAC, or DPI. Remarkably, SK-N-MC cells do not respond to ATP stimulation in spite of P2X7R expression. These results provide a mechanism by which danger signals and particulate matter mediate inflammation via the inflammasome in the absence of microbial infection.

  2. There's more than one way to climb a tree: Limb length and microhabitat use in lizards with toe pads.

    Directory of Open Access Journals (Sweden)

    Travis J Hagey

    Full Text Available Ecomorphology links microhabitat and morphology. By comparing ecomorphological associations across clades, we can investigate the extent to which evolution can produce similar solutions in response to similar challenges. While Anolis lizards represent a well-studied example of repeated convergent evolution, very few studies have investigated the ecomorphology of geckos. Similar to anoles, gekkonid lizards have independently evolved adhesive toe pads and many species are scansorial. We quantified gecko and anole limb length and microhabitat use, finding that geckos tend to have shorter limbs than anoles. Combining these measurements with microhabitat observations of geckos in Queensland, Australia, we observed geckos using similar microhabitats as reported for anoles, but geckos with relatively longer limbs were using narrower perches, differing from patterns observed in anoles and other lizards. We also observed arboreal geckos with relatively shorter proximal limb segments as compared to rock-dwelling and terrestrial geckos, similar to patterns observed for other lizards. We conclude that although both geckos and anoles have adhesive pads and use similar microhabitats, their locomotor systems likely complement their adhesive pads in unique ways and result in different ecomorphological patterns, reinforcing the idea that species with convergent morphologies still have idiosyncratic characteristics due to their own separate evolutionary histories.

  3. Structural and functional responses of the oligochaete and aeolosomatid assemblage in lowland streams: a one-way-pollution-modelled ecosystem

    Directory of Open Access Journals (Sweden)

    Maria V. López van Oosterom

    2015-03-01

    Full Text Available We investigated the responses of the assemblage of Oligochaeta and Aeolosomatidae to organic pollution; comparing taxonomic richness, diversity, abundance, and diet of the individuals inhabiting two lowland streams with different degrees of anthropic impact (the Rodríguez and the Carnaval belonging to the Río de la Plata basin, Argentina. The physicochemical parameters in the Rodríguez Stream indicated a strong deterioration of the water quality compared to that of the Carnaval. A canonical-correlation analysis indicated that the Tubificinae, Megadrili, Enchytraeidae, and Rhyacodrilinae were more closely associated with the Rodríguez Stream; whereas the Naidinae, Pristininae, and Opystocystidae were more highly represented in the Carnaval. The diversity and taxonomic richness in the Rodríguez Stream exhibited significant differences from those of the Carnaval (P<0.001, but the abundance was not different between the two sites. Schoener’s index revealed the higher degree of dietary overlap of the two streams because all the species analysed consumed a high proportion of detritus, especially the organisms in the Rodríguez. In the Carnaval Stream a higher number of alimentary items were consumed, and mainly by the Naidinae. This difference, probably reflecting the greater availability of this resource at sites impacted by organic pollution, underscores the fundamental role of oligochaetes in the food webs of aquatic ecosystems. The combined use of structural and functional parameters enables a more comprehensive view of how these lotic systems function and as such provides information that will serve to design tools for the management of such temperate environments.

  4. There's more than one way to climb a tree: Limb length and microhabitat use in lizards with toe pads.

    Science.gov (United States)

    Hagey, Travis J; Harte, Scott; Vickers, Mathew; Harmon, Luke J; Schwarzkopf, Lin

    2017-01-01

    Ecomorphology links microhabitat and morphology. By comparing ecomorphological associations across clades, we can investigate the extent to which evolution can produce similar solutions in response to similar challenges. While Anolis lizards represent a well-studied example of repeated convergent evolution, very few studies have investigated the ecomorphology of geckos. Similar to anoles, gekkonid lizards have independently evolved adhesive toe pads and many species are scansorial. We quantified gecko and anole limb length and microhabitat use, finding that geckos tend to have shorter limbs than anoles. Combining these measurements with microhabitat observations of geckos in Queensland, Australia, we observed geckos using similar microhabitats as reported for anoles, but geckos with relatively longer limbs were using narrower perches, differing from patterns observed in anoles and other lizards. We also observed arboreal geckos with relatively shorter proximal limb segments as compared to rock-dwelling and terrestrial geckos, similar to patterns observed for other lizards. We conclude that although both geckos and anoles have adhesive pads and use similar microhabitats, their locomotor systems likely complement their adhesive pads in unique ways and result in different ecomorphological patterns, reinforcing the idea that species with convergent morphologies still have idiosyncratic characteristics due to their own separate evolutionary histories.

  5. Bifurcated overtones of one-way localized Fabry-Pérot resonances in parity-time symmetric optical lattices

    Science.gov (United States)

    Nafaa Gaafer, Fatma; Shen, Yaxi; Peng, Yugui; Wu, Aimin; Zhang, Peng; Zhu, Xuefeng

    2017-06-01

    Not Available Project supported by the National Natural Science Foundation of China (Grant Nos. 11674119, 11404125, and 11574389). X. F. Z acknowledges the financial support from the Bird Nest Plan of HUST, China. P. Z. is supported by One Hundred-Talent Plan of Chinese Academy of Sciences.

  6. One Way or Another: Evidence for Perceptual Asymmetry in Pre-attentive Learning of Non-native Contrasts

    Directory of Open Access Journals (Sweden)

    Liquan Liu

    2018-03-01

    Full Text Available Research investigating listeners’ neural sensitivity to speech sounds has largely focused on segmental features. We examined Australian English listeners’ perception and learning of a supra-segmental feature, pitch direction in a non-native tonal contrast, using a passive oddball paradigm and electroencephalography. The stimuli were two contours generated from naturally produced high-level and high-falling tones in Mandarin Chinese, differing only in pitch direction (Liu and Kager, 2014. While both contours had similar pitch onsets, the pitch offset of the falling contour was lower than that of the level one. The contrast was presented in two orientations (standard and deviant reversed and tested in two blocks with the order of block presentation counterbalanced. Mismatch negativity (MMN responses showed that listeners discriminated the non-native tonal contrast only in the second block, reflecting indications of learning through exposure during the first block. In addition, listeners showed a later MMN peak for their second block of test relative to listeners who did the same block first, suggesting linguistic (as opposed to acoustic processing or a misapplication of perceptual strategies from the first to the second block. The results also showed a perceptual asymmetry for change in pitch direction: listeners who encountered a falling tone deviant in the first block had larger frontal MMN amplitudes than listeners who encountered a level tone deviant in the first block. The implications of our findings for second language speech and the developmental trajectory for tone perception are discussed.

  7. “I Will Take Your Answer One Way or Another”: The Oscar Wilde Trial Transcripts as Literary Artefacts

    OpenAIRE

    Merkler, Benjamin

    2014-01-01

    Oscar Wilde was not only an eccentric artist and perhaps one of the most popular British dramatists since Shakespeare, but also one of the – if not the – first modern star, who understood how to present oneself in public and use publicity as means of self-realisation. After discussing upcoming theoretical problems of this paper like the reliability of transcripts as well as the theoretical background of the following analysis, there will be an introductory part that first of all provides ...

  8. Two Versions of the Projection Postulate: From EPR Argument to One-Way Quantum Computing and Teleportation

    Directory of Open Access Journals (Sweden)

    Andrei Khrennikov

    2010-01-01

    of the projection postulate (due to von Neumann and Lüders should be taken into account seriously in the analysis of the basic constructions of quantum information theory. This paper is a review devoted to such an analysis.

  9. Development of a test facility for analyzing supercritical fluid blowdown

    Energy Technology Data Exchange (ETDEWEB)

    Roberto, Thiago D.; Alvim, Antonio C.M., E-mail: thiagodbtr@gmail.com [Coordenacao dos Programas de Pos-Graduacao em Engenharia (PEN/COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear; Silva, Mario A.B. da, E-mail: mabs500@gmail.com [Universidade Federal de Pernambuco (CTG/UFPE), Recife, PE (Brazil). Departamento de Energia Nuclear; Lapa, Celso M.F., E-mail: lapa@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2015-07-01

    The generation IV nuclear reactors under development mostly use supercritical fluids as the working fluid because higher temperatures improve the thermal efficiency. Supercritical fluids are used by modern nuclear power plants to achieve thermal efficiencies of around 45%. With water as the supercritical working fluid, these plants operate at a high temperature and pressure. However, experiments on supercritical water are limited by technical and financial difficulties. These difficulties can be overcome by using model fluids, which have more feasible supercritical conditions and exhibit a lower critical pressure and temperature. Experimental research is normally used to determine the conditions under which model fluids represent supercritical fluids under steady-state conditions. A fluid-to-fluid scaling approach has been proposed to determine model fluids that can represent supercritical fluids in a transient state. This paper presents an application of fractional scale analysis to determine the simulation parameters for a depressurization test facility. Carbon dioxide (CO{sub 2}) and R134a gas were considered as the model fluids because their critical point conditions are more feasible than those of water. The similarities of water (prototype), CO{sub 2} (model) and R134a (model) for depressurization in a pressure vessel were analyzed. (author)

  10. Initial Neutral Particle Analyzer Measurements of Ion Temperature in NSTX

    Energy Technology Data Exchange (ETDEWEB)

    S.S. Medley; R.E. Bell; M.P. Petrov; A.L. Roquemore; and E.V. Suvorkin

    2002-07-08

    The Neutral Particle Analyzer (NPA) diagnostic on the National Spherical Torus Experiment (NSTX) utilizes a PPPL-designed E||B spectrometer which measures the energy spectra of minority H and bulk D species simultaneously with 39 energy channels per mass specie and a time resolution of 1 msec. The calibrated energy range is E = 0.5-150 keV and the energy resolution varies from (delta)E/E = 3-7% over the surface of the microchannel plate detector. The NPA measures thermal Maxwellian ion spectra to obtain line integrated ion temperatures, T{sub i}. For line integral electron densities below neL {approx} 3.5 x 10{sup 19} m{sup -2}, good agreement is observed between the line integrated NPA T{sub i} and the central T{sub i}(0) measured by the spatially localized CHarge Exchange Recombination Spectroscopy (CHERS) diagnostic. However, with increasingly higher n{sub eL} the NPA T{sub i} falls below the central T{sub i}(0) measured by CHERS because the charge exchange neutral emissivity weights the line integrated NPA measurement outboard of the plasma core. An analytic neutral analysis code, DOUBLE, has been applied to the NPA data to correct for this effect and restore agreement with T{sub i}(0) measured by CHERS. A description of the NPA diagnostic on NSTX and initial ion temperature measurements along with an illustration of application of the DOUBLE code are presented.

  11. CSNI specialist meeting on simulators and plant analyzers

    International Nuclear Information System (INIS)

    Miettinen, J.; Holmstroem, H.

    1994-01-01

    The Specialist Meeting on Simulators and Plant Analyzers, held in June 9-12, 1992, in Lappeenranta, Finland, was sponsored by the Committee on the Safety on Nuclear Installations (CSNI) of the OECD Nuclear Energy Agency (NEA). It was organized in collaboration with the Technical Research Centre of Finland (VTT) and the Lappeenranta Technical University of Technology (LTKK). All the presented papers were invited and devided into four sessions. In the first session the objectives, requirements and consepts of simulators were discussed against present standards and guidelines. The second session focused on the capabilities of current analytical models. The third session focused on the experiences gained so far from the applications. The final fourth session concentrated on simulators, which are currently under development, and future plans with regard to both development and utilization. At the end of the meeting topics of the meeting were discussed at the panel discussion. Summaries of the sessions and shortened version of the panel discussion are included into the proceeding. (orig.)

  12. Development of a test facility for analyzing supercritical fluid blowdown

    International Nuclear Information System (INIS)

    Roberto, Thiago D.; Alvim, Antonio C.M.

    2015-01-01

    The generation IV nuclear reactors under development mostly use supercritical fluids as the working fluid because higher temperatures improve the thermal efficiency. Supercritical fluids are used by modern nuclear power plants to achieve thermal efficiencies of around 45%. With water as the supercritical working fluid, these plants operate at a high temperature and pressure. However, experiments on supercritical water are limited by technical and financial difficulties. These difficulties can be overcome by using model fluids, which have more feasible supercritical conditions and exhibit a lower critical pressure and temperature. Experimental research is normally used to determine the conditions under which model fluids represent supercritical fluids under steady-state conditions. A fluid-to-fluid scaling approach has been proposed to determine model fluids that can represent supercritical fluids in a transient state. This paper presents an application of fractional scale analysis to determine the simulation parameters for a depressurization test facility. Carbon dioxide (CO 2 ) and R134a gas were considered as the model fluids because their critical point conditions are more feasible than those of water. The similarities of water (prototype), CO 2 (model) and R134a (model) for depressurization in a pressure vessel were analyzed. (author)

  13. Millennial Filipino Student Engagement Analyzer Using Facial Feature Classification

    Science.gov (United States)

    Manseras, R.; Eugenio, F.; Palaoag, T.

    2018-03-01

    Millennials has been a word of mouth of everybody and a target market of various companies nowadays. In the Philippines, they comprise one third of the total population and most of them are still in school. Having a good education system is important for this generation to prepare them for better careers. And a good education system means having quality instruction as one of the input component indicators. In a classroom environment, teachers use facial features to measure the affect state of the class. Emerging technologies like Affective Computing is one of today’s trends to improve quality instruction delivery. This, together with computer vision, can be used in analyzing affect states of the students and improve quality instruction delivery. This paper proposed a system of classifying student engagement using facial features. Identifying affect state, specifically Millennial Filipino student engagement, is one of the main priorities of every educator and this directed the authors to develop a tool to assess engagement percentage. Multiple face detection framework using Face API was employed to detect as many student faces as possible to gauge current engagement percentage of the whole class. The binary classifier model using Support Vector Machine (SVM) was primarily set in the conceptual framework of this study. To achieve the most accuracy performance of this model, a comparison of SVM to two of the most widely used binary classifiers were tested. Results show that SVM bested RandomForest and Naive Bayesian algorithms in most of the experiments from the different test datasets.

  14. A discrete choice experiment studying students' preferences for scholarships to private medical schools in Japan.

    Science.gov (United States)

    Goto, Rei; Kakihara, Hiroaki

    2016-02-09

    The shortage of physicians in rural areas and in some specialties is a societal problem in Japan. Expensive tuition in private medical schools limits access to them particularly for students from middle- and low-income families. One way to reduce this barrier and lessen maldistribution is to offer conditional scholarships to private medical schools. A discrete choice experiment is carried out on a total of 374 students considering application to medical schools. The willingness to receive a conditional scholarship program to private medical schools is analyzed. The probability of attending private medical schools significantly decreased because of high tuition, a postgraduate obligation to provide a service in specific specialty areas, and the length of time of this obligation. An obligation to provide a service in rural regions had no significant effect on this probability. To motivate non-applicants to private medical schools to enroll in such schools, a decrease in tuition to around 1.2 million yen (US$ 12,000) or less, which is twice that of public schools, was found to be necessary. Further, it was found that non-applicants to private medical schools choose to apply to such schools even with restrictions if they have tuition support at the public school level. Conditional scholarships for private medical schools may widen access to medical education and simultaneously provide incentives to work in insufficiently served areas.

  15. The Instrument Implementation of Two-tier Multiple Choice to Analyze Students’ Science Process Skill Profile

    Directory of Open Access Journals (Sweden)

    Sukarmin Sukarmin

    2018-01-01

    Full Text Available This research is aimed to analyze the profile of students’ science process skill (SPS by using instrument two-tier multiple choice. This is a descriptive research that describes the profile of students’ SPS. Subjects of the research were 10th-grade students from high, medium and low categorized school. Instrument two-tier multiple choice consists of 30 question that contains an indicator of SPS. The indicator of SPS namely formulating a hypothesis, designing experiment, analyzing data, applying the concept, communicating, making a conclusion. Based on the result of the research and analysis, it shows that: 1 the average of indicator achievement of science process skill at high categorized school on formulating hypothesis is 74,55%, designing experiment is 74,89%, analyzing data is 67,89%, applying concept is 52,89%, communicating is 80,22%, making conclusion is 76%, 2. the average of indicator achievement of science process skill at medium categorized school on formulating hypothesis is 53,47%, designing experiment is 59,86%, analyzing data is 42,22%, applying concept is 33,19%, communicating is 76,25%, making conclusion is 61,53%, 3 the average of indicator achievement of science process skill at low categorized school on formulating hypothesis is 51%, designing experiment is 55,17%, analyzing data is 39,17%, applying concept is 35,83%, communicating is 58,83%, making conclusion is 58%.

  16. Looking for a Framework for Analyzing Eco-innovation Dynamics

    DEFF Research Database (Denmark)

    Yang, Yan

    2011-01-01

    Looking for a Framework for Analyzing Eco-innovation Dynamics: A Triple Helix Model of Innovation Perspective.......Looking for a Framework for Analyzing Eco-innovation Dynamics: A Triple Helix Model of Innovation Perspective....

  17. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  18. ARC Code TI: Inference Kernel for Open Static Analyzers (IKOS)

    Data.gov (United States)

    National Aeronautics and Space Administration — IKOS is a C++ library designed to facilitate the development of sound static analyzers based on Abstract Interpretation. Specialization of a static analyzer for an...

  19. Lab-on-a-chip Astrobiology Analyzer, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer to measure chemical signatures of life in extraterrestrial settings. The analyzer will...

  20. Portable Programmable Multifunction Body Fluids Analyzer, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced Liquid Logic proposes to develop a very capable analyzer based on its digital microfluidic technology. Such an analyzer would be:  Capable of both simple...

  1. 21 CFR 868.1700 - Nitrous oxide gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Nitrous oxide gas analyzer. 868.1700 Section 868...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1700 Nitrous oxide gas analyzer. (a) Identification. A nitrous oxide gas analyzer is a device intended to measure the concentration of nitrous oxide...

  2. 21 CFR 862.2500 - Enzyme analyzer for clinical use.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Enzyme analyzer for clinical use. 862.2500 Section... Instruments § 862.2500 Enzyme analyzer for clinical use. (a) Identification. An enzyme analyzer for clinical use is a device intended to measure enzymes in plasma or serum by nonkinetic or kinetic measurement of...

  3. 40 CFR 91.318 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer... Provisions § 91.318 Oxides of nitrogen analyzer calibration. (a) Calibrate the chemiluminescent oxides of... introduction into service, and monthly thereafter, check the chemiluminescent oxides of nitrogen analyzer for...

  4. 40 CFR 89.321 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer... Test Equipment Provisions § 89.321 Oxides of nitrogen analyzer calibration. (a) The chemiluminescent oxides of nitrogen analyzer shall receive the initial and periodic calibration described in this section...

  5. 40 CFR 90.318 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer... Emission Test Equipment Provisions § 90.318 Oxides of nitrogen analyzer calibration. (a) Calibrate the chemiluminescent oxides of nitrogen analyzer as described in this section. (b) Initial and Periodic Interference...

  6. [Communication interface and software design for laboratory analyzer].

    Science.gov (United States)

    Wang, Dong; Wang, Yun-Guang; Wang, Chun-Hua; Song, Cheng-Jun; Hu, Wen-Zhong; Chen, Hang-Hua

    2009-07-01

    In this text the interface of hardware and software for laboratory analyzer is analyzed. Adopting VC++ and multi-thread technique, the real-time communication without errors between LIS and the laboratory analyzer is realized. Practice proves that the system based on the technique is stable in running and all data are received accurately and timely.

  7. 40 CFR 89.322 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Test Equipment Provisions § 89.322 Carbon dioxide analyzer calibration. (a) Prior to its introduction... carbon dioxide analyzer shall be calibrated on all normally used instrument ranges. New calibration...

  8. 40 CFR 86.1324-84 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Exhaust Test Procedures § 86.1324-84 Carbon dioxide analyzer calibration. Prior to its introduction into service and monthly thereafter, the NDIR carbon dioxide analyzer shall be calibrated as follows: (a...

  9. 40 CFR 86.124-78 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Complete Heavy-Duty Vehicles; Test Procedures § 86.124-78 Carbon dioxide analyzer calibration. Prior to its introduction into service and monthly thereafter the NDIR carbon dioxide analyzer shall be calibrated: (a...

  10. 40 CFR 90.320 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Emission Test Equipment Provisions § 90.320 Carbon dioxide analyzer calibration. (a) Prior to its initial... carbon dioxide analyzer as follows: (1) Follow good engineering practices for instrument start-up and...

  11. 40 CFR 1065.250 - Nondispersive infra-red analyzer.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Nondispersive infra-red analyzer. 1065... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Measurement Instruments Co and Co2 Measurements § 1065.250 Nondispersive infra-red analyzer. (a) Application. Use a nondispersive infra-red (NDIR) analyzer to measure CO...

  12. 21 CFR 868.1975 - Water vapor analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Water vapor analyzer. 868.1975 Section 868.1975...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1975 Water vapor analyzer. (a) Identification. A water vapor analyzer is a device intended to measure the concentration of water vapor in a...

  13. The design of virtual signal analyzer with high cost performance

    Science.gov (United States)

    Wang, Ya-nan; Pei, Gui-ling; Xu, Lei

    2013-03-01

    Based on 16bit STEREO AUDIO CODEC and C# this paper introduces a virtual signal analyzer. It mainly describes the system's overall structure, hardware design, PC software framework, etc. With reducing costs dramatically, the system also achieves being a signal generator, oscilloscope, recorder, spectrum analyzer, time-frequency analyzer and so on.

  14. 21 CFR 870.3640 - Indirect pacemaker generator function analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Indirect pacemaker generator function analyzer... Indirect pacemaker generator function analyzer. (a) Identification. An indirect pacemaker generator function analyzer is an electrically powered device that is used to determine pacemaker function or...

  15. 21 CFR 870.3630 - Pacemaker generator function analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Pacemaker generator function analyzer. 870.3630... (CONTINUED) MEDICAL DEVICES CARDIOVASCULAR DEVICES Cardiovascular Prosthetic Devices § 870.3630 Pacemaker generator function analyzer. (a) Identification. A pacemaker generator function analyzer is a device that is...

  16. BMDExpress Data Viewer - a visualization tool to analyze BMDExpress datasets.

    Science.gov (United States)

    Kuo, Byron; Francina Webster, A; Thomas, Russell S; Yauk, Carole L

    2016-08-01

    Regulatory agencies increasingly apply benchmark dose (BMD) modeling to determine points of departure for risk assessment. BMDExpress applies BMD modeling to transcriptomic datasets to identify transcriptional BMDs. However, graphing and analytical capabilities within BMDExpress are limited, and the analysis of output files is challenging. We developed a web-based application, BMDExpress Data Viewer (http://apps.sciome.com:8082/BMDX_Viewer/), for visualizing and graphing BMDExpress output files. The application consists of "Summary Visualization" and "Dataset Exploratory" tools. Through analysis of transcriptomic datasets of the toxicants furan and 4,4'-methylenebis(N,N-dimethyl)benzenamine, we demonstrate that the "Summary Visualization Tools" can be used to examine distributions of gene and pathway BMD values, and to derive a potential point of departure value based on summary statistics. By applying filters on enrichment P-values and minimum number of significant genes, the "Functional Enrichment Analysis" tool enables the user to select biological processes or pathways that are selectively perturbed by chemical exposure and identify the related BMD. The "Multiple Dataset Comparison" tool enables comparison of gene and pathway BMD values across multiple experiments (e.g., across timepoints or tissues). The "BMDL-BMD Range Plotter" tool facilitates the observation of BMD trends across biological processes or pathways. Through our case studies, we demonstrate that BMDExpress Data Viewer is a useful tool to visualize, explore and analyze BMDExpress output files. Visualizing the data in this manner enables rapid assessment of data quality, model fit, doses of peak activity, most sensitive pathway perturbations and other metrics that will be useful in applying toxicogenomics in risk assessment. © 2015 Her Majesty the Queen in Right of Canada. Journal of Applied Toxicology published by John Wiley & Sons, Ltd. © 2015 Her Majesty the Queen in Right of Canada. Journal

  17. MESAFace, a graphical interface to analyze the MESA output

    Science.gov (United States)

    Giannotti, M.; Wise, M.; Mohammed, A.

    2013-04-01

    MESA (Modules for Experiments in Stellar Astrophysics) has become very popular among astrophysicists as a powerful and reliable code to simulate stellar evolution. Analyzing the output data thoroughly may, however, present some challenges and be rather time-consuming. Here we describe MESAFace, a graphical and dynamical interface which provides an intuitive, efficient and quick way to analyze the MESA output. Catalogue identifier: AEOQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOQ_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 19165 No. of bytes in distributed program, including test data, etc.: 6300592 Distribution format: tar.gz Programming language: Mathematica. Computer: Any computer capable of running Mathematica. Operating system: Any capable of running Mathematica. Tested on Linux, Mac, Windows XP, Windows 7. RAM: Recommended 2 Gigabytes or more. Supplementary material: Additional test data files are available. Classification: 1.7, 14. Nature of problem: Find a way to quickly and thoroughly analyze the output of a MESA run, including all the profiles, and have an efficient method to produce graphical representations of the data. Solution method: We created two scripts (to be run consecutively). The first one downloads all the data from a MESA run and organizes the profiles in order of age. All the files are saved as tables or arrays of tables which can then be accessed very quickly by Mathematica. The second script uses the Manipulate function to create a graphical interface which allows the user to choose what to plot from a set of menus and buttons. The information shown is updated in real time. The user can access very quickly all the data from the run under examination and visualize it with plots and tables. Unusual features: Moving the

  18. EDAS-manual. SATAN - system to analyze tremendous amounts of nuclear data. Vol. 2

    International Nuclear Information System (INIS)

    Goeringer, H.; Gralla, S.; Malzacher, P.; Richter, M.; Schall, D.; Winkelmann, K.

    1988-09-01

    The system to analyze tremendous amounts of nuclear data (SATAN) shows different steps of a special experiment data evaluation called 'Linearisation'. The report contains the EDAS-manual with EDAS-command, TSO-command, macro and procedure. Syntax and usage of EDAS macros are explained. (DG)

  19. D-VASim: A Software Tool to Simulate and Analyze Genetic Logic Circuits

    DEFF Research Database (Denmark)

    Baig, Hasan; Madsen, Jan

    2016-01-01

    -stage researchers with limited experience in the field of biology. The Solution: Using LabVIEW to develop a user-friendly simulation tool named Dynamic Virtual Analyzer and Simulator (D-VASim), which is the first software tool in the domain of synthetic biology that provides a virtual laboratory environment...

  20. Observing and Analyzing Children's Mathematical Development, Based on Action Theory

    Science.gov (United States)

    Bunck, M. J. A.; Terlien, E.; van Groenestijn, M.; Toll, S. W. M.; Van Luit, J. E. H.

    2017-01-01

    Children who experience difficulties with learning mathematics should be taught by teachers who focus on the child's best way of learning. Analyses of the mathematical difficulties are necessary for fine-tuning mathematics education to the needs of these children. For this reason, an instrument for Observing and Analyzing children's Mathematical…