WorldWideScience

Sample records for laue case analyzer

  1. X-ray dark-field imaging and its application. Laue case analyzer

    CERN Document Server

    Ando, M

    2003-01-01

    A system on X-ray dark-field imaging under development and its application is reported. That comprises an asymmetric monochromator and a Laue case analyzer that has a specified thickness for a given X-ray photon energy or wavelength and a sample locating inbetween these. This system uses Si 4,4,0 diffraction for both X-ray optics element in a parallel arrangement. In order to achieve the dark-field imaging condition the Si Laue analyzer should be 1.075 mm in thickness for the X-ray energy of 35 keV. Since this system is very simple one can expect a variety of applications including material science, biology, palaeontology and clinical medicine where a large view area with size of 100 mm x 100 mm is needed. (author)

  2. High resolution short focal distance Bent Crystal Laue Analyzer for copper K edge x-ray absorption spectroscopy.

    Science.gov (United States)

    Kujala, N G; Karanfil, C; Barrea, R A

    2011-06-01

    We have developed a compact short focal distance Bent Crystal Laue Analyzer (BCLA) for Cu speciation studies of biological systems with specific applications to cancer biology. The system provides high energy resolution and high background rejection. The system is composed of an aluminum block serving as a log spiral bender for a 15 micron thick Silicon 111 crystal and a set of soller slits. The energy resolution of the BCLA-about 14 eV at the Cu Kα line- allows resolution of the Cu Kα(1) and CuKα(2) lines. The system is easily aligned by using a set of motorized XYZ linear stages. Two operation modes are available: incident energy scans (IES) and emission energy scans (EES). IES allows scanning of the incident energy while the BCLA system is maintained at a preselected fixed position--typically CuKα(1) line. EES is used when the incident energy is fixed and the analyzer is scanned to provide the peak profile of the emission lines of Cu.

  3. Hemispherical Laue camera

    Science.gov (United States)

    Li, James C. M.; Chu, Sungnee G.

    1980-01-01

    A hemispherical Laue camera comprises a crystal sample mount for positioning a sample to be analyzed at the center of sphere of a hemispherical, X-radiation sensitive film cassette, a collimator, a stationary or rotating sample mount and a set of standard spherical projection spheres. X-radiation generated from an external source is directed through the collimator to impinge onto the single crystal sample on the stationary mount. The diffracted beam is recorded on the hemispherical X-radiation sensitive film mounted inside the hemispherical film cassette in either transmission or back-reflection geometry. The distances travelled by X-radiation diffracted from the crystal to the hemispherical film are the same for all crystal planes which satisfy Bragg's Law. The recorded diffraction spots or Laue spots on the film thereby preserve both the symmetry information of the crystal structure and the relative intensities which are directly related to the relative structure factors of the crystal orientations. The diffraction pattern on the exposed film is compared with the known diffraction pattern on one of the standard spherical projection spheres for a specific crystal structure to determine the orientation of the crystal sample. By replacing the stationary sample support with a rotating sample mount, the hemispherical Laue camera can be used for crystal structure determination in a manner previously provided in conventional Debye-Scherrer cameras.

  4. Laue 2007: international workshop on advanced Laue diffraction in frontier science

    Energy Technology Data Exchange (ETDEWEB)

    Adachi, S.I.; Akgul, G.; Aksoy, F.; Andersen, K.; Andersson, M.; Anfinrud, Ph.; Baruchel, J.; Bastie, P.; Bau, R.; Blakeley, M.; Bourgeois, D.; Brau, D.; Bravin, A.; Cammarata, M.; Christensen, M.; Cole, J.; Courtois, P.; Cousson, A.; Eggonopoulos-Papadopoulos, B.; Daoud-Aladine, M.A.; Dera, P.; Feng, R.; Fiedler, St.; Fischer, H.; Fisher, St.; Folami, S.; Fosu, M.A.; Fuente, F.; Fullagar, W.; Fulla Marsa, D.; Ghosh, R.; Giles, C.; Goossens, D.; Goujon, A.; Gutmann, M.; Heger, G.; Henry, E.; Hewat, A.; Hossmann, Ch.; Ivanov, A.; Jauch, W.; Jorgensen, R.; Katona, G.; Keen, D.; Kong, Q.; Koshihara, Sh.Y.; Lauss, B.; Laue, M.V.; Lecomte, C.; Legrand, V.; Lemee-Cailleau, M.H.; Marmeggi, J.C.; Martinez-Criado, G.; Mason, S.; McIntyre, G.; Mailleur, F.; Micha, J.S.; Moffat, K.; Mohammed Mustapha, A.; Ouladdiaf, B.; Pahl, R.; Parise, J.; Pearson, A.; Pecaut, J.; Popov, A.; Prokleska, J.; Raitman, E.; Ren, Z.; Rodriguez-Carvajal, J.; Sasaki, J.; Schmidt, M.; Schotte, F.; Stirling, W.; Suominen Fuller, M.; Tanaka, I.; Timmins, P.; Tomking, P.; Turner, M.; Van Thor, J.; Vettier, Ch.; Wildes, A.; Wilson, Ch.; Wohri, A.; Wulf, M.; Zhao, Y

    2007-07-01

    Laue diffraction is currently undergoing a lively renaissance due to new instrumental developments at both synchrotron X-ray and neutron sources. The aim of the workshop Laue-2007 is to offer state-of-the-art experimental methods and hands-on experience of data analysis for exploration, using single-crystal Laue diffraction, of the crystalline structure of complex materials in extreme cases. The oral sessions cover the following topics: -) history and renaissance of Laue diffraction, -) modern X-ray techniques, -) modern neutron techniques, -) applications, -) analysis and software demonstrations including hands-on experience, and -) future directions. This document gathers the abstracts of the presentations and of the posters.

  5. Von Laue's theorem and its applications

    CERN Document Server

    Wang, Changbiao

    2012-01-01

    Von Laue's theorem is strictly proved in detail to clarify confusions in textbook and literature. This theorem is used to analyze the classical electron and the static electric field confined in a finite region of space.

  6. X-ray micro Laue diffraction tomography analysis of a solid oxide fuel cell1

    Science.gov (United States)

    Ferreira Sanchez, Dario; Villanova, Julie; Laurencin, Jérôme; Micha, Jean-Sébastien; Montani, Alexandre; Gergaud, Patrice; Bleuet, Pierre

    2015-01-01

    The relevance of micro Laue diffraction tomography (µ-LT) to investigate heterogeneous polycrystalline materials has been studied. For this purpose, a multiphase solid oxide fuel cell (SOFC) electrode composite made of yttria-stabilized zirconia and nickel oxide phases, with grains of about a few micrometres in size, has been analyzed. In order to calibrate the Laue data and to test the technique’s sensitivity limits, a monocrystalline germanium sample of about 8 × 4 µm in cross-section size has also been studied through µ-LT. The SOFC and germanium Laue diffraction pattern analyses are compared and discussed. The indexing procedure has been successfully applied for the analysis of the germanium Laue data, and the depth-resolved two-dimensional cartographies of the full deviatoric strain tensor components were obtained. The development and application of an original geometrical approach to analyze the SOFC Laue data allowed the authors to resolve grains with sizes of about 3 µm and to identify their individual Laue patterns; by indexing those Laue patterns, the crystalline phases and orientations of most of the grains identified through the geometrical approach could be resolved. PMID:25844076

  7. Automated indexing of Laue images from polycrystalline materials

    Energy Technology Data Exchange (ETDEWEB)

    Chung, J.S.; Ice, G.E. [Oak Ridge National Lab., TN (United States). Metals and Ceramics Div.

    1998-12-31

    Third generation hard x-ray synchrotron sources and new x-ray optics have revolutionized x-ray microbeams. Now intense sub-micron x-ray beams are routinely available for x-ray diffraction measurement. An important application of sub-micron x-ray beams is analyzing polycrystalline material by measuring the diffraction of individual grains. For these measurements, conventional analysis methods will not work. The most suitable method for microdiffraction on polycrystalline samples is taking broad-bandpass or white-beam Laue images. With this method, the crystal orientation and non-isostatic strain can be measured rapidly without rotation of sample or detector. The essential step is indexing the reflections from more than one grain. An algorithm has recently been developed to index broad bandpass Laue images from multi-grain samples. For a single grain, a unique set of indices is found by comparing measured angles between Laue reflections and angles between possible indices derived from the x-ray energy bandpass and the scattering angle 2 theta. This method has been extended to multigrain diffraction by successively indexing points not recognized in preceding indexing iterations. This automated indexing method can be used in a wide range of applications.

  8. Automation Enhancement of Multilayer Laue Lenses

    Energy Technology Data Exchange (ETDEWEB)

    Lauer K. R.; Conley R.

    2010-12-01

    X-ray optics fabrication at Brookhaven National Laboratory has been facilitated by a new, state of the art magnetron sputtering physical deposition system. With its nine magnetron sputtering cathodes and substrate carrier that moves on a linear rail via a UHV brushless linear servo motor, the system is capable of accurately depositing the many thousands of layers necessary for multilayer Laue lenses. I have engineered a versatile and automated control program from scratch for the base system and many subsystems. Its main features include a custom scripting language, a fully customizable graphical user interface, wireless and remote control, and a terminal-based interface. This control system has already been successfully used in the creation of many types of x-ray optics, including several thousand layer multilayer Laue lenses.Before reaching the point at which a deposition can be run, stencil-like masks for the sputtering cathodes must be created to ensure the proper distribution of sputtered atoms. Quality of multilayer Laue lenses can also be difficult to measure, given the size of the thin film layers. I employ my knowledge of software and algorithms to further ease these previously painstaking processes with custom programs. Additionally, I will give an overview of an x-ray optic simulator package I helped develop during the summer of 2010. In the interest of keeping my software free and open, I have worked mostly with the multiplatform Python and the PyQt application framework, utilizing C and C++ where necessary.

  9. A bent Laue-Laue monochromator for a synchrotron-based computed tomography system

    CERN Document Server

    Ren, B; Chapman, L D; Ivanov, I; Wu, X Y; Zhong, Z; Huang, X

    1999-01-01

    We designed and tested a two-crystal bent Laue-Laue monochromator for wide, fan-shaped synchrotron X-ray beams for the program multiple energy computed tomography (MECT) at the National Synchrotron Light Source (NSLS). MECT employs monochromatic X-ray beams from the NSLS's X17B superconducting wiggler beamline for computed tomography (CT) with an improved image quality. MECT uses a fixed horizontal fan-shaped beam with the subject's apparatus rotating around a vertical axis. The new monochromator uses two Czochralski-grown Si crystals, 0.7 and 1.4 mm thick, respectively, and with thick ribs on their upper and lower ends. The crystals are bent cylindrically, with the axis of the cylinder parallel to the fan beam, using 4-rod benders with two fixed rods and two movable ones. The bent-crystal feature of the monochromator resolved the difficulties we had had with the flat Laue-Laue design previously used in MECT, which included (a) inadequate beam intensity, (b) excessive fluctuations in beam intensity, and (c) i...

  10. Extraction of accurate structure-factor amplitudes from Laue data: wavelength normalization with wiggler and undulator X-ray sources.

    Science.gov (United States)

    Srajer, V; Crosson, S; Schmidt, M; Key, J; Schotte, F; Anderson, S; Perman, B; Ren, Z; Teng, T Y; Bourgeois, D; Wulff, M; Moffat, K

    2000-07-01

    Wavelength normalization is an essential part of processing of Laue X-ray diffraction data and is critically important for deriving accurate structure-factor amplitudes. The results of wavelength normalization for Laue data obtained in nanosecond time-resolved experiments at the ID09 beamline at the European Synchrotron Radiation Facility, Grenoble, France, are presented. Several wiggler and undulator insertion devices with complex spectra were used. The results show that even in the most challenging cases, such as wiggler/undulator tandems or single-line undulators, accurate wavelength normalization does not require unusually redundant Laue data and can be accomplished using typical Laue data sets. Single-line undulator spectra derived from Laue data compare well with the measured incident X-ray spectra. Successful wavelength normalization of the undulator data was also confirmed by the observed signal in nanosecond time-resolved experiments. Single-line undulators, which are attractive for time-resolved experiments due to their high peak intensity and low polychromatic background, are compared with wigglers, based on data obtained on the same crystal.

  11. Simulation of a Laue lens with bent Ge(111) crystals

    CERN Document Server

    Valsan, Vineeth; Frontera, Filippo; Liccardo, Vincenzo; Caroli, Ezio; Stephen, John B

    2015-01-01

    In the context of Laue project for focusing hard X-/ soft gamma-rays, an entire Laue lens, using bent Ge(111) crystal tiles, with 40 meters curvature radius, is simulated with a focal length of 20 meters. The focusing energy band is between 80 keV and 600 keV. The distortion of the output image of the lens on the focal plane due to the effect of crystal tile misalignment as well as the radial distortion arising from the curvature of the crystal is discussed in detail. Expected detection efficiency and instrument background is also estimated. Finally the sensitivity of the Laue lens is calculated. A quantitative analysis of the results of these simulation is also presented.

  12. Optical laue diffraction on photonic structures designed by laser lithography

    Science.gov (United States)

    Samusev, K. B.; Rybin, M. V.; Lukashenko, S. Yu.; Limonov, M. F.

    2016-06-01

    Two-dimensional photonic crystals with square symmetry C 4v were obtained using the laser lithography method. The structure of these samples was studied by scanning electron microscopy. Optical Laue diffraction for monochromatic light was studied experimentally depending on the incidence angle of laser beam and lattice constant. Interpretation of the observed diffraction patterns is given in the framework of the Laue diffraction mechanism for an one-dimensional chain of scattering elements. Red thresholds for different diffraction orders were determined experimentally and theoretically. The results of calculations are in an excellent agreement with experiment.

  13. Dynamical focusing by bent, asymmetrically cut perfect crystals in Laue geometry.

    Science.gov (United States)

    Guigay, J P; Ferrero, C

    2016-07-01

    A semi-analytical approach based on the influence functions of a point source located on the crystal surface has been adopted to show that the focusing ability of cylindrically bent Laue crystals may be strongly enhanced by replacing symmetrically cut crystals with asymmetrically cut crystals. This approach is generally applicable to any distance between the X-ray source and the focusing bent crystal. A mathematically straightforward method to simplify the derivation of the already known expression of the influence functions in the case of deformed crystals with a constant strain gradient (e.g. cylindrically bent crystals) is also presented.

  14. Using the Case Survey Method To Analyze Policy Studies

    Science.gov (United States)

    Yin, Robert K.; Heald, Karen A.

    1975-01-01

    Describes a case study survey method that allows an analyst to aggregate (by means of a closed-ended questionnaire) the case study experiences and to assess the quality of each case study in a reliable and replicable manner. (Author/IRT)

  15. Improvement of Students’ Ability to Analyzing Cases on Case Studies Through Journal and Learning Log

    Directory of Open Access Journals (Sweden)

    Riska Ahmad

    2017-08-01

    Full Text Available The purpose of this research is to improve the ability of students in guidance and counseling  to  analyzing the case through journals and learning logs This research is classroom action research consists of two cycles. The research phase consisted of planning, implementation, observation and reflection. The research subject are students in guidance and counseling while they are in sixth semester, totaling 20 people who were taking courses in Case Study. The research instrument is the observation guidelines, assessment rubrics and documentation of case studies in the form of journals and learning logs, and case study reports. The study was conducted collaboratively with student magister’s program guidance and counseling. The results showed that in cycle 1 students are able to identify cases, to develop ideas about the case, select and use instruments to analyze the cause of the problem. The results of the research cycle 2, showed 17 of the 20 students were able to analyze the cause of the problem, select the type of service and provide appropriate assistance in accordance with problem cases. Overall value obtained by the students in the subject of Case Studies also increased. In terms of the ability of explanation of the concept, the concept of truth and creativity, based on the ratings given by fellow students of average ability students were in either category, although there is less good, as are associated with the activity of the opinion and the quality of the opinions expressed.

  16. Analyzing discussions on twitter: Case study on HPV vaccinations

    NARCIS (Netherlands)

    Kaptein, R.; Boertjes, E.; Langley, D.

    2014-01-01

    In this work we analyze the discussions on Twitter around the Human papillomavirus (HPV) vaccinations. We collect a dataset consisting of tweets related to the HPV vaccinations by searching for relevant keywords, by retrieving the conversations on Twitter, and by retrieving tweets from our user grou

  17. Analyzing discussions on twitter: Case study on HPV vaccinations

    NARCIS (Netherlands)

    Kaptein, R.; Boertjes, E.; Langley, D.

    2014-01-01

    In this work we analyze the discussions on Twitter around the Human papillomavirus (HPV) vaccinations. We collect a dataset consisting of tweets related to the HPV vaccinations by searching for relevant keywords, by retrieving the conversations on Twitter, and by retrieving tweets from our user

  18. Time of flight Laue fiber diffraction studies of perdeuterated DNA

    Energy Technology Data Exchange (ETDEWEB)

    Forsyth, V.T.; Whalley, M.A.; Mahendrasingam, A.; Fuller, W. [Keele Univ. (United Kingdom)] [and others

    1994-12-31

    The diffractometer SXD at the Rutherford Appleton Laboratory ISIS pulsed neutron source has been used to record high resolution time-of-flight Laue fiber diffraction data from DNA. These experiments, which are the first of their kind, were undertaken using fibers of DNA in the A conformation and prepared using deuterated DNA in order to minimis incoherent background scattering. These studies complement previous experiments on instrument D19 at the Institute Laue Langevin using monochromatic neutrons. Sample preparation involved drawing large numbers of these deuterated DNA fibers and mounting them in a parallel array. The strategy of data collection is discussed in terms of camera design, sample environment and data collection. The methods used to correct the recorded time-of-flight data and map it into the final reciprocal space fiber diffraction dataset are also discussed. Difference Fourier maps showing the distribution of water around A-DNA calculated on the basis of these data are compared with results obtained using data recorded from hydrogenated A-DNA on D19. Since the methods used for sample preparation, data collection and data processing are fundamentally different for the monochromatic and Laue techniques, the results of these experiments also afford a valuable opportunity to independently test the data reduction and analysis techniques used in the two methods.

  19. Rapid and Accurate Assembly Method for a New Laue Lens Prototype

    DEFF Research Database (Denmark)

    Wade, Colin; Barriere, Nicolas; Hanlon, Lorraine

    2015-01-01

    The Laue lens is a technology for gamma-ray astrophysics whereby gamma-rays of particular energies can be focused by a suitable arrangement of crystals. The Laue lens assembly station at UC Berkeley was used to build a technological demonstrator addressing the key issues of crystal mounting speed...

  20. Bragg-von Laue diffraction generalized to twisted X-rays.

    Science.gov (United States)

    Jüstel, Dominik; Friesecke, Gero; James, Richard D

    2016-03-01

    A pervasive limitation of nearly all practical X-ray methods for the determination of the atomic scale structure of matter is the need to crystallize the molecule, compound or alloy in a sufficiently large (∼ 10 × 10 × 10 µm) periodic array. In this paper an X-ray method applicable to structure determination of some important noncrystalline structures is proposed. It is designed according to a strict mathematical analog of von Laue's method, but replacing the translation group by another symmetry group, and simultaneously replacing plane waves by different exact closed-form solutions of Maxwell's equations. Details are presented for helical structures like carbon nanotubes or filamentous viruses. In computer simulations the accuracy of the determination of structure is shown to be comparable to the periodic case.

  1. Hard X / soft gamma ray polarimetry using a Laue lens

    CERN Document Server

    Barrière, Nicolas M; Ubertini, Pietro

    2011-01-01

    Hard X / soft gamma-ray polarimetric analysis can be performed efficiently by the study of Compton scattering anisotropy in a detector composed of fine pixels. But in the energy range above 100 keV where sources flux are extremely weak and instrumental background very strong, such delicate measurement is actually very difficult to perform. Laue lens is an emerging technology based on diffraction in crystals allowing the concentration of soft gamma rays. This kind of optics can be applied to realize an efficient high-sensitivity and high-angular resolution telescope, at the cost of a field of view reduced to a few arcmin though. A 20 m focal length telescope concept focusing in the 100 keV - 600 keV energy range is taken as example here to show that recent progresses in the domain of high-reflectivity crystals can lead to very appealing performance. The Laue lens being fully transparent to polarization, this kind of telescope would be well suited to perform polarimetric studies since the ideal focal plan is a ...

  2. One dimensional focusing with high numerical aperture multilayer Laue lens

    Energy Technology Data Exchange (ETDEWEB)

    Bajt, Saša, E-mail: sasa.bajt@desy.de; Prasciolu, Mauro [Photon Science, DESY, Notkestrasse 85, 22607 Hamburg (Germany); Morgan, Andrew J. [Center for Free-Electron Laser Science, DESY, Notkestrasse 85, 22607 Hamburg (Germany); Chapman, Henry N. [Center for Free-Electron Laser Science, DESY, Notkestrasse 85, 22607 Hamburg (Germany); Dept. of Physics, University of Hamburg, Luruper Chaussee 149, 22607 Hamburg (Germany); Centre for Ultrafast Imaging, Luruper Chaussee 149, 22607 Hamburg (Germany); Krzywinski, Jacek [SLAC, 2575 Sand Hill Rd., Menlo Park, CA 94025 (United States); Andrejczuk, Andrzej [Faculty of Physics, University of Bialystok, K. Ciolkowskiego 1L, 15-245, Bialystok (Poland)

    2016-01-28

    Multilayer Laue lenses (MLLs) capitalize on the developments in multilayer deposition technologies for fabricating reflective coatings, specifically undertaken for EUV lithography, where layer thicknesses of several nanometers can be achieved. MLLs are deposited layer by layer, with their thicknesses following the zone plate law, and then pieces are sliced and extracted for use in focusing. Rays are reflected in the Laue geometry. The efficiency of a MLL can be very high, and is maximized by making the slice equal to about a half Pendellosung period so that most energy is transferred from the undiffracted to the diffracted beam, and by ensuring that the Bragg condition is met at each point in the zone plate. This latter condition requires that the layers are tilted to the beam by an amount that varies with layer position; e.g. for focusing a collimated beam, the layers should be normal to a cylinder of radius of twice the focal length. We have fabricated such tilted-zone MLLs and find that they exhibit improved efficiency across their entire pupil as compared with parallel-zone MLLs. This leads to a higher effective NA of the optic and hence higher resolution.

  3. Laue-DIC: a new method for improved stress field measurements at the micrometer scale

    Energy Technology Data Exchange (ETDEWEB)

    Petit, J., E-mail: johannpetit@u-paris10.fr [LEME, Université Paris Ouest, 50 rue de Sèvres, F-92410 Ville d’Avray (France); Castelnau, O. [PIMM, CNRS, Arts and Métiers ParisTech, 151 Bd de l’Hopital, F-75013 Paris (France); Bornert, M. [Laboratoire Navier, Université Paris-Est, École des Ponts ParisTech, F-77455 Marne-la-Vallée (France); Zhang, F. G. [PIMM, CNRS, Arts and Métiers ParisTech, 151 Bd de l’Hopital, F-75013 Paris (France); Hofmann, F.; Korsunsky, A. M. [Department of Engineering Science, University of Oxford, Parks Road, Oxford OX1 3PJ (United Kingdom); Faurie, D. [LSPM, CNRS, Université Paris 13, 93430 Villetaneuse (France); Le Bourlot, C. [INSA-Lyon, MATEIS CNRS UMR5510, F-69621 Villeurbanne (France); Micha, J. S. [Université Grenoble Alpes, INAC-SPrAM, F-38000 Grenoble (France); CNRS, SPrAM, F-38000 Grenoble (France); CRG-IF BM32 at ESRF, 71 Avenue des Martyrs, F-38000 Grenoble (France); Robach, O.; Ulrich, O. [Université Grenoble Alpes, INAC-SPrAM, F-38000 Grenoble (France); CRG-IF BM32 at ESRF, 71 Avenue des Martyrs, F-38000 Grenoble (France); CEA, INAC-SP2M, F-38000 Grenoble (France)

    2015-05-09

    The increment of elastic strain distribution, with a micrometer spatial resolution, is obtained by the correlation of successive Laue images. Application to a bent Si crystal allows evaluation of the accuracy of this new Laue-DIC method, which is about 10{sup −5}. A better understanding of the effective mechanical behavior of polycrystalline materials requires an accurate knowledge of the behavior at a scale smaller than the grain size. The X-ray Laue microdiffraction technique available at beamline BM32 at the European Synchrotron Radiation Facility is ideally suited for probing elastic strains (and associated stresses) in deformed polycrystalline materials with a spatial resolution smaller than a micrometer. However, the standard technique used to evaluate local stresses from the distortion of Laue patterns lacks accuracy for many micromechanical applications, mostly due to (i) the fitting of Laue spots by analytical functions, and (ii) the necessary comparison of the measured pattern with the theoretical one from an unstrained reference specimen. In the present paper, a new method for the analysis of Laue images is presented. A Digital Image Correlation (DIC) technique, which is essentially insensitive to the shape of Laue spots, is applied to measure the relative distortion of Laue patterns acquired at two different positions on the specimen. The new method is tested on an in situ deformed Si single-crystal, for which the prescribed stress distribution has been calculated by finite-element analysis. It is shown that the new Laue-DIC method allows determination of local stresses with a strain resolution of the order of 10{sup −5}.

  4. The clinical diagnosis and treatment about 22 cases of limbic encephalitis were retrospectively analyzed.

    Science.gov (United States)

    Zang, Weiping; Zhang, Zhijun; Feng, Laihui; Zhang, Ailing

    2016-03-01

    To summarize and analyze the clinical characteristics and treatment of limbic encephalitis, in order to provide the basis for clinical work. We retrospectively analyzed the clinical characteristics, magnetic resonance imaging (MRI), cerebrospinal fluid (CSF) and self immune antibody results of 22 patients with limbic encephalitis in Zheng zhou people's Hospital from March 2013 to May 2014. 22 cases of patients with psychiatric disturbance, such as hallucinations being typical clinical manifestations: Memory decline in 18 cases: Seizures in 13 patients: Altered level of consciousness in 10 cases; Movement disorders in 7 cases and 9 cases with febrile.14 cases have relieved after treating with antiviral and immunosuppressive therapy, 5 cases left memory decline, 2 patients left overwhelmingly excited, 1 cases of seizures. The clinical symptoms of patients with limbic encephalitis are complicated changeable and unspecific. so earlier diagnosis and treatment are very important for the prognosis of patients.

  5. Multilayer Laue Lens Growth at NSLS-II

    Energy Technology Data Exchange (ETDEWEB)

    Conley R.; Bouet, N.; Lauer, K.; Carlucci-Dayton, M.; Biancarosa, J.; Boas, L.; Drannbauer, J.; Feraca, J.; Rosenbaum, L.

    2012-08-15

    The new NSLS-II deposition laboratory has been commissioned to include a variety of thin-film characterization equipment and a next-generation deposition system. The primary goal for this effort is R&D on the multilayer Laue lens (MLL), which is a new type of x-ray optic with the potential for an unprecedented level of x-ray nano-focusing. This unique deposition system contains many design features in order to facilitate growth of combined depth-graded and laterally graded multilayers with precise thickness control over many thousands of layers, providing total film growth in one run of up to 100 {micro}m thick or greater. A precision in-vacuum linear motor servo system raster scans a substrate over an array of magnetrons with shaped apertures at well-defined velocities to affect a multilayer coating. The design, commissioning, and performance metrics of the NSLS-II deposition system will be discussed. Latest growth results of both MLL and reflective multilayers in this machine will be presented.

  6. Effects of Professional Experience and Group Interaction on Information Requested in Analyzing IT Cases

    Science.gov (United States)

    Lehmann, Constance M.; Heagy, Cynthia D.

    2008-01-01

    The authors investigated the effects of professional experience and group interaction on the information that information technology professionals and graduate accounting information system (AIS) students request when analyzing business cases related to information systems design and implementation. Understanding these effects can contribute to…

  7. Soft gamma-ray optics: new Laue lens design and performance estimates

    CERN Document Server

    Barriere, N; Abrosimov, N; Von Ballmoos, P; Bastie, P; Courtois, P; Jentschel, M; Knödlseder, J; Rousselle, J; Ubertini, P

    2009-01-01

    Laue lenses are an emerging technology based on diffraction in crystals that allows the concentration of soft gamma rays. This kind of optics that works in the 100 keV - 1.5 MeV band can be used to realize an high-sensitivity and high-angular resolution telescope (in a narrow field of view). This paper reviews the recent progresses that have been done in the development of efficient crystals, in the design study and in the modelisation of the answer of Laue lenses. Through the example of a new concept of 20 m focal length lens focusing in the 100 keV - 600 keV band, the performance of a telescope based on a Laue lens is presented. This lens uses the most efficient mosaic crystals in each sub-energy range in order to yield the maximum reflectivity. Imaging capabilities are investigated and shows promising results.

  8. Verification of the weak equivalence principle with Laue diffracting neutrons: Test experiment

    Science.gov (United States)

    Vezhlev, E. O.; Voronin, V. V.; Kuznetsov, I. A.; Semenikhin, S. Yu.; Fedorov, V. V.

    2013-07-01

    We propose a novel experiment to test the weak equivalence principle (WEP) for the Laue diffracting neutron. Our experiment is based on an essential magnification of an external affect on neutron diffracting by Laue for the Bragg angles close to the right one in couple with additional enhancement factor which exists due to the delay of the Laue diffracting neutron at such Bragg angles. This enhancement phenomena is proposed to be utilized for measuring the force which deviates from zero if WEP is violated. The accuracy of measuring inertial to gravitational neutron masses ratio for the introduced setup can reach ˜10-5, which is more than one order superior to the best present-day result.

  9. Laue crystal structure of Shewanella oneidensis cytochrome c nitrite reductase from a high-yield expression system

    Energy Technology Data Exchange (ETDEWEB)

    Youngblut, Matthew; Judd, Evan T.; Srajer, Vukica; Sayyed, Bilal; Goelzer, Tyler; Elliott, Sean J.; Schmidt, Marius; Pacheco, A. Andrew (UW); (UC); (BU)

    2012-09-11

    The high-yield expression and purification of Shewanella oneidensis cytochrome c nitrite reductase (ccNiR) and its characterization by a variety of methods, notably Laue crystallography, are reported. A key component of the expression system is an artificial ccNiR gene in which the N-terminal signal peptide from the highly expressed S. oneidensis protein 'small tetraheme c' replaces the wild-type signal peptide. This gene, inserted into the plasmid pHSG298 and expressed in S. oneidensis TSP-1 strain, generated approximately 20 mg crude ccNiR per liter of culture, compared with 0.5-1 mg/L for untransformed cells. Purified ccNiR has nitrite and hydroxylamine reductase activities comparable to those previously reported for Escherichia coli ccNiR, and is stable for over 2 weeks in pH 7 solution at 4 C. UV/vis spectropotentiometric titrations and protein film voltammetry identified five independent one-electron reduction processes. Global analysis of the spectropotentiometric data also allowed determination of the extinction coefficient spectra for the five reduced ccNiR species. The characteristics of the individual extinction coefficient spectra suggest that, within each reduced species, the electrons are distributed among the various hemes, rather than being localized on specific heme centers. The purified ccNiR yielded good-quality crystals, with which the 2.59-{angstrom}-resolution structure was solved at room temperature using the Laue diffraction method. The structure is similar to that of E. coli ccNiR, except in the region where the enzyme interacts with its physiological electron donor (CymA in the case of S. oneidensis ccNiR, NrfB in the case of the E. coli protein).

  10. R&D progress on second-generation crystals for Laue lens applications

    CERN Document Server

    Barrière, N; Bastie, P; Courtois, P; Abrosimov, N V; Andersen, K; Buslaps, T; Camus, T; Halloin, H; Jentschel, M; Knödlseder, J; Roudil, G; Serre, D; Skinner, G

    2007-01-01

    The concept of a gamma-ray telescope based on a Laue lens offers the possibility to increase the sensitivity by more than an order of magnitude with respect to existing instruments. Laue lenses have been developed by our collaboration for several years : the main achievement of this R&D program was the CLAIRE lens prototype. Since then, the endeavour has been oriented towards the development of efficient diffracting elements (crystal slabs), the aim being to step from a technological Laue lens to a scientifically exploitable lens. The latest mission concept featuring a gamma-ray lens is the European Gamma-Ray Imager (GRI) which intends to make use of the Laue lens to cover energies from 200 keV to 1300 keV. Investigations of two promising materials, low mosaicity copper and gradient concentration silicon-germanium are presented in this paper. The measurements have been performed during three runs on beamline ID15A of the European Synchrotron Radiation Facility, and on the GAMS 4 instrument of the Institut...

  11. A focal plane detector design for a wide-band Laue-lens telescope

    DEFF Research Database (Denmark)

    Caroli, E.; Auricchio, N.; Amati, L.

    2005-01-01

    , and the detection of nuclear and annihilation lines. Recently the development of high energy Laue lenses with broad energy bandpasses from 60 to 600 keV have been proposed for a Hard X ray focusing Telescope (HAXTEL) in order to study the X-ray continuum of celestial sources. The required focal plane detector...

  12. ANALYZING THE PROCESS OF PRODUCTION IN LOGISTICS SUGARCANE MILL: A CASE STUDY

    Directory of Open Access Journals (Sweden)

    Alexandre Tognoli

    2011-06-01

    Full Text Available The objective was to present and analyze the physical arrangement of logistics and production process plant in a sugarcane mill, in order to expose the processes involved, analyzing them more deeply and thus collaborate in a more efficient production. The relevance of this presentation is linked to the benefits that the plant and professionals can get through this work, enabling the development of methods and production alternatives. The research method used was case study based on interviews, on-site observation and document analysis, which was very appropriate as it could examine and cross checking. This work will allow a better understanding of the production process of the logistics of the plant in a sugarcane mill and working with suggestions and methods for more efficient production.

  13. A case-control study and analyze the epidemiological importance risk of family history of psoriasis

    Directory of Open Access Journals (Sweden)

    Anca Chiriac

    2014-01-01

    Full Text Available We have conducted a case-control study to analyze the epidemiological importance risk of family history of psoriasis. The retrospective study was done on 1236 patients diagnosed with psoriasis on clinical and histopathological grounds, between 2004-2011, in an Out-patient Clinic in North-Eastern part of Romania.The sex ratio of psoriasis was 1.18:1 (male patients 54.13%, female patients 45.87%, median age at the diagnosis was 29.34±15.24SD; family history of psoriasis (by declaration was 29.53% (Tabl. I.

  14. Analyzing privacy requirements: A case study of healthcare in Saudi Arabia.

    Science.gov (United States)

    Ebad, Shouki A; Jaha, Emad S; Al-Qadhi, Mohammed A

    2016-01-01

    Developing legally compliant systems is a challenging software engineering problem, especially in systems that are governed by law, such as healthcare information systems. This challenge comes from the ambiguities and domain-specific definitions that are found in governmental rules. Therefore, there is a significant business need to automatically analyze privacy texts, extract rules and subsequently enforce them throughout the supply chain. The existing works that analyze health regulations use the U.S. Health Insurance Portability and Accountability Act as a case study. In this article, we applied the Breaux and Antón approach to the text of the Saudi Arabian healthcare privacy regulations; in Saudi Arabia, privacy is among the top dilemmas for public and private healthcare practitioners. As a result, we extracted and analyzed 2 rights, 4 obligations, 22 constraints, and 6 rules. Our analysis can assist requirements engineers, standards organizations, compliance officers and stakeholders by ensuring that their systems conform to Saudi policy. In addition, this article discusses the threats to the study validity and suggests open problems for future research.

  15. An X-ray Raman spectrometer for EXAFS studies on minerals: bent Laue spectrometer with 20 keV X-rays.

    Science.gov (United States)

    Hiraoka, N; Fukui, H; Tanida, H; Toyokawa, H; Cai, Y Q; Tsuei, K D

    2013-03-01

    An X-ray Raman spectrometer for studies of local structures in minerals is discussed. Contrary to widely adopted back-scattering spectrometers using ≤10 keV X-rays, a spectrometer utilizing ~20 keV X-rays and a bent Laue analyzer is proposed. The 20 keV photons penetrate mineral samples much more deeply than 10 keV photons, so that high intensity is obtained owing to an enhancement of the scattering volume. Furthermore, a bent Laue analyzer provides a wide band-pass and a high reflectivity, leading to a much enhanced integrated intensity. A prototype spectrometer has been constructed and performance tests carried out. The oxygen K-edge in SiO(2) glass and crystal (α-quartz) has been measured with energy resolutions of 4 eV (EXAFS mode) and 1.3 eV (XANES mode). Unlike methods previously adopted, it is proposed to determine the pre-edge curve based on a theoretical Compton profile and a Monte Carlo multiple-scattering simulation before extracting EXAFS features. It is shown that the obtained EXAFS features are reproduced fairly well by a cluster model with a minimal set of fitting parameters. The spectrometer and the data processing proposed here are readily applicable to high-pressure studies.

  16. Analyzing and Comparing Biomass Feedstock Supply Systems in China: Corn Stover and Sweet Sorghum Case Studies

    Directory of Open Access Journals (Sweden)

    Lantian Ren

    2015-06-01

    Full Text Available This paper analyzes the rural Chinese biomass supply system and models supply chain operations according to U.S. concepts of logistical unit operations: harvest and collection, storage, transportation, preprocessing, and handling and queuing. In this paper, we quantify the logistics cost of corn stover and sweet sorghum in China under different scenarios. We analyze three scenarios of corn stover logistics from northeast China and three scenarios of sweet sorghum stalks logistics from Inner Mongolia in China. The case study estimates that the logistics cost of corn stover and sweet sorghum stalk to be $52.95/dry metric ton and $52.64/dry metric ton, respectively, for the current labor-based biomass logistics system. However, if the feedstock logistics operation is mechanized, the cost of corn stover and sweet sorghum stalk decreases to $36.01/dry metric ton and $35.76/dry metric ton, respectively. The study also includes a sensitivity analysis to identify the cost factors that cause logistics cost variation. Results of the sensitivity analysis show that labor price has the most influence on the logistics cost of corn stover and sweet sorghum stalk, with a variation of $6 to $12/dry metric ton.

  17. Analyzing data from a fuzzy rating scale-based questionnaire. A case study.

    Science.gov (United States)

    Gil, María Ángeles; Lubiano, María Asunción; de la Rosa de Sáa, Sara; Sinova, Beatriz

    2015-01-01

    The fuzzy rating scale was introduced to cope with the imprecision of human thought and experience in measuring attitudes in many fields of Psychology. The flexibility and expressiveness of this scale allow us to properly describe the answers to many questions involving psychological measurement. Analyzing the responses to a fuzzy rating scale-based questionnaire is indeed a critical problem. Nevertheless, over the last years, a methodology is being developed to analyze statistically fuzzy data in such a way that the information they contain is fully exploited. In this paper, a summary review of the main procedures is given. The methods are illustrated by their application on the dataset obtained from a case study with nine-year-old children. In this study, children replied to some questions from the well-known TIMSS/PIRLS questionnaire by using a fuzzy rating scale. The form could be filled in either on the computer or by hand. The study indicates that the requirements of background and training underlying the fuzzy rating scale are not too demanding. Moreover, it is clearly shown that statistical conclusions substantially often differ depending on the responses being given in accordance with either a Likert scale or a fuzzy rating scale.

  18. Analyzing the performance of the planning system by use of AAPM TG 119 test cases.

    Science.gov (United States)

    Nithya, L; Raj, N Arunai Nambi; Rathinamuthu, Sasikumar; Pandey, Manish Bhushan

    2016-01-01

    Our objective in this study was to create AAPM TG 119 test plans for intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT) in the Monaco planning system. The results were compared with the published studies, and the performance of the Monaco planning system was analyzed. AAPM TG 119 proposed a set of test cases called multi-target, mock prostate, mock head and neck and C-shape to ascertain the overall accuracy of IMRT planning, measurement, and analysis. We used these test cases to investigate the performance of the Monaco planning system for the complex plans. For these test cases, we created IMRT plans with static multi-leaf collimator (MLC) and dynamic MLC by using 7-9 static beams as explained in TG-119. VMAT plans were also created with a 320° arc length and a single or double arc. The planning objectives and dose were set as described in TG 119. The dose prescriptions for multi-target, mock prostate, mock head and neck, and C-shape were taken as 50, 75.6, 50 and 50 Gy, respectively. All plans were compared with the results of TG 119 and the study done by Mynampati et al. Point dose and fluence measurements were done with a CC13 chamber and ArcCHECK phantom, respectively. Gamma analysis was done for the calculated and measured dose. Using the Monaco planning system, we achieved the goals mentioned in AAPM TG-119, and the plans were comparable to those of other studies. A comparison of point dose and fluence showed good results. From these results, we conclude that the performance of the Monaco planning system is good for complex plans.

  19. Moiré pattern from a multiple Bragg–Laue interferometer

    Energy Technology Data Exchange (ETDEWEB)

    Hirano, Kenji, E-mail: hirano-k@sit.jp; Fukamachi, Tomoe; Kanematsu, Yoshinobu; Jongsukswat, Sukswat; Negishi, Riichirou; Ju, Dongying [Saitama Institute of Technology, 1690 Fusaiji, Fukaya, Saitama 369-0293 (Japan); Hirano, Keiichi [Institute of Material Structure Science, KEK-PF, High Energy Accelerator Research Organization, Oho, Tsukuba, Ibaraki 305-0801 (Japan); Kawamura, Takaaki [Department of Mathematics and Physics, University of Yamanashi, Kofu, Yamanashi 400-8510 (Japan)

    2012-01-01

    A moiré pattern is observed by dividing the incident beam into two and formed by multiple Bragg–Laue interference fringes corresponding to the two incident beams. The coherency of X-rays from a bending-magnet beamline is evaluated using the moiré pattern. In X-ray section topography of Si 220 diffraction in a multiple Bragg–Laue mode, a moiré pattern is observed when the incident beam is divided into two parts by inserting a platinum wire in the middle of the beam. The moiré pattern can be explained by the summation of two interference fringes corresponding to the two incident beams. The coherency of the X-rays from the bending-magnet beamline is estimated using the moiré pattern.

  20. A neutron image plate quasi-Laue diffractometer for protein crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Cipriani, F.; Castagna, J.C.; Wilkinson, C. [European Molecular Biology Laboratory, Grenoble (France)] [and others

    1994-12-31

    An instrument which is based on image plate technology has been constructed to perform cold neutron Laue crystallography on protein structures. The crystal is mounted at the center of a cylindrical detector which is 400mm long and has a circumference of 1000mm, with gadolinium oxide-containing image plates mounted on its exterior surface. Laue images registered on the plate are read out by rotating the drum and translating a laser read head parallel to the cylinder axis, giving a pixel size of 200{mu}m x 200{mu}m and a total read time of 5 minutes. Preliminary results indicate that it should be possible to obtain a complete data set from a protein crystal to atomic resolution in about two weeks.

  1. Submicron focusing of XUV radiation from a laser plasma source using a multilayer Laue lens

    Science.gov (United States)

    Reese, M.; Schäfer, B.; Großmann, P.; Bayer, A.; Mann, K.; Liese, T.; Krebs, H. U.

    2011-01-01

    The focusing properties of a one-dimensional multilayer Laue lens (MLL) were investigated using monochromatic soft X-ray radiation from a table-top, laser-produced plasma source. The MLL was fabricated by a focused ion beam (FIB) structuring of pulsed laser deposited ZrO2/Ti multilayers. This novel method offers the potential to overcome limitations encountered in electron lithographic processes. Utilizing this multilayer Laue lens, a line focus of XUV radiation from a laser-induced plasma in a nitrogen gas puff target could be generated. The evaluated focal length is close to the designed value of 220 μm for the measurement wavelength of 2.88 nm. Divergence angle and beam waist diameter are measured by a moving knife edge and a far-field experiment, determining all relevant second-order moments based beam parameters. The waist diameter has been found to be approximately 370 nm (FWHM).

  2. Development of multilayer Laue Lenses for soft X-ray radiation

    Energy Technology Data Exchange (ETDEWEB)

    Liese, Tobias; Krebs, Hans-Ulrich [Institut fuer Materialphysik, Universitaet Goettingen (Germany); Reese, Michael; Grossmann, Peter; Mann, Klaus [Laser-Laboratorium Goettingen e.V. (Germany)

    2010-07-01

    Despite the improvements of fabrication techniques for Fresnel zone plates as diffractive optics for X-ray microscopy, the spatial resolution with high diffraction efficiencies has reached a limit mostly. A novel approach to focusing soft X-rays in the water window regime (2.3-4.4 nm) is to prepare non-periodic multilayer structures using as 1-dimensional zone plates in Laue diffraction geometry. For this purpose ZrO{sub 2}/Ti multilayers were deposited by pulsed laser deposition (PLD) on Si(111) substrates in ultrahigh vacuum. The interfaces within the multilayer are positioned according to the Fresnel zone plate law. In this contribution, results of the Laue Lens fabrication by focused ion beam (FIB) and lens characteristics measured by a table-top X-ray source are presented.

  3. First results of the EXILL and FATIMA campaign at the Institut Laue Langevin

    Energy Technology Data Exchange (ETDEWEB)

    Jolie, Jan; Regis, Jean-Marc; Saed-Samii, Nima; Warr, Nigel [IKP, Universitaet zu Koeln, Zuelpicher Str. 77, 50937 Koeln (Germany); Wilmsen, Dennis [IKP, Universitaet zu Koeln, Zuelpicher Str. 77, 50937 Koeln (Germany); GANIL, BP 55027 (France); France, Gilles de; Clement, Emmanuel [GANIL, BP 55027 (France); Blanc, Aurelien; Jentschel, Michael; Koester, Uli; Mutti, Paolo; Soldner, Thorsten [ILL, 71 Av. des Martyrs, 38042 Grenoble (France); Simpson, Gary [University of Western Scotland, Paisley, PA1 2BE (United Kingdom); UIrban, Waldek [Faculty of Physics, University of Warsaw, 02-093 Warsaw (Poland); Bruce, Alison; Lalskovski, Stefan [SCEM, University of Brighton, Brighton BN2 4GJ (United Kingdom); Fraile, Luis [Grupo de Fisica Nuclear, Universidad Complutese, 28040 Madrid (Spain); Kroell, Thorsten [Institut fuer Kernphysik, TU Darmstadt, Darmstadt (Germany); Podolyak, Zsolt; Regan, Patrick [Dept. of Physics, University of Surrey, Guildford GU2 7XH (United Kingdom); Korten, Wolfram [CEA, Centre de Saclay, IRFU, 91191 Gif-sur-Yvette (France); Ur, Calin; Marginean, Nicu [Horia Hulubei NIPNE, 77125 Bucharest (Romania)

    2015-07-01

    At the PF1B cold neutron beam line at the Institut Laue Langevin the EXILL and FATIMA array consisting of 8 EXOGAM clover Ge detectors and 16 LaBr3(Ce) scintillators was used for the measurement of lifetimes using the generalised centroid difference method. The studied nuclei were formed by the (n,γ) and (n,fission) reactions. We report on the set-up and present first results on {sup 90}Zr and {sup 196}Pt.

  4. A Theoretical Study of the Two-Dimensional Point Focusing by Two Multilayer Laue Lenses.

    Energy Technology Data Exchange (ETDEWEB)

    Yan,H.; Maser, J.; Kang, H.C.; Macrader, A.; Stephenson, B.

    2008-08-10

    Hard x-ray point focusing by two crossed multilayer Laue lenses is studied using a full-wave modeling approach. This study shows that for a small numerical aperture, the two consecutive diffraction processes can be decoupled into two independent ones in respective directions. Using this theoretical tool, we investigated adverse effects of various misalignments on the 2D focus profile and discussed the tolerance to them. We also derived simple expressions that described the required alignment accuracy.

  5. Analyzing the Sensitivity of EGFR-L861Q Mutation to TKIs and A Case Report

    Directory of Open Access Journals (Sweden)

    Xingxing WANG

    2015-09-01

    Full Text Available Background and objective The significant efficacy of tyrosine kinase inhibitors (TKIs has been approved for advanced non-small cell lung cancer (NSCLC patients with activating epidermal growth factor receptor (EGFR mutations. No clear evidence exists that EGFR-L861Q is sensitive to TKIs, and the best treatment for NSCLC patients with EGFR-L861Q mutation is undetermined. This study aims to discuss the best treatment for advanced NSCLC patients with EGFR-L861Q mutation by analyzing the differences among the structures of wild-type EGFR, activating mutant EGFR-L858R, and EGFR-L861Q mutation. Method The protein structures of wild-type EGFR were reconstructed. EGFR-L858R and EGFR-L861Q mutation were activated. The differences among the three kinds of protein conformation were analyzed using homologous modeling technique. Results The structure of EGFR-L858R and wild-type EGFR exhibited notable distinctions. The structure of EGFR-L861Q mutation was different compared with wild-type EGFR and activating mutant EGFR-L858R protein conformations. NSCLC patients with EGFR-L861Q mutation were given chemotherapy as the first-line of therapy, and TKIs were applied to maintain treatment when the tumor is unchanged. Effect evaluation result was improved when the lung computed tomography lesions were reviewed. Conclusion The analysis of the protein conformation of EGFR-L861Q mutation and the curative effect of chemotherapy with TKIs could help predict the sensitivity of EGFR-L861Q to TKIs. Combining the analysis with a clinical case, maintenance treatment with TKIs may achieve satisfactory curative effect in advanced NSCLC patients who have achieved disease control after first-line chemotherapy.

  6. A simulation study on the focal plane detector of the LAUE project

    Energy Technology Data Exchange (ETDEWEB)

    Khalil, M., E-mail: mkhalil@in2p3.fr [APC Laboratory, 10 rue Alice Domon et Léonie Duquet, 75205 Paris Cedex 13 (France); Department of Physics and Earth Sciences, University of Ferrara, Via Saragat, 1, 44100 Ferrara (Italy); Frontera, F. [Department of Physics and Earth Sciences, University of Ferrara, Via Saragat, 1, 44100 Ferrara (Italy); INAF/IASF-Bologna, Via P. Gobetti 101, Bologna (Italy); Caroli, E. [INAF/IASF-Bologna, Via P. Gobetti 101, Bologna (Italy); Virgilli, E.; Valsan, V. [Department of Physics and Earth Sciences, University of Ferrara, Via Saragat, 1, 44100 Ferrara (Italy)

    2015-06-21

    The LAUE project, supported by the Italian Space Agency (ASI), is devoted to the development of a long focal length (even 20 m or longer) Laue lens for gamma ray astronomy between 80 and 600 keV. These lenses take advantage of Bragg diffraction to focus radiation onto a small spot drastically improving the signal to noise ratio as well as reducing the required size of the detector significantly. In this paper we present a Monte-Carlo simulation study with MEGALIB to optimize, for space applications, the detector size to achieve high detection efficiency, and to optimize the position resolution of the detector to reconstruct the Point Spread Function of the lens considered for the LAUE project. Then we will show simulations, using the SILVACO semiconductor simulation toolkit, on the optimized detector to estimate its capacitance per channel and depletion voltage. In all of the simulations, two materials were compared; a low density material (Silicon) and a high density material (Germanium). - Highlights: • The quantized Hall plateaus and Shubnikov de Haas oscillations in transition metal doped topological insulators are observed. • The evidence of a two-dimensional/layered transport of the bulk electrons is reported. • An obvious ferromagnetism in doped topological insulators is observed. • Care should be taken to pindown the transport of the topological SS in topological insulators.

  7. Developing a method for soft gamma-ray Laue lens assembly and calibration

    CERN Document Server

    Barrière, Nicolas M; Boggs, Steven E; Lowell, Alexander; Wade, Colin; Baugh, Max; von Ballmoos, Peter; Abrosimov, Nikolay V; Hanlon, Lorraine

    2013-01-01

    Laue lenses constitute a promising option for concentrating soft gamma rays with a large collection area and reasonable focal lengths. In astronomy they could lead to increased telescope sensitivity by one to two orders of magnitude, in particular for faint nuclear gamma-ray lines, but also for continua like hard X-ray tails from a variety of compact objects. Other fields like Homeland security and nuclear medicine share the same need for more sensitive gamma-ray detection systems and could find applications for gamma-ray focusing optics. There are two primary challenges for developing Laue lenses: the search for high-reflectivity and reproducible crystals, and the development of a method to accurately orient and fix the thousands of crystals constituting a lens. In this paper we focus on the second topic. We used our dedicated X-ray beamline and Laue lens assembly station to build a breadboard lens made of 15 crystals. This allowed us to test our tools and methods, as well as our simulation code and calibrat...

  8. Fair shares: a preliminary framework and case analyzing the ethics of offshoring.

    Science.gov (United States)

    Gordon, Cameron; Zimmerman, Alan

    2010-06-01

    Much has been written about the offshoring phenomenon from an economic efficiency perspective. Most authors have attempted to measure the net economic effects of the strategy and many purport to show that "in the long run" that benefits will outweigh the costs. There is also a relatively large literature on implementation which describes the best way to manage the offshoring process. But what is the morality of offshoring? What is its "rightness" or "wrongness?" Little analysis of the ethics of offshoring has been completed thus far. This paper develops a preliminary framework for analyzing the ethics of offshoring and then applies this framework to basic case study of offshoring in the U.S. The paper following discusses the definition of offshoring; shifts to the basic philosophical grounding of the ethical concepts; develops a template for conducting an ethics analysis of offshoring; applies this template using basic data for offshoring in the United States; and conducts a preliminary ethical analysis of the phenomenon in that country, using a form of utilitarianism as an analytical baseline. The paper concludes with suggestions for further research.

  9. Hard x-ray broad band Laue lenses (80 - 600 keV): building methods and performances

    CERN Document Server

    Virgilli, E; Rosati, P; Liccardo, V; Squerzanti, S; Carassiti, V; Caroli, E; Auricchio, N; Stephen, J B

    2015-01-01

    We present the status of the laue project devoted to develop a technology for building a 20 meter long focal length Laue lens for hard x-/soft gamma-ray astronomy (80 - 600 keV). The Laue lens is composed of bent crystals of Gallium Arsenide (GaAs, 220) and Germanium (Ge, 111), and, for the first time, the focusing property of bent crystals has been exploited for this field of applications. We show the preliminary results concerning the adhesive employed to fix the crystal tiles over the lens support, the positioning accuracy obtained and possible further improvements. The Laue lens petal that will be completed in a few months has a pass band of 80 - 300 keV and is a fraction of an entire Laue lens capable of focusing X-rays up to 600 keV, possibly extendable down to 20 - 30 keV with suitable low absorption crystal materials and focal length. The final goal is to develop a focusing optics that can improve the sensitivity over current telescopes in this energy band by 2 orders of magnitude.

  10. Use of a miniature diamond-anvil cell in high-pressure single-crystal neutron Laue diffraction

    Directory of Open Access Journals (Sweden)

    Jack Binns

    2016-05-01

    Full Text Available The first high-pressure neutron diffraction study in a miniature diamond-anvil cell of a single crystal of size typical for X-ray diffraction is reported. This is made possible by modern Laue diffraction using a large solid-angle image-plate detector. An unexpected finding is that even reflections whose diffracted beams pass through the cell body are reliably observed, albeit with some attenuation. The cell body does limit the range of usable incident angles, but the crystallographic completeness for a high-symmetry unit cell is only slightly less than for a data collection without the cell. Data collections for two sizes of hexamine single crystals, with and without the pressure cell, and at 300 and 150 K, show that sample size and temperature are the most important factors that influence data quality. Despite the smaller crystal size and dominant parasitic scattering from the diamond-anvil cell, the data collected allow a full anisotropic refinement of hexamine with bond lengths and angles that agree with literature data within experimental error. This technique is shown to be suitable for low-symmetry crystals, and in these cases the transmission of diffracted beams through the cell body results in much higher completeness values than are possible with X-rays. The way is now open for joint X-ray and neutron studies on the same sample under identical conditions.

  11. Polarisation measurements with a CdTe pixel array detector for Laue hard X-ray focusing telescopes

    CERN Document Server

    Caroli, E; Pisa, A; Stephen, J B; Frontera, F; Castanheira, M T D; Sordo, S; Caroli, Ezio; Silva, Rui M. Curado da; Pisa, Alessandro; Stephen, John B.; Frontera, Filippo; Castanheira, Matilde T. D.; Sordo, Stefano del

    2006-01-01

    Polarimetry is an area of high energy astrophysics which is still relatively unexplored, even though it is recognized that this type of measurement could drastically increase our knowledge of the physics and geometry of high energy sources. For this reason, in the context of the design of a Gamma-Ray Imager based on new hard-X and soft gamma ray focusing optics for the next ESA Cosmic Vision call for proposals (Cosmic Vision 2015-2025), it is important that this capability should be implemented in the principal on-board instrumentation. For the particular case of wide band-pass Laue optics we propose a focal plane based on a thick pixelated CdTe detector operating with high efficiency between 60-600 keV. The high segmentation of this type of detector (1-2 mm pixel size) and the good energy resolution (a few keV FWHM at 500 keV) will allow high sensitivity polarisation measurements (a few % for a 10 mCrab source in 106s) to be performed. We have evaluated the modulation Q factors and minimum detectable polaris...

  12. Analyzing the Roles, Activities, and Skills of Learning Technologists: A Case Study from City University London

    Science.gov (United States)

    Fox, Olivia; Sumner, Neal

    2014-01-01

    This article reports on a case study carried out at City University London into the role of learning technologists. The article examines how the role developed by providing points of comparison with a report on the career development of learning technology staff in UK universities in 2001. This case study identified that learning technologists…

  13. Analyzing the Roles, Activities, and Skills of Learning Technologists: A Case Study from City University London

    Science.gov (United States)

    Fox, Olivia; Sumner, Neal

    2014-01-01

    This article reports on a case study carried out at City University London into the role of learning technologists. The article examines how the role developed by providing points of comparison with a report on the career development of learning technology staff in UK universities in 2001. This case study identified that learning technologists…

  14. Analyzing inconsistent cases in management fsQCA studies: A methodological manifesto

    NARCIS (Netherlands)

    Balachandran Nair, L.; Gibbert, Michael

    2016-01-01

    Cases inconsistent with theoretical expectations are by default indicators for a lack of theory-data fit, and as such are prime candidates for theory building. However, the conventional tendency is to ignore inconsistent cases in Management research. The current article focuses on the theory-buildin

  15. Dance Pedagogy Case Studies: A Grounded Theory Approach to Analyzing Qualitative Data

    Science.gov (United States)

    Wilson, Margaret

    2009-01-01

    Combining traditional forms of research to fit unique populations contributes to understanding broad phenomena within the discipline of dance. This paper describes a methodological approach for understanding separate, but interrelated, case studies which illuminated a particular approach to teaching and learning about the body. In each case study…

  16. Analyzing and Comparing Biomass Feedstock Supply Systems in China: Corn Stover and Sweet Sorghum Case Studies

    OpenAIRE

    Lantian Ren; Kara Cafferty; Mohammad Roni; Jacob Jacobson; Guanghui Xie; Leslie Ovard; Christopher Wright

    2015-01-01

    This paper analyzes the rural Chinese biomass supply system and models supply chain operations according to U.S. concepts of logistical unit operations: harvest and collection, storage, transportation, preprocessing, and handling and queuing. In this paper, we quantify the logistics cost of corn stover and sweet sorghum in China under different scenarios. We analyze three scenarios of corn stover logistics from northeast China and three scenarios of sweet sorghum stalks logistics from Inner M...

  17. Focusing effect of bent GaAs crystals for gamma-ray Laue lenses: Monte Carlo and experimental results

    CERN Document Server

    Virgilli, E; Rosati, P; Bonnini, E; Buffagni, E; Ferrari, C; Stephen, J B; Caroli, E; Auricchio, N; Basili, A; Silvestri, S

    2015-01-01

    We report on results of observation of the focusing effect from the planes (220) of Gallium Arsenide (GaAs) crystals. We have compared the experimental results with the simulations of the focusing capability of GaAs tiles through a developed Monte Carlo. The GaAs tiles were bent using a lapping process developed at the cnr/imem - Parma (Italy) in the framework of the laue project, funded by ASI, dedicated to build a broad band Laue lens prototype for astrophysical applications in the hard X-/soft gamma-ray energy range (80-600 keV). We present and discuss the results obtained from their characterization, mainly in terms of focusing capability. Bent crystals will significantly increase the signal to noise ratio of a telescope based on a Laue lens, consequently leading to an unprecedented enhancement of sensitivity with respect to the present non focusing instrumentation.

  18. Unveiling Physical Processes in Type Ia Supernovae with a Laue Lens Telescope

    Science.gov (United States)

    Barriere, Nicolas; Boggs, S. E.; Tomsick, J. A.

    2010-03-01

    Despite their use as standard candles in cosmological studies, many fundamental aspects of Type Ia supernovae (SNIa) remain uncertain, including the progenitor systems, the explosion trigger and the detailed nuclear burning physics. The most popular model involves an accreting CO white dwarf undergoing a thermonuclear runaway, converting a substantial fraction of the stellar mass to 56Ni. The radioactive decay chain 56Ni -> 56Co -> 56Fe powers both the SNIa optical light curve and produces several gamma-ray lines, including bright lines at 158 keV and 847 keV. Observations of the spectrum and light curve of any of these lines would be extremely valuable in constraining and discriminating between the currently competing models of SNIa. However, these lines are weak in flux and evolve relatively quickly by gamma-ray standards: to be able to study a handful SNIa per year, the required sensitivity is about 10-6 ph/cm2/s at 847 keV and 10-7 ph/s/cm2 at 158 keV for 3% broadened lines, and these levels must be achieved in 105 s. A Laue lens telescope offers a novel and powerful method of achieving these extremely challenging requirements. In this paper, we briefly introduce the Laue lens principle and state-of-the-art technologies, and we demonstrate how a space-borne telescope based on a Laue lens focusing on a Compton camera could bring about the long-awaited observational clues leading to a better understanding of SNIa physics.

  19. New source for ultracold neutrons at the Institut Laue-Langevin

    Science.gov (United States)

    Piegsa, F. M.; Fertl, M.; Ivanov, S. N.; Kreuz, M.; Leung, K. K. H.; Schmidt-Wellenburg, P.; Soldner, T.; Zimmer, O.

    2014-07-01

    A new intense superthermal source for ultracold neutrons (UCN) was installed at a dedicated beam line at the Institut Laue-Langevin. Incident neutrons with a wavelength of 0.89 nm are converted to UCN in a 5-liter volume filled with superfluid He4 at a temperature of about 0.7 K. The UCN can be extracted to room temperature experiments. We present the cryogenic setup of the source, a characterization of the cold neutron beam, and UCN production measurements, where a UCN density in the production volume of at least 55 per cm3 was determined.

  20. New source for ultracold neutrons at the Institut Laue-Langevin

    CERN Document Server

    Piegsa, F M; Ivanov, S N; Kreuz, M; Leung, K K H; Schmidt-Wellenburg, P; Soldner, T; Zimmer, O

    2014-01-01

    A new intense superthermal source for ultracold neutrons (UCN) has been installed at a dedicated beam line at the Institut Laue-Langevin. Incident neutrons with a wavelength of 0.89 nm are converted to UCN in a five liter volume filled with superfluid $^4$He at a temperature of about 0.7 K. The UCN can be extracted to room temperature experiments. We present the cryogenic setup of the source, a characterization of the cold neutron beam, and UCN production measurements, where a UCN density in the production volume of at least 55 per cm$^3$ was determined.

  1. Nanosecond x-ray Laue diffraction apparatus suitable for laser shock compression experiments.

    Science.gov (United States)

    Suggit, Matthew; Kimminau, Giles; Hawreliak, James; Remington, Bruce; Park, Nigel; Wark, Justin

    2010-08-01

    We have used nanosecond bursts of x-rays emitted from a laser-produced plasma, comprised of a mixture of mid-Z elements, to produce a quasiwhite-light spectrum suitable for performing Laue diffraction from single crystals. The laser-produced plasma emits x-rays ranging in energy from 3 to in excess of 10 keV, and is sufficiently bright for single shot nanosecond diffraction patterns to be recorded. The geometry is suitable for the study of laser-shocked crystals, and single-shot diffraction patterns from both unshocked and shocked silicon crystals are presented.

  2. Development of a 3D CZT detector prototype for Laue Lens telescope

    OpenAIRE

    Caroli, Ezio; Auricchio, Natalia; Del Sordo, Stefano; Abbene, Leonardo; Budtz-JøRgensen, Carl; Casini, Fabio; Curado da Silva, Rui M.; Kuvvetli, Irfan; Milano, Luciano; Natalucci, Lorenzo; Quadrini, Egidio M.; Stephen, John B.; Ubertini, Pietro; Zanichelli, Massimiliano; Zappettini, Andrea

    2010-01-01

    We report on the development of a 3D position sensitive prototype suitable as focal plane detector for Laue lens telescope. The basic sensitive unit is a drift strip detector based on a CZT crystal, (~19×8 mm2 area, 2.4 mm thick), irradiated transversally to the electric field direction. The anode side is segmented in 64 strips, that divide the crystal in 8 independent sensor (pixel), each composed by one collecting strip and 7 (one in common) adjacent drift strips. The drift strips are biase...

  3. Analyzing Activities in the Course of Science Education, According to Activity Theory: The Case of Sound

    Science.gov (United States)

    Theodoraki, Xarikleia; Plakitsi, Katerina

    2013-01-01

    In the present study, we analyze activities on the topic of sound, which are performed in the science education laboratory lessons in the third-year students of the Department of Early Childhood Education at the University of Ioannina. The analysis of the activities is based on one of the most modern learning theories of CHAT (Cultural Historical…

  4. Analyzing Mathematics Textbooks through a Constructive-Empirical Perspective on Abstraction: The Case of Pythagoras' Theorem

    Science.gov (United States)

    Yang, Kai-Lin

    2016-01-01

    This study aims at analyzing how Pythagoras' theorem is handled in three versions of Taiwanese textbooks using a conceptual framework of a constructive-empirical perspective on abstraction, which comprises three key attributes: the generality of the object, the connectivity of the subject and the functionality of diagrams as the focused semiotic…

  5. Using attack-defense trees to analyze threats and countermeasures in an ATM: a case study

    NARCIS (Netherlands)

    Fraile, Marlon; Ford, Margaret; Gadyatskaya, Olga; Kumar, Rajesh; Stoelinga, Mariëlle; Trujillo-Rasua, Rolando

    2016-01-01

    Securing automated teller machines (ATMs), as critical and complex infrastructure, requires a precise understanding of the associated threats. This paper reports on the application of attack-defense trees to model and analyze the security of ATMs.We capture the most dangerous multi-stage attack scen

  6. Analyzing the contributions of a government-commissioned research project: A case study

    NARCIS (Netherlands)

    Hegger, I.; Janssen, S.W.J.; Keijsers, J.F.E.M.; Schuit, A.J.; Oers, H.A.M. van

    2014-01-01

    Background: It often remains unclear to investigators how their research contributes to the work of the commissioner. We initiated the 'Risk Model' case study to gain insight into how a Dutch National Institute for Public Health and the Environment (RIVM) project and its knowledge products contribut

  7. A Comparison of Mean Phase Difference and Generalized Least Squares for Analyzing Single-Case Data

    Science.gov (United States)

    Manolov, Rumen; Solanas, Antonio

    2013-01-01

    The present study focuses on single-case data analysis specifically on two procedures for quantifying differences between baseline and treatment measurements. The first technique tested is based on generalized least square regression analysis and is compared to a proposed non-regression technique, which allows obtaining similar information. The…

  8. Analyzing the contributions of a government-commissioned research project: A case study

    NARCIS (Netherlands)

    Hegger, I.; Janssen, S.W.J.; Keijsers, J.F.E.M.; Schuit, A.J.; Oers, H.A.M. van

    2014-01-01

    Background: It often remains unclear to investigators how their research contributes to the work of the commissioner. We initiated the 'Risk Model' case study to gain insight into how a Dutch National Institute for Public Health and the Environment (RIVM) project and its knowledge products contribut

  9. An Analyzing of Creativity on Innovation and Business Performance: Case at Furniture Industry in Makassar

    OpenAIRE

    Munizu, Musran

    2016-01-01

    Creativity and innovation are important variables in improving business performance. The purpose of this research is to analyze the effect of creativity on innovation, creativity on business performance, and then innovation business performance. Respondents are the owners or managers of furniture industry. The unit of analysis of this research was the SMEs of furniture in Makassar. The number of respondents were 85 persons. They were consists of a business owners or managers. The data is carr...

  10. 对数螺旋型劳厄弯晶的X射线单色成像应用%Logarithmic Spiral Bent Laue Crystals for X-Ray Monochromatic Imaging Applications

    Institute of Scientific and Technical Information of China (English)

    毋玉芬; 肖沙里; 鲁建; 钱家渝; 刘利锋; 黄显宾

    2013-01-01

    Taking advantages of the monochromatic X-ray diffraction property of Laue crystals,an innovative use of logarithmic spiral bent Laue crystals for X-ray monochromatic imaging is investigated.According to the ray tracing method and the surface equation of the logarithmic spiral,the imaging principles and characteristics of the logarithmic spiral bent Laue crystals are studied,including the condition that the diffracted beam can be separated from the transmitted beam,the magnifications and the field of view (FOV).A logarithmic spiral bent quartz (1010) Laue crystal analyzer is developed.With the proposed crystal analyzer,the monochromatic backlight imaging experiment for the mesh grid with a diameter of 50 μm is carried out by taking an X-ray source of Cu target as the backlighter.The experimental results show that the spatial resolution of the analyzer is approximately 11.9 μm under a source diameter of 110 μm.Furthermore,the FOVs of the crystal analyzer are 22.3557 mm and 8.2602 mm in horizontal and vertical directions,respectively.%利用劳厄晶体研究了X射线的单色衍射性质,研究了对数螺旋型劳厄弯晶在等离子体X射线单色成像中的应用.根据光线追迹原理及对数螺旋线的表面方程,研究了对数螺旋型劳厄弯晶的单色成像原理,分析了单色衍射像不受透射白光X射线影响的准则,以及子午、弧矢放大倍数和单色成像视场等性能参数.研制了石英晶体(1010)对数螺旋劳厄弯晶分析器,以铜靶X射线源作为背光源,对网丝直径为50 μm的金属网格进行了单色背光成像实验.实验结果表明,当背光源尺寸为110 μm时,对数螺旋型劳厄弯晶的空间分辨力约为11.9μm,分析器在子午和弧矢方向的视场分别达到22.3557 mm和8.2602 mm.

  11. Patient-clinician mobile communication: analyzing text messaging between adolescents with asthma and nurse case managers.

    Science.gov (United States)

    Yoo, Woohyun; Kim, Soo Yun; Hong, Yangsun; Chih, Ming-Yuan; Shah, Dhavan V; Gustafson, David H

    2015-01-01

    With the increasing penetration of digital mobile devices among adolescents, mobile texting messaging is emerging as a new channel for patient-clinician communication for this population. In particular, it can promote active communication between healthcare clinicians and adolescents with asthma. However, little is known about the content of the messages exchanged in medical encounters via mobile text messaging. Therefore, this study explored the content of text messaging between clinicians and adolescents with asthma. We collected a total of 2,953 text messages exchanged between 5 nurse case managers and 131 adolescents with asthma through a personal digital assistant. The text messages were coded using a scheme developed by adapting categories from the Roter Interaction Analysis System. Nurse case managers sent more text messages (n=2,639) than adolescents with asthma. Most messages sent by nurse case managers were targeted messages (n=2,475) directed at all adolescents with asthma, whereas there were relatively few tailored messages (n=164) that were created personally for an individual adolescent. In addition, both targeted and tailored messages emphasized task-focused behaviors over socioemotional behaviors. Likewise, text messages (n=314) sent by adolescents also emphasized task-focused over socioemotional behaviors. Mobile texting messaging has the potential to play an important role in patient-clinician communication. It promotes not only active interaction, but also patient-centered communication with clinicians. In order to achieve this potential, healthcare clinicians may need to focus on socioemotional communication as well as task-oriented communication.

  12. Improved multislice calculations for including higher-order Laue zones effects

    Energy Technology Data Exchange (ETDEWEB)

    Lobato, I., E-mail: Ivan.Lobato@ua.ac.be [University of Antwerp, Department of Physics, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Van Dyck, D. [University of Antwerp, Department of Physics, Groenenborgerlaan 171, B-2020 Antwerp (Belgium)

    2012-08-15

    A new method for including higher-order Laue zones (HOLZs) effects in an efficient way in electron scattering simulations has been developed and tested by detail calculations. The calculated results by the conventional multislice (CMS) method and the improved conventional multislice (ICMS) method using a large dynamical aperture to avoid numerical errors are compared with accurate results. We have found that the zero-order Laue zones (ZOLZs) reflection cannot be properly described only using the projected potential in the whole unit cell; in general, we need to subslice the electrostatic potential inside the unit cell. It is shown that the ICMS method has higher accuracy than the CMS method for the calculation of the ZOLZ, HOLZ and Pseudo-HOLZ reflections. Hence, ICMS method allows to use a larger slice thickness than the CMS method and reduces the calculation time. -- Highlights: Black-Right-Pointing-Pointer We have developed and tested a new method for including HOLZ effects in an efficient way in electron scattering simulations. Black-Right-Pointing-Pointer The ICMS method has higher accuracy than the CMS method for the calculation of the ZOLZ, HOLZ and Pseudo-HOLZ reflections. Black-Right-Pointing-Pointer ICMS method allows to use a larger slice thickness than the CMS method and reduces the calculation time.

  13. Laue optics for nuclear astrophysics: New detector requirements for focused gamma-ray beams

    Energy Technology Data Exchange (ETDEWEB)

    Barriere, N. [INAF - IASF Roma, via Fosso del Cavaliere 100, 00133 Roma (Italy)], E-mail: nicolas.barriere@iasf-roma.inaf.it; Ballmoos, P. von [CESR - UMR 5187, 9 Av. du Colonel Roche, 31028 Toulouse (France); Abrosimov, N.V. [IKZ, Max Born-Str. 2, D-12489 Berlin (Germany); Bastie, P. [LSP UMR 5588, 140 Av. de la physique, 38402 Saint Martin d' Heres (France); Camus, T. [CESR - UMR 5187, 9 Av. du Colonel Roche, 31028 Toulouse (France); Courtois, P.; Jentschel, M. [ILL, 6 rue Jules Horowitz, 38042 Grenoble (France); Knoedlseder, J. [CESR - UMR 5187, 9 Av. du Colonel Roche, 31028 Toulouse (France); Natalucci, L. [INAF - IASF Roma, via Fosso del Cavaliere 100, 00133 Roma (Italy); Roudil, G.; Rousselle, J. [CESR - UMR 5187, 9 Av. du Colonel Roche, 31028 Toulouse (France); Wunderer, C.B. [SSL, University of California at Berkeley, CA 94708 (United States); Kurlov, V.N. [Institute of Solid State Physics of Russian Academy of Sciences, 142432 Chernogolovka (Russian Federation)

    2009-10-21

    Nuclear astrophysics presents an extraordinary scientific potential for the study of the most powerful sources and the most violent events in the Universe. But in order to take full advantage of this potential, telescopes should be at least an order of magnitude more sensitive than present technologies. Today, Laue lenses have demonstrated their capability of focusing gamma-rays in the 100 keV-1 MeV domain, enabling the possibility of building a new generation of instruments for which sensitive area is decoupled from collecting area. Thus we have now the opportunity of dramatically increase the signal/background ratio and hence improve significantly the sensitivity. With a lens, the best detector is no longer the largest possible within a mission envelope. The point spread function of a Laue lens measures a few centimeters in diameter, but the field of view is limited by the detector size. Requirements for a focal plane instrument are presented in the context of the Gamma-Ray Imager mission (proposed to European Space Agency, ESA in the framework of the first Cosmic Vision AO): a 15-20 cm a side finely pixellated detector capable of Compton events reconstruction seems to be optimal, giving polarization and background rejection capabilities and 30 arcsec of angular resolution within a field of view of 5 arc min.

  14. Laue diffraction: The key to neutron crystallography from submillimetric-volume single crystals

    Science.gov (United States)

    Lemée-Cailleau, M.-H.; McIntyre, G. J.; Wilkinson, C.

    2005-12-01

    For several decades, chemists and physicists have been fascinated by molecular compounds rich in delocalized electrons. In the solid state these compounds may offer a very rich fan of properties: optical, conduction and dielectric, magneticldots Each state is the result of a delicate balance amongst intra- and/or intermolecular interactions which can be controlled, not just by direct chemical substitution, but also by external parameters such as temperature, pressure, continuous electric or magnetic fields, or by light. The recent evolution of this field of science towards more and more sophisticated materials makes also more and more difficult their crystal growth. While neutron scattering is an extremely powerful technique to get precise structural information, it is also often disregarded in this field because usually large single crystals are required. With the recent renaissance of Laue techniques using the very intense flux provided by the reactor of the Institut Laue-Langevin (ILL), accurate structural and/or magnetic information can be now extracted routinely from molecular crystals of volume 0.1 mm3 or smaller, with easy possibilities of high pressure (up to 3 GPa) down to 0.2 K. A general survey of these new possibilities is illustrated by an example taken from the field of low-dimensional organic complexes.

  15. A network centrality measure framework for analyzing urban traffic flow: A case study of Wuhan, China

    Science.gov (United States)

    Zhao, Shuangming; Zhao, Pengxiang; Cui, Yunfan

    2017-07-01

    In this paper, we propose an improved network centrality measure framework that takes into account both the topological characteristics and the geometric properties of a road network in order to analyze urban traffic flow in relation to different modes: intersection, road, and community, which correspond to point mode, line mode, and area mode respectively. Degree, betweenness, and PageRank centralities are selected as the analysis measures, and GPS-enabled taxi trajectory data is used to evaluate urban traffic flow. The results show that the mean value of the correlation coefficients between the modified degree, the betweenness, and the PageRank centralities and the traffic flow in all periods are higher than the mean value of the correlation coefficients between the conventional degree, the betweenness, the PageRank centralities and the traffic flow at different modes; this indicates that the modified measurements, for analyzing traffic flow, are superior to conventional centrality measurements. This study helps to shed light into the understanding of urban traffic flow in relation to different modes from the perspective of complex networks.

  16. Analyzing farming systems diversity: a case study in south-western France

    Energy Technology Data Exchange (ETDEWEB)

    Choisis, J. P.; Thevenet, C.; Girbon, A.

    2012-11-01

    The huge changes in agricultural activities, which may be amplified by the forthcoming Common Agriculture Policy reform, call the future of crop-livestock systems into question and hence the impact of these changes on landscapes and biodiversity. We analyzed relationships between agriculture, landscape and biodiversity in south-western France. The study area covered about 4,000 ha and included four villages. We conducted a survey of 56 farms. Multivariate analysis (multiple factor analysis and cluster analysis) were used to analyze relationships between 25 variables and to build a typology. The type of farming (beef and/or dairy cattle, cash crops), size (area and workforce) and cultivation practices, among others, were revealed as differentiating factors of farms. Six farming types were identified (1) hillside mixed crop-livestock farms, (2) large corporate farms, (3) extensive cattle farms, (4) large intensive farms on the valley sides, (5) small multiple-job holdings, and (6) hobby farms. The diversity of farming systems revealed the variable impact of the main drivers of change affecting agricultural development, particularly the enlargement and modernization of farms along with the demography of agricultural holdings. (Author) 41 refs.

  17. Patient–Clinician Mobile Communication: Analyzing Text Messaging Between Adolescents with Asthma and Nurse Case Managers

    Science.gov (United States)

    Kim, Soo Yun; Hong, Yangsun; Chih, Ming-Yuan; Shah, Dhavan V.; Gustafson, David H.

    2015-01-01

    Abstract Background: With the increasing penetration of digital mobile devices among adolescents, mobile texting messaging is emerging as a new channel for patient–clinician communication for this population. In particular, it can promote active communication between healthcare clinicians and adolescents with asthma. However, little is known about the content of the messages exchanged in medical encounters via mobile text messaging. Therefore, this study explored the content of text messaging between clinicians and adolescents with asthma. Materials and Methods: We collected a total of 2,953 text messages exchanged between 5 nurse case managers and 131 adolescents with asthma through a personal digital assistant. The text messages were coded using a scheme developed by adapting categories from the Roter Interaction Analysis System. Results: Nurse case managers sent more text messages (n=2,639) than adolescents with asthma. Most messages sent by nurse case managers were targeted messages (n=2,475) directed at all adolescents with asthma, whereas there were relatively few tailored messages (n=164) that were created personally for an individual adolescent. In addition, both targeted and tailored messages emphasized task-focused behaviors over socioemotional behaviors. Likewise, text messages (n=314) sent by adolescents also emphasized task-focused over socioemotional behaviors. Conclusions: Mobile texting messaging has the potential to play an important role in patient–clinician communication. It promotes not only active interaction, but also patient-centered communication with clinicians. In order to achieve this potential, healthcare clinicians may need to focus on socioemotional communication as well as task-oriented communication. PMID:25401324

  18. Specification Representation and Test Case Reduction by Analyzing the Interaction Patterns in System Model

    Directory of Open Access Journals (Sweden)

    Ashish Kumari

    2012-01-01

    Full Text Available Extended Finite State Machine uses the formal description language to model the requirement specification of the system. The system models are frequently changed because of the specification changes. We can show the changes in specification by changing the model represented using finite state machine. To test the modified parts of the model the selective test generation techniques are used. However, the regression test suits still may be very large according to the size. In this paper, we have discussed the method whichdefine the test suits reduction and the requirement specification that used for testing the main system after the modifications in the requirements and implementation. Extended finite state machine uses the state transition diagram for representing the requirement specification. It shows how system changes states and action and variable used during each transition. After that data dependency andcontrol dependency are find out among the transitions of state transition diagram. After these dependencies we can find out the affecting and affected portion in the system introduced by the modification. The main condition is: “If two test cases generate same affecting and affected pattern, it means it is enough to implement only one test case rather than two.” So using this approach we can substantially reduce the size of original test suite.

  19. Analyzing patient's waiting time in emergency & trauma department in public hospital - A case study

    Science.gov (United States)

    Roslan, Shazwa; Tahir, Herniza Md; Nordin, Noraimi Azlin Mohd; Zaharudin, Zati Aqmar

    2014-09-01

    Emergency and Trauma Department (ETD) is an important element for a hospital. It provides medical service, which operates 24 hours a day in most hospitals. However overcrowding is not exclusion for ETD. Overflowing occurs due to affordable services provided by public hospitals, since it is funded by the government. It is reported that a patient attending ETD must be treated within 90 minutes, in accordance to achieve the Key Performance Indicator (KPI). However, due to overcrowd situations, most patients have to wait longer than the KPI standard. In this paper, patient's average waiting time is analyzed. Using Chi-Square Test of Goodness, patient's inter arrival per hour is also investigated. As conclusion, Monday until Wednesday was identified as the days that exceed the KPI standard while Chi-Square Test of Goodness showed that the patient's inter arrival is independent and random.

  20. Analyzing Student Perceptions on Translanguaging: A Case Study of a Puerto Rican University Classroom

    Directory of Open Access Journals (Sweden)

    Adrian J. Rivera

    2017-02-01

    Full Text Available Translanguaging in the classroom is gaining traction as a viable pedagogical choice. Often overlooked, though, are the students’ attitudes in response to strategic classroom translanguaging. This study seeks to determine whether students’ language attitudes influence their perceptions of an instructor’s translingual pedagogy. The study took place in an undergraduate psychology classroom at the University of Puerto Rico, Mayagüez, and involved a case-study approach and analysis of survey results. The results show this particular group of students has a neutral to positive outlook on classroom translanguaging. The high number of neutral responses may mean students are indifferent to translingual pedagogy or that these students are conditioned to work within a context where code switching and translanguaging happen frequently.

  1. Real-time microstructure imaging by Laue microdiffraction: A sample application in laser 3D printed Ni-based superalloys

    Science.gov (United States)

    Zhou, Guangni; Zhu, Wenxin; Shen, Hao; Li, Yao; Zhang, Anfeng; Tamura, Nobumichi; Chen, Kai

    2016-06-01

    Synchrotron-based Laue microdiffraction has been widely applied to characterize the local crystal structure, orientation, and defects of inhomogeneous polycrystalline solids by raster scanning them under a micro/nano focused polychromatic X-ray probe. In a typical experiment, a large number of Laue diffraction patterns are collected, requiring novel data reduction and analysis approaches, especially for researchers who do not have access to fast parallel computing capabilities. In this article, a novel approach is developed by plotting the distributions of the average recorded intensity and the average filtered intensity of the Laue patterns. Visualization of the characteristic microstructural features is realized in real time during data collection. As an example, this method is applied to image key features such as microcracks, carbides, heat affected zone, and dendrites in a laser assisted 3D printed Ni-based superalloy, at a speed much faster than data collection. Such analytical approach remains valid for a wide range of crystalline solids, and therefore extends the application range of the Laue microdiffraction technique to problems where real-time decision-making during experiment is crucial (for instance time-resolved non-reversible experiments).

  2. Organizing, exploring, and analyzing antibody sequence data: the case for relational-database managers.

    Science.gov (United States)

    Owens, John

    2009-01-01

    Technological advances in the acquisition of DNA and protein sequence information and the resulting onrush of data can quickly overwhelm the scientist unprepared for the volume of information that must be evaluated and carefully dissected to discover its significance. Few laboratories have the luxury of dedicated personnel to organize, analyze, or consistently record a mix of arriving sequence data. A methodology based on a modern relational-database manager is presented that is both a natural storage vessel for antibody sequence information and a conduit for organizing and exploring sequence data and accompanying annotation text. The expertise necessary to implement such a plan is equal to that required by electronic word processors or spreadsheet applications. Antibody sequence projects maintained as independent databases are selectively unified by the relational-database manager into larger database families that contribute to local analyses, reports, interactive HTML pages, or exported to facilities dedicated to sophisticated sequence analysis techniques. Database files are transposable among current versions of Microsoft, Macintosh, and UNIX operating systems.

  3. [Attention deficit hyperactivity disorder analyzed with array comparative genome hybridization method. Case report].

    Science.gov (United States)

    Duga, Balázs; Czakó, Márta; Komlósi, Katalin; Hadzsiev, Kinga; Sümegi, Katalin; Kisfali, Péter; Melegh, Márton; Melegh, Béla

    2014-10-05

    One of the most common psychiatric disorders during childhood is attention deficit hyperactivity disorder, which affects 5-6% of children worldwide. Symptoms include attention deficit, hyperactivity, forgetfulness and weak impulse control. The exact mechanism behind the development of the disease is unknown. However, current data suggest that a strong genetic background is responsible, which explains the frequent occurrence within a family. Literature data show that copy number variations are very common in patients with attention deficit hyperactivity disorder. The authors present a patient with attention deficit hyperactivity disorder who proved to have two approximately 400 kb heterozygous microduplications at 6p25.2 and 15q13.3 chromosomal regions detected by comparative genomic hybridization methods. Both duplications affect genes (6p25.2: SLC22A23; 15q13.3: CHRNA7) which may play a role in the development of attention deficit hyperactivity disorder. This case serves as an example of the wide spectrum of indication of the array comparative genome hybridization method.

  4. Analyzing the Performance of an Institutional Scientific Repository – A Case Study

    Directory of Open Access Journals (Sweden)

    María Eduarda Rodrigues

    2012-09-01

    Full Text Available Scientific knowledge evolution is mainly based on an effective dissemination of research results. The concept of Open Access gives us the theoretical foundation of a model for accessing scientific knowledge, free from the constraints of traditional publishing and technologically supported by the Internet.Institutional Repositories are information systems that allow preserving, storing and disseminating scientific knowledge produced in higher education and scientific research institutions. They increase the visibility and the citation level of the documents. They also contribute to minimizing negative aspects like plagiarism of content because documents are exposed to peers in real time.As an alternative way to the traditional system of publishing scientific research content, repositories are developed in a cultural climate of great visibility leading to an immediate critical evaluation by peers.The Scientific Repository of the Polytechnic Institute of Castelo Branco – Portugal (RCIPCB was created in 2009 but its official presentation took place in January 2010. Its main purposes are promoting Open Access (OA, and preserving and disseminating the scientific knowledge produced at the Polytechnic Institute of Castelo Branco (IPCB. Using DSpace as a technological platform, RCIPCB is an institutional project supported by the president of the IPCB.Therefore, the present study was developed with the aim of analyzing the performance of RCIPCB considering the evolution and growth in terms of users, archiving and self-archiving, the number of published documents (scientific versus deposited documents in 2010 and the heterogeneity among communities/collections and its causes.Data were collected in RCIPCB, in the 2010 scientific publication list of the institute and through a questionnaire survey distributed among the members of the community with most documents deposited and those of the community with the fewest documents.For data collected in RCIPCB

  5. A social network approach to analyzing water governance: The case of the Mkindo catchment, Tanzania

    Science.gov (United States)

    Stein, C.; Ernstson, H.; Barron, J.

    The governance dimension of water resources management is just as complex and interconnected as the hydrological processes it aims to influence. There is an increasing need (i) to understand the multi-stakeholder governance arrangements that emerge from the cross-scale nature and multifunctional role of water; and (ii) to develop appropriate research tools to analyze them. In this study we demonstrate how social network analysis (SNA), a well-established technique from sociology and organizational research, can be used to empirically map collaborative social networks between actors that either directly or indirectly influence water flows in the Mkindo catchment in Tanzania. We assess how these collaborative social networks affect the capacity to govern water in this particular catchment and explore how knowledge about such networks can be used to facilitate more effective or adaptive water resources management. The study is novel as it applies social network analysis not only to organizations influencing blue water (the liquid water in rivers, lakes and aquifers) but also green water (the soil moisture used by plants). Using a questionnaire and semi-structured interviews, we generated social network data of 70 organizations, ranging from local resource users and village leaders, to higher-level governmental agencies, universities and NGOs. Results show that there is no organization that coordinates the various land and water related activities at the catchment scale. Furthermore, an important result is that village leader play a crucial role linking otherwise disconnected actors, but that they are not adequately integrated into the formal water governance system. Water user associations (WUAs) are in the process of establishment and could bring together actors currently not part of the formal governance system. However, the establishment of WUAs seems to follow a top-down approach not considering the existing informal organization of water users that are revealed

  6. Using climate response functions in analyzing electricity production variables. A case study from Norway.

    Science.gov (United States)

    Tøfte, Lena S.; Martino, Sara; Mo, Birger

    2016-04-01

    representation of hydropower is included and total hydro power production for each area is calculated, and the production is distributed among all available plants within each area. During simulation, the demand is affected by prices and temperatures. 6 different infrastructure scenarios of wind and power line development are analyzed. The analyses are done by running EMPS calibrated for today's situation for 11*11*8 different combinations of altered weather variables (temperature, precipitation and wind) describing different climate change scenarios, finding the climate response function for every EMPS-variable according the electricity production, such as prices and income, energy balances (supply, consumption and trade), overflow losses, probability of curtailment etc .

  7. Analyzing Agricultural Sustainability Indicators,Under Energy Subsidy Reduction Policy(Case Study of Qorveh Plain

    Directory of Open Access Journals (Sweden)

    H. Balali

    2016-03-01

    region, situated in the West of Iran. Materials and Methods: The statistical sample of this research includes all irrigated land of Qorveh as the studied area. A partial equilibrium model has been applied by mathematic programming approach in order to analyze the economic and environmental effects of reduction of energy subsidies for the agriculture sector in the studied area. For this purpose, through a survey, questionnaires were used in order to identify production coefficients of agriculture products and farmers' behavior during 2012-2013. Then relevant equations were used in mathematical programming framework with the aim of maximizing gross margin of agriculture activities in planning horizon by using GAMS 22.9. Results and Discussion: The results showed that by increasing energy price in policy scenarios of ES1 to ES7 the gross margin of agriculture activities decreases. Also, the results indicate that by implementation of scenarios SE1 and SE2, most economical and environmental indicators of agricultural sustainability will be improved and increasing energy prices as the mentioned policy scenarios has the most effect on GM_ELEC , GM_GAS, and NIT_H indicators and reduces them by 10.7%, 0.97% and 1.48%, respectively. In scenarios ES3 to ES5 with respect to scenarios ES1 and ES2, there is only 7% decrease in the NIT_H index. In scenario ES6, which grows electricity cost by 2.25 times and diesel fuel cost by 1.98 times, GM_ELEC, and GM_WA have the maximum decrease, namely 12.66% and 14.47%, respectively and WA_H has reached 9010 which shows an increase of 6.47%. In scenario ES7, with the exception of WA_H, GM_ELEC and GM_GAS other indicators decreased and this shows that the closer we keep to real energy prices, the more improvement we observe in the environmental indicators. Conclusions: Consequently, results showed that the reduction of energy subsidies leads to reductions in economic indicators of the study area, as total gross margins. Also, the results showed

  8. Test experiment to search for a neutron EDM by the Laue diffraction method

    CERN Document Server

    Fedorov, V V; Lelievre-Berna, E; Nesvizhevsky, V V; Petoukhov, A; Semenikhin, S Y; Soldner, T; Tasset, F; Voronin, V V

    2005-01-01

    A prototype experiment to measure the neutron electric dipole moment (nEDM) by spin-rotation in a non-centrosymmetric crystal in Laue geometry was carried out in order to investigate the statistical sensitivity and systematic effects of the method. The statistical sensitivity to the nEDM was about $6\\cdot 10^{-24}$ e$\\cdot $cm per day and can be improved by one order of magnitude for the full scale setup. Systematics was limited by the homogeneity of the magnetic field in the crystal region and by a new kind of spin rotation effect. We attribute this effect to a difference of the two Bloch waves amplitudes in the crystal, which is caused by the presence of a small crystal deformation due to a temperature gradient. In a revised scheme of the experiment, this effect could be exploited for a purposeful manipulation of the Bloch waves.

  9. The new powder diffractometer D1B of the Institut Laue Langevin

    Science.gov (United States)

    Puente Orench, I.; Clergeau, J. F.; Martínez, S.; Olmos, M.; Fabelo, O.; Campo, J.

    2014-11-01

    D1B is a medium resolution high flux powder diffractometer located at the Institut Laue Langevin, ILL. D1B a suitable instrument for studying a large variety of polycrystalline materials. D1B runs since 1998 as a CRG (collaborating research group) instrument, being exploited by the CNRS (Centre National de la Recherche Scientifique, France) and CSIC (Consejo Superior de Investigaciones Cientificas, Spain). In 2008 the Spanish CRG started an updating program which included a new detector and a radial oscillating collimator (ROC). The detector, which has a sensitive height of 100mm, covers an angular range of 128°. Its 1280 gold wires provide a neutron detection point every 0.1°. The ROC is made of 198 gadolinium- based absorbing collimation blades, regular placed every 0.67°. Here the present characteristics of D1B are reviewed and the different experimental performances will be presented.

  10. INTERFACE RESIDUAL STRESSES IN DENTAL ZIRCONIA USING LAUE MICRO-DIFFRACTION

    Energy Technology Data Exchange (ETDEWEB)

    Bale, H. A.; Tamura, N.; Coelho, P.G.; Hanan, J. C.

    2009-01-01

    Due to their aesthetic value and high compressive strength, dentists have recently employed ceramics for restoration materials. Among the ceramic materials, zirconia provides high toughness and crack resistant characteristics. Residual stresses develop in processing due to factors including grain anisotropy and thermal coefficient mismatch. In the present study, polychromatic X-ray (Laue) micro-diffraction provided grain orientation and residual stresses on a clinically relevant zirconia model ceramic disk. A 0.5 mm x 0.024 mm region on zirconia was examined on a 500 nm scale for residual stresses using a focused poly-chromatic synchrotron X-ray beam. Large stresses ranging from - to + 1GPa were observed at some grains. On average, the method suggests a relatively small compressive stress at the surface between 47 and 75 MPa depending on direction.

  11. Study of imperfect natural diamonds with the application of the X-ray synchrotron radiation (the 'Laue-SR' method)

    CERN Document Server

    Rylov, G M; Sobolev, N V; Kulipanov, G N; Kondratyev, V I; Tolochko, B P; Sharafutdinov, M R

    2001-01-01

    The 'Laue-SR' method has been realised for fast gathering experimental data in the study of imperfect natural and synthesised diamonds which are hard to investigate with the conventional X-ray methods. Time to obtain a diffraction pattern with the use of the polychromatic SR is shorter by several orders; the resolution of the image of substructure defects of a crystal lattice (as compared to the conventional Laue method) is improved by an order and does not vanish even at large disorientation or other non-coherent disturbances of the crystal lattice. The 'Laue-SR' method is especially appropriate for the study of intact, sufficiently large diamond crystals (up to 5 mm), since the diamond has a small coefficient of the X-ray absorption and is practically transparent in the operational range of the SR waves, lambda=0.5-1.5 A. This method was shown to be applied successfully for an accelerated study of a large bulk of imperfect natural diamond crystals without any preliminary preparation and without their destru...

  12. Dislocation-based plasticity model and micro-beam Laue diffraction analysis of polycrystalline Ni foil: A forward prediction

    Science.gov (United States)

    Song, Xu; Hofmann, Felix; Korsunsky, Alexander M.

    2010-10-01

    A physically-based, rate and length-scale dependent strain gradient crystal plasticity framework was employed to simulate the polycrystalline plastic deformation at the microscopic level in a large-grained, commercially pure Ni sample. The latter was characterised in terms of the grain morphology and orientation (in the bulk) by micro-beam Laue diffraction experiments carried out on beamline B16 at Diamond Light Source. The corresponding finite element model was developed using a grain-based mesh with the specific grain orientation assignment appropriate for the sample considered. Sample stretching to 2% plastic strain was simulated, and a post-processor was developed to extract the information about the local lattice misorientation (curvature), enabling forward-prediction of the Laue diffraction patterns. The 'streaking' phenomenon of the Laue spots (anisotropic broadening of two-dimensional (2D) diffraction peaks observed on the 2D detector) was correctly captured by the simulation, as constructed by direct superposition of reflections from different integration points within the diffraction gauge volume. Good agreement was found between the images collected from experiments and simulation patterns at various positions in the sample.

  13. Laue lens for radiotherapy applications through a focused hard x-ray beam: a feasibility study on requirements and tolerances

    Science.gov (United States)

    Camattari, Riccardo

    2017-09-01

    Focusing a hard x-ray beam would represent an innovative technique for tumour treatment, since such a beam may deliver a dose to a tumour located at a given depth under the skin, sparing the surrounding healthy cells. A detailed study of a focusing system for hard x-ray aimed at radiotherapy is presented here. Such a focusing system, named Laue lens, exploits x-ray diffraction and consists of a series of crystals disposed as concentric rings capable of concentrating a flux of x-rays towards a focusing point. A feasibility study regarding the positioning tolerances of the crystalline optical elements has been carried out. It is shown that a Laue lens can effectively be used in the context of radiotherapy for tumour treatments provided that the mounting errors are below certain values, which are reachable in the modern micromechanics. An extended survey based on an analytical approach and on simulations is presented for precisely estimating all the contributions of each mounting error, analysing their effect on the focal spot of the Laue lens. Finally, a simulation for evaluating the released dose in a water phantom is shown.

  14. A positron annihilation radiation telescope using Laue diffraction in a crystal lens

    Energy Technology Data Exchange (ETDEWEB)

    Smither, R.K. (Argonne National Lab., IL (United States)); von Ballmoos, P. (Toulouse-3 Univ., 31 (France). Centre d' Etude Spatiale des Rayonnements)

    1993-03-01

    We present a new type of gamma-ray telescope featuring a Laue diffraction lens, a detector module with a 3-by-3 germanium array, and a balloon gondola stabilized to 5 arc sec pointing accuracy. The instrument's lens is designed to collect 511 keV photons on its 150 CM[sup 2] effective area and focus them onto a small detector having only [approx]14 CM[sup 3] of equivalent volume for background noise. As a result, this telescope overcomes the mass-sensitivity impasse of present detectors in which the collection areas are identical to the detection area. The sensitivity of our instrument is anticipated to be 3 [times] 10[sup [minus]5] ph cm[sup [minus]2] S[sup [minus]1] at 511 key with an angular resolution of 15 arc sec and an energy resolution of 2 keV. These features will allow the resolve of a possible energetically narrow 511 keV positron annihilation line both energy-wise and spatially within a Galactic Center microquasar'' as 1El740.7-2942 or GRS1758-258. In addition to the galactic microquasars,'' other prime objectives include Cyg X-1, X-ray binaries, pulsars, and AGNS.

  15. A positron annihilation radiation telescope using Laue diffraction in a crystal lens

    Energy Technology Data Exchange (ETDEWEB)

    Smither, R.K. [Argonne National Lab., IL (United States); von Ballmoos, P. [Toulouse-3 Univ., 31 (France). Centre d`Etude Spatiale des Rayonnements

    1993-03-01

    We present a new type of gamma-ray telescope featuring a Laue diffraction lens, a detector module with a 3-by-3 germanium array, and a balloon gondola stabilized to 5 arc sec pointing accuracy. The instrument`s lens is designed to collect 511 keV photons on its 150 CM{sup 2} effective area and focus them onto a small detector having only {approx}14 CM{sup 3} of equivalent volume for background noise. As a result, this telescope overcomes the mass-sensitivity impasse of present detectors in which the collection areas are identical to the detection area. The sensitivity of our instrument is anticipated to be 3 {times} 10{sup {minus}5} ph cm{sup {minus}2} S{sup {minus}1} at 511 key with an angular resolution of 15 arc sec and an energy resolution of 2 keV. These features will allow the resolve of a possible energetically narrow 511 keV positron annihilation line both energy-wise and spatially within a Galactic Center ``microquasar`` as 1El740.7-2942 or GRS1758-258. In addition to the galactic ``microquasars,`` other prime objectives include Cyg X-1, X-ray binaries, pulsars, and AGNS.

  16. A laboratory based system for Laue micro x-ray diffraction

    Energy Technology Data Exchange (ETDEWEB)

    Advanced Light Source; Tamura, Nobumichi; Lynch, P.A.; Stevenson, A.W.; Liang, D.; Parry, D.; Wilkins, S.; Tamura, N.

    2007-02-28

    A laboratory diffraction system capable of illuminating individual grains in a polycrystalline matrix is described. Using a microfocus x-ray source equipped with a tungsten anode and prefigured monocapillary optic, a micro-x-ray diffraction system with a 10 mum beam was developed. The beam profile generated by the ellipsoidal capillary was determined using the"knife edge" approach. Measurement of the capillary performance, indicated a beam divergence of 14 mrad and a useable energy bandpass from 5.5 to 19 keV. Utilizing the polychromatic nature of the incident x-ray beam and application of the Laue indexing software package X-Ray Micro-Diffraction Analysis Software, the orientation and deviatoric strain of single grains in a polycrystalline material can be studied. To highlight the system potential the grain orientation and strain distribution of individual grains in a polycrystalline magnesium alloy (Mg 0.2 wt percent Nd) was mapped before and after tensile loading. A basal (0002) orientation was identified in the as-rolled annealed alloy; after tensile loading some grains were observed to undergo an orientation change of 30 degrees with respect to (0002). The applied uniaxial load was measured as an increase in the deviatoric tensile strain parallel to the load axis (37 References).

  17. Manufacturing of advanced bent crystals for Laue Optics for Gamma ObservationS (LOGOS)

    Energy Technology Data Exchange (ETDEWEB)

    Mazzolari, Andrea, E-mail: mazzolari@fe.infn.it [Department of Physics and Earth Sciences, University of Ferrara, Via Saragat 1/c, 44122 Ferrara (Italy); INFN, Section of Ferrara (Italy); Camattari, Riccardo; Bellucci, Valerio; Paternò, Gianfranco [Department of Physics and Earth Sciences, University of Ferrara, Via Saragat 1/c, 44122 Ferrara (Italy); INFN, Section of Ferrara (Italy); Scian, Carlo; Mattei, Giovanni [University of Padova, Department of Physics and Astronomy Galileo Galilei (Italy); Guidi, Vincenzo [Department of Physics and Earth Sciences, University of Ferrara, Via Saragat 1/c, 44122 Ferrara (Italy); INFN, Section of Ferrara (Italy)

    2015-07-15

    X- and γ-ray detection is currently a hot topic for a wide scientific community, spanning from astrophysics to nuclear medicine. However, lack of optics capable of focusing photons of energies in the energy range 0.1–1 MeV leaves the photon detection to a direct-view approach, resulting in a limited efficiency and resolution. The main scope of the INFN-LOGOS project is the development of technologies that enable manufacturing highly performing optical elements to be employed in the realization of hard X-ray lenses. Such lenses, typically named Laue lenses, consist of an ensemble of crystals disposed in concentric rings in order to diffract the incident radiation towards the focus of the lens, where a detector is placed. In particular, the INFN-LOGOS project aims at the realization of intrinsically bent silicon and germanium crystals exploiting the quasi-mosaic effect for focusing hard X-rays. Crystal manufacturing relies on a proper revisitation of techniques typically employed in silicon micromachining, such as thin film deposition and patterning or ion implantation.

  18. The Laue diffraction method to search for a neutron EDM. Experimental test of the sensitivity

    Energy Technology Data Exchange (ETDEWEB)

    Fedorov, V.V. E-mail: vfedorov@mail.pnpi.spb.ru; Lapin, E.G.; Lelievre-Berna, E.; Nesvizhevsky, V.V.; Petoukhov, A.K.; Semenikhin, S.Yu.; Soldner, T.; Tasset, F.; Voronin, V.V

    2005-01-01

    The feasibility of an experiment to search for the neutron electric dipole moment (EDM) by Laue diffraction in crystals without a center of symmetry was tested. At the PF1A beam of the ILL reactor a record time delay of {tau}{approx}2 ms for the passage of neutrons through a quartz crystal was reached for the (1 1 0) plane and diffraction angles equal to 88.5 degrees. That corresponds to an effective neutron velocity in the crystal of 20 m/s, while the velocity of the incident neutron was 800 m/s. It was shown experimentally that the value {tau}N{sup 1/2}, determining the method's sensitivity, has a maximum for the Bragg angle equal to 86 deg. The results allow us to estimate the statistical sensitivity of the method for the neutron EDM. For the PF1B beam of the ILL reactor the sensitivity can reach {approx}6x10{sup -25} e cm per day for the available quartz crystal.

  19. A focal plane detector design for a wide-band Laue-lens telescope

    CERN Document Server

    Caroli, E; Amati, L; Bezsmolnyy, Y; Budtz-Jorgensen, C; Silva, R M C; Frontera, F; Pisa, A; Del Sordo, S; Stephen, J B; Ventura, G; Caroli, Ezio; Auricchio, Natalia; Amati, Lorenzo; Bezsmolnyy, Yuriy; Budtz-Jorgensen, Carl; Silva, Rui M. Curado da; Frontera, Filippo; Pisa, Alessandro; Sordo, Stefano Del; Stephen, John B.; Ventura, Giulio

    2006-01-01

    The energy range above 60 keV is important for the study of many open problems in high energy astrophysics such as the role of Inverse Compton with respect to synchrotron or thermal processes in GRBs, non thermal mechanisms in SNR, the study of the high energy cut-offs in AGN spectra, and the detection of nuclear and annihilation lines. Recently the development of high energy Laue lenses with broad energy bandpasses from 60 to 600 keV have been proposed for a Hard X ray focusing Telescope (HAXTEL) in order to study the X-ray continuum of celestial sources. The required focal plane detector should have high detection efficiency over the entire operative range, a spatial resolution of about 1 mm, an energy resolution of a few keV at 500 keV and a sensitivity to linear polarization. We describe a possible configuration of the focal plane detector based on several CdTe/CZT pixelated layers stacked together to achieve the required detection efficiency at high energy. Each layer can operate both as a separate posit...

  20. Development of a 3D CZT detector prototype for Laue Lens telescope

    Science.gov (United States)

    Caroli, Ezio; Auricchio, Natalia; Del Sordo, Stefano; Abbene, Leonardo; Budtz-Jørgensen, Carl; Casini, Fabio; Curado da Silva, Rui M.; Kuvvetlli, Irfan; Milano, Luciano; Natalucci, Lorenzo; Quadrini, Egidio M.; Stephen, John B.; Ubertini, Pietro; Zanichelli, Massimiliano; Zappettini, Andrea

    2010-07-01

    We report on the development of a 3D position sensitive prototype suitable as focal plane detector for Laue lens telescope. The basic sensitive unit is a drift strip detector based on a CZT crystal, (~19×8 mm2 area, 2.4 mm thick), irradiated transversally to the electric field direction. The anode side is segmented in 64 strips, that divide the crystal in 8 independent sensor (pixel), each composed by one collecting strip and 7 (one in common) adjacent drift strips. The drift strips are biased by a voltage divider, whereas the anode strips are held at ground. Furthermore, the cathode is divided in 4 horizontal strips for the reconstruction of the third interaction position coordinate. The 3D prototype will be made by packing 8 linear modules, each composed by one basic sensitive unit, bonded on a ceramic layer. The linear modules readout is provided by a custom front end electronics implementing a set of three RENA-3 for a total of 128 channels. The front-end electronics and the operating logics (in particular coincidence logics for polarisation measurements) are handled by a versatile and modular multi-parametric back end electronics developed using FPGA technology.

  1. Fabrication of multilayer Laue lenses by a combination of pulsed laser deposition and focused ion beam.

    Science.gov (United States)

    Liese, Tobias; Radisch, Volker; Krebs, Hans-Ulrich

    2010-07-01

    X-ray diffractive techniques using Fresnel zone plate lenses of various forms are of great technical interest because of their ability to form images at very high spatial resolution, but the zone plates are unfortunately very hard to produce by lithography. Alternatively, multilayer Laue lenses (MLLs) and multilayer zone plates are used due to the higher and easily adjustable aspect ratio necessary for different wavelengths. In this paper, the fabrication of a MLL by a combination of pulsed laser deposition and focused ion beam machining is described. All steps of the production of a Ti/ZrO(2) microlens test structure with focal length of 220 microm (for a wavelength of 2.88 nm in the "water window" regime) are explained in detail. It is shown that this combination of two powerful techniques is very effective for the fabrication of MLL. All steps can be done in a very precise and controlled way without introducing damage to the grown multilayer structures.

  2. Double-slit dynamical diffraction of X-rays in ideal crystals (Laue case).

    Science.gov (United States)

    Balyan, Minas K

    2010-11-01

    The theoretical investigation of double-slit dynamical X-ray diffraction in ideal crystals shows that, on the exit surface of crystals, interference fringes similar to Young's fringes are formed. An expression for the period of the fringes was obtained. The visibility of the fringes depending on temporal and spatial coherent properties of the incident beam is studied. The polarization state of the incident beam also affects the visibility of the fringes, which in turn depends on the size of the slits. The deviation from Bragg's exact angle causes a shift of the fringes and can also affect the amplitude of the intensity. One of the parameters on which the visibility of the fringes depends is the source-crystal distance. The proposed scheme can be used as a Rayleigh X-ray interferometer. Use of the scheme as a Michelson X-ray stellar interferometer is also possible.

  3. Analyzing and modeling CRE in a changing climate and energy system - a case study from Mid-Norway

    Science.gov (United States)

    Tøfte, Lena S.; Sauterleute, Julian F.; Kolberg, Sjur A.; Warland, Geir

    2014-05-01

    Climate related energy (CRE) is influenced by both weather, the system for energy transport and market mechanisms. In the COMPLEX-project, Mid-Norway is a case study where we analyze co-fluctuations between wind and hydropower resources; how co-fluctuations may change in the long-term; which effects this has on the power generation; and how the hydropower system can be operated optimally in this context. In the region Mid-Norway, nearly all power demand is generated by hydro-electric facilities, and the region experiences a deficit of electricity. This is both due to energy deficiency and limitations in the power grid system. In periods of low inflow and situations with high electricity demand (i.e. winter), power must be imported from neighboring regions. In future, this situation might change with the development of renewable energy sources. The region is likely to experience considerable investments in wind power and small-scale hydropower. In relation to the deployment of wind power and small-scale hydropower and security of supply, the transmission grid within and out of the region is extended. With increasing production of intermittent energy sources as wind and small-scale hydro, dependencies and co-fluctuations between rain and wind are to be analyzed due to spatial and temporal scale, in the present and a future climate. Climate change scenarios agree on higher temperatures, more precipitation in total and a larger portion of the precipitation coming as rain in this region, and the average wind speed as well as the frequency of storms along the coast is expected to increase slightly during the winter. Changing temperatures will also change the electricity needs, as electricity is the main source for heating in Norway. It's important to study if and to which extent today's hydropower system and reservoirs are able to balance new intermittent energy sources in the region, in both today's and tomorrow's climate. The case study includes down-scaling of climate

  4. Development of an instrument to analyze organizational characteristics in multidisciplinary care pathways; the case of colorectal cancer.

    Science.gov (United States)

    Pluimers, Dorine J; van Vliet, Ellen J; Niezink, Anne Gh; van Mourik, Martijn S; Eddes, Eric H; Wouters, Michel W; Tollenaar, Rob A E M; van Harten, Wim H

    2015-04-09

    To analyze the organization of multidisciplinary care pathways such as colorectal cancer care, an instrument was developed based on a recently published framework that was earlier used in analyzing (monodisciplinary) specialist cataract care from a lean perspective. The instrument was constructed using semi-structured interviews and direct observation of the colorectal care process based on a Rapid Plant Assessment. Six lean aspects that were earlier established that highly impact process design, were investigated: operational focus, autonomous work cell, physical lay-out of resources, multi-skilled team, pull planning and non-value adding activities. To test reliability, clarity and face validity of the instrument, a pilot study was performed in eight Dutch hospitals. In the pilot it proved feasible to apply the instrument and generate the intended information. The instrument consisted of 83 quantitative and 24 qualitative items. Examples of results show differences in operational focus, number of patient visits needed for diagnosis, numbers of staff involved with treatment, the implementation of protocols and utilization of one-stop-shops. Identification of waste and non-value adding activities may need further attention. Based on feedback from involved clinicians the face validity was acceptable and the results provided useful feedback- and benchmark data. The instrument proved to be reliable and valid for broader implementation in Dutch health care. The limited number of cases made statistical analysis not possible and further validation studies may shed better light on variation. This paper demonstrates the use of an instrument to analyze organizational characteristics in colorectal cancer care from a lean perspective. Wider use might help to identify best organizational practices for colorectal surgery. In larger series the instrument might be used for in-depth research into the relation between organization and patient outcomes. Although we found no reason

  5. The Geometric Factor of Electrostatic Plasma Analyzers: A Case Study from the Fast Plasma Investigation for the Magnetospheric Multiscale mission

    Science.gov (United States)

    Collinson, Glyn A.; Dorelli, John Charles; Avanov, Leon A.; Lewis, Gethyn R.; Moore, Thomas E.; Pollock, Craig; Kataria, Dhiren O.; Bedington, Robert; Arridge, Chris S.; Chornay, Dennis J.; Gliese,Ulrik; Mariano, Al.; Barrie, Alexander C; Tucker, Corey; Owen, Christopher J.; Walsh, Andrew P.; Shappirio, Mark D.; Adrian, Mark L.

    2012-01-01

    We report our findings comparing the geometric factor (GF) as determined from simulations and laboratory measurements of the new Dual Electron Spectrometer (DES) being developed at NASA Goddard Space Flight Center as part of the Fast Plasma Investigation on NASA's Magnetospheric Multiscale mission. Particle simulations are increasingly playing an essential role in the design and calibration of electrostatic analyzers, facilitating the identification and mitigation of the many sources of systematic error present in laboratory calibration. While equations for laboratory measurement of the Geometric Factpr (GF) have been described in the literature, these are not directly applicable to simulation since the two are carried out under substantially different assumptions and conditions, making direct comparison very challenging. Starting from first principles, we derive generalized expressions for the determination of the GF in simulation and laboratory, and discuss how we have estimated errors in both cases. Finally, we apply these equations to the new DES instrument and show that the results agree within errors. Thus we show that the techniques presented here will produce consistent results between laboratory and simulation, and present the first description of the performance of the new DES instrument in the literature.

  6. The geometric factor of electrostatic plasma analyzers: A case study from the Fast Plasma Investigation for the Magnetospheric Multiscale mission

    Energy Technology Data Exchange (ETDEWEB)

    Collinson, Glyn A. [Heliophysics Science Division, NASA Goddard Space Flight Center, Greenbelt, Maryland 20071 (United States); Mullard Space Science Laboratory, University College London, Holmbury St. Mary, Surrey (United Kingdom); Dorelli, John C.; Moore, Thomas E.; Pollock, Craig; Mariano, Al; Shappirio, Mark D.; Adrian, Mark L. [Heliophysics Science Division, NASA Goddard Space Flight Center, Greenbelt, Maryland 20071 (United States); Avanov, Levon A. [Innovim, 7501 Greenway Center Drive, Maryland Trade Center III, Greenbelt, Maryland 20770 (United States); Heliophysics Science Division, NASA Goddard Space Flight Center, Greenbelt, Maryland 20071 (United States); Lewis, Gethyn R.; Kataria, Dhiren O.; Bedington, Robert; Owen, Christopher J.; Walsh, Andrew P. [Mullard Space Science Laboratory, University College London, Holmbury St. Mary, Surrey (United Kingdom); Arridge, Chris S. [Mullard Space Science Laboratory, University College London, Holmbury St. Mary, Surrey (United Kingdom); The Centre for Planetary Sciences, UCL/Birkbeck (United Kingdom); Chornay, Dennis J. [University of Maryland, 7403 Hopkins Avenue, College Park, Maryland 20740 (United States); Heliophysics Science Division, NASA Goddard Space Flight Center, Greenbelt, Maryland 20071 (United States); Gliese, Ulrik [SGT, Inc., 7515 Mission Drive, Suite 30, Lanham, Maryland 20706 (United States); Heliophysics Science Division, NASA Goddard Space Flight Center, Greenbelt, Maryland 20071 (United States); Barrie, Alexander C. [Millennium Engineering and Integration, 2231 Crystal Dr., Arlington, Virginia 22202 (United States); Heliophysics Science Division, NASA Goddard Space Flight Center, Greenbelt, Maryland 20071 (United States); Tucker, Corey [Global Science and Technology Inc., 7855 Walker Drive, Greenbelt, Maryland 20770 (United States); Heliophysics Science Division, NASA Goddard Space Flight Center, Greenbelt, Maryland 20071 (United States)

    2012-03-15

    We report our findings comparing the geometric factor (GF) as determined from simulations and laboratory measurements of the new Dual Electron Spectrometer (DES) being developed at NASA Goddard Space Flight Center as part of the Fast Plasma Investigation on NASA's Magnetospheric Multiscale mission. Particle simulations are increasingly playing an essential role in the design and calibration of electrostatic analyzers, facilitating the identification and mitigation of the many sources of systematic error present in laboratory calibration. While equations for laboratory measurement of the GF have been described in the literature, these are not directly applicable to simulation since the two are carried out under substantially different assumptions and conditions, making direct comparison very challenging. Starting from first principles, we derive generalized expressions for the determination of the GF in simulation and laboratory, and discuss how we have estimated errors in both cases. Finally, we apply these equations to the new DES instrument and show that the results agree within errors. Thus we show that the techniques presented here will produce consistent results between laboratory and simulation, and present the first description of the performance of the new DES instrument in the literature.

  7. Analyzing the relationship between perceived justice and satisfaction after service failures: a case study in a telecom company

    Directory of Open Access Journals (Sweden)

    Marcus Augusto Vasconcelos Araújo

    2016-03-01

    Full Text Available The zero defect is something virtually unachievable in service operations, which makes the ability of companies to effectively manage customer complaints derived from service failures an important condition for its success in the long term. This article evaluates the influence of justice perceptions in customer satisfaction after a complaints management process. The case of a cell phone company is analyzed in which a survey with 496 customers who complained to the firm’s call center was applied. Customers’ justice perceptions and its relationship with satisfaction after the procedure were evaluated. The results obtained with a logistic regression analysis indicated that perceptions of procedural and distributive justice influenced significantly satisfaction, while the interpersonal justice, despite having had the high individual mean, didn’t showed any significant relationship with this construct. Some important reflections are made based on these results, considering that many elements of justice poorly evaluated had great influence on the final satisfaction, which indicated the need for the company to review its priorities in terms of processes evaluation and employee training.

  8. Development of a bent Laue beam-expanding double-crystal monochromator for biomedical X-ray imaging

    Energy Technology Data Exchange (ETDEWEB)

    Martinson, Mercedes, E-mail: mercedes.m@usask.ca [University of Saskatchewan, 116 Science Place, Room 163, Saskatoon, Saskatchewan (Canada); Samadi, Nazanin [University of Saskatchewan, 107 Wiggins Road, Saskatoon, Saskatchewan (Canada); Belev, George [Canadian Light Source, 44 Innovation Boulevard, Saskatoon, Saskatchewan (Canada); Bassey, Bassey [University of Saskatchewan, 116 Science Place, Room 163, Saskatoon, Saskatchewan (Canada); Lewis, Rob [University of Saskatchewan, 107 Wiggins Road, Saskatoon, Saskatchewan (Canada); Monash University, Clayton, Victoria 3800 (Australia); Aulakh, Gurpreet [University of Saskatchewan, 107 Wiggins Road, Saskatoon, Saskatchewan (Canada); Chapman, Dean [University of Saskatchewan, 116 Science Place, Room 163, Saskatoon, Saskatchewan (Canada); University of Saskatchewan, 107 Wiggins Road, Saskatoon, Saskatchewan (Canada)

    2014-03-13

    A bent Laue beam-expanding double-crystal monochromator was developed and tested at the Biomedical Imaging and Therapy beamline at the Canadian Light Source. The expander will reduce scanning time for micro-computed tomography and allow dynamic imaging that has not previously been possible at this beamline. The Biomedical Imaging and Therapy (BMIT) beamline at the Canadian Light Source has produced some excellent biological imaging data. However, the disadvantage of a small vertical beam limits its usability in some applications. Micro-computed tomography (micro-CT) imaging requires multiple scans to produce a full projection, and certain dynamic imaging experiments are not possible. A larger vertical beam is desirable. It was cost-prohibitive to build a longer beamline that would have produced a large vertical beam. Instead, it was proposed to develop a beam expander that would create a beam appearing to originate at a source much farther away. This was accomplished using a bent Laue double-crystal monochromator in a non-dispersive divergent geometry. The design and implementation of this beam expander is presented along with results from the micro-CT and dynamic imaging tests conducted with this beam. Flux (photons per unit area per unit time) has been measured and found to be comparable with the existing flat Bragg double-crystal monochromator in use at BMIT. This increase in overall photon count is due to the enhanced bandwidth of the bent Laue configuration. Whilst the expanded beam quality is suitable for dynamic imaging and micro-CT, further work is required to improve its phase and coherence properties.

  9. A Study of Pre-Service Information and Communication Teachers’ Efficacy Levels for Analyzing and Responding to Cyberbullying Cases

    Directory of Open Access Journals (Sweden)

    Melike Kavuk

    2016-07-01

    Full Text Available This case study was conducted to investigate efficacy levels of preservice Information and Communication Teachers’ to identify, prevent and intervene to cyberbullying cases. Fifty participants were interviewed and 56 cyberbullying cases, which the participants experienced or witnessed, were collected to evaluate their cyberbullying readiness. Based on the content analysis and the expert ratings, preservice teachers found to have problems for identifying cyberbullying cases, suggesting appropriate prevention strategies for cyberbullying, judging intervention strategies, and suggesting appropriate intervention methods.

  10. Experimental and theoretical study of the diffraction properties of various crystals for the realization of a soft gamma-ray Laue lens

    DEFF Research Database (Denmark)

    Barriere, N.; Rousselle, J.; von Ballmoos, P.

    2009-01-01

    Crystals are the elementary constituents of Laue lenses, an emerging technology which could allow the realization of a space-borne telescope 10-100 times more sensitive than existing ones, in the 100 keV-1.5 MeV energy range. This paper addresses the development of efficient crystals for the real......Crystals are the elementary constituents of Laue lenses, an emerging technology which could allow the realization of a space-borne telescope 10-100 times more sensitive than existing ones, in the 100 keV-1.5 MeV energy range. This paper addresses the development of efficient crystals...

  11. A Study of Pre-Service Information and Communication Teachers' Efficacy Levels for Analyzing and Responding to Cyberbullying Cases

    Science.gov (United States)

    Kavuk, Melike; Bulu, Sanser; Keser, Hafize

    2016-01-01

    This case study was conducted to investigate efficacy levels of preservice Information and Communication Teachers' to identify, prevent and intervene to cyberbullying cases. Fifty participants were interviewed and 56 cyberbullying cases, which the participants experienced or witnessed, were collected to evaluate their cyberbullying readiness.…

  12. Development of an instrument to analyze organizational characteristics in multidisciplinary care pathways; the case of colorectal cancer

    NARCIS (Netherlands)

    Pluimers, Dorine J.; Vliet, van Ellen J.; Niezink, Anne G.H.; Mourik, van Martijn S.; Eddes, Eric H.; Wouters, Michel W.; Tollenaar, Rob A.E.M.; Harten, van Wim H.

    2015-01-01

    Background To analyze the organization of multidisciplinary care pathways such as colorectal cancer care, an instrument was developed based on a recently published framework that was earlier used in analyzing (monodisciplinary) specialist cataract care from a lean perspective. Methods The instrumen

  13. Analyzing the Classroom Teachers' Levels of Creating a Constructivist Learning Environments in Terms of Various Variables: A Mersin Case

    Science.gov (United States)

    Üredi, Lütfi

    2014-01-01

    In this research, it was aimed to analyze the classroom teachers' level of creating a constructivist learning environment in terms of various variables. For that purpose, relational screening model was used in the research. Classroom teachers' level of creating a constructivist learning environment was determined using the "constructivist…

  14. Analyzing and Interpreting Lime Burials from the Spanish Civil War (1936-1939): A Case Study from La Carcavilla Cemetery.

    Science.gov (United States)

    Schotsmans, Eline M J; García-Rubio, Almudena; Edwards, Howell G M; Munshi, Tasnim; Wilson, Andrew S; Ríos, Luis

    2017-03-01

    Over 500 victims of the Spanish Civil War (1936-1939) were buried in the cemetery of La Carcavilla (Palencia, Spain). White material, observed in several burials, was analyzed with Raman spectroscopy and powder XRD, and confirmed to be lime. Archaeological findings at La Carcavilla's cemetery show that the application of lime was used in an organized way, mostly associated with coffinless interments of victims of Francoist repression. In burials with a lime cast, observations made it possible to draw conclusions regarding the presence of soft tissue at the moment of deposition, the sequence of events, and the presence of clothing and other evidence. This study illustrates the importance of analyzing a burial within the depositional environment and taphonomic context.

  15. Preliminary 3D In-situ measurements of the texture evolution of strained H2O ice during annealing using neutron Laue diffractometry

    Science.gov (United States)

    Journaux, Baptiste; Montagnat, Maurine; Chauve, Thomas; Ouladdiaf, Bachir; Allibon, John

    2015-04-01

    Dynamic recrystallization (DRX) strongly affects the evolution of microstructure (grain size and shape) and texture (crystal preferred orientation) in materials during deformation at high temperature. Since texturing leads to anisotropic physical properties, predicting the effect of DRX is essential for industrial applications, for interpreting geophysical data and modeling geodynamic flows, and predicting ice sheet flow and climate evolution. A large amount of literature is available related to metallurgy, geology or glaciology, but there remains overall fundamental questions about the relationship between nucleation, grain boundary migration and texture development at the microscopic scale. Previous measurements of DRX in ice were either conducted using 2D ex-situ techniques such as AITA [1,2] or Electron Backscattering Diffraction (EBSD) [3], or using 3D statistical ex-situ [4] or in-situ [5] techniques. Nevertheless, all these techniques failed to observe at the scale of nucleation processes during DRX in full 3D. Here we present a new approach using neutron Laue diffraction, which enable to perform 3D measurements of in-situ texture evolution of strained polycrystalline H2O ice (>2% at 266 K) during annealing at the microscopic scale. Thanks the CYCLOPS instrument [6] (Institut Laue Langevin Grenoble, France) and the intrinsic low background of this setup, preliminary observations enabled us to follow, in H2O ice, the evolution of serrated grain boundaries, and kink-band during annealing. Our observations show a significant evolution of the texture and internal misorientation over the course of few hours at an annealing temperature of 268.5 K. In the contrary, ice kink-band structures seem to be very stable over time at near melting temperatures. The same samples have been analyzed ex-situ using EBSD for comparison. These results represent a first step toward in-situ microscopic measurements of dynamic recrystallization processes in ice during strain. This

  16. Analyzing The Relationship Among The GDP - Current Account Deficit and Short Term Capital Flows: The Case of Emerging Markets

    Directory of Open Access Journals (Sweden)

    Yusuf Ekrem AKBAŞ

    2014-12-01

    Full Text Available In this study, it was analyzed if there was causal relationship among the current deficit, short term capital flows and economic growth in emerging markets. Before causality test was done, CDLM tests were done in order to the fact that to be able to determine if there was cross section dependence in countries form the panel. At the end of CDLM tests cross section dependence in emerging markets form the panel was determined. Then, panel causality test developed was done. According to the result of panel causality test bidirectional causality between current account deficit and GDP, unidirectional causality from short term capital flows to current deficit and GDP were determined.

  17. Analyzing Factor of Time of Scoring Goal in Success of Football (Case Study: South Africa World Cup 2010

    Directory of Open Access Journals (Sweden)

    Shafiee Shahram

    2014-10-01

    Full Text Available Goal of this research is to study Factor of Time of Scoring Goal in South Africa World Cup 2010 and its relationship with successful results for teams participating in this tournament. Research method is descriptive-analytical and its information was collected observationally and with a VCD, a 32-inch television set and datasheet. Statistical population and sample were considered equal to each other (N=n and included study of 64 competitions held during tournament. Information was analyzed with SPSS.18 software and indices of descriptive statistics and inferential statistics tests such as binomial and chi square were analyzed. Results showed that there was significant difference between two halves of competitions in terms of the number of scored goals. Increase of attack in the second 45 min led to more goals for teams. No significant difference was found between the scored goals and every 15 min and teams competed with opponents until the final minutes of competition. There was significant difference between positive results for the teams which scored the first goal and the teams which scored the first goal had more chance of victory. On the other hand, all teams which were ahead of the opponent in one goal in the final 30 min won at the end (P≤0/05.

  18. A SWOT Analysis for Organizing a Summer School: Case Study for Advanced Summer School in Analyzing Market Data 2013

    Directory of Open Access Journals (Sweden)

    Radu Herman

    2013-05-01

    Full Text Available The economics scholars agree that investment in education is a competitive advantage. After participating and graduating the “Advanced Summer School in Analyzing Market Data 2013”, the students will gain some formal competences is applied knowledge in Statistics with the IBM SPSS Statistics software. Studies show that the employers seek also practical competences in the undergraduate students, along with the theoretical knowledge. The article focuses on a SWOT analysis for organizing a Summer School in order to compose lists of strengths, weaknesses, opportunities and threats. The purpose of the “Advanced Summer School in Analyzing Market Data 2013“ is to train undergraduate students from social-human sciences to gain competences which are valued in the market and a certificate for attendance, to develop an appropriate training program which combines applied knowledge, statistics and IBM SPSS software and to create a „Summer School quality brand” with high-quality training programs for the Faculty of Administration and Business.

  19. Adulterations in first century BC: the case of Greek silver drachmae analyzed by X-ray methods

    Science.gov (United States)

    Constantinescu, Bogdan; Săşianu, Alexandru; Bugoi, Roxana

    2003-04-01

    For coins, chemical differences that occur during preparation affect the elemental composition and can be used to identify the producing technologies and workshops and to distinguish between originals and counterfeits. In this study, we focus our attention on some Thassos silver tetradrachmae and on an important number of Apollonia-Dyrrhachium silver drachmae emitted by these Greek cities under Pompejus authority during the First Roman Civil War. All the analyzed coins were found on the territory of present Romania (ancient Dacia). The important presence of Apollonia-Dyrrhachium drachmae here can be explained by the hypothesis that these coins were probably used by Pompejus as payment for Dacian mercenaries. To analyze the elemental composition we used: 241Am gamma source based X-ray fluorescence and in vacuum 3 MeV protons particle induced X-ray emission. The following main categories were found: original coins, local (Barbarian) imitations, debased coins with silver content down to 70%, official (original dies) counterfeits of bronze, official counterfeits of tin, and plated coins—a bronze core covered by a 0.2-0.5 mm silver layer.

  20. "Balancing the Partner's Contribution" - Analyzing the Risks and Benifits in the case of an Indian Joint Venture.

    OpenAIRE

    gautam, mukim

    2006-01-01

    Joint Venture is used as it has the tendency to evolve where there is a need to implant hierarchy to enable future strategic decision-making in cases where single firm alternatives are inferior or prohibited. (Kay,N.M. 1997) A Joint Venture can be defined an agreement between two or more firms, leading to the establishment of a separate legal entity. The firms may be held responsible for their own individual tasks for which they have special expertise, knowledge or past achievements. For ...

  1. "Balancing the Partner's Contribution" - Analyzing the Risks and Benifits in the case of an Indian Joint Venture.

    OpenAIRE

    gautam, mukim

    2006-01-01

    Joint Venture is used as it has the tendency to evolve where there is a need to implant hierarchy to enable future strategic decision-making in cases where single firm alternatives are inferior or prohibited. (Kay,N.M. 1997) A Joint Venture can be defined an agreement between two or more firms, leading to the establishment of a separate legal entity. The firms may be held responsible for their own individual tasks for which they have special expertise, knowledge or past achievements. For ...

  2. A Case Study: Analyzing City Vitality with Four Pillars of Activity-Live, Work, Shop, and Play.

    Science.gov (United States)

    Griffin, Matt; Nordstrom, Blake W; Scholes, Jon; Joncas, Kate; Gordon, Patrick; Krivenko, Elliott; Haynes, Winston; Higdon, Roger; Stewart, Elizabeth; Kolker, Natali; Montague, Elizabeth; Kolker, Eugene

    2016-03-01

    This case study evaluates and tracks vitality of a city (Seattle), based on a data-driven approach, using strategic, robust, and sustainable metrics. This case study was collaboratively conducted by the Downtown Seattle Association (DSA) and CDO Analytics teams. The DSA is a nonprofit organization focused on making the city of Seattle and its Downtown a healthy and vibrant place to Live, Work, Shop, and Play. DSA primarily operates through public policy advocacy, community and business development, and marketing. In 2010, the organization turned to CDO Analytics ( cdoanalytics.org ) to develop a process that can guide and strategically focus DSA efforts and resources for maximal benefit to the city of Seattle and its Downtown. CDO Analytics was asked to develop clear, easily understood, and robust metrics for a baseline evaluation of the health of the city, as well as for ongoing monitoring and comparisons of the vitality, sustainability, and growth. The DSA and CDO Analytics teams strategized on how to effectively assess and track the vitality of Seattle and its Downtown. The two teams filtered a variety of data sources, and evaluated the veracity of multiple diverse metrics. This iterative process resulted in the development of a small number of strategic, simple, reliable, and sustainable metrics across four pillars of activity: Live, Work, Shop, and Play. Data during the 5 years before 2010 were used for the development of the metrics and model and its training, and data during the 5 years from 2010 and on were used for testing and validation. This work enabled DSA to routinely track these strategic metrics, use them to monitor the vitality of Downtown Seattle, prioritize improvements, and identify new value-added programs. As a result, the four-pillar approach became an integral part of the data-driven decision-making and execution of the Seattle community's improvement activities. The approach described in this case study is actionable, robust, inexpensive

  3. Comparison between Windowed FFT and Hilbert-Huang Transform for Analyzing Time Series with Poissonian Fluctuations: A Case Study

    Institute of Scientific and Technical Information of China (English)

    Dong Han; Shuang-Nan Zhang

    2006-01-01

    Hilbert-Huang Transform (HHT) is a novel data analysis technique for nonlinear and non-stationary data. We present a time-frequency analysis of both simulated light curves and an X-ray burst from the X-ray burster 4U 1702-429 with both the HHT and the Windowed Fast Fourier Transform (WFFT) methods. Our results show that the HHT method has failed in all cases for light curves with Poissonian fluctuations which are typical for all photon counting instruments used in astronomy, whereas the WFFT method can sensitively detect the periodic signals in the presence of Poissonian fluctuations; the only drawback of the WFFT method is that it cannot detect sharp frequency variations accurately.

  4. Using the social structure of markets as a framework for analyzing vaccination debates: The case of emergency polio vaccination.

    Science.gov (United States)

    Connelly, Yaron; Ziv, Arnona; Goren, Uri; Tal, Orna; Kaplan, Giora; Velan, Baruch

    2016-07-02

    The framework of the social structure of markets was used to analyze an online debate revolving around an emergency poliovirus vaccination campaign in Israel. Examination of a representative sample of 200 discussions revealed the activity of three parties: authoritative agents promoting vaccinations, alternative agents promoting anti-vaccination, both representing sellers, and the impartial agents, representing the customers-the general public deliberating whether to comply with vaccination or not. Both sellers interacted with consumers using mechanisms of luring and convincing. The authoritative agents conveyed their message by exhibiting professionalism, building trust and offering to share information. The alternative agents spread doubts and evoked negative emotions of distrust and fear. Among themselves, the alternative agents strived to discredit the authoritative agents, while the latter preferred to ignore the former. Content analysis of discussions conducted by the general public reveal reiteration of the messages conveyed by the sellers, implying that the transaction of pro and anti-vaccination ideas indeed took place. We suggest that the framework of the market as a social structure can be applied to the analysis of other vaccination debates, and thereby provide additional insights into vaccination polemics.

  5. Analyzing small data sets using Bayesian estimation: the case of posttraumatic stress symptoms following mechanical ventilation in burn survivors

    Directory of Open Access Journals (Sweden)

    Rens van de Schoot

    2015-03-01

    Full Text Available Background: The analysis of small data sets in longitudinal studies can lead to power issues and often suffers from biased parameter values. These issues can be solved by using Bayesian estimation in conjunction with informative prior distributions. By means of a simulation study and an empirical example concerning posttraumatic stress symptoms (PTSS following mechanical ventilation in burn survivors, we demonstrate the advantages and potential pitfalls of using Bayesian estimation. Methods: First, we show how to specify prior distributions and by means of a sensitivity analysis we demonstrate how to check the exact influence of the prior (mis- specification. Thereafter, we show by means of a simulation the situations in which the Bayesian approach outperforms the default, maximum likelihood and approach. Finally, we re-analyze empirical data on burn survivors which provided preliminary evidence of an aversive influence of a period of mechanical ventilation on the course of PTSS following burns. Results: Not suprisingly, maximum likelihood estimation showed insufficient coverage as well as power with very small samples. Only when Bayesian analysis, in conjunction with informative priors, was used power increased to acceptable levels. As expected, we showed that the smaller the sample size the more the results rely on the prior specification. Conclusion: We show that two issues often encountered during analysis of small samples, power and biased parameters, can be solved by including prior information into Bayesian analysis. We argue that the use of informative priors should always be reported together with a sensitivity analysis.

  6. Analyzing and Confirming the Model of innovation in food products (Case Study: Jaam Food Industry Group and Golden zar

    Directory of Open Access Journals (Sweden)

    Ramzan Gholami

    2016-09-01

    Full Text Available In the era of knowledge-based economy, rapidly changing and uncertain business environment, means that the company's biggest challenge Confrontation with the problem of how to exploit the current market, achieve a competitive advantage. Some research Suggest that innovation is the most important tool for companies to maintain a competitive advantage from the use For this purpose This research seeks to measure the pattern is innovation. The population of the research staff collected food and roses. 2250 people who use formula cochran sample of 250 individuals were determined. Direction Determine the reliability of the composite reliability 0.92 was used. Factor analysis was used to test the validity of the questionnaire. Using structural equation modeling software to analyze the data lisrel and PLS . Results showed that the pattern Innovation evaluation model was perfect, and indicators needed to measure innovation in the field of research and development and investment 8In knowledge, human resources, innovation policies, performance, innovation, technology and the associated information Global flows Global economy, and productivity and trade classifications, and it was determined that the effect of each of the indicators Is positive.

  7. Analyzing Orientations

    Science.gov (United States)

    Ruggles, Clive L. N.

    Archaeoastronomical field survey typically involves the measurement of structural orientations (i.e., orientations along and between built structures) in relation to the visible landscape and particularly the surrounding horizon. This chapter focuses on the process of analyzing the astronomical potential of oriented structures, whether in the field or as a desktop appraisal, with the aim of establishing the archaeoastronomical "facts". It does not address questions of data selection (see instead Chap. 25, "Best Practice for Evaluating the Astronomical Significance of Archaeological Sites", 10.1007/978-1-4614-6141-8_25) or interpretation (see Chap. 24, "Nature and Analysis of Material Evidence Relevant to Archaeoastronomy", 10.1007/978-1-4614-6141-8_22). The main necessity is to determine the azimuth, horizon altitude, and declination in the direction "indicated" by any structural orientation. Normally, there are a range of possibilities, reflecting the various errors and uncertainties in estimating the intended (or, at least, the constructed) orientation, and in more formal approaches an attempt is made to assign a probability distribution extending over a spread of declinations. These probability distributions can then be cumulated in order to visualize and analyze the combined data from several orientations, so as to identify any consistent astronomical associations that can then be correlated with the declinations of particular astronomical objects or phenomena at any era in the past. The whole process raises various procedural and methodological issues and does not proceed in isolation from the consideration of corroborative data, which is essential in order to develop viable cultural interpretations.

  8. Using Logistic Regression to Analyze the Balance of a Game: The Case of StarCraft II

    CERN Document Server

    Yun, Hyokun

    2011-01-01

    Recently, the market size of online game has been increasing astonishingly fast, and so does the importance of good game design. In online games, usually a human user competes with others, so the fairness of the game system to all users is of great importance not to lose interests of users on the game. Furthermore, the emergence and success of electronic sports (e-sports) and professional gaming which specially talented gamers compete with others draws more attention on whether they are competing in the fair environment. No matter how fierce the debates are in the game-design community, it is rarely the case that one employs statistical analysis to answer this question seriously. But considering the fact that we can easily gather large amount of user behavior data on games, it seems potentially beneficial to make use of this data to aid making decisions on design problems of games. Actually, modern games do not aim to perfectly design the game at once: rather, they first release the game, and then monitor use...

  9. Analyzing historical land use changes using a Historical Land Use Reconstruction Model: a case study in Zhenlai County, northeastern China.

    Science.gov (United States)

    Yang, Yuanyuan; Zhang, Shuwen; Liu, Yansui; Xing, Xiaoshi; de Sherbinin, Alex

    2017-01-30

    Historical land use information is essential to understanding the impact of anthropogenic modification of land use/cover on the temporal dynamics of environmental and ecological issues. However, due to a lack of spatial explicitness, complete thematic details and the conversion types for historical land use changes, the majority of historical land use reconstructions do not sufficiently meet the requirements for an adequate model. Considering these shortcomings, we explored the possibility of constructing a spatially-explicit modeling framework (HLURM: Historical Land Use Reconstruction Model). Then a three-map comparison method was adopted to validate the projected reconstruction map. The reconstruction suggested that the HLURM model performed well in the spatial reconstruction of various land-use categories, and had a higher figure of merit (48.19%) than models used in other case studies. The largest land use/cover type in the study area was determined to be grassland, followed by arable land and wetland. Using the three-map comparison, we noticed that the major discrepancies in land use changes among the three maps were as a result of inconsistencies in the classification of land-use categories during the study period, rather than as a result of the simulation model.

  10. A Study to Identify, Assess & Analyze the Incidence of Poisoning Cases in a Tertiary Care Teaching Hospital at Davangere, Karnataka

    Directory of Open Access Journals (Sweden)

    Baishnab S

    2016-06-01

    Full Text Available Poison is any substance that causes harmful effect when administered either accidently or intentionally. In India, as agriculture is the main occupation, pesticides are used to a greater extent and the poisoning with such products is far more common. The objective was to identify and assess the incidence of accidental or intentional poisoning and also to assess the relation between socio economic factors and poisoning. This prospective cohort study was conducted in the departments of medicine, paediatric, emergency and ICU of a tertiary care teaching hospital for a period of 6 months. A total number of 150 cases were collected and categorized into different classes based on type of poisoning agents. In that organophosphate accounts more 31.3% (n=47, followed by snake bite 20% (n= 30. Male predominance were seen 58.7% (n=88, while comparing to female 41.3% (n= 62. Based on economic study, low socio economic peoples were more prone to poisoning i.e., 54.7% (n= 82. Rural people were far front in poisoning54.7% (n= 82 than urban and sub- urban. The literature status showed that 78.7% (n=118 was literate. Poisoning incidence are more in married subjects i.e., 50.7% (n=76. While considering occupation, farmers were most 30.7% (n= 46. The study highlighted the lacunae of poisoning information services in hospitals. Clinical pharmacist’s involvement can improve the identification of poison and toxicity rating.

  11. Reflections of ions in electrostatic analyzers: a case study with New Horizons/Solar Wind Around Pluto.

    Science.gov (United States)

    Randol, B M; Ebert, R W; Allegrini, F; McComas, D J; Schwadron, N A

    2010-11-01

    Electrostatic analyzers (ESAs), in various forms, are used to measure plasma in a range of applications. In this article, we describe how ions reflect from the interior surfaces of an ESA, the detection of which constitutes a fundamentally nonideal response of ESAs. We demonstrate this effect by comparing laboratory data from a real ESA-based space instrument, the Solar Wind Around Pluto (SWAP) instrument, aboard the NASA New Horizons spacecraft, to results from a model based on quantum mechanical simulations of particles reflected from the instrument's surfaces combined with simulations of particle trajectories through the instrument's applied electrostatic fields. Thus, we show, for the first time, how reflected ions in ESAs lead to nonideal effects that have important implications for understanding the data returned by these instruments, as well as for designing new low-background ESA-based instruments. Specifically, we show that the response of SWAP widens considerably below a level of 10(-3) of the peak response. Thus, a direct measurement of a plasma distribution with SWAP will have an energy-dependent background on the order of ≤10(-3) of the peak of the signal due to that distribution. We predict that this order of magnitude estimate for the background applies to a large number of ESA-based instruments because ESAs operate using a common principle. However, the exact shape of the energy-dependent response will be different for different instruments. The principle of operation is that ions outside the ideal range of energy-per-charge are deflected into the walls of the ESA. Therefore, we propose that a new design paradigm is necessary to mitigate the effect of ion reflections and thus accurately and directly measure the energy spectrum of a plasma using ESAs. In this article, we build a framework for minimizing the effect of ion reflections in the design of new ESAs. Through the use of existing computer simulation software, a design team can use our method

  12. Reflections of ions in electrostatic analyzers: A case study with New Horizons/Solar Wind Around Pluto

    Energy Technology Data Exchange (ETDEWEB)

    Randol, B. M.; Ebert, R. W. [Department of Physics and Astronomy, University of Texas at San Antonio, San Antonio, Texas 78229 (United States); Space Science and Engineering Division, Southwest Research Institute, San Antonio, Texas 78228 (United States); Allegrini, F.; McComas, D. J. [Space Science and Engineering Division, Southwest Research Institute, San Antonio, Texas 78228 (United States); Department of Physics and Astronomy, University of Texas at San Antonio, San Antonio, Texas 78229 (United States); Schwadron, N. A. [Department of Astronomy, Boston University, Boston, Massachusetts 02215 (United States)

    2010-11-15

    Electrostatic analyzers (ESAs), in various forms, are used to measure plasma in a range of applications. In this article, we describe how ions reflect from the interior surfaces of an ESA, the detection of which constitutes a fundamentally nonideal response of ESAs. We demonstrate this effect by comparing laboratory data from a real ESA-based space instrument, the Solar Wind Around Pluto (SWAP) instrument, aboard the NASA New Horizons spacecraft, to results from a model based on quantum mechanical simulations of particles reflected from the instrument's surfaces combined with simulations of particle trajectories through the instrument's applied electrostatic fields. Thus, we show, for the first time, how reflected ions in ESAs lead to nonideal effects that have important implications for understanding the data returned by these instruments, as well as for designing new low-background ESA-based instruments. Specifically, we show that the response of SWAP widens considerably below a level of 10{sup -3} of the peak response. Thus, a direct measurement of a plasma distribution with SWAP will have an energy-dependent background on the order of {<=}10{sup -3} of the peak of the signal due to that distribution. We predict that this order of magnitude estimate for the background applies to a large number of ESA-based instruments because ESAs operate using a common principle. However, the exact shape of the energy-dependent response will be different for different instruments. The principle of operation is that ions outside the ideal range of energy-per-charge are deflected into the walls of the ESA. Therefore, we propose that a new design paradigm is necessary to mitigate the effect of ion reflections and thus accurately and directly measure the energy spectrum of a plasma using ESAs. In this article, we build a framework for minimizing the effect of ion reflections in the design of new ESAs. Through the use of existing computer simulation software, a design team

  13. Analyzing the biophysical inputs and outputs embodied in global commodity chains - the case of Israeli meat consumption

    Directory of Open Access Journals (Sweden)

    Shira Dickler

    2014-11-01

    Full Text Available The prevailing global livestock industry relies heavily on natural capital and is responsible for high emissions of greenhouse gases (GHG. In recent years, nations have begun to take more of an active role in measuring their resource inputs and GHG outputs for various products. However, up until now, most nations have been recording data for production, focusing on processes within their geographical boundaries. Some recent studies have suggested the need to also embrace a consumption-based approach. It follows that in an increasingly globalized interconnected world, to be able to generate a sustainable food policy, a full systems approach should be embraced. The case of Israeli meat consumption presents an interesting opportunity for analysis, as the country does not have sufficient resources or the climatic conditions needed to produce enough food to support its population. Therefore, Israel, like a growing number of other countries that are dependent on external resources, relies on imports to meet demand, displacing the environmental impact of meat consumption to other countries. This research utilizes a multi-regional consumption perspective, aiming to measure the carbon and land footprints demanded by Israeli cattle and chicken meat consumption, following both domestic production and imports of inputs and products. The results of this research show that the “virtual land” required for producing meat for consumption in Israel is equivalent to 62% of the geographical area of the country. Moreover, almost 80% of meat consumption is provided by locally produced chicken products but the ecological impact of this source is inconsequential compared to the beef supply chain; beef imports comprise only 13% of meat consumption in Israel but are responsible for 71% of the carbon footprint and 83% of the land footprint. The sources of Israel’s meat supply are currently excluded from environmental impact assessments of Israeli processes. However

  14. Using MOD16 products for analyzing evapotranspiration and evaporation on the surface of lakes. Case studies in Romania

    Science.gov (United States)

    Stan, Florentina; Madelin, Malika; Zaharia, Liliana

    2017-04-01

    analyze the spatial distribution of PET on the surface of the studied lakes, based on the data derived from satellite images, using GIS analysis functions. The results indicate strong correlation (R2 up to 0.85) between E measured on the surface of the lakes and PET, and lower correlation between E and ET (R2 up to 0.31), based on eight-day cumulative data. Results are further improved when aggregated to the annually time scale (e.g. only for 2010), so for the relation between E and PET the R2 is up to 0.90, and for E and ET R2 is up to 0.92. Concerning the spatial distribution of PET on the lake surfaces, maximum values were identified, where the lake presents the greatest width. Based on the strong correlation identified between PET satellite product and E, we could use this relationship in the future for estimating the evaporation for unmonitored lakes. Considering the low spatial resolution of the MOD16 products, the possible errors related to land cover around the lake should be considered, especially if the lakes are small. Keywords: evapotranspiration, evaporation, lakes, MODIS, MOD16 products, Romania.

  15. Analyzing ground ozone formation regimes using a principal axis factoring method: A case study of Kladno (Czech Republic) industrial area

    Energy Technology Data Exchange (ETDEWEB)

    Malec, L.; Skacel, F. [Department of Gas, Coke and Air Protection, Institute of Chemical Technology in Prague, (Czech Republic)]. E-mail: Lukas.Malec@vscht.cz; Fousek, T. [Institute of Public Health, District of Central Czech Republic, Kladno (Czech Republic); Tekac, V. [Department of Gas, Coke and Air Protection, Institute of Chemical Technology in Prague, (Czech Republic); Kral, P. [Institute of Public Health, District of Central Czech Republic, Kladno (Czech Republic)

    2008-07-15

    Tropospheric ozone is a secondary air pollutant, changes in the ambient content of which are affected by both, the emission rates of primary pollutants and the variability of meteorological conditions. In this paper, we use two multivariate statistical methods to analyze the impact of the meteorological conditions associated with pollutant transformation processes. First, we evaluated the variability of the spatial and temporal distribution of ozone precursor parameters by using discriminant analysis (DA) in locations close to the industrial area of Kladno (a city in the Czech Republic). Second, we interpreted the data set by using factor analysis (FA) to examine the differences between ozone formation processes in summer and in winter. To avoid temperature dependency between the variables, as well as to describe tropospheric washout processes, we used water vapour content rather than the more commonly employed relative humidity parameter. In this way, we were able to successfully determine and subsequently evaluate the various processes of ozone formation, together with the distribution of ozone precursors. High air temperature, radiation and low water content relate to summer pollution episodes, while radiation and wind speed prove to be the most important parameters during winter. [Spanish] El ozono troposferico es un contaminante fotoquimico secundario cuyos contenidos estan influidos tanto por las razones de emision de las sustancias contaminantes primarias como por la variabilidad de las condiciones meteorologicas. En este trabajo utilizamos dos metodos estadisticos multivariados para el analisis de la influencia de las condiciones meteorologicas relacionadas con los procesos de transformacion de las sustancias contaminantes. Primero, estimamos la variabilidad de la descomposicion espacial y temporal de los precursores de ozono mediante el analisis discriminante (DA) en las areas cercanas a la zona industrial de Kladno (una ciudad de la Republica Checa

  16. Single-shot full strain tensor determination with microbeam X-ray Laue diffraction and a two-dimensional energy-dispersive detector.

    Science.gov (United States)

    Abboud, A; Kirchlechner, C; Keckes, J; Conka Nurdan, T; Send, S; Micha, J S; Ulrich, O; Hartmann, R; Strüder, L; Pietsch, U

    2017-06-01

    The full strain and stress tensor determination in a triaxially stressed single crystal using X-ray diffraction requires a series of lattice spacing measurements at different crystal orientations. This can be achieved using a tunable X-ray source. This article reports on a novel experimental procedure for single-shot full strain tensor determination using polychromatic synchrotron radiation with an energy range from 5 to 23 keV. Microbeam X-ray Laue diffraction patterns were collected from a copper micro-bending beam along the central axis (centroid of the cross section). Taking advantage of a two-dimensional energy-dispersive X-ray detector (pnCCD), the position and energy of the collected Laue spots were measured for multiple positions on the sample, allowing the measurement of variations in the local microstructure. At the same time, both the deviatoric and hydrostatic components of the elastic strain and stress tensors were calculated.

  17. A Study on Effective Evaluation Methods for Analyzing Effects of Nature Experiential Study : Through Case Studies on Evaluations of Geological Field Study

    OpenAIRE

    宮下, 治||ミヤシタ, オサム||Miyashita, Osamu

    2008-01-01

    In this paper, I examined evaluation methods through case studies in order to find out effective methods for analyzing effects of nature experiential study on students. I practiced four evaluation methods, which were often used in previous studies for evaluation of nature experiential study; (1) evaluation by contents of report in texts written by students, (2) evaluation by students'learning activities records, (3) evaluation by students' essays after classes, (4) evaluation by results of pr...

  18. A new method for polychromatic X-ray μLaue diffraction on a Cu pillar using an energy-dispersive pn-junction charge-coupled device.

    Science.gov (United States)

    Abboud, A; Kirchlechner, C; Send, S; Micha, J S; Ulrich, O; Pashniak, N; Strüder, L; Keckes, J; Pietsch, U

    2014-11-01

    μLaue diffraction with a polychromatic X-ray beam can be used to measure strain fields and crystal orientations of micro crystals. The hydrostatic strain tensor can be obtained once the energy profile of the reflections is measured. However, this remains a challenge both on the time scale and reproducibility of the beam position on the sample. In this review, we present a new approach to obtain the spatial and energy profiles of Laue spots by using a pn-junction charge-coupled device, an energy-dispersive area detector providing 3D resolution of incident X-rays. The morphology and energetic structure of various Bragg peaks from a single crystalline Cu micro-cantilever used as a test system were simultaneously acquired. The method facilitates the determination of the Laue spots' energy spectra without filtering the white X-ray beam. The synchrotron experiment was performed at the BM32 beamline of ESRF using polychromatic X-rays in the energy range between 5 and 25 keV and a beam size of 0.5 μm × 0.5 μm. The feasibility test on the well known system demonstrates the capabilities of the approach and introduces the "3D detector method" as a promising tool for material investigations to separate bending and strain for technical materials.

  19. Wake monochromator in asymmetric and symmetric Bragg and Laue geometry for self-seeding the European X-ray FEL

    CERN Document Server

    Geloni, Gianluca; Saldin, Evgeni; Serkez, Svitozar; Tolkiehn, Martin

    2013-01-01

    We discuss the use of self-seeding schemes with wake monochromators to produce TW power, fully coherent pulses for applications at the dedicated bio-imaging bealine at the European X-ray FEL, a concept for an upgrade of the facility beyond the baseline previously proposed by the authors. We exploit the asymmetric and symmetric Bragg and Laue reflections (sigma polarization) in diamond crystal. Optimization of the bio-imaging beamline is performed with extensive start-to-end simulations, which also take into account effects such as the spatio-temporal coupling caused by the wake monochromator. The spatial shift is maximal in the range for small Bragg angles. A geometry with Bragg angles close to pi/2 would be a more advantageous option from this viewpoint, albeit with decrease of the spectral tunability. We show that it will be possible to cover the photon energy range from 3 keV to 13 keV by using four different planes of the same crystal with one rotational degree of freedom.

  20. Ewald: an extended wide-angle Laue diffractometer for the second target station of the Spallation Neutron Source.

    Science.gov (United States)

    Coates, Leighton; Robertson, Lee

    2017-08-01

    Visualizing hydrogen atoms in biological materials is one of the biggest remaining challenges in biophysical analysis. While X-ray techniques have unrivaled capacity for high-throughput structure determination, neutron diffraction is uniquely sensitive to hydrogen atom positions in crystals of biological materials and can provide a more complete picture of the atomic and electronic structures of biological macromolecules. This information can be essential in providing predictive understanding and engineering control of key biological processes, for example, in catalysis, ligand binding and light harvesting, and to guide bioengineering of enzymes and drug design. One very common and large capability gap for all neutron atomic resolution single-crystal diffractometers is the weak flux of available neutron beams, which results in limited signal-to-noise ratios giving a requirement for sample volumes of at least 0.1 mm(3). The ability to operate on crystals an order of magnitude smaller (0.01 mm(3)) will open up new and more complex systems to studies with neutrons which will help in our understanding of enzyme mechanisms and enable us to improve drugs against multi resistant bacteria. With this is mind, an extended wide-angle Laue diffractometer, 'Ewald', has been designed, which can collect data using crystal volumes below 0.01 mm(3).

  1. Wake monochromator in asymmetric and symmetric Bragg and Laue geometry for self-seeding the European X-ray FEL

    Energy Technology Data Exchange (ETDEWEB)

    Geloni, Gianluca [European XFEL GmbH, Hamburg (Germany); Kocharyan, Vitali; Saldin, Evgeni; Serkez, Svitozar; Tolkiehn, Martin [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2013-01-15

    We discuss the use of self-seeding schemes with wake monochromators to produce TW power, fully coherent pulses for applications at the dedicated bio-imaging beamline at the European X-ray FEL, a concept for an upgrade of the facility beyond the baseline previously proposed by the authors. We exploit the asymmetric and symmetric Bragg and Laue reflections (sigma polarization) in diamond crystal. Optimization of the bio-imaging beamline is performed with extensive start-to-end simulations, which also take into account effects such as the spatio-temporal coupling caused by the wake monochromator. The spatial shift is maximal in the range for small Bragg angles. A geometry with Bragg angles close to {pi}/2 would be a more advantageous option from this viewpoint, albeit with decrease of the spectral tunability. We show that it will be possible to cover the photon energy range from 3 keV to 13 keV by using four different planes of the same crystal with one rotational degree of freedom.

  2. Crystallography using synchrotron radiation X-ray. Application of Weissenberg and time resolved Laue methods to macromolecular structure analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sakabe, Noriyoshi [National Lab. for High Energy Physics, Tsukuba, Ibaraki (Japan)

    1994-12-31

    The three-dimensional structures of macromolecules under static and dynamic conditions are very important for the study of molecular biology. X-ray crystallography is the most powerful tool to obtain the three-dimensional structures of the macromolecules of especially large size, for which synchrotron radiation X-ray is used, The collection of diffraction data is the first, most important step for crystalline structure analysis. Efforts have been exerted to establish the data collection system using monochromatic synchrotron radiation X-ray (SRX). The diffraction intensity data collection system combined with a newly developed Weissenberg camera for macromolecules and an image plate (IP) using SRX has been established at the Photon Factory. Many important biological structures by high resolution have already come out with this data collection system, which is used also for the study on enzymatic reaction mechanism. A time resolved Laue camera has been designed, and the preliminary experiment has been carried out in the Photon Factory. These systems are reported. (K.I.).

  3. Analyze to Network Bank Fraud Cases and Prevention%网银诈骗案件的分析及防范

    Institute of Scientific and Technical Information of China (English)

    纪芳; 薛亮

    2011-01-01

    本文通过介绍几起典型的网银诈骗案件,分析网银诈骗的过程,最后提出一些防范措施与建议,以此提高警惕,保护好个人财产。%This paper describes several typical Network Bank fraud cases,analyzes the process of Network Bank fraud,and finally proposes some preventive measures and recommendations,as alert and protection for personal property.

  4. Evaluating statistical and clinical significance of intervention effects in single-case experimental designs: an SPSS method to analyze univariate data.

    Science.gov (United States)

    Maric, Marija; de Haan, Else; Hogendoorn, Sanne M; Wolters, Lidewij H; Huizenga, Hilde M

    2015-03-01

    Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a data-analytic method to analyze univariate (i.e., one symptom) single-case data using the common package SPSS. This method can help the clinical researcher to investigate whether an intervention works as compared with a baseline period or another intervention type, and to determine whether symptom improvement is clinically significant. First, we describe the statistical method in a conceptual way and show how it can be implemented in SPSS. Simulation studies were performed to determine the number of observation points required per intervention phase. Second, to illustrate this method and its implications, we present a case study of an adolescent with anxiety disorders treated with cognitive-behavioral therapy techniques in an outpatient psychotherapy clinic, whose symptoms were regularly assessed before each session. We provide a description of the data analyses and results of this case study. Finally, we discuss the advantages and shortcomings of the proposed method. Copyright © 2014. Published by Elsevier Ltd.

  5. An upgraded experiment of X-ray photon-photon elastic scattering with a Laue-case beam collider

    CERN Document Server

    Yamaji, T; Yamazaki, T; Namba, T; Asai, S; Kobayashi, T; Tamasaku, K; Tanaka, Y; Inubushi, Y; Sawada, K; Yabashi, M; Ishikawa, T

    2016-01-01

    The new result of a photon-photon scattering experiment in the X-ray region is reported. An X-ray beam collider is used to divide and collide X-ray beams from an X-ray Free Electron Laser, SACLA. The sensitivity of the experiment is enhanced by an upgraded X-ray beam collider and improvement of the SACLA beam quality. The intensity of the colliding photon beams increased significantly, giving an integrated luminosity of (1.24 \\pm 0.08) \\times 10^{28} m^{-2}. No signal of scattered X rays was observed. The obtained 95% C.L. limit on the QED cross section is 1.9 \\times 10^{-27} m^2 at \\omega_{cms}=6.5 keV, which is more stringent by around three orders of magnitude than our previous result.

  6. An experiment of X-ray photon–photon elastic scattering with a Laue-case beam collider

    Directory of Open Access Journals (Sweden)

    T. Yamaji

    2016-12-01

    Full Text Available We report a search for photon–photon elastic scattering in vacuum in the X-ray region at an energy in the center of mass system of ωcms=6.5keV for which the QED cross section is σQED=2.5×10−47m2. An X-ray beam provided by the SACLA X-ray Free Electron Laser is split and the two beamlets are made to collide at right angle, with a total integrated luminosity of (1.24±0.08×1028m−2. No signal X rays from the elastic scattering that satisfy the correlation between energy and scattering angle were detected. We obtain a 95% C.L. upper limit for the scattering cross section of 1.9×10−27m2 at ωcms=6.5keV. The upper limit is the lowest upper limit obtained so far by keV experiments.

  7. High-energy transmission Laue micro-beam X-ray diffraction: a probe for intra-granular lattice orientation and elastic strain in thicker samples.

    Science.gov (United States)

    Hofmann, Felix; Song, Xu; Abbey, Brian; Jun, Tea-Sung; Korsunsky, Alexander M

    2012-05-01

    An understanding of the mechanical response of modern engineering alloys to complex loading conditions is essential for the design of load-bearing components in high-performance safety-critical aerospace applications. A detailed knowledge of how material behaviour is modified by fatigue and the ability to predict failure reliably are vital for enhanced component performance. Unlike macroscopic bulk properties (e.g. stiffness, yield stress, etc.) that depend on the average behaviour of many grains, material failure is governed by `weakest link'-type mechanisms. It is strongly dependent on the anisotropic single-crystal elastic-plastic behaviour, local morphology and microstructure, and grain-to-grain interactions. For the development and validation of models that capture these complex phenomena, the ability to probe deformation behaviour at the micro-scale is key. The diffraction of highly penetrating synchrotron X-rays is well suited to this purpose and micro-beam Laue diffraction is a particularly powerful tool that has emerged in recent years. Typically it uses photon energies of 5-25 keV, limiting penetration into the material, so that only thin samples or near-surface regions can be studied. In this paper the development of high-energy transmission Laue (HETL) micro-beam X-ray diffraction is described, extending the micro-beam Laue technique to significantly higher photon energies (50-150 keV). It allows the probing of thicker sample sections, with the potential for grain-level characterization of real engineering components. The new HETL technique is used to study the deformation behaviour of individual grains in a large-grained polycrystalline nickel sample during in situ tensile loading. Refinement of the Laue diffraction patterns yields lattice orientations and qualitative information about elastic strains. After deformation, bands of high lattice misorientation can be identified in the sample. Orientation spread within individual scattering volumes is

  8. A narrative method for analyzing transitions in urban water management: The case of the Miami-Dade Water and Sewer Department

    Science.gov (United States)

    Treuer, Galen; Koebele, Elizabeth; Deslatte, Aaron; Ernst, Kathleen; Garcia, Margaret; Manago, Kim

    2017-01-01

    Although the water management sector is often characterized as resistant to risk and change, urban areas across the United States are increasingly interested in creating opportunities to transition toward more sustainable water management practices. These transitions are complex and difficult to predict - the product of water managers acting in response to numerous biophysical, regulatory, political, and financial factors within institutional constraints. Gaining a better understanding of how these transitions occur is crucial for continuing to improve water management. This paper presents a replicable methodology for analyzing how urban water utilities transition toward sustainability. The method combines standardized quantitative measures of variables that influence transitions with contextual qualitative information about a utility's unique decision making context to produce structured, data-driven narratives. Data-narratives document the broader context, the utility's pretransition history, key events during an accelerated period of change, and the consequences of transition. Eventually, these narratives should be compared across cases to develop empirically-testable hypotheses about the drivers of and barriers to utility-level urban water management transition. The methodology is illustrated through the case of the Miami-Dade Water and Sewer Department (WASD) in Miami-Dade County, Florida, and its transition toward more sustainable water management in the 2000s, during which per capita water use declined, conservation measures were enacted, water rates increased, and climate adaptive planning became the new norm.

  9. A GPS-Based Methodology to Analyze Environment-Health Associations at the Trip Level: Case-Crossover Analyses of Built Environments and Walking.

    Science.gov (United States)

    Chaix, Basile; Kestens, Yan; Duncan, Dustin T; Brondeel, Ruben; Méline, Julie; El Aarbaoui, Tarik; Pannier, Bruno; Merlo, Juan

    2016-10-15

    Environmental health studies have examined associations between context and health with individuals as statistical units. However, investigators have been unable to investigate momentary exposures, and such studies are often vulnerable to confounding from, for example, individual-level preferences. We present a Global Positioning System (GPS)-based methodology for segmenting individuals' observation periods into visits to places and trips, enabling novel life-segment investigations and case-crossover analysis for improved inferences. We analyzed relationships between built environments and walking in trips. Participants were tracked for 7 days with GPS receivers and accelerometers and surveyed with a Web-based mapping application about their transport modes during each trip (Residential Environment and Coronary Heart Disease (RECORD) GPS Study, France, 2012-2013; 6,313 trips made by 227 participants). Contextual factors were assessed around residences and the trips' origins and destinations. Conditional logistic regression modeling was used to estimate associations between environmental factors and walking or accelerometry-assessed steps taken in trips. In case-crossover analysis, the probability of walking during a trip was 1.37 (95% confidence interval: 1.23, 1.61) times higher when trip origin was in the fourth (vs. first) quartile of service density and 1.47 (95% confidence interval: 1.23, 1.68) times higher when trip destination was in the fourth (vs. first) quartile of service density. Green spaces at the origin and destination of trips were also associated with within-individual, trip-to-trip variations in walking. Our proposed approach using GPS and Web-based surveys enables novel life-segment epidemiologic investigations.

  10. Experimental and theoretical study of diffraction properties of various crystals for the realization of a soft gamma-ray Laue lens

    CERN Document Server

    Barriere, Nicolas; von Ballmoos, Peter; Abrosimov, Nikolai V; Courtois, Pierre; Bastie, Pierre; Camus, Thierry; Jentschel, Michael; Kurlov, Vladimir N; Natalucci, Lorenzo; Roudil, Gilles; Brejnholt, Nicolai Frisch; Serre, Denis

    2009-01-01

    Crystals are the elementary constituents of Laue lenses, an emerging technology which could allow the realization of a space borne telescope 10 to 100 times more sensitive than existing ones in the 100 keV - 1.5 MeV energy range. This study addresses the current endeavor to the development of efficient crystals for the realization of a Laue lens. In the theoretical part 35 candidate-crystals both pure and two-components are considered. Their peak reflectivity at 100 keV, 500 keV and 1 MeV is calculated assuming they are mosaic crystals. It results that a careful selection of crystals can allow a reflectivity above 30% over the whole energy range, and even reaching 40% in its lower part. Experimentally, we concentrated on three different materials (Si_{1-x}Ge_x with gradient of composition, mosaic Cu and Au) that have been measured both at ESRF and ILL using highly-monochromatic beams ranging from 300 keV up to 816 keV. The aim was to check their homogeneity, quality and angular spread (mosaicity). These cryst...

  11. "Publish or Perish" as citation metrics used to analyze scientific output in the humanities: International case studies in economics, geography, social sciences, philosophy, and history.

    Science.gov (United States)

    Baneyx, Audrey

    2008-01-01

    Traditionally, the most commonly used source of bibliometric data is the Thomson ISI Web of Knowledge, in particular the (Social) Science Citation Index and the Journal Citation Reports, which provide the yearly Journal Impact Factors. This database used for the evaluation of researchers is not advantageous in the humanities, mainly because books, conference papers, and non-English journals, which are an important part of scientific activity, are not (well) covered. This paper presents the use of an alternative source of data, Google Scholar, and its benefits in calculating citation metrics in the humanities. Because of its broader range of data sources, the use of Google Scholar generally results in more comprehensive citation coverage in the humanities. This presentation compares and analyzes some international case studies with ISI Web of Knowledge and Google Scholar. The fields of economics, geography, social sciences, philosophy, and history are focused on to illustrate the differences of results between these two databases. To search for relevant publications in the Google Scholar database, the use of "Publish or Perish" and of CleanPoP, which the author developed to clean the results, are compared.

  12. Analyzing an Integrated Planning Approach Among Planning Scale and Sector A Case Study of Malang City’s Vision as The City of Education

    Directory of Open Access Journals (Sweden)

    Akhmad Amirudin

    2014-04-01

    Full Text Available Integrated planning is more needed by government today because of the complexity of problems and limited resources. Integrated planning can undertake the problems by giving comprehensive solution and provide how much resources are needed to reach the goal. Integrated planning approach is implied to provide better tools to guide actions towards the development of cities, improvement of human conditions, and ultimately a better urbanism. So the research focused on integrated planning in Malang City based on Malang City’s vision, strategic planning, operational planning, budgeting planning in Malang City to achieve Malang City’s vision as the city of Education. In this study, researcher used qualitative method with descriptive research, which is a research process aims to describe the exact nature / something happened and took place on the research conducted. The research purpose is to identify and describe and analyze the process of Malang City Planning Agency integrate other planning scale and sector in developing planning; and to identify, describe and analyze the process of Malang City Planning Agency integrated all stakeholders in Integrated Planning process. This research use descriptive research method. The reason to use descriptive research method in this study because the principle objectives of this study aimed to describe, illustrate in a systematic, factual and accurate statement of the facts and the relationship between phenomenon. Then qualitative method was directed at the individual's background and a holistic (whole. So in this case should not isolate the individual or organization into a variable or hypothesis, but should view it as part of wholeness. The result of this research in the case study of Malang City has shown thatThe case study of Malang City showed that various sectors recognized but did not pay much attention to Malang City’s vision as City of Education in their plans; however, Regional Mid-term Development

  13. Mass spectrometry based lipid(ome) analyzer and molecular platform: a new software to interpret and analyze electrospray and/or matrix-assisted laser desorption/ionization mass spectrometric data of lipids: a case study from Mycobacterium tuberculosis.

    Science.gov (United States)

    Sabareesh, Varatharajan; Singh, Gurpreet

    2013-04-01

    Mass Spectrometry based Lipid(ome) Analyzer and Molecular Platform (MS-LAMP) is a new software capable of aiding in interpreting electrospray ionization (ESI) and/or matrix-assisted laser desorption/ionization (MALDI) mass spectrometric data of lipids. The graphical user interface (GUI) of this standalone programme is built using Perl::Tk. Two databases have been developed and constituted within MS-LAMP, on the basis of Mycobacterium tuberculosis (M. tb) lipid database (www.mrl.colostate.edu) and that of Lipid Metabolites and Pathways Strategy Consortium (LIPID MAPS; www.lipidmaps.org). Different types of queries entered through GUI would interrogate with a chosen database. The queries can be molecular mass(es) or mass-to-charge (m/z) value(s) and molecular formula. LIPID MAPS identifier also can be used to search but not for M. tb lipids. Multiple choices have been provided to select diverse ion types and lipids. Satisfying to input parameters, a glimpse of various lipid categories and their population distribution can be viewed in the output. Additionally, molecular structures of lipids in the output can be seen using ChemSketch (www.acdlabs.com), which has been linked to the programme. Furthermore, a version of MS-LAMP for use in Linux operating system is separately available, wherein PyMOL can be used to view molecular structures that result as output from General Lipidome MS-LAMP. The utility of this software is demonstrated using ESI mass spectrometric data of lipid extracts of M. tb grown under two different pH (5.5 and 7.0) conditions.

  14. Advantages of 3D FEM numerical modeling over 2D, analyzed in a case study of transient thermal-hydraulic groundwater utilization

    Science.gov (United States)

    Fuchsluger, Martin; Götzl, Gregor

    2014-05-01

    In general most aquifers have a much larger lateral extent than vertical. This fact leads to the application of the Dupuit-Forchheimer assumptions to many groundwater problems, whereas a two dimensional simulation is considered sufficient. By coupling transient fluid flow modeling with heat transport the 2D aquifer approximation is in many cases insufficient as it does not consider effects of the subjacent and overlying aquitards on heat propagation as well as the impact of surface climatic effects on shallow aquifers. A shallow Holocene aquifer in Vienna served as a case study to compare different modeling approaches in two and three dimensions in order to predict the performance and impact of a thermal aquifer utilization for heating (1.3 GWh) and cooling (1.4 GWh) of a communal building. With the assumption of a 6 doublets well field, the comparison was realized in three steps: At first a two dimensional model for unconfined flow was set up, assuming a varying hydraulic conductivity as well as a varying top and bottom elevation of the aquifer (gross - thickness). The model area was chosen along constant hydraulic head at steady state conditions. A second model was made by mapping solely the aquifer in three dimensions using the same subdomain and boundary conditions as defined in step one. The third model consists of a complete three dimensional geological build-up including the aquifer as well as the overlying and subjacent layers and additionally an annually variable climatic boundary condition at the surface. The latter was calibrated with measured water temperature at a nearby water gauge. For all three models the same annual operating mode of the 6 hydraulic doublets was assumed. Furthermore a limited maximal groundwater temperature at a range between 8 and 18 °C as well as a constrained well flow rate has been given. Finally a descriptive comparison of the three models concerning the extracted thermal power, drawdown, temperature distribution and Darcy

  15. Analyzing Error Medication Adverse Case Based on Reason Model%基于Reason模型给药错误不良个案分析

    Institute of Scientific and Technical Information of China (English)

    阚庭; 李娟; 师文文; 储静

    2015-01-01

    Nursing students are lack of experiences when they start clinical practice, which leads to the frequent occurrence of nursing adverse events related to nursing students.Reason had put forward Organizational Accident Model, which pointed out that the system or organizational failures underlying the errors account the most for an adverse event,though it is partly caused by personal negligence or unskillfulness.The article analyzed a case of error medication administration relevant to a nursing student by using Reason model, in order to find out the un-derlying environmental and organizational defects and propose some measures avoiding similar events.%由于护生在临床实习阶段缺乏经验,与护生相关的护理不良事件频频发生。 Reason提出的组织事故模型指出,在不良事件中,虽有一部分来自个人的疏忽或技术的不良,但是更大部分来自系统、程序、工作环境中的潜藏失误。本文运用Reason模型,对一例护生临床实习期间经历的护理给药错误不良事件进行分析,寻找出事件背后潜在的环境与组织漏洞,并提出相关对策,以避免类似事件再次发生。

  16. Feasibility analyses for HEU to LEU fuel conversion of the LAUE Langivin Institute (ILL) High Flux Reactor (RHF).

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, J.; Tentner. A.; Bergeron, A.; Nuclear Engineering Division

    2010-08-19

    The High Flux Reactor (RHF) of the Laue Langevin Institute (ILL) based in Grenoble, France is a research reactor designed primarily for neutron beam experiments for fundamental science. It delivers one of the most intense neutron fluxes worldwide, with an unperturbed thermal neutron flux of 1.5 x 10{sup 15} n/cm{sup 2}/s in its reflector. The reactor has been conceived to operate at a nuclear power of 57 MW but currently operates at 52 MW. The reactor currently uses a Highly Enriched Uranium (HEU) fuel. In the framework of its non-proliferation policies, the international community presently aims to minimize the amount of nuclear material available that could be used for nuclear weapons. In this geopolitical context, most worldwide research and test reactors have already started a program of conversion to the use of Low Enriched Uranium (LEU) fuel. A new type of LEU fuel based on a mixture of uranium and molybdenum (UMo) is expected to allow the conversion of compact high performance reactors like the RHF. This report presents the results of reactor design, performance and steady state safety analyses for conversion of the RHF from the use of HEU fuel to the use of UMo LEU fuel. The objective of this work was to show that is feasible, under a set of manufacturing assumptions, to design a new RHF fuel element that could safely replace the HEU element currently used. The new proposed design has been developed to maximize performance, minimize changes and preserve strong safety margins. Neutronics and thermal-hydraulics models of the RHF have been developed and qualified by benchmark against experiments and/or against other codes and models. The models developed were then used to evaluate the RHF performance if LEU UMo were to replace the current HEU fuel 'meat' without any geometric change to the fuel plates. Results of these direct replacement analyses have shown a significant degradation of the RHF performance, in terms of both neutron flux and cycle

  17. 国内近30年报道的198例异位甲状腺病例的文献分析%Analyzing for 198 cases of ectopic thyroid glands

    Institute of Scientific and Technical Information of China (English)

    王宏; 孙莹; 贾静; 杨松青

    2012-01-01

    Summary Through analyzing the data of ectopic thyroid glands, to know this kind of congenital lesion deeply. In all the cases, parathyroid gland types were more than aberrant thyroid types; the majority cases were single ectopic thyroid; most of ectopic thyroid were present in the neck and mouth; local mass was the most common symptom; adenoma was often associated with ectopic thyroid and there was a high proportion of combined malignant; most cases could not be correctly diagnosed.

  18. The evidences of progressive pressurization of volcanic conduit as driving forces of unrest phenomena analyzed via modelling of multiplatform geodetic measurements: Fernandina (GALAPAGOS) and Maunaloa (HAWAII) case studies

    Science.gov (United States)

    Pepe, Susi; Castaldo, Raffaele; Casu, Francesco; D'Auria, Luca; De Luca, Claudio; De Novellis, Vincenzo; Solaro, Giuseppe; Tizzani, Pietro

    2017-04-01

    collection, we determined the source responsible of deformation observed and in particular the results of our inversions show that the pipe source contributes substantially to both the ground deformation pattern and the cost function. In the case of Fernandina Volcano (Galápagos) we exploited the advanced Differential SAR Interferometry (DInSAR) techniques to analyze the 2012-2013 uplift episode by using X-band data from the COSMO-SkyMed (CSK) satellite constellation. This volcano falls among those not well monitored, therefore, the availability of CSK data, acquired with a repeat time ranging from 4 to 12 days and with a ground resolution of 3 meters, represents a unique opportunity to perform a detailed study of the space and time ground deformation field changes (Sansosti et al., 2014). In addition, in this case study we computed the ground deformation time series by applying the Small BAseline Subset (SBAS)-DInSAR approach (Berardino et al., 2002) to CSK data, acquired from both ascending and descending orbits. The results of their combination (vertical and horizontal E-W components) are used in order to evaluate, through a cross correlation analysis (Tizzani et al., 2009; 2015), the volcanic areas that are characterized by similar uplift temporal behavior. Subsequently, we determine the geometry, location and the temporal evolution of the geodetic source responsible for the 2012 - 2013 uplift event by applying an inverse method to the DInSAR measurements. We search for its geometrical parameters and volume variation that minimize the difference between the observed data and the modelled ground deformation field. We tested various analytical models and finally, using the Akaike Information Criterion (Akaike, 1965) among the tested analytical sources, we selected the tilted pipe. The pipe model is similar to the prolate ellipsoid, but the size of the smaller axis is kept fixed to a very small value (i.e., 10 m). Despite having a similar fit with the prolate ellipsoid

  19. Real-time tracking of CO migration and binding in the α and β subunits of human hemoglobin via 150-ps time-resolved Laue crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Schotte, Friedrich; Cho, Hyun Sun [Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, Bethesda, MD 20892-0520 (United States); Soman, Jayashree [Department of Biochemistry and Cell Biology, and W.M. Keck Center for Computational Biology, Rice University, Houston, TX 77251-1892 (United States); Wulff, Michael [European Synchrotron Radiation Facility, BP 220, 38043 Grenoble Cedex (France); Olson, John S. [Department of Biochemistry and Cell Biology, and W.M. Keck Center for Computational Biology, Rice University, Houston, TX 77251-1892 (United States); Anfinrud, Philip A., E-mail: anfinrud@nih.gov [Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, Bethesda, MD 20892-0520 (United States)

    2013-08-30

    Highlights: ► 150-ps Time-resolved Laue crystallography has unveiled ligand dynamics in hemoglobin. ► Significant kinetic differences are observed in the α and β subunits of hemoglobin. ► The B-site lifetime is ∼1.6 ns in β, and ∼18 ns in α. ► The B-site location in β is ∼0.25 Å closer to the binding site than in α. ► The correlation between CO position and rebinding rate suggests distal control. - Abstract: We have developed the method of picosecond Laue crystallography and used this capability to probe ligand dynamics in tetrameric R-state hemoglobin (Hb). Time-resolved, 2 Å-resolution electron density maps of photolyzed HbCO reveal the time-dependent population of CO in the binding (A) and primary docking (B) sites of both α and β subunits from 100 ps to 10 μs. The proximity of the B site in the β subunit is about 0.25 Å closer to its A binding site, and its k{sub BA} rebinding rate (∼300 μs{sup −1}) is six times faster, suggesting distal control of the rebinding dynamics. Geminate rebinding in the β subunit exhibits both prompt and delayed geminate phases. We developed a microscopic model to quantitatively explain the observed kinetics, with three states for the α subunit and four states for the β subunit. This model provides a consistent framework for interpreting rebinding kinetics reported in prior studies of both HbCO and HbO{sub 2}.

  20. Evaluating statistical and clinical significance of intervention effects in single-case experimental designs: An SPSS method to analyze univariate data

    NARCIS (Netherlands)

    Maric, M.; de Haan, M.; Hogendoorn, S.M.; Wolters, L.H.; Huizenga, H.M.

    2015-01-01

    Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a

  1. A Method for Using Adjacency Matrices to Analyze the Connections Students Make within and between Concepts: The Case of Linear Algebra

    Science.gov (United States)

    Selinski, Natalie E.; Rasmussen, Chris; Wawro, Megan; Zandieh, Michelle

    2014-01-01

    The central goals of most introductory linear algebra courses are to develop students' proficiency with matrix techniques, to promote their understanding of key concepts, and to increase their ability to make connections between concepts. In this article, we present an innovative method using adjacency matrices to analyze students'…

  2. A Method for Using Adjacency Matrices to Analyze the Connections Students Make within and between Concepts: The Case of Linear Algebra

    Science.gov (United States)

    Selinski, Natalie E.; Rasmussen, Chris; Wawro, Megan; Zandieh, Michelle

    2014-01-01

    The central goals of most introductory linear algebra courses are to develop students' proficiency with matrix techniques, to promote their understanding of key concepts, and to increase their ability to make connections between concepts. In this article, we present an innovative method using adjacency matrices to analyze students' interpretation…

  3. A Method for Using Adjacency Matrices to Analyze the Connections Students Make within and between Concepts: The Case of Linear Algebra

    Science.gov (United States)

    Selinski, Natalie E.; Rasmussen, Chris; Wawro, Megan; Zandieh, Michelle

    2014-01-01

    The central goals of most introductory linear algebra courses are to develop students' proficiency with matrix techniques, to promote their understanding of key concepts, and to increase their ability to make connections between concepts. In this article, we present an innovative method using adjacency matrices to analyze students' interpretation…

  4. 砂岩储层出砂段套管强度分析%Analyze of Production Casing Failure for Sand Production

    Institute of Scientific and Technical Information of China (English)

    吴怀志; 翟晓鹏; 姬辉; 管虹翔

    2013-01-01

    砂岩储层开采过程中储层砂岩会与储层流体一起流向井筒,易在井筒附近油层形成出砂空洞,使储层出砂段套管失去地层的保护作用,引起套管失效.分析出砂段套管强度校核采用的圆柱壳屈曲模型理论的局限性,提出依据普氏理论,建立疏松砂岩出砂后套管的受力模型;并采用数值方法确定出砂段套管强度值.研究表明:出砂段空洞高度与套管轴向力呈一定线性比例,出砂段空洞与出砂量是一定指数关系.该理论可用来进行出砂段套管的校核和设计.%Loose sandstone reservoir improve oil recovery with reasonable sand control, but cavities around the wellbore are formed when sand is produced, which makes the casing lost the protection of the formation and casing failure comes about. Comparing with the limitation of thin cylindrical shell theory , a mechanical model of stable sand arch acting on casing when sand production is established by M. M- ΠpoTOΠb(R)KOHOB. It is found that strength failure is the main casing failure under sand arching action and casing ultimate strength calibration formula is given. The research shows the cavity height and sand production volume are in a power relationship; the cavity height and axial force are in a linear relationship. With the increase of the overburden pressure, the used he used higher steel grade to keep the reliability of the casing. The research provides theoretical foundation for production casing design and casing strength verify.

  5. Analyzing geographic clustered response

    Energy Technology Data Exchange (ETDEWEB)

    Merrill, D.W.; Selvin, S.; Mohr, M.S.

    1991-08-01

    In the study of geographic disease clusters, an alternative to traditional methods based on rates is to analyze case locations on a transformed map in which population density is everywhere equal. Although the analyst's task is thereby simplified, the specification of the density equalizing map projection (DEMP) itself is not simple and continues to be the subject of considerable research. Here a new DEMP algorithm is described, which avoids some of the difficulties of earlier approaches. The new algorithm (a) avoids illegal overlapping of transformed polygons; (b) finds the unique solution that minimizes map distortion; (c) provides constant magnification over each map polygon; (d) defines a continuous transformation over the entire map domain; (e) defines an inverse transformation; (f) can accept optional constraints such as fixed boundaries; and (g) can use commercially supported minimization software. Work is continuing to improve computing efficiency and improve the algorithm. 21 refs., 15 figs., 2 tabs.

  6. Evaluating statistical and clinical significance of intervention effects in single-case experimental designs: An SPSS method to analyze univariate data

    NARCIS (Netherlands)

    M. Maric; M. de Haan; S.M. Hogendoorn; L.H. Wolters; H.M. Huizenga

    2015-01-01

    Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a data-

  7. Advantages of analyzing postmortem brain samples in routine forensic drug screening-Case series of three non-natural deaths tested positive for lysergic acid diethylamide (LSD).

    Science.gov (United States)

    Mardal, Marie; Johansen, Sys Stybe; Thomsen, Ragnar; Linnet, Kristian

    2017-09-01

    Three case reports are presented, including autopsy findings and toxicological screening results, which were tested positive for the potent hallucinogenic drug lysergic acid diethylamide (LSD). LSD and its main metabolites were quantified in brain tissue and femoral blood, and furthermore hematoma and urine when available. LSD, its main metabolite 2-oxo-3-hydroxy-LSD (oxo-HO-LSD), and iso-LSD were quantified in biological samples according to a previously published procedure involving liquid-liquid extraction and ultra-high performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS). LSD was measured in the brain tissue of all presented cases at a concentration level from 0.34-10.8μg/kg. The concentration level in the target organ was higher than in peripheral blood. Additional psychoactive compounds were quantified in blood and brain tissue, though all below toxic concentration levels. The cause of death in case 1 was collision-induced brain injury, while it was drowning in case 2 and 3 and thus not drug intoxication. However, the toxicological findings could help explain the decedent's inability to cope with brain injury or drowning incidents. The presented findings could help establish reference concentrations in brain samples and assist in interpretation of results from forensic drug screening in brain tissue. This is to the author's knowledge the first report of LSD, iso-LSD, and oxo-HO-LSD measured in brain tissue samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. A Case Study Analyzing the Reading Levels of Print and Electronic Health Education Material for Health Consumers with Low Levels of Literacy

    Science.gov (United States)

    Gerstle, Alan John

    2010-01-01

    The purpose of this case study was to determine if two samples of health education literature (one in print media; the other in electronic media), and published by the same health education organization, provided the requisite reading level for their intended audiences: immigrants and native speakers with a fourth-grade level of literacy. A…

  9. Analyzing the Effectiveness of Reward Management System on Employee Performance through the Mediating Role of Employee Motivation Case Study: Isfahan Regional Electric Company

    OpenAIRE

    Amin Karami; Hossein Rezaei Dolatabadi; Saeed Rajaeepour

    2013-01-01

    Purpose: Analyzing the effectiveness of reward management system on employee performance through the mediating role of employee motivation was the purpose of the present survey.Method and tools: Given that staff department of Isfahan Regional Electric Company was the statistical population under study simple random sampling was used in this survey. Sample size was determined by means of Cochran formula (140 persons). Historical study and field study methods were the most important methods of ...

  10. Advantages of analyzing postmortem brain samples in routine forensic drug screening—case series of three non-natural deaths tested positive for lysergic acid diethylamide (LSD)

    DEFF Research Database (Denmark)

    Mardal, Marie; Johansen, Sys Stybe; Thomsen, Ragnar

    2017-01-01

    Three case reports are presented, including autopsy findings and toxicological screening results, which were tested positive for the potent hallucinogenic drug lysergic acid diethylamide (LSD). LSD and its main metabolites were quantified in brain tissue and femoral blood, and furthermore hematoma...... and urine when available. LSD, its main metabolite 2-oxo-3-hydroxy-LSD (oxo-HO-LSD), and iso-LSD were quantified in biological samples according to a previously published procedure involving liquid-liquid extraction and ultra-high performance liquid chromatography − tandem mass spectrometry (UHPLC......-MS/MS). LSD was measured in the brain tissue of all presented cases at a concentration level from 0.34 −10.8 μg/kg. The concentration level in the target organ was higher than in peripheral blood. Additional psychoactive compounds were quantified in blood and brain tissue, though all below toxic concentration...

  11. Development of a time-trend model for analyzing and predicting case-pattern of Lassa fever epidemics in Liberia, 2013-2017.

    Science.gov (United States)

    Olugasa, Babasola O; Odigie, Eugene A; Lawani, Mike; Ojo, Johnson F

    2015-01-01

    The objective was to develop a case-pattern model for Lassa fever (LF) among humans and derive predictors of time-trend point distribution of LF cases in Liberia in view of the prevailing under-reporting and public health challenge posed by the disease in the country. A retrospective 5 years data of LF distribution countrywide among humans were used to train a time-trend model of the disease in Liberia. A time-trend quadratic model was selected due to its goodness-of-fit (R2 = 0.89, and P trend within the Northern County of Nimba. Case specific exponential increase was predicted for the first 2 years (2013-2014) with a geometric increase over the next 3 years (2015-2017) in Nimba County. This paper describes a translational application of the space-time distribution pattern of LF epidemics, 2008-2012 reported in Liberia, on which a predictive model was developed. We proposed a computationally feasible two-stage space-time permutation approach to estimate the time-trend parameters and conduct predictive inference on LF in Liberia.

  12. Analyzing the Effectiveness of Policy Implementation at the Local Level:A Case Study of Management of the 2009–2010 Drought in Yunnan Province, China

    Institute of Scientific and Technical Information of China (English)

    Neera Shrestha Pradhan; Yufang Su; Yao Fu; Liyun Zhang; Yongping Yang

    2017-01-01

    Several research efforts have focused primarily on policy implementation and improving innovative actions to address disaster risks. Discussions are ongoing on how to measure the effectiveness of policy implemen-tation at the local level. But there is no definitive theory of effective policy implementation, and very few frameworks have been found acceptable as the basis of an analysis of the effectiveness of policy implementation, especially on droughts. Based on the 2009–2010 extreme drought in Yunnan, China, this article presents a modified framework to assess the effectiveness of policy implementation by defining policy, practice, and performance, as well as a feedback loop by which to share the lessons learned. Water conservancy projects in Luliang County and the agricul-tural diversity program in Longyang County in Yunnan Province were analyzed from a farmers' perspective. It was found that farmers are highly dependent on government policies and projects, and the effectiveness of policies is measured by short-term, immediate, and tangible benefits rather than long-term adaptation strategies. The results highlight the urgent need to reduce risks by developing better awareness about climate change and drought and its impacts, increased understanding of drought hazards, and implementation of appropriate measures for long-term adaptation.

  13. Investigating and analyzing prospective teacher's reflective thinking in solving mathematical problem: A case study of female-field dependent (FD) prospective teacher

    Science.gov (United States)

    Agustan, S.; Juniati, Dwi; Siswono, Tatag Yuli Eko

    2017-05-01

    In the last few years, reflective thinking becomes very popular term in the world of education, especially in professional education of teachers. One of goals of the educational personnel and teacher institutions create responsible prospective teachers and they are able reflective thinking. Reflective thinking is a future competence that should be taught to students to face the challenges and to respond of demands of the 21st century. Reflective thinking can be applied in mathematics becauseby reflective thinking, students can improve theircuriosity to solve mathematical problem. In solving mathematical problem is assumed that cognitive style has an impact on prospective teacher's mental activity. As a consequence, reflective thinking and cognitive style are important things in solving mathematical problem. The subject, in this research paper, isa female-prospective teacher who has fielddependent cognitive style. The purpose of this research paperis to investigate the ability of prospective teachers' reflective thinking in solving mathematical problem. This research paper is a descriptive by using qualitativeapproach. To analyze the data related to prospectiveteacher's reflective thinking in solving contextual mathematicalproblem, the researchers focus in four main categories which describe prospective teacher's activities in using reflective thinking, namely; (a) formulation and synthesis of experience, (b) orderliness of experience, (c) evaluating the experience and (d) testing the selected solution based on the experience.

  14. Analyzing Teachers' Stories

    Directory of Open Access Journals (Sweden)

    Anat Kainan

    2002-09-01

    Full Text Available This article presents an integrated socio-literal approach as a way to analyze work stories. It uses a case of teachers' stories about the administration as an example. The stories focus on grumbles about various activities of members of the management of a school in a small town. The complaints appear in descriptions of the action, the characters, and, in particular, in the way the story is presented to the audience. The stories present a situation of two opposing groups-the administration and the teachers. The presentation of the stories creates a sense of togetherness among the veterans and new teachers in the staff room, and helps the integration of the new teachers into the staff. The veterans use the stories as an opportunity to express their anger at not having been assigned responsibilities on the one hand and their hopes of such promotion on the other. The stories act as a convenient medium to express criticism without entering into open hostilities. Behind them, a common principle can be discerned- the good of the school. The stories describe the infringement of various aspects of the school's social order, and it is possible to elicit from them what general pattern the teachers want to preserve in the school.

  15. Analyzing the chronic pain causes of 138 cases cavum pelvis%138例慢性盆腔疼痛原因分析

    Institute of Scientific and Technical Information of China (English)

    谢希强; 陈伟

    2001-01-01

    目的:探讨分析慢性盆腔疼痛的原因,以利作出疾病诊断。方法:通过药物治疗及手术后回顾性诊断,对138例慢性盆腔疼痛作出列表分析。结果与结论:慢性盆腔疼痛首发因素是慢性盆腔炎,其次是子宫内膜异位症与盆腔静脉瘀血症等。%Objective :To research the chronic pain causes of cavum pelvis .Method:We reviewed.138 cases which trented.with drug and operation,then compared them through grid.Result and Conclusion:The first cause of chronic pain of cavum pelvis is chronic pelvic in flammationThe second is endometriosis and pelvic varicosis

  16. Between Critical and Uncritical Understandings: A Case Study Analyzing the Claims of Islamophobia Made in the Context of the Proposed ‘Super-Mosque’ in Dudley, England

    Directory of Open Access Journals (Sweden)

    Chris Allen

    2013-04-01

    Full Text Available Research highlights how usage and claims of Islamophobia tend to be simplistic and without nuance. Using a case study approach, this article considers the claims of Islamophobia made in relation to the proposed Dudley ‘super-mosque’. Setting out a narrative of the ‘super-mosque’, this article draws upon primary and secondary research to consider the claims and discourses of the major actors in the Dudley setting: the Dudley Muslim Association, Dudley Metropolitan Borough Council, the far-right especially the British National Party and the English Defence League, as well as individual political figures. Considering each in detail, this article seeks to evaluate the extent to which each of the actors and the claims of Islamophobia made against them might be valid. As well as exploring claims of Islamophobia within a ‘real’ environment, this article seeks to critically engage the opposition shown towards the mosque, the way in which the opposition campaigns were mobilized and engineered, and how the ideological meanings of Islamophobia was able to be readily utilized to validate and justify such opposition. In doing so, this article concludes that the claims and usage of Islamophobia was weak and that a more critical and nuanced usage of the term is urgently required.

  17. Analyzing Two-Phase Single-Case Data with Non-overlap and Mean Difference Indices: Illustration, Software Tools, and Alternatives.

    Science.gov (United States)

    Manolov, Rumen; Losada, José L; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2016-01-01

    Two-phase single-case designs, including baseline evaluation followed by an intervention, represent the most clinically straightforward option for combining professional practice and research. However, unless they are part of a multiple-baseline schedule, such designs do not allow demonstrating a causal relation between the intervention and the behavior. Although the statistical options reviewed here cannot help overcoming this methodological limitation, we aim to make practitioners and applied researchers aware of the available appropriate options for extracting maximum information from the data. In the current paper, we suggest that the evaluation of behavioral change should include visual and quantitative analyses, complementing the substantive criteria regarding the practical importance of the behavioral change. Specifically, we emphasize the need to use structured criteria for visual analysis, such as the ones summarized in the What Works Clearinghouse Standards, especially if such criteria are complemented by visual aids, as illustrated here. For quantitative analysis, we focus on the non-overlap of all pairs and the slope and level change procedure, as they offer straightforward information and have shown reasonable performance. An illustration is provided of the use of these three pieces of information: visual, quantitative, and substantive. To make the use of visual and quantitative analysis feasible, open source software is referred to and demonstrated. In order to provide practitioners and applied researchers with a more complete guide, several analytical alternatives are commented on pointing out the situations (aims, data patterns) for which these are potentially useful.

  18. Employing observational method for prospective data collection: A case study for analyzing diagnostic process and evaluating efficacy of TCM treatments for diabetes mellitus.

    Science.gov (United States)

    Gu, Xuelan; Huang, Nan; Gu, Jie; Joshi, Manoj Kumar; Wang, Hongqiang

    2016-11-04

    With the mounting pandemic of glucose metabolism dysregulation and type 2 Diabetes Mellitus (T2DM), traditional medicine such as traditional Chinese medicine (TCM) recipes has been widely adopted as a part of therapeutic approach, especially in Asian countries. A novel approach, which is adopted from cohort studies from epidemiology has been applied to explore the clinical efficacy, as well as the herbal component selection of a variety of TCM formulations against T2DM. In the current study, 98 newly diagnosed T2DM patients were recruited in two hospitals. Over a span of 4 weeks, the patients were treated by prescriptions of their individual TCM physicians. General TCM symptoms, blood glucose parameters, as well as general metabolic health biomarkers were evaluated over the therapy period. The pattern of which herbs were used, together with association between blood glucose level change and the use of herbs, were analyzed. TCM diabetic syndrome diagnosis was made by physicians based on symptoms, who prescribed herbal TCM medication afterwards for individual subjects. The results showed significant reduction in fasting and postmeal glucose levels, as well as insulin after the TCM treatment regimen as compared to baseline. As secondary endpoint, total triglyceride level decreased over the period of study as well. Kudzuvine root, Rhemannia root, Figwoot root, and Mulberry leaf were the top herbs associated with pronounced glucose reduction. In conclusion, an observational study on a cohort of patients receiving TCM therapy has shown good clinical outcome for T2DM patients receiving TCM treatments. Association analysis on herbal usage and clinical outcome suggested opportunity in constructing optimized formulation for superior efficacy with future studies at a larger scale. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. Method for Analyzing Trade-offs in Biomass Management in Smallholder Farming Systems Based on Mass Balance: A Case Study in Tajikistan's Foothills

    Directory of Open Access Journals (Sweden)

    Sebastian Ruppen

    2016-02-01

    Full Text Available In smallholder farming systems, especially in mountainous areas, households face trade-offs between meeting short-term human needs and ensuring long-term soil productivity. Improved biomass management can help break the downward spiral of overexploitation of natural resources, land degradation, and productivity decline, and support more sustainable use of marginal land. Mixed crop/livestock systems are often carefully balanced to minimize risks. Thus, when planning interventions, profound systems knowledge is crucial. However, the data required for system characterization are often scarce, and original field studies are thus necessary. The aim of this research, a case study in Tajikistan, was to improve systems understanding of the biomass cycle in crop/livestock systems in order to quantify the obstacles to the spread of sustainable land management technologies to farming households. It aimed to establish a database and methods of rapid data collection to quantify the stocks and flows of biomass, with a focus on mass balances, and to evaluate smallholders’ biomass management options and trade-offs. Data collection included household interviews, secondary literature, and reference data sets from global sources. Trade-off analysis focused on household-level self-supply of food, fodder, and fuel by farmers with different sizes of smallholdings, and their potential for on-farm recycling of organic matter. Results indicate that food self-supply by small and medium smallholders is insufficient and fodder sources are scarce. Fodder scarcity means that application of crop byproducts to soils is unlikely. Animal dung is largely used as fuel. Firewood needs exceed on-farm wood production, leading to deforestation. The approach presented facilitates an understanding of current and potential agricultural land interventions in the crop/livestock farming systems prevailing in mountainous areas.

  20. STUDY AND ANALYZE THE SATISFACTION OF CLIENTS BY C.S.M MODEL-CASE STUDY:ISFAHAN DEPARTMENT OF PHYSICAL EDUCATION

    Directory of Open Access Journals (Sweden)

    morteza rezaee

    2013-08-01

    Full Text Available Abstract One of the most important purposes of any organization is to attract customer satisfaction and service recipients through offering services with high quality. Organizations by different ways to obtain the reasonable demand and legal requirements, then due to these service demands, response appropriately (Ziviar Farzad, 1390. Factors such as accelerating office works associated with suitable/ healthy behavior and accuracy in customer demands has a great importance, While it cause the clients to be satisfied. The administrative system of Iran, however, the religious matters are included in all organizations and offices according to our Islamic leader command. These overall policies have been notified to administrative system and upstream documentation like development plan regulation and civil service management code. Thus, the main principle is that the employees consider clients and customers like their own self, which means act and behave like what they expect for themselves. This survey is a field study. The research method is descriptive – analytic as well as inferential. In fact, samples have chosen from the clients attending to Isfahan department of physical education during the years, 1384 to 1390. Data collected through filling out the questionnaire by the clients. So the satisfaction was evaluated through six years. Sample selection was completely random from men and women attending to the department. All the required data analyzed and categorized by SPSS techniques. Therefore, this study is done descriptively to explain the objective and accurate systematic of facts happen in our sample. Findings of this survey show that the satisfaction of clients in the department of physical education has decreasing trend during the years, 1385 to 1390. the average satisfaction during these years is 76,25 that is less than the average satisfaction in Isfahan province in the same year. Statistics also show that the top rated ( or high score

  1. Supervised Classification of Satellite Images to Analyze Multi-Temporal Land Use and Coverage : A Case Study for the Town of MARABA, State of PARA, Brazil

    Directory of Open Access Journals (Sweden)

    Priscila Siqueira Aranha

    2015-03-01

    Full Text Available Natarajan Meghanathan et al. (Eds : COSIT, SEC, SI GL, AIAPP - 2015 pp. 09–19, 2015. © CS & IT-CSCP 2015 DOI : 10.5 121/csit.2015.50602 S UPERVISED C LASSIFICATION OF S ATELLITE I MAGES T O A NALYZE M ULTI - T EMPORAL L AND U SE AND C OVERAGE : A C ASE S TUDY F OR T HE T OWN OF M ARABÁ , S TATE OF P ARÁ, B RAZIL Priscila Siqueira Aranha 1 , Flavia Pessoa Monteiro 1 , Paulo André Ignácio Pontes 5 , Jorge Antonio Moraes de Souza 2 , Nandamudi Lankalapalli Vijaykumar 4 , Maurílio de Abreu Monteiro 3 and Carlos Renato Lisboa Francês 1 1 Federal University of Pará (UFPA, Pará, Brazil {priscilasa, flaviamonteiro, rfrances}@ufpa.br 2 Federal Rural University of Amazonia (UFRA, Pará, Brazil jorge.souza@ufra.edu.br 3 Federal University of South and Southeast of Pará ( UNIFESSPA, Pará, Brazil maurilio.monteiro@unifesspa.edu.br 4 National Institute for Space Research (INPE, São P aulo, Brazil vijay.nl@inpe.br 5 Federal Institute of Education, Science and Technol ogy of Pará (IFPA, Pará, Brazil paulo.pontes@ifpa.edu.br A BSTRACT Amazon has one of the most diversified biome of the planet. Its environmental preservation has an impact in the global scenario. However, besides the environmental features, the complexity of the region involves other different aspects such as social, economic and cultural. In fact, these aspects are intrinsically interrelated, for e xample, cultural aspects may affect land use/land cover characteristics. This paper proposes an innovative methodology to in vestigate changes of critical factors in the environment, based on a case study in the 26 de Mar ço Settlement, in the city of Marabá, in the Brazilian Amazon. The proposed methodology demonstr ated, from the obtained results, an improvement of the efficiency of the classification technique to determine different thematic classes as well as a substantial enhancement in the precision of classified images. Another important aspect is the automation in the process

  2. Cryogenically cooled bent double-Laue monochromator for high-energy undulator X-rays (50-200 keV).

    Science.gov (United States)

    Shastri, S D; Fezzaa, K; Mashayekhi, A; Lee, W K; Fernandez, P B; Lee, P L

    2002-09-01

    A liquid-nitrogen-cooled monochromator for high-energy X-rays consisting of two bent Si(111) Laue crystals adjusted to sequential Rowland conditions has been in operation for over two years at the SRI-CAT sector 1 undulator beamline of the Advanced Photon Source (APS). It delivers over ten times more flux than a flat-crystal monochromator does at high energies, without any increase in energy width (DeltaE/E approximately 10(-3)). Cryogenic cooling permits optimal flux, avoiding a sacrifice from the often employed alternative technique of filtration - a technique less effective at sources like the 7 GeV APS, where considerable heat loads can be deposited by high-energy photons, especially at closed undulator gaps. The fixed-offset geometry provides a fully tunable in-line monochromatic beam. In addition to presenting the optics performance, unique crystal design and stable bending mechanism for a cryogenically cooled crystal under high heat load, the bending radii adjustment procedures are described.

  3. Analyzing Software Piracy in Education.

    Science.gov (United States)

    Lesisko, Lee James

    This study analyzes the controversy of software piracy in education. It begins with a real world scenario that presents the setting and context of the problem. The legalities and background of software piracy are explained and true court cases are briefly examined. Discussion then focuses on explaining why individuals and organizations pirate…

  4. Crew Activity Analyzer

    Science.gov (United States)

    Murray, James; Kirillov, Alexander

    2008-01-01

    The crew activity analyzer (CAA) is a system of electronic hardware and software for automatically identifying patterns of group activity among crew members working together in an office, cockpit, workshop, laboratory, or other enclosed space. The CAA synchronously records multiple streams of data from digital video cameras, wireless microphones, and position sensors, then plays back and processes the data to identify activity patterns specified by human analysts. The processing greatly reduces the amount of time that the analysts must spend in examining large amounts of data, enabling the analysts to concentrate on subsets of data that represent activities of interest. The CAA has potential for use in a variety of governmental and commercial applications, including planning for crews for future long space flights, designing facilities wherein humans must work in proximity for long times, improving crew training and measuring crew performance in military settings, human-factors and safety assessment, development of team procedures, and behavioral and ethnographic research. The data-acquisition hardware of the CAA (see figure) includes two video cameras: an overhead one aimed upward at a paraboloidal mirror on the ceiling and one mounted on a wall aimed in a downward slant toward the crew area. As many as four wireless microphones can be worn by crew members. The audio signals received from the microphones are digitized, then compressed in preparation for storage. Approximate locations of as many as four crew members are measured by use of a Cricket indoor location system. [The Cricket indoor location system includes ultrasonic/radio beacon and listener units. A Cricket beacon (in this case, worn by a crew member) simultaneously transmits a pulse of ultrasound and a radio signal that contains identifying information. Each Cricket listener unit measures the difference between the times of reception of the ultrasound and radio signals from an identified beacon

  5. Analyzing Peace Pedagogies

    Science.gov (United States)

    Haavelsrud, Magnus; Stenberg, Oddbjorn

    2012-01-01

    Eleven articles on peace education published in the first volume of the Journal of Peace Education are analyzed. This selection comprises peace education programs that have been planned or carried out in different contexts. In analyzing peace pedagogies as proposed in the 11 contributions, we have chosen network analysis as our method--enabling…

  6. The Intermodulation Lockin Analyzer

    CERN Document Server

    Tholen, Erik A; Forchheimer, Daniel; Schuler, Vivien; Tholen, Mats O; Hutter, Carsten; Haviland, David B

    2011-01-01

    Nonlinear systems can be probed by driving them with two or more pure tones while measuring the intermodulation products of the drive tones in the response. We describe a digital lock-in analyzer which is designed explicitly for this purpose. The analyzer is implemented on a field-programmable gate array, providing speed in analysis, real-time feedback and stability in operation. The use of the analyzer is demonstrated for Intermodulation Atomic Force Microscopy. A generalization of the intermodulation spectral technique to arbitrary drive waveforms is discussed.

  7. Analyzing in the present

    DEFF Research Database (Denmark)

    Revsbæk, Line; Tanggaard, Lene

    2015-01-01

    The article presents a notion of “analyzing in the present” as a source of inspiration in analyzing qualitative research materials. The term emerged from extensive listening to interview recordings during everyday commuting to university campus. Paying attention to the way different parts...... the interdependency between researcher and researched. On this basis, we advocate an explicit “open-state-of mind” listening as a key aspect of analyzing qualitative material, often described only as a matter of reading transcribed empirical materials, reading theory, and writing. The article contributes...

  8. Analyzing binding data.

    Science.gov (United States)

    Motulsky, Harvey J; Neubig, Richard R

    2010-07-01

    Measuring the rate and extent of radioligand binding provides information on the number of binding sites, and their affinity and accessibility of these binding sites for various drugs. This unit explains how to design and analyze such experiments.

  9. Analog multivariate counting analyzers

    CERN Document Server

    Nikitin, A V; Armstrong, T P

    2003-01-01

    Characterizing rates of occurrence of various features of a signal is of great importance in numerous types of physical measurements. Such signal features can be defined as certain discrete coincidence events, e.g. crossings of a signal with a given threshold, or occurrence of extrema of a certain amplitude. We describe measuring rates of such events by means of analog multivariate counting analyzers. Given a continuous scalar or multicomponent (vector) input signal, an analog counting analyzer outputs a continuous signal with the instantaneous magnitude equal to the rate of occurrence of certain coincidence events. The analog nature of the proposed analyzers allows us to reformulate many problems of the traditional counting measurements, and cast them in a form which is readily addressed by methods of differential calculus rather than by algebraic or logical means of digital signal processing. Analog counting analyzers can be easily implemented in discrete or integrated electronic circuits, do not suffer fro...

  10. Miniature mass analyzer

    CERN Document Server

    Cuna, C; Lupsa, N; Cuna, S; Tuzson, B

    2003-01-01

    The paper presents the concept of different mass analyzers that were specifically designed as small dimension instruments able to detect with great sensitivity and accuracy the main environmental pollutants. The mass spectrometers are very suited instrument for chemical and isotopic analysis, needed in environmental surveillance. Usually, this is done by sampling the soil, air or water followed by laboratory analysis. To avoid drawbacks caused by sample alteration during the sampling process and transport, the 'in situ' analysis is preferred. Theoretically, any type of mass analyzer can be miniaturized, but some are more appropriate than others. Quadrupole mass filter and trap, magnetic sector, time-of-flight and ion cyclotron mass analyzers can be successfully shrunk, for each of them some performances being sacrificed but we must know which parameters are necessary to be kept unchanged. To satisfy the miniaturization criteria of the analyzer, it is necessary to use asymmetrical geometries, with ion beam obl...

  11. Analyzing Microarray Data.

    Science.gov (United States)

    Hung, Jui-Hung; Weng, Zhiping

    2017-03-01

    Because there is no widely used software for analyzing RNA-seq data that has a graphical user interface, this protocol provides an example of analyzing microarray data using Babelomics. This analysis entails performing quantile normalization and then detecting differentially expressed genes associated with the transgenesis of a human oncogene c-Myc in mice. Finally, hierarchical clustering is performed on the differentially expressed genes using the Cluster program, and the results are visualized using TreeView.

  12. Analyzes 30 cases clinical observations and nursing for the newborn suffocates%新生儿窒息30例临床观察及护理分析

    Institute of Scientific and Technical Information of China (English)

    张昱君

    2013-01-01

    Objective To investigate and analyze clinical observation and nursing experiences of 30 cases of neonatal asphyxia and to summarize its clinical value. Methods From May 2007 to January 2012,30 cases which came to Liulin District Hospital because of neonatal asphyxia were chosen to study clinical observation and nursing experiences. They were rescued and nursed according to guidelines of resuscitation. Results After comprehensive care and resuscitation,28 cases of neonatal asphyxia were successfully cured (93.33%),1 case was died and 1 case was given up treatment by his families .Conclusion Nursing care and resuscitation can improve the success rate of neonatal asphyxia cases,which is worthy of clinical reference and promotion.%  目的探讨新生儿窒息的急救复苏方法和综合护理措施,评价其临床应用价值。方法对30例窒息新生儿实施积极抢救治疗和复苏后护理,观察其临床效果。结果通过积极抢救治疗和综合护理,30例窒息新生儿复苏成功28例,死亡1例,家属放弃1例,复苏成功率达93.33%。结论对窒息新生儿实施正确合理的抢救和治疗,复苏后予以精心优质的护理可提高窒息新生儿复苏的成功率及预后,临床治疗效果明显,值得推广。

  13. Total organic carbon analyzer

    Science.gov (United States)

    Godec, Richard G.; Kosenka, Paul P.; Smith, Brian D.; Hutte, Richard S.; Webb, Johanna V.; Sauer, Richard L.

    The development and testing of a breadboard version of a highly sensitive total-organic-carbon (TOC) analyzer are reported. Attention is given to the system components including the CO2 sensor, oxidation reactor, acidification module, and the sample-inlet system. Research is reported for an experimental reagentless oxidation reactor, and good results are reported for linearity, sensitivity, and selectivity in the CO2 sensor. The TOC analyzer is developed with gravity-independent components and is designed for minimal additions of chemical reagents. The reagentless oxidation reactor is based on electrolysis and UV photolysis and is shown to be potentially useful. The stability of the breadboard instrument is shown to be good on a day-to-day basis, and the analyzer is capable of 5 sample analyses per day for a period of about 80 days. The instrument can provide accurate TOC and TIC measurements over a concentration range of 20 ppb to 50 ppm C.

  14. Analyzing radioligand binding data.

    Science.gov (United States)

    Motulsky, Harvey; Neubig, Richard

    2002-08-01

    Radioligand binding experiments are easy to perform, and provide useful data in many fields. They can be used to study receptor regulation, discover new drugs by screening for compounds that compete with high affinity for radioligand binding to a particular receptor, investigate receptor localization in different organs or regions using autoradiography, categorize receptor subtypes, and probe mechanisms of receptor signaling, via measurements of agonist binding and its regulation by ions, nucleotides, and other allosteric modulators. This unit reviews the theory of receptor binding and explains how to analyze experimental data. Since binding data are usually best analyzed using nonlinear regression, this unit also explains the principles of curve fitting with nonlinear regression.

  15. Advances in hematology analyzers.

    Science.gov (United States)

    DeNicola, Dennis B

    2011-05-01

    The complete blood count is one of the basic building blocks of the minimum database in veterinary medicine. Over the past 20 years, there has been a tremendous advancement in the technology of hematology analyzers and their availability to the general practitioner. There are 4 basic methodologies that can be used to generate data for a complete blood count: manual methods, quantitative buffy coat analysis, automated impedance analysis, and flow cytometric analysis. This article will review the principles of these methodologies, discuss some of their advantages and disadvantages, and describe some of the hematology analyzers that are available for the in-house veterinary laboratory.

  16. Analyzing Stereotypes in Media.

    Science.gov (United States)

    Baker, Jackie

    1996-01-01

    A high school film teacher studied how students recognized messages in film, examining how film education could help students identify and analyze racial and gender stereotypes. Comparison of students' attitudes before and after the film course found that the course was successful in raising students' consciousness. (SM)

  17. Analyzing Workforce Education. Monograph.

    Science.gov (United States)

    Texas Community & Technical Coll. Workforce Education Consortium.

    This monograph examines the issue of task analysis as used in workplace literacy programs, debating the need for it and how to perform it in a rapidly changing environment. Based on experiences of community colleges in Texas, the report analyzes ways that task analysis can be done and how to implement work force education programs more quickly.…

  18. PhosphoSiteAnalyzer

    DEFF Research Database (Denmark)

    Bennetzen, Martin V; Cox, Jürgen; Mann, Matthias

    2012-01-01

    an algorithm to retrieve kinase predictions from the public NetworKIN webpage in a semiautomated way and applies hereafter advanced statistics to facilitate a user-tailored in-depth analysis of the phosphoproteomic data sets. The interface of the software provides a high degree of analytical flexibility...... and is designed to be intuitive for most users. PhosphoSiteAnalyzer is a freeware program available at http://phosphosite.sourceforge.net ....

  19. Magnetoresistive emulsion analyzer.

    Science.gov (United States)

    Lin, Gungun; Baraban, Larysa; Han, Luyang; Karnaushenko, Daniil; Makarov, Denys; Cuniberti, Gianaurelio; Schmidt, Oliver G

    2013-01-01

    We realize a magnetoresistive emulsion analyzer capable of detection, multiparametric analysis and sorting of ferrofluid-containing nanoliter-droplets. The operation of the device in a cytometric mode provides high throughput and quantitative information about the dimensions and magnetic content of the emulsion. Our method offers important complementarity to conventional optical approaches involving ferrofluids, and paves the way to the development of novel compact tools for diagnostics and nanomedicine including drug design and screening.

  20. IPv6 Protocol Analyzer

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    With the emerging of next generation Intemet protocol (IPv6), it is expected to replace the current version of Internet protocol (IPv4) that will be exhausted in the near future. Besides providing adequate address space, some other new features are included into the new 128 bits of IP such as IP auto configuration, quality of service, simple routing capability, security, mobility and multicasting. The current protocol analyzer will not be able to handle IPv6 packets. This paper will focus on developing protocol analyzer that decodes IPv6 packet. IPv6 protocol analyzer is an application module,which is able to decode the IPv6 packet and provide detail breakdown of the construction of the packet. It has to understand the detail construction of the IPv6, and provide a high level abstraction of bits and bytes of the IPv6 packet.Thus it increases network administrators' understanding of a network protocol,helps he/she in solving protocol related problem in a IPv6 network environment.

  1. Analyzing Chinese Financial Reporting

    Institute of Scientific and Technical Information of China (English)

    SABRINA; ZHANG

    2008-01-01

    If the world’s capital markets could use a harmonized accounting framework it would not be necessary for a comparison between two or more sets of accounting standards. However,there is much to do before this becomes reality.This article aims to pres- ent a general overview of China’s General Accepted Accounting Principles(GAAP), U.S.General Accepted Accounting Principles and International Financial Reporting Standards(IFRS),and to analyze the differ- ences among IFRS,U.S.GAAP and China GAAP using fixed assets as an example.

  2. Mineral/Water Analyzer

    Science.gov (United States)

    1983-01-01

    An x-ray fluorescence spectrometer developed for the Viking Landers by Martin Marietta was modified for geological exploration, water quality monitoring, and aircraft engine maintenance. The aerospace system was highly miniaturized and used very little power. It irradiates the sample causing it to emit x-rays at various energies, then measures the energy levels for sample composition analysis. It was used in oceanographic applications and modified to identify element concentrations in ore samples, on site. The instrument can also analyze the chemical content of water, and detect the sudden development of excessive engine wear.

  3. Analyzing Aeroelasticity in Turbomachines

    Science.gov (United States)

    Reddy, T. S. R.; Srivastava, R.

    2003-01-01

    ASTROP2-LE is a computer program that predicts flutter and forced responses of blades, vanes, and other components of such turbomachines as fans, compressors, and turbines. ASTROP2-LE is based on the ASTROP2 program, developed previously for analysis of stability of turbomachinery components. In developing ASTROP2- LE, ASTROP2 was modified to include a capability for modeling forced responses. The program was also modified to add a capability for analysis of aeroelasticity with mistuning and unsteady aerodynamic solutions from another program, LINFLX2D, that solves the linearized Euler equations of unsteady two-dimensional flow. Using LINFLX2D to calculate unsteady aerodynamic loads, it is possible to analyze effects of transonic flow on flutter and forced response. ASTROP2-LE can be used to analyze subsonic, transonic, and supersonic aerodynamics and structural mistuning for rotors with blades of differing structural properties. It calculates the aerodynamic damping of a blade system operating in airflow so that stability can be assessed. The code also predicts the magnitudes and frequencies of the unsteady aerodynamic forces on the airfoils of a blade row from incoming wakes. This information can be used in high-cycle fatigue analysis to predict the fatigue lives of the blades.

  4. Field Deployable DNA analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Wheeler, E; Christian, A; Marion, J; Sorensen, K; Arroyo, E; Vrankovich, G; Hara, C; Nguyen, C

    2005-02-09

    This report details the feasibility of a field deployable DNA analyzer. Steps for swabbing cells from surfaces and extracting DNA in an automatable way are presented. Since enzymatic amplification reactions are highly sensitive to environmental contamination, sample preparation is a crucial step to make an autonomous deployable instrument. We perform sample clean up and concentration in a flow through packed bed. For small initial samples, whole genome amplification is performed in the packed bed resulting in enough product for subsequent PCR amplification. In addition to DNA, which can be used to identify a subject, protein is also left behind, the analysis of which can be used to determine exposure to certain substances, such as radionuclides. Our preparative step for DNA analysis left behind the protein complement as a waste stream; we determined to learn if the proteins themselves could be analyzed in a fieldable device. We successfully developed a two-step lateral flow assay for protein analysis and demonstrate a proof of principle assay.

  5. Analyzing the platelet proteome.

    Science.gov (United States)

    García, Angel; Zitzmann, Nicole; Watson, Steve P

    2004-08-01

    During the last 10 years, mass spectrometry (MS) has become a key tool for protein analysis and has underpinned the emerging field of proteomics. Using high-throughput tandem MS/MS following protein separation, it is potentially possible to analyze hundreds to thousands of proteins in a sample at a time. This technology can be used to analyze the protein content (i.e., the proteome) of any cell or tissue and complements the powerful field of genomics. The technology is particularly suitable for platelets because of the absence of a nucleus. Cellular proteins can be separated by either gel-based methods such as two-dimensional gel electrophoresis or one-dimensional sodium dodecyl sulfate polyacrylamide gel electrophoresis followed by liquid chromatography (LC) -MS/MS or by multidimensional LC-MS/MS. Prefractionation techniques, such as subcellular fractionations or immunoprecipitations, can be used to improve the analysis. Each method has particular advantages and disadvantages. Proteomics can be used to compare the proteome of basal and diseased platelets, helping to reveal information on the molecular basis of the disease.

  6. 放置宫内节育器患者50例常见并发症分析%Placing IUD 50 Cases of Patients with Common Complications Were Analyzed

    Institute of Scientific and Technical Information of China (English)

    李晓英

    2014-01-01

    Our service center placed IUD complications in 50 cases were analyzed, and concluded that common complications are pain, bleeding, uterine perforation,An IUD and ectopic incarcerated, infections,etc. In order to reduce the occurrence of complications, and place of women in IUD check regularly.%对本服务中心50例放置宫内节育器出现并发症者进行分析,得出常见并发症有疼痛、出血、子宫穿孔、IUD异位和嵌顿、感染等,为减少并发症的发生,对放置宫内节育器的妇女要定期检查。

  7. Analyzing Flowgraphs with ATL

    Directory of Open Access Journals (Sweden)

    Valerio Cosentino

    2013-11-01

    Full Text Available This paper presents a solution to the Flowgraphs case study for the Transformation Tool Contest 2013 (TTC 2013. Starting from Java source code, we execute a chain of model transformations to derive a simplified model of the program, its control flow graph and its data flow graph. Finally we develop a model transformation that validates the program flow by comparing it with a set of flow specifications written in a domain specific language. The proposed solution has been implemented using ATL.

  8. 川崎病并发无菌性脑膜炎31例临床治疗分析%Kawasaki disease complicated with aseptic meningitis clinical treatment of 31 cases were analyzed

    Institute of Scientific and Technical Information of China (English)

    罗云娇; 杜曾庆; 杨小涛

    2014-01-01

    objective the article is to review the 112 cases of Kawasaki disease (Kd) treaded in our department from January to december 2013 with the emphasis of their characteristics,diagnosis,treatment of patients with aseptic meningitis in 31 cases which were retrospectively analyzed. Methods according to Kawasaki disease and aseptic meningitis treatment, the patients were treated with aspirin 50mg/d/ kg,divided into 2 - 3 oral,and given mannitol intracranial pressure lowering treatment. all patients were given intravenous immunoglobulin therapy. Before discharging,cases in the group were given two dimensional echocardiogram Beckoning graph (2de) examination. results the clinical symptoms of all patients were disappeared,their condition improved and discharged. normal cardiac 2de examination of 5 cases of abnormal coronary artery:26 (83.87%) cases had different degrees of coronary artery lesions. conclusion the present (Kd) has replaced the wind damp heat and has become the most important cause of acquired heart disease for children. in recent years Kd with aseptic meningitis cases is increasing,Kd concurrent aseptic meningitis is Kawasaki disease in severe cases,in this group of cases were complicated with multiple organ injury. clinical attention should be paid. Pediatricians should raise awareness of Kd, especially Kd with aseptic meningitis cases. early diagnosis,early treatment is the key to reduce coronary artery disease complicated by Kd.%目的:对我科2013年1月至12月收治的112例川崎病(Kd)中并发无菌性脑膜炎31例病例的临床特征、诊断、治疗进行回顾性分析。方法本组病例入院后按川崎病并无菌性脑膜炎治疗,本组病例均给予阿司匹林50mg/d/㎏,分2~3次口服,并给予甘露醇脱水降颅压等治疗。全部病例均给予静脉丙种球蛋白治疗。出院前本组病例均做了心脏二维超声心动图(2de)检查。结果全部病例出院时临床症状消失

  9. Implementation and application of FDA′s drug risk management:by analyzing the case of rosiglitazone%从罗格列酮案例看FDA药品风险管理的实施与应用

    Institute of Scientific and Technical Information of China (English)

    赵东升; 王强; 杨凌; 解放军第

    2013-01-01

    目的 通过罗格列酮案例分析FDA药品风险管理的实施与应用.方法 于2012年12月查阅FDA网站关于罗格列酮风险管理相关信息,运用文献分析的方法进行研究.结果 FDA在药品风险管理的过程中重视药品风险沟通,基于实证理念及REMS的应用.结论 学习和运用风险管理和循证理念是我国药品监管的当务之急.%Objective To analyze the implementation and application of drug risk management of FDA through the rosiglitazone case. Methods Information of rosiglitazone risk management on website of FDA on December 2012 was searched and analyzed by literature research methods. Results FDA paid much attention to the application of risk communication and evidence-based concept and the REMS during drug risk management process. Conclusion Learning and using risk management and evidence-based concept is the top priority of drug regulatory in our country.

  10. Analyzing business models

    DEFF Research Database (Denmark)

    Nielsen, Christian

    2014-01-01

    , because the costs of processing and analyzing it exceed the benefits indicating bounded rationality. Hutton (2002) concludes that the analyst community’s inability to raise important questions on quality of management and the viability of its business model inevitably led to the Enron debacle. There seems...... financial statement. Plumlee (2003) finds for instance that such information imposes significant costs on even expert users such as analysts and fund managers and reduces their use of it. Analysts’ ability to incorporate complex information in their analyses is a decreasing function of its complexity...... to be evidence of the fact that all types of corporate stakeholders from management to employees, owners, the media and politicians have grave difficulties in interpreting new forms of reporting. One hypothesis could be that if managements’ own understanding of value creation is disclosed to the other...

  11. Analyzing architecture articles

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    In the present study, we express the quality, function, and characteristics of architecture to help people comprehensively understand what architecture is. We also reveal the problems and conflict found in population, land, water resources, pollution, energy, and the organization systems in construction. China’s economy is transforming. We should focus on the cities, architectural environment, energy conservation, emission-reduction, and low-carbon output that will result in successful green development. We should macroscopically and microscopically analyze the development, from the natural environment to the artificial environment; from the relationship between human beings and nature to the combination of social ecology in cities, and farmlands. We must learn to develop and control them harmoniously and scientifically to provide a foundation for the methods used in architecture research.

  12. PDA: Pooled DNA analyzer

    Directory of Open Access Journals (Sweden)

    Lin Chin-Yu

    2006-04-01

    Full Text Available Abstract Background Association mapping using abundant single nucleotide polymorphisms is a powerful tool for identifying disease susceptibility genes for complex traits and exploring possible genetic diversity. Genotyping large numbers of SNPs individually is performed routinely but is cost prohibitive for large-scale genetic studies. DNA pooling is a reliable and cost-saving alternative genotyping method. However, no software has been developed for complete pooled-DNA analyses, including data standardization, allele frequency estimation, and single/multipoint DNA pooling association tests. This motivated the development of the software, 'PDA' (Pooled DNA Analyzer, to analyze pooled DNA data. Results We develop the software, PDA, for the analysis of pooled-DNA data. PDA is originally implemented with the MATLAB® language, but it can also be executed on a Windows system without installing the MATLAB®. PDA provides estimates of the coefficient of preferential amplification and allele frequency. PDA considers an extended single-point association test, which can compare allele frequencies between two DNA pools constructed under different experimental conditions. Moreover, PDA also provides novel chromosome-wide multipoint association tests based on p-value combinations and a sliding-window concept. This new multipoint testing procedure overcomes a computational bottleneck of conventional haplotype-oriented multipoint methods in DNA pooling analyses and can handle data sets having a large pool size and/or large numbers of polymorphic markers. All of the PDA functions are illustrated in the four bona fide examples. Conclusion PDA is simple to operate and does not require that users have a strong statistical background. The software is available at http://www.ibms.sinica.edu.tw/%7Ecsjfann/first%20flow/pda.htm.

  13. 肺脓肿误诊肺癌12例临床及CT影像回顾分析%Retrospectively analyzed the clinical and CT imaging of 12 cases lung abscess were misdiagnosed lung cancer

    Institute of Scientific and Technical Information of China (English)

    王世友

    2014-01-01

    目的:探讨肺脓肿误诊肺癌的原因,以提高不典型肺脓肿的诊断正确率。方法:回顾性分析2008年5月-2013年12月12例术前均未能给出明确诊断,经手术病理证实的肺脓肿患者的临床资料。结果:右肺上叶后段4例,左肺上叶前段4例,右肺下叶背段5例。CT表现:8例表现为类圆形孤立团块,不规则块影3例。结论:当胸部发现较大软组密度肿块,肿块周围生长有细长条索样、柔软性毛刺存在,复诊肿块外形或肿块内部变化相对较快或较长时不变化,抗炎治疗有缩小,肿块与胸膜粘连较广泛,应该考虑慢性脓肿可能。%Objective:To explore the causes of lung abscess misdiagnosis to lung cancer,and to improve the diagnostic accuracy of atypical pulmonary abscess.Methods:Retrospective analyze the clinical data of 12 patients who had been failed to give a clear lung abscess diagnosis in preoperative but confirmed by operation and pathology from May 2008 to December 2013.Results:The upper lobe of the right lung section in 4 cases,4 cases of left upper lobe anterior,right lower lobe dorsal segment in 5 cases.CT features:8 cases in the group with the performance for the round solitary mass,irregular mass in 3 cases.Conclusion:When the chest that large soft tissue mass,around the mass growth has slender cable like softness,burr,referral mass shape or mass internal change relatively quickly or longer without change,anti-inflammatory therapy have narrowed,mass and pleural adhesion widely,consideration should be given to possible chronic abscess.

  14. Bios data analyzer.

    Science.gov (United States)

    Sabelli, H; Sugerman, A; Kovacevic, L; Kauffman, L; Carlson-Sabelli, L; Patel, M; Konecki, J

    2005-10-01

    The Bios Data Analyzer (BDA) is a set of computer programs (CD-ROM, in Sabelli et al., Bios. A Study of Creation, 2005) for new time series analyses that detects and measures creative phenomena, namely diversification, novelty, complexes, nonrandom complexity. We define a process as creative when its time series displays these properties. They are found in heartbeat interval series, the exemplar of bios .just as turbulence is the exemplar of chaos, in many other empirical series (galactic distributions, meteorological, economic and physiological series), in biotic series generated mathematically by the bipolar feedback, and in stochastic noise, but not in chaotic attractors. Differencing, consecutive recurrence and partial autocorrelation indicate nonrandom causation, thereby distinguishing chaos and bios from random and random walk. Embedding plots distinguish causal creative processes (e.g. bios) that include both simple and complex components of variation from stochastic processes (e.g. Brownian noise) that include only complex components, and from chaotic processes that decay from order to randomness as the number of dimensions is increased. Varying bin and dimensionality show that entropy measures symmetry and variety, and that complexity is associated with asymmetry. Trigonometric transformations measure coexisting opposites in time series and demonstrate bipolar, partial, and uncorrelated opposites in empirical processes and bios, supporting the hypothesis that bios is generated by bipolar feedback, a concept which is at variance with standard concepts of polar and complementary opposites.

  15. TEAMS Model Analyzer

    Science.gov (United States)

    Tijidjian, Raffi P.

    2010-01-01

    The TEAMS model analyzer is a supporting tool developed to work with models created with TEAMS (Testability, Engineering, and Maintenance System), which was developed by QSI. In an effort to reduce the time spent in the manual process that each TEAMS modeler must perform in the preparation of reporting for model reviews, a new tool has been developed as an aid to models developed in TEAMS. The software allows for the viewing, reporting, and checking of TEAMS models that are checked into the TEAMS model database. The software allows the user to selectively model in a hierarchical tree outline view that displays the components, failure modes, and ports. The reporting features allow the user to quickly gather statistics about the model, and generate an input/output report pertaining to all of the components. Rules can be automatically validated against the model, with a report generated containing resulting inconsistencies. In addition to reducing manual effort, this software also provides an automated process framework for the Verification and Validation (V&V) effort that will follow development of these models. The aid of such an automated tool would have a significant impact on the V&V process.

  16. Downhole Fluid Analyzer Development

    Energy Technology Data Exchange (ETDEWEB)

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  17. Analyzing Spacecraft Telecommunication Systems

    Science.gov (United States)

    Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric

    2004-01-01

    Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.

  18. Digital Microfluidics Sample Analyzer

    Science.gov (United States)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  19. Study of optical Laue diffraction

    Energy Technology Data Exchange (ETDEWEB)

    Chakravarthy, Giridhar, E-mail: cgiridhar84@gmail.com, E-mail: aloksharan@email.com; Allam, Srinivasa Rao, E-mail: cgiridhar84@gmail.com, E-mail: aloksharan@email.com; Satyanarayana, S. V. M., E-mail: cgiridhar84@gmail.com, E-mail: aloksharan@email.com; Sharan, Alok, E-mail: cgiridhar84@gmail.com, E-mail: aloksharan@email.com [Department of Physics, Pondicherry University, Puducherry-605014 (India)

    2014-10-15

    We present the study of the optical diffraction pattern of one and two-dimensional gratings with defects, designed using desktop pc and printed on OHP sheet using laser printer. Gratings so prepared, using novel low cost technique provides good visual aid in teaching. Diffraction pattern of the monochromatic light (632.8nm) from the grating so designed is similar to that of x-ray diffraction pattern of crystal lattice with point defects in one and two-dimensions. Here both optical and x-ray diffractions are Fraunhofer. The information about the crystalline lattice structure and the defect size can be known.

  20. Analyzing storage media of digital camera

    OpenAIRE

    Chow, KP; Tse, KWH; Law, FYW; Ieong, RSC; Kwan, MYK; Tse, H.; Lai, PKY

    2009-01-01

    Digital photography has become popular in recent years. Photographs have become common tools for people to record every tiny parts of their daily life. By analyzing the storage media of a digital camera, crime investigators may extract a lot of useful information to reconstruct the events. In this work, we will discuss a few approaches in analyzing these kinds of storage media of digital cameras. A hypothetical crime case will be used as case study for demonstration of concepts. © 2009 IEEE.

  1. Soft Decision Analyzer

    Science.gov (United States)

    Steele, Glen; Lansdowne, Chatwin; Zucha, Joan; Schlensinger, Adam

    2013-01-01

    The Soft Decision Analyzer (SDA) is an instrument that combines hardware, firmware, and software to perform realtime closed-loop end-to-end statistical analysis of single- or dual- channel serial digital RF communications systems operating in very low signal-to-noise conditions. As an innovation, the unique SDA capabilities allow it to perform analysis of situations where the receiving communication system slips bits due to low signal-to-noise conditions or experiences constellation rotations resulting in channel polarity in versions or channel assignment swaps. SDA s closed-loop detection allows it to instrument a live system and correlate observations with frame, codeword, and packet losses, as well as Quality of Service (QoS) and Quality of Experience (QoE) events. The SDA s abilities are not confined to performing analysis in low signal-to-noise conditions. Its analysis provides in-depth insight of a communication system s receiver performance in a variety of operating conditions. The SDA incorporates two techniques for identifying slips. The first is an examination of content of the received data stream s relation to the transmitted data content and the second is a direct examination of the receiver s recovered clock signals relative to a reference. Both techniques provide benefits in different ways and allow the communication engineer evaluating test results increased confidence and understanding of receiver performance. Direct examination of data contents is performed by two different data techniques, power correlation or a modified Massey correlation, and can be applied to soft decision data widths 1 to 12 bits wide over a correlation depth ranging from 16 to 512 samples. The SDA detects receiver bit slips within a 4 bits window and can handle systems with up to four quadrants (QPSK, SQPSK, and BPSK systems). The SDA continuously monitors correlation results to characterize slips and quadrant change and is capable of performing analysis even when the

  2. Molecular epidemiology analyze of a hemorrhagic fever with renal syndrome case%1例肾综合征出血热死亡病例分子流行病学分析

    Institute of Scientific and Technical Information of China (English)

    刘远; 蒋力云; 丁鹏; 王大虎; 肖新才

    2012-01-01

    目的 分析2011年广州市1例肾综合征出血热重症死亡病例的发病原因,并对其进行分子流行病学分析.方法 将患者血清以及居住地周围人群、老鼠标本进行抗体检测和PCR检测,将获得的序列与NCBI上的序列进行比对.结果 鼠血IgG抗体检测阳性率为33.33%,人群血清IgG抗体检测阳性率为5.66%,均高于检测的阳性率.患者血清中汉坦病毒基因不存在大的变异.结论 加强HFRS的日常监测,扩大监测范围和数量,认真做好防鼠灭鼠工作,减少传染源,切断传播途径,有利于HFRS的防治.%Objective To find out the cause of a hemorrhagic fever with renal syndrome case, and carry out molecular epidemiology analyze. Method The serum samples were developed antibody and PCR detection among the patient, surround people and rat Acquired sequence compare with the sequences in the Genbank. Results The positive rate of IgG in the sera of rats and people were 33. 33% and 5. 66% , respectively, they were higher than the positive rate we survey in these years. Hantavirus was not big mutation in the patient's serum. Conclusions We need to enhance the daily surveillance, expand the monitoring circumsciption and quantity, do well the deratization work, reduce the infection sources, cut down the transmission routes, those were profit for HFRS.

  3. The Importance of Nursing Risk Management Education by Analyzing One Case of Nursing Error%由一例护理差错分析护理风险管理教育的重要性

    Institute of Scientific and Technical Information of China (English)

    徐玲丽

    2015-01-01

    The safety of nursing care is related to patients’lives and prognosis, nursing risk runs through the implementation of entire nursing activities. Nursing students in clinical practice is the only way that they embark on nursing jobs, an average of 40% ~ 50% nursing errors during the internship is caused by nursing students, therefore, nursing risk management education is extremely important. This paper analyzing major reasons for one nursing error during nursing students’clinical practice, expected the nursing risk management education to nursing students before internship in terms of theoretical knowledge, hospital core checking system, nursing services, communication knowledge, practical case studies and educational content to improve nursing students’awareness toward nursing risk and to reduce the incidence of nursing errors from their own factors.%护理安全关系到患者的生命及疾病的预后,护理风险贯彻于整个护理活动中。临床实习是护生走上护士岗位的必经之路,而实习期间护理差错平均有40%~50%与护生有关,对护生进行护理风险管理教育极为重要。由一例护生差错来分析实习期间护生出现差错的主要原因,期望在校期间通过对护生进行护理风险管理的理论知识、医院核心查对制度、护理服务与沟通知识、实际案例分析与总结等风险管理内容的教育,以提高护生的护理风险防范意识,从自身因素减少护理差错的发生。

  4. Network analysis using organizational risk analyzer

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The tool system of the organizational risk analyzer (ORA) to study the network of East Turkistan terrorists is selected. The model of the relationships among its personnel, knowledge, resources and task entities is represented by the meta-matrix in ORA, with which to analyze the risks and vulnerabilities of organizational structure quantitatively, and obtain the last vulnerabilities and risks of the organization. Case study in this system shows that it should be a shortcut to destroy effectively the network...

  5. Analyzing machine noise for real time maintenance

    Science.gov (United States)

    Yamato, Yoji; Fukumoto, Yoshifumi; Kumazaki, Hiroki

    2017-02-01

    Recently, IoT technologies have been progressed and applications of maintenance area are expected. However, IoT maintenance applications are not spread in Japan yet because of one-off solution of sensing and analyzing for each case, high cost to collect sensing data and insufficient maintenance automation. This paper proposes a maintenance platform which analyzes sound data in edges, analyzes only anomaly data in cloud and orders maintenance automatically to resolve existing technology problems. We also implement a sample application and compare related work.

  6. Soft Decision Analyzer and Method

    Science.gov (United States)

    Steele, Glen F. (Inventor); Lansdowne, Chatwin (Inventor); Zucha, Joan P. (Inventor); Schlesinger, Adam M. (Inventor)

    2016-01-01

    A soft decision analyzer system is operable to interconnect soft decision communication equipment and analyze the operation thereof to detect symbol wise alignment between a test data stream and a reference data stream in a variety of operating conditions.

  7. Analyzing the Turns of David Harvey's Scholarship:A Case Study on the Innovation of Geographical Thought%地理学思想变革的案例剖析:哈维的学术转型

    Institute of Scientific and Technical Information of China (English)

    叶超; 蔡运龙

    2012-01-01

    David Harvey transformed from a positivist to a radicalist, and to a Marxist in the end. The turns of his scholarship are an important phenomenon in the western geography, and a typical case of innovation in geographical thought and epistemology. According to analyzing some representative works of Harvey, this paper focuses on epistemological problems, particularly the transformation of Harvey's view on space because of 'space as a keyword' in geography in accordance with Harvey himself. It is argued that his scholarship career mainly at least experienced three stages:the view of the relative but multidimensional space, the view of uniting'social- process-spatial-form' and the view of historical-geographical materialism spatial system. The three stages are corresponding with Harvey's scholarship career as a positivist, a radicalist, and finally a Marxist. On the basis of the statements of Harvey himself, this article sums up that there are some reasons why he fulfilled the shifts, I.e. The drastically changing social and political conditions, his individual interests and gifts, his working circumstances, the need for geography at that time, the other occasional factors. Harvey's academic transformation implies significance and inspiring for Chinese geographers. The geographers of China should treat correctly logical positivism geography, be confronted with key social practical problems and inquire into the deep reasons and build theories. Based on the principles of 'independent spirit, free thinking', Chinese geographers can contribute more to academy and society.%大卫·哈维的学术转型是当代西方地理学发展的一个重要现象,也是地理学思想创新的一个典型案例.本文聚焦于哈维空间观的转型,通过解析哈维学术转型的代表论著,认为哈维关于空间的认识论立场主要经历了以实证主义相对空间观为主的“多维”空间观、“社会过程一空间形式”统一体、历史一地理唯物主

  8. Analyze the Efficacy of 50 Cases of Free Nerve Graft Repair of Peripheral Nerve Injury%游离神经移植修复周围神经缺损50例疗效分析

    Institute of Scientific and Technical Information of China (English)

    王杰; 王浩; 黄飞; 黄荣华; 范亚生

    2015-01-01

    目的:探讨游离神经移植修复在周围神经损伤患者中的临床治疗效果。方法:对来笔者医院诊断、治疗的50例周围神经损伤患者相关资料进行分析,根据患者不同治疗方法将其分为两组,每组各25例,对照组采用单纯的神经游离移植,试验组实施带血管的神经游离移植,比较两组治疗效果。结果:试验组92.0%患者对修复方案给予肯定评价,高于对照组的76.0%;试验组92.0%对修复方案满意,高于对照组的72.0%;试验组ADL评分为(16.2±3.7)分,躯体功能评分为(59.6±7.5)分,心理功能评分为(65.8±9.2)分,社会功能评分为(57.2±6.5)分,均高于对照组,两组比较差异有统计学意义(P<0.05);试验组并发症发生率为8.0%,低于对照组的28.0%。结论:周围神经损伤发病率较高,临床上采用带血管的神经游离移植修复效果理想,值得推广使用。%Objective:To explore the free nerve graft repair clinical outcomes in patients with peripheral nerve injury.Method:The data of 50 patients with peripheral nerve injury was analyzed,according to the different methods of treatment,the patients were divided into two groups,25 cases in each group.The control group was treated by a simple nerve graft,the experimental group was implemented the free vascularized nerve transplant treatment,the effect of two groups were compared.Result:92.0% of patients in the experimental group affirmed rehabilitation program evaluation, higher than control group of 76.0%.92.0% for the experimental group rehabilitation program satisfaction was higher than the control group of 72.0%.The experimental group of ADL scores were (16.2±3.7)points,physical function scores were (59.6±7.5)points,psychological function score were (65.8±9.2)points,social function score were (57.2±6.5) points,higher than the control group,the difference was statistically significant(P<0.05).The experimental

  9. PM 3655 PHILIPS Logic analyzer

    CERN Multimedia

    A logic analyzer is an electronic instrument that captures and displays multiple signals from a digital system or digital circuit. A logic analyzer may convert the captured data into timing diagrams, protocol decodes, state machine traces, assembly language, or may correlate assembly with source-level software. Logic Analyzers have advanced triggering capabilities, and are useful when a user needs to see the timing relationships between many signals in a digital system.

  10. Decision no. 2011-DC-0216 of the French nuclear safety authority from May 5, 2011, ordering the Laue Langevin Institute to proceed to a complementary safety evaluation of its basic nuclear facility (high flux reactor - INB no. 67) in the eyes of the Fukushima Daiichi nuclear power plant accident; Decision no. 2011-DC-0216 de l'Autorite de surete nucleaire du 5 mai 2011 prescrivant a l'Institut Laue Langevin (ILL) de proceder a une evaluation complementaire de la surete de son installation nucleaire de base (Reacteur a Haut Flux - INB n.67) au regard de l'accident survenu a la centrale nucleaire de Fukushima Daiichi

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    As a consequence of the accident of the Fukushima Daiichi nuclear power plant (Japan), the French Prime Minister entrusted the French nuclear safety authority (ASN) with the mission to carry out a safety analysis re-evaluation of the French nuclear facilities, and in particular the nuclear power plants. A decision has been addressed by the ASN to each nuclear operator with the specifications of this safety re-evaluation analysis and the list of facilities in concern. This document is the decision addressed to the Laue Langevin Institute, operator of the high flux research reactor (RHF) of Grenoble (France). (J.S.)

  11. Analyzing data files in SWAN

    CERN Document Server

    Gajam, Niharika

    2016-01-01

    Traditionally analyzing data happens via batch-processing and interactive work on the terminal. The project aims to provide another way of analyzing data files: A cloud-based approach. It aims to make it a productive and interactive environment through the combination of FCC and SWAN software.

  12. Analyzing Valuation Practices through Contracts

    DEFF Research Database (Denmark)

    Tesnière, Germain; Labatut, Julie; Boxenbaum, Eva

    This paper seeks to analyze the most recent changes in how societies value animals. We analyze this topic through the prism of contracts between breeding companies and farmers. Focusing on new valuation practices and qualification of breeding animals, we question the evaluation of difficult...

  13. The discovery of X-rays diffraction: From crystals to DNA. A case study to promote understanding of the nature of science and of its interdisciplinary character

    Science.gov (United States)

    Guerra, Francesco; Leone, Matteo; Robotti, Nadia

    2016-05-01

    The advantages of introducing history of science topics into the teaching of science has been advocated by a large number of scholars within the science education community. One of the main reasons given for using history of science in teaching is its power to promote understanding of the nature of science (NOS). In this respect, the historical case of X-rays diffraction, from the discovery of Max von Laue (1912) to the first X-rays diffraction photographs of DNA (1953), is a case in point for showing that a correct experimental strategy and a favourable theoretical context are not enough to make a scientific discovery.

  14. Detecting influenza outbreaks by analyzing Twitter messages

    CERN Document Server

    Culotta, Aron

    2010-01-01

    We analyze over 500 million Twitter messages from an eight month period and find that tracking a small number of flu-related keywords allows us to forecast future influenza rates with high accuracy, obtaining a 95% correlation with national health statistics. We then analyze the robustness of this approach to spurious keyword matches, and we propose a document classification component to filter these misleading messages. We find that this document classifier can reduce error rates by over half in simulated false alarm experiments, though more research is needed to develop methods that are robust in cases of extremely high noise.

  15. ANALYZE Users' Guide

    Energy Technology Data Exchange (ETDEWEB)

    Azevedo, S.

    1982-10-01

    This report is a reproduction of the visuals that were used in the ANALYZE Users' Guide lectures of the videotaped LLNL Continuing Education Course CE2018-H, State Space Lectures. The course was given in Spring 1982 through the EE Department Education Office. Since ANALYZE is menu-driven, interactive, and has self-explanatory questions (sort of), these visuals and the two 50-minute videotapes are the only documentation which comes with the code. More information about the algorithms contained in ANALYZE can be obtained from the IEEE book on Programs for Digital Signal Processing.

  16. Improved cylindrical mirror energy analyzer

    Science.gov (United States)

    Baranova, L. A.

    2017-03-01

    A study has been carried out of the electron-optical properties of improved design of the cylindrical mirror energy analyzer. Both external and internal electrodes of the analyzer are divided into three isolated parts, whereby the potentials on the individual parts can be regulated independently from each other. In symmetric operating mode at identical potentials on the side parts of the electrodes, a significant increase has been obtained in resolving power and light-gathering power of the analyzer compared to the standard design of the cylindrical mirror. In asymmetric operating mode, which is implemented in a linear potential distribution on the external electrode, the conditions have been found under which the linear dispersion of the analyzer increases several times.

  17. Market study: Whole blood analyzer

    Science.gov (United States)

    1977-01-01

    A market survey was conducted to develop findings relative to the commercialization potential and key market factors of the whole blood analyzer which is being developed in conjunction with NASA's Space Shuttle Medical System.

  18. C2Analyzer:Co-target-Co-function Analyzer

    Institute of Scientific and Technical Information of China (English)

    Md Aftabuddin; Chittabrata Mal; Arindam Deb; Sudip Kundu

    2014-01-01

    MicroRNAs (miRNAs) interact with their target mRNAs and regulate biological pro-cesses at post-transcriptional level. While one miRNA can target many mRNAs, a single mRNA can also be targeted by a set of miRNAs. The targeted mRNAs may be involved in different bio-logical processes that are described by gene ontology (GO) terms. The major challenges involved in analyzing these multitude regulations include identification of the combinatorial regulation of miR-NAs as well as determination of the co-functionally-enriched miRNA pairs. The C2Analyzer:Co-target-Co-function Analyzer, is a Perl-based, versatile and user-friendly web tool with online instructions. Based on the hypergeometric analysis, this novel tool can determine whether given pairs of miRNAs are co-functionally enriched. For a given set of GO term(s), it can also identify the set of miRNAs whose targets are enriched in the given GO term(s). Moreover, C2Analyzer can also identify the co-targeting miRNA pairs, their targets and GO processes, which they are involved in. The miRNA-miRNA co-functional relationship can also be saved as a .txt file, which can be used to further visualize the co-functional network by using other software like Cytoscape. C2Analyzer is freely available at www.bioinformatics.org/c2analyzer.

  19. LEGAL-EASE:Analyzing Chinese Financial Statements

    Institute of Scientific and Technical Information of China (English)

    EDWARD; MA

    2008-01-01

    In this article,we will focus on under- standing and analyzing the typical accounts of Chinese financial statements,including the balance sheet and income statement. Accounts are generally incorrectly prepared. This can be due to several factors,incom- petence,as well as more serious cases of deliberate attempts to deceive.Regardless, accounts can be understood and errors or specific acts of misrepresentation uncovered. We will conduct some simple analysis to demonstrate how these can be spotted.

  20. On-Demand Urine Analyzer

    Science.gov (United States)

    Farquharson, Stuart; Inscore, Frank; Shende, Chetan

    2010-01-01

    A lab-on-a-chip was developed that is capable of extracting biochemical indicators from urine samples and generating their surface-enhanced Raman spectra (SERS) so that the indicators can be quantified and identified. The development was motivated by the need to monitor and assess the effects of extended weightlessness, which include space motion sickness and loss of bone and muscle mass. The results may lead to developments of effective exercise programs and drug regimes that would maintain astronaut health. The analyzer containing the lab-on-a- chip includes materials to extract 3- methylhistidine (a muscle-loss indicator) and Risedronate (a bone-loss indicator) from the urine sample and detect them at the required concentrations using a Raman analyzer. The lab-on- a-chip has both an extractive material and a SERS-active material. The analyzer could be used to monitor the onset of diseases, such as osteoporosis.

  1. An update on chemistry analyzers.

    Science.gov (United States)

    Vap, L M; Mitzner, B

    1996-09-01

    This update of six chemistry analyzers available to the clinician discusses several points that should be considered prior to the purchase of equipment. General topics include how to best match an instrument to clinic needs and the indirect costs associated with instrument operation. Quality assurance recommendations are discussed and common terms are defined. Specific instrument features, principles of operation, performance, and costs are presented. The information provided offers potential purchasers an objective approach to the evaluation of a chemistry analyzer for the veterinary clinic.

  2. Analyzing the Grammar of English

    CERN Document Server

    Teschner, Richard V

    2007-01-01

    Analyzing the Grammar of English offers a descriptive analysis of the indispensable elements of English grammar. Designed to be covered in one semester, this textbook starts from scratch and takes nothing for granted beyond a reading and speaking knowledge of English. Extensively revised to function better in skills-building classes, it includes more interspersed exercises that promptly test what is taught, simplified and clarified explanations, greatly expanded and more diverse activities, and a new glossary of over 200 technical terms.Analyzing the Grammar of English is the only English gram

  3. Methods of analyzing crude oil

    Energy Technology Data Exchange (ETDEWEB)

    Cooks, Robert Graham; Jjunju, Fred Paul Mark; Li, Anyin; Rogan, Iman S.

    2017-08-15

    The invention generally relates to methods of analyzing crude oil. In certain embodiments, methods of the invention involve obtaining a crude oil sample, and subjecting the crude oil sample to mass spectrometry analysis. In certain embodiments, the method is performed without any sample pre-purification steps.

  4. Analyzing Classroom Instruction in Reading.

    Science.gov (United States)

    Rutherford, William L.

    A method for analyzing instructional techniques employed during reading group instruction is reported, and the characteristics of the effective reading teacher are discussed. Teaching effectiveness is divided into two categories: (1) how the teacher acts and interacts with children on a personal level and (2) how the teacher performs his…

  5. Strategies for Analyzing Tone Languages

    Science.gov (United States)

    Coupe, Alexander R.

    2014-01-01

    This paper outlines a method of auditory and acoustic analysis for determining the tonemes of a language starting from scratch, drawing on the author's experience of recording and analyzing tone languages of north-east India. The methodology is applied to a preliminary analysis of tone in the Thang dialect of Khiamniungan, a virtually undocumented…

  6. The Convertible Arbitrage Strategy Analyzed

    NARCIS (Netherlands)

    Loncarski, I.; Ter Horst, J.R.; Veld, C.H.

    2006-01-01

    This paper analyzes convertible bond arbitrage on the Canadian market for the period 1998 to 2004.Convertible bond arbitrage is the combination of a long position in convertible bonds and a short position in the underlying stocks. Convertible arbitrage has been one of the most successful strategies

  7. 广州市登革热散发和暴发布雷图指数临界值分析%Using Breteau Index to analyze the nature of sporadic and outbreak cases of Dengue fever

    Institute of Scientific and Technical Information of China (English)

    李晓宁; 罗雷; 肖新才; 景钦隆; 魏跃红; 李意兰; 曹庆; 杨智聪; 许雅

    2014-01-01

    目的 了解广州市登革热散发和暴发蚊媒密度指标布雷图指数(BI)的临界值.方法 以广州市为研究区域,对2006-2012年国家传染病监测与管理系统中广州市登革热病例进行描述性分析,采用受试者工作特征(ROC)曲线判断登革热在街道内散发与暴发BI的临界值.结果 2006-2012年广州市共报告本地病例1 038例,共有71次登革热暴发,259次散发.ROC曲线分析显示,BI值为6.4时预测散发的约登指数最高(1.469),灵敏度为67.8%,特异度为79.1%;BI为9.5时预测暴发的约登指数最高(1.726),灵敏度为81.7%,特异度为90.9%.结论 预测登革热散发和暴发的BI临界值分别可定为5.0和9.5,实际工作中,应根据监测目的和人力物力对临界值进行调整,以得到更为合理的灵敏度和特异度.%Objective To understand the threshold of Breteau Index (BI) on Dengue fever outbreak in Guangzhou.Methods Dengue cases from Guangzhou during 2006 to 2012 in the National Infectious Disease Report and Management System were collected and described.Receiveroperating characteristic (ROC) curve was used to judge the threshold of BI on the outbreaks of Dengue fever.Results A total of 1 038 local cases were reported from 2006 to 2012 in Guangzhou city,with a total of 71 outbreaks and 259 sporadic episodes.Data from the ROC curve analysis showed that the highest Youden index that BI predicting Dengue fever outbreaks or sporadic were 6.4 and 9.5,respectively.When using BI=6.4 in predicting the sporadic case of Dengue,sensitivity and specificity were 67.8%,79.1%,respectively.When using BI=9.5 in predicting the outbreaks of Dengue,sensitivity and specificity were 81.7%,90.9%,respectively.Conclusion Both BI=5.0 and BI=9.5 showed effects on predicting the nature of sporadic or outbreak on Dengue,suggesting that the threshold need to be monitored,according to the purpose of control and availability of manpower,in order to get better sensitivity and

  8. THE DEMOGRAPHIC BEHAVIOUR OF SMALL TOWNS IN ROMANIA IN THE POST-COMMUNIST PERIOD ANALYZED THROUGH THE DYNAMICS OF THE POPULATION. CASE STUDY: THE SMALL TOWNS IN THE BIHOR COUNTY, ROMANIA

    Directory of Open Access Journals (Sweden)

    Marcu STASAC

    2016-11-01

    Full Text Available The political, economic and social changes that Romania faced after December 1989 have strongly influenced the evolution of the demographic phenomena. The decrease of the birth rate, the increase of the death rate and a pronounced negative external migration, have led to important changes in the demographic structure of the towns. Despite the fact that these changes are characteristic to all towns, the demographic decline registers the most significant values in the case of small towns, particularly the mono-industrial ones set up during the communist time in Romania, with severe economic and social consequences. In this context, the necessity of working out a strategy in the field of population from the perspective of the long-lasting development of the small towns proves to be compulsory. Without such a strategy, the imbalances shall deepen, whereas the demographic aging shalltrigger a stronger and stronger pressure upon the active population, having negative effects upon the social systems. The purpose of the study is to outlining the dimension of the demographic decline and of the demographic hazards with severe negative consequences, both economic and social, for the small towns.

  9. Introduction: why analyze single cells?

    Science.gov (United States)

    Di Carlo, Dino; Tse, Henry Tat Kwong; Gossett, Daniel R

    2012-01-01

    Powerful methods in molecular biology are abundant; however, in many fields including hematology, stem cell biology, tissue engineering, and cancer biology, data from tools and assays that analyze the average signals from many cells may not yield the desired result because the cells of interest may be in the minority-their behavior masked by the majority-or because the dynamics of the populations of interest are offset in time. Accurate characterization of samples with high cellular heterogeneity may only be achieved by analyzing single cells. In this chapter, we discuss the rationale for performing analyses on individual cells in more depth, cover the fields of study in which single-cell behavior is yielding new insights into biological and clinical questions, and speculate on how single-cell analysis will be critical in the future.

  10. The Statistical Loop Analyzer (SLA)

    Science.gov (United States)

    Lindsey, W. C.

    1985-01-01

    The statistical loop analyzer (SLA) is designed to automatically measure the acquisition, tracking and frequency stability performance characteristics of symbol synchronizers, code synchronizers, carrier tracking loops, and coherent transponders. Automated phase lock and system level tests can also be made using the SLA. Standard baseband, carrier and spread spectrum modulation techniques can be accomodated. Through the SLA's phase error jitter and cycle slip measurements the acquisition and tracking thresholds of the unit under test are determined; any false phase and frequency lock events are statistically analyzed and reported in the SLA output in probabilistic terms. Automated signal drop out tests can be performed in order to trouble shoot algorithms and evaluate the reacquisition statistics of the unit under test. Cycle slip rates and cycle slip probabilities can be measured using the SLA. These measurements, combined with bit error probability measurements, are all that are needed to fully characterize the acquisition and tracking performance of a digital communication system.

  11. Methods for Analyzing Social Media

    DEFF Research Database (Denmark)

    Jensen, Jakob Linaa; [Ukendt], editors

    2013-01-01

    Social media is becoming increasingly attractive for users. It is a fast way to communicate ideas and a key source of information. It is therefore one of the most influential mediums of communication of our time and an important area for audience research. The growth of social media invites many...... new questions such as: How can we analyze social media? Can we use traditional audience research methods and apply them to online content? Which new research strategies have been developed? Which ethical research issues and controversies do we have to pay attention to? This book focuses on research...... strategies and methods for analyzing social media and will be of interest to researchers and practitioners using social media, as well as those wanting to keep up to date with the subject....

  12. Methods for Analyzing Social Media

    DEFF Research Database (Denmark)

    Jensen, Jakob Linaa; [Ukendt], editors

    2013-01-01

    Social media is becoming increasingly attractive for users. It is a fast way to communicate ideas and a key source of information. It is therefore one of the most influential mediums of communication of our time and an important area for audience research. The growth of social media invites many...... new questions such as: How can we analyze social media? Can we use traditional audience research methods and apply them to online content? Which new research strategies have been developed? Which ethical research issues and controversies do we have to pay attention to? This book focuses on research...... strategies and methods for analyzing social media and will be of interest to researchers and practitioners using social media, as well as those wanting to keep up to date with the subject....

  13. Analyzing the Facebook Friendship Graph

    OpenAIRE

    Catanese, Salvatore; De Meo, Pasquale; Ferrara, Emilio; Fiumara, Giacomo

    2010-01-01

    Online Social Networks (OSN) during last years acquired a huge and increasing popularity as one of the most important emerging Web phenomena, deeply modifying the behavior of users and contributing to build a solid substrate of connections and relationships among people using the Web. In this preliminary work paper, our purpose is to analyze Facebook, considering a significant sample of data reflecting relationships among subscribed users. Our goal is to extract, from this platform, relevant ...

  14. Analyzing viewpoint diversity in twitter

    OpenAIRE

    2013-01-01

    Information diversity has a long tradition in human history. Recently there have been claims that diversity is diminishing in information available in social networks. On the other hand, some studies suggest that diversity is actually quite high in social networks such as Twitter. However these studies only focus on the concept of source diversity and they only focus on American users. In this paper we analyze different dimensions of diversity. We also provide an experimental design in which ...

  15. Remote Laser Diffraction PSD Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2000-06-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified "off-the-shelf" classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a "hot cell" (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  16. Sulfur Dioxide Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, Stephen R [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-05-01

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. Brookhaven National Laboratory (BNL) has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

  17. Analyzing ion distributions around DNA.

    Science.gov (United States)

    Lavery, Richard; Maddocks, John H; Pasi, Marco; Zakrzewska, Krystyna

    2014-07-01

    We present a new method for analyzing ion, or molecule, distributions around helical nucleic acids and illustrate the approach by analyzing data derived from molecular dynamics simulations. The analysis is based on the use of curvilinear helicoidal coordinates and leads to highly localized ion densities compared to those obtained by simply superposing molecular dynamics snapshots in Cartesian space. The results identify highly populated and sequence-dependent regions where ions strongly interact with the nucleic and are coupled to its conformational fluctuations. The data from this approach is presented as ion populations or ion densities (in units of molarity) and can be analyzed in radial, angular and longitudinal coordinates using 1D or 2D graphics. It is also possible to regenerate 3D densities in Cartesian space. This approach makes it easy to understand and compare ion distributions and also allows the calculation of average ion populations in any desired zone surrounding a nucleic acid without requiring references to its constituent atoms. The method is illustrated using microsecond molecular dynamics simulations for two different DNA oligomers in the presence of 0.15 M potassium chloride. We discuss the results in terms of convergence, sequence-specific ion binding and coupling with DNA conformation.

  18. Método para análise de benefícios em cadeias de suprimento: um estudo de caso A method to analyze benefits in supply chain benefits: a case study un the brazilian dairy sector

    Directory of Open Access Journals (Sweden)

    Fernando Cezar Leandro Scramim

    2004-12-01

    Full Text Available O objetivo deste trabalho é apresentar um método para realizar análises quantitativas de desempenho de configurações operacionais alternativas, em cadeias de suprimentos empresariais. Este método propõe integrar, em um único framework de análise, os conceitos advindos da gestão da cadeia de suprimentos e de sistemas de custeios gerenciais. O trabalho propõe a operacionalização do método por meio da utilização dos conhecimentos advindos da abordagem conhecida como system dynamics (SD. Para tanto, foi realizado um estudo de caso junto a uma Cooperativa de Laticínios do interior do Estado de São Paulo, no qual modelos de simulação foram construídos, baseados na análise prévia dos elementos do sistema. Deste modo, tornou-se possível examinar as relações de causalidade por meio da construção de cenários alternativos e controle sistemático das variáveis pesquisadas. Do exame dos cenários alternativos e do desempenho do sistema frente aos "distúrbios" aos quais as variáveis são submetidas, natureza do processo de simulação, pôde-se ganhar um maior aprendizado sobre o sistema em estudo.This paper proposes an analytical method, based on a framework that integrates the concepts of Supply Chain Management and Cost Management Systems, to study and restructure Brazilian agricultural supply chains. Using systemic reasoning and the system dynamics (SD approach, the proposed method was applied to Brazil's dairy sector to quantitatively analyze the performance of organizational and technical configurations of the sector's economic agents. A network of companies in Brazil's dairy supply chain was defined in terms of an SD model, which indicated that the impact of actions on the agents' production costs can be forecasted before such actions are taken, conferring greater consistency on the study. The network consisted of four representative groups of rural producers and a dairy company. Input data were based on a network

  19. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    Science.gov (United States)

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  20. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Directory of Open Access Journals (Sweden)

    Mike W.-L. Cheung

    2016-05-01

    Full Text Available Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  1. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Science.gov (United States)

    Cheung, Mike W.-L.; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists—and probably the most crucial one—is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study. PMID:27242639

  2. Analyze of Ventilator Associated Pneumonia

    Directory of Open Access Journals (Sweden)

    Aysel Sunnetcioglu

    2014-03-01

    Full Text Available Aim: Ventilator-associated pneumonia (VAP is the infection that is an important cause of morbidity and mortality developed in patients whom the invasive mechanical ventilation (MV were performed in intensive care units (ICU. In this study, the factors of VAP developing in patients whom the mechanical ventilation of ICU performed, antibiotic susceptibility to these factors and determining the risk factors were aimed. Material and Method: Between January 2009 and March 2013, 79 cases, followed with the mechanical ventilation for at least for 48 hours and developed VAP, were retrospectively reviewed at Anesthesiology and Intensive Care Unit of Reanimation at Faculty of Medicine at Yuzuncu Yil University, performing endotracheal intubation. The cases were evaluated in terms of microorganisms, antibiotic susceptibility and risk factors. Results: The rate of our VAP speed was calculated to be 19.68 on the day of 1000 ventilator. While a single microorganism could be isolated in 81.1% of the 74 VAP cases whose the active pathogen could be isolated, two or more than two microorganisms were isolated in 18.9% of them.While 83 of the strains (90.2% were gram-negative bacteria, 7 of them (7.6% were gram-positive bacteria. Acinetobacter spp. (40.2% was most commonly isolated as a gram-negative factor, but methicillin-resistant S. aureus (4.3% was isolated as a gram-positive factor. It was determined that the isolated factors in VAP cases were significantly resistant to the broad-spectrum antibiotics. Discussion: As a result, in patients with high-risk factors for the development of VAP, early and appropriate empirical antibiotic treatment should be started according to the results of the sensitivity of the unit and for the multi-drug-resistant microorganisms with common and high mortality.

  3. Análise de acidentes fatais na mineração: o caso da mineração no Peru Analyzes of fatal accidents in the mining industry: the case of Peruvian mining

    Directory of Open Access Journals (Sweden)

    Renan Collantes Candia

    2009-12-01

    contributed to this industry, showing its competitive position in worldwide mining. This article analyzes fatal accidents in Peru, from the year 2000 up to May of 2008. The primary information source was the fatal accident registry available at the Peruvian State Department of Energy and Mines. Various types of accidents were identified, with emphasis on those provoked by rock falls in underground mines. The majority of the victims belong to contracted companies, rendering special services. The results show that underground mining has larger risks than surface mining.

  4. The Aqueduct Global Flood Analyzer

    Science.gov (United States)

    Iceland, Charles

    2015-04-01

    As population growth and economic growth take place, and as climate change accelerates, many regions across the globe are finding themselves increasingly vulnerable to flooding. A recent OECD study of the exposure of the world's large port cities to coastal flooding found that 40 million people were exposed to a 1 in 100 year coastal flood event in 2005, and the total value of exposed assets was about US 3,000 billion, or 5% of global GDP. By the 2070s, those numbers were estimated to increase to 150 million people and US 35,000 billion, or roughly 9% of projected global GDP. Impoverished people in developing countries are particularly at risk because they often live in flood-prone areas and lack the resources to respond. WRI and its Dutch partners - Deltares, IVM-VU University Amsterdam, Utrecht University, and PBL Netherlands Environmental Assessment Agency - are in the initial stages of developing a robust set of river flood and coastal storm surge risk measures that show the extent of flooding under a variety of scenarios (both current and future), together with the projected human and economic impacts of these flood scenarios. These flood risk data and information will be accessible via an online, easy-to-use Aqueduct Global Flood Analyzer. We will also investigate the viability, benefits, and costs of a wide array of flood risk reduction measures that could be implemented in a variety of geographic and socio-economic settings. Together, the activities we propose have the potential for saving hundreds of thousands of lives and strengthening the resiliency and security of many millions more, especially those who are most vulnerable. Mr. Iceland will present Version 1.0 of the Aqueduct Global Flood Analyzer and provide a preview of additional elements of the Analyzer to be released in the coming years.

  5. Fuel analyzer; Analisador de combustiveis

    Energy Technology Data Exchange (ETDEWEB)

    Cozzolino, Roberval [RS Motors, Indaiatuba, SP (Brazil)

    2008-07-01

    The current technology 'COMBUSTIMETRO' aims to examine the fuel through performance of the engine, as the role of the fuel is to produce energy for the combustion engine in the form of which is directly proportional to the quality and type of fuel. The 'COMBUSTIMETRO' has an engine that always keeps the same entry of air, fuel and fixed point of ignition. His operation is monitored by sensors (Sonda Lambda, RPM and Gases Analyzer) connected to a processor that performs calculations and records the information, generate reports and graphs. (author)

  6. Method for analyzing microbial communities

    Science.gov (United States)

    Zhou, Jizhong [Oak Ridge, TN; Wu, Liyou [Oak Ridge, TN

    2010-07-20

    The present invention provides a method for quantitatively analyzing microbial genes, species, or strains in a sample that contains at least two species or strains of microorganisms. The method involves using an isothermal DNA polymerase to randomly and representatively amplify genomic DNA of the microorganisms in the sample, hybridizing the resultant polynucleotide amplification product to a polynucleotide microarray that can differentiate different genes, species, or strains of microorganisms of interest, and measuring hybridization signals on the microarray to quantify the genes, species, or strains of interest.

  7. Raman Gas Analyzer (RGA): Natural Gas Measurements.

    Science.gov (United States)

    Petrov, Dmitry V; Matrosov, Ivan I

    2016-06-08

    In the present work, an improved model of the Raman gas analyzer (RGA) of natural gas (NG) developed by us is described together with its operating principle. The sensitivity has been improved and the number of measurable gases has been expanded. Results of its approbation on a real NG sample are presented for different measurement times. A comparison of the data obtained with the results of chromatographic analysis demonstrates their good agreement. The time stability of the results obtained using this model is analyzed. It is experimentally established that the given RGA can reliably determine the content of all molecular NG components whose content exceeds 0.005% for 100 s; moreover, in this case the limiting sensitivity for some NG components is equal to 0.002%.

  8. Operating System Performance Analyzer for Embedded Systems

    Directory of Open Access Journals (Sweden)

    Shahzada Khayyam Nisar

    2011-11-01

    Full Text Available RTOS provides a number of services to an embedded system designs such as case management, memory management, and Resource Management to build a program. Choosing the best OS for an embedded system is based on the available OS for system designers and their previous knowledge and experience. This can cause an imbalance between the OS and embedded systems. RTOS performance analysis is critical in the design and integration of embedded software to ensure that limits the application meet at runtime. To select an appropriate operating system for an embedded system for a particular application, the OS services to be analyzed. These OS services are identified by parameters to establish performance metrics. Performance Metrics selected include context switching, Preemption time and interrupt latency. Performance Metrics are analyzed to choose the right OS for an embedded system for a particular application.

  9. VOSA: A VO SED Analyzer

    Science.gov (United States)

    Rodrigo, C.; Bayo, A.; Solano, E.

    2017-03-01

    VOSA (VO Sed Analyzer, http://svo2.cab.inta-csic.es/theory/vosa) is a public web-tool developed by the Spanish Virtual Observatory (http://svo.cab.inta-csic.es/) and designed to help users to (1) build Spectral Energy Distributions (SEDs) combining private photometric measurements with data available in VO services, (2) obtain relevant properties of these objects (distance, extinction, etc) from VO catalogs, (3) analyze them comparing observed photometry with synthetic photometry from different collections of theoretical models or observational templates, using different techniques (chi-square minimization, Bayesian analysis) to estimate physical parameters of the observed objects (teff, logg, metallicity, stellar radius/distance ratio, infrared excess, etc), and use these results to (4) estimate masses and ages via interpolation of collections of isochrones and evolutionary tracks from the VO. In particular, VOSA offers the advantage of deriving physical parameters using all the available photometric information instead of a restricted subset of colors. The results can be downloaded in different formats or sent to other VO tools using SAMP. We have upgraded VOSA to provide access to Gaia photometry and give a homogeneous estimation of the physical parameters of thousands of objects at a time. This upgrade has required the implementation of a new computation paradigm, including a distributed environment, the capability of submitting and processing jobs in an asynchronous way, the use of parallelized computing to speed up processes (˜ ten times faster) and a new design of the web interface.

  10. Thermal and evolved gas analyzer

    Science.gov (United States)

    Williams, M. S.; Boynton, W. V.; James, R. L.; Verts, W. T.; Bailey, S. H.; Hamara, D. K.

    1998-01-01

    The Thermal and Evolved Gas Analyzer (TEGA) instrument will perform calorimetry and evolved gas analysis on soil samples collected from the Martian surface. TEGA is one of three instruments, along with a robotic arm, that form the Mars Volatile and Climate Survey (MVACS) payload. The other instruments are a stereo surface imager, built by Peter Smith of the University of Arizona and a meteorological station, built by JPL. The MVACS lander will investigate a Martian landing site at approximately 70 deg south latitude. Launch will take place from Kennedy Space Center in January, 1999. The TEGA project started in February, 1996. In the intervening 24 months, a flight instrument concept has been designed, prototyped, built as an engineering model and flight model, and tested. The instrument performs laboratory-quality differential-scanning calorimetry (DSC) over the temperature range of Mars ambient to 1400K. Low-temperature volatiles (water and carbon dioxide ices) and the carbonates will be analyzed in this temperature range. Carbonates melt and evolve carbon dioxide at temperatures above 600 C. Evolved oxygen (down to a concentration of 1 ppm) is detected, and C02 and water vapor and the isotopic variations of C02 and water vapor are detected and their concentrations measured. The isotopic composition provides important tests of the theory of solar system formation.

  11. Analyzing PICL trace data with MEDEA

    Energy Technology Data Exchange (ETDEWEB)

    Merlo, A.P. [Pavia Univ. (Italy). Dipt di Informatica e Sistemistica; Worley, P.H. [Oak Ridge National Lab., TN (United States)

    1993-11-01

    Execution traces and performance statistics can be collected for parallel applications on a variety of multiprocessor platforms by using the Portable Instrumented Communication Library (PICL). The static and dynamic performance characteristics of performance data can be analyzed easily and effectively with the facilities provided within the MEasurements Description Evaluation and Analysis tool (MEDEA). This report describes the integration of the PICL trace file format into MEDEA. A case study is then outlined that uses PICL and MEDEA to characterize the performance of a parallel benchmark code executed on different hardware platforms and using different parallel algorithms and communication protocols.

  12. Grid and Data Analyzing and Security

    Directory of Open Access Journals (Sweden)

    Fatemeh SHOKRI

    2012-12-01

    Full Text Available This paper examines the importance of secure structures in the process of analyzing and distributing information with aid of Grid-based technologies. The advent of distributed network has provided many practical opportunities for detecting and recording the time of events, and made efforts to identify the events and solve problems of storing information such as being up-to-date and documented. In this regard, the data distribution systems in a network environment should be accurate. As a consequence, a series of continuous and updated data must be at hand. In this case, Grid is the best answer to use data and resource of organizations by common processing.

  13. Three Practical Methods for Analyzing Slope Stability

    Institute of Scientific and Technical Information of China (English)

    XU Shiguang; ZHANG Shitao; ZHU Chuanbing; YIN Ying

    2008-01-01

    Since the environmental capacity and the arable as well as the inhabitant lands have actually reached a full balance, the slopes are becoming the more and more important options for various engineering constructions. Because of the geological complexity of the slope, the design and thedecision-making of a slope-based engineering is still not ractical to rely solely on the theoretical analysis and numerical calculation, but mainly on the experience of the experts. Therefore, it hasimportant practical significance to turn some successful experience into mathematic equations. Basedupon the abundant typical slope engineering construction cases in Yunnan, Southwestern China, 3methods for yzing the slope stability have been developed in this paper. First of all, the corresponded analogous mathematic equation for analyzing slope stability has been established through case studies. Then, artificial neural network and multivariate regression analysis have alsobeen set up when 7 main influencing factors are adopted

  14. Coaxial charged particle energy analyzer

    Science.gov (United States)

    Kelly, Michael A. (Inventor); Bryson, III, Charles E. (Inventor); Wu, Warren (Inventor)

    2011-01-01

    A non-dispersive electrostatic energy analyzer for electrons and other charged particles having a generally coaxial structure of a sequentially arranged sections of an electrostatic lens to focus the beam through an iris and preferably including an ellipsoidally shaped input grid for collimating a wide acceptance beam from a charged-particle source, an electrostatic high-pass filter including a planar exit grid, and an electrostatic low-pass filter. The low-pass filter is configured to reflect low-energy particles back towards a charged particle detector located within the low-pass filter. Each section comprises multiple tubular or conical electrodes arranged about the central axis. The voltages on the lens are scanned to place a selected energy band of the accepted beam at a selected energy at the iris. Voltages on the high-pass and low-pass filters remain substantially fixed during the scan.

  15. Compact Microwave Fourier Spectrum Analyzer

    Science.gov (United States)

    Savchenkov, Anatoliy; Matsko, Andrey; Strekalov, Dmitry

    2009-01-01

    A compact photonic microwave Fourier spectrum analyzer [a Fourier-transform microwave spectrometer, (FTMWS)] with no moving parts has been proposed for use in remote sensing of weak, natural microwave emissions from the surfaces and atmospheres of planets to enable remote analysis and determination of chemical composition and abundances of critical molecular constituents in space. The instrument is based on a Bessel beam (light modes with non-zero angular momenta) fiber-optic elements. It features low power consumption, low mass, and high resolution, without a need for any cryogenics, beyond what is achievable by the current state-of-the-art in space instruments. The instrument can also be used in a wide-band scatterometer mode in active radar systems.

  16. Charge Analyzer Responsive Local Oscillations

    Science.gov (United States)

    Krause, Linda Habash; Thornton, Gary

    2015-01-01

    The first transatlantic radio transmission, demonstrated by Marconi in December of 1901, revealed the essential role of the ionosphere for radio communications. This ionized layer of the upper atmosphere controls the amount of radio power transmitted through, reflected off of, and absorbed by the atmospheric medium. Low-frequency radio signals can propagate long distances around the globe via repeated reflections off of the ionosphere and the Earth's surface. Higher frequency radio signals can punch through the ionosphere to be received at orbiting satellites. However, any turbulence in the ionosphere can distort these signals, compromising the performance or even availability of space-based communication and navigations systems. The physics associated with this distortion effect is analogous to the situation when underwater images are distorted by convecting air bubbles. In fact, these ionospheric features are often called 'plasma bubbles' since they exhibit some of the similar behavior as underwater air bubbles. These events, instigated by solar and geomagnetic storms, can cause communication and navigation outages that last for hours. To help understand and predict these outages, a world-wide community of space scientists and technologists are devoted to researching this topic. One aspect of this research is to develop instruments capable of measuring the ionospheric plasma bubbles. Figure 1 shows a photo of the Charge Analyzer Responsive to Local Oscillations (CARLO), a new instrument under development at NASA Marshall Space Flight Center (MSFC). It is a frequency-domain ion spectrum analyzer designed to measure the distributions of ionospheric turbulence from 1 Hz to 10 kHz (i.e., spatial scales from a few kilometers down to a few centimeters). This frequency range is important since it focuses on turbulence scales that affect VHF/UHF satellite communications, GPS systems, and over-the-horizon radar systems. CARLO is based on the flight-proven Plasma Local

  17. Analyzing Agricultural Agglomeration in China

    Directory of Open Access Journals (Sweden)

    Erling Li

    2017-02-01

    Full Text Available There has been little scholarly research on Chinese agriculture’s geographic pattern of agglomeration and its evolutionary mechanisms, which are essential to sustainable development in China. By calculating the barycenter coordinates, the Gini coefficient, spatial autocorrelation and specialization indices for 11 crops during 1981–2012, we analyze the evolutionary pattern and mechanisms of agricultural agglomeration. We argue that the degree of spatial concentration of Chinese planting has been gradually increasing and that regional specialization and diversification have progressively been strengthened. Furthermore, Chinese crop production is moving from the eastern provinces to the central and western provinces. This is in contrast to Chinese manufacturing growth which has continued to be concentrated in the coastal and southeastern regions. In Northeast China, the Sanjiang and Songnen plains have become agricultural clustering regions, and the earlier domination of aquaculture and rice production in Southeast China has gradually decreased. In summary, this paper provides a political economy framework for understanding the regionalization of Chinese agriculture, focusing on the interaction among the objectives, decisionmaking behavior, path dependencies and spatial effects.

  18. Thomson parabola ion energy analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Cobble, James A [Los Alamos National Laboratory; Flippo, Kirk A [Los Alamos National Laboratory; Letzring, Samuel A [Los Alamos National Laboratory; Lopez, Frank E [Los Alamos National Laboratory; Offermann, Dustin T [Los Alamos National Laboratory; Oertel, John A [Los Alamos National Laboratory; Mastrosimone, Dino [UNIV OF ROCHESTER

    2010-01-01

    A new, versatile Thomson parabola ion energy (TPIE) analyzer has been designed and constructed for use at the OMEGA-EP facility. Multi-MeV ions from EP targets are transmitted through a W pinhole into a (5- or 8-kG) magnetic field and subsequently through a parallel electric field of up to 30 kV/cm. The ion drift region may have a user-selected length of 10, 50, or 80 cm. With the highest fields, 500-Me V C{sup 6+} and C{sup 5+} may be resolved. TPIE is TIM-mounted at OMEGA-EP and is qualified in all existing TIMs. The instrument runs on pressure-interlocked 15-VDC power available in EP TIM carts. It may be inserted to within several inches of the target to attain sufficient flux for a measurement. For additional flux control, the user may select a square-aperture W pinhole of 0.004-inch or 0.010-inch. The detector consists of CR-39 backed by an image plate. The fully relativistic design code and design features are discussed. Ion spectral results from first use at OMEGA-EP are expected.

  19. Analyzing and modeling heterogeneous behavior

    Science.gov (United States)

    Lin, Zhiting; Wu, Xiaoqing; He, Dongyue; Zhu, Qiang; Ni, Jixiang

    2016-05-01

    Recently, it was pointed out that the non-Poisson statistics with heavy tail existed in many scenarios of human behaviors. But most of these studies claimed that power-law characterized diverse aspects of human mobility patterns. In this paper, we suggest that human behavior may not be driven by identical mechanisms and can be modeled as a Semi-Markov Modulated Process. To verify our suggestion and model, we analyzed a total of 1,619,934 records of library visitations (including undergraduate and graduate students). It is found that the distribution of visitation intervals is well fitted with three sections of lines instead of the traditional power law distribution in log-log scale. The results confirm that some human behaviors cannot be simply expressed as power law or any other simple functions. At the same time, we divided the data into groups and extracted period bursty events. Through careful analysis in different groups, we drew a conclusion that aggregate behavior might be composed of heterogeneous behaviors, and even the behaviors of the same type tended to be different in different period. The aggregate behavior is supposed to be formed by "heterogeneous groups". We performed a series of experiments. Simulation results showed that we just needed to set up two states Semi-Markov Modulated Process to construct proper representation of heterogeneous behavior.

  20. Analyzing Cases of Resilience Success and Failure - A Research Study

    Science.gov (United States)

    2012-12-01

    obtain sufficient data to explore the analysis approaches described in this section. The market sectors represented by the four collaboration...Strengths, Weaknesses, Opportunities, Threats ( SWOT ) analysis New ideas on measures, analytics, and reporting; lessons learned D7 Data from existing...Questions 6 1.2.3 Hypothesis 6 2 Research Approach 7 2.1 Overview 7 2.2 Data Collection and Coding 8 2.3 Analysis Methods 9 2.3.1 Attribute

  1. Analyzing Technology Adoption - The Case of Kerala Home Gardens

    Directory of Open Access Journals (Sweden)

    Reeba Jacob

    2016-05-01

    Full Text Available Homegardens are traditional agroforestry system with a unique structure and function. It is the predominant farming system in Kerala. The study was undertaken in Thiruvananthapuram district covering a sample of 100 homegardens farmers from all the five agro-ecological units with an aim to assess the level of adoption of selected Kerala Agricultural University (KAU production practices in homegardens. Results of the study identified that majority of the farmers (63% belonged to medium level of adoption. Adoption quotient was worked out and compared with standard Rogers curve. Correlation analysis of the independent variables with the dependent variable viz., level of adoption indicated that age, farming experience, knowledge, evaluative perception, mass media contribution, livestock possession and extension contribution had direct significant effect on level of adoption of KAU production practices by homegarden farmers.

  2. Case studies on analyzing software architectures for usability

    NARCIS (Netherlands)

    Folmer, E; Bosch, J

    2005-01-01

    Studies of software engineering projects reveal that a large number of usability related change requests are made after its deployment. Fixing certain usability problems during. the later stages of development has proven to be costly, since some of these changes require changes to the software archi

  3. Using Toyota's A3 Thinking for Analyzing MBA Business Cases

    Science.gov (United States)

    Anderson, Joe S.; Morgan, James N.; Williams, Susan K.

    2011-01-01

    A3 Thinking is fundamental to Toyota's benchmark management philosophy and to their lean production system. It is used to solve problems, gain agreement, mentor team members, and lead organizational improvements. A structured problem-solving approach, A3 Thinking builds improvement opportunities through experience. We used "The Toyota…

  4. Modular Construction of Shape-Numeric Analyzers

    Directory of Open Access Journals (Sweden)

    Bor-Yuh Evan Chang

    2013-09-01

    Full Text Available The aim of static analysis is to infer invariants about programs that are precise enough to establish semantic properties, such as the absence of run-time errors. Broadly speaking, there are two major branches of static analysis for imperative programs. Pointer and shape analyses focus on inferring properties of pointers, dynamically-allocated memory, and recursive data structures, while numeric analyses seek to derive invariants on numeric values. Although simultaneous inference of shape-numeric invariants is often needed, this case is especially challenging and is not particularly well explored. Notably, simultaneous shape-numeric inference raises complex issues in the design of the static analyzer itself. In this paper, we study the construction of such shape-numeric, static analyzers. We set up an abstract interpretation framework that allows us to reason about simultaneous shape-numeric properties by combining shape and numeric abstractions into a modular, expressive abstract domain. Such a modular structure is highly desirable to make its formalization and implementation easier to do and get correct. To achieve this, we choose a concrete semantics that can be abstracted step-by-step, while preserving a high level of expressiveness. The structure of abstract operations (i.e., transfer, join, and comparison follows the structure of this semantics. The advantage of this construction is to divide the analyzer in modules and functors that implement abstractions of distinct features.

  5. Objects in Films: analyzing signs

    Directory of Open Access Journals (Sweden)

    GAMBARATO, Renira Rampazzo

    2009-12-01

    Full Text Available The focus of this essay is the analysis of daily objects as signs in films. Objects from everyday life acquire several functions in films: they can be solely used as scene objects or to support a particular film style. Other objects are specially chosen to translate a character’s interior state of mind or the filmmaker’s aesthetical or ethical commitment to narrative concepts. In order to understand such functions and commitments, we developed a methodology for film analysis which focuses on the objects. Object interpretation, as the starting point of film analysis, is not a new approach. For instance, French film critic André Bazin proposed that use of object interpretation in the 1950s. Similarly, German film theorist Siegfried Kracauer stated it in the 1960s. However, there is currently no existing analytical model to use when engaging in object interpretation in film. This methodology searches for the most representative objects in films which involves both quantitative and qualitative analysis; we consider the number of times each object appears in a film (quantitative analysis as well as the context of their appearance, i.e. the type of shot used and how that creates either a larger or smaller relevance and/or expressiveness (qualitative analysis. In addition to the criteria of relevance and expressiveness, we also analyze the functionality of an object by exploring details and specifying the role various objects play in films. This research was developed at Concordia University, Montreal, Canada and was supported by the Foreign Affairs and International Trade, Canada (DFAIT.

  6. Eastern Mediterranean Natural Gas: Analyzing Turkey's Stance

    Directory of Open Access Journals (Sweden)

    Abdullah Tanriverdi

    2016-02-01

    Full Text Available Recent large-scale natural gas discoveries in East Mediterranean have drawn attention to the region. The discoveries caused both hope and tension in the region. As stated, the new resources may serve as a new hope for all relevant parties as well as the region if managed in a collaborative and conciliatory way. Energy may be a remedy to Cyprus' financial predicament, initiate a process for resolving differences between Turkey and Cyprus, normalize Israel-Turkey relations and so on. On the contrary, adopting unilateral and uncooperative approach may aggravate the tension and undermine regional stability and security. In this sense, the role of energy in generating hope or tension is dependent on the approaches of related parties. The article will analyze Turkey's attitude in East Mediterranean case in terms of possible negative and positive implications for Turkey in the energy field. The article examines Turkey's position and the reasons behind its stance in the East Mediterranean case. Considering Turkey's energy profile and energy policy goals, the article argues that the newly found hydrocarbons may bring in more stakes for Turkey if Turkey adopts a cooperative approach in this case.

  7. Analyzing Strategic Business Rules through Simulation Modeling

    Science.gov (United States)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  8. Buccal microbiology analyzed by infrared spectroscopy

    Science.gov (United States)

    de Abreu, Geraldo Magno Alves; da Silva, Gislene Rodrigues; Khouri, Sônia; Favero, Priscila Pereira; Raniero, Leandro; Martin, Airton Abrahão

    2012-01-01

    Rapid microbiological identification and characterization are very important in dentistry and medicine. In addition to dental diseases, pathogens are directly linked to cases of endocarditis, premature delivery, low birth weight, and loss of organ transplants. Fourier Transform Infrared Spectroscopy (FTIR) was used to analyze oral pathogens Aggregatibacter actinomycetemcomitans ATCC 29523, Aggregatibacter actinomycetemcomitans-JP2, and Aggregatibacter actinomycetemcomitans which was clinically isolated from the human blood-CI. Significant spectra differences were found among each organism allowing the identification and characterization of each bacterial species. Vibrational modes in the regions of 3500-2800 cm-1, the 1484-1420 cm-1, and 1000-750 cm-1 were used in this differentiation. The identification and classification of each strain were performed by cluster analysis achieving 100% separation of strains. This study demonstrated that FTIR can be used to decrease the identification time, compared to the traditional methods, of fastidious buccal microorganisms associated with the etiology of the manifestation of periodontitis.

  9. Computer model for analyzing sodium cold traps

    Energy Technology Data Exchange (ETDEWEB)

    McPheeters, C C; Raue, D J

    1983-05-01

    A computer model was developed to simulate the processes that occur in sodium cold traps. The Model for Analyzing Sodium Cold Traps (MASCOT) simulates any desired configuration of mesh arrangements and dimensions and calculates pressure drops and flow distributions, temperature profiles, impurity concentration profiles, and impurity mass distributions. The calculated pressure drop as a function of impurity mass content determines the capacity of the cold trap. The accuracy of the model was checked by comparing calculated mass distributions with experimentally determined mass distributions from literature publications and with results from our own cold trap experiments. The comparisons were excellent in all cases. A parametric study was performed to determine which design variables are most important in maximizing cold trap capacity.

  10. Analyzing delay causes in Egyptian construction projects

    Directory of Open Access Journals (Sweden)

    Mohamed M. Marzouk

    2014-01-01

    Full Text Available Construction delays are common problems in civil engineering projects in Egypt. These problems occur frequently during project life-time leading to disputes and litigation. Therefore, it is essential to study and analyze causes of construction delays. This research presents a list of construction delay causes retrieved from literature. The feedback of construction experts was obtained through interviews. Subsequently, a questionnaire survey was prepared. The questionnaire survey was distributed to thirty-three construction experts who represent owners, consultants, and contractor’s organizations. Frequency Index, Severity Index, and Importance Index are calculated and according to the highest values of them the top ten delay causes of construction projects in Egypt are determined. A case study is analyzed and compared to the most important delay causes in the research. Statistical analysis is carried out using analysis of variance ANOVA method to test delay causes, obtained from the survey. The test results reveal good correlation between groups while there is significant difference between them for some delay causes and finally roadmap for prioritizing delay causes groups is presented.

  11. Analyzing delay causes in Egyptian construction projects.

    Science.gov (United States)

    Marzouk, Mohamed M; El-Rasas, Tarek I

    2014-01-01

    Construction delays are common problems in civil engineering projects in Egypt. These problems occur frequently during project life-time leading to disputes and litigation. Therefore, it is essential to study and analyze causes of construction delays. This research presents a list of construction delay causes retrieved from literature. The feedback of construction experts was obtained through interviews. Subsequently, a questionnaire survey was prepared. The questionnaire survey was distributed to thirty-three construction experts who represent owners, consultants, and contractor's organizations. Frequency Index, Severity Index, and Importance Index are calculated and according to the highest values of them the top ten delay causes of construction projects in Egypt are determined. A case study is analyzed and compared to the most important delay causes in the research. Statistical analysis is carried out using analysis of variance ANOVA method to test delay causes, obtained from the survey. The test results reveal good correlation between groups while there is significant difference between them for some delay causes and finally roadmap for prioritizing delay causes groups is presented.

  12. 中孕期系统超声筛查对检出异常胎儿的临床价值及产前超声筛查漏检病例初步分析%Clinical value of systematic ultrasound screening for fetal abnormality in second trimester and analyzing the Undetected cases of prenatal ultrasound screening preliminary

    Institute of Scientific and Technical Information of China (English)

    荆春丽; 孙寒冰; 沙恩波; 王彦; 郭邑

    2011-01-01

    Objective To evaluate the clinical value of systematic ultrasound screening for fetal abnormality in second trimester, and to analyze the causes of misdiagnosis and medical malpractice. Methods By checking up 12 682 cases of 20-24 weeks pregnant women with using systematic ultrasound screening, storied the information in the workstation, and then did the evaluate, following-up interviews and statistics. Results Six hundred sixteen cases had been diagnosed out with abnormal structure or fetal abnormality. Detection rate 4. 86%( 616/12 682 ) and 18 cases with abnormal structure were failed. Failed rate 0. 14% ( 18/12 682 ), which were confirmed after birth. Including 2 cases with hand abnormality, 1 case with 1 degree cleft lip, 6 cases with ventricular septum defect( ≤4 mm ),1 case with tetralogy of fallot, 1 case with congenital defect of external ear, 1 case with abnormalities of the genitalia, 2 cases with cleft palate only, 1 case with strephenopodia,l case with caudal regression syndrome,l case with sacral spinal bifida with tethered cord fat, 1case with sacrococcygeal cystic spina bifida. 3 of above cases ( 1 case in caudal regression syndrome, 1 case in sacral spinal bifida with tethered cord fat, 1 case of sacrococcygeal cystic spina bifida ) caused medical malpractice after birth. There were 239 cases of the above 616 cases with fetal abnormality had abortions in our hospital,( 1 ) Necropsy 23 cases among the abortions were conformed to the ultrasound diagnosis. ( 2 ) Appearance of the abnormality 142( cleft lip 43 )cases among the abortions were conformed to the ultrasound diagnosis. ( 3 ) 61 cases had been diagnosed out with fetal visceral abnormality, without necropsy 13 cases among them were polyhydramnios or oligohydramnios. Three hundred seventy-seven exceptional cases were primary abortions, which also confirmed our diagnosis after the hospital' stelephone interview. Conclusion The systematic ultrasound screening can improve the detection rate

  13. An "ESA-affordable" Laue-lens

    DEFF Research Database (Denmark)

    Lund, Niels

    2005-01-01

    With ESA's INTEGRAL mission gamma-ray astronomy has advanced to the point where major scientific advances must be expected from detailed studies of the many new point sources. The interest in developing focusing telescopes operating in the soft gamma-ray regime up to 1 MeV is therefore mounting...... constraints of a specific medium size launch vehicle. The introduction of the lens mass as a primary design driver has some surprising effects for the choice of material for the crystals and new tradeoff considerations are introduced....

  14. Stochastic Particle Real Time Analyzer (SPARTA) Validation and Verification Suite

    Energy Technology Data Exchange (ETDEWEB)

    Gallis, Michael A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Fluid Science and Engineering Dept.; Koehler, Timothy P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Fluid Science and Engineering Dept.; Plimpton, Steven J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Multi Scale Science Dept.

    2014-10-01

    This report presents the test cases used to verify, validate and demonstrate the features and capabilities of the first release of the 3D Direct Simulation Monte Carlo (DSMC) code SPARTA (Stochastic Real Time Particle Analyzer). The test cases included in this report exercise the most critical capabilities of the code like the accurate representation of physical phenomena (molecular advection and collisions, energy conservation, etc.) and implementation of numerical methods (grid adaptation, load balancing, etc.). Several test cases of simple flow examples are shown to demonstrate that the code can reproduce phenomena predicted by analytical solutions and theory. A number of additional test cases are presented to illustrate the ability of SPARTA to model flow around complicated shapes. In these cases, the results are compared to other well-established codes or theoretical predictions. This compilation of test cases is not exhaustive, and it is anticipated that more cases will be added in the future.

  15. Stochastic Particle Real Time Analyzer (SPARTA) Validation and Verification Suite

    Energy Technology Data Exchange (ETDEWEB)

    Gallis, Michael A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Fluid Science and Engineering Dept.; Koehler, Timothy P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Fluid Science and Engineering Dept.; Plimpton, Steven J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Multi Scale Science Dept.

    2014-10-01

    This report presents the test cases used to verify, validate and demonstrate the features and capabilities of the first release of the 3D Direct Simulation Monte Carlo (DSMC) code SPARTA (Stochastic Real Time Particle Analyzer). The test cases included in this report exercise the most critical capabilities of the code like the accurate representation of physical phenomena (molecular advection and collisions, energy conservation, etc.) and implementation of numerical methods (grid adaptation, load balancing, etc.). Several test cases of simple flow examples are shown to demonstrate that the code can reproduce phenomena predicted by analytical solutions and theory. A number of additional test cases are presented to illustrate the ability of SPARTA to model flow around complicated shapes. In these cases, the results are compared to other well-established codes or theoretical predictions. This compilation of test cases is not exhaustive, and it is anticipated that more cases will be added in the future.

  16. Drunk driving accidents related risk factors are analyzed of sanmenxia city from 2012 to 2013 in 639 cases%三门峡市2012-2013年酒后驾车交通事故639例危险因素分析

    Institute of Scientific and Technical Information of China (English)

    李向阳; 梁丽娜; 范洪庚; 王秀娟

    2015-01-01

    目的:通过对三门峡市酒后驾车交通事故的相关危险因素分析,以期为酒后驾驶预防和控制提供科学依据。方法:对酒后驾车交通事故驾驶员639例性别、年龄、肇事时间、驾驶车辆类型、事故形态及血液酒精含量等作统计分析。结果:酒后驾车639例交通事故中男性627例(98.12%);20~49岁508例(79.50%)。肇事时间18:00~0:00段占304例(47.57%)。酒后驾驶摩托车314例(49.14%),酒后驾驶小型汽车243例(38.03%)。结论:酒后驾车以中青年男性为主,多集中在晚餐后,摩托车及小型汽车驾驶员是酒后驾驶的主要行为人群,应加强对酒后驾车的预防和控制。%Objective: Drunk driving accidents related risk factors are analyzed of sanmenxia city, in order to pro-vide scientific basis for the prevention and control of drunk driving. Methods:In the Excel table of 639 cases of drunk driving accidents the driver’s gender, age, driving accident time, vehicle type, such as accident form and blood alcohol content for statistical analysis. Results:Male 627 in 639 cases of drunk driving traffic accident cases, accounted for 98.12%;20 to 49 years old in 508 cases, accounting for 79.50%. Accident time 18:00~0:00 segment accounts for 304 cas-es, accounting for 47.57%. Drunk driving motorcycle 314 cases, accounting for 49.14%, 243 cases of drunk driving small cars accounted for 38.03%. Conclusion:Drunk driving is given priority to with young and middle-aged men, more concen-trated in the after dinner, motorcycles and small car driver is drunk driving the main behavior of the crowd, should strengthen the prevention and control of drunken driving.

  17. Analyzing modified unimodular gravity via Lagrange multipliers

    Science.gov (United States)

    Sáez-Gómez, Diego

    2016-06-01

    The so-called unimodular version of general relativity is revisited. Unimodular gravity is constructed by fixing the determinant of the metric, which leads to the trace-free part of the equations instead of the usual Einstein field equations. Then a cosmological constant naturally arises as an integration constant. While unimodular gravity turns out to be equivalent to general relativity (GR) at the classical level, it provides important differences at the quantum level. Here we extend the unimodular constraint to some extensions of general relativity that have drawn a lot of attention over the last years—f (R ) gravity (or its scalar-tensor picture) and Gauss-Bonnet gravity. The corresponding unimodular version of such theories is constructed as well as the conformal transformation that relates the Einstein and Jordan frames for these nonminimally coupled theories. From the classical point of view, the unimodular versions of such extensions are completely equivalent to their originals, but an effective cosmological constant arises naturally, which may provide a richer description of the evolution of the Universe. Here we analyze the case of Starobisnky inflation and compare it with the original one.

  18. 使用化学发光法检测26707例血清抗梅毒螺旋体特异性抗体以及结果假阳性率分析%Analyze of the False Positive Rate of Serum Anti Treponema Pallidum Specific Antibody of 26 707 Cases Detected by Chemiluminescence Method

    Institute of Scientific and Technical Information of China (English)

    张保平; 刘珊; 韩艳秋

    2015-01-01

    Objective To detect serum anti-Treponema pallidum specific antibody of 26 707 cases by Abbott I2000SR auto-matic chemiluminescent microparticle immunoassay analyzer,and treponema pallidum particle agglutination assay (TPPA) was regarded as a standard reference method which was used to detect anti-Treponema pallidum specific antibody.To analyze the false positive rate of Abbott I2000SR according to the TPPA.Methods Collected 26 707 serums from inpatients and outpatients of the hospital during September 1,2013 to March 5,2014.The subjects were asked to fasting conditions taking venous blood 3 ml,3 000 r/min centrifugal 10 min utes after the separation of serum,detected the Anti-TP by CMIA (Ab-bott I2000SR)and the TPPA testing,analyzed test results by statistical methods.Results There were 52 cases detected by I2000SR whose S/CO values of 26 707 cases of serum Treponema pallidum specific antibodies were 1 to 2,of which 9 cases were verified positive by TPPA,and the positive rate was 17.31%.There were 26 cases detected by I2000SR whose S/CO values of Treponema pallidum specific antibodies were 2 to 3,of which 9 cases were verified positive by TPPA,and the posi-tive rate was 34.62%.There were 26 cases detected by I2000SR whose S/CO values of Treponema pallidum specific anti-bodies were 3 to 5,of which 9 cases were verified positive by TPPA,and the positive rate was 34.62%.There were 25 cases detected by I2000SR whose S/CO values of Treponema pallidum specific antibodies were 5 to 7,of which 11 cases were veri-fied positive by TPPA,and the positive rate was 44%.There were 25 cases detected by I2000SR whose S/CO values of Treponema pallidum specific antibodies were 7 to 10,of which 17 cases were verified positive by TPPA,and the positive rate was 68%.There were 28 cases detected by I2000SR whose S/CO values of Treponema pallidum specific antibodies were 10to 13,of which 24 cases were verified positive by TPPA,and the positive rate was 85.71%.There were 23 cases detected

  19. Portable Programmable Multifunction Body Fluids Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced Liquid Logic proposes to develop a very capable analyzer based on its digital microfluidic technology. Such an analyzer would be:  Capable of both...

  20. Analyzing personalized policies for online biometric verification.

    Directory of Open Access Journals (Sweden)

    Apaar Sadhwani

    Full Text Available Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR subject to constraints on the false accept rate (FAR and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses 12 biometrics for each resident, which represents a five (four, respectively log reduction in FRR relative to fingerprint (iris, respectively policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR and acquires an average of 1.3 fingerprints per resident.

  1. Analyzing personalized policies for online biometric verification.

    Science.gov (United States)

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  2. A Framework for Analyzing Reader-Text Interactions.

    Science.gov (United States)

    MacLean, Margaret

    1986-01-01

    This article presents a framework for analyzing verbal report data, attempting to integrate Fillmore's levels of envisionment and Galda's differentiation between reader and text-centered responses. It examines the extent to which readers rely on text based, reader-based, or interactive processes to understand text. Case study data is presented.…

  3. Analyzing Literacy Practice: Grounded Theory to Model

    Science.gov (United States)

    Purcell-Gates, Victoria; Perry, Kristen H.; Briseno, Adriana

    2011-01-01

    In this methodological and theoretical article, we address the need for more cross-case work on studies of literacy in use within different social and cultural contexts. The Cultural Practices of Literacy Study (CPLS) project has been working on a methodology for cross-case analyses that are principled in that the qualitative nature of each case,…

  4. Electrical spectrum & network analyzers a practical approach

    CERN Document Server

    Helfrick, Albert D

    1991-01-01

    This book presents fundamentals and the latest techniques of electrical spectrum analysis. It focuses on instruments and techniques used on spectrum and network analysis, rather than theory. The book covers the use of spectrum analyzers, tracking generators, and network analyzers. Filled with practical examples, the book presents techniques that are widely used in signal processing and communications applications, yet are difficult to find in most literature.Key Features* Presents numerous practical examples, including actual spectrum analyzer circuits* Instruction on how to us

  5. Properties of grain boundary networks in the NEEM ice core analyzed by combined transmission and reflection optical microscopy

    Science.gov (United States)

    Binder, Tobias; Weikusat, Ilka; Garbe, Christoph; Svensson, Anders; Kipfstuhl, Sepp

    2014-05-01

    Microstructure analysis of ice cores is vital to understand the processes controlling the flow of ice on the microscale. To quantify the microstructural variability (and thus occurring processes) on centimeter, meter and kilometer scale along deep polar ice cores, a large number of sections has to be analyzed. In the last decade, two different methods have been applied: On the one hand, transmission optical microscopy of thin sections between crossed polarizers yields information on the distribution of crystal c-axes. On the other hand, reflection optical microscopy of polished and controlled sublimated section surfaces allows to characterize the high resolution properties of a single grain boundary, e.g. its length, shape or curvature (further developed by [1]). Along the entire NEEM ice core (North-West Greenland, 2537 m length) drilled in 2008-2011 we applied both methods to the same set of vertical sections. The data set comprises series of six consecutive 6 x 9 cm2 sections in steps of 20 m - in total about 800 images. A dedicated method for automatic processing and matching both image types has recently been developed [2]. The high resolution properties of the grain boundary network are analyzed. Furthermore, the automatic assignment of c-axis misorientations to visible sublimation grooves enables us to quantify the degree of similarity between the microstructure revealed by both analysis techniques. The reliability to extract grain boundaries from both image types as well as the appearance of sublimation groove patterns exhibiting low misorientations is investigated. X-ray Laue diffraction measurements (yielding full crystallographic orientation) have validated the sensitivity of the surface sublimation method for sub-grain boundaries [3]. We introduce an approach for automatic extraction of sub-grain structures from sublimation grooves. A systematic analysis of sub-grain boundary densities indicates a possible influence of high impurity contents (amongst

  6. ADAM: Analyzer for Dialectal Arabic Morphology

    Directory of Open Access Journals (Sweden)

    Wael Salloum

    2014-12-01

    Full Text Available While Modern Standard Arabic (MSA has many resources, Arabic Dialects, the primarily spoken local varieties of Arabic, are quite impoverished in this regard. In this article, we present ADAM (Analyzer for Dialectal Arabic Morphology. ADAM is a poor man’s solution to quickly develop morphological analyzers for dialectal Arabic. ADAM has roughly half the out-of-vocabulary rate of a state-of-the-art MSA analyzer and is comparable in its recall performance to an Egyptian dialectal morphological analyzer that took years and expensive resources to build.

  7. Designing of Acousto-optic Spectrum Analyzer

    Institute of Scientific and Technical Information of China (English)

    WANG Dan-zhi; SHAO Ding-rong; LI Shu-jian

    2004-01-01

    The structure of the acousto-optic spectrum analyzer was investigated including the RF amplifying circuit, the optical structures and the postprocessing circuit, and the design idea of the module was applied to design the spectrum analyzer. The modularization spectrum analyzer takes on the performance stabilization and higher reliability, and according to different demands, the different modules can be used. The spectrum analyzer had such performances as the detecting frequency error of 0.58MHz,detecting responsivity of 90 dBm and bandwidth of 50 Mhz.

  8. Analyzing the attributes of Indiana's STEM schools

    Science.gov (United States)

    Eltz, Jeremy

    "Primary and secondary schools do not seem able to produce enough students with the interest, motivation, knowledge, and skills they will need to compete and prosper in the emerging world" (National Academy of Sciences [NAS], 2007a, p. 94). This quote indicated that there are changing expectations for today's students which have ultimately led to new models of education, such as charters, online and blended programs, career and technical centers, and for the purposes of this research, STEM schools. STEM education as defined in this study is a non-traditional model of teaching and learning intended to "equip them [students] with critical thinking, problem solving, creative and collaborative skills, and ultimately establishes connections between the school, work place, community and the global economy" (Science Foundation Arizona, 2014, p. 1). Focusing on science, technology, engineering, and math (STEM) education is believed by many educational stakeholders to be the solution for the deficits many students hold as they move on to college and careers. The National Governors Association (NGA; 2011) believes that building STEM skills in the nation's students will lead to the ability to compete globally with a new workforce that has the capacity to innovate and will in turn spur economic growth. In order to accomplish the STEM model of education, a group of educators and business leaders from Indiana developed a comprehensive plan for STEM education as an option for schools to use in order to close this gap. This plan has been promoted by the Indiana Department of Education (IDOE, 2014a) with the goal of increasing STEM schools throughout Indiana. To determine what Indiana's elementary STEM schools are doing, this study analyzed two of the elementary schools that were certified STEM by the IDOE. This qualitative case study described the findings and themes from two elementary STEM schools. Specifically, the research looked at the vital components to accomplish STEM

  9. Analyzing metabolomics-based challenge tests

    NARCIS (Netherlands)

    Vis, D.J.; Westerhuis, J.A.; Jacobs, D.M.; Duynhoven, van J.P.M.; Wopereis, S.; Ommen, van B.; Hendriks, M.M.W.B.; Smilde, A.K.

    2015-01-01

    Challenge tests are used to assess the resilience of human beings to perturbations by analyzing responses to detect functional abnormalities. Well known examples are allergy tests and glucose tolerance tests. Increasingly, metabolomics analysis of blood or serum samples is used to analyze the biolog

  10. Performance evaluation of PL-11 platelet analyzer

    Institute of Scientific and Technical Information of China (English)

    张有涛

    2013-01-01

    Objective To evaluate and report the performance of PL-11 platelet analyzer. Methods Intravenous blood sam-ples anticoagulated with EDTA-K2 and sodium citrate were tested by the PL-11 platelet analyzer to evaluate the intra-assay and interassay coefficient of variation(CV),

  11. Harmonic analysis utilizing a Phonodeik and an Henrici analyzer

    Science.gov (United States)

    Fickinger, William J.; Hanson, Roger J.; Hoekje, Peter L.

    2004-05-01

    Dayton C. Miller of the Case School of Applied Science assembled a series of instruments for accurate analysis of sound [D. C. Miller, J. Franklin Inst. 182, 285-322 (1916)]. He created the Phonodeik to display and record sound waveforms of musical instruments, voices, fog horns, and so on. Waveforms were analyzed with the Henrici harmonic analyzer, built in Switzerland by G. Coradi. In this device, the motion of a stylus along the curve to be analyzed causes a series of spheres to rotate; two moveable rollers in contact with the nth sphere record the contributions of the sine(nx) and cosine(nx) components of the wave. Corrections for the measured spectra are calculated from analysis of the response of the Phonodeik. Finally, the original waveform could be reconstructed from the corrected spectral amplitudes and phases by a waveform synthesizer, also built at Case. Videos will be presented that show the motion of the gears, spheres, and dials of a working Henrici analyzer, housed at the Department of Speech Pathology and Audiology at the University of Iowa. Operation of the Henrici analyzer and the waveform synthesizer will be explained.

  12. Systems Analyze Water Quality in Real Time

    Science.gov (United States)

    2010-01-01

    A water analyzer developed under Small Business Innovation Research (SBIR) contracts with Kennedy Space Center now monitors treatment processes at water and wastewater facilities around the world. Originally designed to provide real-time detection of nutrient levels in hydroponic solutions for growing plants in space, the ChemScan analyzer, produced by ASA Analytics Inc., of Waukesha, Wisconsin, utilizes spectrometry and chemometric algorithms to automatically analyze multiple parameters in the water treatment process with little need for maintenance, calibration, or operator intervention. The company has experienced a compound annual growth rate of 40 percent over its 15-year history as a direct result of the technology's success.

  13. ANALYZING OF MULTICOMPONENT UNDERSAMPLED SIGNALS BY HAF

    Institute of Scientific and Technical Information of China (English)

    Tao Ran; Shan Tao; Zhou Siyong; Wang Yue

    2001-01-01

    The phenomenon of frequency ambiguity may appear in radar or communication systems. S. Barbarossa(1991) had unwrapped the frequency ambiguity of single component undersampled signals by Wigner-Ville distribution(WVD). But there has no any effective algorithm to analyze multicomponent undersampled signals by now. A new algorithm to analyze multicomponent undersampled signals by high-order ambiguity function (HAF) is proposed hera HAF analyzes polynomial phase signals by the method of phase rank reduction, its advantage is that it does not have boundary effect and is not sensitive to the cross-items of multicomponent signals.The simulation results prove the effectiveness of HAF algorithm.

  14. On-Demand Urine Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer that can be integrated into International Space Station (ISS) toilets to measure key...

  15. A method for analyzing strategic product launch

    OpenAIRE

    XIAO Junji

    2007-01-01

    This paper proposes a method to analyze how the manufacturers make product launch decisions in a multi-product oligopoly market, and how the heterogeneity in their products affects the manufacturers' decisions on model launch and withdrawal.

  16. Analyzing and Interpreting Research in Health Education ...

    African Journals Online (AJOL)

    While qualitative research is used when little or nothing is known about the subject, ... and/or grounded theoretical approaches that are analyzable by comparison, ... While qualitative research is interpreted by inductive reasoning, quantitative ...

  17. Low Gravity Drug Stability Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this proposed program (through Phase III) is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation....

  18. Low Gravity Drug Stability Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed program through Phase III is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation. It will be...

  19. Ultrasensitive Atmospheric Analyzer for Miniature UAVs Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this SBIR Phase I effort, Los Gatos Research (LGR) proposes to develop a highly-accurate, lightweight, low-power gas analyzer for quantification of water vapor...

  20. The Information Flow Analyzing Based on CPC

    Institute of Scientific and Technical Information of China (English)

    ZHANG Zhang; LI Hui

    2005-01-01

    The information flow chart within product life cycle is given out based on collaborative production commerce (CPC) thoughts. In this chart, the separated information systems are integrated by means of enterprise knowledge assets that are promoted by CPC from production knowledge. The information flow in R&D process is analyzed in the environment of virtual R&D group and distributed PDM. In addition, the information flow throughout the manufacturing and marketing process is analyzed in CPC environment.

  1. QUBIT DATA STRUCTURES FOR ANALYZING COMPUTING SYSTEMS

    Directory of Open Access Journals (Sweden)

    Vladimir Hahanov

    2014-11-01

    Full Text Available Qubit models and methods for improving the performance of software and hardware for analyzing digital devices through increasing the dimension of the data structures and memory are proposed. The basic concepts, terminology and definitions necessary for the implementation of quantum computing when analyzing virtual computers are introduced. The investigation results concerning design and modeling computer systems in a cyberspace based on the use of two-component structure are presented.

  2. Analyte comparisons between 2 clinical chemistry analyzers.

    OpenAIRE

    Sutton, A; Dawson, H; Hoff, B; Grift, E; Shoukri, M

    1999-01-01

    The purpose of this study was to assess agreement between a wet reagent and a dry reagent analyzer. Thirteen analytes (albumin, globulin, alkaline phosphatase, alanine aminotransferase, amylase, urea nitrogen, calcium, cholesterol, creatinine, glucose, potassium, total bilirubin, and total protein) for both canine and feline serum were evaluated. Concordance correlations, linear regression, and plots of difference against mean were used to analyze the data. Concordance correlations were excel...

  3. A Rigorous Methodology for Analyzing and Designing Plug-Ins

    DEFF Research Database (Denmark)

    Fasie, Marieta V.; Haxthausen, Anne Elisabeth; Kiniry, Joseph

    2013-01-01

    . This paper addresses these problems by describing a rigorous methodology for analyzing and designing plug-ins. The methodology is grounded in the Extended Business Object Notation (EBON) and covers informal analysis and design of features, GUI, actions, and scenarios, formal architecture design, including...... behavioral semantics, and validation. The methodology is illustrated via a case study whose focus is an Eclipse environment for the RAISE formal method's tool suite....

  4. Analyzing IT Service Delivery in an ISP from Nicaragua

    Science.gov (United States)

    Flores, Johnny; Rusu, Lazar; Johanneson, Paul

    This paper presents a method for analyzing IT service delivery and its application in an Internet Service Provider (ISP). The method proposed is based on ITIL-processes and case study technique; it includes questionnaires for gathering information, semi-structured interviews, focus groups and documents as sources of information for recognition of factual information. The method application allows to the ISP determines its practices and limitations of the IT Service Delivery.

  5. Analyzing visual signals as visual scenes.

    Science.gov (United States)

    Allen, William L; Higham, James P

    2013-07-01

    The study of visual signal design is gaining momentum as techniques for studying signals become more sophisticated and more freely available. In this paper we discuss methods for analyzing the color and form of visual signals, for integrating signal components into visual scenes, and for producing visual signal stimuli for use in psychophysical experiments. Our recommended methods aim to be rigorous, detailed, quantitative, objective, and where possible based on the perceptual representation of the intended signal receiver(s). As methods for analyzing signal color and luminance have been outlined in previous publications we focus on analyzing form information by discussing how statistical shape analysis (SSA) methods can be used to analyze signal shape, and spatial filtering to analyze repetitive patterns. We also suggest the use of vector-based approaches for integrating multiple signal components. In our opinion elliptical Fourier analysis (EFA) is the most promising technique for shape quantification but we await the results of empirical comparison of techniques and the development of new shape analysis methods based on the cognitive and perceptual representations of receivers. Our manuscript should serve as an introductory guide to those interested in measuring visual signals, and while our examples focus on primate signals, the methods are applicable to quantifying visual signals in most taxa.

  6. A resource-efficient adaptive Fourier analyzer

    Science.gov (United States)

    Hajdu, C. F.; Zamantzas, C.; Dabóczi, T.

    2016-10-01

    We present a resource-efficient frequency adaptation method to complement the Fourier analyzer proposed by Péceli. The novel frequency adaptation scheme is based on the adaptive Fourier analyzer suggested by Nagy. The frequency adaptation method was elaborated with a view to realizing a detector connectivity check on an FPGA in a new beam loss monitoring (BLM) system, currently being developed for beam setup and machine protection of the particle accelerators at the European Organisation for Nuclear Research (CERN). The paper summarizes the Fourier analyzer to the extent relevant to this work and the basic principle of the related frequency adaptation methods. It then outlines the suggested new scheme, presents practical considerations for implementing it and underpins it with an example and the corresponding operational experience.

  7. Beam profile analyzer for CO2 lasers

    Directory of Open Access Journals (Sweden)

    Rubén López

    2015-12-01

    Full Text Available The development of an optoelectronic system to analyze the beam intensity profile of CO2 lasers is presented herein. The device collects the beam profile with a LiTaO3 pyroelectric detector and uses a sampling technique based on the acquisition of horizontal sections at different levels. The digital signal processing includes subroutines that drop down two dimensional and three dimensional beam profile displays to determine the laser beam parameters of optical power, peak pixel location, centroid location and width of the laser beam, with algorithms based on the ISO 11146 standard. With the systematic calibration of the analyzer was obtained in the measurement of power an error under 5%, for a 20–200 W range and an error under 1.6% for spatial measurements of a TEM00 laser. By design, the analyzer can be used during the laser process.

  8. 40 CFR 90.313 - Analyzers required.

    Science.gov (United States)

    2010-07-01

    ... ionization (HFID) type. For constant volume sampling, the hydrocarbon analyzer may be of the flame ionization (FID) type or of the heated flame ionization (HFID) type. (ii) For the HFID system, if the temperature... drying. Chemical dryers are not an acceptable method of removing water from the sample. Water removal...

  9. Imaging thermal plasma mass and velocity analyzer

    Science.gov (United States)

    Yau, Andrew W.; Howarth, Andrew

    2016-07-01

    We present the design and principle of operation of the imaging ion mass and velocity analyzer on the Enhanced Polar Outflow Probe (e-POP), which measures low-energy (1-90 eV/e) ion mass composition (1-40 AMU/e) and velocity distributions using a hemispherical electrostatic analyzer (HEA), a time-of-flight (TOF) gate, and a pair of toroidal electrostatic deflectors (TED). The HEA and TOF gate measure the energy-per-charge and azimuth of each detected ion and the ion transit time inside the analyzer, respectively, providing the 2-D velocity distribution of each major ionospheric ion species and resolving the minor ion species under favorable conditions. The TED are in front of the TOF gate and optionally sample ions at different elevation angles up to ±60°, for measurement of 3-D velocity distribution. We present examples of observation data to illustrate the measurement capability of the analyzer, and show the occurrence of enhanced densities of heavy "minor" O++, N+, and molecular ions and intermittent, high-velocity (a few km/s) upward and downward flowing H+ ions in localized regions of the quiet time topside high-latitude ionosphere.

  10. Analyzing the Biology on the System Level

    OpenAIRE

    Tong, Wei

    2016-01-01

    Although various genome projects have provided us enormous static sequence information, understanding of the sophisticated biology continues to require integrating the computational modeling, system analysis, technology development for experiments, and quantitative experiments all together to analyze the biology architecture on various levels, which is just the origin of systems biology subject. This review discusses the object, its characteristics, and research attentions in systems biology,...

  11. Analyzing Vessel Behavior Using Process Mining

    NARCIS (Netherlands)

    Maggi, F.M.; Mooij, A.J.; Aalst, W.M.P. van der

    2013-01-01

    In the maritime domain, electronic sensors such as AIS receivers and radars collect large amounts of data about the vessels in a certain geographical area. We investigate the use of process mining techniques for analyzing the behavior of the vessels based on these data. In the context of maritime sa

  12. Fluidization quality analyzer for fluidized beds

    Science.gov (United States)

    Daw, C. Stuart; Hawk, James A.

    1995-01-01

    A control loop and fluidization quality analyzer for a fluidized bed utilizes time varying pressure drop measurements. A fast-response pressure transducer measures the overall bed pressure drop, or over some segment of the bed, and the pressure drop signal is processed to produce an output voltage which changes with the degree of fluidization turbulence.

  13. Studying Reliability Using Identical Handheld Lactate Analyzers

    Science.gov (United States)

    Stewart, Mark T.; Stavrianeas, Stasinos

    2008-01-01

    Accusport analyzers were used to generate lactate performance curves in an investigative laboratory activity emphasizing the importance of reliable instrumentation. Both the calibration and testing phases of the exercise provided students with a hands-on opportunity to use laboratory-grade instrumentation while allowing for meaningful connections…

  14. Analyzing volatile compounds in dairy products

    Science.gov (United States)

    Volatile compounds give the first indication of the flavor in a dairy product. Volatiles are isolated from the sample matrix and then analyzed by chromatography, sensory methods, or an electronic nose. Isolation may be performed by solvent extraction or headspace analysis, and gas chromatography i...

  15. How to Analyze Company Using Social Network?

    Science.gov (United States)

    Palus, Sebastian; Bródka, Piotr; Kazienko, Przemysław

    Every single company or institution wants to utilize its resources in the most efficient way. In order to do so they have to be have good structure. The new way to analyze company structure by utilizing existing within company natural social network and example of its usage on Enron company are presented in this paper.

  16. Analyzing Vessel Behavior Using Process Mining

    NARCIS (Netherlands)

    Maggi, F.M.; Mooij, A.J.; Aalst, W.M.P. van der

    2013-01-01

    In the maritime domain, electronic sensors such as AIS receivers and radars collect large amounts of data about the vessels in a certain geographical area. We investigate the use of process mining techniques for analyzing the behavior of the vessels based on these data. In the context of maritime sa

  17. Analyzing Languages for Specific Purposes Discourse

    Science.gov (United States)

    Bowles, Hugo

    2012-01-01

    In the last 20 years, technological advancement and increased multidisciplinarity has expanded the range of data regarded as within the scope of languages for specific purposes (LSP) research and the means by which they can be analyzed. As a result, the analytical work of LSP researchers has developed from a narrow focus on specialist terminology…

  18. Thermal and Evolved-Gas Analyzer Illustration

    Science.gov (United States)

    2008-01-01

    This is a computer-aided drawing of the Thermal and Evolved-Gas Analyzer, or TEGA, on NASA's Phoenix Mars Lander. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  19. Graphic method for analyzing common path interferometers

    DEFF Research Database (Denmark)

    Glückstad, J.

    1998-01-01

    Common path interferometers are widely used for visualizing phase disturbances and fluid flows. They are attractive because of the inherent simplicity and robustness in the setup. A graphic method will be presented for analyzing and optimizing filter parameters in common path interferometers....

  20. Analyzing the Control Structure of PEPA

    DEFF Research Database (Denmark)

    Yang, Fan; Nielson, Hanne Riis

    to PEPA programs, the approximating result is very precise. Based on the analysis, we also develop algorithms for validating the deadlock property of PEPA programs. The techniques have been implemented in a tool which is able to analyze processes with a control structure that more than one thousand states....

  1. GSM Trace Quality Analyzer (TQA) software

    OpenAIRE

    Blanchart Forne, Marc

    2016-01-01

    Connectivity is now the must-have service for enhancing passenger experience. To proof and also to show to the customers the quality of the connectivity system an user friendly mock-up has to be designed. A packet analyzer software designed to validate an existing SATCOM simulator and to improve future airline architecture networks.

  2. Analyzing efficiency of vegetable production in Benin

    NARCIS (Netherlands)

    Singbo, A.G.

    2012-01-01

    The objective of this research is to investigate the production technology and efficiency of vegetable production and marketing at the farm level in Benin. Using recent advances in cross sectional efficiency analysis, we analyze two samples of vegetable producers following different perspectives.

  3. Analyzing the Information Economy: Tools and Techniques.

    Science.gov (United States)

    Robinson, Sherman

    1986-01-01

    Examines methodologies underlying studies which measure the information economy and considers their applicability and limitations for analyzing policy issues concerning libraries and library networks. Two studies provide major focus for discussion: Porat's "The Information Economy: Definition and Measurement" and Machlup's "Production and…

  4. Strengthening 4-H by Analyzing Enrollment Data

    Science.gov (United States)

    Hamilton, Stephen F.; Northern, Angela; Neff, Robert

    2014-01-01

    The study reported here used data from the ACCESS 4-H Enrollment System to gain insight into strengthening New York State's 4-H programming. Member enrollment lists from 2009 to 2012 were analyzed using Microsoft Excel to determine trends and dropout rates. The descriptive data indicate declining 4-H enrollment in recent years and peak enrollment…

  5. Statistical network analysis for analyzing policy networks

    DEFF Research Database (Denmark)

    Robins, Garry; Lewis, Jenny; Wang, Peng

    2012-01-01

    and policy network methodology is the development of statistical modeling approaches that can accommodate such dependent data. In this article, we review three network statistical methods commonly used in the current literature: quadratic assignment procedures, exponential random graph models (ERGMs...... has much to offer in analyzing the policy process....

  6. Consideration Regarding Diagnosis Analyze of Corporate Management

    Directory of Open Access Journals (Sweden)

    Mihaela Ciopi OPREA

    2009-01-01

    Full Text Available Diagnosis management aims to identify critical situations and positive aspectsof corporate management. An effective diagnosis made by a team with thestatus of independence from the organization’s management is for managers auseful feedback necessary to improve performance. The work presented focuseson the methodology to achieve effective diagnosis, considering multitudecriteria and variables to be analyzed.

  7. Analyzing Ethnographic Data--Strategies and Results.

    Science.gov (United States)

    Porter-Gehrie, Cynthia; Crowson, Robert L.

    Using ethnographic data, this study explores the behavior of urban principals at work. The event analysis summary (appended) was based on Mintzberg's classification of on-the-job characteristics and role behavior and then modified to reflect the data obtained. "Key incidents" rather than case studies serve as the basis for organizing descriptive…

  8. Real time speech formant analyzer and display

    Energy Technology Data Exchange (ETDEWEB)

    Holland, George E. (Ames, IA); Struve, Walter S. (Ames, IA); Homer, John F. (Ames, IA)

    1987-01-01

    A speech analyzer for interpretation of sound includes a sound input which converts the sound into a signal representing the sound. The signal is passed through a plurality of frequency pass filters to derive a plurality of frequency formants. These formants are converted to voltage signals by frequency-to-voltage converters and then are prepared for visual display in continuous real time. Parameters from the inputted sound are also derived and displayed. The display may then be interpreted by the user. The preferred embodiment includes a microprocessor which is interfaced with a television set for displaying of the sound formants. The microprocessor software enables the sound analyzer to present a variety of display modes for interpretive and therapeutic used by the user.

  9. An improved prism energy analyzer for neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Schulz, J., E-mail: jennifer.schulz@helmholtz-berlin.de [Helmholtz-Zentrum Berlin für Materialien und Energie, Hahn-Meitner-Platz 1, 14109 Berlin (Germany); Ott, F. [Laboratoire Leon Brillouin, Bât 563 CEA Saclay, 91191 Gif sur Yvette Cedex (France); Krist, Th. [Helmholtz-Zentrum Berlin für Materialien und Energie, Hahn-Meitner-Platz 1, 14109 Berlin (Germany)

    2014-04-21

    The effects of two improvements of an existing neutron energy analyzer consisting of stacked silicon prism rows are presented. First we tested the effect of coating the back of the prism rows with an absorbing layer to suppress neutron scattering by total reflection and by refraction at small angles. Experiments at HZB showed that this works perfectly. Second the prism rows were bent to shift the transmitted wavelength band to larger wavelengths. At HZB we showed that bending increased the transmission of neutrons with a wavelength of 4.9 Å. Experiments with a white beam at the EROS reflectometer at LLB showed that bending of the energy analyzing device to a radius of 7.9 m allows to shift the transmitted wavelength band from 0 to 9 Å to 2 to 16 Å.

  10. The EPOS Automated Selective Chemistry Analyzer evaluated.

    Science.gov (United States)

    Moses, G C; Lightle, G O; Tuckerman, J F; Henderson, A R

    1986-01-01

    We evaluated the analytical performance of the EPOS (Eppendorf Patient Oriented System) Automated Selective Chemistry Analyzer, using the following tests for serum analytes: alanine and aspartate aminotransferases, lactate dehydrogenase, creatine kinase, gamma-glutamyltransferase, alkaline phosphatase, and glucose. Results from the EPOS correlated well with those from comparison instruments (r greater than or equal to 0.990). Precision and linearity limits were excellent for all tests; linearity of the optical and pipetting systems was satisfactory. Reagent carryover was negligible. Sample-to-sample carryover was less than 1% for all tests, but only lactate dehydrogenase was less than the manufacturer's specified 0.5%. Volumes aspirated and dispensed by the sample and reagent II pipetting systems differed significantly from preset values, especially at lower settings; the reagent I system was satisfactory at all volumes tested. Minimal daily maintenance and an external data-reduction system make the EPOS a practical alternative to other bench-top chemistry analyzers.

  11. Simulation of a Hyperbolic Field Energy Analyzer

    CERN Document Server

    Gonzalez-Lizardo, Angel

    2016-01-01

    Energy analyzers are important plasma diagnostic tools with applications in a broad range of disciplines including molecular spectroscopy, electron microscopy, basic plasma physics, plasma etching, plasma processing, and ion sputtering technology. The Hyperbolic Field Energy Analyzer (HFEA) is a novel device able to determine ion and electron energy spectra and temperatures. The HFEA is well suited for ion temperature and density diagnostics at those situations where ions are scarce. A simulation of the capacities of the HFEA to discriminate particles of a particular energy level, as well as to determine temperature and density is performed in this work. The electric field due the combination of the conical elements, collimator lens, and Faraday cup applied voltage was computed in a well suited three-dimensional grid. The field is later used to compute the trajectory of a set of particles with a predetermined energy distribution. The results include the observation of the particle trajectories inside the sens...

  12. Development of pulse neutron coal analyzer

    Science.gov (United States)

    Jing, Shi-wie; Gu, De-shan; Qiao, Shuang; Liu, Yu-ren; Liu, Lin-mao; Shi-wei, Jing

    2005-04-01

    This article introduced the development of pulsed neutron coal analyzer by pulse fast-thermal neutron analysis technology in the Radiation Technology Institute of Northeast Normal University. The 14MeV pulse neutron generator and bismuth germanate detector and 4096 multichannel analyzer were applied in this system. The multiple linear regression method employed to process data solved the interferential problem of multiple elements. The prototype (model MZ-MKFY) had been applied in Changshan and Jilin power plant for about a year. The results of measuring the main parameters of coal such as low caloric power, whole total water, ash content, volatile content, and sulfur content, with precision acceptable to the coal industry, are presented.

  13. Analyzing public health policy: three approaches.

    Science.gov (United States)

    Coveney, John

    2010-07-01

    Policy is an important feature of public and private organizations. Within the field of health as a policy arena, public health has emerged in which policy is vital to decision making and the deployment of resources. Public health practitioners and students need to be able to analyze public health policy, yet many feel daunted by the subject's complexity. This article discusses three approaches that simplify policy analysis: Bacchi's "What's the problem?" approach examines the way that policy represents problems. Colebatch's governmentality approach provides a way of analyzing the implementation of policy. Bridgman and Davis's policy cycle allows for an appraisal of public policy development. Each approach provides an analytical framework from which to rigorously study policy. Practitioners and students of public health gain much in engaging with the politicized nature of policy, and a simple approach to policy analysis can greatly assist one's understanding and involvement in policy work.

  14. Methods of analyzing composition of aerosol particles

    Science.gov (United States)

    Reilly, Peter T.A.

    2013-02-12

    An aerosol particle analyzer includes a laser ablation chamber, a gas-filled conduit, and a mass spectrometer. The laser ablation chamber can be operated at a low pressure, which can be from 0.1 mTorr to 30 mTorr. The ablated ions are transferred into a gas-filled conduit. The gas-filled conduit reduces the electrical charge and the speed of ablated ions as they collide and mix with buffer gases in the gas-filled conduit. Preferably, the gas filled-conduit includes an electromagnetic multipole structure that collimates the nascent ions into a beam, which is guided into the mass spectrometer. Because the gas-filled conduit allows storage of vast quantities of the ions from the ablated particles, the ions from a single ablated particle can be analyzed multiple times and by a variety of techniques to supply statistically meaningful analysis of composition and isotope ratios.

  15. CRIE: An automated analyzer for Chinese texts.

    Science.gov (United States)

    Sung, Yao-Ting; Chang, Tao-Hsing; Lin, Wei-Chun; Hsieh, Kuan-Sheng; Chang, Kuo-En

    2016-12-01

    Textual analysis has been applied to various fields, such as discourse analysis, corpus studies, text leveling, and automated essay evaluation. Several tools have been developed for analyzing texts written in alphabetic languages such as English and Spanish. However, currently there is no tool available for analyzing Chinese-language texts. This article introduces a tool for the automated analysis of simplified and traditional Chinese texts, called the Chinese Readability Index Explorer (CRIE). Composed of four subsystems and incorporating 82 multilevel linguistic features, CRIE is able to conduct the major tasks of segmentation, syntactic parsing, and feature extraction. Furthermore, the integration of linguistic features with machine learning models enables CRIE to provide leveling and diagnostic information for texts in language arts, texts for learning Chinese as a foreign language, and texts with domain knowledge. The usage and validation of the functions provided by CRIE are also introduced.

  16. Analyzing Evolving Social Network 2 (EVOLVE2)

    Science.gov (United States)

    2015-04-01

    COVERED (From - To) JUN 2012 – OCT 2014 4. TITLE AND SUBTITLE ANALYZING EVOLVING SOCIAL NETWORKS 2 (EVOLVE2) 5a. CONTRACT NUMBER FA8750-12-2-0186... jazz 198 2742 274 0.14 connect 1095 7825 783 0.014 hep-th 8710 14254 1425 0.0003 netscience 1461 2742 274 0.0013 imdb 6260 98235 9824 0.005 technological

  17. A Conceptual Framework for Analyzing Terrorist Groups,

    Science.gov (United States)

    1985-06-01

    liberation, and American corporations are considered an element oppressing the work - ing class. 54 A CONCEPTUAL FRAMEWORK FOR ANALYZING TERRORIST GROUPS...personnel and installations in Guatemala, Iran, and Spain to protest American support of Israel’s invasion of Lebanon. Other factors are also at work ...inferred? D7. Are the goals iealistically obtainable? D8. Do the members envisage a long struggle? Are they milleni - alists (a new world after chaos?) E

  18. Analyzing negative ties in social networks

    Directory of Open Access Journals (Sweden)

    Mankirat Kaur

    2016-03-01

    Full Text Available Online social networks are a source of sharing information and maintaining personal contacts with other people through social interactions and thus forming virtual communities online. Social networks are crowded with positive and negative relations. Positive relations are formed by support, endorsement and friendship and thus, create a network of well-connected users whereas negative relations are a result of opposition, distrust and avoidance creating disconnected networks. Due to increase in illegal activities such as masquerading, conspiring and creating fake profiles on online social networks, exploring and analyzing these negative activities becomes the need of hour. Usually negative ties are treated in same way as positive ties in many theories such as balance theory and blockmodeling analysis. But the standard concepts of social network analysis do not yield same results in respect of each tie. This paper presents a survey on analyzing negative ties in social networks through various types of network analysis techniques that are used for examining ties such as status, centrality and power measures. Due to the difference in characteristics of flow in positive and negative tie networks some of these measures are not applicable on negative ties. This paper also discusses new methods that have been developed specifically for analyzing negative ties such as negative degree, and h∗ measure along with the measures based on mixture of positive and negative ties. The different types of social network analysis approaches have been reviewed and compared to determine the best approach that can appropriately identify the negative ties in online networks. It has been analyzed that only few measures such as Degree and PN centrality are applicable for identifying outsiders in network. For applicability in online networks, the performance of PN measure needs to be verified and further, new measures should be developed based upon negative clique concept.

  19. Organization theory. Analyzing health care organizations.

    Science.gov (United States)

    Cors, W K

    1997-02-01

    Organization theory (OT) is a tool that can be applied to analyze and understand health care organizations. Transaction cost theory is used to explain, in a unifying fashion, the myriad changes being undertaken by different groups of constituencies in health care. Agency theory is applied to aligning economic incentives needed to ensure Integrated Delivery System (IDS) success. By using tools such as OT, a clearer understanding of organizational changes is possible.

  20. The analyzing of Dove marketing strategy

    Institute of Scientific and Technical Information of China (English)

    Guo; Yaohui

    2015-01-01

    <正>1.Introduction In this report,I try to analyze the related information about DOVE chocolate.Firstly,I would like to introduce this product.Dove chocolate is one of a series of products launched by the world’s largest pet food and snack food manufacturers,U.S.multinational food company Mars(Mars).Entered China in 1989,It becomes China’s leading brand of chocolate in

  1. Coordinating, Scheduling, Processing and Analyzing IYA09

    Science.gov (United States)

    Gipson, John; Behrend, Dirk; Gordon, David; Himwich, Ed; MacMillan, Dan; Titus, Mike; Corey, Brian

    2010-01-01

    The IVS scheduled a special astrometric VLBI session for the International Year of Astronomy 2009 (IYA09) commemorating 400 years of optical astronomy and 40 years of VLBI. The IYA09 session is the most ambitious geodetic session to date in terms of network size, number of sources, and number of observations. We describe the process of designing, coordinating, scheduling, pre-session station checkout, correlating, and analyzing this session.

  2. MORPHOLOGICAL ANALYZER MYSTEM 3.0

    Directory of Open Access Journals (Sweden)

    A. I. Zobnin

    2015-01-01

    Full Text Available The large part of the Russian National Corpus has automatic morphological markup. It is based on the morphological analyzer Mystem developed in Yandex with some postprocessing of the results (for example, all indeclinable nouns acquire the tag '0', verbs are divided into separate paradigms by aspect, etc.. Recently a new (third version of Mystem has been released (see https://tech.yandex.ru/mystem/.  In this article we give an overview of its capabilities.

  3. Miles Technicon H.2 automated hematology analyzer.

    Science.gov (United States)

    1992-11-01

    Automated hematology analyzers are used in all large hospitals and most commercial laboratories, as well as in most smaller hospitals and laboratories, to perform complete blood counts (including white blood cell, red blood cell, and platelet counts; hemoglobin concentration; and RBC indices) and white blood cell differential counts. Our objectives in this study are to provide user guidance for selecting, purchasing, and using an automated hematology analyzer, as well as to present an overview of the technology used in an automated five-part differential unit. Specifications for additional automated units are available in ECRI's Clinical Laboratory Product Comparison System. We evaluated the Miles Technicon H.2 unit and rated it Acceptable. The information in this Single Product Evaluation is also useful for purchasing other models; our criteria will guide users in assessing components, and our findings and discussions on some aspects of automated hematology testing are common to many available systems. We caution readers not to base purchasing decisions on our rating of the Miles unit alone, but on a thorough understanding of the issues surrounding automated hematology analyzers, which can be gained only by reading this report in its entirety. The willingness of manufacturers to cooperate in our studies and the knowledge they gain through participating lead to the development of better products. Readers should refer to the Guidance Section, "Selecting and Purchasing an Automated Hematology Analyzer," where we discuss factors such as standardization, training, human factors, manufacturer support, patient population, and special features that the laboratory must consider before obtaining any automated unit; we also provide an in-depth review of cost issues, including life-cycle cost analyses, acquisition methods and costs of hardware and supplies, and we describe the Hemacost and Hemexmpt cost worksheets for use with our PresValu and PSV Manager CAHDModel software

  4. Analyzing Malware Based on Volatile Memory

    Directory of Open Access Journals (Sweden)

    Liang Hu

    2013-11-01

    Full Text Available To explain the necessity of comprehensive and automatically analysis process for volatile memory, this paper summarized ordinarily analyzing methods and their common points especially for concerned data source. Then, a memory analysis framework Volatiltiy-2.2 and statistical output file size are recommended. In addition, to address the limitation of plug-ins classification in analyzing procedure, a user perspective classify is necessary and proposed. Furthermore, according to target data source differences on the base of result data set volume and employed relational method is introduced for comprehensive analysis guideline procedure. Finally, a test demo including DLLs loading order list analyzing is recommend, in which DLL load list is regard as different kind of characteristics typical data source with process and convert into process behavior fingerprint. The clustering for the fingerprint is employed string similar degree algorithm model in the demo, which has a wide range applications in traditional malware behavior analysis, and it is proposed that these methods also can be applied for volatile memory

  5. Thermo Scientific Sulfur Dioxide Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. BNL has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

  6. Experienced physicians benefit from analyzing initial diagnostic hypotheses.

    Science.gov (United States)

    Bass, Adam; Geddes, Colin; Wright, Bruce; Coderre, Sylvain; Rikers, Remy; McLaughlin, Kevin

    2013-01-01

    Most incorrect diagnoses involve at least one cognitive error, of which premature closure is the most prevalent. While metacognitive strategies can mitigate premature closure in inexperienced learners, these are rarely studied in experienced physicians. Our objective here was to evaluate the effect of analytic information processing on diagnostic performance of nephrologists and nephrology residents. We asked nine nephrologists and six nephrology residents at the University of Calgary and Glasgow University to diagnose ten nephrology cases. We provided presenting features along with contextual information, after which we asked for an initial diagnosis. We then primed participants to use either hypothetico-deductive reasoning or scheme-inductive reasoning to analyze the remaining case data and generate a final diagnosis. After analyzing initial hypotheses, both nephrologists and residents improved the accuracy of final diagnoses (31.1% vs. 65.6%, p vs. 70.0%, p inductive reasoning (odds ratio [95% confidence interval] 5.69 [1.59, 20.33], p = 0.07), whereas the performance of experienced nephrologists did not differ between strategies (odds ratio 0.57 [0.23, 1.39], p = 0.20). Experienced nephrologists and nephrology residents can improve their performance by analyzing initial diagnostic hypotheses. The explanation of the interaction between experience and the effect of different reasoning strategies is unclear, but may relate to preferences in reasoning strategy, or the changes in knowledge structure with experience.

  7. The DOE Automated Radioxenon Sampler-Analyzer (ARSA) Beta-Gamma Coincidence Spectrometer Data Analyzer

    Science.gov (United States)

    2000-09-01

    detected using the counting system given the daily fluctuations in Radon gas interference, the background counts, the memory effect of previous...THE DOE AUTOMATED RADIOXENON SAMPLER-ANALYZER (ARSA) BETA-GAMMA COINCIDENCE SPECTROMETER DATA ANALYZER T.R. Heimbigner, T.W. Bowyer, J.I...1830 ABSTRACT The Automated Radioxenon Sampler/Analyzer (ARSA) developed at the Pacific Northwest National Laboratory for the Comprehensive

  8. Remote Laser Diffraction Particle Size Distribution Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2001-03-01

    In support of a radioactive slurry sampling and physical characterization task, an “off-the-shelf” laser diffraction (classical light scattering) particle size analyzer was utilized for remote particle size distribution (PSD) analysis. Spent nuclear fuel was previously reprocessed at the Idaho Nuclear Technology and Engineering Center (INTEC—formerly recognized as the Idaho Chemical Processing Plant) which is on DOE’s INEEL site. The acidic, radioactive aqueous raffinate streams from these processes were transferred to 300,000 gallon stainless steel storage vessels located in the INTEC Tank Farm area. Due to the transfer piping configuration in these vessels, complete removal of the liquid can not be achieved. Consequently, a “heel” slurry remains at the bottom of an “emptied” vessel. Particle size distribution characterization of the settled solids in this remaining heel slurry, as well as suspended solids in the tank liquid, is the goal of this remote PSD analyzer task. A Horiba Instruments Inc. Model LA-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a “hot cell” (gamma radiation) environment. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not previously achievable—making this technology far superior than the traditional methods used. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  9. A chemical analyzer for charged ultrafine particles

    Directory of Open Access Journals (Sweden)

    S. G. Gonser

    2013-04-01

    Full Text Available New particle formation is a frequent phenomenon in the atmosphere and of major significance for the earth's climate and human health. To date the mechanisms leading to the nucleation of particles as well as to aerosol growth are not completely understood. A lack of appropriate measurement equipment for online analysis of the chemical composition of freshly nucleated particles is one major limitation. We have developed a Chemical Analyzer for Charged Ultrafine Particles (CAChUP capable of analyzing particles with diameters below 30 nm. A bulk of size separated particles is collected electrostatically on a metal filament, resistively desorbed and consequently analyzed for its molecular composition in a time of flight mass spectrometer. We report of technical details as well as characterization experiments performed with the CAChUP. Our instrument was tested in the laboratory for its detection performance as well as for its collection and desorption capabilities. The manual application of known masses of camphene (C10H16 to the desorption filament resulted in a detection limit between 0.5 and 5 ng, and showed a linear response of the mass spectrometer. Flow tube experiments of 25 nm diameter secondary organic aerosol from ozonolysis of alpha-pinene also showed a linear relation between collection time and the mass spectrometer's signal intensity. The resulting mass spectra from the collection experiments are in good agreement with published work on particles generated by the ozonolysis of alpha-pinene. A sensitivity study shows that the current setup of CAChUP is ready for laboratory measurements and for the observation of new particle formation events in the field.

  10. A chemical analyzer for charged ultrafine particles

    Directory of Open Access Journals (Sweden)

    S. G. Gonser

    2013-09-01

    Full Text Available New particle formation is a frequent phenomenon in the atmosphere and of major significance for the Earth's climate and human health. To date the mechanisms leading to the nucleation of particles as well as to aerosol growth are not completely understood. A lack of appropriate measurement equipment for online analysis of the chemical composition of freshly nucleated particles is one major limitation. We have developed a Chemical Analyzer for Charged Ultrafine Particles (CAChUP capable of analyzing particles with diameters below 30 nm. A bulk of size-separated particles is collected electrostatically on a metal filament, resistively desorbed and subsequently analyzed for its molecular composition in a time of flight mass spectrometer. We report on technical details as well as characterization experiments performed with the CAChUP. Our instrument was tested in the laboratory for its detection performance as well as for its collection and desorption capabilities. The manual application of defined masses of camphene (C10H16 to the desorption filament resulted in a detection limit between 0.5 and 5 ng, and showed a linear response of the mass spectrometer. Flow tube experiments of 25 nm diameter secondary organic aerosol from ozonolysis of alpha-pinene also showed a linear relation between collection time and the mass spectrometer's signal intensity. The resulting mass spectra from the collection experiments are in good agreement with published work on particles generated by the ozonolysis of alpha-pinene. A sensitivity study shows that the current setup of CAChUP is ready for laboratory measurements and for the observation of new particle formation events in the field.

  11. Blood Gas Analyzer Accuracy of Glucose Measurements.

    Science.gov (United States)

    Liang, Yafen; Wanderer, Jonathan; Nichols, James H; Klonoff, David; Rice, Mark J

    2017-07-01

    To investigate the comparability of glucose levels measured with blood gas analyzers (BGAs) and by central laboratories (CLs). Glucose measurements obtained between June 1, 2007, and March 1, 2016, at the Vanderbilt University Medical Center were reviewed. The agreement between CL and BGA results were assessed using Bland-Altman, consensus error grid (CEG), and surveillance error grid (SEG) analyses. We further analyzed the BGAs' performance against the US Food and Drug Administration (FDA) 2014 draft guidance and 2016 final guidance for blood glucose monitoring and the International Organization for Standardization (ISO) 15197:2013 standard. We analyzed 2671 paired glucose measurements, including 50 pairs of hypoglycemic values (1.9%). Bland-Altman analysis yielded a mean bias of -3.1 mg/dL, with 98.1% of paired values meeting the 95% limits of agreement. In the hypoglycemic range, the mean bias was -0.8 mg/dL, with 100% of paired values meeting the 95% limits of agreement. When using CEG analysis, 99.9% of the paired values fell within the no risk zone. Similar results were found using SEG analysis. For the FDA 2014 draft guidance, our data did not meet the target compliance rate. For the FDA 2016 final guidance, our data partially met the target compliance rate. For the ISO standard, our data met the target compliance rate. In this study, the agreement for glucose measurement between common BGAs and CL instruments met the ISO 2013 standard. However, BGA accuracy did not meet the stricter requirements of the FDA 2014 draft guidance or 2016 final guidance. Fortunately, plotting these results on either the CEG or the SEG revealed no results in either the great or extreme clinical risk zones. Copyright © 2017 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.

  12. IRISpy: Analyzing IRIS Data in Python

    Science.gov (United States)

    Ryan, Daniel; Christe, Steven; Mumford, Stuart; Baruah, Ankit; Timothy, Shelbe; Pereira, Tiago; De Pontieu, Bart

    2017-08-01

    IRISpy is a new community-developed open-source software library for analysing IRIS level 2 data. It is written in Python, a free, cross-platform, general-purpose, high-level programming language. A wide array of scientific computing software packages have already been developed in Python, from numerical computation (NumPy, SciPy, etc.), to visualization and plotting (matplotlib), to solar-physics-specific data analysis (SunPy). IRISpy is currently under development as a SunPy-affiliated package which means it depends on the SunPy library, follows similar standards and conventions, and is developed with the support of of the SunPy development team. IRISpy’s has two primary data objects, one for analyzing slit-jaw imager data and another for analyzing spectrograph data. Both objects contain basic slicing, indexing, plotting, and animating functionality to allow users to easily inspect, reduce and analyze the data. As part of this functionality the objects can output SunPy Maps, TimeSeries, Spectra, etc. of relevant data slices for easier inspection and analysis. Work is also ongoing to provide additional data analysis functionality including derivation of systematic measurement errors (e.g. readout noise), exposure time correction, residual wavelength calibration, radiometric calibration, and fine scale pointing corrections. IRISpy’s code base is publicly available through github.com and can be contributed to by anyone. In this poster we demonstrate IRISpy’s functionality and future goals of the project. We also encourage interested users to become involved in further developing IRISpy.

  13. Spectrum Analyzers Incorporating Tunable WGM Resonators

    Science.gov (United States)

    Savchenkov, Anatoliy; Matsko, Andrey; Strekalov, Dmitry; Maleki, Lute

    2009-01-01

    A photonic instrument is proposed to boost the resolution for ultraviolet/ optical/infrared spectral analysis and spectral imaging allowing the detection of narrow (0.00007-to-0.07-picometer wavelength resolution range) optical spectral signatures of chemical elements in space and planetary atmospheres. The idea underlying the proposal is to exploit the advantageous spectral characteristics of whispering-gallery-mode (WGM) resonators to obtain spectral resolutions at least three orders of magnitude greater than those of optical spectrum analyzers now in use. Such high resolutions would enable measurement of spectral features that could not be resolved by prior instruments.

  14. CRISP90 - SOFTWARE DESIGN ANALYZER SYSTEM

    Science.gov (United States)

    Tausworthe, R. C.

    1994-01-01

    The CRISP90 Software Design Analyzer System, an update of CRISP-80, is a set of programs forming a software design and documentation tool which supports top-down, hierarchic, modular, structured design and programming methodologies. The quality of a computer program can often be significantly influenced by the design medium in which the program is developed. The medium must foster the expression of the programmer's ideas easily and quickly, and it must permit flexible and facile alterations, additions, and deletions to these ideas as the design evolves. The CRISP90 software design analyzer system was developed to provide the PDL (Programmer Design Language) programmer with such a design medium. A program design using CRISP90 consists of short, English-like textual descriptions of data, interfaces, and procedures that are imbedded in a simple, structured, modular syntax. The display is formatted into two-dimensional, flowchart-like segments for a graphic presentation of the design. Together with a good interactive full-screen editor or word processor, the CRISP90 design analyzer becomes a powerful tool for the programmer. In addition to being a text formatter, the CRISP90 system prepares material that would be tedious and error prone to extract manually, such as a table of contents, module directory, structure (tier) chart, cross-references, and a statistics report on the characteristics of the design. Referenced modules are marked by schematic logic symbols to show conditional, iterative, and/or concurrent invocation in the program. A keyword usage profile can be generated automatically and glossary definitions inserted into the output documentation. Another feature is the capability to detect changes that were made between versions. Thus, "change-bars" can be placed in the output document along with a list of changed pages and a version history report. Also, items may be marked as "to be determined" and each will appear on a special table until the item is

  15. Analyzing the Biology on the System Level

    Institute of Scientific and Technical Information of China (English)

    Wei Tong

    2004-01-01

    Although various genome projects have provided us enormous static sequence information, understanding of the sophisticated biology continues to require integrating the computational modeling, system analysis, technology development for experiments, and quantitative experiments all together to analyze the biology architecture on various levels, which is just the origin of systems biology subject. This review discusses the object, its characteristics, and research attentions in systems biology, and summarizes the analysis methods, experimental technologies, research developments, and so on in the four key fields of systems biology-systemic structures, dynamics, control methods, and design principles.

  16. Analyzing Engineered Nanoparticles using Photothermal Infrared Spectroscopy

    DEFF Research Database (Denmark)

    Yamada, Shoko

    using redox activity measurements. With a new setup adapted to miniaturization, stable pH was achieved, platinum was found to be more suitable than gold for open circuit potential-time measurements, miniaturized platinum working electrodes and quasi silver/silver chloride reference electrodes were...... of design rules for the responsivity of the string-based photothermal spectrometer. Responsivity is maximized for a thin, narrow and long string irradiated by high power radiation. Various types of nanoparticles and binary mixtures of them were successfully detected and analyzed. Detection of copper...

  17. Development of a Portable Water Quality Analyzer

    Directory of Open Access Journals (Sweden)

    Germán COMINA

    2010-08-01

    Full Text Available A portable water analyzer based on a voltammetric electronic tongue has been developed. The system uses an electrochemical cell with two working electrodes as sensors, a computer controlled potentiostat, and software based on multivariate data analysis for pattern recognition. The system is suitable to differentiate laboratory made and real in-situ river water samples contaminated with different amounts of Escherichia coli. This bacteria is not only one of the main indicators for water quality, but also a main concern for public health, affecting especially people living in high-burden, resource-limiting settings.

  18. Using SCR methods to analyze requirements documentation

    Science.gov (United States)

    Callahan, John; Morrison, Jeffery

    1995-01-01

    Software Cost Reduction (SCR) methods are being utilized to analyze and verify selected parts of NASA's EOS-DIS Core System (ECS) requirements documentation. SCR is being used as a spot-inspection tool. Through this formal and systematic approach of the SCR requirements methods, insights as to whether the requirements are internally inconsistent or incomplete as the scenarios of intended usage evolve in the OC (Operations Concept) documentation. Thus, by modelling the scenarios and requirements as mode charts using the SCR methods, we have been able to identify problems within and between the documents.

  19. Thermo Scientific Ozone Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The primary measurement output from the Thermo Scientific Ozone Analyzer is the concentration of the analyte (O3) reported at 1-s resolution in units of ppbv in ambient air. Note that because of internal pneumatic switching limitations the instrument only makes an independent measurement every 4 seconds. Thus, the same concentration number is repeated roughly 4 times at the uniform, monotonic 1-s time base used in the AOS systems. Accompanying instrument outputs include sample temperatures, flows, chamber pressure, lamp intensities and a multiplicity of housekeeping information. There is also a field for operator comments made at any time while data is being collected.

  20. Analyzing complex networks evolution through Information Theory quantifiers

    Energy Technology Data Exchange (ETDEWEB)

    Carpi, Laura C., E-mail: Laura.Carpi@studentmail.newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Rosso, Osvaldo A., E-mail: rosso@fisica.ufmg.b [Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Chaos and Biology Group, Instituto de Calculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Pabellon II, Ciudad Universitaria, 1428 Ciudad de Buenos Aires (Argentina); Saco, Patricia M., E-mail: Patricia.Saco@newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Hidraulica, Facultad de Ciencias Exactas, Ingenieria y Agrimensura, Universidad Nacional de Rosario, Avenida Pellegrini 250, Rosario (Argentina); Ravetti, Martin Gomez, E-mail: martin.ravetti@dep.ufmg.b [Departamento de Engenharia de Producao, Universidade Federal de Minas Gerais, Av. Antonio Carlos, 6627, Belo Horizonte (31270-901), MG (Brazil)

    2011-01-24

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  1. Analyzing the BBOB results by means of benchmarking concepts.

    Science.gov (United States)

    Mersmann, O; Preuss, M; Trautmann, H; Bischl, B; Weihs, C

    2015-01-01

    We present methods to answer two basic questions that arise when benchmarking optimization algorithms. The first one is: which algorithm is the "best" one? and the second one is: which algorithm should I use for my real-world problem? Both are connected and neither is easy to answer. We present a theoretical framework for designing and analyzing the raw data of such benchmark experiments. This represents a first step in answering the aforementioned questions. The 2009 and 2010 BBOB benchmark results are analyzed by means of this framework and we derive insight regarding the answers to the two questions. Furthermore, we discuss how to properly aggregate rankings from algorithm evaluations on individual problems into a consensus, its theoretical background and which common pitfalls should be avoided. Finally, we address the grouping of test problems into sets with similar optimizer rankings and investigate whether these are reflected by already proposed test problem characteristics, finding that this is not always the case.

  2. Analyzing Network Coding Gossip Made Easy

    CERN Document Server

    Haeupler, Bernhard

    2010-01-01

    We give a new technique to analyze the stopping time of gossip protocols that are based on random linear network coding (RLNC). Our analysis drastically simplifies, extends and strengthens previous results. We analyze RLNC gossip in a general framework for network and communication models that encompasses and unifies the models used previously in this context. We show, in most settings for the first time, that it converges with high probability in the information-theoretically optimal time. Most stopping times are of the form O(k + T) where k is the number of messages to be distributed and T is the time it takes to disseminate one message. This means RLNC gossip achieves "perfect pipelining". Our analysis directly extends to highly dynamic networks in which the topology can change completely at any time. This remains true even if the network dynamics are controlled by a fully adaptive adversary that knows the complete network state. Virtually nothing besides simple O(kT) sequential flooding protocols was prev...

  3. Analyzing Interoperability of Protocols Using Model Checking

    Institute of Scientific and Technical Information of China (English)

    WUPeng

    2005-01-01

    In practical terms, protocol interoperability testing is still laborious and error-prone with little effect, even for those products that have passed conformance testing. Deadlock and unsymmetrical data communication are familiar in interoperability testing, and it is always very hard to trace their causes. The previous work has not provided a coherent way to analyze why the interoperability was broken among protocol implementations under test. In this paper, an alternative approach is presented to analyzing these problems from a viewpoint of implementation structures. Sequential and concurrent structures are both representative implementation structures, especially in event-driven development model. Our research mainly discusses the influence of sequential and concurrent structures on interoperability, with two instructive conclusions: (a) a sequential structure may lead to deadlock; (b) a concurrent structure may lead to unsymmetrical data communication. Therefore, implementation structures carry weight on interoperability, which may not gain much attention before. To some extent, they are decisive on the result of interoperability testing. Moreover, a concurrent structure with a sound task-scheduling strategy may contribute to the interoperability of a protocol implementation. Herein model checking technique is introduced into interoperability analysis for the first time. As the paper shows, it is an effective way to validate developers' selections on implementation structures or strategies.

  4. Atmospheric Aerosol Chemistry Analyzer: Demonstration of feasibility

    Energy Technology Data Exchange (ETDEWEB)

    Mroz, E.J.; Olivares, J.; Kok, G.

    1996-04-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The project objective was to demonstrate the technical feasibility of an Atmospheric Aerosol Chemistry Analyzer (AACA) that will provide a continuous, real-time analysis of the elemental (major, minor and trace) composition of atmospheric aerosols. The AACA concept is based on sampling the atmospheric aerosol through a wet cyclone scrubber that produces an aqueous suspension of the particles. This suspension can then be analyzed for elemental composition by ICP/MS or collected for subsequent analysis by other methods. The key technical challenge was to develop a wet cyclone aerosol sampler suitable for respirable particles found in ambient aerosols. We adapted an ultrasonic nebulizer to a conventional, commercially available, cyclone aerosol sampler and completed collection efficiency tests for the unit, which was shown to efficiently collect particles as small as 0.2 microns. We have completed the necessary basic research and have demonstrated the feasibility of the AACA concept.

  5. Improving respiration measurements with gas exchange analyzers.

    Science.gov (United States)

    Montero, R; Ribas-Carbó, M; Del Saz, N F; El Aou-Ouad, H; Berry, J A; Flexas, J; Bota, J

    2016-12-01

    Dark respiration measurements with open-flow gas exchange analyzers are often questioned for their low accuracy as their low values often reach the precision limit of the instrument. Respiration was measured in five species, two hypostomatous (Vitis Vinifera L. and Acanthus mollis) and three amphistomatous, one with similar amount of stomata in both sides (Eucalyptus citriodora) and two with different stomata density (Brassica oleracea and Vicia faba). CO2 differential (ΔCO2) increased two-fold with no change in apparent Rd, when the two leaves with higher stomatal density faced outside. These results showed a clear effect of the position of stomata on ΔCO2. Therefore, it can be concluded that leaf position is important to guarantee the improvement of respiration measurements increasing ΔCO2 without affecting the respiration results by leaf or mass units. This method will help to increase the accuracy of leaf respiration measurements using gas exchange analyzers. Copyright © 2016 Elsevier GmbH. All rights reserved.

  6. CALIBRATION OF ONLINE ANALYZERS USING NEURAL NETWORKS

    Energy Technology Data Exchange (ETDEWEB)

    Rajive Ganguli; Daniel E. Walsh; Shaohai Yu

    2003-12-05

    Neural networks were used to calibrate an online ash analyzer at the Usibelli Coal Mine, Healy, Alaska, by relating the Americium and Cesium counts to the ash content. A total of 104 samples were collected from the mine, with 47 being from screened coal, and the rest being from unscreened coal. Each sample corresponded to 20 seconds of coal on the running conveyor belt. Neural network modeling used the quick stop training procedure. Therefore, the samples were split into training, calibration and prediction subsets. Special techniques, using genetic algorithms, were developed to representatively split the sample into the three subsets. Two separate approaches were tried. In one approach, the screened and unscreened coal was modeled separately. In another, a single model was developed for the entire dataset. No advantage was seen from modeling the two subsets separately. The neural network method performed very well on average but not individually, i.e. though each prediction was unreliable, the average of a few predictions was close to the true average. Thus, the method demonstrated that the analyzers were accurate at 2-3 minutes intervals (average of 6-9 samples), but not at 20 seconds (each prediction).

  7. Optoacoustic 13C-breath test analyzer

    Science.gov (United States)

    Harde, Hermann; Helmrich, Günther; Wolff, Marcus

    2010-02-01

    The composition and concentration of exhaled volatile gases reflects the physical ability of a patient. Therefore, a breath analysis allows to recognize an infectious disease in an organ or even to identify a tumor. One of the most prominent breath tests is the 13C-urea-breath test, applied to ascertain the presence of the bacterium helicobacter pylori in the stomach wall as an indication of a gastric ulcer. In this contribution we present a new optical analyzer that employs a compact and simple set-up based on photoacoustic spectroscopy. It consists of two identical photoacoustic cells containing two breath samples, one taken before and one after capturing an isotope-marked substrate, where the most common isotope 12C is replaced to a large extent by 13C. The analyzer measures simultaneously the relative CO2 isotopologue concentrations in both samples by exciting the molecules on specially selected absorption lines with a semiconductor laser operating at a wavelength of 2.744 μm. For a reliable diagnosis changes of the 13CO2 concentration of 1% in the exhaled breath have to be detected at a concentration level of this isotope in the breath of about 500 ppm.

  8. Analyzing Mode Confusion via Model Checking

    Science.gov (United States)

    Luettgen, Gerald; Carreno, Victor

    1999-01-01

    Mode confusion is one of the most serious problems in aviation safety. Today's complex digital flight decks make it difficult for pilots to maintain awareness of the actual states, or modes, of the flight deck automation. NASA Langley leads an initiative to explore how formal techniques can be used to discover possible sources of mode confusion. As part of this initiative, a flight guidance system was previously specified as a finite Mealy automaton, and the theorem prover PVS was used to reason about it. The objective of the present paper is to investigate whether state-exploration techniques, especially model checking, are better able to achieve this task than theorem proving and also to compare several verification tools for the specific application. The flight guidance system is modeled and analyzed in Murphi, SMV, and Spin. The tools are compared regarding their system description language, their practicality for analyzing mode confusion, and their capabilities for error tracing and for animating diagnostic information. It turns out that their strengths are complementary.

  9. Sentiment Analyzer for Arabic Comments System

    Directory of Open Access Journals (Sweden)

    Alaa El-Dine Ali Hamouda

    2013-04-01

    Full Text Available Today, the number of users of social network is increasing. Millions of users share opinions on different aspects of life every day. Therefore social network are rich sources of data for opinion mining and sentiment analysis. Also users have become more interested in following news pages on Facebook. Several posts; political for example, have thousands of users’ comments that agree/disagree with the post content. Such comments can be a good indicator for the community opinion about the post content. For politicians, marketers, decision makers …, it is required to make sentiment analysis to know the percentage of users agree, disagree and neutral respect to a post. This raised the need to analyze theusers’ comments in Facebook. We focused on Arabic Facebook news pages for the task of sentiment analysis. We developed a corpus for sentiment analysis and opinion mining purposes. Then, we used different machine learning algorithms – decision tree, support vector machines, and naive bayes - to develop sentiment analyzer. The performance of the system using each technique was evaluated and compared with others.

  10. Analyzing endocrine system conservation and evolution.

    Science.gov (United States)

    Bonett, Ronald M

    2016-08-01

    Analyzing variation in rates of evolution can provide important insights into the factors that constrain trait evolution, as well as those that promote diversification. Metazoan endocrine systems exhibit apparent variation in evolutionary rates of their constituent components at multiple levels, yet relatively few studies have quantified these patterns and analyzed them in a phylogenetic context. This may be in part due to historical and current data limitations for many endocrine components and taxonomic groups. However, recent technological advancements such as high-throughput sequencing provide the opportunity to collect large-scale comparative data sets for even non-model species. Such ventures will produce a fertile data landscape for evolutionary analyses of nucleic acid and amino acid based endocrine components. Here I summarize evolutionary rate analyses that can be applied to categorical and continuous endocrine traits, and also those for nucleic acid and protein-based components. I emphasize analyses that could be used to test whether other variables (e.g., ecology, ontogenetic timing of expression, etc.) are related to patterns of rate variation and endocrine component diversification. The application of phylogenetic-based rate analyses to comparative endocrine data will greatly enhance our understanding of the factors that have shaped endocrine system evolution.

  11. Analyzing rare diseases terms in biomedical terminologies

    Directory of Open Access Journals (Sweden)

    Erika Pasceri

    2012-03-01

    Full Text Available Rare disease patients too often face common problems, including the lack of access to correct diagnosis, lack of quality information on the disease, lack of scientific knowledge of the disease, inequities and difficulties in access to treatment and care. These things could be changed by implementing a comprehensive approach to rare diseases, increasing international cooperation in scientific research, by gaining and sharing scientific knowledge about and by developing tools for extracting and sharing knowledge. A significant aspect to analyze is the organization of knowledge in the biomedical field for the proper management and recovery of health information. For these purposes, the sources needed have been acquired from the Office of Rare Diseases Research, the National Organization of Rare Disorders and Orphanet, organizations that provide information to patients and physicians and facilitate the exchange of information among different actors involved in this field. The present paper shows the representation of rare diseases terms in biomedical terminologies such as MeSH, ICD-10, SNOMED CT and OMIM, leveraging the fact that these terminologies are integrated in the UMLS. At the first level, it was analyzed the overlap among sources and at a second level, the presence of rare diseases terms in target sources included in UMLS, working at the term and concept level. We found that MeSH has the best representation of rare diseases terms.

  12. Analyzing polarization swings in 3C 279

    CERN Document Server

    Kiehlmann, S; Jorstad, S G; Sokolovsky, K V; Schinzel, F K; Agudo, I; Arkharov, A A; Benitez, E; Berdyugin, A; Blinov, D A; Bochkarev, N G; Borman, G A; Burenkov, A N; Casadio, C; Doroshenko, V T; Efimova, N V; Fukazawa, Y; Gomez, J L; Hagen-Thorn, V A; Heidt, J; Hiriart, D; Itoh, R; Joshi, M; Kimeridze, G N; Konstantinova, T S; Kopatskaya, E N; Korobtsev, I V; Kovalev, Y Y; Krajci, T; Kurtanidze, O; Kurtanidze, S O; Larionov, V M; Larionova, E G; Larionova, L V; Lindfors, E; Lopez, J M; Marscher, A P; McHardy, I M; Molina, S N; Morozova, D A; Nazarov, S V; Nikolashvili, M G; Nilsson, K; Pulatova, N G; Reinthal, R; Sadun, A; Sergeev, S G; Sigua, L A; Sorcia, M; Spiridonova, O I; Takalo, L O; Taylor, B; Troitsky, I S; Ugolkova, L S; Zensus, J A; Zhdanova, V E

    2013-01-01

    Quasar 3C 279 is known to exhibit episodes of optical polarization angle rotation. We present new, well-sampled optical polarization data for 3C 279 and introduce a method to distinguish between random and deterministic electric vector position angle (EVPA) variations. We observe EVPA rotations in both directions with different amplitudes and find that the EVPA variation shows characteristics of both random and deterministic cases. Our analysis indicates that the EVPA variation is likely dominated by a random process in the low brightness state of the jet and by a deterministic process in the flaring state.

  13. Analyzing polarization swings in 3C 279

    Directory of Open Access Journals (Sweden)

    Kiehlmann S.

    2013-12-01

    Full Text Available Quasar 3C 279 is known to exhibit episodes of optical polarization angle rotation. We present new, well-sampled optical polarization data for 3C 279 and introduce a method to distinguish between random and deterministic electric vector position angle (EVPA variations. We observe EVPA rotations in both directions with different amplitudes and find that the EVPA variation shows characteristics of both random and deterministic cases. Our analysis indicates that the EVPA variation is likely dominated by a random process in the low brightness state of the jet and by a deterministic process in the flaring state.

  14. SRGM Analyzers Tool of SDLC for Software Improving Quality

    Directory of Open Access Journals (Sweden)

    Mr. Girish Nille

    2014-11-01

    Full Text Available Software Reliability Growth Models (SRGM have been developed to estimate software reliability measures such as software failure rate, number of remaining faults and software reliability. In this paper, the software analyzers tool proposed for deriving several software reliability growth models based on Enhanced Non-homogeneous Poisson Process (ENHPP in the presence of imperfect debugging and error generation. The proposed models are initially formulated for the case when there is no differentiation between failure observation and fault removal testing processes and then this extended for the case when there is a clear differentiation between failure observation and fault removal testing processes. Many Software Reliability Growth Models (SRGM have been developed to describe software failures as a random process and can be used to measure the development status during testing. With SRGM software consultants can easily measure (or evaluate the software reliability (or quality and plot software reliability growth charts.

  15. Governance of Aquatic Agricultural Systems: Analyzing Representation, Power, and Accountability

    Directory of Open Access Journals (Sweden)

    Blake D. Ratner

    2013-12-01

    Full Text Available Aquatic agricultural systems in developing countries face increasing competition from multiple stakeholders over rights to access and use natural resources, land, water, wetlands, and fisheries, essential to rural livelihoods. A key implication is the need to strengthen governance to enable equitable decision making amidst competition that spans sectors and scales, building capacities for resilience, and for transformations in institutions that perpetuate poverty. In this paper we provide a simple framework to analyze the governance context for aquatic agricultural system development focused on three dimensions: stakeholder representation, distribution of power, and mechanisms of accountability. Case studies from Cambodia, Bangladesh, Malawi/Mozambique, and Solomon Islands illustrate the application of these concepts to fisheries and aquaculture livelihoods in the broader context of intersectoral and cross-scale governance interactions. Comparing these cases, we demonstrate how assessing governance dimensions yields practical insights into opportunities for transforming the institutions that constrain resilience in local livelihoods.

  16. Analyzing Options for Airborne Emergency Wireless Communications

    Energy Technology Data Exchange (ETDEWEB)

    Michael Schmitt; Juan Deaton; Curt Papke; Shane Cherry

    2008-03-01

    In the event of large-scale natural or manmade catastrophic events, access to reliable and enduring commercial communication systems is critical. Hurricane Katrina provided a recent example of the need to ensure communications during a national emergency. To ensure that communication demands are met during these critical times, Idaho National Laboratory (INL) under the guidance of United States Strategic Command has studied infrastructure issues, concerns, and vulnerabilities associated with an airborne wireless communications capability. Such a capability could provide emergency wireless communications until public/commercial nodes can be systematically restored. This report focuses on the airborne cellular restoration concept; analyzing basic infrastructure requirements; identifying related infrastructure issues, concerns, and vulnerabilities and offers recommended solutions.

  17. Stackable differential mobility analyzer for aerosol measurement

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Meng-Dawn (Oak Ridge, TN); Chen, Da-Ren (Creve Coeur, MO)

    2007-05-08

    A multi-stage differential mobility analyzer (MDMA) for aerosol measurements includes a first electrode or grid including at least one inlet or injection slit for receiving an aerosol including charged particles for analysis. A second electrode or grid is spaced apart from the first electrode. The second electrode has at least one sampling outlet disposed at a plurality different distances along its length. A volume between the first and the second electrode or grid between the inlet or injection slit and a distal one of the plurality of sampling outlets forms a classifying region, the first and second electrodes for charging to suitable potentials to create an electric field within the classifying region. At least one inlet or injection slit in the second electrode receives a sheath gas flow into an upstream end of the classifying region, wherein each sampling outlet functions as an independent DMA stage and classifies different size ranges of charged particles based on electric mobility simultaneously.

  18. Analyzing BSE transmission to quantify regional risk.

    Science.gov (United States)

    de Koeijer, Aline A

    2007-10-01

    As a result of consumer fears and political concerns related to BSE as a risk to human health, a need has arisen recently for more sensitive methods to detect BSE and more accurate methods to determine BSE incidence. As a part of the development of such methods, it is important to be able to identify groups of animals with above-average BSE risk. One of the well-known risk factors for BSE is age, as very young animals do not develop the disease, and very old animals are less likely to develop the disease. Here, we analyze which factors have a strong influence on the age distribution of BSE in a population. Building on that, we develop a simple set of calculation rules for classifying the BSE risk in a given cattle population. Required inputs are data on imports and on the BSE control measures in place over the last 10 or 20 years.

  19. Identifying and Analyzing Web Server Attacks

    Energy Technology Data Exchange (ETDEWEB)

    Seifert, Christian; Endicott-Popovsky, Barbara E.; Frincke, Deborah A.; Komisarczuk, Peter; Muschevici, Radu; Welch, Ian D.

    2008-08-29

    Abstract: Client honeypots can be used to identify malicious web servers that attack web browsers and push malware to client machines. Merely recording network traffic is insufficient to perform comprehensive forensic analyses of such attacks. Custom tools are required to access and analyze network protocol data. Moreover, specialized methods are required to perform a behavioral analysis of an attack, which helps determine exactly what transpired on the attacked system. This paper proposes a record/replay mechanism that enables forensic investigators to extract application data from recorded network streams and allows applications to interact with this data in order to conduct behavioral analyses. Implementations for the HTTP and DNS protocols are presented and their utility in network forensic investigations is demonstrated.

  20. Drug stability analyzer for long duration spaceflights

    Science.gov (United States)

    Shende, Chetan; Smith, Wayne; Brouillette, Carl; Farquharson, Stuart

    2014-06-01

    Crewmembers of current and future long duration spaceflights require drugs to overcome the deleterious effects of weightlessness, sickness and injuries. Unfortunately, recent studies have shown that some of the drugs currently used may degrade more rapidly in space, losing their potency well before their expiration dates. To complicate matters, the degradation products of some drugs can be toxic. Consequently there is a need for an analyzer that can determine if a drug is safe at the time of use, as well as to monitor and understand space-induced degradation, so that drug types, formulations, and packaging can be improved. Towards this goal we have been investigating the ability of Raman spectroscopy to monitor and quantify drug degradation. Here we present preliminary data by measuring acetaminophen, and its degradation product, p-aminophenol, as pure samples, and during forced degradation reactions.

  1. Basis-neutral Hilbert-space analyzers

    CERN Document Server

    Martin, Lane; Kondakci, H Esat; Larson, Walker D; Shabahang, Soroush; Jahromi, Ali K; Malhotra, Tanya; Vamivakas, A Nick; Atia, George K; Abouraddy, Ayman F

    2016-01-01

    Interferometry is one of the central organizing principles of optics. Key to interferometry is the concept of optical delay, which facilitates spectral analysis in terms of time-harmonics. In contrast, when analyzing a beam in a Hilbert space spanned by spatial modes -- a critical task for spatial-mode multiplexing and quantum communication -- basis-specific principles are invoked that are altogether distinct from that of `delay.' Here, we extend the traditional concept of temporal delay to the spatial domain, thereby enabling the analysis of a beam in an arbitrary spatial-mode basis -- exemplified using Hermite-Gaussian and radial Laguerre-Gaussian modes. Such generalized delays correspond to optical implementations of fractional transforms; for example, the fractional Hankel transform is the generalized delay associated with the space of Laguerre-Gaussian modes, and an interferometer incorporating such a `delay' obtains modal weights in the associated Hilbert space. By implementing an inherently stable, rec...

  2. Analyzing and forecasting the European social climate

    Directory of Open Access Journals (Sweden)

    Liliana DUGULEANĂ

    2015-06-01

    Full Text Available The paper uses the results of the sample survey Eurobarometer, which has been requested by the European Commission. The social climate index is used to measure the level of perceptions of population by taking into account their personal situation and their perspective at national level. The paper makes an analysis of the evolution of social climate indices for the countries of European Union and offers information about the expectations of population of analyzed countries. The obtained results can be compared with the forecasting of Eurobarometer, on short term of one year and medium term of five years. Modelling the social climate index and its influence factors offers useful information about the efficiency of social protection and inclusion policies.

  3. Analyzing petabytes of data with Hadoop

    CERN Document Server

    CERN. Geneva

    2009-01-01

    Abstract The open source Apache Hadoop project provides a powerful suite of tools for storing and analyzing petabytes of data using commodity hardware. After several years of production use inside of web companies like Yahoo! and Facebook and nearly a year of commercial support and development by Cloudera, the technology is spreading rapidly through other disciplines, from financial services and government to life sciences and high energy physics. The talk will motivate the design of Hadoop and discuss some key implementation details in depth. It will also cover the major subprojects in the Hadoop ecosystem, go over some example applications, highlight best practices for deploying Hadoop in your environment, discuss plans for the future of the technology, and provide pointers to the many resources available for learning more. In addition to providing more information about the Hadoop platform, a major goal of this talk is to begin a dialogue with the ATLAS research team on how the tools commonly used in t...

  4. Fully Analyzing an Algebraic Polya Urn Model

    CERN Document Server

    Morcrette, Basile

    2012-01-01

    This paper introduces and analyzes a particular class of Polya urns: balls are of two colors, can only be added (the urns are said to be additive) and at every step the same constant number of balls is added, thus only the color compositions varies (the urns are said to be balanced). These properties make this class of urns ideally suited for analysis from an "analytic combinatorics" point-of-view, following in the footsteps of Flajolet-Dumas-Puyhaubert, 2006. Through an algebraic generating function to which we apply a multiple coalescing saddle-point method, we are able to give precise asymptotic results for the probability distribution of the composition of the urn, as well as local limit law and large deviation bounds.

  5. Analyzing and mining automated imaging experiments.

    Science.gov (United States)

    Berlage, Thomas

    2007-04-01

    Image mining is the application of computer-based techniques that extract and exploit information from large image sets to support human users in generating knowledge from these sources. This review focuses on biomedical applications of this technique, in particular automated imaging at the cellular level. Due to increasing automation and the availability of integrated instruments, biomedical users are becoming increasingly confronted with the problem of analyzing such data. Image database applications need to combine data management, image analysis and visual data mining. The main point of such a system is a software layer that represents objects within an image and the ability to use a large spectrum of quantitative and symbolic object features. Image analysis needs to be adapted to each particular experiment; therefore, 'end user programming' will be desired to make the technology more widely applicable.

  6. Analyzing Hydrological Sustainability Through Water Balance

    Science.gov (United States)

    Menció, Anna; Folch, Albert; Mas-Pla, Josep

    2010-05-01

    The objective of the Water Framework Directive (2000/60/EC) is to assist in the development of management plans that will lead to the sustainable use of water resources in all EU member states. However, defining the degree of sustainability aimed at is not a straightforward task. It requires detailed knowledge of the hydrogeological characteristics of the basin in question, its environmental needs, the amount of human water demand, and the opportunity to construct a proper water balance that describes the behavior of the hydrological system and estimates available water resources. An analysis of the water balance in the Selva basin (Girona, NE Spain) points to the importance of regional groundwater fluxes in satisfying current exploitation rates, and shows that regional scale approaches are often necessary to evaluate water availability. In addition, we discuss the pressures on water resources, and analyze potential actions, based on the water balance results, directed towards achieving sustainable water management in the basin.

  7. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business...... as the software development and distribution by a set of actors dependent on each other and the ecosystem. We commence on the hypothesis that the establishment of a software ecosystem on the telemedicine services of Denmark would address these issues and investigate how a software ecosystem can foster...... the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...

  8. Complex networks theory for analyzing metabolic networks

    Institute of Scientific and Technical Information of China (English)

    ZHAO Jing; YU Hong; LUO Jianhua; CAO Z.W.; LI Yixue

    2006-01-01

    One of the main tasks of post-genomic informatics is to systematically investigate all molecules and their interactions within a living cell so as to understand how these molecules and the interactions between them relate to the function of the organism,while networks are appropriate abstract description of all kinds of interactions. In the past few years, great achievement has been made in developing theory of complex networks for revealing the organizing principles that govern the formation and evolution of various complex biological, technological and social networks. This paper reviews the accomplishments in constructing genome-based metabolic networks and describes how the theory of complex networks is applied to analyze metabolic networks.

  9. Modeling and analyzing architectural change with alloy

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Ingstrup, Mads

    2010-01-01

    to the uptake of reconfiguration techniques in industry. Using the Alloy language and associated tool, we propose a practical way to formally model and analyze runtime architectural change expressed as architectural scripts. Our evaluation shows the performance to be acceptable; our experience......Although adaptivity based on reconfiguration has the potential to improve dependability of systems, the cost of a failed attempt at reconfiguration is prohibitive in precisely the applications where high dependability is required. Existing work on formal modeling and verification of architectural...... reconfigurations partly achieve the goal of ensuring correctness, however the formalisms used often lack tool support and the ensuing models have uncertain relation to a concrete implementation. Thus a practical way to ensure with formal certainty that specific architectural changes are correct remains a barrier...

  10. Analyzing Trust Perceptions in System Implementations

    DEFF Research Database (Denmark)

    Schlichter, Bjarne Rerup; Rose, Jeremy

    2009-01-01

    Implementations of large scale information systems are complex and problematic with a reputation for being delayed and going over budget. A critical factor in the success of these implementations is trust in the system, in the project and between the various stakeholders. As problems and delays...... mount, trust relations become strained, leading to a circle of suspicion and disbelief which is both destructive and hard to break out of. This case study analyses trust relations during a problematic period of time in the implementation of the Faroese integrated healthcare information system, using....... A major contribution is that if an implementation project interacts with many or complex abstract systems, the managers must focus on continuous embedding and re-embedding by interacting directly with representatives of the abstract systems in question to maintain trust. Also we observe that actors...

  11. Analyzing longitudinal data with missing values.

    Science.gov (United States)

    Enders, Craig K

    2011-11-01

    Missing data methodology has improved dramatically in recent years, and popular computer programs now offer a variety of sophisticated options. Despite the widespread availability of theoretically justified methods, researchers in many disciplines still rely on subpar strategies that either eliminate incomplete cases or impute the missing scores with a single set of replacement values. This article provides readers with a nontechnical overview of some key issues from the missing data literature and demonstrates several of the techniques that methodologists currently recommend. This article begins by describing Rubin's missing data mechanisms. After a brief discussion of popular ad hoc approaches, the article provides a more detailed description of five analytic approaches that have received considerable attention in the missing data literature: maximum likelihood estimation, multiple imputation, the selection model, the shared parameter model, and the pattern mixture model. Finally, a series of data analysis examples illustrate the application of these methods.

  12. Interactive word cloud for analyzing reviews

    Science.gov (United States)

    Jung, HyunRyong

    2013-12-01

    A five-star quality rating is one of the most widely used systems for evaluating items. However, it has two fundamental limitations: 1) the rating for one item cannot describe crucial information in detail; 2) the rating is not on an absolute scale that can be used to compare items. Because of these limitations, users cannot make an optimal decision. In this paper, we introduce our sophisticated approach to extract useful information from user reviews using collapsed dependencies and sentiment analysis. We propose an interactive word cloud that can show grammatical relationships among words, explore reviews efficiently, and display positivity or negativity on a sentence. In addition, we introduce visualization for comparing multiple word clouds and illustrate the usage through test cases.

  13. Multi-Pass Quadrupole Mass Analyzer

    Science.gov (United States)

    Prestage, John D.

    2013-01-01

    Analysis of the composition of planetary atmospheres is one of the most important and fundamental measurements in planetary robotic exploration. Quadrupole mass analyzers (QMAs) are the primary tool used to execute these investigations, but reductions in size of these instruments has sacrificed mass resolving power so that the best present-day QMA devices are still large, expensive, and do not deliver performance of laboratory instruments. An ultra-high-resolution QMA was developed to resolve N2 +/CO+ by trapping ions in a linear trap quadrupole filter. Because N2 and CO are resolved, gas chromatography columns used to separate species before analysis are eliminated, greatly simplifying gas analysis instrumentation. For highest performance, the ion trap mode is used. High-resolution (or narrow-band) mass selection is carried out in the central region, but near the DC electrodes at each end, RF/DC field settings are adjusted to allow broadband ion passage. This is to prevent ion loss during ion reflection at each end. Ions are created inside the trap so that low-energy particles are selected by low-voltage settings on the end electrodes. This is beneficial to good mass resolution since low-energy particles traverse many cycles of the RF filtering fields. Through Monte Carlo simulations, it is shown that ions are reflected at each end many tens of times, each time being sent back through the central section of the quadrupole where ultrahigh mass filtering is carried out. An analyzer was produced with electrical length orders of magnitude longer than its physical length. Since the selector fields are sized as in conventional devices, the loss of sensitivity inherent in miniaturizing quadrupole instruments is avoided. The no-loss, multi-pass QMA architecture will improve mass resolution of planetary QMA instruments while reducing demands on the RF electronics for high-voltage/high-frequency production since ion transit time is no longer limited to a single pass. The

  14. Alzheimer's disease: analyzing the missing heritability.

    Directory of Open Access Journals (Sweden)

    Perry G Ridge

    Full Text Available Alzheimer's disease (AD is a complex disorder influenced by environmental and genetic factors. Recent work has identified 11 AD markers in 10 loci. We used Genome-wide Complex Trait Analysis to analyze >2 million SNPs for 10,922 individuals from the Alzheimer's Disease Genetics Consortium to assess the phenotypic variance explained first by known late-onset AD loci, and then by all SNPs in the Alzheimer's Disease Genetics Consortium dataset. In all, 33% of total phenotypic variance is explained by all common SNPs. APOE alone explained 6% and other known markers 2%, meaning more than 25% of phenotypic variance remains unexplained by known markers, but is tagged by common SNPs included on genotyping arrays or imputed with HapMap genotypes. Novel AD markers that explain large amounts of phenotypic variance are likely to be rare and unidentifiable using genome-wide association studies. Based on our findings and the current direction of human genetics research, we suggest specific study designs for future studies to identify the remaining heritability of Alzheimer's disease.

  15. Signal processing and analyzing works of art

    Science.gov (United States)

    Johnson, Don H.; Johnson, C. Richard, Jr.; Hendriks, Ella

    2010-08-01

    In examining paintings, art historians use a wide variety of physico-chemical methods to determine, for example, the paints, the ground (canvas primer) and any underdrawing the artist used. However, the art world has been little touched by signal processing algorithms. Our work develops algorithms to examine x-ray images of paintings, not to analyze the artist's brushstrokes but to characterize the weave of the canvas that supports the painting. The physics of radiography indicates that linear processing of the x-rays is most appropriate. Our spectral analysis algorithms have an accuracy superior to human spot-measurements and have the advantage that, through "short-space" Fourier analysis, they can be readily applied to entire x-rays. We have found that variations in the manufacturing process create a unique pattern of horizontal and vertical thread density variations in the bolts of canvas produced. In addition, we measure the thread angles, providing a way to determine the presence of cusping and to infer the location of the tacks used to stretch the canvas on a frame during the priming process. We have developed weave matching software that employs a new correlation measure to find paintings that share canvas weave characteristics. Using a corpus of over 290 paintings attributed to Vincent van Gogh, we have found several weave match cliques that we believe will refine the art historical record and provide more insight into the artist's creative processes.

  16. Numerical methods for analyzing electromagnetic scattering

    Science.gov (United States)

    Lee, S. W.; Lo, Y. T.; Chuang, S. L.; Lee, C. S.

    1985-01-01

    Attenuation properties of the normal modes in an overmoded waveguide coated with a lossy material were analyzed. It is found that the low-order modes, can be significantly attenuated even with a thin layer of coating if the coating material is not too lossy. A thinner layer of coating is required for large attenuation of the low-order modes if the coating material is magnetic rather than dielectric. The Radar Cross Section (RCS) from an uncoated circular guide terminated by a perfect electric conductor was calculated and compared with available experimental data. It is confirmed that the interior irradiation contributes to the RCS. The equivalent-current method based on the geometrical theory of diffraction (GTD) was chosen for the calculation of the contribution from the rim diffraction. The RCS reduction from a coated circular guide terminated by a PEC are planned schemes for the experiments are included. The waveguide coated with a lossy magnetic material is suggested as a substitute for the corrugated waveguide.

  17. Complete denture analyzed by optical coherence tomography

    Science.gov (United States)

    Negrutiu, Meda L.; Sinescu, Cosmin; Todea, Carmen; Podoleanu, Adrian G.

    2008-02-01

    The complete dentures are currently made using different technologies. In order to avoid deficiencies of the prostheses made using the classical technique, several alternative systems and procedures were imagined, directly related to the material used and also to the manufacturing technology. Thus, at the present time, there are several injecting systems and technologies on the market, that use chemoplastic materials, which are heat cured (90-100°C), in dry or wet environment, or cold cured (below 60°C). There are also technologies that plasticize a hard cured material by thermoplastic processing (without any chemical changes) and then inject it into a mold. The purpose of this study was to analyze the existence of possible defects in several dental prostheses using a non invasive method, before their insertion in the mouth. Different dental prostheses, fabricated from various materials were investigated using en-face optical coherence tomography. In order to discover the defects, the scanning was made in three planes, obtaining images at different depths, from 0,01 μm to 2 mm. In several of the investigated prostheses we found defects which may cause their fracture. These defects are totally included in the prostheses material and can not be vizualised with other imagistic methods. In conclusion, en-face OCT is an important investigative tool for the dental practice.

  18. Analyzing Consumer Behavior Towards Contemporary Food Retailers

    Directory of Open Access Journals (Sweden)

    E.Dursun

    2008-01-01

    Full Text Available The objective of this research is analyzing consumer behaviors towards to contemporary food retailers. Food retailing has been changing during recent years in Turkey. Foreign investors captivated with this market potential of food retailing. Retailer‟s format has been changed and featuring large-scale, extended product variety and full service retailers spreading rapidly through the nation-wide. Consumers‟ tend to shop their household needs from contemporary retailers due mainly to urbanism, increasing women workforce and income growth. In this research, original data collected through face-to-face interview from 385 respondents which are located in Istanbul. Different Socio-Economic Status (SES groups‟ ratio for Istanbul was forming sampling distribution. Consumers prefer closest food retailers which are mainly purchasing food products. Consumers purchase more than their planned what their needs; especially C SES group average comes first for the spending money for unplanned shopping. Chain stores and hypermarkets are the most preferred retailers in food purchasing. Moreover, consumer responses to judgments related to retailing are being investigating with factor analysis.

  19. Analyzing planar cell polarity during zebrafish gastrulation.

    Science.gov (United States)

    Jessen, Jason R

    2012-01-01

    Planar cell polarity was first described in invertebrates over 20 years ago and is defined as the polarity of cells (and cell structures) within the plane of a tissue, such as an epithelium. Studies in the last 10 years have identified critical roles for vertebrate homologs of these planar cell polarity proteins during gastrulation cell movements. In zebrafish, the terms convergence and extension are used to describe the collection of morphogenetic movements and cell behaviors that contribute to narrowing and elongation of the embryonic body plan. Disruption of planar cell polarity gene function causes profound defects in convergence and extension creating an embryo that has a shortened anterior-posterior axis and is broadened mediolaterally. The zebrafish gastrula-stage embryo is transparent and amenable to live imaging using both Nomarski/differential interference contrast and fluorescence microscopy. This chapter describes methods to analyze convergence and extension movements at the cellular level and thereby connect embryonic phenotypes with underlying planar cell polarity defects in migrating cells.

  20. PSAIA – Protein Structure and Interaction Analyzer

    Directory of Open Access Journals (Sweden)

    Vlahoviček Kristian

    2008-04-01

    Full Text Available Abstract Background PSAIA (Protein Structure and Interaction Analyzer was developed to compute geometric parameters for large sets of protein structures in order to predict and investigate protein-protein interaction sites. Results In addition to most relevant established algorithms, PSAIA offers a new method PIADA (Protein Interaction Atom Distance Algorithm for the determination of residue interaction pairs. We found that PIADA produced more satisfactory results than comparable algorithms implemented in PSAIA. Particular advantages of PSAIA include its capacity to combine different methods to detect the locations and types of interactions between residues and its ability, without any further automation steps, to handle large numbers of protein structures and complexes. Generally, the integration of a variety of methods enables PSAIA to offer easier automation of analysis and greater reliability of results. PSAIA can be used either via a graphical user interface or from the command-line. Results are generated in either tabular or XML format. Conclusion In a straightforward fashion and for large sets of protein structures, PSAIA enables the calculation of protein geometric parameters and the determination of location and type for protein-protein interaction sites. XML formatted output enables easy conversion of results to various formats suitable for statistic analysis. Results from smaller data sets demonstrated the influence of geometry on protein interaction sites. Comprehensive analysis of properties of large data sets lead to new information useful in the prediction of protein-protein interaction sites.

  1. Artificial patinas analyzed with PIXE method

    Energy Technology Data Exchange (ETDEWEB)

    Campos, P.H.O.V. de; Rizzutto, M.A. [Universidade de Sao Paulo (USP), SP (Brazil). Inst. de Fisica. Dept. de Fisica Nuclear]. E-mail: rizzutto@if.usp.br; Neiva, A.C.; Bendezu H, R. del P. [Universidade de Sao Paulo (USP), SP (Brazil). Escola Politecnica. Dept. de Engenharia Quimica]. E-mail: acneiva@usp.br

    2007-07-01

    Aiming at the restoration and conservation of the archaeological metallic objects, the artificial patinas can be used to simulate the natural patinas (corrosion products in metal and its alloys), permitting the characterization and corrosion mechanisms studies. The natural patinas formation is difficult to study because of the long corrosion production process in materials which take years to be formed. On the other hand the artificial patinas can be easily produced in a shorter time, moreover, they can be used as simulation of the corrosion process and in substitution of monuments and old art objects, damaged for some reason. In our study artificial patinas were produced over pellets of copper and bronze with sulfate, chloride and nitrate solutions and were analyzed with PIXE (Proton Induced X-Ray Emission) technique to supply qualitative and quantitative information of the corrosion elements. The quantitative PIXE analysis takes into account the incident ion beam absorption and the emergent X-ray of the sample, as well as the patina layer and the backing. The PIXE results have shown the presence of S, Cl and Fe and some other elements already known form the backings, such as Cu, Sn, etc. PIXE measurements were also realized in reference metallic materials. (author)

  2. On geometric factors for neutral particle analyzers.

    Science.gov (United States)

    Stagner, L; Heidbrink, W W

    2014-11-01

    Neutral particle analyzers (NPA) detect neutralized energetic particles that escape from plasmas. Geometric factors relate the counting rate of the detectors to the intensity of the particle source. Accurate geometric factors enable quick simulation of geometric effects without the need to resort to slower Monte Carlo methods. Previously derived expressions [G. R. Thomas and D. M. Willis, "Analytical derivation of the geometric factor of a particle detector having circular or rectangular geometry," J. Phys. E: Sci. Instrum. 5(3), 260 (1972); J. D. Sullivan, "Geometric factor and directional response of single and multi-element particle telescopes," Nucl. Instrum. Methods 95(1), 5-11 (1971)] for the geometric factor implicitly assume that the particle source is very far away from the detector (far-field); this excludes applications close to the detector (near-field). The far-field assumption does not hold in most fusion applications of NPA detectors. We derive, from probability theory, a generalized framework for deriving geometric factors that are valid for both near and far-field applications as well as for non-isotropic sources and nonlinear particle trajectories.

  3. A Method for Analyzing Volunteered Geographic Information ...

    Science.gov (United States)

    Volunteered geographic information (VGI) can be used to identify public valuation of ecosystem services in a defined geographic area using photos as a representation of lived experiences. This method can help researchers better survey and report on the values and preferences of stakeholders involved in rehabilitation and revitalization projects. Current research utilizes VGI in the form of geotagged social media photos from three platforms: Flickr, Instagram, and Panaramio. Social media photos have been obtained for the neighborhoods next to the St. Louis River in Duluth, Minnesota, and are being analyzed along several dimensions. These dimensions include the spatial distribution of each platform, the characteristics of the physical environment portrayed in the photos, and finally, the ecosystem service depicted. In this poster, we focus on the photos from the Irving and Fairmount neighborhoods of Duluth, MN to demonstrate the method at the neighborhood scale. This study demonstrates a method for translating the values expressed in social media photos into ecosystem services and spatially-explicit data to be used in multiple settings, including the City of Duluth’s Comprehensive Planning and community revitalization efforts, habitat restoration in a Great Lakes Area of Concern, and the USEPA’s Office of Research and Development. This poster will demonstrate a method for translating values expressed in social media photos into ecosystem services and spatially

  4. Methodological considerations in analyzing Twitter data.

    Science.gov (United States)

    Kim, Annice E; Hansen, Heather M; Murphy, Joe; Richards, Ashley K; Duke, Jennifer; Allen, Jane A

    2013-12-01

    Twitter is an online microblogging tool that disseminates more than 400 million messages per day, including vast amounts of health information. Twitter represents an important data source for the cancer prevention and control community. This paper introduces investigators in cancer research to the logistics of Twitter analysis. It explores methodological challenges in extracting and analyzing Twitter data, including characteristics and representativeness of data; data sources, access, and cost; sampling approaches; data management and cleaning; standardizing metrics; and analysis. We briefly describe the key issues and provide examples from the literature and our studies using Twitter data to understand public health issues. For investigators considering Twitter-based cancer research, we recommend assessing whether research questions can be answered appropriately using Twitter, choosing search terms carefully to optimize precision and recall, using respected vendors that can provide access to the full Twitter data stream if possible, standardizing metrics to account for growth in the Twitter population over time, considering crowdsourcing for analysis of Twitter content, and documenting and publishing all methodological decisions to further the evidence base.

  5. A Method for Analyzing Volunteered Geographic Information ...

    Science.gov (United States)

    Volunteered geographic information (VGI) can be used to identify public valuation of ecosystem services in a defined geographic area using photos as a representation of lived experiences. This method can help researchers better survey and report on the values and preferences of stakeholders involved in rehabilitation and revitalization projects. Current research utilizes VGI in the form of geotagged social media photos from three platforms: Flickr, Instagram, and Panaramio. Social media photos have been obtained for the neighborhoods next to the St. Louis River in Duluth, Minnesota, and are being analyzed along several dimensions. These dimensions include the spatial distribution of each platform, the characteristics of the physical environment portrayed in the photos, and finally, the ecosystem service depicted. In this poster, we focus on the photos from the Irving and Fairmount neighborhoods of Duluth, MN to demonstrate the method at the neighborhood scale. This study demonstrates a method for translating the values expressed in social media photos into ecosystem services and spatially-explicit data to be used in multiple settings, including the City of Duluth’s Comprehensive Planning and community revitalization efforts, habitat restoration in a Great Lakes Area of Concern, and the USEPA’s Office of Research and Development. This poster will demonstrate a method for translating values expressed in social media photos into ecosystem services and spatially

  6. Analyzing Complex Reaction Mechanisms Using Path Sampling.

    Science.gov (United States)

    van Erp, Titus S; Moqadam, Mahmoud; Riccardi, Enrico; Lervik, Anders

    2016-11-08

    We introduce an approach to analyze collective variables (CVs) regarding their predictive power for a reaction. The method is based on already available path sampling data produced by, for instance, transition interface sampling or forward flux sampling, which are path sampling methods used for efficient computation of reaction rates. By a search in CV space, a measure of predictiveness can be optimized and, in addition, the number of CVs can be reduced using projection operations which keep this measure invariant. The approach allows testing hypotheses on the reaction mechanism but could, in principle, also be used to construct the phase-space committor surfaces without the need of additional trajectory sampling. The procedure is illustrated for a one-dimensional double-well potential, a theoretical model for an ion-transfer reaction in which the solvent structure can lower the barrier, and an ab initio molecular dynamics study of water auto-ionization. The analysis technique enhances the quantitative interpretation of path sampling data which can provide clues on how chemical reactions can be steered in desired directions.

  7. USING NLP APPROACH FOR ANALYZING CUSTOMER REVIEWS

    Directory of Open Access Journals (Sweden)

    Saleem Abuleil

    2017-01-01

    Full Text Available The Web considers one of the main sources of customer opinions and reviews which they are represented in two formats; structured data (numeric ratings and unstructured data (textual comments. Millions of textual comments about goods and services are posted on the web by customers and every day thousands are added, make it a big challenge to read and understand them to make them a useful structured data for customers and decision makers. Sentiment analysis or Opinion mining is a popular technique for summarizing and analyzing those opinions and reviews. In this paper, we use natural language processing techniques to generate some rules to help us understand customer opinions and reviews (textual comments written in the Arabic language for the purpose of understanding each one of them and then convert them to a structured data. We use adjectives as a key point to highlight important information in the text then we work around them to tag attributes that describe the subject of the reviews, and we associate them with their values (adjectives.

  8. Qualitative Methodology in Analyzing Educational Phenomena

    Directory of Open Access Journals (Sweden)

    Antonio SANDU

    2010-12-01

    Full Text Available Semiological analysis of educational phenomena allow researchers access to a multidimensional universe of meanings that is represented by the school, not so much seen as an institution, but as a vector of social action through educational strategies. We consider education as a multidimensional phenomenon since its analysis allows the researcher to explore a variety of research hypotheses of different paradigmatic perspectives that converge in an educational finality. According to the author Simona Branc one of the most appropriate methods used in qualitative data analysis is Grounded Theory; this one assumes a systematic process of generating concepts and theories based on the data collected. Specialised literature defines Grounded Theory as an inductive approach that starts with general observations and during the analytical process creates conceptual categories that explain the theme explored. Research insist on the role of the sociologic theory of managing the research data and for providing ways of conceptualizing the descriptions and explanations.Qualitative content analysis is based on the constructivist paradigm (constructionist in the restricted sense that we used previously. It aims to create an “understanding of the latent meanings of the analyzed messages”. Quantitative content analysis involves a process of encoding and statistical analysis of data extracted from the content of the paper in the form of extractions like: frequencies, contingency analysis, etc

  9. Analyzing Design Heating Loads in Superinsulated Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Arena, Lois [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)

    2015-06-16

    The U.S. Department of Energy’s Building America research team Consortium for Advanced Residential Buildings (CARB) worked with the EcoVillage cohousing community in Ithaca, New York, on the Third Residential EcoVillage Experience neighborhood. This communityscale project consists of 40 housing units—15 apartments and 25 single-family residences. Units range in size from 450 ft2 to 1,664 ft2 and cost from $80,000 for a studio apartment to $235,000 for a three- or four-bedroom single-family home. For the research component of this project, CARB analyzed current heating system sizing methods for superinsulated homes in cold climates to determine if changes in building load calculation methodology should be recommended. Actual heating energy use was monitored and compared to results from the Air Conditioning Contractors of America’s Manual J8 (MJ8) and the Passive House Planning Package software. Results from that research indicate that MJ8 significantly oversizes heating systems for superinsulated homes and that thermal inertia and internal gains should be considered for more accurate load calculations.

  10. Comparison of two dry chemistry analyzers and a wet chemistry analyzer using canine serum.

    Science.gov (United States)

    Lanevschi, Anne; Kramer, John W.

    1996-01-01

    Canine serum was used to compare seven chemistry analytes on two tabletop clinical dry chemistry analyzers, Boehringer's Reflotron and Kodak's Ektachem. Results were compared to those obtained on a wet chemistry reference analyzer, Roche Diagnostic's Cobas Mira. Analytes measured were urea nitrogen (BUN), creatinine, glucose, aspartate aminotransferase (AST), alanine aminotransferase (ALT), cholesterol and bilirubin. Nine to 12 canine sera with values in the low, normal, and high range were evaluated. The correlations were acceptable for all comparisons with correlation coefficients greater than 0.98 for all analytes. Regression analysis resulted in significant differences for both tabletop analyzers when compared to the reference analyzer for cholesterol and bilirubin, and for glucose and AST on the Kodak Ektachem. Differences appeared to result from proportional systematic error occurring at high analyte concentrations.

  11. The Chemnitz LogAnalyzer: a tool for analyzing data from hypertext navigation research.

    Science.gov (United States)

    Brunstein, Angela; Naumann, Anja; Krems, Josef F

    2005-05-01

    Computer-based studies usually produce log files as raw data. These data cannot be analyzed adequately with conventional statistical software. The Chemnitz LogAnalyzer provides tools for quick and comfortable visualization and analyses of hypertext navigation behavior by individual users and for aggregated data. In addition, it supports analogous analyses of questionnaire data and reanalysis with respect to several predefined orders of nodes of the same hypertext. As an illustration of how to use the Chemnitz LogAnalyzer, we give an account of one study on learning with hypertext. Participants either searched for specific details or read a hypertext document to familiarize themselves with its content. The tool helped identify navigation strategies affected by these two processing goals and provided comparisons, for example, of processing times and visited sites. Altogether, the Chemnitz LogAnalyzer fills the gap between log files as raw data of Web-based studies and conventional statistical software.

  12. Analyzing the outcomes of health promotion practices.

    Science.gov (United States)

    Pereira Lima, Vera Lucia Góes; Arruda, José Maria; Barroso, Maria Auxiliadora Bessa; Lobato Tavares, Maria de Fátima; Ribeiro Campos, Nora Zamith; Zandonadil, Regina Celi Moreira Basílio; da Rocha, Rosa Maria; Parreira, Clélia Maria de Souza Ferreira; Cohen, Simone Cynamon; Kligerman, Débora Cynamon; Sperandio, Ana Maria Girotti; Correa, Carlos Roberto Silveira; Serrano, Miguel Malo

    2007-01-01

    This article focuses on health promotion (HP) outcomes, illustrated through evaluation of case studies and identification of strategies which have contributed to their success and sustainability. Evaluation research and practice in three distinct sceneries are discussed: (i) institutional and governmental agencies; (ii) communities in the "Manguinhos Complex" and Nova Iguaqu Municipality, and (iii) building of potentially healthy municipality networks. The effectiveness of a social program in a health promotion perspective was based in the "School for Parents" program, undertaken by the First Court of Childhood and Youth of Rio de Janeiro, between 2001 and 2004. The analysis was grounded in the monitoring of 48 parents in charge of children under 18, who were victims of abuse, violence or negligence, and social exclusion, most of all. The study's objectives were: illustrating the evidence of effectiveness of health promotion, discussing the concept of HP effectiveness under macro unfavorable conditions, and identifying strategies that foster sustainability of results. Institutional resources included a multi-professional staff, multidisciplinary approaches, participatory workshops, family case management, partnership with public and private institutions, and volunteer and civil society sponsorship of the families. Evaluation was based on social impact indicators, and psychosocial and contextual determinants. Evaluation methods included program monitoring and quantitative-qualitative methods, through a longitudinal evaluation of 3 years, including one year post program. The evaluation showed highly favorable results concerning "family integration', "quality of family relations" and "human rights mobilization". Unsatisfactory results such as "lack of access to formal employment" are likely related to structural factors and the need for new public policies in areas such as education, professional training, housing, and access to formal employment. The training process

  13. Experienced physicians benefit from analyzing initial diagnostic hypotheses

    Directory of Open Access Journals (Sweden)

    Adam Bass

    2013-03-01

    Full Text Available Background: Most incorrect diagnoses involve at least one cognitive error, of which premature closure is the most prevalent. While metacognitive strategies can mitigate premature closure in inexperienced learners, these are rarely studied in experienced physicians. Our objective here was to evaluate the effect of analytic information processing on diagnostic performance of nephrologists and nephrology residents. Methods: We asked nine nephrologists and six nephrology residents at the University of Calgary and Glasgow University to diagnose ten nephrology cases. We provided presenting features along with contextual information, after which we asked for an initial diagnosis. We then primed participants to use either hypothetico-deductive reasoning or scheme-inductive reasoning to analyze the remaining case data and generate a final diagnosis. Results: After analyzing initial hypotheses, both nephrologists and residents improved the accuracy of final diagnoses (31.1% vs. 65.6%, p < 0.001, and 40.0% vs. 70.0%, p < 0.001, respectively. We found a significant interaction between experience and analytic processing strategy (p = 0.002: nephrology residents had significantly increased odds of diagnostic success when using scheme-inductive reasoning (odds ratio [95% confidence interval] 5.69 [1.59, 20.33], p = 0.007, whereas the performance of experienced nephrologists did not differ between strategies (odds ratio 0.57 [0.23, 1.39], p = 0.2. Discussion: Experienced nephrologists and nephrology residents can improve their performance by analyzing initial diagnostic hypotheses. The explanation of the interaction between experience and the effect of different reasoning strategies is unclear, but may relate to preferences in reasoning strategy, or the changes in knowledge structure with experience.

  14. ANALYZING THE SAVING AND TWIN DEFICITS CORRELATION: EVIDENCE FROM ROMANIA

    Directory of Open Access Journals (Sweden)

    2015-07-01

    Full Text Available The importance of saving, specifically in the particular case of Romania, is relieved by the fact that supplies the necessary financial resources for a “healthy” economic recovery and growth. Accordingly, saving finance the investment from domestic sources and do not expose the national economy to the risks and disturbances which come when the economy is financed mainly with resources from financial foreign markets. The last financial crisis, which in many south European countries still an ongoing one, demonstrated again how “toxic” can be the economic dependence of the foreign capital inflows, the positive influence of the national saving being more than relevant in this case. The purpose of this article is to determine if there is a connection between saving and so called twin deficits, public deficit and current account deficit, as main macroeconomic indicators, in the economy of Romania during the 2003 – 2013 years. Particular attention is also given to the national investment situation and to the household saving behavior before and after the financial crises has started. The research methods is based on a qualitative approach and includes mainly methods of observation, value comparison and critical analyze of theories and data collected. First part of the research comprises a qualitative analyze of the theoretical relation and implications of the variables analyzed, so that in the second part to have an empirical data research which includes tables and figures with the evolution of these three economic indicators in Romania. The conclusions of this article lead to a correlation which is conditioned by the economic cycles and its influences over the economy. The findings of this research work would help economic policy makers to improve their macroeconomic decisions.

  15. BECOMING A TOUR GUIDE: ANALYZING THE MOTIVATIONS

    Directory of Open Access Journals (Sweden)

    Monika PRAKASH

    2010-06-01

    Full Text Available Guides play a vital role in this process bringing satisfaction to tourists visiting a country or region/state. The opportunity of direct interaction with the tourists makes them all the more responsible for projecting the correct image of the country/region, giving factually correct information about the destination, ensuring the safety and well being of the tourists as well as pleasing and satisfying the stay for them during their visits. Over last few years there has been a greater interest in tour guide profession especially in the northern region of India.The purpose of this study is to identify the motivations that led to choosing tour guiding as a profession and career. There appears to be a significant difference in such motivation in different regions of the country. A comparison in motivations in two regions (north vs. east was made. Based on primary data collection paper attempts to discuss what has motivated the youth to take up tour guiding profession- whether such motivation is positive of negative. In either case policy makers may decide what type of support programs need to be introduced by the state and other agencies like educational, rearing and counselling, financial support, social security, or any other type of interventions.

  16. Mass Analyzers Facilitate Research on Addiction

    Science.gov (United States)

    2012-01-01

    The famous go/no go command for Space Shuttle launches comes from a place called the Firing Room. Located at Kennedy Space Center in the Launch Control Center (LCC), there are actually four Firing Rooms that take up most of the third floor of the LCC. These rooms comprise the nerve center for Space Shuttle launch and processing. Test engineers in the Firing Rooms operate the Launch Processing System (LPS), which is a highly automated, computer-controlled system for assembly, checkout, and launch of the Space Shuttle. LPS monitors thousands of measurements on the Space Shuttle and its ground support equipment, compares them to predefined tolerance levels, and then displays values that are out of tolerance. Firing Room operators view the data and send commands about everything from propellant levels inside the external tank to temperatures inside the crew compartment. In many cases, LPS will automatically react to abnormal conditions and perform related functions without test engineer intervention; however, firing room engineers continue to look at each and every happening to ensure a safe launch. Some of the systems monitored during launch operations include electrical, cooling, communications, and computers. One of the thousands of measurements derived from these systems is the amount of hydrogen and oxygen inside the shuttle during launch.

  17. Novel topological descriptors for analyzing biological networks

    Directory of Open Access Journals (Sweden)

    Varmuza Kurt K

    2010-06-01

    Full Text Available Abstract Background Topological descriptors, other graph measures, and in a broader sense, graph-theoretical methods, have been proven as powerful tools to perform biological network analysis. However, the majority of the developed descriptors and graph-theoretical methods does not have the ability to take vertex- and edge-labels into account, e.g., atom- and bond-types when considering molecular graphs. Indeed, this feature is important to characterize biological networks more meaningfully instead of only considering pure topological information. Results In this paper, we put the emphasis on analyzing a special type of biological networks, namely bio-chemical structures. First, we derive entropic measures to calculate the information content of vertex- and edge-labeled graphs and investigate some useful properties thereof. Second, we apply the mentioned measures combined with other well-known descriptors to supervised machine learning methods for predicting Ames mutagenicity. Moreover, we investigate the influence of our topological descriptors - measures for only unlabeled vs. measures for labeled graphs - on the prediction performance of the underlying graph classification problem. Conclusions Our study demonstrates that the application of entropic measures to molecules representing graphs is useful to characterize such structures meaningfully. For instance, we have found that if one extends the measures for determining the structural information content of unlabeled graphs to labeled graphs, the uniqueness of the resulting indices is higher. Because measures to structurally characterize labeled graphs are clearly underrepresented so far, the further development of such methods might be valuable and fruitful for solving problems within biological network analysis.

  18. Modeling and Analyzing Academic Researcher Behavior

    Directory of Open Access Journals (Sweden)

    Phuc Huu Nguyen

    2016-12-01

    Full Text Available Abstract. This paper suggests a theoretical framework for analyzing the mechanism of the behavior of academic researchers whose interests are tangled and vary widely in academic factors (the intrinsic satisfaction in conducting research, the improvement in individual research ability, etc. or non-academic factors (career rewards, financial rewards, etc.. Furthermore, each researcher also has his/her different academic stances in their preferences about academic freedom and academic entrepreneurship. Understanding the behavior of academic researchers will contribute to nurture young researchers, to improve the standard of research and education as well as to boost collaboration in academia-industry. In particular, as open innovation is increasingly in need of the involvement of university researchers, to establish a successful approach to entice researchers into enterprises’ research, companies must comprehend the behavior of university researchers who have multiple complex motivations. The paper explores academic researchers' behaviors through optimizing their utility functions, i.e. the satisfaction obtained by their research outputs. This paper characterizes these outputs as the results of researchers' 3C: Competence (the ability to implement the research, Commitment (the effort to do the research, and Contribution (finding meaning in the research. Most of the previous research utilized the empirical methods to study researcher's motivation. Without adopting economic theory into the analysis, the past literature could not offer a deeper understanding of researcher's behavior. Our contribution is important both conceptually and practically because it provides the first theoretical framework to study the mechanism of researcher's behavior. Keywords: Academia-Industry, researcher behavior, ulrich model’s 3C.

  19. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the

  20. Analyzing wildfire exposure on Sardinia, Italy

    Science.gov (United States)

    Salis, Michele; Ager, Alan A.; Arca, Bachisio; Finney, Mark A.; Alcasena, Fermin; Bacciu, Valentina; Duce, Pierpaolo; Munoz Lozano, Olga; Spano, Donatella

    2014-05-01

    We used simulation modeling based on the minimum travel time algorithm (MTT) to analyze wildfire exposure of key ecological, social and economic features on Sardinia, Italy. Sardinia is the second largest island of the Mediterranean Basin, and in the last fifty years experienced large and dramatic wildfires, which caused losses and threatened urban interfaces, forests and natural areas, and agricultural productions. Historical fires and environmental data for the period 1995-2009 were used as input to estimate fine scale burn probability, conditional flame length, and potential fire size in the study area. With this purpose, we simulated 100,000 wildfire events within the study area, randomly drawing from the observed frequency distribution of burn periods and wind directions for each fire. Estimates of burn probability, excluding non-burnable fuels, ranged from 0 to 1.92x10-3, with a mean value of 6.48x10-5. Overall, the outputs provided a quantitative assessment of wildfire exposure at the landscape scale and captured landscape properties of wildfire exposure. We then examined how the exposure profiles varied among and within selected features and assets located on the island. Spatial variation in modeled outputs resulted in a strong effect of fuel models, coupled with slope and weather. In particular, the combined effect of Mediterranean maquis, woodland areas and complex topography on flame length was relevant, mainly in north-east Sardinia, whereas areas with herbaceous fuels and flat areas were in general characterized by lower fire intensity but higher burn probability. The simulation modeling proposed in this work provides a quantitative approach to inform wildfire risk management activities, and represents one of the first applications of burn probability modeling to capture fire risk and exposure profiles in the Mediterranean basin.

  1. Analyzing cancer samples with SNP arrays.

    Science.gov (United States)

    Van Loo, Peter; Nilsen, Gro; Nordgard, Silje H; Vollan, Hans Kristian Moen; Børresen-Dale, Anne-Lise; Kristensen, Vessela N; Lingjærde, Ole Christian

    2012-01-01

    Single nucleotide polymorphism (SNP) arrays are powerful tools to delineate genomic aberrations in cancer genomes. However, the analysis of these SNP array data of cancer samples is complicated by three phenomena: (a) aneuploidy: due to massive aberrations, the total DNA content of a cancer cell can differ significantly from its normal two copies; (b) nonaberrant cell admixture: samples from solid tumors do not exclusively contain aberrant tumor cells, but always contain some portion of nonaberrant cells; (c) intratumor heterogeneity: different cells in the tumor sample may have different aberrations. We describe here how these phenomena impact the SNP array profile, and how these can be accounted for in the analysis. In an extended practical example, we apply our recently developed and further improved ASCAT (allele-specific copy number analysis of tumors) suite of tools to analyze SNP array data using data from a series of breast carcinomas as an example. We first describe the structure of the data, how it can be plotted and interpreted, and how it can be segmented. The core ASCAT algorithm next determines the fraction of nonaberrant cells and the tumor ploidy (the average number of DNA copies), and calculates an ASCAT profile. We describe how these ASCAT profiles visualize both copy number aberrations as well as copy-number-neutral events. Finally, we touch upon regions showing intratumor heterogeneity, and how they can be detected in ASCAT profiles. All source code and data described here can be found at our ASCAT Web site ( http://www.ifi.uio.no/forskning/grupper/bioinf/Projects/ASCAT/).

  2. Analyzing Sustainable Competitive Advantage: Strategically Managing Resource Allocations to Achieve Operational Competitiveness

    National Research Council Canada - National Science Library

    Nurul Aida Abdul Malek; Khuram Shahzad; Josu Takala; Stefan Bojnec; Drago Papler; Yang Liu

    2015-01-01

    .... This study demonstrates the competitive priorities of manufacturing strategy in hydro-power case company to evaluate the level of sustainable competitive advantage and also to further analyze how...

  3. Eastern Mediterranean Natural Gas: Analyzing Turkey’s Stance

    Directory of Open Access Journals (Sweden)

    Abdullah Tanrıverdi

    2013-10-01

    Full Text Available Recent large-scale natural gas discoveries in East Mediterranean have drawn attention to the region. The discoveries caused both hope and tension in the region. As stated, the new resources may serve as a new hope for all relevant parties as well as the region if managed in a collaborative and conciliatory way. Energy may be a remedy to Cyprus’ financial predicament, initiate a process for resolving differences between Turkey and Cyprus, normalize Israel-Turkey relations and so on. On the contrary, adopting unilateral and uncooperative approach may aggravate the tension and undermine regional stability and security. In this sense, the role of energy in generating hope or tension is dependent on the approaches of related parties. The article will analyze Turkey’s attitude in East Mediterranean case in terms of possible negative and positive implications for Turkey in the energy field. The article examines Turkey’s position and the reasons behind its stance in the East Mediterranean case. Considering Turkey’s energy profile and energy policy goals, the article argues that the newly found hydrocarbons may bring in more stakes for Turkey if Turkey adopts a cooperative approach in this case.

  4. Technology for collecting and analyzing relational data

    Directory of Open Access Journals (Sweden)

    E. N. Fedorova

    2016-01-01

    summarize the information there is a mechanism of data grouping, which provides general data of the number of entries, maximum, minimum, average values for different groups of records.Results. This technology has been tested in the monitoring requirements of the services of additional professional education and the definition of the educational needs of teachers and executives of educational organizations of the Irkutsk region. The survey has involved 2,780 respondents in 36 municipalities. Creating the data model took several hours. The survey was conducted during the month.Conclusion. The proposed technology allows a short time to collect the information in relational form, and then analyze it without the need for programming with flexible assignment of the operating logic for form.

  5. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2013-12-01

    The latest Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with newly available global observations. The traditional approach to climate model evaluation, which compares a single parameter at a time, identifies symptomatic model biases and errors but fails to diagnose the model problems. The model diagnosis process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. To address these challenges, we are developing a parallel, distributed web-service system that enables the physics-based multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation and (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, and (4) the calculation of difference between two variables. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use

  6. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2014-12-01

    We have developed a cloud-enabled web-service system that empowers physics-based, multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks. The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the observational datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation, (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs, and (3) ECMWF reanalysis outputs for several environmental variables in order to supplement observational datasets. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, (4) the calculation of difference between two variables, and (5) the conditional sampling of one physical variable with respect to another variable. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA will be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. In order to support 30+ simultaneous users during the school, we have deployed CMDA to the Amazon cloud environment. The cloud-enabled CMDA will provide each student with a virtual machine while the user interaction with the system will remain the same

  7. Evaluation and performance characteristics of the Q Hemostasis Analyzer, an automated coagulation analyzer.

    Science.gov (United States)

    Toulon, Pierre; Fischer, Florence; Appert-Flory, Anny; Jambou, Didier

    2014-05-01

    The Q Hemostasis Analyzer (Grifols, Barcelona, Spain) is a fully-automated random-access multiparameter analyzer, designed to perform coagulation, chromogenic and immunologic assays. It is equipped with a cap-piercing system. The instrument was evaluated in a hemostasis laboratory of a University Hospital with respect to its technical features in the determination of coagulation i.e. prothrombin time (PT), activated partial thromboplastin time (aPTT), thrombin time, fibrinogen and single coagulation factors V (FV) and VIII (FVIII), chromogenic [antithrombin (AT) and protein C activity] and immunologic assays [von Willebrand factor antigen (vWF:Ag) concentration], using reagents from the analyzer manufacturer. Total precision (evaluated as the coefficient of variation) was below 6% for most parameters both in normal and in pathological ranges, except for FV, FVIII, AT and vWF:Ag both in the normal and pathological samples. No carryover was detected in alternating aPTT measurement in a pool of normal plasma samples and in the same pool spiked with unfractionated heparin (>1.5 IU/mL). The effective throughput was 154 PT, 66 PT/aPTT, 42 PT/aPTT/fibrinogen, and 38 PT/aPTT/AT per hour, leading to 154 to 114 tests performed per hour, depending of the tested panel. Test results obtained on the Q Hemostasis Analyzer were well correlated with those obtained on the ACL TOP analyzer (Instrumentation Laboratory), with r between 0.862 and 0.989. In conclusion, routine coagulation testing can be performed on the Q Hemostasis Analyzer with satisfactory precision and the same apply to more specialized and specific tests.

  8. Analyzers Measure Greenhouse Gases, Airborne Pollutants

    Science.gov (United States)

    2012-01-01

    In complete darkness, a NASA observatory waits. When an eruption of boiling water billows from a nearby crack in the ground, the observatory s sensors seek particles in the fluid, measure shifts in carbon isotopes, and analyze samples for biological signatures. NASA has landed the observatory in this remote location, far removed from air and sunlight, to find life unlike any that scientists have ever seen. It might sound like a scene from a distant planet, but this NASA mission is actually exploring an ocean floor right here on Earth. NASA established a formal exobiology program in 1960, which expanded into the present-day Astrobiology Program. The program, which celebrated its 50th anniversary in 2010, not only explores the possibility of life elsewhere in the universe, but also examines how life begins and evolves, and what the future may hold for life on Earth and other planets. Answers to these questions may be found not only by launching rockets skyward, but by sending probes in the opposite direction. Research here on Earth can revise prevailing concepts of life and biochemistry and point to the possibilities for life on other planets, as was demonstrated in December 2010, when NASA researchers discovered microbes in Mono Lake in California that subsist and reproduce using arsenic, a toxic chemical. The Mono Lake discovery may be the first of many that could reveal possible models for extraterrestrial life. One primary area of interest for NASA astrobiologists lies with the hydrothermal vents on the ocean floor. These vents expel jets of water heated and enriched with chemicals from off-gassing magma below the Earth s crust. Also potentially within the vents: microbes that, like the Mono Lake microorganisms, defy the common characteristics of life on Earth. Basically all organisms on our planet generate energy through the Krebs Cycle, explains Mike Flynn, research scientist at NASA s Ames Research Center. This metabolic process breaks down sugars for energy

  9. Using Simulation to Analyze Acoustic Environments

    Science.gov (United States)

    Wood, Eric J.

    2016-01-01

    One of the main projects that was worked on this semester was creating an acoustic model for the Advanced Space Suit in Comsol Multiphysics. The geometry tools built into the software were used to create an accurate model of the helmet and upper torso of the suit. After running the simulation, plots of the sound pressure level within the suit were produced, as seen below in Figure 1. These plots show significant nulls which should be avoided when placing microphones inside the suit. In the future, this model can be easily adapted to changes in the suit design to determine optimal microphone placements and other acoustic properties. Another major project was creating an acoustic diverter that will potentially be used to route audio into the Space Station's Node 1. The concept of the project was to create geometry to divert sound from a neighboring module, the US Lab, into Node 1. By doing this, no new audio equipment would need to be installed in Node 1. After creating an initial design for the diverter, analysis was performed in Comsol in order to determine how changes in geometry would affect acoustic performance, as shown in Figure 2. These results were used to produce a physical prototype diverter on a 3D printer. With the physical prototype, testing was conducted in an anechoic chamber to determine the true effectiveness of the design, as seen in Figure 3. The results from this testing have been compared to the Comsol simulation results to analyze how closely the Comsol results are to real-world performance. While the Comsol results do not seem to closely resemble the real world performance, this testing has provided valuable insight into how much trust can be placed in the results of Comsol simulations. A final project that was worked on during this tour was the Audio Interface Unit (AIU) design for the Orion program. The AIU is a small device that will be used for as an audio communication device both during launch and on-orbit. The unit will have functions

  10. Using Model Checking for Analyzing Distributed Power Control Problems

    Directory of Open Access Journals (Sweden)

    Thomas Brihaye

    2010-01-01

    Full Text Available Model checking (MC is a formal verification technique which has been known and still knows a resounding success in the computer science community. Realizing that the distributed power control (PC problem can be modeled by a timed game between a given transmitter and its environment, the authors wanted to know whether this approach can be applied to distributed PC. It turns out that it can be applied successfully and allows one to analyze realistic scenarios including the case of discrete transmit powers and games with incomplete information. The proposed methodology is as follows. We state some objectives a transmitter-receiver pair would like to reach. The network is modeled by a game where transmitters are considered as timed automata interacting with each other. The objectives are then translated into timed alternating-time temporal logic formulae and MC is exploited to know whether the desired properties are verified and determine a winning strategy.

  11. THE THEORIES OF INCOMPLETE CONTRACTS IN ANALYZING THE COMPANY

    Directory of Open Access Journals (Sweden)

    Pacala Anca

    2012-07-01

    Full Text Available Incomplete contracts theories have developed significantly in recent decades, although insistence for rigorous models left little room for empirical research. By formalizing and extending some results from other theories such as transaction costs, incomplete contracts theory tries to analyze the prudence displayed by the parties before the possible opportunistic behavior that would follow completing a contract, especially in the case of specific investments and how the insufficient contractual protection measures can lead to inefficient levels of investment. Even the name - incomplete contracts theory- suggests that the main concern is to consider the limits of contracts, that the contracts fail to specify not only the investment ex ante, but also many other unforeseen items that may appear ex post, and that would be desirable to be introduced in such an arrangement. Explanations can be either the bounded rationality or excessive cost that would involve writing of such contracts.\\r\

  12. Possibility of Earthquake-prediction by analyzing VLF signals

    Science.gov (United States)

    Ray, Suman; Chakrabarti, Sandip Kumar; Sasmal, Sudipta

    2016-07-01

    Prediction of seismic events is one of the most challenging jobs for the scientific community. Conventional ways for prediction of earthquakes are to monitor crustal structure movements, though this method has not yet yield satisfactory results. Furthermore, this method fails to give any short-term prediction. Recently, it is noticed that prior to any seismic event a huge amount of energy is released which may create disturbances in the lower part of D-layer/E-layer of the ionosphere. This ionospheric disturbance may be used as a precursor of earthquakes. Since VLF radio waves propagate inside the wave-guide formed by lower ionosphere and Earth's surface, this signal may be used to identify ionospheric disturbances due to seismic activity. We have analyzed VLF signals to find out the correlations, if any, between the VLF signal anomalies and seismic activities. We have done both the case by case study and also the statistical analysis using a whole year data. In both the methods we found that the night time amplitude of VLF signals fluctuated anomalously three days before the seismic events. Also we found that the terminator time of the VLF signals shifted anomalously towards night time before few days of any major seismic events. We calculate the D-layer preparation time and D-layer disappearance time from the VLF signals. We have observed that this D-layer preparation time and D-layer disappearance time become anomalously high 1-2 days before seismic events. Also we found some strong evidences which indicate that it may possible to predict the location of epicenters of earthquakes in future by analyzing VLF signals for multiple propagation paths.

  13. Capturing and analyzing wheelchair maneuvering patterns with mobile cloud computing.

    Science.gov (United States)

    Fu, Jicheng; Hao, Wei; White, Travis; Yan, Yuqing; Jones, Maria; Jan, Yih-Kuen

    2013-01-01

    Power wheelchairs have been widely used to provide independent mobility to people with disabilities. Despite great advancements in power wheelchair technology, research shows that wheelchair related accidents occur frequently. To ensure safe maneuverability, capturing wheelchair maneuvering patterns is fundamental to enable other research, such as safe robotic assistance for wheelchair users. In this study, we propose to record, store, and analyze wheelchair maneuvering data by means of mobile cloud computing. Specifically, the accelerometer and gyroscope sensors in smart phones are used to record wheelchair maneuvering data in real-time. Then, the recorded data are periodically transmitted to the cloud for storage and analysis. The analyzed results are then made available to various types of users, such as mobile phone users, traditional desktop users, etc. The combination of mobile computing and cloud computing leverages the advantages of both techniques and extends the smart phone's capabilities of computing and data storage via the Internet. We performed a case study to implement the mobile cloud computing framework using Android smart phones and Google App Engine, a popular cloud computing platform. Experimental results demonstrated the feasibility of the proposed mobile cloud computing framework.

  14. Impact of instrumental constraints and imperfections on the dislocation structure in micron-sized Cu compression pillars

    Energy Technology Data Exchange (ETDEWEB)

    Kirchlechner, C., E-mail: christoph.kirchlechner@unileoben.ac.at [Erich Schmid Institute of Materials Science, Austrian Academy of Sciences (Austria); Department of Materials Physics, Montanuniversitaet Leoben (Austria); Keckes, J. [Department of Materials Physics, Montanuniversitaet Leoben (Austria); Motz, C.; Grosinger, W.; Kapp, M.W. [Erich Schmid Institute of Materials Science, Austrian Academy of Sciences (Austria); Micha, J.S.; Ulrich, O. [CEA-Grenoble/Institut Nanosciences et Cryogenie (France); CRG-IF BM32 at ESRF, European Synchrotron Radiation Facility, Grenoble (France); Dehm, G. [Erich Schmid Institute of Materials Science, Austrian Academy of Sciences (Austria); Department of Materials Physics, Montanuniversitaet Leoben (Austria)

    2011-08-15

    Highlights: {yields} In situ {mu}Laue compression tests on three 7 {mu}m sized Copper pillars were performed. {yields} The evolution of dislocation structures is interlinked with the mechanical response. {yields} Well aligned samples do not store GNDs to a strain of approximately 0.18. {yields} Poorly aligned samples immediately store GNDs and form dislocation boundaries. - Abstract: In situ micro-Laue diffraction was used to study the plasticity in three 7 {mu}m, initially identical, single-crystalline Cu pillars during compression. Movements of the Laue spot as well as Laue spot streaking were analyzed to obtain real-time insights into the storage of excess dislocations and the possible formation of dislocation cell structures. The results reveal that instrumental constraints lead to dislocation storage at the sample base and top, but will not affect the storage of excess dislocations in the sample center in case of an ideal alignment. In contrast, misaligned samples show early yielding due to the activation of an unpredicted slip system, storage of excess dislocations also in the sample center and, at a later stage, the formation of a complex dislocation substructure.

  15. Kamu Kurumlarında İş Analizi Çalışmaları: Muğla Üniversitesi Örneği(Studies of Job Analyzing in Public Institutions: A Case Study in Muğla University

    Directory of Open Access Journals (Sweden)

    Edip ÖRÜCÜ

    2005-01-01

    Full Text Available In the context of the effectiveness of organization activities, determining the limits and the characteristics of the works, departments and units, definition of the duties have crucial importance. Job analyses are done in order to carry out these aims. This study aims to determine the reactions of the employees to the job analyses done in the organizations. In this context, the study includes a literature survey on job analyses. The findings obtained from the field research have been analyzed. A questionnaire has been applied to employees in the Faculty of Economic and Administrative Sciences and the Faculty of Arts and Sciences in Muğla University. Consequently, we find that the reaction of the employees in the Faculty of Economic and Administrative Sciences is less than the others, depending on that they know much more about job analyses.

  16. Decomposition methods for analyzing changes of industrial water use

    Science.gov (United States)

    Shang, Yizi; Lu, Shibao; Shang, Ling; Li, Xiaofei; Wei, Yongping; Lei, Xiaohui; Wang, Chao; Wang, Hao

    2016-12-01

    Changes in industrial water use are of the utmost significance in rapidly developing countries. Such countries are experience rapid industrialization, which may stimulate substantial increases in their future industrial water use. Local governments face challenges in formulating industrial policies for sustainable development, particularly in areas that experience severe water shortages. This study addresses the factors driving increased industrial water use and the degrees to which these factors contribute, and determines whether the trend will change in the future. This study explores the options for quantitative analysis that analyzes changes in industrial water use. We adopt both the refined Laspeyres and the Logarithmic Mean Divisia Index models to decompose the driving forces of industrial water use. Additionally, we validate the decomposition results through a comparative study using empirical analysis. Using Tianjin, a national water-saving city in China, as a case study, we compare the performance of the two models. In the study, the driving forces of changes in industrial water use are summarized as output, technological, and structural forces. The comparative results indicate that the refined Laspeyres model may be preferable for this case, and further reveal that output and technology have long-term, stable effects on industrial water use. However, structure may have an uncertain influence on industrial water use. The reduced water use may be a consequence of Tianjin's attempts to target water savings in other areas. Therefore, we advise the Tianjin local government to restructure local industries towards water-saving targets.

  17. An extended Cellular Potts Model analyzing a wound healing assay.

    Science.gov (United States)

    Scianna, Marco

    2015-07-01

    A suitable Cellular Potts Model is developed to reproduce and analyze an in vitro wound-healing assay. The proposed approach is able both to quantify the invasive capacity of the overall cell population and to evaluate selected determinants of single cell movement (velocity, directional movement, and final displacement). In this respect, the present CPM allows us to capture differences and correlations in the migratory behavior of cells initially located at different distances from the wound edge. In the case of an undifferentiated extracellular matrix, the model then predicts that a maximal healing can be obtained by a chemically induced increment of cell elasticity and not by a chemically induced downregulation of intercellular adhesive contacts. Moreover, in the case of two-component substrates (formed by a mesh of collagenous-like threads and by a homogeneous medium), CPM simulations show that both fiber number and cell-fiber adhesiveness influence cell speed and wound closure rate in a biphasic fashion. On the contrary, the topology of the fibrous network affects the healing process by mediating the productive directional cell movement. The paper, also equipped with comments on the computational cost of the CPM algorithm, ends with a throughout discussion of the pertinent experimental and theoretical literature. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. 混合谱系白血病基因重排阳性急性髓系白血病的临床特点与预后分析%Clinical characteristics and prognostic analyze of acute myeloid leukemia cases with MLL gene rearrangement

    Institute of Scientific and Technical Information of China (English)

    季国; 王祥民; 石培民; 岑建农; 孙爱宁

    2013-01-01

    目的:探讨混合谱系白血病(MLL)基因重排阳性急性髓系白血病(AML)患者的临床特点、预后,并与同期MLL基因重排阴性AML患者进行比较。方法观察随访51例MLL基因重排阳性AML病例(非M3型),分析临床特征、细胞形态学、免疫表型、细胞遗传学、早期死亡(early death,ED)、CR率、复发率、总体生存率(overall survival,OS)、移植效果等,并与同期随机选择的51例MLL基因重排阴性AML病例(非M3型)进行比较。结果(1)与对照组患者相比,MLL基因重排阳性患者WBC数、LDH、外周血原始细胞比例明显增高(P<0.05),FAB分型中M4/M5比例明显增高(P<0.05)。(2) MLL基因重排阳性 AML组单核系统的表面标志 CD14、CD64、CD15和 CD11b的表达明显高于对照组(P<0.05)。(3)MLL 基因重排阳性 AML 患者总缓解率51.0%,复发率42.3%。而对照组总缓解率72.5%,复发率18.9%。MLL基因重排阳性AML患者较对照组患者缓解率低,易复发(P<0.05);至随访截止时MLL基因重排阳性AML患者OS为32.3%,明显低于对照组(P<0.05);MLL基因重排阳性AML患者中,单纯化疗患者3年 OS率为26.7%,移植患者3年 OS率为60.0%,可见异基因外周血造血干细胞移植明显提高了OS率(P<0.05)。结论 MLL基因重排阳性AML在AML-M4/M5中发生率高,化疗效果差,易复发,预后差,异基因外周血造血干细胞移植可显著改善其生存率。%Objective To investigate the clinical characteristics and prognosis of acute myoloid leukimia cases with MLL gene rearrangement. Methods 51 de novo MLL gene rearrangement AML(non M3)cases were retrospectively reviewed. The clinical features, cytomorphology, immunophenotype, cytogenetics, early death, complete remission rate, recurrence rate, overall survival and response to allo-HSCT of these patients were compared with 51 cases without MLL gene

  19. HMR Log Analyzer: Analyze Web Application Logs Over Hadoop MapReduce

    Directory of Open Access Journals (Sweden)

    Sayalee Narkhede

    2013-07-01

    Full Text Available In today’s Internet world, log file analysis is becoming a necessary task for analyzing the customer’sbehavior in order to improve advertising and sales as well as for datasets like environment, medical,banking system it is important to analyze the log data to get required knowledge from it. Web mining is theprocess of discovering the knowledge from the web data. Log files are getting generated very fast at therate of 1-10 Mb/s per machine, a single data center can generate tens of terabytes of log data in a day.These datasets are huge. In order to analyze such large datasets we need parallel processing system andreliable data storage mechanism. Virtual database system is an effective solution for integrating the databut it becomes inefficient for large datasets. The Hadoop framework provides reliable data storage byHadoop Distributed File System and MapReduce programming model which is a parallel processingsystem for large datasets. Hadoop distributed file system breaks up input data and sends fractions of theoriginal data to several machines in hadoop cluster to hold blocks of data. This mechanism helps toprocess log data in parallel using all the machines in the hadoop cluster and computes result efficiently.The dominant approach provided by hadoop to “Store first query later”, loads the data to the HadoopDistributed File System and then executes queries written in Pig Latin. This approach reduces the responsetime as well as the load on to the end system. This paper proposes a log analysis system using HadoopMapReduce which will provide accurate results in minimum response time.

  20. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Farzin Heravi

    2012-09-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.

  1. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Roozbeh Rashed

    2013-01-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.  

  2. Causal analyzing on regional economic disparities based on the spatial economic model: A case study of Lan-Xin railway radiation belt%基于空间计量经济模型的区域经济差异成因分析——以兰新铁路辐射带为例

    Institute of Scientific and Technical Information of China (English)

    李建豹; 白永平; 李建虎; 侯成成

    2012-01-01

    利用空间分析法确定兰新铁路辐射带的辐射范围,以县级行政单元为基本研究单元,综合运用SPSS、GeoDA和ARCGIS分析区域经济差异后发现:在兰新铁路辐射带内,兰州市市辖区、乌鲁木齐市市辖区、嘉峪关市、哈密市、阿拉善左旗的经济发展水平明显比其它地区高,甘肃段内区域经济差异较大,新疆段内区域经济差异较小;区域经济空间集聚特征明显;利用空间计量经济模型分析区域经济差异成因可知,财政收入对经济发展具有明显的负面影响,市场规模、经济结构、工业化、虚拟变量对经济发展具有明显的促进作用,其中虚拟变量与经济发展水平的回归系数最大,说明农村城市化对经济发展的促进作用最大。区域投资水平对经济发展影响不明显。%In this paper we discussed economic development disparities and the causes by taking administrative county units in the radiation range through spatial analysis of GIS.Based on the ten relative economic indices in Lan-Xin railway radiation belt,the general score of regional economic level was calculated by SPSS,and the disparities of regional economic level in Lan-Xin railway radiation belt were analyzed by means of GIS spatial analysis provided by ARCGIS and GeoDA.Based on studying on the economic development disparities,some conclusions were drawn as follows.In Lan-Xin railway radiation belt,the regional economic level is significantly higher in the municipal of Lanzhou city,the municipal of Urumqi city,Jiayuguan city,Hami city,Alxa Left Banner than others.The economic disparities is larger in Gansu section,it is comparatively smaller in Xinjiang section.Regional economy takes on significantly spatial agglomeration.Based on the spatial econometric models,we analyzed the causes of regional economic disparities,and got some conclusions as follows.Financial receipt has obvious negative impact on economic development.Market scale

  3. 针刺联合醒神通络方治疗中风后吞咽困难47例效果分析%Acupuncture combined wake up t2dm side effect of Treatment of Dysphagia after stroke 47 cases were analyzed

    Institute of Scientific and Technical Information of China (English)

    胡海

    2014-01-01

    Objective To explore the acupuncture combined Chinese medicine application value in the treatment of dysphagia after stroke .Methods 94 patients with dysphagia after stroke were randomly divided into observation group and control group ,each group of 47 cases and control group in implementing routine acupuncture treatment,the observation group in the acupuncture treatment combined TCM internal and,on the basis of continuous 1 month after treatment,observe the clinical thera-peutic effect of two groups of patients.Results Observation group after treatment,the clinical effective rate was 87.2%(41/47);the control group in clinical effective rate was 68.1%(32/47),obvious difference(P<0.05).Conclusion acupuncture combined Chinese medicine treating dysphagia after stroke clinical curative effect,can ob-viously improve the prognosis of patients,and is worth promoting.%目的:探究针刺联合中药在治疗中风后吞咽困难中的应用价值。方法94例中风后吞咽困难患者随机分为观察组及对照组,每组各47例,对照组实施常规针刺治疗,观察组在针刺治疗的基础上联合中药内服,连续治疗1月后,观察两组患者的临床治疗效果。结果观察组经治疗后,临床有效率为87.2%(41/47),对照组临床有效率为68.1%(32/47),存在明显统计学差异性(P<0.05)。结论针刺联合中药治疗中风后吞咽困难临床疗效确切,可明显改善患者的预后,值得推广。

  4. AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool

    Science.gov (United States)

    Halford, Keith

    2009-01-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  5. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  6. 40 CFR 92.119 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 92... Hydrocarbon analyzer calibration. The HFID hydrocarbon analyzer shall receive the following initial and... into service and at least annually thereafter, the HFID hydrocarbon analyzer shall be adjusted...

  7. 40 CFR 86.1321-94 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86... Procedures § 86.1321-94 Hydrocarbon analyzer calibration. The FID hydrocarbon analyzer shall receive the... into service and at least annually thereafter, the FID hydrocarbon analyzer shall be adjusted...

  8. 40 CFR 91.316 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 91....316 Hydrocarbon analyzer calibration. (a) Calibrate the FID and HFID hydrocarbon analyzer as described... thereafter, adjust the FID and HFID hydrocarbon analyzer for optimum hydrocarbon response as specified...

  9. 40 CFR 89.319 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 89... Equipment Provisions § 89.319 Hydrocarbon analyzer calibration. (a) The FID hydrocarbon analyzer shall... and at least annually thereafter, adjust the FID hydrocarbon analyzer for optimum hydrocarbon...

  10. A semiconductor parameter analyzer for ionizing radiation detectors

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Luiz A.P., E-mail: lasantos@cnen.gov.b [Centro Regional de Ciencias Nucleares (CRCN-NE/CNEN-PE), Recife, PE (Brazil)

    2009-07-01

    Electrometers and ion chamber are normally used to make several types of measurements in a radiation field and there is a unique voltage applied to each detector type. Some electronic devices that are built of semiconductor materials like silicon crystal can also be used for the same purpose. In this case, a characteristic curve of the device must be acquired to choose an operation point which consists of an electrical current or voltage to be applied to the device. Unlike ion chambers, such an electronic device can have different operation points depending on its current versus voltage curve (I x V). The best operation point of the device is also a function of the radiation, energy, dose rate and fluence. The purpose of this work is to show a semiconductor parameter analyzer built to acquire I x V curves as usually, and the innovation here is the fact that it can be used to obtain such a parametric curve when a quad-polar device is under irradiation. The results demonstrate that the system is a very important tool to scientists interested to evaluate a semiconductor detector before, during and after irradiation. A collection of results for devices under an X-ray beam and a neutron fluence are presented: photodiode, phototransistors, bipolar transistor and MOSFET. (author)

  11. The minimal requirements to use calcium imaging to analyze ICRAC.

    Science.gov (United States)

    Alansary, Dalia; Kilch, Tatiana; Holzmann, Christian; Peinelt, Christine; Hoth, Markus; Lis, Annette

    2014-06-02

    Endogenous calcium release-activated channel (CRAC) currents are usually quite small and not always easy to measure using the patch-clamp technique. While we have, for instance, successfully recorded very small CRAC currents in primary human effector T cells, we have not yet managed to record CRAC in naïve primary human T cells. Many groups, including ours, therefore use Ca(2+) imaging technologies to analyze CRAC-dependent Ca(2+) influx. However, Ca(2+) signals are quite complex and depend on many different transporter activities; thus, it is not trivial to make quantitative statements about one single transporter, in this case CRAC channels. Therefore, a detailed patch-clamp analysis of ICRAC is always preferred. Since many laboratories use Ca(2+) imaging for ICRAC analysis, we detail here the minimal requirements for reliable measurements. Ca(2+) signals not only depend on the net Ca(2+) influx through CRAC channels but also depend on other Ca(2+) influx mechanisms, K(+) channels or Cl(-) channels (which determine the membrane potential), Ca(2+) export mechanisms like plasma membrane Ca(2+) ATPase (PMCA), sarco/endoplasmic reticulum Ca(2+) ATPase (SERCA) or Na(+)-Ca(2+) exchangers, and (local) Ca(2+) buffering often by mitochondria. In this protocol, we summarize a set of experiments that allow (quantitative) statements about CRAC channel activity using Ca(2+) imaging experiments, including the ability to rule out Ca(2+) signals from other sources.

  12. Analyzing learning during Peer Instruction dialogues: A resource activation framework

    Science.gov (United States)

    Wood, Anna K.; Galloway, Ross K.; Hardy, Judy; Sinclair, Christine M.

    2014-12-01

    Peer Instruction (PI) is an evidence based pedagogy commonly used in undergraduate physics instruction. When asked questions designed to test conceptual understanding, it has been observed that the proportion of students choosing the correct answer increases following peer discussion; however, relatively little is known about what takes place during these discussions or how they are beneficial to the processes of learning physics [M. C. James and S. Willoughby, Am. J. Phys. 79, 123 (2011)]. In this paper a framework for analyzing PI discussions developed through the lens of the "resources model" [D. Hammer, Am. J. Phys. 64, 1316 (1996); D. Hammer et al., Information Age Publishing (2005)] is proposed. A central hypothesis for this framework is that the dialogue with peers plays a crucial role in activating appropriate cognitive resources, enabling the students to see the problem differently, and therefore to answer the questions correctly. This framework is used to gain greater insights into the PI discussions of first year undergraduate physics students at the University of Edinburgh, UK, which were recorded using Livescribe Smartpens. Analysis of the dialogues revealed three different types of resource activation corresponding to increasing cognitive grain size. These were activation of knowledge elements, activation of linkages between knowledge elements, and activation of control structures (epistemic games and epistemological frames). Three case studies are examined to illustrate the role that peer dialogue plays in the activation of these cognitive resources in a PI session. The implications for pedagogical practice are discussed.

  13. Analyzing Nonblocking Switching Networks using Linear Programming (Duality)

    CERN Document Server

    Ngo, Hung Q; Le, Anh N; Nguyen, Thanh-Nhan

    2012-01-01

    The main task in analyzing a switching network design (including circuit-, multirate-, and photonic-switching) is to determine the minimum number of some switching components so that the design is non-blocking in some sense (e.g., strict- or wide-sense). We show that, in many cases, this task can be accomplished with a simple two-step strategy: (1) formulate a linear program whose optimum value is a bound for the minimum number we are seeking, and (2) specify a solution to the dual program, whose objective value by weak duality immediately yields a sufficient condition for the design to be non-blocking. We illustrate this technique through a variety of examples, ranging from circuit to multirate to photonic switching, from unicast to $f$-cast and multicast, and from strict- to wide-sense non-blocking. The switching architectures in the examples are of Clos-type and Banyan-type, which are the two most popular architectural choices for designing non-blocking switching networks. To prove the result in the multir...

  14. Second Generation Integrated Composite Analyzer (ICAN) Computer Code

    Science.gov (United States)

    Murthy, Pappu L. N.; Ginty, Carol A.; Sanfeliz, Jose G.

    1993-01-01

    This manual updates the original 1986 NASA TP-2515, Integrated Composite Analyzer (ICAN) Users and Programmers Manual. The various enhancements and newly added features are described to enable the user to prepare the appropriate input data to run this updated version of the ICAN code. For reference, the micromechanics equations are provided in an appendix and should be compared to those in the original manual for modifications. A complete output for a sample case is also provided in a separate appendix. The input to the code includes constituent material properties, factors reflecting the fabrication process, and laminate configuration. The code performs micromechanics, macromechanics, and laminate analyses, including the hygrothermal response of polymer-matrix-based fiber composites. The output includes the various ply and composite properties, the composite structural response, and the composite stress analysis results with details on failure. The code is written in FORTRAN 77 and can be used efficiently as a self-contained package (or as a module) in complex structural analysis programs. The input-output format has changed considerably from the original version of ICAN and is described extensively through the use of a sample problem.

  15. Modeling and Analyzing Taxi Congestion Premium in Congested Cities

    Directory of Open Access Journals (Sweden)

    Changwei Yuan

    2017-01-01

    Full Text Available Traffic congestion is a significant problem in many major cities. Getting stuck in traffic, the mileage per unit time that a taxicab travels will decline significantly. Congestion premium (or so-called low-speed fare has become an increasingly important income source for taxi drivers. However, the impact of congestion premium on the taxicab market is not widely understood yet. In particular, modeling and analyzing of the taxi fare structure with congestion premium are extremely limited. In this paper, we developed a taxi price equilibrium model, in which the adjustment mechanism of congestion premium on optimizing the taxi driver’s income, balancing the supply and demand, and eventually improving the level of service in the whole taxicab market was investigated. In the final part, we provided a case study to demonstrate the feasibility of the proposed model. The results indicated that the current taxi fare scheme in Beijing is suboptimal, since the gain from the raise of congestion premium cannot compensate for the loss from the demand reduction. Conversely, the optimal fare scheme suggested by our model can effectively reduce the excessive demand and reach the supply-demand equilibrium, while keeping the stability of the driver’s income to the maximum extent.

  16. Analyzing spatial data from mouse tracker methodology: An entropic approach.

    Science.gov (United States)

    Calcagnì, Antonio; Lombardi, Luigi; Sulpizio, Simone

    2017-01-11

    Mouse tracker methodology has recently been advocated to explore the motor components of the cognitive dynamics involved in experimental tasks like categorization, decision-making, and language comprehension. This methodology relies on the analysis of computer-mouse trajectories, by evaluating whether they significantly differ in terms of direction, amplitude, and location when a given experimental factor is manipulated. In this kind of study, a descriptive geometric approach is usually adopted in the analysis of raw trajectories, where they are summarized with several measures, such as maximum-deviation and area under the curve. However, using raw trajectories to extract spatial descriptors of the movements is problematic due to the noisy and irregular nature of empirical movement paths. Moreover, other significant components of the movement, such as motor pauses, are disregarded. To overcome these drawbacks, we present a novel approach (EMOT) to analyze computer-mouse trajectories that quantifies movement features in terms of entropy while modeling trajectories as composed by fast movements and motor pauses. A dedicated entropy decomposition analysis is additionally developed for the model parameters estimation. Two real case studies from categorization tasks are finally used to test and evaluate the characteristics of the new approach.

  17. Scientific horizons at the Institut Laue-Langevin

    Indian Academy of Sciences (India)

    Richard Wagner

    2008-11-01

    In order to maintain the scientific value of the ILL and to respond to the changing needs of its broad user community throughout Europe and beyond, in 2001 the Millennium Programme (Phase M-0, 2001–2008) was launched to boost the quality of ILL's experimental facilities. The ongoing renewal programme has been directed to-wards renewing some of the neutron delivery systems – neutron guides and beam tubes – towards improvement of sample environment and installation of user-friendly data acquisition/handling systems, and in particular towards upgrading neutron instrumentation. Already at this stage, the upgrading of eleven instruments and some neutron delivery systems has increased the overall efficiency (expressed in terms of count rates) of ILL's instruments suite by more than a factor of 14. This does not only render the time for a neutron scattering experiment much shorter but, in particular, does open access to new scientific challenges such as in nanoscience and biology where often only small samples are available. In the recently launched Phase M-1 (2007–2013), five new instruments as well as new neutron guide systems will be built with the emphasis on neutron diffraction and spectroscopy with cold neutrons. This will further boost the application of neutron scattering techniques to materials science problems and to condensed matter research, and will keep the institute at the forefront of neutron science well beyond 2020.

  18. Lab-on-a-chip Astrobiology Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer to measure chemical signatures of life in extraterrestrial settings. The analyzer will...

  19. Looking for a Framework for Analyzing Eco-innovation Dynamics

    DEFF Research Database (Denmark)

    Yang, Yan

    2011-01-01

    Looking for a Framework for Analyzing Eco-innovation Dynamics: A Triple Helix Model of Innovation Perspective.......Looking for a Framework for Analyzing Eco-innovation Dynamics: A Triple Helix Model of Innovation Perspective....

  20. Human Immunodeficiency Virus Infection (HIV)/Acquired Immune Deficiency Syndrome (AIDS) Merge Non-tuberculous Mycobacteria (NTM) Analyzed 138 Cases of Pulmonary Adverse Treatment%人类免疫缺陷病毒感染(HIV)/艾滋病(AIDS)合并非结核分枝杆菌(NTM)肺病138例治疗不良反应分析

    Institute of Scientific and Technical Information of China (English)

    银春莲; 谢周华; 裴洁; 阮光靖

    2015-01-01

    Objective: to investigate the incidence of adverse reactions and the treatment methods during cure HiV /aids combined ntM lung disease. Methods:the adverse reactions and the treatment methods in 138 patients ,who with (HiV)/(aids) merge ntM lung disease, were therapy in our hospital from 2010 to 2013,were retrospectively analyzed, the speciifc content including the age and sex of patient, the occurrence time, types, clinical manifestations and treatment measures of adverse drug reactions in the therapeutic process. Results:70%of adverse drug reactions occurred in 1~2 months after treatment were observed. the top 3 major causes:drug-induced liver injury were observed in 56 cases, drug rash were observed in 34 cases, hyperuricemia were observed in 28 cases. 76%of drug-induced liver injury with hyperuricemia in patients had no remarkable symptoms.adverse drug reactions in Males more than in male.More common in 58~70 years. after symptomatic treatment,chang or stop suspected drugs or stop all drugs, adverse drug reactions disappear, patients condition gradually improved. Conclusion: in the therapeutic process of (HiV) / (aids) merge ntM lung disease, We should pay attention to monitor adverse drug reactions , especially males and elderly patients, even if they have no obvious clinical symptoms, they should be closely observed, periodic review, early detection and proper treatment, to avoid serious adverse drug reactions.%目的:探讨HiV/aids合并ntM肺病治疗过程中不良反应的发生情况以及采用处理方法。方法回顾性分析我院2010~2013年138例HiV/aids合并ntM肺病患者治疗过程中发生药物不良反应的发生时间、发生种类、临床表现、年龄、性别以及处理措施。结果70%的药物不良反应发生在用药1~2个月。发生居前三位为:药物性肝损伤56例,药物性皮疹34例,高尿酸血症28例。76%的药物性肝损伤及高尿酸血症患者无明显临床症状。男性比女性多见。58

  1. 40 CFR 86.1221-90 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86...-Fueled and Methanol-Fueled Heavy-Duty Vehicles § 86.1221-90 Hydrocarbon analyzer calibration. The FID hydrocarbon analyzer shall receive the following initial and periodic calibrations. (a) Initial and...

  2. 40 CFR 90.316 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 90... Equipment Provisions § 90.316 Hydrocarbon analyzer calibration. (a) Calibrate the FID and HFID hydrocarbon... thereafter, adjust the FID and HFID hydrocarbon analyzer for optimum hydrocarbon response as specified...

  3. 40 CFR 86.331-79 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86....331-79 Hydrocarbon analyzer calibration. The following steps are followed in sequence to calibrate the hydrocarbon analyzer. It is suggested, but not required, that efforts be made to minimize relative...

  4. 40 CFR 86.121-90 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86... Complete Heavy-Duty Vehicles; Test Procedures § 86.121-90 Hydrocarbon analyzer calibration. The hydrocarbon... FID and HFID hydrocarbon analyzers shall be adjusted for optimum hydrocarbon response....

  5. 40 CFR 86.521-90 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86... Regulations for 1978 and Later New Motorcycles; Test Procedures § 86.521-90 Hydrocarbon analyzer calibration. (a) The FID hydrocarbon analyzer shall receive the following initial and periodic calibration....

  6. 21 CFR 864.5680 - Automated heparin analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated heparin analyzer. 864.5680 Section 864....5680 Automated heparin analyzer. (a) Identification. An automated heparin analyzer is a device used to determine the heparin level in a blood sample by mixing the sample with protamine (a...

  7. 21 CFR 882.1420 - Electroencephalogram (EEG) signal spectrum analyzer.

    Science.gov (United States)

    2010-04-01

    ....1420 Electroencephalogram (EEG) signal spectrum analyzer. (a) Identification. An electroencephalogram (EEG) signal spectrum analyzer is a device used to display the frequency content or power spectral... analyzer. 882.1420 Section 882.1420 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH...

  8. 21 CFR 1230.32 - Analyzing of samples.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Analyzing of samples. 1230.32 Section 1230.32 Food... FEDERAL CAUSTIC POISON ACT Administrative Procedures § 1230.32 Analyzing of samples. Samples collected by an authorized agent shall be analyzed at the laboratory designated by the Food and...

  9. 21 CFR 870.3640 - Indirect pacemaker generator function analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Indirect pacemaker generator function analyzer... Indirect pacemaker generator function analyzer. (a) Identification. An indirect pacemaker generator function analyzer is an electrically powered device that is used to determine pacemaker function or...

  10. 21 CFR 870.3630 - Pacemaker generator function analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Pacemaker generator function analyzer. 870.3630... (CONTINUED) MEDICAL DEVICES CARDIOVASCULAR DEVICES Cardiovascular Prosthetic Devices § 870.3630 Pacemaker generator function analyzer. (a) Identification. A pacemaker generator function analyzer is a device that is...

  11. 40 CFR 90.320 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Emission Test Equipment Provisions § 90.320 Carbon dioxide analyzer calibration. (a) Prior to its initial... carbon dioxide analyzer as follows: (1) Follow good engineering practices for instrument start-up...

  12. 21 CFR 868.1400 - Carbon dioxide gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Carbon dioxide gas analyzer. 868.1400 Section 868...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1400 Carbon dioxide gas analyzer. (a) Identification. A carbon dioxide gas analyzer is a device intended to measure the concentration of carbon...

  13. 21 CFR 868.1670 - Neon gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Neon gas analyzer. 868.1670 Section 868.1670 Food... DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1670 Neon gas analyzer. (a) Identification. A neon gas analyzer is a device intended to measure the concentration of neon in a gas mixture exhaled by...

  14. 21 CFR 868.2380 - Nitric oxide analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Nitric oxide analyzer. 868.2380 Section 868.2380...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Monitoring Devices § 868.2380 Nitric oxide analyzer. (a) Identification. The nitric oxide analyzer is a device intended to measure the concentration of nitric oxide...

  15. Analyzing development of working models for disrupted attachments: the case of hidden family violence.

    Science.gov (United States)

    Ayoub, Catherine C; Fischer, Kurt W; O'Connor, Erin E

    2003-06-01

    This article offers a developmental model of attachment theory rooted in dynamic skill theory. Dynamic skill theory is based on the assumption that people do not have integrated, fundamentally logical minds, but instead develop along naturally fractionated strands of a web. Contrary to traditional interpretations of attachment theory, dynamic skill theory proposes that individuals continue to modify their working models of attachments throughout the lifespan. In particular, working models of close relationships develop systematically through a series of skill levels such that the skills vary across strands in the web and will not automatically form a unified whole. The continual modification of working models is particularly pertinent for the consequences of hidden family violence for individuals' development. Dynamic skill theory shows how trauma can produce not developmental delay or fixation, as has been proposed previously, but instead the construction of advanced, complex working models.

  16. Analyzing PSU’s Performance: A Case from Ministry of Petroleum and Natural Gas of India

    Directory of Open Access Journals (Sweden)

    Chia-Nan Wang

    2013-01-01

    Full Text Available The high economic growth in the past few years and increasing industrialization coupled with a burgeoning population have created a lot of concern for India’s energy scenario. India’s crude oil production has not shown significant growth in the last 10 or more years whereas its refining capacity has grown by more than 20% over the last 5 years. Oil consumption is growing at approximately 4.1% per year and natural gas consumption is growing at 68% per year. Therefore, evaluation performances and pushing energy companies to improve become important issues. The purpose of this research is of evaluation the performance of Indian energy industry under multiple different inputs and outputs criteria. The data envelopment analysis (DEA and grey theory are used to conduct this study. There are total 14 public sector undertakings (PSUs under this industry and no any private company. However, only 10 of them are mature enough to be published in India stock markets. Therefore, the realistic data of all 10 companies are used for this evaluation. The results demonstrate that Gas Authority of India Limited (GAIL, Chennai Petroleum Corporation Limited (CPCL, and Oil India Limited (OIL are the top 3 of ranking influences. This integrated numerical study gives a better “past-present-future” insights into evaluation performance in India energy industry.

  17. Analyzing the subsurface structure using seismic refraction method: Case study STMKG campus

    Energy Technology Data Exchange (ETDEWEB)

    Wibowo, Bagus Adi, E-mail: bagusadiwibowo1993@gmail.com [The State College of Meteorology, Climatology and Geophysics (STMKG), The Indonesian Meteorology, Climatology, and Geophysics Agency (BMKG), Perhubungan 1 Street, South Tangerang, 15221 (Indonesia); Ngadmanto, Drajat [The Center of Research and Development (PUSLITBANG), The Indonesian Meteorology, Climatology, and Geophysics Agency (BMKG), Angkasa I, Jakarta, 10620 (Indonesia); Daryono [The Mitigation of Earthquake and Tsunami, The Indonesian Meteorology, Climatology, and Geophysics Agency (BMKG), Angkasa I, Jakarta, 10620 (Indonesia)

    2015-04-24

    A geophysic survey is performed to detect subsurface structure under STMKG Campus in Pondok Betung, South Tangerang, Indonesia, using seismic refraction method. The survey used PASI 16S24-U24. The waveform data is acquired from 3 different tracks on the research location with a close range from each track. On each track we expanded 24 geofons with spacing between receiver 2 meters and the total length of each track about 48 meters. The waveform data analysed using 2 different ways. First, used a seismic refractionapplication WINSISIM 12 and second, used a Hagiwara Method. From both analysis, we known the velocity of P-wave in the first and second layer and the thickness of the first layer. From the velocity and the thickness informations we made 2-D vertical subsurface profiles. In this research, we only detect 2 layers in each tracks. The P-wave velocity of first layer is about 200-500 m/s with the thickness of this layer about 3-6 m/s. The P-wave velocity of second layer is about 400-900 m/s. From the P-wave velocity data we interpreted that both layer consisted by similar materials such as top soil, soil, sand, unsaturated gravel, alluvium and clay. But, the P-wave velocity difference between those 2 layers assumed happening because the first layer is soil embankment layer, having younger age than the layer below.

  18. Case Studies in Critical Ecoliteracy: A Curriculum for Analyzing the Social Foundations of Environmental Problems

    Science.gov (United States)

    Turner, Rita; Donnelly, Ryan

    2013-01-01

    This article outlines the features and application of a set of model curriculum materials that utilize eco-democratic principles and humanities-based content to cultivate critical analysis of the cultural foundations of socio-environmental problems. We first describe the goals and components of the materials, then discuss results of their use in…

  19. A case study of analyzing student teachers' concept images of the ...

    African Journals Online (AJOL)

    Administrator

    Four categories of studies deal with the teaching and learning of specific key ... these researches focus on difficulties met by students and teachers while learning and teaching ..... Verbal (oral and ..... computer laboratory teaching environment.

  20. LLNL Center for Microtechnology: Capabilities, Customers, Case Study-HANAA (Handheld Nucleic Acid Analyzer)

    Energy Technology Data Exchange (ETDEWEB)

    Mariella, R

    2002-12-30

    The polymerase chain reaction (PCR) is an enzyme-based chemical reaction that manufactures copies of one or more identifying regions of double-stranded DNA sequences (target sequences). These copies of target DNA are known as ''amplicons''. By creating millions of these copies of the identifying sequences (when they are present!), PCR allows researchers to detect by them, and hence the presence of the relevant organism, with techniques such as electrophoresis, flow cytometry, or spectrometry. Although there are numerous commercial PCR instruments that are designed for bench-top use in a laboratory, the challenges of building a battery-powered instrument that could perform such assays in the field.