WorldWideScience

Sample records for humphrey field analyser

  1. Visual field examination method using virtual reality glasses compared with the Humphrey perimeter

    Directory of Open Access Journals (Sweden)

    Tsapakis S

    2017-08-01

    Full Text Available Stylianos Tsapakis, Dimitrios Papaconstantinou, Andreas Diagourtas, Konstantinos Droutsas, Konstantinos Andreanos, Marilita M Moschos, Dimitrios Brouzas 1st Department of Ophthalmology, National and Kapodistrian University of Athens, Athens, Greece Purpose: To present a visual field examination method using virtual reality glasses and evaluate the reliability of the method by comparing the results with those of the Humphrey perimeter.Materials and methods: Virtual reality glasses, a smartphone with a 6 inch display, and software that implements a fast-threshold 3 dB step staircase algorithm for the central 24° of visual field (52 points were used to test 20 eyes of 10 patients, who were tested in a random and consecutive order as they appeared in our glaucoma department. The results were compared with those obtained from the same patients using the Humphrey perimeter.Results: High correlation coefficient (r=0.808, P<0.0001 was found between the virtual reality visual field test and the Humphrey perimeter visual field.Conclusion: Visual field examination results using virtual reality glasses have a high correlation with the Humphrey perimeter allowing the method to be suitable for probable clinical use. Keywords: visual fields, virtual reality glasses, perimetry, visual fields software, smartphone

  2. Comparison of visual field test results obtained through Humphrey matrix frequency doubling technology perimetry versus standard automated perimetry in healthy children.

    Science.gov (United States)

    Kocabeyoglu, Sibel; Uzun, Salih; Mocan, Mehmet Cem; Bozkurt, Banu; Irkec, Murat; Orhan, Mehmet

    2013-10-01

    The aim of this study was to compare the visual field test results in healthy children obtained via the Humphrey matrix 24-2 threshold program and standard automated perimetry (SAP) using the Swedish interactive threshold algorithm (SITA)-Standard 24-2 test. This prospective study included 55 healthy children without ocular or systemic disorders who underwent both SAP and frequency doubling technology (FDT) perimetry visual field testing. Visual field test reliability indices, test duration, global indices (mean deviation [MD], and pattern standard deviation [PSD]) were compared between the 2 tests using the Wilcoxon signed-rank test and paired t-test. The performance of the Humphrey field analyzer (HFA) 24-2 SITA-standard and frequency-doubling technology Matrix 24-2 tests between genders were compared with Mann-Whitney U-test. Fifty-five healthy children with a mean age of 12.2 ± 1.9 years (range from 8 years to 16 years) were included in this prospective study. The test durations of SAP and FDT were similar (5.2 ± 0.5 and 5.1 ± 0.2 min, respectively, P = 0.651). MD and the PSD values obtained via FDT Matrix were significantly higher than those obtained via SAP (P tests in terms of MD (r = 0.352, P = 0.008) and PSD (r = 0.329, P = 0.014) was observed. Children were able to complete both the visual test algorithms successfully within 6 min. However, SAP testing appears to be associated with less depression of the visual field indices of healthy children. FDT Matrix and SAP should not be used interchangeably in the follow-up of children.

  3. Comparison of visual field test results obtained through Humphrey matrix frequency doubling technology perimetry versus standard automated perimetry in healthy children

    Directory of Open Access Journals (Sweden)

    Sibel Kocabeyoglu

    2013-01-01

    Full Text Available Aims : The aim of this study was to compare the visual field test results in healthy children obtained via the Humphrey matrix 24-2 threshold program and standard automated perimetry (SAP using the Swedish interactive threshold algorithm (SITA-Standard 24-2 test. Materials and Methods: This prospective study included 55 healthy children without ocular or systemic disorders who underwent both SAP and frequency doubling technology (FDT perimetry visual field testing. Visual field test reliability indices, test duration, global indices (mean deviation [MD], and pattern standard deviation [PSD] were compared between the 2 tests using the Wilcoxon signed-rank test and paired t-test. The performance of the Humphrey field analyzer (HFA 24-2 SITA-standard and frequency-doubling technology Matrix 24-2 tests between genders were compared with Mann-Whitney U-test. Results: Fifty-five healthy children with a mean age of 12.2 ± 1.9 years (range from 8 years to 16 years were included in this prospective study. The test durations of SAP and FDT were similar (5.2 ± 0.5 and 5.1 ± 0.2 min, respectively, P = 0.651. MD and the PSD values obtained via FDT Matrix were significantly higher than those obtained via SAP (P < 0.001, and fixation losses and false negative errors were significantly less with SAP (P < 0.05. A weak positive correlation between the two tests in terms of MD (r = 0.352, P = 0.008 and PSD (r = 0.329, P = 0.014 was observed. Conclusion: Children were able to complete both the visual test algorithms successfully within 6 min. However, SAP testing appears to be associated with less depression of the visual field indices of healthy children. FDT Matrix and SAP should not be used interchangeably in the follow-up of children.

  4. Comparison of 30-2 Standard and Fast programs of Swedish Interactive Threshold Algorithm of Humphrey Field Analyzer for perimetry in patients with intracranial tumors.

    Science.gov (United States)

    Singh, Manav Deep; Jain, Kanika

    2017-11-01

    To find out whether 30-2 Swedish Interactive Threshold Algorithm (SITA) Fast is comparable to 30-2 SITA Standard as a tool for perimetry among the patients with intracranial tumors. This was a prospective cross-sectional study involving 80 patients aged ≥18 years with imaging proven intracranial tumors and visual acuity better than 20/60. The patients underwent multiple visual field examinations using the two algorithms till consistent and repeatable results were obtained. A total of 140 eyes of 80 patients were analyzed. Almost 60% of patients undergoing perimetry with SITA Standard required two or more sessions to obtain consistent results, whereas the same could be obtained in 81.42% with SITA Fast in the first session itself. Of 140 eyes, 70 eyes had recordable field defects and the rest had no defects as detected by either of the two algorithms. Mean deviation (MD) (P = 0.56), pattern standard deviation (PSD) (P = 0.22), visual field index (P = 0.83) and number of depressed points at P 0.5% on MD and PSD probability plots showed no statistically significant difference between two algorithms. Bland-Altman test showed that considerable variability existed between two algorithms. Perimetry performed by SITA Standard and SITA Fast algorithm of Humphrey Field Analyzer gives comparable results among the patients of intracranial tumors. Being more time efficient and with a shorter learning curve, SITA Fast my be recommended as a standard test for the purpose of perimetry among these patients.

  5. Comparison of Peristat Online Perimetry with the Humphrey Perimetry in a Clinic-Based Setting

    OpenAIRE

    Lowry, Eugene A.; Hou, Jing; Hennein, Lauren; Chang, Robert T.; Lin, Shan; Keenan, Jeremy; Wang, Sean K.; Ianchulev, Sean; Pasquale, Louis R.; Han, Ying

    2016-01-01

    Purpose We determined the receiver operating characteristic (ROC) curves for Peristat online perimetry at detecting varying degrees of glaucoma and the correlation between Peristat online perimetry and Humphrey visual field. Methods: A prospective, comparative study of Peristat online perimetry (an achromatic static computer threshold testing program) and Humphrey visual field (HVF) 24-2 SITA standard testing was performed by 63 glaucoma patients and 30 healthy controls in random order. The n...

  6. Comparison of Macular Integrity Assessment (MAIA ™, MP-3, and the Humphrey Field Analyzer in the Evaluation of the Relationship between the Structure and Function of the Macula.

    Directory of Open Access Journals (Sweden)

    Kazuyuki Hirooka

    Full Text Available This study was conducted in order to compare relationships between the macular visual field (VF mean sensitivity measured by MAIATM (Macular Integrity Assessment, MP-3, or Humphry field analyzer (HFA and the ganglion cell and inner plexiform layer (GCA thicknesses.This cross-sectional study examined 73 glaucoma patients and 19 normal subjects. All subjects underwent measurements for GCA thickness by Cirrus HD-OCT and static threshold perimetry using MAIATM, MP-3, or HFA. VF and OCT in the retinal view were used to examine both the global relationship between the VF sensitivity and GCA thickness, and the superior hemiretina and inferior hemiretina. The relationship between the GCA thickness and macular sensitivity was examined by Spearman correlation analysis.For each instrument, statistically significant macular VF sensitivity (dB and GCA thickness relationships were observed using the decibel scale (R = 0.547-0.687, all P < 0.001. The highest correlation for the global (R = 0.682 and the superior hemiretina (R = 0.594 GCA thickness-VF mean sensitivity was observed by the HFA. The highest correlation for the inferior hemiretina (R = 0.687 GCA thickness-VF mean sensitivity was observed by the MP-3. Among the three VF measurement instruments, however, no significant differences were found for the structure-function relationships.All three VF measurement instruments found similar structure-function relationships in the central VF.

  7. TFTR magnetic field design analyses

    International Nuclear Information System (INIS)

    Davies, K.; Iwinski, E.; McWhirter, J.M.

    1975-11-01

    The three main magnetic field windings for the TFTR are the toroidal field (TF) windings, the ohmic heating (OH) winding, and the equilibrium field (EF) winding. The following information is provided for these windings: (1) descriptions, (2) functions, (3) magnetic designs, e.g., number and location of turns, (4) design methods, and (5) descriptions of resulting magnetic fields. This report does not deal with the thermal, mechanical support, or construction details of the windings

  8. Visual discrimination training improves Humphrey perimetry in chronic cortically induced blindness.

    Science.gov (United States)

    Cavanaugh, Matthew R; Huxlin, Krystel R

    2017-05-09

    To assess if visual discrimination training improves performance on visual perimetry tests in chronic stroke patients with visual cortex involvement. 24-2 and 10-2 Humphrey visual fields were analyzed for 17 chronic cortically blind stroke patients prior to and following visual discrimination training, as well as in 5 untrained, cortically blind controls. Trained patients practiced direction discrimination, orientation discrimination, or both, at nonoverlapping, blind field locations. All pretraining and posttraining discrimination performance and Humphrey fields were collected with online eye tracking, ensuring gaze-contingent stimulus presentation. Trained patients recovered ∼108 degrees 2 of vision on average, while untrained patients spontaneously improved over an area of ∼16 degrees 2 . Improvement was not affected by patient age, time since lesion, size of initial deficit, or training type, but was proportional to the amount of training performed. Untrained patients counterbalanced their improvements with worsening of sensitivity over ∼9 degrees 2 of their visual field. Worsening was minimal in trained patients. Finally, although discrimination performance improved at all trained locations, changes in Humphrey sensitivity occurred both within trained regions and beyond, extending over a larger area along the blind field border. In adults with chronic cortical visual impairment, the blind field border appears to have enhanced plastic potential, which can be recruited by gaze-controlled visual discrimination training to expand the visible field. Our findings underscore a critical need for future studies to measure the effects of vision restoration approaches on perimetry in larger cohorts of patients. Copyright © 2017 The Author(s). Published by Wolters Kluwer Health, Inc. on behalf of the American Academy of Neurology.

  9. Integrated Field Analyses of Thermal Springs

    Science.gov (United States)

    Shervais, K.; Young, B.; Ponce-Zepeda, M. M.; Rosove, S.

    2011-12-01

    A group of undergraduate researchers through the SURE internship offered by the Southern California Earthquake Center (SCEC) have examined thermal springs in southern Idaho, northern Utah as well as mud volcanoes in the Salton Sea, California. We used an integrated approach to estimate the setting and maximum temperature, including water chemistry, Ipad-based image and data-base management, microbiology, and gas analyses with a modified Giggenbach sampler.All springs were characterized using GISRoam (tmCogent3D). We are performing geothermometry calculations as well as comparisons with temperature gradient data on the results while also analyzing biological samples. Analyses include water temperature, pH, electrical conductivity, and TDS measured in the field. Each sample is sealed and chilled and delivered to a water lab within 12 hours.Temperatures are continuously monitored with the use of Solinst Levelogger Juniors. Through partnership with a local community college geology club, we receive results on a monthly basis and are able to process initial data earlier in order to evaluate data over a longer time span. The springs and mudpots contained microbial organisms which were analyzed using methods of single colony isolation, polymerase chain reaction, and DNA sequencing showing the impact of the organisms on the springs or vice versa. Soon we we will collect gas samples at sites that show signs of gas. This will be taken using a hybrid of the Giggenbach method and our own methods. Drawing gas samples has proven a challenge, however we devised a method to draw out gas samples utilizing the Giggenbach flask, transferring samples to glass blood sample tubes, replacing NaOH in the Giggenbach flask, and evacuating it in the field for multiple samples using a vacuum pump. We also use a floating platform devised to carry and lower a levelogger, to using an in-line fuel filter from a tractor in order to keep mud from contaminating the equipment.The use of raster

  10. Structural analyses of ITER toroidal field coils under fault conditions

    International Nuclear Information System (INIS)

    Jong, C.T.J.

    1992-04-01

    ITER (International Thermonuclear Experimental Reactor) is intended to be an experimental thermonuclear tokamak reactor testing the basic physics performance and technologies essential to future fusion reactors. The magnet system of ITER consists essentially of 4 sub-systems, i.e. toroidal field coils (TFCs), poloidal field coils (PFCs), power supplies, and cryogenic supplies. These subsystems do not contain significant radioactivity inventories, but the large energy inventory is a potential accident initiator. The aim of the structural analyses is to prevent accidents from propagating into vacuum vessel, tritium system and cooling system, which all contain significant amounts of radioactivity. As part of design process 3 conditions are defined for PF and TF coils, at which mechanical behaviour has to be analyzed in some detail, viz: normal operating conditions, upset conditions and fault conditions. This paper describes the work carried out by ECN to create a detailed finite element model of 16 TFCs as well as results of some fault condition analyses made with the model. Due to fault conditions, either electrical or mechanical, magnetic loading of TFCs becomes abnormal and further mechanical failure of parts of the overall structure might occur (e.g. failure of coil, gravitational supports, intercoil structure). The analyses performed consist of linear elastic stress analyses and electro-magneto-structural analyses (coupled field analyses). 8 refs.; 5 figs.; 5 tabs

  11. Stress analyses of ITER toroidal field coils under fault conditions

    International Nuclear Information System (INIS)

    Jong, C.T.J.

    1990-02-01

    The International Thermonuclear Experimental Reactor (ITER) is intended as an experimental thermonuclear tokamak reactor for testing the basic physics, performance and technologies essential to future fusion reactors. The ITER design will be based on extensive new design work, supported by new physical and technological results, and on the great body of experience built up over several years from previous national and international reactor studies. Conversely, the ITER design process should provide the fusion community with valuable insights into what key areas need further development or clarification as we move forward towards practical fusion power. As part of the design process of the ITER toroidal field coils the mechanical behaviour of the magnetic system under fault conditions has to be analysed in more detail. This paper describes the work carried out to create a detailed finite element model of two toroidal field coils as well as some results of linear elastic analyses with fault conditions. The analyses have been performed with the finite element code ANSYS. (author). 5 refs.; 8 figs.; 2 tabs

  12. Procedures for field chemical analyses of water samples

    International Nuclear Information System (INIS)

    Korte, N.; Ealey, D.

    1983-12-01

    A successful water-quality monitoring program requires a clear understanding of appropriate measurement procedures in order to obtain reliable field data. It is imperative that the responsible personnel have a thorough knowledge of the limitations of the techniques being used. Unfortunately, there is a belief that field analyses are simple and straightforward. Yet, significant controversy as well as misuse of common measurement techniques abounds. This document describes procedures for field measurements of pH, carbonate and bicarbonate, specific conductance, dissolved oxygen, nitrate, Eh, and uranium. Each procedure section includes an extensive discussion regarding the limitations of the method as well as brief discussions of calibration procedures and available equipment. A key feature of these procedures is the consideration given to the ultimate use of the data. For example, if the data are to be used for geochemical modeling, more precautions are needed. In contrast, routine monitoring conducted merely to recognize gross changes can be accomplished with less effort. Finally, quality assurance documentation for each measurement is addressed in detail. Particular attention is given to recording sufficient information such that decisions concerning the quality of the data can be easily made. Application of the procedures and recommendations presented in this document should result in a uniform and credible water-quality monitoring program. 22 references, 4 figures, 3 tables

  13. Humphrey Davy and the Safety Lamp: The Use of Metal Gauze as a Flame Barrier

    Science.gov (United States)

    Mills, Allan

    2015-01-01

    The "safety lamp" invented by Humphrey Davy in 1815 utilised the cooling effect of metal gauze to prevent the flame of a candle or oil lamp (essential for illumination in mines) from passing through such a screen. It is therefore rendered unable to ignite any potentially explosive mixture of air and methane in the atmosphere surrounding…

  14. Photoelastic analyses of stresses in toroidal magnetic field coils

    International Nuclear Information System (INIS)

    Pih, H.

    1977-02-01

    Several two-dimensional photoelastic stress analyses were made on models of circular and oval toroidal magnetic field coils for fusion reactors. The circumferential variation of each coil's in-plane magnetic force was simulated by applying different pressures to 16 segmented regions of the inner surface of the models. One special loading fixture was used for the model of each shape and size. Birefringence and isoclinic angles were measured in a transmission polariscope at selected points on the loaded model. Boundary stresses in the cases of known boundary conditions were determined directly from the isochromatics. Separate principal stresses were calculated using the combination of photoelastic information and isopachic data obtained by the electrical analogy method from the solution of Laplace's equation. Comparisons were made between experimental results and those computed using the finite element method. The stress distribution between theoretical and experimental agrees very well, although the finite element method yielded slightly higher stresses than the photoelastic method; further work is needed to resolve this difference. In this investigation several variations of coil geometry and methods of support were evaluated. Based on experimental results, optimum structural designs of toroidal field coils were recommended

  15. Numerical analyses for efficient photoionization by nonmonochromatic fields

    International Nuclear Information System (INIS)

    Hasegawa, Shuichi; Suzuki, Atsuyuki

    2000-01-01

    Numerical analyses on excitation and ionization probabilities of atoms with hyperfine structures were performed in order to compare two different excitation methods, adiabatic excitation and broadband excitation. The lifetime of the intermediate states was considered in order to investigate the effect of the absorption line broadening. The dependences of the two excitation methods on the lifetime were found to be quite different. The ionization probability by the adiabatic excitation is higher than that by the broadband excitation for identical excitation laser intensity. (author)

  16. Absorbed dose distribution analyses in irradiation with adjacent fields

    International Nuclear Information System (INIS)

    Cudalbu, C.; Onuc, C.; Andrada, S.

    2002-01-01

    Because the special irradiation technique with adjacent fields is the most used in the case of medulloblastoma treatment, we consider very important to specify some general information about medulloblastoma. This malignant disease has a large incidence in children with age between 5-7 years. This tumor usually originates in the cerebellum and is referred to as primitive undifferentiated tumor. It may spread contiguously to the cerebellar peduncle, floor of the fourth ventricle, into the cervical spine. In addition, it may spread via the cerebrospinal fluid intracranially and/or to the spinal cord. For this purpose it is necessary to perform a treatment technique with cranial tangential fields combined with adjacent fields for the entire spinal cord to achieve a perfect coverage of the zones with malignant cells. The treatment in this case is an association between surgery-radio-chemotherapy, where the radiotherapy has a very important roll and a curative purpose. This is due to the fact that the migration of malignant cells in the body can't be controlled by surgery. Because of this special irradiation technique used in medulloblastoma treatment, we chase to describe in this paper this complex type of irradiation where the implications of the beams divergence in doses distribution are essentials

  17. Confusion-limited galaxy fields. II. Classical analyses

    International Nuclear Information System (INIS)

    Chokshi, A.; Wright, E.L.

    1989-01-01

    Chokshi and Wright presented a detailed model for simulating angular distribution of galaxy images in fields that extended to very high redshifts. Standard tools are used to analyze these simulated galaxy fields for the Omega(O) = 0 and the Omega(O) = 1 cases in order to test the discriminatory power of these tools. Classical number-magnitude diagrams and surface brightness-color-color diagrams are employed to study crowded galaxy fields. An attempt is made to separate the effects due to stellar evolution in galaxies from those due to the space time geometry. The results show that this discrimination is maximized at near-infrared wavelengths where the stellar photospheres are still visible but stellar evolution effects are less severe than those observed at optical wavelenghts. Rapid evolution of the stars on the asymptotic giant branch is easily recognized in the simulated data for both cosmologies and serves to discriminate between the two extreme values of Omega(O). Measurements of total magnitudes of individual galaxies are not essential for studying light distribution in galaxies as a function of redshift. Calculations for the extragalactic background radiation are carried out using the simulated data, and compared to integrals over the evolutionary models used. 29 refs

  18. Thermoluminescent analyses of mean photon energy of a field

    Energy Technology Data Exchange (ETDEWEB)

    Cavalieri, T. A.; De Paiva, F.; Fonseca, G.; Dalledone S, P. de T.; Yoriyaz, H., E-mail: tassio.cavalieri@usp.br [Instituto de Pesquisas Energeticas e Nucleares / CNEN, Av. Lineu Prestes 2242, Cidade Universitaria, 05508-000 Sao Paulo (Brazil)

    2014-08-15

    Nowadays a common method of dosimetry is utilize the thermoluminescent dosimetry (TLD) of LiF, where for pure gamma field is typically used the LiF or CaF{sub 2} TLDs and for mixed neutron and gamma field dosimetry is used the pair TLD-600/TLD-700. The difference between these three LiF TLDs is the amount of isotope {sup 6}Li in their composition. The isotope {sup 6}Li has a great cross section for thermal neutrons, making the TLD-600 sensitive to thermal neutrons beyond the radiation gamma. Whereas the TLD-700 is considered sensitive only for radiation gamma. Some studies showed an energetic dependence of these TLDs for gammas rays. So the goal of this work was study these energetic dependence of TLDs from the angular coefficient of their response versus dose calibration curves when they were irradiated in four fields with photons of different energies: 43 keV, 662 keV, 1.2 MeV, 3 MeV. In order to create the calibration curves TLD, it was performed three irradiations with distinct exposure times for each photon energy. These studies showed a different angular coefficient to each curve; demonstrate the energetic dependence of these TLDs. By simulation with Monte Carlo based code, MCNP-5, it was observed the deposited photon dose due to different photons energies. From these simulations, it was also possible to observe a difference of dose deposition in TLDs when they were exposed to the same dose provided from different photons energies. These work showed the previously study of photon energetic dependence of LiF TLDs. (Author)

  19. Mapping the Structure-Function Relationship in Glaucoma and Healthy Patients Measured with Spectralis OCT and Humphrey Perimetry

    Science.gov (United States)

    Muñoz–Negrete, Francisco J.; Oblanca, Noelia; Rebolleda, Gema

    2018-01-01

    Purpose To study the structure-function relationship in glaucoma and healthy patients assessed with Spectralis OCT and Humphrey perimetry using new statistical approaches. Materials and Methods Eighty-five eyes were prospectively selected and divided into 2 groups: glaucoma (44) and healthy patients (41). Three different statistical approaches were carried out: (1) factor analysis of the threshold sensitivities (dB) (automated perimetry) and the macular thickness (μm) (Spectralis OCT), subsequently applying Pearson's correlation to the obtained regions, (2) nonparametric regression analysis relating the values in each pair of regions that showed significant correlation, and (3) nonparametric spatial regressions using three models designed for the purpose of this study. Results In the glaucoma group, a map that relates structural and functional damage was drawn. The strongest correlation with visual fields was observed in the peripheral nasal region of both superior and inferior hemigrids (r = 0.602 and r = 0.458, resp.). The estimated functions obtained with the nonparametric regressions provided the mean sensitivity that corresponds to each given macular thickness. These functions allowed for accurate characterization of the structure-function relationship. Conclusions Both maps and point-to-point functions obtained linking structure and function damage contribute to a better understanding of this relationship and may help in the future to improve glaucoma diagnosis. PMID:29850196

  20. 76 FR 2432 - Bureau of Educational and Cultural Affairs (ECA) Request for Grant Proposals: FY2012 Humphrey...

    Science.gov (United States)

    2011-01-13

    ... countries' development needs in key areas including public health, sustainable growth, and democratic... leadership skills for public service in their countries. Each year the Humphrey Program brings accomplished... combining non-degree graduate study, leadership training, and professional development. Candidates for the...

  1. Humphrey Ridley (1653-1708): 17th century evolution in neuroanatomy and selective cerebrovascular injections for cadaver dissection.

    Science.gov (United States)

    Thakur, Jai Deep; Sonig, Ashish; Chittiboina, Prashant; Khan, Imad Saeed; Wadhwa, Rishi; Nanda, Anil

    2012-08-01

    Humphrey Ridley, M.D. (1653-1708), is a relatively unknown historical figure, belonging to the postmedieval era of neuroanatomical discovery. He was born in the market town of Mansfield, 14 miles from the county of Nottinghamshire, England. After studying at Merton College, Oxford, he pursued medicine at Leiden University in the Netherlands. In 1688, he was incorporated as an M.D. at Cambridge. Ridley authored the first original treatise in English language on neuroanatomy, The Anatomy of the Brain Containing its Mechanisms and Physiology: Together with Some New Discoveries and Corrections of Ancient and Modern Authors upon that Subject. Ridley described the venous anatomy of the eponymous circular sinus in connection with the parasellar compartment. His methods were novel, unique, and effective. To appreciate the venous anatomy, he preferred to perform his anatomical dissections on recently executed criminals who had been hanged. These cadavers had considerable venous engorgement, which made the skull base venous anatomy clearer. To enhance the appearance of the cerebral vasculature further, he used tinged wax and quicksilver in the injections. He set up experimental models to answer questions definitively, in proving that the arachnoid mater is a separate meningeal layer. The first description of the subarachnoid cisterns, blood-brain barrier, and the fifth cranial nerve ganglion with its branches are also attributed to Ridley. This historical vignette revisits Ridley's life and academic work that influenced neuroscience and neurosurgical understanding in its infancy. It is unfortunate that most of his novel contributions have gone unnoticed and uncited. The authors hope that this article will inform the neurosurgical community of Ridley's contributions to the field of neurosurgery.

  2. Comparison between Humphrey Field Analyzer and Micro Perimeter 1 in normal and glaucoma subjects

    Directory of Open Access Journals (Sweden)

    Vineet Ratra

    2012-01-01

    Results: The mean light thresholds of 21 matching points in control group with MP1 and HFA were 14.97 ± 2.64 dB and 30.90 ± 2.08 dB, respectively. In subjects with glaucoma, the mean values were MP1: 11.73 ± 4.36 dB and HFA: 27.96 ± 5.41 dB. Mean difference of light thresholds among the two instruments was 15.86 ± 3.25 dB in normal subjects (P < 0.001 and 16.22 ± 2.77 dB in glaucoma subjects (P < 0.001. Pearson correlation analysis of the HFA and MP1 results for each test point location in both cases and control subjects showed significant positive correlation (controls, r = 0.439, P = 0.047; glaucoma subjects, r = 0.812, P < 0.001. There was no difference between nasal and temporal points but a slight vertical asymmetry was observed with MP1. Conclusion: There are significant and reproducible differences in the differential light threshold in MP1 and HFA in both normal and glaucoma subjects. We found a correction factor of 17.271 for comparison of MP1 with HFA. MP1 appeared to be more sensitive in predicting loss in glaucoma.

  3. Spanish-Language Adaptation of Morgeson and Humphrey's Work Design Questionnaire (WDQ).

    Science.gov (United States)

    Fernández Ríos, Manuel; Ramírez Vielma, Raúl G; Sánchez García, José Carlos; Bargsted Aravena, Mariana; Polo Vargas, Jean David; Ruiz Díaz, Miguel Ángel

    2017-06-09

    Since work organizations became the subject of scientific research, how to operationalize and measure dimensions of work design has been an issue, mainly due to concerns about internal consistency and factor structure. In response, Morgeson and Humphrey (2006) built the Work Design Questionnaire -WDQ-, an instrument that identifies and measures these dimensions in different work and organizational contexts. This paper presents the instruent's adaptation into Spanish using reliability and validity analysis and drawing on a sample of 1035 Spanish workers who hold various jobs in an array of occupational categories. The total instrument's internal consistency was Cronbach's alpha of .92 and the various scales' reliability ranged from .70 to .96, except for three dimensions. There was initially a difference in the comparative fit of the two versions' factor structures, but the model with 21 work characteristics (motivational -task and knowledge-, social, and work context) showed the highest goodness of fit of the various models tested, confirming previous results from the U.S. version as well as adaptations into other languages and contexts. CFA results indicated goodness of fit of factor configurations corresponding to each of the four major categories of work characteristics, with CFI and TLI around .90, as well as SRMR and RMSEA below .08. Thus it brings to the table a reliable, valid measure of work design with clear potential applications in research as well as professional practice, applications that could improve working conditions, boost productivity, and generate more personal and professional development opportunities for workers.

  4. Measurement of electromagnetic fields generated by air traffic control radar systems with spectrum analysers.

    Science.gov (United States)

    Barellini, A; Bogi, L; Licitra, G; Silvi, A M; Zari, A

    2009-12-01

    Air traffic control (ATC) primary radars are 'classical' radars that use echoes of radiofrequency (RF) pulses from aircraft to determine their position. High-power RF pulses radiated from radar antennas may produce high electromagnetic field levels in the surrounding area. Measurement of electromagnetic fields produced by RF-pulsed radar by means of a swept-tuned spectrum analyser are investigated here. Measurements have been carried out both in the laboratory and in situ on signals generated by an ATC primary radar.

  5. The luminosities of cool supergiants in the Magellanic Clouds, and the Humphreys-Davidson limit revisited

    Science.gov (United States)

    Davies, Ben; Crowther, Paul A.; Beasor, Emma R.

    2018-05-01

    The empirical upper luminosity boundary Lmax of cool supergiants, often referred to as the Humphreys-Davidson limit, is thought to encode information on the general mass-loss behaviour of massive stars. Further, it delineates the boundary at which single stars will end their lives stripped of their hydrogen-rich envelope, which in turn is a key factor in the relative rates of Type-II to Type-Ibc supernovae from single star channels. In this paper we have revisited the issue of Lmax by studying the luminosity distributions of cool supergiants (SGs) in the Large and Small Magellanic Clouds (LMC/SMC). We assemble samples of cool SGs in each galaxy which are highly-complete above log L/L⊙=5.0, and determine their spectral energy distributions from the optical to the mid-infrared using modern multi-wavelength survey data. We show that in both cases Lmax appears to be lower than previously quoted, and is in the region of log L/L⊙=5.5. There is no evidence for Lmax being higher in the SMC than in the LMC, as would be expected if metallicity-dependent winds were the dominant factor in the stripping of stellar envelopes. We also show that Lmax aligns with the lowest luminosity of single nitrogen-rich Wolf-Rayet stars, indicating of a change in evolutionary sequence for stars above a critical mass. From population synthesis analysis we show that the Geneva evolutionary models greatly over-predict the numbers of cool SGs in the SMC. We also argue that the trend of earlier average spectral types of cool SGs in lower metallicity environments represents a genuine shift to hotter temperatures. Finally, we use our new bolometric luminosity measurements to provide updated bolometric corrections for cool supergiants.

  6. Numerical analyses of a Couette-Taylor flow in the presence of a magnetic field

    International Nuclear Information System (INIS)

    Tagawa, T; Kaneda, M

    2005-01-01

    An axisymmetric Couette-Taylor flow of liquid metal in the presence of a magnetic field has been numerically studied. An inner cylinder of a coaxial container is rotating at a constant angular velocity whereas the outer cylindrical wall is at rest. An axial or a toroidal magnetic field is applied to this configuration to investigate the influence of such magnetic fields on the liquid metal Couette-Taylor flow. The toroidal magnetic field can be produced with a straight wire along the central axis in which electric current passes. The governing equations of mass conservation, momentum, Ohm's law and conservation of electric charge for an axisymmetric cylindrical coordinate system have been numerically solved with a finite difference method using the HSMAC algorithm. In the numerical analyses, since the Joule heating and the induced magnetic field are neglected, the system parameters are the Hartmann number and the Reynolds number. The numerical results reveal significant difference in the Couette-Taylor flow depending on whether the applied magnetic field is axial or toroidal as well as on the Hartmann and Reynolds numbers. The axial magnetic field damps out the secondary flow efficiently and velocity gradient in the direction of the magnetic field tends to diminish while the toroidal magnetic field does not have such an efficient damping

  7. A hand-held sensor for analyses of local distributions of magnetic fields and losses

    CERN Document Server

    Krismanic, G; Baumgartinger, N

    2000-01-01

    The paper describes a novel sensor for non-destructive analyses of local field and loss distributions in laminated soft magnetic cores, such as transformer cores. It was designed for rapid information on comparative local degrees of inhomogeneity, e.g., for the estimation of local building factors. Similar to a magnifying glass with handle, the compact hand-held sensor contains extremely sharp needle electrodes for the detection of the induction vector B as well as double-field coils for the vector H. Losses P are derived from the Poynting law. Applied to inner -- or also outer -- core regions, the sensor yields instantaneous computer displays of local H, B, and P.

  8. Measurement of electromagnetic fields generated by air traffic control radar systems with spectrum analysers

    International Nuclear Information System (INIS)

    Barellini, A.; Bogi, L.; Licitra, G.; Silvi, A. M.; Zari, A.

    2009-01-01

    Air traffic control (ATC) primary radars are 'classical' radars that use echoes of radiofrequency (RF) pulses from aircraft to determine their position. High-power RF pulses radiated from radar antennas may produce high electromagnetic field levels in the surrounding area. Measurement of electromagnetic fields produced by RF-pulsed radar by means of a swept-tuned spectrum analyser are investigated here. Measurements have been carried out both in the laboratory and in situ on signals generated by an ATC primary radar. (authors)

  9. The Equipment of Czech Firefighters for the Detection and Field Analyses of Chemical Warfare Agents

    Directory of Open Access Journals (Sweden)

    Jana Krykorkova

    2014-05-01

    Full Text Available This paper describes the requirements for the devices of detection, chemical reconnaissance and field analyses of chemical warfare agents (CWA and divides them into simple devices of detection, universal detectors, selective analyzers, multi-component analyzers and mobile laboratories. It also describes the devices of detection available within the Fire and Rescue Service of the Czech Republic (FRS CR and compares them with some prospective trends of further development.

  10. Analysing radio-frequency coil arrays in high-field magnetic resonance imaging by the combined field integral equation method

    Energy Technology Data Exchange (ETDEWEB)

    Wang Shumin; Duyn, Jeff H [Laboratory of Functional and Molecular Imaging, National Institute of Neurological Disorders and Stroke, National Institutes of Health, 10 Center Drive, 10/B1D728, Bethesda, MD 20892 (United States)

    2006-06-21

    We present the combined field integral equation (CFIE) method for analysing radio-frequency coil arrays in high-field magnetic resonance imaging (MRI). Three-dimensional models of coils and the human body were used to take into account the electromagnetic coupling. In the method of moments formulation, we applied triangular patches and the Rao-Wilton-Glisson basis functions to model arbitrarily shaped geometries. We first examined a rectangular loop coil to verify the CFIE method and also demonstrate its efficiency and accuracy. We then studied several eight-channel receive-only head coil arrays for 7.0 T SENSE functional MRI. Numerical results show that the signal dropout and the average SNR are two major concerns in SENSE coil array design. A good design should be a balance of these two factors.

  11. Compilation of gas geochemistry and isotopic analyses from The Geysers geothermal field: 1978-1991

    Science.gov (United States)

    Lowenstern, Jacob B.; Janik, Cathy; Fahlquist, Lynne; Johnson, Linda S.

    1999-01-01

    We present 45 chemical and isotopic analyses from well discharges at The Geysers geothermal field and summarize the most notable geochemical trends. H2 and H2S concentrations are highest in the Southeast Geysers, where steam samples have δD and δ18O values that reflect replenishment by meteoric water. In the Northwest Geysers, samples are enriched in gas/steam, CO2, CH4, and N2/Ar relative to the rest of the field, and contain steam that is elevated in δD and δ18O, most likely due to substantial contributions from Franciscan-derived fluids. The δ13C of CO2, trends in CH4 vs. N2, and abundance of NH3 indicate that the bulk of the non-condensable gases are derived from thermal breakdown of organic materials in Franciscan meta-sediments.

  12. An Image of Britain during the Second World War: The films of Humphrey Jennings (1939-1945 Une image de la Grande-Bretagne pendant la Seconde Guerre mondiale : les films de Humphrey Jennings (1939-1945

    Directory of Open Access Journals (Sweden)

    Elena Von Kassel

    2009-10-01

    Full Text Available Il s’agit de chercher à comprendre comment le style poétique a pu atteindre des sommets avec les films de Humphrey Jennings dans le documentaire anglais et comment ces films résultent d’un mélange entre deux traditions du cinéma britannique, traditions jusque-là opposées. A l’initiative d’Alberto Cavalcanti, en 1939, The First Days, coréalisé par Humphrey Jennings, Harry Watt et Pat Jackson, fut, au tout début la guerre, le premier film offrant une réflexion sur des réalités, lorsque l’on avait essayé de manipuler le public dans Le lion a des ailes, tourné la même année, mais aussi  le premier film anglais de propagande de la Seconde Guerre mondiale, dont Alexander Korda avait lancé et soutenu la réalisation. Quand, partant pour Ealing Studios, Cavalcanti quitte la direction de la GPO Film Unit, il est remplacé par Ian Dalrymple, le producteur du film Le lion a des ailes. C’est sous la direction de Dalrymple qu’au GPO, désormais appelé Crown Film Unit, un nouveau genre de film put émerger. Dans cet esprit démocratique, après The First Days, fut tourné en coréalisation London Can Take It (1940, par Humphrey Jennings et Harry Watt, le thème portant sur la façon dont les Londoniens supportaient les bombardements. Puis, Jennings a réalisé Heart of Britain, le thème portant, cette fois, sur la résistance de toute l’Angleterre. Words for Battle, qui suivit, était un film de propagande historique. Toutefois, c’est avec Listen to Britain, en 1942, que Jennings put vraiment toucher le public. La guerre n’était pas encore gagnée, mais la propagande dans le documentaire anglais était bien plus efficace que celle de l’ennemi, et parvenait, en même temps, à toucher toutes les couches de la population.

  13. Analyses of Research Topics in the Field of Informetrics Based on the Method of Topic Modeling

    Directory of Open Access Journals (Sweden)

    Sung-Chien Lin

    2014-07-01

    Full Text Available In this study, we used the approach of topic modeling to uncover the possible structure of research topics in the field of Informetrics, to explore the distribution of the topics over years, and to compare the core journals. In order to infer the structure of the topics in the field, the data of the papers published in the Journal of Informetricsand Scientometrics during 2007 to 2013 are retrieved from the database of the Web of Science as input of the approach of topic modeling. The results of this study show that when the number of topics was set to 10, the topic model has the smallest perplexity. Although data scopes and analysis methodsare different to previous studies, the generating topics of this study are consistent with those results produced by analyses of experts. Empirical case studies and measurements of bibliometric indicators were concerned important in every year during the whole analytic period, and the field was increasing stability. Both the two core journals broadly paid more attention to all of the topics in the field of Informetrics. The Journal of Informetricsput particular emphasis on construction and applications ofbibliometric indicators and Scientometrics focused on the evaluation and the factors of productivity of countries, institutions, domains, and journals.

  14. Finite element analyses of a heater-interruption in the HAW test field

    International Nuclear Information System (INIS)

    Horn, B.A. van den.

    1991-09-01

    In this report the results of two finite element analyses of the HAW field are presented. The determination of the influence of a heater-interruption on the tube load as well as the differences in the evaluation of the tube load for both types of boreholes (type A and type B) are the main objectives of this report. Axisymmetric models are made for both type of boreholes in order to simulate this heater-interruption. It appeared that a heater-interruption of 4 hours leads to a temperature drop of 17.2deg C at the borehole wall and to a maximum reduction of the tube load of 1.76 MPa. About 20 days after reparation of the heaters of the heaters the evolution of the maximum temperature and the maximum tube load will be rehabilitated; the difference with the corresponding evolutions due to an uninterrupted heat-production are negligible. (author). 9 refs.; 25 figs.; 5 tabs

  15. Design and testing of indigenous cost effective three dimensional radiation field analyser (3D RFA).

    Science.gov (United States)

    Ganesh, K M; Pichandi, A; Nehru, R M; Ravikumar, M

    2014-06-01

    The aim of the study is to design and validate an indigenous three dimensional Radiation Field Analyser (3D RFA). The feed system made for X, Y and Z axis movements is of lead screw with deep ball bearing mechanism made up of stain less steel driven by stepper motors with accuracy less than 0.5 mm. The telescopic column lifting unit was designed using linear actuation technology for lifting the water phantom. The acrylic phantom with dimensions of 800 x 750 x 570 mm was made with thickness of 15 mm. The software was developed in visual basic programming language, classified into two types, viz. beam analyzer software and beam acquisition software. The premeasurement checks were performed as per TG 106 recommendations. The physical parameters of photon PDDs such as Dmax, D10, D20 and Quality Index (QI), and the electron PDDs such as R50, Rp, E0, Epo and X-ray contamination values can be obtained instantaneously by using the developed RFA system. Also the results for profile data such as field size, central axis deviation, penumbra, flatness and symmetry calculated according to various protocols can be obtained for both photon and electron beams. The result of PDDs for photon beams were compared with BJR25 supplement values and the profile data were compared with TG 40 recommendation. The results were in agreement with standard protocols.

  16. Microsegregation in multicomponent alloy analysed by quantitative phase-field model

    International Nuclear Information System (INIS)

    Ohno, M; Takaki, T; Shibuta, Y

    2015-01-01

    Microsegregation behaviour in a ternary alloy system has been analysed by means of quantitative phase-field (Q-PF) simulations with a particular attention directed at an influence of tie-line shift stemming from different liquid diffusivities of the solute elements. The Q-PF model developed for non-isothermal solidification in multicomponent alloys with non-zero solid diffusivities was applied to analysis of microsegregation in a ternary alloy consisting of fast and slow diffusing solute elements. The accuracy of the Q-PF simulation was first verified by performing the convergence test of segregation ratio with respect to the interface thickness. From one-dimensional analysis, it was found that the microsegregation of slow diffusing element is reduced due to the tie-line shift. In two-dimensional simulations, the refinement of microstructure, viz., the decrease of secondary arms spacing occurs at low cooling rates due to the formation of diffusion layer of slow diffusing element. It yields the reductions of degrees of microsegregation for both the fast and slow diffusing elements. Importantly, in a wide range of cooling rates, the degree of microsegregation of the slow diffusing element is always lower than that of the fast diffusing element, which is entirely ascribable to the influence of tie-line shift. (paper)

  17. Experimental Field Tests and Finite Element Analyses for Rock Cracking Using the Expansion of Vermiculite Materials

    Directory of Open Access Journals (Sweden)

    Chi-hyung Ahn

    2016-01-01

    Full Text Available In the previous research, laboratory tests were performed in order to measure the expansion of vermiculite upon heating and to convert it into expansion pressure. Based on these test results, this study mainly focuses on experimental field tests conducted to verify that expansion pressure obtained by heating vermiculite materials is enough to break massive and hard granite rock with an intention to excavate the tunnel. Hexahedral granite specimens with a circular hole perforated in the center were constructed for the experimental tests. The circular holes were filled with vermiculite plus thermal conduction and then heated using the cartridge heater. As a result, all of hexahedral granite specimens had cracks in the surface after 700-second thermal heating and were finally spilt into two pieces completely. The specimen of larger size only requires more heating time and expansion pressure. The material properties of granite rocks, which were obtained from the experimental tests, were utilized to produce finite element models used for numerical analyses. The analysis results show good agreement with the experimental results in terms of initial cracking, propagation direction, and expansion pressure.

  18. Lung and heart dose volume analyses with CT simulator in tangential field irradiation of breast cancer

    International Nuclear Information System (INIS)

    Das, Indra J.; Cheng, Elizabeth C.; Fowble, Barbara

    1997-01-01

    breast are very different based on actual CT data. The slopes of regression lines for the left and right lung are 0.64%/mm and 0.54%/mm, respectively with a combined slope of 0.6%/mm. With the selection of proper beam parameters, the heart volume can be minimized. As expected, there is no correlation between heart PIV and the CLD. A maximum heart PIV of 5.6% is observed with one fourth of patients having a PIV of 0%. The heart PIV is inversely correlated with gantry angle as shown in Figure 2. Due to the radiation scatter in the body, the geometrical volume may be of limited importance and, hence, dose volume histogram (DVH) analyses were performed. A representative DVH of a patient whose lung and heart PIV were 14.4% and 4.0%, respectively is shown in Figure 3. Conclusions: The CT-simulator provides an accurate volumetric information of the heart and lungs in the treatment fields. The lung PIV is directly correlated to the CLD. Left and right lungs have different volumes and, hence, different regression lines are recommended. Heart volume is not correlated with the CLD. The heart PIV is associated to the beam angle. Heart volume may not be accurately visualized in a tangential radiograph; however, this can be easily seen in a DRR with contour delineation and can be minimized with proper beam parameters. Lung and heart PIV along with DVH are essential in reducing pulmonarv and cardiac complications

  19. Stratigraphy, palaeoenvironments and palaeoecology of the Loch Humphrey Burn lagerst?tte and other Mississippian palaeobotanical localities of the Kilpatrick Hills, southwest Scotland

    OpenAIRE

    Bateman, Richard M.; Stevens, Liadan G.; Hilton, Jason

    2016-01-01

    Background and Aims. The largely Mississippian strata of the Kilpatrick Hills, located at the western end of the Scottish Midland Valley, enclose several macrofossil floras that together contain ca 21 organ-species of permineralised plants and ca 44 organ-species of compressed plants, here estimated to represent 25 whole-plant species (Glenarbuck = nine, Loch Humphrey Burn Lower = 11, Upper = seven). The most significant locality is the internationally important volcanigenic sequence that is ...

  20. Intercomparison of fast response commercial gas analysers for nitrous oxide flux measurements under field conditions

    Science.gov (United States)

    Rannik, Ü.; Haapanala, S.; Shurpali, N. J.; Mammarella, I.; Lind, S.; Hyvönen, N.; Peltola, O.; Zahniser, M.; Martikainen, P. J.; Vesala, T.

    2015-01-01

    Four gas analysers capable of measuring nitrous oxide (N2O) concentration at a response time necessary for eddy covariance flux measurements were operated from spring until winter 2011 over a field cultivated with reed canary grass (RCG, Phalaris arundinacea, L.), a perennial bioenergy crop in eastern Finland. The instruments were TGA100A (Campbell Scientific Inc.), CW-TILDAS-CS (Aerodyne Research Inc.), N2O / CO-23d (Los Gatos Research Inc.) and QC-TILDAS-76-CS (Aerodyne Research Inc.). The period with high emissions, lasting for about 2 weeks after fertilization in late May, was characterized by an up to 2 orders of magnitude higher emission, whereas during the rest of the campaign the N2O fluxes were small, from 0.01 to 1 nmol m-2 s-1. Two instruments, CW-TILDAS-CS and N2O / CO-23d, determined the N2O exchange with minor systematic difference throughout the campaign, when operated simultaneously. TGA100A produced the cumulatively highest N2O estimates (with 29% higher values during the period when all instruments were operational). QC-TILDAS-76-CS obtained 36% lower fluxes than CW-TILDAS-CS during the first period, including the emission episode, whereas the correspondence with other instruments during the rest of the campaign was good. The reasons for systematic differences were not identified, suggesting further need for detailed evaluation of instrument performance under field conditions with emphasis on stability, calibration and any other factors that can systematically affect the accuracy of flux measurements. The instrument CW-TILDAS-CS was characterized by the lowest noise level (with a standard deviation of around 0.12 ppb at 10 Hz sampling rate) as compared to N2O / CO-23d and QC-TILDAS-76-CS (around 0.50 ppb) and TGA100A (around 2 ppb). We identified that for all instruments except CW-TILDAS-CS the random error due to instrumental noise was an important source of uncertainty at the 30 min averaging level and the total stochastic error was frequently

  1. Exploring the Adult Learning Research Field by Analysing Who Cites Whom

    Science.gov (United States)

    Nylander, Erik; Österlund, Lovisa; Fejes, Andreas

    2018-01-01

    In this article we report on findings from a large-scale bibliographic study conducted based on the citation practices within the field of research on adult learning. Our data consist of 151,261 citation links between more than 33,000 different authors whose papers were published in five leading international journals in the field of adult…

  2. Field and ray analyses of antenna excitations in ICRF heating of large Tokamaks

    International Nuclear Information System (INIS)

    Bers, A.; Lister, G.; Jacquinot, J.

    1980-09-01

    We present analytical and computational techniques for determining the electromagnetic fields and associated power flow excited by antenna systems external to large Tokamak plasmas. The finite poloidal and toroidal extension of the poloidal antenna current is modeled by a superposition of current sheets placed at a fixed radius outside the plasma. Antennae both with and without a screen between the current sheet and the plasma are considered. The plama is modeled by its cold dielectric tensor and inhomogeneous density and applied magnetic field. For large Tokamak plasmas in which the plasma dimensions are large compared to the antenna, the field excitation problem can be considered approximately in slab geometry. The field solution of this problem which we present, gives the electromagnetic fields excited in the edge plasma by the antennae and includes the effect of the cutoffs which may exist in this region. To proceed further into the plasma we consider a ray tracing analysis. Starting from an equiphase surface of the excited fields in the edge plasma, the group velocity rays can be followed in full toroidal geometry up to the cyclotron singular resonance region where the power is deposited in the particles. Both the amplitude and phase of the fields can be established in the vicinity of the angular surface so that the power deposition profile can be eventually calculated

  3. A kinetic model of retarding field analyser measurements in strongly magnetized, flowing, collisional plasmas

    Czech Academy of Sciences Publication Activity Database

    Gunn, J. P.; Fuchs, Vladimír; Kočan, M.

    2013-01-01

    Roč. 55, č. 4 (2013), 045012-045012 ISSN 0741-3335 R&D Projects: GA MŠk 7G10072 Institutional support: RVO:61389021 Keywords : plasma * collisions * magnetic field * retarding field analyzer Subject RIV: BL - Plasma and Gas Discharge Physics Impact factor: 2.386, year: 2013 http://iopscience.iop.org/0741-3335/55/4/045012/pdf/0741-3335_55_4_045012.pdf

  4. Rates of change of the earth's magnetic field measured by recent analyses

    Science.gov (United States)

    Harrison, C. G. A.; Huang, Qilin

    1990-01-01

    Typical rates of change of the earth's magnetic field are presented as a function of the earth's spherical harmonics. Harmonics up to the eight degree are analyzed. With the increase in the degree of the harmonics an increase in the relative rate of change can be observed. For higher degrees, the rate of change can be predicted. This enables a differentiation between harmonics originating in the core and harmonics caused by crustal magnetization. The westward drift of the magnetic field depends on the longitudinal gradient of the field. In order to determine the longitudinal motions, harmonics up to degree 20 can be utilized. The average rate of secular acceleration increases with the degree of harmonics from 0.001 deg/sq yr for a dipole term to an average of 0.05 deg/sq yr for degree eight harmonics.

  5. Analyses of Research Topics in the Field of Informetrics Based on the Method of Topic Modeling

    OpenAIRE

    Sung-Chien Lin

    2014-01-01

    In this study, we used the approach of topic modeling to uncover the possible structure of research topics in the field of Informetrics, to explore the distribution of the topics over years, and to compare the core journals. In order to infer the structure of the topics in the field, the data of the papers published in the Journal of Informetricsand Scientometrics during 2007 to 2013 are retrieved from the database of the Web of Science as input of the approach of topic modeling. The results ...

  6. Field sampling, preparation procedure and plutonium analyses of large freshwater samples

    International Nuclear Information System (INIS)

    Straelberg, E.; Bjerk, T.O.; Oestmo, K.; Brittain, J.E.

    2002-01-01

    This work is part of an investigation of the mobility of plutonium in freshwater systems containing humic substances. A well-defined bog-stream system located in the catchment area of a subalpine lake, Oevre Heimdalsvatn, Norway, is being studied. During the summer of 1999, six water samples were collected from the tributary stream Lektorbekken and the lake itself. However, the analyses showed that the plutonium concentration was below the detection limit in all the samples. Therefore renewed sampling at the same sites was carried out in August 2000. The results so far are in agreement with previous analyses from the Heimdalen area. However, 100 times higher concentrations are found in the lowlands in the eastern part of Norway. The reason for this is not understood, but may be caused by differences in the concentrations of humic substances and/or the fact that the mountain areas are covered with snow for a longer period of time every year. (LN)

  7. Thermal and hydraulic analyses of TFTR cooling water system and magnetic field coils

    International Nuclear Information System (INIS)

    Lee, A.Y.

    1975-10-01

    The TFTR toroidal field coils, ohmic heating, hybrid and equilibrium field coils are cooled by water from the machine area cooling water system. The system has the following major equipment and capacities: flow rate of 3600 gpm; ballast tank volume of 5500 gal; pumps of 70.4 m head; chiller refrigeration rating of 3300 tons and connecting pipe of 45.7 cm I.D. The performance of the closed loop system was analyzed and found to be adequate for the thermal loads. The field coils were analyzed with detailed thermal and hydraulic models, including a simulation of the complete water cooling loop. Under the nominal operating mode of one second of toroidal field flat top time and 300 seconds of pulse cycle time, the maximum temperature for the TF coils is 53 0 C; for the OH coils 46 0 C and for the EF coils 39 0 C, which are well below the coil design limit of 120 0 C. The maximum TF coil coolant temperature is 33 0 C which is below the coolant design limit of 100 0 C. The overall pressure loss of the system is below 6.89 x 10 5 Pa (100 psi). With the given chiller refrigeration capacity, the TF coils can be operated to yield up to 4 seconds of flat top time. The TF coils can be operated on a steady state basis at up to 20% of the pulsed duty design current rating of 7.32 kA/coil

  8. Red cherries (Prunus avium var. Stella) processed by pulsed electric field - Physical, chemical and microbiological analyses.

    Science.gov (United States)

    Sotelo, Kristine A G; Hamid, Nazimah; Oey, Indrawati; Pook, Chris; Gutierrez-Maddox, Noemi; Ma, Qianli; Ying Leong, Sze; Lu, Jun

    2018-02-01

    This study examined, for the first time, the effect of mild or moderate intensity pulsed electric field (PEF) processing on cherries, in particular changes in physicochemical properties, release of anthocyanins and polyphenols, and the potential growth of lactic acid bacteria. Cherry samples were treated at a constant pulse frequency of 100Hz and a constant pulse width of 20μs with different electric field strengths between 0.3 and 2.5kV/cm. Titratable acidity and total soluble solids values of most PEF samples stored for 24h significantly decreased compared to other samples. Stored samples also had increased cyanidin glucoside content. However, concentration of rutin, 4-hydroxybenzoic acid and isorhamnetin rutinoside significantly decreased in samples stored for 24h. In conclusion, sweet cherries were only influenced by storage after PEF processing. PEF processing did not affect the growth of probiotic bacteria. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Field-driven chiral bubble dynamics analysed by a semi-analytical approach

    Science.gov (United States)

    Vandermeulen, J.; Leliaert, J.; Dupré, L.; Van Waeyenberge, B.

    2017-12-01

    Nowadays, field-driven chiral bubble dynamics in the presence of the Dzyaloshinskii-Moriya interaction are a topic of thorough investigation. In this paper, a semi-analytical approach is used to derive equations of motion that express the bubble wall (BW) velocity and the change in in-plane magnetization angle as function of the micromagnetic parameters of the involved interactions, thereby taking into account the two-dimensional nature of the bubble wall. It is demonstrated that the equations of motion enable an accurate description of the expanding and shrinking convex bubble dynamics and an expression for the transition field between shrinkage and expansion is derived. In addition, these equations of motion show that the BW velocity is not only dependent on the driving force, but also on the BW curvature. The absolute BW velocity increases for both a shrinking and an expanding bubble, but for different reasons: for expanding bubbles, it is due to the increasing importance of the driving force, while for shrinking bubbles, it is due to the increasing importance of contributions related to the BW curvature. Finally, using this approach we show how the recently proposed magnetic bubblecade memory can operate in the flow regime in the presence of a tilted sinusoidal magnetic field and at greatly reduced bubble sizes compared to the original device prototype.

  10. Soil–structure interaction analyses to locate nuclear power plant free-field seismic instrumentation

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, James J., E-mail: jasjjoh@aol.com [James J. Johnson and Associates, Alamo, CA (United States); Ake, Jon P. [US Nuclear Regulatory Commission, Washington, DC (United States); Maslenikov, Oleg R. [James J. Johnson and Associates, Alamo, CA (United States); Kenneally, Roger M. [Consultant, Seminole, FL (United States)

    2015-12-15

    Highlights: • Determine the location of seismic instrumentation so that recorded motion will be free-field motion. • Certified Designs of nuclear island for AP1000 and EPR; ABWR Reactor Building were analyzed. • Three site conditions and multiple recorded time histories were considered. • Instrumentation located 1-diameter from the edge of structure/foundation is adequate. • Acceptance criteria were probability of non-exceedance of response spectra values. - Abstract: The recorded earthquake ground motion at the nuclear power plant site is needed for several purposes. US Nuclear Regulatory Commission (NRC) Regulatory Guide 1.12, Nuclear Power Plant Instrumentation for Earthquakes, NRC (1997a), describes acceptable instrumentation to meet the requirements in NRC's regulations pertaining to earthquake engineering criteria for nuclear power plants. The ground motion data recorded by the free-field seismic instrumentation are used to compare the actual earthquake motion at the site with the design input motion. The result of the comparison determines if the Operating Basis Earthquake ground motion (OBE) has been exceeded and plant shutdown is required per the guidance in NRC Regulatory Guide 1.166, Pre-Earthquake Planning and Immediate Nuclear Power Plant Operator Postearthquake Actions, NRC (1979b). The free-field is defined as a location on the ground surface or in the site soil column that is sufficiently distant from the site structures to be essentially unaffected by the vibration of the site structures.

  11. Analyses of the radiation-caused characteristics change in SOI MOSFETs using field shield isolation

    International Nuclear Information System (INIS)

    Hirano, Yuuichi; Maeda, Shigeru; Fernandez, Warren; Iwamatsu, Toshiaki; Yamaguchi, Yasuo; Maegawa, Shigeto; Nishimura, Tadashi

    1999-01-01

    Reliability against radiation ia an important issue in silicon on insulator metal oxide semiconductor field effect transistors (SOI MOSFETs) used in satellites and nuclear power plants and so forth which are severely exposed to radiation. Radiation-caused characteristic change related to the isolation-edge in an irradiated environment was analyzed on SOI MOSFETs. Moreover short channel effects for an irradiated environment were investigated by simulations. It was revealed that the leakage current which was observed in local oxidation of silicon (LOCOS) isolated SOI MOSFETs was successfully suppressed by using field shield isolation. Simulated potential indicated that the potential rise at the LOCOS edge can not be seen in the case of field shield isolation edge which does not have physical isolation. Also it was found that the threshold voltage shift caused by radiation in short channel regime is severer than that in long regime channel. In transistors with a channel length of 0.18μm, a potential rise of the body region by radiation-induced trapped holes can be seen in comparison with that of 1.0μm. As a result, we must consider these effects for designing deep submicron devices used in an irradiated environment. (author)

  12. Risk analyses for disposing nonhazardous oil field wastes in salt caverns

    Energy Technology Data Exchange (ETDEWEB)

    Tomasko, D.; Elcock, D.; Veil, J.; Caudle, D.

    1997-12-01

    Salt caverns have been used for several decades to store various hydrocarbon products. In the past few years, four facilities in the US have been permitted to dispose nonhazardous oil field wastes in salt caverns. Several other disposal caverns have been permitted in Canada and Europe. This report evaluates the possibility that adverse human health effects could result from exposure to contaminants released from the caverns in domal salt formations used for nonhazardous oil field waste disposal. The evaluation assumes normal operations but considers the possibility of leaks in cavern seals and cavern walls during the post-closure phase of operation. In this assessment, several steps were followed to identify possible human health risks. At the broadest level, these steps include identifying a reasonable set of contaminants of possible concern, identifying how humans could be exposed to these contaminants, assessing the toxicities of these contaminants, estimating their intakes, and characterizing their associated human health risks. The contaminants of concern for the assessment are benzene, cadmium, arsenic, and chromium. These were selected as being components of oil field waste and having a likelihood to remain in solution for a long enough time to reach a human receptor.

  13. BASEMAP, HUMPHREYS COUNTY, MS

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — FEMA Framework Basemap datasets comprise six of the seven FGDC themes of geospatial data that are used by most GIS applications (Note: the seventh framework theme,...

  14. High fidelity phase locked PIV measurements analysing the flow fields surrounding an oscillating piezoelectric fan

    International Nuclear Information System (INIS)

    Jeffers, Nicholas; Nolan, Kevin; Stafford, Jason; Donnelly, Brian

    2014-01-01

    Piezoelectric fans have been studied extensively and are seen as a promising technology for thermal management due to their ability to provide quiet, reliable cooling with low power consumption. The fluid mechanics of an unconfined piezoelectric fan are complex which is why the majority of the literature to date confines the fan in an attempt to simplify the flow field. This paper investigates the fluid mechanics of an unconfined fan operating in its first vibration frequency mode. The piezoelectric fan used in this study measures 12.7 mm × 70 mm and resonates at 92.5 Hz in air. A custom built experimental facility was developed to capture the fan's flow field using phase locked Particle Image Velocimetry (PIV). The phase locked PIV results are presented in terms of vorticity and show the formation of a horse shoe vortex. A three dimensional A2 criterion constructed from interpolated PIV measurements was used to identify the vortex core in the vicinity of the fan. This analysis was used to clearly identify the formation of a horse shoe vortex that turns into a hairpin vortex before it breaks up due to a combination of vortex shedding and flow along the fan blade. The results presented in this paper contribute to both the fluid dynamics and heat transfer literature concerning first mode fan oscillation.

  15. An evaluation of fresh gas flow rates for spontaneously breathing cats and small dogs on the Humphrey ADE semi-closed breathing system.

    Science.gov (United States)

    Gale, Elizabeth; Ticehurst, Kim E; Zaki, Sanaa

    2015-05-01

    To evaluate the fresh gas flow (FGF) rate requirements for the Humphrey ADE semi-closed breathing system in the Mapleson A mode; to determine the FGF at which rebreathing occurs, and compare the efficiency of this system to the Bain (Mapleson D) system in spontaneously breathing cats and small dogs. Prospective clinical study. Twenty-five healthy (ASA score I or II) client-owned cats and dogs (mean ± SD age 4.7 ± 5.0 years, and body weight 5.64 ± 3.26 kg) undergoing elective surgery or minor procedures. Anaesthesia was maintained with isoflurane delivered via the Humphrey ADE system in the A mode using an oxygen FGF of 100 mL kg(-1) minute(-1). The FGF was then reduced incrementally by 5-10 mL kg(-1) minute(-1) at approximately five-minute intervals, until rebreathing (inspired CO(2) >5 mmHg (0.7 kPa)) was observed, after which flow rates were increased. In six animals, once the minimum FGF at which rebreathing occurred was found, the breathing system was changed to the Bain, and the effects of this FGF delivery examined, before FGF was increased. Rebreathing did not occur at the FGF recommended by the manufacturer for the ADE. The mean ± SD FGF that resulted in rebreathing was 60 ± 20 mL kg(-1) minute(-1). The mean minimum FGF at which rebreathing did not occur with the ADE was 87 ± 39 mL kg(-1) minute(-1). This FGF resulted in significant rebreathing (inspired CO(2) 8.8 ± 2.6 mmHg (1.2 ± 0.3 kPa)) on the Bain system. The FGF rates recommended for the Humphrey ADE are adequate to prevent rebreathing in spontaneously breathing cats and dogs cats and small dogs. © 2014 Association of Veterinary Anaesthetists and the American College of Veterinary Anesthesia and Analgesia.

  16. Statistical Analyses of Second Indoor Bio-Release Field Evaluation Study at Idaho National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G.; Pulsipher, Brent A.; Matzke, Brett D.

    2009-12-17

    In September 2008 a large-scale testing operation (referred to as the INL-2 test) was performed within a two-story building (PBF-632) at the Idaho National Laboratory (INL). The report “Operational Observations on the INL-2 Experiment” defines the seven objectives for this test and discusses the results and conclusions. This is further discussed in the introduction of this report. The INL-2 test consisted of five tests (events) in which a floor (level) of the building was contaminated with the harmless biological warfare agent simulant Bg and samples were taken in most, if not all, of the rooms on the contaminated floor. After the sampling, the building was decontaminated, and the next test performed. Judgmental samples and probabilistic samples were determined and taken during each test. Vacuum, wipe, and swab samples were taken within each room. The purpose of this report is to study an additional four topics that were not within the scope of the original report. These topics are: 1) assess the quantitative assumptions about the data being normally or log-normally distributed; 2) evaluate differences and quantify the sample to sample variability within a room and across the rooms; 3) perform geostatistical types of analyses to study spatial correlations; and 4) quantify the differences observed between surface types and sampling methods for each scenario and study the consistency across the scenarios. The following four paragraphs summarize the results of each of the four additional analyses. All samples after decontamination came back negative. Because of this, it was not appropriate to determine if these clearance samples were normally distributed. As Table 1 shows, the characterization data consists of values between and inclusive of 0 and 100 CFU/cm2 (100 was the value assigned when the number is too numerous to count). The 100 values are generally much bigger than the rest of the data, causing the data to be right skewed. There are also a significant

  17. Aesthetic appreciation: event-related field and time-frequency analyses.

    Science.gov (United States)

    Munar, Enric; Nadal, Marcos; Castellanos, Nazareth P; Flexas, Albert; Maestú, Fernando; Mirasso, Claudio; Cela-Conde, Camilo J

    2011-01-01

    Improvements in neuroimaging methods have afforded significant advances in our knowledge of the cognitive and neural foundations of aesthetic appreciation. We used magnetoencephalography (MEG) to register brain activity while participants decided about the beauty of visual stimuli. The data were analyzed with event-related field (ERF) and Time-Frequency (TF) procedures. ERFs revealed no significant differences between brain activity related with stimuli rated as "beautiful" and "not beautiful." TF analysis showed clear differences between both conditions 400 ms after stimulus onset. Oscillatory power was greater for stimuli rated as "beautiful" than those regarded as "not beautiful" in the four frequency bands (theta, alpha, beta, and gamma). These results are interpreted in the frame of synchronization studies.

  18. Determination of Watershed Infiltration and Erosion Parameters from Field Rainfall Simulation Analyses

    Directory of Open Access Journals (Sweden)

    Mark E. Grismer

    2016-06-01

    Full Text Available Realistic modeling of infiltration, runoff and erosion processes from watersheds requires estimation of the effective hydraulic conductivity (Km of the hillslope soils and how it varies with soil tilth, depth and cover conditions. Field rainfall simulation (RS plot studies provide an opportunity to assess the surface soil hydraulic and erodibility conditions, but a standardized interpretation and comparison of results of this kind from a wide variety of test conditions has been difficult. Here, we develop solutions to the combined set of time-to-ponding/runoff and Green– Ampt infiltration equations to determine Km values from RS test plot results and compare them to the simpler calculation of steady rain minus runoff rates. Relating soil detachment rates to stream power, we also examine the determination of “erodibility” as the ratio thereof. Using data from over 400 RS plot studies across the Lake Tahoe Basin area that employ a wide range of rain rates across a range of soil slopes and conditions, we find that the Km values can be determined from the combined infiltration equation for ~80% of the plot data and that the laminar flow form of stream power best described a constant “erodibility” across a range of volcanic skirun soil conditions. Moreover, definition of stream power based on laminar flows obviates the need for assumption of an arbitrary Mannings “n” value and the restriction to mild slopes (<10%. The infiltration equation based Km values, though more variable, were on average equivalent to that determined from the simpler calculation of steady rain minus steady runoff rates from the RS plots. However, these Km values were much smaller than those determined from other field test methods. Finally, we compare RS plot results from use of different rainfall simulators in the basin and demonstrate that despite the varying configurations and rain intensities, similar erodibilities were determined across a range of

  19. Thermo-mechanical analyses and model validation in the HAW test field. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Heijdra, J J; Broerse, J; Prij, J

    1995-01-01

    An overview is given of the thermo-mechanical analysis work done for the design of the High Active Waste experiment and for the purpose of validation of the used models through comparison with experiments. A brief treatise is given on the problems of validation of models used for the prediction of physical behaviour which cannot be determined with experiments. The analysis work encompasses investigations into the initial state of stress in the field, the constitutive relations, the temperature rise, and the pressure on the liner tubes inserted in the field to guarantee the retrievability of the radioactive sources used for the experiment. The measurements of temperatures, deformations, and stresses are described and an evaluation is given of the comparison of measured and calculated data. An attempt has been made to qualify or even quantify the discrepancies, if any, between measurements and calculations. It was found that the model for the temperature calculations performed adequately. For the stresses the general tendency was good, however, large discrepancies exist mainly due to inaccuracies in the measurements. For the deformations again the general tendency of the model predictions was in accordance with the measurements. However, from the evaluation it appears that in spite of the efforts to estimate the correct initial rock pressure at the location of the experiment, this pressure has been underestimated. The evaluation has contributed to a considerable increase in confidence in the models and gives no reason to question the constitutive model for rock salt. However, due to the quality of the measurements of the stress and the relatively short period of the experiments no quantitatively firm support for the constitutive model is acquired. Collections of graphs giving the measured and calculated data are attached as appendices. (orig.).

  20. Thermo-mechanical analyses and model validation in the HAW test field. Final report

    International Nuclear Information System (INIS)

    Heijdra, J.J.; Broerse, J.; Prij, J.

    1995-01-01

    An overview is given of the thermo-mechanical analysis work done for the design of the High Active Waste experiment and for the purpose of validation of the used models through comparison with experiments. A brief treatise is given on the problems of validation of models used for the prediction of physical behaviour which cannot be determined with experiments. The analysis work encompasses investigations into the initial state of stress in the field, the constitutive relations, the temperature rise, and the pressure on the liner tubes inserted in the field to guarantee the retrievability of the radioactive sources used for the experiment. The measurements of temperatures, deformations, and stresses are described and an evaluation is given of the comparison of measured and calculated data. An attempt has been made to qualify or even quantify the discrepancies, if any, between measurements and calculations. It was found that the model for the temperature calculations performed adequately. For the stresses the general tendency was good, however, large discrepancies exist mainly due to inaccuracies in the measurements. For the deformations again the general tendency of the model predictions was in accordance with the measurements. However, from the evaluation it appears that in spite of the efforts to estimate the correct initial rock pressure at the location of the experiment, this pressure has been underestimated. The evaluation has contributed to a considerable increase in confidence in the models and gives no reason to question the constitutive model for rock salt. However, due to the quality of the measurements of the stress and the relatively short period of the experiments no quantitatively firm support for the constitutive model is acquired. Collections of graphs giving the measured and calculated data are attached as appendices. (orig.)

  1. Three-dimensional finite element model for flexible pavement analyses based field modulus measurements

    International Nuclear Information System (INIS)

    Lacey, G.; Thenoux, G.; Rodriguez-Roa, F.

    2008-01-01

    In accordance with the present development of empirical-mechanistic tools, this paper presents an alternative to traditional analysis methods for flexible pavements using a three-dimensional finite element formulation based on a liner-elastic perfectly-plastic Drucker-Pager model for granular soil layers and a linear-elastic stress-strain law for the asphalt layer. From the sensitivity analysis performed, it was found that variations of +-4 degree in the internal friction angle of granular soil layers did not significantly affect the analyzed pavement response. On the other hand, a null dilation angle is conservatively proposed for design purposes. The use of a Light Falling Weight Deflectometer is also proposed as an effective and practical tool for on-site elastic modulus determination of granular soil layers. However, the stiffness value obtained from the tested layer should be corrected when the measured peak deflection and the peak force do not occur at the same time. In addition, some practical observations are given to achieve successful field measurements. The importance of using a 3D FE analysis to predict the maximum tensile strain at the bottom of the asphalt layer (related to pavement fatigue) and the maximum vertical comprehensive strain transmitted to the top of the granular soil layers (related to rutting) is also shown. (author)

  2. Numerical analyses of magnetic field and force in toroidal superconducting magnetic energy storage using unit coils (abstract)

    International Nuclear Information System (INIS)

    Kanamaru, Y.; Nakayama, T.; Amemiya, Y.

    1997-01-01

    Superconducting magnetic energy storage (SMES) is more useful than other systems of electric energy storage because of its larger amounts of stored energy and its higher efficiency. There are two types of SMES. One is the solenoid type and the other is the toroidal type. Some models of solenoid-type SMES are designed in the U.S. and in Japan. But the large scale SMES causes a high magnetic field in the living environment, and causes the erroneous operation of electronic equipment. The authors studied some suitable designs of magnetic shielding for the solenoidal-type SMES to reduce the magnetic field in the living environment. The toiroidal type SMES is studied in this article. The magnetic leakage flux of the toiroidal-type SMES is generally lower than that of the solenoid-type SMES. The toroidal-type SMES is constructed of unit coils, which are convenient for construction. The magnetic leakage flux occurs between unit coils. The electromagnetic force of the coils is very strong. Therefore analyses of the leakage flux and electromagnetic force are important to the design of SMES. The authors studied the number, radius, and length of unit coils. The storage energy is 5 G Wh. The numerical analyses of magnetic fields in the toroidal type SMES are obtained by analytical solutions. copyright 1997 American Institute of Physics

  3. Lava flow field emplacement studies of Manua Ulu (Kilauea Volcano, Hawai'i, United States) and Venus, using field and remote sensing analyses

    Science.gov (United States)

    Byrnes, Jeffrey Myer

    2002-04-01

    This work examines lava emplacement processes by characterizing surface units using field and remote sensing analyses in order to understand the development of lava flow fields. Specific study areas are the 1969--1974 Mauna Ulu compound flow field, (Kilauea Volcano, Hawai'i, USA), and five lava flow fields on Venus: Turgmam Fluctus, Zipaltonal Fluctus, the Tuli Mons/Uilata Fluctus flow complex, the Var Mons flow field, and Mylitta Fluctus. Lava surface units have been examined in the field and with visible-, thermal-, and radar-wavelength remote sensing datasets for Mauna Ulu, and with radar data for the Venusian study areas. For the Mauna Ulu flow field, visible characteristics are related to color, glass abundance, and dm- to m-scale surface irregularities, which reflect the lava flow regime, cooling, and modification due to processes such as coalescence and inflation. Thermal characteristics are primarily affected by the abundance of glass and small-scale roughness elements (such as vesicles), and reflect the history of cooling, vesiculation and degassing, and crystallization of the lava. Radar characteristics are primarily affected by unit topography and fracturing, which are related to flow inflation, remobilization, and collapse, and reflect the local supply of lava during and after unit emplacement. Mauna Ulu surface units are correlated with pre-eruption topography, lack a simple relationship to the main feeder lava tubes, and are distributed with respect to their position within compound flow lobes and with distance from the vent. The Venusian lava flow fields appear to have developed through emplacement of numerous, thin, simple and compound flows, presumably over extended periods of time, and show a wider range of radar roughness than is observed at Mauna Ulu. A potential correlation is suggested between flow rheology and surface roughness. Distributary flow morphologies may result from tube-fed flows, and flow inflation is consistent with observed

  4. Analyses of patterns-of-failure and prognostic factors according to radiation fields in early-stage Hodgkin lymphoma

    International Nuclear Information System (INIS)

    Krebs, Lorraine; Guillerm, Sophie; Menard, Jean; Hennequin, Christophe; Quero, Laurent; Amorin, Sandy; Brice, Pauline

    2017-01-01

    Doses and volumes of radiation therapy (RT) for early stages of Hodgkin lymphoma (HL) have been reduced over the last 30 years. Combined modality therapy (CMT) is currently the standard treatment for most patients with early-stage HL. The aim of this study was to analyze the site of relapse after RT according to the extent of radiation fields. Between 1987 and 2011, 427 patients were treated at our institution with RT ± chemotherapy for stage-I/II HL. Among these, 65 patients who experienced a relapse were retrospectively analyzed. Most patients had nodular sclerosis histology (86 %) and stage-II disease (75.9 %). Bulky disease was present in 21 % and 56 % of patients belonged to the unfavorable risk group according to European Organization for Research and Treatment of Cancer (EORTC)/The Lymphoma Study Association (LYSA) definitions. CMT was delivered to 91 % of patients. All patients received RT with doses ranging from 20 to 45 Gy (mean = 34 ± 5.3 Gy). The involved-field RT technique was used in 59 % of patients. The mean time between diagnosis and relapse was 4.2 years (range 0.3-24.5). Out-of-field relapses were suffered by 53 % of patients. Relapses occurred more frequently at out-of-field sites in patients with a favorable disease status, whereas in-field relapses were associated with bulky mediastinal disease. Relapses occurred later for favorable compared with the unfavorable risk group (3.5 vs. 2.9 years, p = 0.5). From multivariate analyses, neither RT dose nor RT field size were predictive for an in-field relapse (p = 0.25 and p = 0.8, respectively), only bulky disease was predictive (p = 0.018). In patients with bulky disease, RT dose and RT field size were not predictive for an in-field relapse. In this subgroup of patients, chemotherapy should be intensified. We confirmed the bad prognosis of early relapses. (orig.) [de

  5. Functional visual fields: relationship of visual field areas to self-reported function.

    Science.gov (United States)

    Subhi, Hikmat; Latham, Keziah; Myint, Joy; Crossland, Michael D

    2017-07-01

    The aim of this study is to relate areas of the visual field to functional difficulties to inform the development of a binocular visual field assessment that can reflect the functional consequences of visual field loss. Fifty-two participants with peripheral visual field loss undertook binocular assessment of visual fields using the 30-2 and 60-4 SITA Fast programs on the Humphrey Field Analyser, and mean thresholds were derived. Binocular visual acuity, contrast sensitivity and near reading performance were also determined. Self-reported overall and mobility function were assessed using the Dutch ICF Activity Inventory. Greater visual field loss (0-60°) was associated with worse self-reported function both overall (R 2 = 0.50; p function (R 2 = 0.61, p function in multiple regression analyses. Superior and inferior visual field areas related similarly to mobility function (R 2 = 0.56, p function in multiple regression analysis. Mean threshold of the binocular visual field to 60° eccentricity is a good predictor of self-reported function overall, and particularly of mobility function. Both the central (0-30°) and peripheral (30-60°) mean threshold are good predictors of self-reported function, but the peripheral (30-0°) field is a slightly better predictor of mobility function, and should not be ignored when considering functional consequences of field loss. The inferior visual field is a slightly stronger predictor of perceived overall and mobility function than the superior field. © 2017 The Authors Ophthalmic & Physiological Optics © 2017 The College of Optometrists.

  6. Biogenic volatile organic compound analyses by PTR-TOF-MS: Calibration, humidity effect and reduced electric field dependency.

    Science.gov (United States)

    Pang, Xiaobing

    2015-06-01

    Green leaf volatiles (GLVs) emitted by plants after stress or damage induction are a major part of biogenic volatile organic compounds (BVOCs). Proton transfer reaction time-of-flight mass spectrometry (PTR-TOF-MS) is a high-resolution and sensitive technique for in situ GLV analyses, while its performance is dramatically influenced by humidity, electric field, etc. In this study the influence of gas humidity and the effect of reduced field (E/N) were examined in addition to measuring calibration curves for the GLVs. Calibration curves measured for seven of the GLVs in dry air were linear, with sensitivities ranging from 5 to 10 ncps/ppbv (normalized counts per second/parts per billion by volume). The sensitivities for most GLV analyses were found to increase by between 20% and 35% when the humidity of the sample gas was raised from 0% to 70% relative humidity (RH) at 21°C, with the exception of (E)-2-hexenol. Product ion branching ratios were also affected by humidity, with the relative abundance of the protonated molecular ions and higher mass fragment ions increasing with humidity. The effect of reduced field (E/N) on the fragmentation of GLVs was examined in the drift tube of the PTR-TOF-MS. The structurally similar GLVs are acutely susceptible to fragmentation following ionization and the fragmentation patterns are highly dependent on E/N. Overall the measured fragmentation patterns contain sufficient information to permit at least partial separation and identification of the isomeric GLVs by looking at differences in their fragmentation patterns at high and low E/N. Copyright © 2015. Published by Elsevier B.V.

  7. Dynamic regulation of GDP binding to G proteins revealed by magnetic field-dependent NMR relaxation analyses.

    Science.gov (United States)

    Toyama, Yuki; Kano, Hanaho; Mase, Yoko; Yokogawa, Mariko; Osawa, Masanori; Shimada, Ichio

    2017-02-22

    Heterotrimeric guanine-nucleotide-binding proteins (G proteins) serve as molecular switches in signalling pathways, by coupling the activation of cell surface receptors to intracellular responses. Mutations in the G protein α-subunit (Gα) that accelerate guanosine diphosphate (GDP) dissociation cause hyperactivation of the downstream effector proteins, leading to oncogenesis. However, the structural mechanism of the accelerated GDP dissociation has remained unclear. Here, we use magnetic field-dependent nuclear magnetic resonance relaxation analyses to investigate the structural and dynamic properties of GDP bound Gα on a microsecond timescale. We show that Gα rapidly exchanges between a ground-state conformation, which tightly binds to GDP and an excited conformation with reduced GDP affinity. The oncogenic D150N mutation accelerates GDP dissociation by shifting the equilibrium towards the excited conformation.

  8. Stratigraphy, palaeoenvironments and palaeoecology of the Loch Humphrey Burn lagerstätte and other Mississippian palaeobotanical localities of the Kilpatrick Hills, southwest Scotland

    Directory of Open Access Journals (Sweden)

    Richard M. Bateman

    2016-02-01

    Full Text Available Background and Aims. The largely Mississippian strata of the Kilpatrick Hills, located at the western end of the Scottish Midland Valley, enclose several macrofossil floras that together contain ca 21 organ-species of permineralised plants and ca 44 organ-species of compressed plants, here estimated to represent 25 whole-plant species (Glenarbuck = nine, Loch Humphrey Burn Lower = 11, Upper = seven. The most significant locality is the internationally important volcanigenic sequence that is reputedly intercalated within the Clyde Plateau Lava Formation at Loch Humphrey Burn, where ca 30 m of reworked tuffs and other clastic sediments enclose one of the world’s most important terrestrial lagerstätten of this period. We here explore the palaeoecology and palaeoenvironments of the locality, and elucidate its controversial age. Methods. Repeated re-excavation of key exposures allowed recognition of five main depositional units, differing in thickness from 4 m to 12 m. It also permitted detailed sampling for plant macrofossils and microfossils throughout the succession. Several approaches are integrated to re-assess the taphonomy and preservation of these exceptional plant fossils. Key Results. The deposits are rich in taxonomically diverse miospores and in toto contain at least six well-developed compression floras, together with two beds yielding nodules that enclose well-researched anatomically preserved plants permineralised in calcite. Bulk geochemistry shows that the upper nodules formed by migration of Ca with subordinate Mn and Na. Some phylogenetically important plant fossils recovered in the early 20th century have been traced to their source horizons. Trends in relative proportions of macrofossil and microfossil taxa through the sequence are only moderately congruent, perhaps reflecting the likelihood that microfossils sample the regional rather than the local flora. Conclusions. The Loch Humphrey Burn sequence encompasses a wide range

  9. Stratigraphy, palaeoenvironments and palaeoecology of the Loch Humphrey Burn lagerstätte and other Mississippian palaeobotanical localities of the Kilpatrick Hills, southwest Scotland.

    Science.gov (United States)

    Bateman, Richard M; Stevens, Liadan G; Hilton, Jason

    2016-01-01

    Background and Aims. The largely Mississippian strata of the Kilpatrick Hills, located at the western end of the Scottish Midland Valley, enclose several macrofossil floras that together contain ca 21 organ-species of permineralised plants and ca 44 organ-species of compressed plants, here estimated to represent 25 whole-plant species (Glenarbuck = nine, Loch Humphrey Burn Lower = 11, Upper = seven). The most significant locality is the internationally important volcanigenic sequence that is reputedly intercalated within the Clyde Plateau Lava Formation at Loch Humphrey Burn, where ca 30 m of reworked tuffs and other clastic sediments enclose one of the world's most important terrestrial lagerstätten of this period. We here explore the palaeoecology and palaeoenvironments of the locality, and elucidate its controversial age. Methods. Repeated re-excavation of key exposures allowed recognition of five main depositional units, differing in thickness from 4 m to 12 m. It also permitted detailed sampling for plant macrofossils and microfossils throughout the succession. Several approaches are integrated to re-assess the taphonomy and preservation of these exceptional plant fossils. Key Results. The deposits are rich in taxonomically diverse miospores and in toto contain at least six well-developed compression floras, together with two beds yielding nodules that enclose well-researched anatomically preserved plants permineralised in calcite. Bulk geochemistry shows that the upper nodules formed by migration of Ca with subordinate Mn and Na. Some phylogenetically important plant fossils recovered in the early 20th century have been traced to their source horizons. Trends in relative proportions of macrofossil and microfossil taxa through the sequence are only moderately congruent, perhaps reflecting the likelihood that microfossils sample the regional rather than the local flora. Conclusions. The Loch Humphrey Burn sequence encompasses a wide range of depositional

  10. Combining Geoelectrical Measurements and CO2 Analyses to Monitor the Enhanced Bioremediation of Hydrocarbon-Contaminated Soils: A Field Implementation

    Directory of Open Access Journals (Sweden)

    Cécile Noel

    2016-01-01

    Full Text Available Hydrocarbon-contaminated aquifers can be successfully remediated through enhanced biodegradation. However, in situ monitoring of the treatment by piezometers is expensive and invasive and might be insufficient as the information provided is restricted to vertical profiles at discrete locations. An alternative method was tested in order to improve the robustness of the monitoring. Geophysical methods, electrical resistivity (ER and induced polarization (IP, were combined with gas analyses, CO2 concentration, and its carbon isotopic ratio, to develop a less invasive methodology for monitoring enhanced biodegradation of hydrocarbons. The field implementation of this monitoring methodology, which lasted from February 2014 until June 2015, was carried out at a BTEX-polluted site under aerobic biotreatment. Geophysical monitoring shows a more conductive and chargeable area which corresponds to the contaminated zone. In this area, high CO2 emissions have been measured with an isotopic signature demonstrating that the main source of CO2 on this site is the biodegradation of hydrocarbon fuels. Besides, the evolution of geochemical and geophysical data over a year seems to show the seasonal variation of bacterial activity. Combining geophysics with gas analyses is thus promising to provide a new methodology for in situ monitoring.

  11. Analyses of patterns-of-failure and prognostic factors according to radiation fields in early-stage Hodgkin lymphoma

    Energy Technology Data Exchange (ETDEWEB)

    Krebs, Lorraine; Guillerm, Sophie; Menard, Jean; Hennequin, Christophe; Quero, Laurent [Saint Louis Hospital, Radiation Oncology Department, Paris (France); Amorin, Sandy; Brice, Pauline [Saint Louis Hospital, AP-HP, Hematooncology Department, Paris (France)

    2017-02-15

    Doses and volumes of radiation therapy (RT) for early stages of Hodgkin lymphoma (HL) have been reduced over the last 30 years. Combined modality therapy (CMT) is currently the standard treatment for most patients with early-stage HL. The aim of this study was to analyze the site of relapse after RT according to the extent of radiation fields. Between 1987 and 2011, 427 patients were treated at our institution with RT ± chemotherapy for stage-I/II HL. Among these, 65 patients who experienced a relapse were retrospectively analyzed. Most patients had nodular sclerosis histology (86 %) and stage-II disease (75.9 %). Bulky disease was present in 21 % and 56 % of patients belonged to the unfavorable risk group according to European Organization for Research and Treatment of Cancer (EORTC)/The Lymphoma Study Association (LYSA) definitions. CMT was delivered to 91 % of patients. All patients received RT with doses ranging from 20 to 45 Gy (mean = 34 ± 5.3 Gy). The involved-field RT technique was used in 59 % of patients. The mean time between diagnosis and relapse was 4.2 years (range 0.3-24.5). Out-of-field relapses were suffered by 53 % of patients. Relapses occurred more frequently at out-of-field sites in patients with a favorable disease status, whereas in-field relapses were associated with bulky mediastinal disease. Relapses occurred later for favorable compared with the unfavorable risk group (3.5 vs. 2.9 years, p = 0.5). From multivariate analyses, neither RT dose nor RT field size were predictive for an in-field relapse (p = 0.25 and p = 0.8, respectively), only bulky disease was predictive (p = 0.018). In patients with bulky disease, RT dose and RT field size were not predictive for an in-field relapse. In this subgroup of patients, chemotherapy should be intensified. We confirmed the bad prognosis of early relapses. (orig.) [German] Waehrend der letzten 30 Jahre wurden die Strahlentherapie-(RT-)Dosis und die RT-Volumina fuer die Behandlung der Fruehstadien

  12. Medieval land use management and geochemistry - spatial analyses on scales from households properties to whole fields systems

    Science.gov (United States)

    Horák, Jan; Janovský, Martin; Klír, Tomáš; Šmejda, Ladislav; Legut-Pintal, Maria

    2017-04-01

    We present the final or preliminary results of our researches of five villages: Spindelbach (Ore Mountains, North-Western Bohemia), Hol (near Prague, Central Bohemia), Lovětín and Regenholz (near Třešť, Czech-Moravian Upland) and Goschwitz (near Wroclaw, Poland). Our research is methodically based on broad spatial sampling of soil samples and mapping of basic soil conditions. We use XRF spectrometry as a main tool for multi-elemental analyses and as a tool for first step screening of large areas. The crucial factor of our methods is also a design of sampling based on a respect to historical land and land use features like parts of village field system or possesions of the households. Also macroscopic visual method of getting data and knowledge of the site is crucial. It was revealed that generally used and acknowledged human indicator - Phosphorus - can be present at only very low levels of concentration, or undetectable, even in the nearness of households. The natural conditions cannot be the causing factor at all cases. This situation is caused also by last human activity intensity and by its spatial manifestation. In such cases, multi-elemental analysis is very useful. Zinc is usually correlated with Phosphorus, which is also connected to Lead. The past human activity indicators are spatially usually connected to modern pollution indicators. These two inputs can be sometimes distinguished by statistical analyses and by spatial visualisation of data. Working with just concentrations can be misleading. Past land use management and its strategies were important for spatial distribution of soil geochemical indicators. Therefore, we can use them not only as quantifiers of human impact on nature, but we can also detect different management or knowledge and experience. As it was revealed e. g. by analyses of households` possessions differences. For example, generally presumed decreasing gradient of management intensity (e.g. manuring) along the distance from

  13. The integrated analyses of digital field mapping techniques and traditional field methods: implications from the Burdur-Fethiye Shear Zone, SW Turkey as a case-study

    Science.gov (United States)

    Elitez, İrem; Yaltırak, Cenk; Zabcı, Cengiz; Şahin, Murat

    2015-04-01

    The precise geological mapping is one of the most important issues in geological studies. Documenting the spatial distribution of geological bodies and their contacts play a crucial role on interpreting the tectonic evolution of any region. Although the traditional field techniques are still accepted to be the most fundamental tools in construction of geological maps, we suggest that the integration of digital technologies to the classical methods significantly increases the resolution and the quality of such products. We simply follow the following steps in integration of the digital data with the traditional field observations. First, we create the digital elevation model (DEM) of the region of interest by interpolating the digital contours of 1:25000 scale topographic maps to 10 m of ground pixel resolution. The non-commercial Google Earth satellite imagery and geological maps of previous studies are draped over the interpolated DEMs in the second stage. The integration of all spatial data is done by using the market leading GIS software, ESRI ArcGIS. We make the preliminary interpretation of major structures as tectonic lineaments and stratigraphic contacts. These preliminary maps are controlled and precisely coordinated during the field studies by using mobile tablets and/or phablets with GPS receivers. The same devices are also used in measuring and recording the geologic structures of the study region. Finally, all digitally collected measurements and observations are added to the GIS database and we finalise our geological map with all available information. We applied this integrated method to map the Burdur-Fethiye Shear Zone (BFSZ) in the southwest Turkey. The BFSZ is an active sinistral 60-to-90 km-wide shear zone, which prolongs about 300 km-long between Suhut-Cay in the northeast and Köyceğiz Lake-Kalkan in the southwest on land. The numerous studies suggest contradictory models not only about the evolution but also about the fault geometry of this

  14. Analyses of crystal field and exchange interaction of Dy3Ga5O12 under extreme conditions

    International Nuclear Information System (INIS)

    Wang Wei; Qi Xin; Yue Yuan

    2011-01-01

    This paper theoretically investigates the effects of crystal field and exchange interaction field on magnetic properties in dysprosium gallium garnet under extreme conditions (low temperatures and high magnetic fields) based on quantum theory. Here, five sets of crystal field parameters are discussed and compared. It demonstrates that, only considering the crystal field effect, the experiments can not be successfully explained. Thus, referring to the molecular field theory, an effective exchange field associated with the Dy—Dy exchange interaction is further taken into account. Under special consideration of crystal field and the exchange interaction field, it obtains an excellent agreement between the theoretical results and experiments, and further confirms that the exchange interaction field between rare-earth ions has great importance to magnetic properties in paramagnetic rare-earth gallium garnets. (condensed matter: electronic structure, electrical, magnetic, and optical properties)

  15. Spectrum integrated (n,He) cross section comparisons and least squares analyses for 6Li and 10B in benchmark fields

    International Nuclear Information System (INIS)

    Schenter, R.E.; Oliver, B.M.; Farrar, H. IV.

    1986-06-01

    Spectrum integrated cross sections for 6 Li and 10 B from five benchmark fast reactor neutron fields are compared with calculated values obtained using the ENDF/B-V Cross Section Files. The benchmark fields include the Coupled Fast Reactivity Measurements Facility (CFRMF) at the Idaho National Engineering Laboratory, the 10% Enriched U-235 Critical Assembly (BIG-10) at Los Alamos National Laboratory, the Sigma-Sigma and Fission Cavity fields of the BR-1 reactor at CEN/SCK, and the Intermediate Energy Standard Neutron Field (ISNF) at the National Bureau of Standards. Results from least square analyses using the FERRET computer code to obtain adjusted cross section values and their uncertainties are presented. Input to these calculations include the above five benchmark data sets. These analyses indicate a need for revision in the ENDF/B-V files for the 10 B and 6 Li cross sections for energies above 50 keV

  16. Risk factors for predicting visual field progression in Chinese patients with primary open-angle glaucoma: A retrospective study.

    Science.gov (United States)

    Hung, Kuo-Hsuan; Cheng, Ching-Yu; Liu, Catherine Jui-Ling

    2015-07-01

    Glaucoma is a leading cause of irreversible blindness worldwide. It is characterized by progressive deterioration of the visual field (VF) that results in a complete loss of vision. This study aimed to determine the risk factors associated with VF progression in Chinese patients with primary open-angle glaucoma (POAG). We reviewed the charts of POAG patients who visited our clinic between July 2009 and June 2010. We included patients with five or more reliable VF tests using the Humphrey Field Analyzer (Humphrey Instruments, San Leandro, CA, USA) during a period of at least 2 years. The scoring system of the Collaborative Initial Glaucoma Treatment Study (CIGTS) was used to code the VF. Progression was defined as an increasing score ≥3, compared to the averaged baseline data. Univariate and multivariate logistic regression analyses were performed to identify the risk factors of VF progression. There were 92 patients (representing 92 eyes) with an average of 8.9 reliable VFs over a mean follow up of 5.4 years. Multivariate logistic regression showed that eyes with more VF tests [odds ratio (OR) = 1.500, p < 0.010] and either increased peak intraocular pressure (IOP) (OR = 1.235, p = 0.044) or a wide IOP range (OR = 1.165, p = 0.041) favored VF progression. High myopia (less than -6.0 D) was not a risk factor (OR = 1.289, p = 0.698) for VF progression in this study. In addition to a greater number of VF tests, Chinese patients with treated POAG who experienced a high peak IOP or a wide range of IOP during follow up were more likely to have VF deterioration. Copyright © 2015. Published by Elsevier Taiwan.

  17. Recovery of visual-field defects after occipital lobe infarction: a perimetric study.

    Science.gov (United States)

    Çelebisoy, Mehmet; Çelebisoy, Neşe; Bayam, Ece; Köse, Timur

    2011-06-01

    To assess the temporal course of homonymous visual-field defects due to occipital lobe infarction, by using automated perimetry. 32 patients with ischaemic infarction of the occipital lobe were studied prospectively, using a Humphrey Visual Field Analyser II. The visual field of each eye was divided into central, paracentral and peripheral zones. The mean visual sensitivity of each zone was calculated and used for the statistical analysis. The results of the initial examination, performed within 2 weeks of stroke, were compared with the results of the sixth-month control. The lesions were assigned to the localisations, optic radiation, striate cortex, occipital pole and occipital convexity, by MRI. A statistically significant improvement was noted, especially for the lower quadrants. Lesions of the occipital pole and convexity were not significantly associated with visual-field recovery. However, involvement of the striate cortex and extensive lesions involving all the areas studied was significantly associated with poor prognosis. Homonymous visual-field defects in our patients improved within 6 months. Restoration of the lower quadrants and especially the peripheral zones was noted. Incomplete damage to the striate cortex, which has a varying pattern of vascular supply, could explain this finding. Magnification factor theory, which is the increment of the receptive-field size of striate cortex cells with visual-field eccentricity, may explain the more significant improvement in the peripheral zones.

  18. Risk Factors for Visual Field Progression in the Groningen Longitudinal Glaucoma Study : A Comparison of Different Statistical Approaches

    NARCIS (Netherlands)

    Wesselink, Christiaan; Marcus, Michael W.; Jansonius, Nomdo M.

    2012-01-01

    Purpose: To identify risk factors for visual field progression in glaucoma and to compare different statistical approaches with this risk factor analysis. Patients and Methods: We included 221 eyes of 221 patients. Progression was analyzed using Nonparametric Progression Analysis applied to Humphrey

  19. The role of host genetic factors in respiratory tract infectious diseases: systematic review, meta-analyses and field synopsis

    NARCIS (Netherlands)

    Patarčić, Inga; Gelemanović, Andrea; Kirin, Mirna; Kolčić, Ivana; Theodoratou, Evropi; Baillie, Kenneth J.; de Jong, Menno D.; Rudan, Igor; Campbell, Harry; Polašek, Ozren

    2015-01-01

    Host genetic factors have frequently been implicated in respiratory infectious diseases, often with inconsistent results in replication studies. We identified 386 studies from the total of 24,823 studies identified in a systematic search of four bibliographic databases. We performed meta-analyses of

  20. Wide-field LOFAR-LBA power-spectra analyses: Impact of calibration, polarization leakage and ionosphere

    Science.gov (United States)

    Gehlot, Bharat K.; Koopmans, Léon V. E.

    2018-05-01

    Contamination due to foregrounds, calibration errors and ionospheric effects pose major challenges in detection of the cosmic 21 cm signal in various Epoch of Reionization (EoR) experiments. We present the results of a study of a field centered on 3C196 using LOFAR Low Band observations, where we quantify various wide field and calibration effects such as gain errors, polarized foregrounds, and ionospheric effects. We observe a `pitchfork' structure in the power spectrum of the polarized intensity in delay-baseline space, which leaks into the modes beyond the instrumental horizon. We show that this structure arises due to strong instrumental polarization leakage (~30%) towards Cas A which is far away from primary field of view. We measure a small ionospheric diffractive scale towards CasA resembling pure Kolmogorov turbulence. Our work provides insights in understanding the nature of aforementioned effects and mitigating them in future Cosmic Dawn observations.

  1. Comparative energy input–output and financial analyses of greenhouse and open field vegetables production in West Java, Indonesia

    International Nuclear Information System (INIS)

    Kuswardhani, Nita; Soni, Peeyush; Shivakoti, Ganesh P.

    2013-01-01

    This paper estimates energy consumption per unit floor area of greenhouse and open field for tomato, chili and lettuce production. Primary data were collected from 530 vegetable farmers during Jan–Dec, 2010 in West Java, Indonesia. Energy estimates were calculated from actual amount of inputs and outputs and corresponding conversion factors. Results reveal that the total input energy used in greenhouse (GH) production of tomato, chili (medium and high land) and lettuce were 47.62, 41.55, 58.84, and 24.54 GJ/ha respectively. Whereas, the requirement of total input energy for open field (OF) production of tomato, chili (medium and high land) and lettuce were 49.01, 41.04, 57.94 and 23.87 GJ/ha, respectively. The ratio of output to input energy was higher in greenhouse production (0.85, 0.45 and 0.49) than open field vegetable production (0.52, 0.175 and 0.186) for tomato, chili medium land and chili highland, respectively, but output–input ratio of lettuce open field production was twice as that of greenhouse vegetable production. Financial analysis revealed higher mean net returns from greenhouse vegetable production as 7043 $/ha (922–15,299 $/ha) when compared to 571 $/ha (44–1172 $/ha) from open field vegetable production. Among the greenhouse vegetables, tomato cultivation was the most profitable in terms of energy efficiency and financial productivity. - Highlights: ► Energy input–output analysis is carried out to compare vegetables production in greenhouse and open field. ► Tomato, Chili and Lettuce production in West Java, Indonesia. ► Economic analysis is conducted to compare the two production systems

  2. Efficient EBE treatment of the dynamic far-field in non-linear FE soil-structure interaction analyses

    NARCIS (Netherlands)

    Crouch, R.S.; Bennett, T.

    2000-01-01

    This paper presents results and observations from the use of a rigorous method of treating the dynamic far-field as part of a non-linear FE analysis. The technique de-veloped by Wolf and Song (referred to as the Scaled Boundary Finite-Element Method) is incorporated into a 3-D time-domain analysis

  3. Entropic potential field formed for a linear-motor protein near a filament: Statistical-mechanical analyses using simple models.

    Science.gov (United States)

    Amano, Ken-Ichi; Yoshidome, Takashi; Iwaki, Mitsuhiro; Suzuki, Makoto; Kinoshita, Masahiro

    2010-07-28

    We report a new progress in elucidating the mechanism of the unidirectional movement of a linear-motor protein (e.g., myosin) along a filament (e.g., F-actin). The basic concept emphasized here is that a potential field is entropically formed for the protein on the filament immersed in solvent due to the effect of the translational displacement of solvent molecules. The entropic potential field is strongly dependent on geometric features of the protein and the filament, their overall shapes as well as details of the polyatomic structures. The features and the corresponding field are judiciously adjusted by the binding of adenosine triphosphate (ATP) to the protein, hydrolysis of ATP into adenosine diphosphate (ADP)+Pi, and release of Pi and ADP. As the first step, we propose the following physical picture: The potential field formed along the filament for the protein without the binding of ATP or ADP+Pi to it is largely different from that for the protein with the binding, and the directed movement is realized by repeated switches from one of the fields to the other. To illustrate the picture, we analyze the spatial distribution of the entropic potential between a large solute and a large body using the three-dimensional integral equation theory. The solute is modeled as a large hard sphere. Two model filaments are considered as the body: model 1 is a set of one-dimensionally connected large hard spheres and model 2 is a double helical structure formed by two sets of connected large hard spheres. The solute and the filament are immersed in small hard spheres forming the solvent. The major findings are as follows. The solute is strongly confined within a narrow space in contact with the filament. Within the space there are locations with sharply deep local potential minima along the filament, and the distance between two adjacent locations is equal to the diameter of the large spheres constituting the filament. The potential minima form a ringlike domain in model 1

  4. Sequence and phylogenetic analyses of novel totivirus-like double-stranded RNAs from field-collected powdery mildew fungi.

    Science.gov (United States)

    Kondo, Hideki; Hisano, Sakae; Chiba, Sotaro; Maruyama, Kazuyuki; Andika, Ida Bagus; Toyoda, Kazuhiro; Fujimori, Fumihiro; Suzuki, Nobuhiro

    2016-02-02

    The identification of mycoviruses contributes greatly to understanding of the diversity and evolutionary aspects of viruses. Powdery mildew fungi are important and widely studied obligate phytopathogenic agents, but there has been no report on mycoviruses infecting these fungi. In this study, we used a deep sequencing approach to analyze the double-stranded RNA (dsRNA) segments isolated from field-collected samples of powdery mildew fungus-infected red clover plants in Japan. Database searches identified the presence of at least ten totivirus (genus Totivirus)-like sequences, termed red clover powdery mildew-associated totiviruses (RPaTVs). The majority of these sequences shared moderate amino acid sequence identity with each other (powdery mildew fungus populations infecting red clover plants in the field. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. El profesor George H. Humphreys

    Directory of Open Access Journals (Sweden)

    Manuel José Luque

    1948-11-01

    Y es un goce espiritual magnìfico por el tamaño y la nobleza de su finalidad, el objetivo mismo de sus actividades y su altura moral. Por el duelo que se entabla con la muerte, cara a cara, y en fin, por la sagrada satisfacción de vencer la enfermedad con el poder de un cerebro servido por diez dedos.

  6. Development of the temperature field at the WWER-440 core outlet monitoring system and application of the data analyses methods

    International Nuclear Information System (INIS)

    Spasova, V.; Georgieva, N.; Haralampieva, Tz.

    2001-01-01

    On-line internal reactor monitoring by 216 thermal couples, located at the reactor core outlet, is carried out during power operation of WWER-440 Units 1 and 2 at Kozloduy NPP. Automatic monitoring of technology process is performed by IB-500MA, which collects and performs initial data processing (discrediting and conversion of analogue signals into digital mode). The paper also presents the results and analyses of power distribution monitoring during the past 21-th and current 22-th fuel cycle at Kozloduy NPP, Unit 1 by using archiving system capacity and related software. The possibility to perform operational assessment and analysis of power distribution in the reactor core in each point of the fuel cycle is checked by comparison of the neutron-physical calculation results with reactor coolant system parameters. Paper shows that the processing and analysis of accumulated significant amount of data in the archive files increases accuracy and reliability of power distribution monitoring in the reactor core in each moment of the fuel cycle of WWER-440 reactors at Kozloduy NPP

  7. Identification and characterization of rock slope instabilities in Val Canaria (TI, Switzerland) based on field and DEM analyses

    Science.gov (United States)

    Ponzio, Maria; Pedrazzini, Andrea; Matasci, Battista; Jaboyedoff, Michel

    2013-04-01

    In Alpine areas rockslides and rock avalanches represent common gravitational hazards that potentially constitute a danger for people and infrastructures. The aim of this study is to characterize and understand the different factors influencing the distribution of large slope instabilities affecting the Val Canaria (southern Switzerland). In particular the importance of the tectonic and lithological settings as well as the impact of the groundwater circulations are investigated in detail. Val Canaria is a SW-NE trending lateral valley that displays potential large rock slope failure. Located just above one of the main N-S communication way (Highway, Railway) through the Alps, the development of large instabilities in the Val Canaria might have dramatic consequences for the main valley downstream. The dominant geological structure of the study area is the presence of a major tectonic boundary separating two basement nappes, constituted by gneissic lithologies, i.e. the Gotthard massif and the Lucomagno nappe that are located in the northern and southern part of the valley respectively. The basement units are separated by meta-sediments of Piora syncline composed by gypsum, dolomitic breccia and fractured calc-mica schists. Along with detailed geological mapping, the use of remote sensing techniques (Aerial and Terrestrial Laser Scanning) allows us to propose a multi-disciplinary approach that combines geological mapping and interpretation with periodic monitoring of the most active rockslide areas. A large array of TLS point cloud datasets (first acquisition in 2006) constitute a notable input, for monitoring purposes, and also for structural, rock mass characterization and failure mechanism interpretations. The analyses highlighted that both valley flanks are affected by deep-seated gravitational slope deformation covering a total area of about 8 km2 (corresponding to 40% of the catchment area). The most active area corresponds to the lower part of the valley

  8. Electromagnetic field analyses of two-layer power transmission cables consisting of coated conductors with magnetic and non-magnetic substrates and AC losses in their superconductor layers

    International Nuclear Information System (INIS)

    Nakahata, Masaaki; Amemiya, Naoyuki

    2008-01-01

    Two-dimensional electromagnetic field analyses were undertaken using two representative cross sections of two-layer cables consisting of coated conductors with magnetic and non-magnetic substrates. The following two arrangements were used for the coated conductors between the inner and outer layers: (1) tape-on-tape and (2) alternate. The calculated magnetic flux profile around each coated conductor was visualized. In the case of the non-magnetic substrate, the magnetic field to which coated conductors in the outer layer are exposed contains more perpendicular component to the conductor wide face (perpendicular field component) when compared to that in the inner layer. On the other hand, for the tape-on-tape arrangement of coated conductors with a magnetic substrate, the reverse is true. In the case of the alternate arrangement of the coated conductor with a magnetic substrate, the magnetic field to which the coated conductors in the inner and outer layers are exposed experiences a small perpendicular field component. When using a non-magnetic substrate, the AC loss in the superconductor layer of the coated conductors in the two-layer cables is dominated by that in the outer layer, whereas the reverse is true in the case of a magnetic substrate. When comparing the AC losses in superconductor layers of coated conductors with non-magnetic and magnetic substrates in two-layer cables, the latter is larger than the former, but the influence of the magnetism of substrates on AC losses in superconductor layers is not remarkable

  9. Multi-site study of diffusion metric variability: effects of site, vendor, field strength, and echo time on regions-of-interest and histogram-bin analyses.

    Science.gov (United States)

    Helmer, K G; Chou, M-C; Preciado, R I; Gimi, B; Rollins, N K; Song, A; Turner, J; Mori, S

    2016-02-27

    It is now common for magnetic-resonance-imaging (MRI) based multi-site trials to include diffusion-weighted imaging (DWI) as part of the protocol. It is also common for these sites to possess MR scanners of different manufacturers, different software and hardware, and different software licenses. These differences mean that scanners may not be able to acquire data with the same number of gradient amplitude values and number of available gradient directions. Variability can also occur in achievable b-values and minimum echo times. The challenge of a multi-site study then, is to create a common protocol by understanding and then minimizing the effects of scanner variability and identifying reliable and accurate diffusion metrics. This study describes the effect of site, scanner vendor, field strength, and TE on two diffusion metrics: the first moment of the diffusion tensor field (mean diffusivity, MD), and the fractional anisotropy (FA) using two common analyses (region-of-interest and mean-bin value of whole brain histograms). The goal of the study was to identify sources of variability in diffusion-sensitized imaging and their influence on commonly reported metrics. The results demonstrate that the site, vendor, field strength, and echo time all contribute to variability in FA and MD, though to different extent. We conclude that characterization of the variability of DTI metrics due to site, vendor, field strength, and echo time is a worthwhile step in the construction of multi-center trials.

  10. Molecular- and cultivation-based analyses of microbial communities in oil field water and in microcosms amended with nitrate to control H{sub 2}S production

    Energy Technology Data Exchange (ETDEWEB)

    Kumaraswamy, Raji; Ebert, Sara; Fedorak, Phillip M.; Foght, Julia M. [Alberta Univ., Edmonton, AB (Canada). Biological Sciences; Gray, Murray R. [Alberta Univ., Edmonton, AB (Canada). Chemical and Materials Engineering

    2011-03-15

    Nitrate injection into oil fields is an alternative to biocide addition for controlling sulfide production ('souring') caused by sulfate-reducing bacteria (SRB). This study examined the suitability of several cultivation-dependent and cultivation-independent methods to assess potential microbial activities (sulfidogenesis and nitrate reduction) and the impact of nitrate amendment on oil field microbiota. Microcosms containing produced waters from two Western Canadian oil fields exhibited sulfidogenesis that was inhibited by nitrate amendment. Most probable number (MPN) and fluorescent in situ hybridization (FISH) analyses of uncultivated produced waters showed low cell numbers ({<=}10{sup 3} MPN/ml) dominated by SRB (>95% relative abundance). MPN analysis also detected nitrate-reducing sulfide-oxidizing bacteria (NRSOB) and heterotrophic nitrate-reducing bacteria (HNRB) at numbers too low to be detected by FISH or denaturing gradient gel electrophoresis (DGGE). In microcosms containing produced water fortified with sulfate, near-stoichiometric concentrations of sulfide were produced. FISH analyses of the microcosms after 55 days of incubation revealed that Gammaproteobacteria increased from undetectable levels to 5-20% abundance, resulting in a decreased proportion of Deltaproteobacteria (50-60% abundance). DGGE analysis confirmed the presence of Delta- and Gammaproteobacteria and also detected Bacteroidetes. When sulfate-fortified produced waters were amended with nitrate, sulfidogenesis was inhibited and Deltaproteobacteria decreased to levels undetectable by FISH, with a concomitant increase in Gammaproteobacteria from below detection to 50-60% abundance. DGGE analysis of these microcosms yielded sequences of Gamma- and Epsilonproteobacteria related to presumptive HNRB and NRSOB (Halomonas, Marinobacterium, Marinobacter, Pseudomonas and Arcobacter), thus supporting chemical data indicating that nitrate-reducing bacteria out-compete SRB when nitrate is

  11. Influences of Biodynamic and Conventional Farming Systems on Quality of Potato (Solanum Tuberosum L.) Crops: Results from Multivariate Analyses of Two Long-Term Field Trials in Sweden.

    Science.gov (United States)

    Kjellenberg, Lars; Granstedt, Artur

    2015-09-15

    The aim of this paper was to present results from two long term field experiments comparing potato samples from conventional farming systems with samples from biodynamic farming systems. The principal component analyses (PCA), consistently exhibited differences between potato samples from the two farming systems. According to the PCA, potato samples treated with inorganic fertilizers exhibited a variation positively related to amounts of crude protein, yield, cooking or tissue discoloration and extract decomposition. Potato samples treated according to biodynamic principles, with composted cow manure, were more positively related to traits such as Quality- and EAA-indices, dry matter content, taste quality, relative proportion of pure protein and biocrystallization value. Distinctions between years, crop rotation and cultivars used were sometimes more significant than differences between manuring systems. Grown after barley the potato crop exhibited better quality traits compared to when grown after ley in both the conventional and the biodynamic farming system.

  12. Influences of Biodynamic and Conventional Farming Systems on Quality of Potato (Solanum Tuberosum L. Crops: Results from Multivariate Analyses of Two Long-Term Field Trials in Sweden

    Directory of Open Access Journals (Sweden)

    Lars Kjellenberg

    2015-09-01

    Full Text Available The aim of this paper was to present results from two long term field experiments comparing potato samples from conventional farming systems with samples from biodynamic farming systems. The principal component analyses (PCA, consistently exhibited differences between potato samples from the two farming systems. According to the PCA, potato samples treated with inorganic fertilizers exhibited a variation positively related to amounts of crude protein, yield, cooking or tissue discoloration and extract decomposition. Potato samples treated according to biodynamic principles, with composted cow manure, were more positively related to traits such as Quality- and EAA-indices, dry matter content, taste quality, relative proportion of pure protein and biocrystallization value. Distinctions between years, crop rotation and cultivars used were sometimes more significant than differences between manuring systems. Grown after barley the potato crop exhibited better quality traits compared to when grown after ley in both the conventional and the biodynamic farming system.

  13. Should processed or raw image data be used in mammographic image quality analyses? A comparative study of three full-field digital mammography systems

    International Nuclear Information System (INIS)

    Borg, Mark; Badr, Ishmail; Royle, Gary

    2015-01-01

    The purpose of this study is to compare a number of measured image quality parameters using processed and unprocessed or raw images in two full-field direct digital units and one computed radiography mammography system. This study shows that the difference between raw and processed image data is system specific. The results have shown that there are no significant differences between raw and processed data in the mean threshold contrast values using the contrast-detail mammography phantom in all the systems investigated; however, these results cannot be generalised to all available systems. Notable differences were noted in contrast-to-noise ratios and in other tests including: response function, modulation transfer function, noise equivalent quanta, normalised noise power spectra and detective quantum efficiency as specified in IEC 62220-1-2. Consequently, the authors strongly recommend the use of raw data for all image quality analyses in digital mammography. (authors)

  14. Geologic field notes and geochemical analyses of outcrop and drill core from Mesoproterozoic rocks and iron-oxide deposits and prospects of southeast Missouri

    Science.gov (United States)

    Day, Warren C.; Granitto, Matthew

    2014-01-01

    The U.S. Geological Survey, in cooperation with the Missouri Department of Natural Resources/Missouri Geological Survey, undertook a study from 1988 to 1994 on the iron-oxide deposits and their host Mesoproterozoic igneous rocks in southeastern Missouri. The project resulted in an improvement of our understanding of the geologic setting, mode of formation, and the composition of many of the known deposits and prospects and the associated rocks of the St. Francois terrane in Missouri. The goal for this earlier work was to allow the comparison of Missouri iron-oxide deposits in context with other iron oxide-copper ± uranium (IOCG) types of mineral deposits observed globally. The raw geochemical analyses were released originally through the USGS National Geochemical Database (NGDB, http://mrdata.usgs.gov). The data presented herein offers all of the field notes, locations, rock descriptions, and geochemical analyses in a coherent package to facilitate new research efforts in IOCG deposit types. The data are provided in both Microsoft Excel (Version Office 2010) spreadsheet format (*.xlsx) and MS-DOS text formats (*.txt) for ease of use by numerous computer programs.

  15. Development of cold moderator vessel for the spallation neutron source. Flow field measurements and thermal hydraulic analyses in cold moderator vessel

    International Nuclear Information System (INIS)

    Aso, Tomokazu; Kaminaga, Masanori; Terada, Atsuhiko; Hino, Ryutaro

    2001-01-01

    The Japan Atomic Energy Research Institute is developing a several MW-scale spallation target system under the High-Intensity Accelerator Project. A cold moderator using supercritical hydrogen is one of the key components in the target system, which directly affects the neutronic performance both in intensity and resolution. Since a hydrogen temperature rise in the moderator vessel affects the neutronic performance, it is necessary to suppress the recirculation and stagnant flows which cause hot spots. In order to develop the conceptual design of the moderator structure in progress, the flow field was measured using a PIV (Particle Image Velocimetry) system under water flow conditions using a flat model that simulated a moderator vessel. From these results, the flow field such as recirculation flows, stagnant flows etc. was clarified. The hydraulic analytical results using the standard k-ε model agreed well with experimental results. Thermal-hydraulic analyses in the moderator vessel were carried out under liquid hydrogen conditions. Based on these results, we clarified the possibility of suppressing the local temperature rise within 3 K under 2 MW operating condition. (author)

  16. A flow injection analyser conductometric coupled system for the field analysis of free dissolved CO{sub 2} and total dissolved inorganic carbon in natural waters

    Energy Technology Data Exchange (ETDEWEB)

    Martinotti, Valter; Balordi, Marcella; Ciceri, Giovanni [RSE SpA - Environment and Sustainable Development Department, Milan (Italy)

    2012-05-15

    A flow injection analyser coupled with a gas diffusion membrane and a conductometric microdetector was adapted for the field analysis of natural concentrations of free dissolved CO{sub 2} and dissolved inorganic carbon in natural waters and used in a number of field campaigns for marine water monitoring. The dissolved gaseous CO{sub 2} presents naturally, or that generated by acidification of the sample, is separated by diffusion using a hydrophobic semipermeable gas porous membrane, and the permeating gas is incorporated into a stream of deionised water and measured by means of an electrical conductometric microdetector. In order to make the system suitable and easy to use for in-field measurements aboard oceanographic ships, the single components of the analyser were compacted into a robust and easy to use system. The calibration of the system is carried out by using standard solutions of potassium bicarbonate at two concentration ranges. Calibration and sample measurements are carried out inside a temperature-constant chamber at 25 C and in an inert atmosphere (N{sub 2}). The detection and quantification limits of the method, evaluated as 3 and 10 times the standard deviation of a series of measurements of the matrix solution were 2.9 and 9.6 {mu}mol/kg of CO{sub 2}, respectively. Data quality for dissolved inorganic carbon was checked with replicate measurements of a certified reference material (A. Dickson, Scripps Institution of Oceanography, University of California, San Diego), both accuracy and repeatability were -3.3% and 10%, respectively. Optimization, performance qualification of the system and its application in various natural water samples are reported and discussed. In the future, the calibration step will be operated automatically in order to improve the analytical performance and the applicability will be increased in the course of experimental surveys carried out both in marine and freshwater ecosystems. Considering the present stage of

  17. 29 November 2013 - U. Humphrey Orjiako Nigerian Ambassador Extraordinary and Plenipotentiary Permanent Mission to the United Nations Office and other international organisations in Geneva signing the Guest Book with Head of International Relations R. Voss, visiting the LHC tunnel at Point 2 and the ALICE cavern with ALICE Collaboration Deputy Spokesperson Y. Schutz.

    CERN Multimedia

    Noemi Caraban

    2013-01-01

    29 November 2013 - U. Humphrey Orjiako Nigerian Ambassador Extraordinary and Plenipotentiary Permanent Mission to the United Nations Office and other international organisations in Geneva signing the Guest Book with Head of International Relations R. Voss, visiting the LHC tunnel at Point 2 and the ALICE cavern with ALICE Collaboration Deputy Spokesperson Y. Schutz.

  18. Crystal analyser-based X-ray phase contrast imaging in the dark field: implementation and evaluation using excised tissue specimens

    International Nuclear Information System (INIS)

    Ando, Masami; Sunaguchi, Naoki; Wu, Yanlin; Do, Synho; Sung, Yongjin; Gupta, Rajiv; Louissaint, Abner; Yuasa, Tetsuya; Ichihara, Shu

    2014-01-01

    We demonstrate the soft tissue discrimination capability of X-ray dark-field imaging (XDFI) using a variety of human tissue specimens. The experimental setup for XDFI comprises an X-ray source, an asymmetrically cut Bragg-type monochromator-collimator (MC), a Laue-case angle analyser (LAA) and a CCD camera. The specimen is placed between the MC and the LAA. For the light source, we used the beamline BL14C on a 2.5-GeV storage ring in the KEK Photon Factory, Tsukuba, Japan. In the eye specimen, phase contrast images from XDFI were able to discriminate soft-tissue structures, such as the iris, separated by aqueous humour on both sides, which have nearly equal absorption. Superiority of XDFI in imaging soft tissue was further demonstrated with a diseased iliac artery containing atherosclerotic plaque and breast samples with benign and malignant tumours. XDFI on breast tumours discriminated between the normal and diseased terminal duct lobular unit and between invasive and in-situ cancer. X-ray phase, as detected by XDFI, has superior contrast over absorption for soft tissue processes such as atherosclerotic plaque and breast cancer. (orig.)

  19. Crystal analyser-based X-ray phase contrast imaging in the dark field: implementation and evaluation using excised tissue specimens

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Masami [RIST, Tokyo University of Science, Noda, Chiba (Japan); Sunaguchi, Naoki [Gunma University, Graduate School of Engineering, Kiryu, Gunma (Japan); Wu, Yanlin [The Graduate University for Advanced Studies, Department of Materials Structure Science, School of High Energy Accelerator Science, Tsukuba, Ibaraki (Japan); Do, Synho; Sung, Yongjin; Gupta, Rajiv [Massachusetts General Hospital and Harvard Medical School, Department of Radiology, Boston, MA (United States); Louissaint, Abner [Massachusetts General Hospital and Harvard Medical School, Department of Pathology, Boston, MA (United States); Yuasa, Tetsuya [Yamagata University, Faculty of Engineering, Yonezawa, Yamagata (Japan); Ichihara, Shu [Nagoya Medical Center, Department of Pathology, Nagoya, Aichi (Japan)

    2014-02-15

    We demonstrate the soft tissue discrimination capability of X-ray dark-field imaging (XDFI) using a variety of human tissue specimens. The experimental setup for XDFI comprises an X-ray source, an asymmetrically cut Bragg-type monochromator-collimator (MC), a Laue-case angle analyser (LAA) and a CCD camera. The specimen is placed between the MC and the LAA. For the light source, we used the beamline BL14C on a 2.5-GeV storage ring in the KEK Photon Factory, Tsukuba, Japan. In the eye specimen, phase contrast images from XDFI were able to discriminate soft-tissue structures, such as the iris, separated by aqueous humour on both sides, which have nearly equal absorption. Superiority of XDFI in imaging soft tissue was further demonstrated with a diseased iliac artery containing atherosclerotic plaque and breast samples with benign and malignant tumours. XDFI on breast tumours discriminated between the normal and diseased terminal duct lobular unit and between invasive and in-situ cancer. X-ray phase, as detected by XDFI, has superior contrast over absorption for soft tissue processes such as atherosclerotic plaque and breast cancer. (orig.)

  20. On the spatio-temporal and energy-dependent response of riometer absorption to electron precipitation: drift-time and conjunction analyses in realistic electric and magnetic fields

    Science.gov (United States)

    Kellerman, Adam; Shprits, Yuri; Makarevich, Roman; Donovan, Eric; Zhu, Hui

    2017-04-01

    Riometers are low-cost passive radiowave instruments located in both northern and southern hemispheres that capable of operating during quiet and disturbed conditions. Many instruments have been operating continuously for multiple solar cycles, making them a useful tool for long-term statistical studies and for real-time analysis and forecasting of space weather. Here we present recent and new analyses of the relationship between the riometer-measured cosmic noise absorption and electron precipitation into the D-region and lower E-region ionosphere. We utilize two techniques: a drift-time analysis in realistic electric and magnetic field models, where a particle is traced from one location to another, and the energy determined by the time delay between similar observations; and a conjunction analysis, where we directly compare precipitated fluxes from THEMIS and Van Allen Probes with the riometer absorption. In both cases we present a statistical analysis of the response of riometer absorption to electron precipitation as a function of MLAT, MLT, and geomagnetic conditions.

  1. Analyses of TIMS and AVIRIS data, integrated with field and laboratory spectra, for lithological and mineralogical interpretation of Vulcano Island, Italy

    Science.gov (United States)

    Buongiorno, M. Fabrizia; Bogliolo, M. Paola; Salvi, Stefano; Pieri, David C.; Geneselli, Francesco

    1995-01-01

    Vulcano Island is part of the Eolian archipelago, located about 25 km from the northeast coast of Sicily. The archipelago comprises seven major volcanic islands, two of which are active volcanoes (Vulcano and Stromboli). Vulcano covers an area of about 50 square km, and is about 10 km long. Explosive volcanic activity has predominated in the geological evolution of Vulcano Island, and there is no evidence that this pattern has ceased. Rather, the current situation is one of unrest, so a strict regimen of continuous geophysical and geochemical monitoring has been undertaken over the last decade. Though the year-round population of Vulcano is small (under 1000), during the summer the island becomes a very popular resort, and has thousands of additional tourists at any time throughout the high season, thus substantially increasing the number of people potentially at risk from an explosive eruption or other hazards such as noxious gas emissions (e.g., CO2, H2S, SO2). During the past ten years, remote sensing data have been repetitively acquired with optical and microwave airborne sensors. The present work shows the preliminary results of a study based on the integration of various remote sensing data sets with field spectroscopy, and other laboratory analyses, for the geological and geomorphological mapping of the island. It is hoped that such work will also usefully contribute to the evaluation of the volcanic hazard potential of the islands as well as to the evaluation of the status of its current activity.

  2. Detection of progression of glaucomatous visual field damage using the point-wise method with the binomial test.

    Science.gov (United States)

    Karakawa, Ayako; Murata, Hiroshi; Hirasawa, Hiroyo; Mayama, Chihiro; Asaoka, Ryo

    2013-01-01

    To compare the performance of newly proposed point-wise linear regression (PLR) with the binomial test (binomial PLR) against mean deviation (MD) trend analysis and permutation analyses of PLR (PoPLR), in detecting global visual field (VF) progression in glaucoma. 15 VFs (Humphrey Field Analyzer, SITA standard, 24-2) were collected from 96 eyes of 59 open angle glaucoma patients (6.0 ± 1.5 [mean ± standard deviation] years). Using the total deviation of each point on the 2(nd) to 16(th) VFs (VF2-16), linear regression analysis was carried out. The numbers of VF test points with a significant trend at various probability levels (pbinomial test (one-side). A VF series was defined as "significant" if the median p-value from the binomial test was binomial PLR method (0.14 to 0.86) was significantly higher than MD trend analysis (0.04 to 0.89) and PoPLR (0.09 to 0.93). The PIS of the proposed method (0.0 to 0.17) was significantly lower than the MD approach (0.0 to 0.67) and PoPLR (0.07 to 0.33). The PBNS of the three approaches were not significantly different. The binomial BLR method gives more consistent results than MD trend analysis and PoPLR, hence it will be helpful as a tool to 'flag' possible VF deterioration.

  3. fields

    Directory of Open Access Journals (Sweden)

    Brad J. Arnold

    2014-07-01

    Full Text Available Surface irrigation, such as flood or furrow, is the predominant form of irrigation in California for agronomic crops. Compared to other irrigation methods, however, it is inefficient in terms of water use; large quantities of water, instead of being used for crop production, are lost to excess deep percolation and tail runoff. In surface-irrigated fields, irrigators commonly cut off the inflow of water when the water advance reaches a familiar or convenient location downfield, but this experience-based strategy has not been very successful in reducing the tail runoff water. Our study compared conventional cutoff practices to a retroactively applied model-based cutoff method in four commercially producing alfalfa fields in Northern California, and evaluated the model using a simple sensor system for practical application in typical alfalfa fields. These field tests illustrated that the model can be used to reduce tail runoff in typical surface-irrigated fields, and using it with a wireless sensor system saves time and labor as well as water.

  4. Assessing microbial degradation of o-xylene at field-scale from the reduction in mass flow rate combined with compound-specific isotope analyses

    Science.gov (United States)

    Peter, A.; Steinbach, A.; Liedl, R.; Ptak, T.; Michaelis, W.; Teutsch, G.

    2004-07-01

    In recent years, natural attenuation (NA) has evolved into a possible remediation alternative, especially in the case of BTEX spills. In order to be approved by the regulators, biodegradation needs to be demonstrated which requires efficient site investigation and monitoring tools. Three methods—the Integral Groundwater Investigation method, the compound-specific isotope analysis (CSIA) and a newly developed combination of both—were used in this work to quantify at field scale the biodegradation of o-xylene at a former gasworks site which is heavily contaminated with BTEX and PAHs. First, the Integral Groundwater Investigation method [Schwarz, R., Ptak, T., Holder, T., Teutsch, G., 1998. Groundwater risk assessment at contaminated sites: a new investigation approach. In: Herbert, M. and Kovar, K. (Editors), GQ'98 Groundwater Quality: Remediation and Protection. IAHS Publication 250, pp. 68-71; COH 4 (2000) 170] was applied, which allows the determination of mass flow rates of o-xylene by integral pumping tests. Concentration time series obtained during pumping at two wells were used to calculate inversely contaminant mass flow rates at the two control planes that are defined by the diameter of the maximum isochrone. A reactive transport model was used within a Monte Carlo approach to identify biodegradation as the dominant process for reduction in the contaminant mass flow rate between the two consecutive control planes. Secondly, compound-specific carbon isotope analyses of o-xylene were performed on the basis of point-scale samples from the same two wells. The Rayleigh equation was used to quantify the degree of biodegradation that occurred between the wells. Thirdly, a combination of the Integral Groundwater Investigation method and the compound-specific isotope analysis was developed and applied. It comprises isotope measurements during the integral pumping tests and the evaluation of δ13C time series by an inversion algorithm to obtain spatially

  5. FY1995 report on the analyses of functional living systems using magnetic stimulation and magnetic fields; 1995 nendo jiki shigeki oyobi kyojiba ni yoru seitai kino kaimei

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-05-01

    The purpose of the project is to widen the understanding of the biological effects o magnetic fields and to search potential applications of biomagnetics to medical diagnosis and treatments. We developed a method of localized magnetic stimulation of the brain. By concentrating induced eddy currents on a target with a pair of opposing pulsed magnetic fields produced by a figure-eight coil, they were able to stimulate the human cortex within a 5 mm resolution. We studied the properties of diamagnetic water in static magnetic fields. The phenomenon that the surface of the water was pushed back by magnetic fields of higher gradients was observed. The behavior of oxygen dissolved in an aqueous solution under magnetic fields of up to 8T with a gradient of 50T/m was studied. For oxygen concentrations greater than 11 mg/l, a clear redistribution of dissolved oxygen was observed. Effects of strong magnetic fields on a process of dissolution of fibrin clots was studied. Fibrin polymers in water magneto-phoresically drifted in the direction of increasing magnetic fields, and dissolution of fibrin polymers by plasmin was accelerated. (NEDO)

  6. The groningen longitudinal glaucoma study III. The predictive value of frequency-doubling perimetry and GDx nerve fibre analyser test results for the development of glaucomatous visual field loss

    NARCIS (Netherlands)

    Heeg, G. P.; Jansonius, N. M.

    Purpose To investigate whether frequency-doubling perimetry (FDT) and nerve fibre analyser (GDx) test results are able to predict glaucomatous visual field loss in glaucoma suspect patients. Methods A large cohort of glaucoma suspect patients (patients with ocular hypertension or a positive family

  7. Morning Glory Disc Anomaly: A Case Report

    African Journals Online (AJOL)

    Ocular complications in affected eye may include strabismus, retinal detachment and reduced visual acuity. Contralateral ... near acuity was OD J3 and OS J1+ according to Jaeger. ... Using the Automated Humphrey Field analyser (Carl Zeiss.

  8. Evaluating the performance of commonly used gas analysers for methane eddy covariance flux measurements: the InGOS inter-comparison field experiment

    NARCIS (Netherlands)

    Peltola, O.; Hensen, A.; Helfter, C.; Belelli Marchesini, L.; Bosveld, F.C.; Bulk, van de W.C.M.; Elbers, J.A.; Haapanala, S.; Holst, J.; Laurila, T.; Lindroth, A.; Nemitz, E.; Röckmann, T.; Vermeulen, A.T.; Mammarella, I.

    2014-01-01

    The performance of eight fast-response methane (CH4) gas analysers suitable for eddy covariance flux measurements were tested at a grassland site near the Cabauw tall tower (Netherlands) during June 2012. The instruments were positioned close to each other in order to minimise the effect of varying

  9. Evaluating the performance of commonly used gas analysers for methane eddy covariance flux measurements : the InGOS inter-comparison field experiment

    NARCIS (Netherlands)

    Peltola, O.; Hensen, A.; Helfter, C.; Belelli Marchesini, L.; Bosveld, F. C.; Van Den Bulk, W. C M; Elbers, J. A.; Haapanala, S.; Holst, J.; Laurila, T.; Lindroth, A.; Nemitz, E.; Röckmann, T.; Vermeulen, A. T.; Mammarella, I.

    2014-01-01

    The performance of eight fast-response methane (CH4) gas analysers suitable for eddy covariance flux measurements were tested at a grassland site near the Cabauw tall tower (Netherlands) during June 2012. The instruments were positioned close to each other in order to minimise the effect of varying

  10. Evaluating the performance of commonly used gas analysers for methane eddy covariance flux measurements: the InGOS inter-comparison field experiment

    NARCIS (Netherlands)

    Peltola, O.; Hensen, A.; Helfter, C.; Belelli Marchesini, L.; Bosveld, F. C.; Van Den Bulk, W. C. M.; Elbers, J. A.; Haapanala, S.; Holst, J.; Laurila, T.; Lindroth, A.; Nemitz, E.; Röckmann, T.; Vermeulen, A. T.; Mammarella, I.

    2014-01-01

    The performance of eight fast-response methane (CH4) gas analysers suitable for eddy covariance flux measurements were tested at a grassland site near the Cabauw tall tower (Netherlands) during June 2012. The instruments were positioned close to each other in order to minimize the effect of varying

  11. Genetic variation in growth, carbon isotope discrimination, and foliar N concentration in Picea mariana: analyses from a half-diallel mating design using field-grown trees

    Science.gov (United States)

    Kurt H. Johnsen; Lawrence B. Flanagan; Dudley A. Huber; John E. Major

    1999-01-01

    The authors performed genetic analyses of growth, carbon isotope discrimination (?13C), and foliar N concentration using a half-diallel subset of a 7 × 7 complete diallel planted on three sites ranging in water availability. Trees were 22 years old. Heritabilities; general and...

  12. Evaluating the performance of commonly used gas analysers for methane eddy covariance flux measurements: the InGOS inter-comparison field experiment

    Science.gov (United States)

    Peltola, O.; Hensen, A.; Helfter, C.; Belelli Marchesini, L.; Bosveld, F. C.; van den Bulk, W. C. M.; Elbers, J. A.; Haapanala, S.; Holst, J.; Laurila, T.; Lindroth, A.; Nemitz, E.; Röckmann, T.; Vermeulen, A. T.; Mammarella, I.

    2014-06-01

    The performance of eight fast-response methane (CH4) gas analysers suitable for eddy covariance flux measurements were tested at a grassland site near the Cabauw tall tower (Netherlands) during June 2012. The instruments were positioned close to each other in order to minimise the effect of varying turbulent conditions. The moderate CH4 fluxes observed at the location, of the order of 25 nmol m-2 s-1, provided a suitable signal for testing the instruments' performance. Generally, all analysers tested were able to quantify the concentration fluctuations at the frequency range relevant for turbulent exchange and were able to deliver high-quality data. The tested cavity ringdown spectrometer (CRDS) instruments from Picarro, models G2311-f and G1301-f, were superior to other CH4 analysers with respect to instrumental noise. As an open-path instrument susceptible to the effects of rain, the LI-COR LI-7700 achieved lower data coverage and also required larger density corrections; however, the system is especially useful for remote sites that are restricted in power availability. In this study the open-path LI-7700 results were compromised due to a data acquisition problem in our data-logging setup. Some of the older closed-path analysers tested do not measure H2O concentrations alongside CH4 (i.e. FMA1 and DLT-100 by Los Gatos Research) and this complicates data processing since the required corrections for dilution and spectroscopic interactions have to be based on external information. To overcome this issue, we used H2O mole fractions measured by other gas analysers, adjusted them with different methods and then applied them to correct the CH4 fluxes. Following this procedure we estimated a bias of the order of 0.1 g (CH4) m-2 (8% of the measured mean flux) in the processed and corrected CH4 fluxes on a monthly scale due to missing H2O concentration measurements. Finally, cumulative CH4 fluxes over 14 days from three closed-path gas analysers, G2311-f (Picarro Inc

  13. [The most cited themes in the research in the field of Mental Health: analyses of six international nursing and medical journals].

    Science.gov (United States)

    Cunico, Laura; Fredo, Susanna; Bernini, Massimo

    2017-01-01

    The review aimed to identify and analyse the future development on the topic by analysing the main themes discussed in number of scientific journal focused on Mental Health both by nurses and physicians.. 4 international journals focused on Mental health and psychiatry International Journal of Mental Health Nursing, Archives of Psychiatric Nursing, American Journal of Psychiatry, Australian and New Zeland Journal of Psychiatry as well as two journal focused generically on health, Journal of Advanced Nursing and Lancet were scrutinized. We have analysed the papers of 2012-2015 for the specialised journals and last and first 6 months of 2012 and 2013 and 2014-2015 for the generic. Editorials, comments and contributions regarding theoretical models were exluded. From the analysis we identified 9 themes and for each theme the pertinent category. For the diagnostic grouping we used the International Statistical Classification of Diseases and Related Health Problems 10th Revision. A trend in research about mood disorders, schizophrenia and addictions and comorbidity emerged according to the 2099 abstracts analysed. Within medical research antidepressants were the most studied psychotropic medication and cognitive behaviour therapy was the most studied psychotherapy. Within nursing research: the nurse-patient relationship, adherence and monitoring of pharmacological therapy, the treatment planning and the working environment, the nursing training and its efficacy. The clinical research trials were twice as frequent in the medical versus nursing research where qualitative research prevails. The research challenge will be to find a new paradigm fit for the future psychiatry having at its disposition the patient's genoma, and needing to routinely use biomarkers for a personalised therapy. A further challenge might be the promotion of interprofessional research between doctors and nurses and the acquisition of new competences of health professionals needed to tackle the

  14. Reprint of "Sequence and phylogenetic analyses of novel totivirus-like double-stranded RNAs from field-collected powdery mildew fungi".

    Science.gov (United States)

    Kondo, Hideki; Hisano, Sakae; Chiba, Sotaro; Maruyama, Kazuyuki; Andika, Ida Bagus; Toyoda, Kazuhiro; Fujimori, Fumihiro; Suzuki, Nobuhiro

    2016-07-02

    The identification of mycoviruses contributes greatly to understanding of the diversity and evolutionary aspects of viruses. Powdery mildew fungi are important and widely studied obligate phytopathogenic agents, but there has been no report on mycoviruses infecting these fungi. In this study, we used a deep sequencing approach to analyze the double-stranded RNA (dsRNA) segments isolated from field-collected samples of powdery mildew fungus-infected red clover plants in Japan. Database searches identified the presence of at least ten totivirus (genus Totivirus)-like sequences, termed red clover powdery mildew-associated totiviruses (RPaTVs). The majority of these sequences shared moderate amino acid sequence identity with each other (powdery mildew fungus populations infecting red clover plants in the field. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Spurious effects of electron emission from the grids of a retarding field analyser on secondary electron emission measurements. Results on a (111) copper single crystal

    International Nuclear Information System (INIS)

    Pillon, J.; Roptin, D.; Cailler, M.

    1976-01-01

    Spurious effects of a four grid retarding field analyzer were studied for low energy secondary electron measurements. Their behavior was investigated and two peaks in the energy spectrum were interpreted as resulting from tertiary electrons from the grids. It was shown that the true secondary electron peak has to be separated from these spurious peaks. The spectrum and the yields sigma and eta obtained for a Cu(111) crystal after a surface cleanness control by Auger spectroscopy are given

  16. Succession of methanogenic archaea in rice straw incorporated into a Japanese rice field: estimation by PCR-DGGE and sequence analyses

    Directory of Open Access Journals (Sweden)

    Atsuo Sugano

    2005-01-01

    Full Text Available The succession and phylogenetic profiles of methanogenic archaeal communities associated with rice straw decomposition in rice-field soil were studied by polymerase chain reaction-denaturing gradient gel electrophoresis (PCR-DGGE analysis followed by 16S rDNA sequencing. Nylon bags containing either leaf sheaths or blades were buried in the plowed layer of a Japanese rice field under drained conditions during the off-crop season and under flooded conditions after transplanting. In addition, rice straw samples that had been buried in the rice field under drained conditions during the off-crop season were temporarily removed during spring plowing and then re-buried in the same rice field under flooded conditions at transplanting. Populations of methanogenic archaea were examined by amplification of the 16S rRNA genes in the DNA extracted from the rice straw samples. No PCR product was produced for samples of leaf sheath or blade prior to burial or after burial under drained conditions, indicating that the methanogen population was very small during decomposition of rice straw under oxic conditions. Many common bands were observed in rice straw samples of leaf sheath and blade during decomposition of rice straw under flooded conditions. Cluster analysis based on DGGE patterns divided methanogenic archaeal communities into two groups before and after the mid-season drainage. Sequence analysis of DGGE bands that were commonly present were closely related to Methanomicrobiales and Rice cluster I. Methanomicrobiales, Rice cluster I and Methanosarcinales were major members before the mid-season drainage, whereas the DGGE bands that characterized methanogenic archaeal communities after the mid-season drainage were closely related to Methanomicrobiales. These results indicate that mid-season drainage affected the methanogenic archaeal communities irrespective of their location on rice straw (sheath and blade and the previous history of decomposition

  17. Learning outcomes of in-person and virtual field-based geoscience instruction at Grand Canyon National Park: complementary mixed-methods analyses

    Science.gov (United States)

    Semken, S. C.; Ruberto, T.; Mead, C.; Bruce, G.; Buxner, S.; Anbar, A. D.

    2017-12-01

    Students with limited access to field-based geoscience learning can benefit from immersive, student-centered virtual-reality and augmented-reality field experiences. While no digital modalities currently envisioned can truly supplant field-based learning, they afford students access to geologically illustrative but inaccessible places on Earth and beyond. As leading producers of immersive virtual field trips (iVFTs), we investigate complementary advantages and disadvantages of iVFTs and in-person field trips (ipFTs). Settings for our mixed-methods study were an intro historical-geology class (n = 84) populated mostly by non-majors and an advanced Southwest geology class (n = 39) serving mostly majors. Both represent the diversity of our urban Southwestern research university. For the same credit, students chose either an ipFT to the Trail of Time (ToT) Exhibition at Grand Canyon National Park (control group) or an online Grand Canyon iVFT (experimental group), in the same time interval. Learning outcomes for each group were identically drawn from elements of the ToT and assessed using pre/post concept sketching and inquiry exercises. Student attitudes and cognitive-load factors for both groups were assessed pre/post using the PANAS instrument (Watson et al., 1998) and with affective surveys. Analysis of pre/post concept sketches indicated improved knowledge in both groups and classes, but more so in the iVFT group. PANAS scores from the intro class showed the ipFT students having significantly stronger (p = .004) positive affect immediately prior to the experience than the iVFT students, possibly reflecting their excitement about the trip to come. Post-experience, the two groups were no longer significantly different, possibly due to the fatigue associated with a full-day ipFT. Two lines of evidence suggest that the modalities were comparable in expected effectiveness. First, the information relevant for the concept sketch was specifically covered in both

  18. Analyses of turbulent flow fields and aerosol dynamics of diesel engine exhaust inside two dilution sampling tunnels using the CTAG model.

    Science.gov (United States)

    Wang, Yan Jason; Yang, Bo; Lipsky, Eric M; Robinson, Allen L; Zhang, K Max

    2013-01-15

    Experimental results from laboratory emission testing have indicated that particulate emission measurements are sensitive to the dilution process of exhaust using fabricated dilution systems. In this paper, we first categorize the dilution parameters into two groups: (1) aerodynamics (e.g., mixing types, mixing enhancers, dilution ratios, residence time); and (2) mixture properties (e.g., temperature, relative humidity, particle size distributions of both raw exhaust and dilution gas). Then we employ the Comprehensive Turbulent Aerosol Dynamics and Gas Chemistry (CTAG) model to investigate the effects of those parameters on a set of particulate emission measurements comparing two dilution tunnels, i.e., a T-mixing lab dilution tunnel and a portable field dilution tunnel with a type of coaxial mixing. The turbulent flow fields and aerosol dynamics of particles are simulated inside two dilution tunnels. Particle size distributions under various dilution conditions predicted by CTAG are evaluated against the experimental data. It is found that in the area adjacent to the injection of exhaust, turbulence plays a crucial role in mixing the exhaust with the dilution air, and the strength of nucleation dominates the level of particle number concentrations. Further downstream, nucleation terminates and the growth of particles by condensation and coagulation continues. Sensitivity studies reveal that a potential unifying parameter for aerodynamics, i.e., the dilution rate of exhaust, plays an important role in new particle formation. The T-mixing lab tunnel tends to favor the nucleation due to a larger dilution rate of the exhaust than the coaxial mixing field tunnel. Our study indicates that numerical simulation tools can be potentially utilized to develop strategies to reduce the uncertainties associated with dilution samplings of emission sources.

  19. Production data from five major geothermal fields in Nevada analysed using a physiostatistical algorithm developed for oil and gas: temperature decline forecasts and type curves

    Science.gov (United States)

    Kuzma, H. A.; Golubkova, A.; Eklund, C.

    2015-12-01

    Nevada has the second largest output of geothermal energy in the United States (after California) with 14 major power plants producing over 425 megawatts of electricity meeting 7% of the state's total energy needs. A number of wells, particularly older ones, have shown significant temperature and pressure declines over their lifetimes, adversely affecting economic returns. Production declines are almost universal in the oil and gas (O&G) industry. BetaZi (BZ) is a proprietary algorithm which uses a physiostatistical model to forecast production from the past history of O&G wells and to generate "type curves" which are used to estimate the production of undrilled wells. Although BZ was designed and calibrated for O&G, it is a general purpose diffusion equation solver, capable of modeling complex fluid dynamics in multi-phase systems. In this pilot study, it is applied directly to the temperature data from five Nevada geothermal fields. With the data appropriately normalized, BZ is shown to accurately predict temperature declines. The figure shows several examples of BZ forecasts using historic data from Steamboat Hills field near Reno. BZ forecasts were made using temperature on a normalized scale (blue) with two years of data held out for blind testing (yellow). The forecast is returned in terms of percentiles of probability (red) with the median forecast marked (solid green). Actual production is expected to fall within the majority of the red bounds 80% of the time. Blind tests such as these are used to verify that the probabilistic forecast can be trusted. BZ is also used to compute and accurate type temperature profile for wells that have yet to be drilled. These forecasts can be combined with estimated costs to evaluate the economics and risks of a project or potential capital investment. It is remarkable that an algorithm developed for oil and gas can accurately predict temperature in geothermal wells without significant recasting.

  20. Sorption data base for the cementitious near-field of L/ILW and ILW repositories for provisional safety analyses for SGT-E2

    International Nuclear Information System (INIS)

    Wieland, E.

    2014-11-01

    The near-field of the planned Swiss repositories for low- and intermediate-level waste (L/ILW) and long-lived intermediate-level waste (ILW) consists of large quantities of cementitious materials. Hardened cement paste (HCP) is considered to be the most important sorbing material present in the near-field of L/ILW and ILW repositories. Interaction of radionuclides with HCP represents the most important mechanism retarding their migration from the near-field into the host rock. This report describes a cement sorption data base (SDB) for the safety-relevant radionuclides in the waste that will be disposed of in the L/ILW and ILW repositories. The current update on sorption values for radionuclides should be read in conjunction with the earlier SDBs CEM-94, CEM-97 and CEM-02. Sorption values have been selected based on procedures reported in these earlier SDBs. The values are revised if corresponding new information and/or data are available. The basic information results from a survey of sorption studies published between 2002 and 2013. The sorption values recommended in this report have either been selected from in-house experimental studies or from literature data, and they were further assessed with a view to the sorption values recently published in the framework of the safety analysis for the planned near surface disposal facility in Belgium. The report summarizes the sorption properties of HCP and compiles sorption values for safety-relevant radionuclides and low-molecular weight organic molecules on undisturbed and degraded HCP. A list of the safety-relevant radionuclides is provided. The radionuclide inventories are determined by the waste streams to be disposed of in the L/ILW and ILW repositories. Information on the elemental and mineral composition of HCP was obtained from hydration studies. The concentrations of the most important impurity elements in cement were obtained from dissolution studies on HCP. Particular emphasis is placed on summarizing our

  1. Higher education and local communities | Humphreys | South ...

    African Journals Online (AJOL)

    ... and advocate for human resource development and the third, as a service provider, building intellectual capital. The article examines the proposition that the most significant contribution that a university or technikon can make to the development of a locality derives from its recruitment of students from the local community.

  2. Neonatal Vitamin A supplementation | Humphrey | South African ...

    African Journals Online (AJOL)

    South African Medical Journal. Journal Home · ABOUT · Advanced Search · Current Issue · Archives · Journal Home > Vol 89, No 2 (1999) >. Log in or Register to get access to full text downloads. Username, Password, Remember me, or Register · Download this PDF file. The PDF file you selected should load here if your ...

  3. Neonatal Vitamin A supplementation | Humphrey | South African ...

    African Journals Online (AJOL)

    Vitamin A deficiency is a major public health problem throughout the developing world, affecting an estimated 124 million young children and accounting for more than 1 million child deaths each year.' A meta-analysis of eight controlled trials estimated that community-based vitamin A supplementation resulted in a 23% ...

  4. Project 'WINDBANK mittleres Aaretal' - Analysis, Diagnosis and Forecast of Wind Fields around the Nuclear Power Plant Goesgen; Projekt 'WINDBANK mittleres Aaretal' - Analyse, Diagnose und Prognose der Windverhaeltnisse um das Kernkraftwerk Goesgen

    Energy Technology Data Exchange (ETDEWEB)

    Graber, W.K.; Tinguely, M

    2002-07-01

    An emergency decision support system for accidental releases of radioactivity into the atmosphere providing regional wind field information is presented. This system is based on intensive meteorological field campaigns each lasting 3-4 months in the regions around the Swiss nuclear power plants. The wind data from temporary and permanent stations are analysed to evaluate the typical wind field patterns occurring in these regions. A cluster analysis for these data-sets lead to 12 different wind field classes with a high separation quality. In the present report, it is demonstrated that an on-line acquisition of meteorological data from existing permanent stations is enough to diagnose the recent wind field class in a region with a radius of 25 km around the nuclear power station of Goesgen with a probability of 95% to hit the correct class. Furthermore, a method is presented to use a high resolution weather prediction model to forecast the future wind field classes. An average probability of 76% to hit the correct class for a forecast time of 24 hours is evaluated. Finally, a method for parameterization of turbulence providing input for dispersion models from standard meteorological online data is presented. (author)

  5. Investigating the usefulness of a cluster-based trend analysis to detect visual field progression in patients with open-angle glaucoma.

    Science.gov (United States)

    Aoki, Shuichiro; Murata, Hiroshi; Fujino, Yuri; Matsuura, Masato; Miki, Atsuya; Tanito, Masaki; Mizoue, Shiro; Mori, Kazuhiko; Suzuki, Katsuyoshi; Yamashita, Takehiro; Kashiwagi, Kenji; Hirasawa, Kazunori; Shoji, Nobuyuki; Asaoka, Ryo

    2017-12-01

    To investigate the usefulness of the Octopus (Haag-Streit) EyeSuite's cluster trend analysis in glaucoma. Ten visual fields (VFs) with the Humphrey Field Analyzer (Carl Zeiss Meditec), spanning 7.7 years on average were obtained from 728 eyes of 475 primary open angle glaucoma patients. Mean total deviation (mTD) trend analysis and EyeSuite's cluster trend analysis were performed on various series of VFs (from 1st to 10th: VF1-10 to 6th to 10th: VF6-10). The results of the cluster-based trend analysis, based on different lengths of VF series, were compared against mTD trend analysis. Cluster-based trend analysis and mTD trend analysis results were significantly associated in all clusters and with all lengths of VF series. Between 21.2% and 45.9% (depending on VF series length and location) of clusters were deemed to progress when the mTD trend analysis suggested no progression. On the other hand, 4.8% of eyes were observed to progress using the mTD trend analysis when cluster trend analysis suggested no progression in any two (or more) clusters. Whole field trend analysis can miss local VF progression. Cluster trend analysis appears as robust as mTD trend analysis and useful to assess both sectorial and whole field progression. Cluster-based trend analyses, in particular the definition of two or more progressing cluster, may help clinicians to detect glaucomatous progression in a timelier manner than using a whole field trend analysis, without significantly compromising specificity. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  6. Tests of heat techniques in households. Analysis of the results of the field tests; Praktijkprestaties van warmtetechnieken bij huishoudens. Analyse resultaten veldtesten

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, A.; Friedel, P.; Overman, P. [Energy Matters, Driebergen (Netherlands)

    2012-11-15

    The development within conventional techniques and new techniques for the recovery and generation of heat in the house construction industry led to a need for knowledge of efficiencies of those techniques in practice. Therefore a project has been set up to gain insight into the efficiencies. In the field tests, five heat techniques are investigated: high-efficiency boilers, solar water heaters, balanced ventilation systems with heat recovery (also called heat recovery systems) and heat pump water heaters [Dutch] De ontwikkeling binnen conventionele technieken en nieuwe technieken voor de terugwinning en opwekking van warmte in de woningbouw leidden ertoe dat er bij verschillende partijen in de keten een kennisbehoefte is ontstaan naar de rendementen van deze technieken in de praktijk. Daartoe heeft AgentschapNL een project opgezet om meer inzicht te verkrijgen in deze rendementen. In de veldtesten worden vijf warmtechnieken bekeken: HR-ketels, HRe-ketels, zonneboilers, gebalanceerde ventilatiesystemen met warmteterugwinning (verder WTW-systemen genoemd) en warmtepompboilers. Deze worden op minutenbasis bemeten.

  7. Bridging health technology assessment (HTA with multicriteria decision analyses (MCDA: field testing of the EVIDEM framework for coverage decisions by a public payer in Canada

    Directory of Open Access Journals (Sweden)

    Tony Michèle

    2011-11-01

    Full Text Available Abstract Background Consistent healthcare decisionmaking requires systematic consideration of decision criteria and evidence available to inform them. This can be tackled by combining multicriteria decision analysis (MCDA and Health Technology Assessment (HTA. The objective of this study was to field-test a decision support framework (EVIDEM, explore its utility to a drug advisory committee and test its reliability over time. Methods Tramadol for chronic non-cancer pain was selected by the health plan as a case study relevant to their context. Based on extensive literature review, a by-criterion HTA report was developed to provide synthesized evidence for each criterion of the framework (14 criteria for the MCDA Core Model and 6 qualitative criteria for the Contextual Tool. During workshop sessions, committee members tested the framework in three steps by assigning: 1 weights to each criterion of the MCDA Core Model representing individual perspective; 2 scores for tramadol for each criterion of the MCDA Core Model using synthesized data; and 3 qualitative impacts of criteria of the Contextual Tool on the appraisal. Utility and reliability of the approach were explored through discussion, survey and test-retest. Agreement between test and retest data was analyzed by calculating intra-rater correlation coefficients (ICCs for weights, scores and MCDA value estimates. Results The framework was found useful by the drug advisory committee in supporting systematic consideration of a broad range of criteria to promote a consistent approach to appraising healthcare interventions. Directly integrated in the framework as a "by-criterion" HTA report, synthesized evidence for each criterion facilitated its consideration, although this was sometimes limited by lack of relevant data. Test-retest analysis showed fair to good consistency of weights, scores and MCDA value estimates at the individual level (ICC ranging from 0.676 to 0.698, thus lending some

  8. [Clinico-statistical study on availability of Esterman disability score for assessment of mobility difficulty in patients with visual field loss].

    Science.gov (United States)

    Yamagata, Yoshitaka; Terada, Yuko; Suzuki, Atsushi; Mimura, Osamu

    2010-01-01

    The visual efficiency scale currently adopted to determine the legal grade of visual disability associated with visual field loss in Japan is not appropriate for the evaluation of disability regarding daily living activities. We investigated whether Esterman disability score (EDS) is suitable for the assessment of mobility difficulty in patients with visual field loss. The correlation between the EDS calculated from Goldmann's kinetic visual field and the degree of subjective mobility difficulty determined by a questionnaire was investigated in 164 patients with visual field loss. The correlation between the EDS determined using a program built into the Humphrey field analyzer and that calculated from Goldmann's kinetic visual field was also investigated. The EDS based on the kinetic visual field was correlated well with the degree of subjective mobility difficulty, and the EDS measured using the Humphrey field analyzer could be estimated from the kinetic visual field-based EDS. Instead of the currently adopted visual efficiency scale, EDS should be employed for the assessment of mobility difficulty in patients with visual field loss, also to establish new judgment criteria concerning the visual field.

  9. Far-Field Acoustic Power Level and Performance Analyses of F31/A31 Open Rotor Model at Simulated Scaled Takeoff, Nominal Takeoff, and Approach Conditions: Technical Report I

    Science.gov (United States)

    Sree, Dave

    2015-01-01

    Far-field acoustic power level and performance analyses of open rotor model F31/A31 have been performed to determine its noise characteristics at simulated scaled takeoff, nominal takeoff, and approach flight conditions. The nonproprietary parts of the data obtained from experiments in 9- by 15-Foot Low-Speed Wind Tunnel (9?15 LSWT) tests were provided by NASA Glenn Research Center to perform the analyses. The tone and broadband noise components have been separated from raw test data by using a new data analysis tool. Results in terms of sound pressure levels, acoustic power levels, and their variations with rotor speed, angle of attack, thrust, and input shaft power have been presented and discussed. The effect of an upstream pylon on the noise levels of the model has been addressed. Empirical equations relating model's acoustic power level, thrust, and input shaft power have been developed. The far-field acoustic efficiency of the model is also determined for various simulated flight conditions. It is intended that the results presented in this work will serve as a database for comparison and improvement of other open rotor blade designs and also for validating open rotor noise prediction codes.

  10. AMS analyses at ANSTO

    Energy Technology Data Exchange (ETDEWEB)

    Lawson, E.M. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia). Physics Division

    1998-03-01

    The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with {sup 14}C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for {sup 14}C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent`s indigenous Aboriginal peoples. (author)

  11. AMS analyses at ANSTO

    International Nuclear Information System (INIS)

    Lawson, E.M.

    1998-01-01

    The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with 14 C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for 14 C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent's indigenous Aboriginal peoples. (author)

  12. Biogeochemical typing of paddy field by a data-driven approach revealing sub-systems within a complex environment--a pipeline to filtrate, organize and frame massive dataset from multi-omics analyses.

    Directory of Open Access Journals (Sweden)

    Diogo M O Ogawa

    Full Text Available We propose the technique of biogeochemical typing (BGC typing as a novel methodology to set forth the sub-systems of organismal communities associated to the correlated chemical profiles working within a larger complex environment. Given the intricate characteristic of both organismal and chemical consortia inherent to the nature, many environmental studies employ the holistic approach of multi-omics analyses undermining as much information as possible. Due to the massive amount of data produced applying multi-omics analyses, the results are hard to visualize and to process. The BGC typing analysis is a pipeline built using integrative statistical analysis that can treat such huge datasets filtering, organizing and framing the information based on the strength of the various mutual trends of the organismal and chemical fluctuations occurring simultaneously in the environment. To test our technique of BGC typing, we choose a rich environment abounding in chemical nutrients and organismal diversity: the surficial freshwater from Japanese paddy fields and surrounding waters. To identify the community consortia profile we employed metagenomics as high throughput sequencing (HTS for the fragments amplified from Archaea rRNA, universal 16S rRNA and 18S rRNA; to assess the elemental content we employed ionomics by inductively coupled plasma optical emission spectroscopy (ICP-OES; and for the organic chemical profile, metabolomics employing both Fourier transformed infrared (FT-IR spectroscopy and proton nuclear magnetic resonance (1H-NMR all these analyses comprised our multi-omics dataset. The similar trends between the community consortia against the chemical profiles were connected through correlation. The result was then filtered, organized and framed according to correlation strengths and peculiarities. The output gave us four BGC types displaying uniqueness in community and chemical distribution, diversity and richness. We conclude therefore that

  13. Can DMCO Detect Visual Field Loss in Neurological Patients? A Secondary Validation Study

    DEFF Research Database (Denmark)

    Olsen, Ane Sophie; Steensberg, Alvilda Thougaard; la Cour, Morten

    2017-01-01

    Unrecognized visual field loss is caused by a range of blinding eye conditions as well as serious brain diseases. The commonest cause of asymptomatic visual field loss is glaucoma. No screening tools have been proven cost-effective. Damato Multifixation Campimetry Online (DMCO), an inexpensive...... online test, has been evaluated as a future cost-beneficial tool to detect glaucoma. To further validate DMCO, this study aimed to test DMCO in a preselected population with neurological visual field loss. Methods : The study design was an evaluation of a diagnostic test. Patients were included...... if they had undergone surgery for epilepsy during 2011-2014, resulting in visual field loss. They were examined with DMCO and results were compared with those obtained with the Humphrey Field Analyzer (30:2 SITA-Fast). DMCO sensitivity and specificity were estimated with 95% confidence intervals. Results...

  14. Equating spatial summation in visual field testing reveals greater loss in optic nerve disease.

    Science.gov (United States)

    Kalloniatis, Michael; Khuu, Sieu K

    2016-07-01

    To test the hypothesis that visual field assessment in ocular disease measured with target stimuli within or close to complete spatial summation results in larger threshold elevation compared to when measured with the standard Goldmann III target size. The hypothesis predicts a greater loss will be identified in ocular disease. Additionally, we sought to develop a theoretical framework that would allow comparisons of thresholds with disease progression when using different Goldmann targets. The Humphrey Field Analyser (HFA) 30-2 grid was used in 13 patients with early/established optic nerve disease using the current Goldmann III target size or a combination of the three smallest stimuli (target size I, II and III). We used data from control subjects at each of the visual field locations for the different target sizes to establish the number of failed points (events) for the patients with optic nerve disease, as well as global indices for mean deviation (MD) and pattern standard deviation (PSD). The 30-2 visual field testing using alternate target size stimuli showed that all 13 patients displayed more defects (events) compared to the standard Goldmann III target size. The median increase for events was seven additional failed points: (range 1-26). The global indices also increased when the new testing approach was used (MD -3.47 to -6.25 dB and PSD 4.32 to 6.63 dB). Spatial summation mapping showed an increase in critical area (Ac) in disease and overall increase in thresholds when smaller target stimuli were used. When compared to the current Goldmann III paradigm, the use of alternate sized targets within the 30-2 testing protocol revealed a greater loss in patients with optic nerve disease for both event analysis and global indices (MD and PSD). We therefore provide evidence in a clinical setting that target size is important in visual field testing. © 2016 The Authors Ophthalmic & Physiological Optics © 2016 The College of Optometrists.

  15. Low-cost, smartphone based frequency doubling technology visual field testing using virtual reality (Conference Presentation)

    Science.gov (United States)

    Alawa, Karam A.; Sayed, Mohamed; Arboleda, Alejandro; Durkee, Heather A.; Aguilar, Mariela C.; Lee, Richard K.

    2017-02-01

    Glaucoma is the leading cause of irreversible blindness worldwide. Due to its wide prevalence, effective screening tools are necessary. The purpose of this project is to design and evaluate a system that enables portable, cost effective, smartphone based visual field screening based on frequency doubling technology. The system is comprised of an Android smartphone to display frequency doubling stimuli and handle processing, a Bluetooth remote for user input, and a virtual reality headset to simulate the exam. The LG Nexus 5 smartphone and BoboVR Z3 virtual reality headset were used for their screen size and lens configuration, respectively. The system is capable of running the C-20, N-30, 24-2, and 30-2 testing patterns. Unlike the existing system, the smartphone FDT tests both eyes concurrently by showing the same background to both eyes but only displaying the stimulus to one eye at a time. Both the Humphrey Zeiss FDT and the smartphone FDT were tested on five subjects without a history of ocular disease with the C-20 testing pattern. The smartphone FDT successfully produced frequency doubling stimuli at the correct spatial and temporal frequency. Subjects could not tell which eye was being tested. All five subjects preferred the smartphone FDT to the Humphrey Zeiss FDT due to comfort and ease of use. The smartphone FDT is a low-cost, portable visual field screening device that can be used as a screening tool for glaucoma.

  16. A Portable FTIR Analyser for Field Measurements of Trace Gases and their Isotopologues: CO2, CH4, N2O, CO, del13C in CO2 and delD in water vapour

    Science.gov (United States)

    Griffith, D. W.; Bryant, G. R.; Deutscher, N. M.; Wilson, S. R.; Kettlewell, G.; Riggenbach, M.

    2007-12-01

    We describe a portable Fourier Transform InfraRed (FTIR) analyser capable of simultaneous high precision analysis of CO2, CH4, N2O and CO in air, as well as δ13C in CO2 and δD in water vapour. The instrument is based on a commercial 1 cm-1 resolution FTIR spectrometer fitted with a mid-IR globar source, 26 m multipass White cell and thermoelectrically-cooled MCT detector operating between 2000 and 7500 cm-1. Air is passed through the cell and analysed in real time without any pre-treatment except for (optional) drying. An inlet selection manifold allows automated sequential analysis of samples from one or more inlet lines, with typical measurement times of 1-10 minutes per sample. The spectrometer, inlet sampling sequence, real-time quantitative spectrum analysis, data logging and display are all under the control of a single program running on a laptop PC, and can be left unattended for continuous measurements over periods of weeks to months. Selected spectral regions of typically 100-200 cm-1 width are analysed by a least squares fitting technique to retrieve concentrations of trace gases, 13CO2 and HDO. Typical precision is better than 0.1% without the need for calibration gases. Accuracy is similar if measurements are referenced to calibration standard gases. δ13C precision is typically around 0.1‰, and for δD it is 1‰. Applications of the analyser include clean and polluted air monitoring, tower-based flux measurements such as flux gradient or integrated horizontal flux measurements, automated soil chambers, and field-based measurements of isotopic fractionation in soil-plant-atmosphere systems. The simultaneous multi-component advantages can be exploited in tracer-type emission measurements, for example of CH4 from livestock using a co-released tracer gas and downwind measurement. We have also developed an open path variant especially suited to tracer release studies and measurements of NH3 emissions from agricultural sources. An illustrative

  17. Perturbation analysis of cyclotron resonance in the electromagnetic field of a TE{sub 011} mode; Analyse par perturbation de la resonance cyclotronique dans le champ electromagnetique en mode TE{sub 011} mode

    Energy Technology Data Exchange (ETDEWEB)

    Dreicer, H [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1968-07-01

    The motion of an electron orbiting under the combined action of a static magnetic field and the AC azimuthal electric field of a cylindrical TE{sub 011} mode is analyzed with help of a perturbation technique. The first and second order perturbation results indicate that at cyclotron resonance the electron's center of gyration oscillates slowly at right angles to the magnetic field between two turning points. We find that superimposed upon this nearly static Exb drift the electron cyclically undergoes the process of cyclotron absorption and induced emission. Our results indicate that it is possible to insure maser action (i.e. induced emission rather than absorption) without special preparation of the electron's velocity provided that the electron is introduced into the field in certain special regions of space pervaded by the TE mode. This is a case where over-population of the upper state is accomplished through 'pumping' in real space. The relation between an electron cyclotron resonance maser based upon this principle and one based upon the principle of velocity space pumping, due to Twiss, is examined. This treatment provides physical interpretations and verifies the numerical results found earlier by Le Gardeur. (author) [French] Le mouvement d'un electron soumis a l'action combinee d'un champ magnetique statique et d'un champ electrique haute frequence azimutal engendre dans une cavite cylindrique en mode TE{sub 011} est analyse a partir d'une methode de perturbation. Les resultats des perturbations au premier et deuxieme ordre indiquent qu'a la resonance cyclotronique, le centre de giration de l'electron oscille lentement dans le plan perpendiculaire au champ magnetique entre deux points de rebroussement. En plus de la derivee quasi-statique ExB, l'electron passe par des etats d'absorption et emission cyclotronique. Les resultats du calcul confirment la possibilite d'avoir une action maser (c'est-a-dire: emission au lieu d'absorption) sans que la vitesse des

  18. Analyses and estimates of hydraulic conductivity from slug tests in alluvial aquifer underlying Air Force Plant 4 and Naval Air Station-Joint Reserve Base Carswell Field, Fort Worth, Texas

    Science.gov (United States)

    Houston, Natalie A.; Braun, Christopher L.

    2004-01-01

    This report describes the collection, analyses, and distribution of hydraulic-conductivity data obtained from slug tests completed in the alluvial aquifer underlying Air Force Plant 4 and Naval Air Station-Joint Reserve Base Carswell Field, Fort Worth, Texas, during October 2002 and August 2003 and summarizes previously available hydraulic-conductivity data. The U.S. Geological Survey, in cooperation with the U.S. Air Force, completed 30 slug tests in October 2002 and August 2003 to obtain estimates of horizontal hydraulic conductivity to use as initial values in a ground-water-flow model for the site. The tests were done by placing a polyvinyl-chloride slug of known volume beneath the water level in selected wells, removing the slug, and measuring the resulting water-level recovery over time. The water levels were measured with a pressure transducer and recorded with a data logger. Hydraulic-conductivity values were estimated from an analytical relation between the instantaneous displacement of water in a well bore and the resulting rate of head change. Although nearly two-thirds of the tested wells recovered 90 percent of their slug-induced head change in less than 2 minutes, 90-percent recovery times ranged from 3 seconds to 35 minutes. The estimates of hydraulic conductivity range from 0.2 to 200 feet per day. Eighty-three percent of the estimates are between 1 and 100 feet per day.

  19. Analyser of sweeping electron beam

    International Nuclear Information System (INIS)

    Strasser, A.

    1993-01-01

    The electron beam analyser has an array of conductors that can be positioned in the field of the sweeping beam, an electronic signal treatment system for the analysis of the signals generated in the conductors by the incident electrons and a display for the different characteristics of the electron beam

  20. Technical center for transportation analyses

    International Nuclear Information System (INIS)

    Foley, J.T.

    1978-01-01

    A description is presented of an information search/retrieval/research activity of Sandia Laboratories which provides technical environmental information which may be used in transportation risk analyses, environmental impact statements, development of design and test criteria for packaging of energy materials, and transportation mode research studies. General activities described are: (1) history of center development; (2) environmental information storage/retrieval system; (3) information searches; (4) data needs identification; and (5) field data acquisition system and applications

  1. Occupational exposure to electromagnetic fields, leukemia and brain cancer: update of two meta-analysis; Exposition professionnelle aux champs electromagnetiques, leucemie et cancer du cerveau: mise a jour de deux meta-analyses

    Energy Technology Data Exchange (ETDEWEB)

    Anon

    2008-11-15

    This new meta-analysis found a slight increase in the risk of brain cancer and of leukemia in populations occupationally exposed to electromagnetic fields. it does not, however, support the hypothesis that electromagnetic fields have an effect on these cancers. (author)

  2. Prevention of visual field defects after macular hole surgery.

    LENUS (Irish Health Repository)

    Cullinane, A B

    2012-02-03

    BACKGROUND\\/AIM: The pathogenesis of visual field loss associated with macular hole surgery is uncertain but a number of explanations have been proposed, the most convincing of which is the effect of peeling of the posterior hyaloid, causing either direct damage to the nerve fibre layer or to its blood supply at the optic nerve head. The purpose of this preliminary prospective study was to determine the incidence of visual field defects following macular hole surgery in cases in which peeling of the posterior hyaloid was confined only to the area of the macula. METHODS: 102 consecutive eyes that had macular hole surgery had preoperative and postoperative visual field examination using a Humphrey\\'s perimeter. A comparison was made between two groups: I, those treated with vitrectomy with complete posterior cortical vitreous peeling; and II, those treated with a vitrectomy with peeling of the posterior hyaloid in the area of the macula but without attempting a complete posterior vitreous detachment. Specifically, no attempt was made to separate the posterior hyaloid from the optic nerve head. Eyes with stage II or III macular holes were operated. Autologous platelet concentrate and non-expansile gas tamponade was used. Patients were postured prone for 1 week. RESULTS: In group I, 22% of patients were found to have visual field defects. In group II, it was possible to separate the posterior hyaloid from the macula without stripping it from the optic nerve head and in these eyes no pattern of postoperative visual field loss emerged. There were no significant vision threatening complications in this group. The difference in the incidence of visual field loss between group I and group II was significant (p=0.02). The anatomical and visual success rates were comparable between both groups. CONCLUSION: The results from this preliminary study suggest that the complication of visual field loss after macular surgery may be reduced if peeling of the posterior hyaloid is

  3. Efficacy of the Amsler Grid Test in Evaluating Glaucomatous Central Visual Field Defects.

    Science.gov (United States)

    Su, Daniel; Greenberg, Andrew; Simonson, Joseph L; Teng, Christopher C; Liebmann, Jeffrey M; Ritch, Robert; Park, Sung Chul

    2016-04-01

    To investigate the efficacy of the Amsler grid test in detecting central visual field (VF) defects in glaucoma. Prospective, cross-sectional study. Patients with glaucoma with reliable Humphrey 10-2 Swedish Interactive Threshold Algorithm standard VF on the date of enrollment or within the previous 3 months. Amsler grid tests were performed for each eye and were considered "abnormal" if there was any perceived scotoma with missing or blurry grid lines within the central 10 degrees ("Amsler grid scotoma"). An abnormal 10-2 VF was defined as ≥3 adjacent points at P grid scotoma area were calculated with the 10-2 VF as the clinical reference standard. Among eyes with an abnormal 10-2 VF, regression analyses were performed between the Amsler grid scotoma area and the 10-2 VF parameters (mean deviation [MD], scotoma extent [number of test points with P grid scotoma area. A total of 106 eyes (53 patients) were included (mean ± standard deviation age, 24-2 MD and 10-2 MD = 66±12 years, -9.61±8.64 decibels [dB] and -9.75±9.00 dB, respectively). Sensitivity, specificity, and positive and negative predictive values of the Amsler grid test were 68%, 92%, 97%, and 46%, respectively. Sensitivity was 40% in eyes with 10-2 MD better than -6 dB, 58% in eyes with 10-2 MD between -12 and -6 dB, and 92% in eyes with 10-2 MD worse than -12 dB. The area under the receiver operating characteristic curve of the Amsler grid scotoma area was 0.810 (95% confidence interval, 0.723-0.880, P grid scotoma area had the strongest relationship with 10-2 MD (quadratic R(2)=0.681), followed by 10-2 scotoma extent (quadratic R(2)=0.611) and 10-2 scotoma mean depth (quadratic R(2)=0.299) (all P grid can be used to screen for moderate to severe central vision loss from glaucoma. Copyright © 2016 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  4. Lack of Congruence between Analyses and Conclusions Limits Usefulness of Study of Socio-cultural Influences on Student Choice of LIS Field in Greece. A Review of: Moniarou-Papaconstantinou, V., Tsatsaroni, A., Katsis, A., & Koulaidis, V. (2010. LIS as a field of study: Socio-cultural influences on students’ decision making. Aslib Proceedings: New Information Perspectives, 62(3, 321-344.

    Directory of Open Access Journals (Sweden)

    Diana K. Wakimoto

    2010-09-01

    Full Text Available Objective — To determine how social and cultural factors influence students’ decision to study library and information science (LIS as undergraduates.Design — Semi-structured interviews and quantitative analysis of questionnaire data.Setting — Three schools in Greece with LIS programs at the undergraduate level.Subjects — One hundred eighty-seven first-year students enrolled in Greece’s LIS schools’ undergraduate programs in the autumn semester of the 2005-2006 academic year.Methods — The authors piloted the questionnaire with 52 students at the LIS school in Athens and had three faculty members review the questionnaire. After modification, the two-part questionnaire was administered during the first week of classes to all first-year undergraduate students enrolled in Greece’s three LIS schools. The first section of the questionnaire collected data on student gender, age, area of residence, school from which they graduated, and parental occupation and level of education. The second part of the questionnaire covered students’ reasons for choosing LIS as a field of study, the degree to which students agreed with dominant public views (i.e., stereotypes of librarianship, and practical issues that influenced students’ decision-making processes. The authors conducted two rounds of semi-structured interviews with students from the same 2005-2006 cohort. They interviewed 41 self-selected students and then interviewed a purposive sample of 15 students from the same cohort in the fifth semester of the students’ studies.Main Results — The questionnaire was completed by 187 LIS students, with 177 responses considered relevant and used in the analyses. Demographic information showed that 78% of the respondents were female, 85.8% were from urban areas, and 98.9% graduated from public schools. The authors constructed two indices to assist with further analyses: the Educational Career Index, which quantified students’ educational

  5. Significant correlations between optic nerve head microcirculation and visual field defects and nerve fiber layer loss in glaucoma patients with myopic glaucomatous disk

    Directory of Open Access Journals (Sweden)

    Yokoyama Y

    2011-12-01

    Full Text Available Yu Yokoyama, Naoko Aizawa, Naoki Chiba, Kazuko Omodaka, Masahiko Nakamura, Takaaki Otomo, Shunji Yokokura, Nobuo Fuse, Toru NakazawaDepartment of Ophthalmology, Tohoku University Graduate School of Medicine, Sendai, JapanBackground: Eyes with glaucoma are characterized by optic neuropathy with visual field defects in the areas corresponding to the optic disk damage. The exact cause for the glaucomatous optic neuropathy has not been determined. Myopia has been shown to be a risk factor for glaucoma. The purpose of this study was to determine whether a significant correlation existed between the microcirculation of the optic disk and the visual field defects and the retinal nerve fiber layer thickness (RNFLT in glaucoma patients with myopic optic disks.Methods: Sixty eyes of 60 patients with myopic disks were studied; 36 eyes with glaucoma (men:women = 19:17 and 24 eyes with no ocular diseases (men:women = 14:10. The mean deviation (MD determined by the Humphrey field analyzer, and the peripapillary RNFLT determined by the Stratus-OCT were compared between the two groups. The ocular circulation was determined by laser speckle flowgraphy (LSFG, and the mean blur rate (MBR was compared between the two groups. The correlations between the RNFLT and MBR of the corresponding areas of the optic disk and between MD and MBR of the optic disk in the glaucoma group were determined by simple regression analyses.Results: The average MBR for the entire optic disk was significantly lower in the glaucoma group than that in the control group. The differences of the MBR for the tissue in the superior, inferior, and temporal quadrants of the optic disk between the two groups were significant. The MBR for the entire optic disk was significantly correlated with the MD (r = 0.58, P = 0.0002 and the average RNFLT (r = 0.53, P = 0.0008. The tissue MBR of the optic disk was significantly correlated with the RNFLT in the superior, inferior, and temporal quadrants

  6. NOAA's National Snow Analyses

    Science.gov (United States)

    Carroll, T. R.; Cline, D. W.; Olheiser, C. M.; Rost, A. A.; Nilsson, A. O.; Fall, G. M.; Li, L.; Bovitz, C. T.

    2005-12-01

    NOAA's National Operational Hydrologic Remote Sensing Center (NOHRSC) routinely ingests all of the electronically available, real-time, ground-based, snow data; airborne snow water equivalent data; satellite areal extent of snow cover information; and numerical weather prediction (NWP) model forcings for the coterminous U.S. The NWP model forcings are physically downscaled from their native 13 km2 spatial resolution to a 1 km2 resolution for the CONUS. The downscaled NWP forcings drive an energy-and-mass-balance snow accumulation and ablation model at a 1 km2 spatial resolution and at a 1 hour temporal resolution for the country. The ground-based, airborne, and satellite snow observations are assimilated into the snow model's simulated state variables using a Newtonian nudging technique. The principle advantages of the assimilation technique are: (1) approximate balance is maintained in the snow model, (2) physical processes are easily accommodated in the model, and (3) asynoptic data are incorporated at the appropriate times. The snow model is reinitialized with the assimilated snow observations to generate a variety of snow products that combine to form NOAA's NOHRSC National Snow Analyses (NSA). The NOHRSC NSA incorporate all of the available information necessary and available to produce a "best estimate" of real-time snow cover conditions at 1 km2 spatial resolution and 1 hour temporal resolution for the country. The NOHRSC NSA consist of a variety of daily, operational, products that characterize real-time snowpack conditions including: snow water equivalent, snow depth, surface and internal snowpack temperatures, surface and blowing snow sublimation, and snowmelt for the CONUS. The products are generated and distributed in a variety of formats including: interactive maps, time-series, alphanumeric products (e.g., mean areal snow water equivalent on a hydrologic basin-by-basin basis), text and map discussions, map animations, and quantitative gridded products

  7. Laser Beam Focus Analyser

    DEFF Research Database (Denmark)

    Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove

    2007-01-01

    the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating......The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...

  8. Contesting Citizenship: Comparative Analyses

    DEFF Research Database (Denmark)

    Siim, Birte; Squires, Judith

    2007-01-01

    importance of particularized experiences and multiple ineequality agendas). These developments shape the way citizenship is both practiced and analysed. Mapping neat citizenship modles onto distinct nation-states and evaluating these in relation to formal equality is no longer an adequate approach....... Comparative citizenship analyses need to be considered in relation to multipleinequalities and their intersections and to multiple governance and trans-national organisinf. This, in turn, suggests that comparative citizenship analysis needs to consider new spaces in which struggles for equal citizenship occur...

  9. Detailed molecular analyses of the hexon loop-1 and fibers of fowl aviadenoviruses reveal new insights into the antigenic relationship and confirm that specific genotypes are involved in field outbreaks of inclusion body hepatitis.

    Science.gov (United States)

    Schachner, Anna; Marek, Ana; Grafl, Beatrice; Hess, Michael

    2016-04-15

    Forty-eight fowl aviadenoviruses (FAdVs) isolated from recent IBH outbreaks across Europe were investigated, by utilizing for the first time the two major adenoviral antigenic domains, hexon loop-1 and fiber, for compound molecular characterization of IBH-associated FAdVs. Successful target gene amplification, following virus isolation in cell culture or from FTA-card samples, demonstrated presence of FAdVs in all cases indicative for IBH. Based on hexon loop-1 analysis, 31 European field isolates exhibited highest nucleotide identity (>97.2%) to reference strains FAdV-2 or -11 representing FAdV-D, while 16 and one European isolates shared >96.0% nucleotide identity with FAdV-8a and -8b, or FAdV-7, the prototype strains representing FAdV-E. These results extend recognition of specific FAdV-D and FAdV-E affiliate genotypes as causative agents of IBH to the European continent. In all isolates, species specificity determined by fiber gene analysis correlated with hexon-based typing. A threshold of 72.0% intraspecies nucleotide identity between fibers from investigated prototype and field strains corresponded with demarcation criteria proposed for hexon, suggesting fiber-based analysis as a complementary tool for molecular FAdV typing. A limited number of strains exhibited inconsistencies between hexon and fiber subclustering, indicating potential constraints for single-gene based typing of those FAdVs. Within FAdV-D, field isolate fibers shared a high degree of nucleotide (>96.7%) and aa (>95.8%) identity, while FAdV-E field isolate fibers displayed greater nucleotide divergence of up to 22.6%, resulting in lower aa identities of >81.7%. Furthermore, comparison with FAdVs from IBH outbreaks outside Europe revealed close genetic relationship in the fiber, independent of the strains' geographic origin. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Risico-analyse brandstofpontons

    NARCIS (Netherlands)

    Uijt de Haag P; Post J; LSO

    2001-01-01

    Voor het bepalen van de risico's van brandstofpontons in een jachthaven is een generieke risico-analyse uitgevoerd. Er is een referentiesysteem gedefinieerd, bestaande uit een betonnen brandstofponton met een relatief grote inhoud en doorzet. Aangenomen is dat de ponton gelegen is in een

  11. Fast multichannel analyser

    Energy Technology Data Exchange (ETDEWEB)

    Berry, A; Przybylski, M M; Sumner, I [Science Research Council, Daresbury (UK). Daresbury Lab.

    1982-10-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10/sup 7/ s/sup -1/ has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format.

  12. A fast multichannel analyser

    International Nuclear Information System (INIS)

    Berry, A.; Przybylski, M.M.; Sumner, I.

    1982-01-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10 7 s -1 has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format. (orig.)

  13. Geological and production analyses focused on exploration of the eastern part of the Cerro Prieto geothermal field, BC; Analisis geologico-productivo enfocado a la exploracion de la parte oriental del campo geotermico de Cerro Prieto, BC

    Energy Technology Data Exchange (ETDEWEB)

    Aguilar Dumas, Alvaro [Comision Federal de Electricidad, Residencia General de Cerro Prieto, B.C (Mexico)]. E-mail: alvaro.aguilar@cfe.gob.mx

    2008-01-15

    The eastern part of the Cerro Prieto geothermal field (CGCP), known as Poligono Nuevo Leon, is an area with proven geothermal resources, as confirmed by seven directional wells located toward the east and by vertical well M-200 located inside the polygon. Well M-200 was drilled in 1984 and has produced about 4 million tons of steam to date. It is integrated into the CP-2 sector, producing 68 t/h of steam. Presently the eastern part of CGCP, representing 25% of the total field area, is producing over half of the steam for the entire field. In the last few years, the steam has come only after increasing the number of production wells located in the eastern zone of CGCP (Rodriguez, 2006), where pressure, enthalpy and temperature conditions are better than in other parts of the field. However in the long term it will be necessary to incorporate Poligono Nuevo Leon into the productive area to expand the productive life of CGCP. This paper includes a geological analysis, plus models for steam production, temperature and enthalpy for Poligono Nuevo Leon. [Spanish] La parte oriental del Campo Geotermico de Cerro Prieto (CGCP), conocida como Poligono Nuevo Leon, representa una area potencial con recursos geotermicos comprobados, lo que demuestran siete pozos direccionales que se han perforado hacia el este, asi como el pozo vertical M-200, localizado dentro del poligono. El pozo M-200 se perforo en 1984 y ha producido a la fecha alrededor de 4 millones de toneladas de vapor, estando integrado al sector CP-2 una produccion de 68 t/h de vapor. Actualmente la parte oriental del CGCP, que representa el 25% del area total del campo, produce mas de la mitad del total de vapor del campo. El suministro de vapor en los ultimos anos se ha logrado cubrir aumentando el numero de pozos en operacion localizados en la zona oriente del CGCP (Rodriguez, 2006), ya que es aqui donde hay condiciones de presion, entalpia y temperatura del yacimiento que son mejores que en otras areas del campo

  14. Possible future HERA analyses

    International Nuclear Information System (INIS)

    Geiser, Achim

    2015-12-01

    A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing ep collider data and their physics scope. Comparisons to the original scope of the HERA pro- gramme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-established data and MC sets, calibrations, and analysis procedures the manpower and expertise needed for a particular analysis is often very much smaller than that needed for an ongoing experiment. Since centrally funded manpower to carry out such analyses is not available any longer, this contribution not only targets experienced self-funded experimentalists, but also theorists and master-level students who might wish to carry out such an analysis.

  15. Biomass feedstock analyses

    Energy Technology Data Exchange (ETDEWEB)

    Wilen, C.; Moilanen, A.; Kurkela, E. [VTT Energy, Espoo (Finland). Energy Production Technologies

    1996-12-31

    The overall objectives of the project `Feasibility of electricity production from biomass by pressurized gasification systems` within the EC Research Programme JOULE II were to evaluate the potential of advanced power production systems based on biomass gasification and to study the technical and economic feasibility of these new processes with different type of biomass feed stocks. This report was prepared as part of this R and D project. The objectives of this task were to perform fuel analyses of potential woody and herbaceous biomasses with specific regard to the gasification properties of the selected feed stocks. The analyses of 15 Scandinavian and European biomass feed stock included density, proximate and ultimate analyses, trace compounds, ash composition and fusion behaviour in oxidizing and reducing atmospheres. The wood-derived fuels, such as whole-tree chips, forest residues, bark and to some extent willow, can be expected to have good gasification properties. Difficulties caused by ash fusion and sintering in straw combustion and gasification are generally known. The ash and alkali metal contents of the European biomasses harvested in Italy resembled those of the Nordic straws, and it is expected that they behave to a great extent as straw in gasification. Any direct relation between the ash fusion behavior (determined according to the standard method) and, for instance, the alkali metal content was not found in the laboratory determinations. A more profound characterisation of the fuels would require gasification experiments in a thermobalance and a PDU (Process development Unit) rig. (orig.) (10 refs.)

  16. An electric field in a gravitational field

    International Nuclear Information System (INIS)

    Harpaz, Amos

    2005-01-01

    The behaviour of an electric field in a gravitational field is analysed. It is found that due to the mass (energy) of the electric field, it is subjected to gravity and it falls in the gravitational field. This fall curves the electric field, a stress force (a reaction force) is created, and the interaction of this reaction force with the static charge gives rise to the creation of radiation

  17. [New visual field testing possibilities (a preliminary report)].

    Science.gov (United States)

    Erichev, V P; Ermolaev, A P; Antonov, A A; Grigoryan, G L; Kosova, D V

    2018-01-01

    There are currently no portable mobile perimeters that allow visual field testing outside ophthalmologist's examination rooms. To develop a mobile perimetry technique based on use of a virtual reality headset (VR). The study involved 26 patients (30 eyes) with II-III stage primary open-angle glaucoma (POAG) with compensated IOP. Perimetry was performed for each patient twice - on Humphrey analyzer (test 30-2, 76 points) and employing similar strategy on a perimeter integrated into VR headset (Total Vision, Russia). Visual field testing was performed with an interval from 1 hour to 3 days. The results were comparatively analyzed. Patients tolerated the examination well. Comparative analysis of preliminary perimetry results obtained with both methods showed high degree of identity, so the results were concluded to be comparable. By visually isolating the wearer, VR headset achieves elimination of distractions and stable light conditions for visual field testing. The headset-perimeter is compact, mobile, easily transportable, can be used in the work of visiting medical teams and for examination at home.

  18. Applied mediation analyses

    DEFF Research Database (Denmark)

    Lange, Theis; Hansen, Kim Wadt; Sørensen, Rikke

    2017-01-01

    In recent years, mediation analysis has emerged as a powerful tool to disentangle causal pathways from an exposure/treatment to clinically relevant outcomes. Mediation analysis has been applied in scientific fields as diverse as labour market relations and randomized clinical trials of heart...... disease treatments. In parallel to these applications, the underlying mathematical theory and computer tools have been refined. This combined review and tutorial will introduce the reader to modern mediation analysis including: the mathematical framework; required assumptions; and software implementation...

  19. Wet Gas Airfoil Analyses

    OpenAIRE

    Larsen, Tarjei Thorrud

    2011-01-01

    Subsea wet gas compression renders new possibilities for cost savings and enhanced gas recovery on existing gas wells. Technology like this opens to make traditional offshore processing plants redundant. With new technology, follows new challenges. Multiphase flows is regarded as a complex field of study, and increased knowledge on the fundamental mechanisms regarding wet gas flow is of paramount importance to the efficiency and stability of the wet gas compressor. The scope of this work was ...

  20. Background and analysis of R and D needs in the field of nuclear toxicology; Contexte et analyse des besoins en R et D dans le domaine de la toxicologie nucleaire

    Energy Technology Data Exchange (ETDEWEB)

    Quemeneur, E [CEA Marcoule, Dir. des Sciences du Vivant, 30 (France); Menager, M Th [CEA Fontenay-aux-Roses, Dir. des Sciences du Vivant, 92 (France); Forestier, C [CEA Cadarache, Dir. des Sciences du Vivant, 13 - Saint-Paul-lez-Durance (France); Ansoborlo, E [CEA Marcoule, Dir. de l' Energie Nucleaire, 30 (France)

    2008-07-01

    The CEA has confirmed his interest in late 2007 for a continuation of the coordinated effort of research in Nuclear Toxicology. Our working group aimed at developing proposals to organize internal priorities in the CEA, as well as supporting a dynamic and open ambition in terms of partnership. This document provides the main elements of the analysis of the context and needs, a strategic proposal and the arrangements planed for the next five years. The proposed programme covers the field of cellular and molecular toxicology applied to the human biology and the environment. Seeking to achieve the right balance between academic and applied research, it addresses some fundamental issues, in large part identified in previous phases (speciation of mobile forms, transport in the biosphere, biochemical targets, and mechanisms of action at different levels of living organisms) and will aim at assessing the potential of novel bio-processes for the intervention in accidental situation. Basically multi-disciplinary, this programme will require to re-launch, and amplify, the coordinated efforts of skills from different poles of the CEA for which internal resources could cover part of the priority needs. However, it seems clear that the establishment of strong relationships with academic partners and industry is required to achieve the expectations raised by the multiple thematic scopes. In the current context of toxicology in France, we chose to approach the field on the basis of a road-map rather than as a unique project. This modular approach seems to be the appropriate way to take into account the concerns of various stakeholders in this vast programme as well as the necessary reflection on the commitment of resources. Our analysis of the competitive environment of work and available forces led to concentrate our efforts on a few chemical elements or radionuclides of specific interest in the nuclear field (tritium, cobalt, iodine, cesium, uranium and plutonium) and to focus

  1. Relationship between progression of visual field defect and intraocular pressure in primary open-angle glaucoma.

    Science.gov (United States)

    Naito, Tomoko; Yoshikawa, Keiji; Mizoue, Shiro; Nanno, Mami; Kimura, Tairo; Suzumura, Hirotaka; Shiraga, Fumio

    2015-01-01

    To analyze the relationship between intraocular pressure (IOP) and the progression of visual field defects in Japanese primary open-angle glaucoma (POAG) and normal-tension glaucoma (NTG) patients. The subjects of the study were patients undergoing treatment for POAG or NTG who had performed visual field tests at least ten times with a Humphrey field analyzer (Swedish interactive thresholding algorithm standard, C30-2 program). The progression of visual field defects was defined by a significantly negative value of the mean deviation slope at the final visual field test during the follow-up period. The relationships between the progression of visual field defects and IOP, as well as other clinical factors, were retrospectively analyzed. A total of 156 eyes of 156 patients were included in the analysis. Significant progression of visual field defects was observed in 70 eyes of 70 patients (44.9%), while no significant progression was evident in 86 eyes of 86 patients (55.1%). The eyes with visual field defect progression had significantly lower baseline IOP (Pfield defect progression than in eyes without (Pfield defects. In NTG, IOP management should take into account not only achieving the target IOP, but also minimizing the fluctuation of IOP during follow-up period.

  2. Analyses of MHD instabilities

    International Nuclear Information System (INIS)

    Takeda, Tatsuoki

    1985-01-01

    In this article analyses of the MHD stabilities which govern the global behavior of a fusion plasma are described from the viewpoint of the numerical computation. First, we describe the high accuracy calculation of the MHD equilibrium and then the analysis of the linear MHD instability. The former is the basis of the stability analysis and the latter is closely related to the limiting beta value which is a very important theoretical issue of the tokamak research. To attain a stable tokamak plasma with good confinement property it is necessary to control or suppress disruptive instabilities. We, next, describe the nonlinear MHD instabilities which relate with the disruption phenomena. Lastly, we describe vectorization of the MHD codes. The above MHD codes for fusion plasma analyses are relatively simple though very time-consuming and parts of the codes which need a lot of CPU time concentrate on a small portion of the codes, moreover, the codes are usually used by the developers of the codes themselves, which make it comparatively easy to attain a high performance ratio on the vector processor. (author)

  3. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  4. Determination of the influence of high clouds on the radiation field and on climate by analyzing NOAA AVHRR data. Die Bestimmung des Einflusses von hohen Wolken auf das Strahlungsfeld und auf das Klima durch Analyse von NOAA AVHRR-Daten

    Energy Technology Data Exchange (ETDEWEB)

    Berger, F H

    1991-11-01

    The influence of clouds on the radiation field and on climate is investigated over the North Sea by analyzing NOAA AVHRR data. The derived information was applied to calculate the cloud-climate efficiency at the top of atmosphere and at the surface. The cloud-climate efficiencies in the shortwave spectrum show a cooling effect of the earth/atmosphere system for all clouds and a strong dependence on the solar insolation. For the cloud-climate efficiency in the longwave region always heating of the earth/atmosphere system was observed. Because of the solar zenith angle effect thick high clouds with the same optical properties may lead to different effects in the earth/atmosphere system. A first approach of a comparison of the increasing cloud forcing (heating of the earth-atmosphere system) and an analysis of the relative topography 300/850 hPa shows that the increase of the cloud forcing is well correlated with an increase of the temperature in this layer. The deviation from the long-term mean of the pressure or temperature shows the same behaviour. (orig.).

  5. CFD analyses in regulatory practice

    International Nuclear Information System (INIS)

    Bloemeling, F.; Pandazis, P.; Schaffrath, A.

    2012-01-01

    Numerical software is used in nuclear regulatory procedures for many problems in the fields of neutron physics, structural mechanics, thermal hydraulics etc. Among other things, the software is employed in dimensioning and designing systems and components and in simulating transients and accidents. In nuclear technology, analyses of this kind must meet strict requirements. Computational Fluid Dynamics (CFD) codes were developed for computing multidimensional flow processes of the type occurring in reactor cooling systems or in containments. Extensive experience has been accumulated by now in selected single-phase flow phenomena. At the present time, there is a need for development and validation with respect to the simulation of multi-phase and multi-component flows. As insufficient input by the user can lead to faulty results, the validity of the results and an assessment of uncertainties are guaranteed only through consistent application of so-called Best Practice Guidelines. The authors present the possibilities now available to CFD analyses in nuclear regulatory practice. This includes a discussion of the fundamental requirements to be met by numerical software, especially the demands upon computational analysis made by nuclear rules and regulations. In conclusion, 2 examples are presented of applications of CFD analysis to nuclear problems: Determining deboration in the condenser reflux mode of operation, and protection of the reactor pressure vessel (RPV) against brittle failure. (orig.)

  6. A simple beam analyser

    International Nuclear Information System (INIS)

    Lemarchand, G.

    1977-01-01

    (ee'p) experiments allow to measure the missing energy distribution as well as the momentum distribution of the extracted proton in the nucleus versus the missing energy. Such experiments are presently conducted on SACLAY's A.L.S. 300 Linac. Electrons and protons are respectively analysed by two spectrometers and detected in their focal planes. Counting rates are usually low and include time coincidences and accidentals. Signal-to-noise ratio is dependent on the physics of the experiment and the resolution of the coincidence, therefore it is mandatory to get a beam current distribution as flat as possible. Using new technologies has allowed to monitor in real time the behavior of the beam pulse and determine when the duty cycle can be considered as being good with respect to a numerical basis

  7. EEG analyses with SOBI.

    Energy Technology Data Exchange (ETDEWEB)

    Glickman, Matthew R.; Tang, Akaysha (University of New Mexico, Albuquerque, NM)

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  8. Pathway-based analyses.

    Science.gov (United States)

    Kent, Jack W

    2016-02-03

    New technologies for acquisition of genomic data, while offering unprecedented opportunities for genetic discovery, also impose severe burdens of interpretation and penalties for multiple testing. The Pathway-based Analyses Group of the Genetic Analysis Workshop 19 (GAW19) sought reduction of multiple-testing burden through various approaches to aggregation of highdimensional data in pathways informed by prior biological knowledge. Experimental methods testedincluded the use of "synthetic pathways" (random sets of genes) to estimate power and false-positive error rate of methods applied to simulated data; data reduction via independent components analysis, single-nucleotide polymorphism (SNP)-SNP interaction, and use of gene sets to estimate genetic similarity; and general assessment of the efficacy of prior biological knowledge to reduce the dimensionality of complex genomic data. The work of this group explored several promising approaches to managing high-dimensional data, with the caveat that these methods are necessarily constrained by the quality of external bioinformatic annotation.

  9. Analysing Access Control Specifications

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2009-01-01

    When prosecuting crimes, the main question to answer is often who had a motive and the possibility to commit the crime. When investigating cyber crimes, the question of possibility is often hard to answer, as in a networked system almost any location can be accessed from almost anywhere. The most...... common tool to answer this question, analysis of log files, faces the problem that the amount of logged data may be overwhelming. This problems gets even worse in the case of insider attacks, where the attacker’s actions usually will be logged as permissible, standard actions—if they are logged at all....... Recent events have revealed intimate knowledge of surveillance and control systems on the side of the attacker, making it often impossible to deduce the identity of an inside attacker from logged data. In this work we present an approach that analyses the access control configuration to identify the set...

  10. Network class superposition analyses.

    Directory of Open Access Journals (Sweden)

    Carl A B Pearson

    Full Text Available Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30 for the yeast cell cycle process, considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.

  11. Seismic fragility analyses

    International Nuclear Information System (INIS)

    Kostov, Marin

    2000-01-01

    In the last two decades there is increasing number of probabilistic seismic risk assessments performed. The basic ideas of the procedure for performing a Probabilistic Safety Analysis (PSA) of critical structures (NUREG/CR-2300, 1983) could be used also for normal industrial and residential buildings, dams or other structures. The general formulation of the risk assessment procedure applied in this investigation is presented in Franzini, et al., 1984. The probability of failure of a structure for an expected lifetime (for example 50 years) can be obtained from the annual frequency of failure, β E determined by the relation: β E ∫[d[β(x)]/dx]P(flx)dx. β(x) is the annual frequency of exceedance of load level x (for example, the variable x may be peak ground acceleration), P(fI x) is the conditional probability of structure failure at a given seismic load level x. The problem leads to the assessment of the seismic hazard β(x) and the fragility P(fl x). The seismic hazard curves are obtained by the probabilistic seismic hazard analysis. The fragility curves are obtained after the response of the structure is defined as probabilistic and its capacity and the associated uncertainties are assessed. Finally the fragility curves are combined with the seismic loading to estimate the frequency of failure for each critical scenario. The frequency of failure due to seismic event is presented by the scenario with the highest frequency. The tools usually applied for probabilistic safety analyses of critical structures could relatively easily be adopted to ordinary structures. The key problems are the seismic hazard definitions and the fragility analyses. The fragility could be derived either based on scaling procedures or on the base of generation. Both approaches have been presented in the paper. After the seismic risk (in terms of failure probability) is assessed there are several approaches for risk reduction. Generally the methods could be classified in two groups. The

  12. Methodological challenges in carbohydrate analyses

    Directory of Open Access Journals (Sweden)

    Mary Beth Hall

    2007-07-01

    Full Text Available Carbohydrates can provide up to 80% of the dry matter in animal diets, yet their specific evaluation for research and diet formulation is only now becoming a focus in the animal sciences. Partitioning of dietary carbohydrates for nutritional purposes should reflect differences in digestion and fermentation characteristics and effects on animal performance. Key challenges to designating nutritionally important carbohydrate fractions include classifying the carbohydrates in terms of nutritional characteristics, and selecting analytical methods that describe the desired fraction. The relative lack of information on digestion characteristics of various carbohydrates and their interactions with other fractions in diets means that fractions will not soon be perfectly established. Developing a system of carbohydrate analysis that could be used across animal species could enhance the utility of analyses and amount of data we can obtain on dietary effects of carbohydrates. Based on quantities present in diets and apparent effects on animal performance, some nutritionally important classes of carbohydrates that may be valuable to measure include sugars, starch, fructans, insoluble fiber, and soluble fiber. Essential to selection of methods for these fractions is agreement on precisely what carbohydrates should be included in each. Each of these fractions has analyses that could potentially be used to measure them, but most of the available methods have weaknesses that must be evaluated to see if they are fatal and the assay is unusable, or if the assay still may be made workable. Factors we must consider as we seek to analyze carbohydrates to describe diets: Does the assay accurately measure the desired fraction? Is the assay for research, regulatory, or field use (affects considerations of acceptable costs and throughput? What are acceptable accuracy and variability of measures? Is the assay robust (enhances accuracy of values? For some carbohydrates, we

  13. Website-analyse

    DEFF Research Database (Denmark)

    Thorlacius, Lisbeth

    2009-01-01

    eller blindgyder, når han/hun besøger sitet. Studier i design og analyse af de visuelle og æstetiske aspekter i planlægning og brug af websites har imidlertid kun i et begrænset omfang været under reflektorisk behandling. Det er baggrunden for dette kapitel, som indleder med en gennemgang af æstetikkens......Websitet er i stigende grad det foretrukne medie inden for informationssøgning,virksomhedspræsentation, e-handel, underholdning, undervisning og social kontakt. I takt med denne voksende mangfoldighed af kommunikationsaktiviteter på nettet, er der kommet mere fokus på at optimere design og...... planlægning af de funktionelle og indholdsmæssige aspekter ved websites. Der findes en stor mængde teori- og metodebøger, som har specialiseret sig i de tekniske problemstillinger i forbindelse med interaktion og navigation, samt det sproglige indhold på websites. Den danske HCI (Human Computer Interaction...

  14. A channel profile analyser

    International Nuclear Information System (INIS)

    Gobbur, S.G.

    1983-01-01

    It is well understood that due to the wide band noise present in a nuclear analog-to-digital converter, events at the boundaries of adjacent channels are shared. It is a difficult and laborious process to exactly find out the shape of the channels at the boundaries. A simple scheme has been developed for the direct display of channel shape of any type of ADC on a cathode ray oscilliscope display. This has been accomplished by sequentially incrementing the reference voltage of a precision pulse generator by a fraction of a channel and storing ADC data in alternative memory locations of a multichannel pulse height analyser. Alternative channels are needed due to the sharing at the boundaries of channels. In the flat region of the profile alternate memory locations are channels with zero counts and channels with the full scale counts. At the boundaries all memory locations will have counts. The shape of this is a direct display of the channel boundaries. (orig.)

  15. Chapter No.4. Safety analyses

    International Nuclear Information System (INIS)

    2002-01-01

    In 2001 the activity in the field of safety analyses was focused on verification of the safety analyses reports for NPP V-2 Bohunice and NPP Mochovce concerning the new profiled fuel and probabilistic safety assessment study for NPP Mochovce. The calculation safety analyses were performed and expert reviews for the internal UJD needs were elaborated. An important part of work was performed also in solving of scientific and technical tasks appointed within bilateral projects of co-operation between UJD and its international partnership organisations as well as within international projects ordered and financed by the European Commission. All these activities served as an independent support for UJD in its deterministic and probabilistic safety assessment of nuclear installations. A special attention was paid to a review of probabilistic safety assessment study of level 1 for NPP Mochovce. The probabilistic safety analysis of NPP related to the full power operation was elaborated in the study and a contribution of the technical and operational improvements to the risk decreasing was quantified. A core damage frequency of the reactor was calculated and the dominant initiating events and accident sequences with the major contribution to the risk were determined. The target of the review was to determine the acceptance of the sources of input information, assumptions, models, data, analyses and obtained results, so that the probabilistic model could give a real picture of the NPP. The review of the study was performed in co-operation of UJD with the IAEA (IPSART mission) as well as with other external organisations, which were not involved in the elaboration of the reviewed document and probabilistic model of NPP. The review was made in accordance with the IAEA guidelines and methodical documents of UJD and US NRC. In the field of calculation safety analyses the UJD activity was focused on the analysis of an operational event, analyses of the selected accident scenarios

  16. Analysing Simple Electric Motors in the Classroom

    Science.gov (United States)

    Yap, Jeff; MacIsaac, Dan

    2006-01-01

    Electromagnetic phenomena and devices such as motors are typically unfamiliar to both teachers and students. To better visualize and illustrate the abstract concepts (such as magnetic fields) underlying electricity and magnetism, we suggest that students construct and analyse the operation of a simply constructed Johnson electric motor. In this…

  17. UMTS signal measurements with digital spectrum analysers

    International Nuclear Information System (INIS)

    Licitra, G.; Palazzuoli, D.; Ricci, A. S.; Silvi, A. M.

    2004-01-01

    The launch of the Universal Mobile Telecommunications System (UNITS), the most recent mobile telecommunications standard has imposed the requirement of updating measurement instrumentation and methodologies. In order to define the most reliable measurement procedure, which is aimed at assessing the exposure to electromagnetic fields, modern spectrum analysers' features for correct signal characterisation has been reviewed. (authors)

  18. Visual field defects after temporal lobe resection for epilepsy.

    Science.gov (United States)

    Steensberg, Alvilda T; Olsen, Ane Sophie; Litman, Minna; Jespersen, Bo; Kolko, Miriam; Pinborg, Lars H

    2018-01-01

    To determine visual field defects (VFDs) using methods of varying complexity and compare results with subjective symptoms in a population of newly operated temporal lobe epilepsy patients. Forty patients were included in the study. Two patients failed to perform VFD testing. Humphrey Field Analyzer (HFA) perimetry was used as the gold standard test to detect VFDs. All patients performed a web-based visual field test called Damato Multifixation Campimetry Online (DMCO). A bedside confrontation visual field examination ad modum Donders was extracted from the medical records in 27/38 patients. All participants had a consultation by an ophthalmologist. A questionnaire described the subjective complaints. A VFD in the upper quadrant was demonstrated with HFA in 29 (76%) of the 38 patients after surgery. In 27 patients tested ad modum Donders, the sensitivity of detecting a VFD was 13%. Eight patients (21%) had a severe VFD similar to a quadrant anopia, thus, questioning their permission to drive a car. In this group of patients, a VFD was demonstrated in one of five (sensitivity=20%) ad modum Donders and in seven of eight (sensitivity=88%) with DMCO. Subjective symptoms were only reported by 28% of the patients with a VFD and in two of eight (sensitivity=25%) with a severe VFD. Most patients (86%) considered VFD information mandatory. VFD continue to be a frequent adverse event after epilepsy surgery in the medial temporal lobe and may affect the permission to drive a car in at least one in five patients. Subjective symptoms and bedside visual field testing ad modum Donders are not sensitive to detect even a severe VFD. Newly developed web-based visual field test methods appear sensitive to detect a severe VFD but perimetry remains the golden standard for determining if visual standards for driving is fulfilled. Patients consider VFD information as mandatory. Copyright © 2017. Published by Elsevier Ltd.

  19. Application of tailings flow analyses to field conditions

    International Nuclear Information System (INIS)

    Bryant, S.M.

    1983-01-01

    Catastrophic failures of tailings impoundments, in which liquefied tailings flow over substantial distances, pose severe hazards to the health and safety of people in downstream areas, and have a potential for economic and environmental devastation. The purpose of this study, an extension of prior investigations, was to develop procedures to measure Bingham flow parameters for mine tailings. In addition, the analytical procedures developed by Lucia (1981) and Jeyapalan (1980) for predicting the consequences of tailings flow failures were evaluated and applied to the Tenmile Tailings Pond at Climax, Colorado. Revisions in the simplified equilibrium procedure, developed by Lucia (1981), make it more compatible with infinite slope solutions. Jeyapalan's model was evaluated using a simple rheological analogy, and it appears there are some numerical difficulties with the operation of the computer program TFLOW used to model the displacements and velocities of flow slides. Comparable flow distances can be determined using either model if the flow volume used in the simplified equilibrium procedure is estimated properly. When both analytical procedures were applied to the Tenmile Pond, it was concluded there was no potential for a flow slide at the site

  20. PTSD symptomics: network analyses in the field of psychotraumatology

    Science.gov (United States)

    Armour, Cherie; Fried, Eiko I.; Olff, Miranda

    2017-01-01

    ABSTRACT Recent years have seen increasing attention on posttraumatic stress disorder (PTSD) research. While research has largely focused on the dichotomy between patients diagnosed with mental disorders and healthy controls — in other words, investigations at the level of diagnoses — recent work has focused on psychopathology symptoms. Symptomics research in the area of PTSD has been scarce so far, although several studies have focused on investigating the network structures of PTSD symptoms. The present special issue of EJPT adds to the literature by curating additional PTSD network studies, each looking at a different aspect of PTSD. We hope that this special issue encourages researchers to conceptualize and model PTSD data from a network perspective, which arguably has the potential to inform and improve the efficacy of therapeutic interventions. PMID:29250305

  1. PTSD symptomics: network analyses in the field of psychotraumatology

    NARCIS (Netherlands)

    Armour, Cherie; Fried, Eiko I.; Olff, Miranda

    2017-01-01

    Recent years have seen increasing attention on posttraumatic stress disorder (PTSD) research. While research has largely focused on the dichotomy between patients diagnosed with mental disorders and healthy controls - in other words, investigations at the level of diagnoses - recent work has focused

  2. Analyses of spatial variations of kenaf in experimental field

    African Journals Online (AJOL)

    STORAGESEVER

    2010-03-08

    Mar 8, 2010 ... completely randomized design at Ikenne and Ilora between June and ... and plant height were independently distributed and exhibited a non stationarity principle. ... farming practices as well as land use, topography of the.

  3. PTSD symptomics: network analyses in the field of psychotraumatology.

    Science.gov (United States)

    Armour, Cherie; Fried, Eiko I; Olff, Miranda

    2017-01-01

    Recent years have seen increasing attention on posttraumatic stress disorder (PTSD) research. While research has largely focused on the dichotomy between patients diagnosed with mental disorders and healthy controls - in other words, investigations at the level of diagnoses - recent work has focused on psychopathology symptoms. Symptomics research in the area of PTSD has been scarce so far, although several studies have focused on investigating the network structures of PTSD symptoms. The present special issue of EJPT adds to the literature by curating additional PTSD network studies, each looking at a different aspect of PTSD. We hope that this special issue encourages researchers to conceptualize and model PTSD data from a network perspective, which arguably has the potential to inform and improve the efficacy of therapeutic interventions.

  4. Radiation field analyses in reactor vessels of PWRs

    Energy Technology Data Exchange (ETDEWEB)

    Fukuya, Koji; Nakata, Hayato; Fujii, Katsuhiko; Kimura, Itsuro [Institute of Nuclear Safety System, Inc., Mihama, Fukui (Japan); Ohmura, Masaki; Kitagawa, Hideo [Mitsubishi Heavy Industries, Ltd., Nuclear Energy Systems Engineering Center, Yokohama, Kanagawa (Japan); Itoh, Taku; Shin, Kazuo [Kyoto Univ. (Japan). Faculty of Engineering

    2002-09-01

    Radiation analysis in reactor vessels of PWRs were performed using three calculation codes (two dimensional transport code DORT, three dimensional transport code TORT and three dimensional Monte Carlo code MCNP) and three cross section data (ENDF/B-IV, ENDF/B-VI and JENDL3.2) to improve accuracy of estimation for neutron flux, gamma-ray flux and displacement per atom (dpa). The calculations using DORT at a surveillance position agreed with the dosimetry measurements for the three cross sections. The calculated neutron spectra using the three cross sections at the reactor vessels and the surveillance position were quite similar to each other. The difference in the cross sections gave small impacts on the fluence estimation. The ratio of the calculations to the measurements using TORT was similar to those using DORT, indicating that TORT is applicable to the radiation analysis in PWRs. The MCNP calculations resulted in a similar agreement with the dosimeter measurement to the DORT calculation while they needed a long computing time. Improvement of calculation techniques is needed for application of MCNP. The calculated dpa agreed within 10% for the three cross sections. (author)

  5. The retest distribution of the visual field summary index mean deviation is close to normal.

    Science.gov (United States)

    Anderson, Andrew J; Cheng, Allan C Y; Lau, Samantha; Le-Pham, Anne; Liu, Victor; Rahman, Farahnaz

    2016-09-01

    When modelling optimum strategies for how best to determine visual field progression in glaucoma, it is commonly assumed that the summary index mean deviation (MD) is normally distributed on repeated testing. Here we tested whether this assumption is correct. We obtained 42 reliable 24-2 Humphrey Field Analyzer SITA standard visual fields from one eye of each of five healthy young observers, with the first two fields excluded from analysis. Previous work has shown that although MD variability is higher in glaucoma, the shape of the MD distribution is similar to that found in normal visual fields. A Shapiro-Wilks test determined any deviation from normality. Kurtosis values for the distributions were also calculated. Data from each observer passed the Shapiro-Wilks normality test. Bootstrapped 95% confidence intervals for kurtosis encompassed the value for a normal distribution in four of five observers. When examined with quantile-quantile plots, distributions were close to normal and showed no consistent deviations across observers. The retest distribution of MD is not significantly different from normal in healthy observers, and so is likely also normally distributed - or nearly so - in those with glaucoma. Our results increase our confidence in the results of influential modelling studies where a normal distribution for MD was assumed. © 2016 The Authors Ophthalmic & Physiological Optics © 2016 The College of Optometrists.

  6. Micromechanical Analyses of Sturzstroms

    Science.gov (United States)

    Imre, Bernd; Laue, Jan; Springman, Sarah M.

    2010-05-01

    have been made observable and reproducible within a physical and a distinct element numerical modelling environment (DEM). As link between field evidence gained from the deposits of natural sturzstroms, the physical model within the ETH Geotechnical Drum Centrifuge (Springman et al., 2001) and the numerical model PFC-3D (Cundall and Strack, 1979; Itasca, 2005), serves a deterministic fractal analytical comminution model (Sammis et al., 1987; Steacy and Sammis, 1991). This approach allowed studying the effects of dynamic fragmentation within sturzstroms at true (macro) scale within the distinct element model, by allowing for a micro-mechanical, distinct particle based, and cyclic description of fragmentation at the same time, without losing significant computational efficiency. Theses experiments indicate rock mass and boundary conditions, which allow an alternating fragmenting and dilating dispersive regime to evolve and to be sustained long enough to replicate the spreading and run out of sturzstroms. The fragmenting spreading model supported here is able to explain the run out of a dry granular flow, beyond the travel distance predicted by a Coulomb frictional sliding model, without resorting to explanations by mechanics that can only be valid for certain, specific of the boundary conditions. The implications derived suggest that a sturzstrom, because of its strong relation to internal fractal fragmentation and other inertial effects, constitutes a landslide category of its own. Its mechanics differ significantly from all other gravity driven mass flows. This proposition does not exclude the possible appearance of frictionites, Toma hills or suspension flows etc., but it considers them as secondary features. The application of a fractal comminution model to describe natural and experimental sturzstrom deposits turned out to be a useful tool for sturzstrom research. Implemented within the DEM, it allows simulating the key features of sturzstrom successfully and

  7. Contralateral eye comparison on changes in visual field following laser in situ keratomileusis vs photorefractive keratectomy for myopia: a randomized clinical trial.

    Science.gov (United States)

    Mostafaei, A; Sedgipour, M R; Sadeghi-Bazargani, H

    2009-12-01

    Study purpose was to compare the changes of Visual Field (VF) during laser in situ Keratomileusis (LASIK) VS photorefractive keratectomy (PRK). This randomized, double blind, study involved 54 eyes of 27 Myopia patients who underwent LASIK or PRK procedures for contralateral eyes in each patient. Using Humphrey 30-2 SITA standard, the Mean Defect (MD) and Pattern Standard Deviation (PSD) were evaluated preoperatively and three months after surgery. At the same examination optical zone size, papillary and corneal diameters were also evaluated. There was no clinically significant difference in PSD and MD measurements between treated eyes with LASIK or PRK in any zone pre and postoperatively. VF may not be affected by corneal changes induced by LASIK or PRK three months after surgery.

  8. Visual Fields at Presentation and after Trans-sphenoidal Resection of Pituitary Adenomas

    Directory of Open Access Journals (Sweden)

    Renu Dhasmana

    2011-01-01

    Full Text Available Purpose: To evaluate visual field changes in patients with pituitary adenomas following trans-sphenoidal surgery. Methods: Eighteen patients with pituitary adenomas underwent a complete ophthalmic assessment and visual field analysis using the Humphrey Field Analyzer 30-2 program before and after trans-sphenoidal surgical resection at the Himalayan Institute of Medical Sciences over a one year period. Visual acuity, duration of symptoms, optic nerve head changes, pattern of visual field defects, and variables such as mean deviation and visual field index were compared. Results: Thirty-six eyes of 18 patients including 10 male and 8 female subjects with mean age of 35.1±9.9 years and histologically proven pituitary adenoma were included. Mean visual acuity at presentation was 0.29 logMAR which improved to 0.21 logMAR postoperatively (P = 0.305. Of 36 eyes, 24 (66.7% had visual field defects including temporal defects in 12 eyes (33.3%, non-specific defects in 10 eyes (27.8%, and peripheral field constriction in 2 eyes (5.6%. Mean deviation of visual fields at presentation was -14.28 dB which improved to -11.32 dB postoperatively. The visual field index improved from 63.5% to 75% postoperatively. Favorable visual field outcomes were correlated with shorter duration of symptoms and absence of optic nerve head changes at presentation. Conclusion: Visual field defects were present in two thirds of patients at presentation. An overall improvement in vision and visual fields was noted after surgical resection. An inverse correlation was found between the duration of symptoms and postoperative visual field recovery, signifying the importance of early surgical intervention.

  9. Thermal and stress analyses with ANSYS program

    International Nuclear Information System (INIS)

    Kanoo, Iwao; Kawaguchi, Osamu; Asakura, Junichi.

    1975-03-01

    Some analyses of the heat conduction and elastic/inelastic stresses, carried out in Power Reactor and Nuclear Fuel Development Corporation (PNC) in fiscal 1973 using ANSYS (Engineering Analysis System) program, are summarized. In chapter I, the present state of structural analysis programs available for a FBR (fast breeder reactor) in PNC is explained. Chapter II is a brief description of the ANSYS current status. In chapter III are presented 8 examples of the steady-state and transient thermal analyses for fast-reactor plant components, and in chapter IV 5 examples of the inelastic structural analysis. With the advance in the field of finite element method, its applications in design study should extend progressively in the future. The present report, it is hoped, will contribute as references in similar analyses and at the same time help to understand the deformation and strain behaviors of structures. (Mori, K.)

  10. How fields vary.

    Science.gov (United States)

    Krause, Monika

    2018-03-01

    Field theorists have long insisted that research needs to pay attention to the particular properties of each field studied. But while much field-theoretical research is comparative, either explicitly or implicitly, scholars have only begun to develop the language for describing the dimensions along which fields can be similar to and different from each other. In this context, this paper articulates an agenda for the analysis of variable properties of fields. It discusses variation in the degree but also in the kind of field autonomy. It discusses different dimensions of variation in field structure: fields can be more or less contested, and more or less hierarchical. The structure of symbolic oppositions in a field may take different forms. Lastly, it analyses the dimensions of variation highlighted by research on fields on the sub- and transnational scale. Post-national analysis allows us to ask how fields relate to fields of the same kind on different scales, and how fields relate to fields on the same scale in other national contexts. It allows us to ask about the role resources from other scales play in structuring symbolic oppositions within fields. A more fine-tuned vocabulary for field variation can help us better describe particular fields and it is a precondition for generating hypotheses about the conditions under which we can expect to observe fields with specified characteristics. © London School of Economics and Political Science 2017.

  11. [Transient elevation of intraocular pressure in primary open-angle glaucoma patients after automated visual field examination in the winter].

    Science.gov (United States)

    Nishino, Kazuaki; Yoshida, Fujiko; Nitta, Akari; Saito, Mieko; Saito, Kazuuchi

    2013-12-01

    To evaluate retrospectively seasonal fluctuations of transient intraocular pressure (IOP) elevation after automated visual field examination in patients with primary open-angle glaucoma (POAG). We reviewed 53 consecutive patients with POAG who visited Kaimeido Ophthalmic and Dental Clinic from January 2011 to March 2013, 21 men and 32 women aged 67.7 +/- 11.2 years. The patients were divided into 4 groups, spring, summer, autumn, and winter according to the month of automated visual field examination and both eyes of each patient were enrolled. IOP was measured immediately after automated visual field examination (vf IOP) and compared with the average IOP from the previous 3 months (pre IOP) and with the average IOP from the following 3 months (post IOP) in each season. IOP elevation rate was defined as (vf IOP- pre IOP)/pre IOP x 100% and calculated for each season (paired t test). Additionally, the correlation between mean deviation (MD) and IOP elevation rate was evaluated (single regression analysis). Exclusion criteria were patients who received cataract surgery during this study or had a history of any previous glaucoma surgery. The automated visual field test was performed with a Humphrey field analyzer and the Central 30-2 FASTPAC threshold program. The average vf IOP was 14.5 +/- 2.5 mmHg, higher than pre IOP 13.8 +/- 2.4 mmHg (p field examination, especially in the winter but not in the summer.

  12. Testing of Visual Field with Virtual Reality Goggles in Manual and Visual Grasp Modes

    Directory of Open Access Journals (Sweden)

    Dariusz Wroblewski

    2014-01-01

    Full Text Available Automated perimetry is used for the assessment of visual function in a variety of ophthalmic and neurologic diseases. We report development and clinical testing of a compact, head-mounted, and eye-tracking perimeter (VirtualEye that provides a more comfortable test environment than the standard instrumentation. VirtualEye performs the equivalent of a full threshold 24-2 visual field in two modes: (1 manual, with patient response registered with a mouse click, and (2 visual grasp, where the eye tracker senses change in gaze direction as evidence of target acquisition. 59 patients successfully completed the test in manual mode and 40 in visual grasp mode, with 59 undergoing the standard Humphrey field analyzer (HFA testing. Large visual field defects were reliably detected by VirtualEye. Point-by-point comparison between the results obtained with the different modalities indicates: (1 minimal systematic differences between measurements taken in visual grasp and manual modes, (2 the average standard deviation of the difference distributions of about 5 dB, and (3 a systematic shift (of 4–6 dB to lower sensitivities for VirtualEye device, observed mostly in high dB range. The usability survey suggested patients’ acceptance of the head-mounted device. The study appears to validate the concepts of a head-mounted perimeter and the visual grasp mode.

  13. Testing of visual field with virtual reality goggles in manual and visual grasp modes.

    Science.gov (United States)

    Wroblewski, Dariusz; Francis, Brian A; Sadun, Alfredo; Vakili, Ghazal; Chopra, Vikas

    2014-01-01

    Automated perimetry is used for the assessment of visual function in a variety of ophthalmic and neurologic diseases. We report development and clinical testing of a compact, head-mounted, and eye-tracking perimeter (VirtualEye) that provides a more comfortable test environment than the standard instrumentation. VirtualEye performs the equivalent of a full threshold 24-2 visual field in two modes: (1) manual, with patient response registered with a mouse click, and (2) visual grasp, where the eye tracker senses change in gaze direction as evidence of target acquisition. 59 patients successfully completed the test in manual mode and 40 in visual grasp mode, with 59 undergoing the standard Humphrey field analyzer (HFA) testing. Large visual field defects were reliably detected by VirtualEye. Point-by-point comparison between the results obtained with the different modalities indicates: (1) minimal systematic differences between measurements taken in visual grasp and manual modes, (2) the average standard deviation of the difference distributions of about 5 dB, and (3) a systematic shift (of 4-6 dB) to lower sensitivities for VirtualEye device, observed mostly in high dB range. The usability survey suggested patients' acceptance of the head-mounted device. The study appears to validate the concepts of a head-mounted perimeter and the visual grasp mode.

  14. Analyses of cavitation instabilities in ductile metals

    DEFF Research Database (Denmark)

    Tvergaard, Viggo

    2007-01-01

    Cavitation instabilities have been predicted for a single void in a ductile metal stressed under high triaxiality conditions. In experiments for a ceramic reinforced by metal particles a single dominant void has been observed on the fracture surface of some of the metal particles bridging a crack......, and also tests for a thin ductile metal layer bonding two ceramic blocks have indicated rapid void growth. Analyses for these material configurations are discussed here. When the void radius is very small, a nonlocal plasticity model is needed to account for observed size-effects, and recent analyses......, while the surrounding voids are represented by a porous ductile material model in terms of a field quantity that specifies the variation of the void volume fraction in the surrounding metal....

  15. [Correlation of intraocular pressure variation after visual field examination with 24-hour intraocular pressure variations in primary open-angle glaucoma].

    Science.gov (United States)

    Noro, Takahiko; Nakamoto, Kenji; Sato, Makoto; Yasuda, Noriko; Ito, Yoshinori; Ogawa, Shumpei; Nakano, Tadashi; Tsuneoka, Hiroshi

    2014-10-01

    We retrospectively examined intraocular pressure variations after visual field examination in primary open angle glaucoma (POAG), together with its influencing factors and its association with 24-hour intraocular pressure variations. Subjects were 94 eyes (52 POAG patients) subjected to measurements of 24-hour intraocular pressure and of changes in intraocular pressure after visual field examination using a Humphrey Visual Field Analyzer. Subjects were classified into three groups according to the magnitude of variation (large, intermediate and small), and 24-hour intraocular pressure variations were compared among the three groups. Factors influencing intraocular pressure variations after visual field examination and those associated with the large variation group were investigated. Average intraocular pressure variation after visual field examination was -0.28 ± 1.90 (range - 6.0(-) + 5.0) mmHg. No significant influencing factors were identified. The intraocular pressure at 3 a.m. was significantly higher in the large variation group than other two groups (p field examination. Increases in intraocular pressure during the night might be associated with large intraocular pressure variations after visual field examination.

  16. O PHTHALMIC MANIFESTATIONS AND VISUAL FIELD CHANGES WITH SELLAR AND SUPRASELLAR TUMOURS

    Directory of Open Access Journals (Sweden)

    Arvind L.

    2015-08-01

    Full Text Available PURPOSE: To evaluate ocular manifestations and visual field changes in patients with Sellar and Suprasellar Tumours. METHODS: Fifty patients with Sellar and Suprasellar tumours underwent a complete ophthalmic assessment and visual field analysis using the Humphrey Field Analyzer 30 - 2 program. Visual acuity, duration of symptoms, optic nerve head changes, pattern of visual field defects was noted. RESULTS: 50 patients including 15 male and 35 female subjects with mean age of 35.1±9.9 years and CT/MRI proven Suprasellar tumours 70% pituitary adenoma and 30% craniopharyngiomas were included. 70% cases presented with headache 80% with diminution of vision only 10% with hypothyroidism 50% with abnormal pupillary reaction including RAPD and anisocoria. Mean visual acuity at presentation was 0.46 log MAR . Of 100 eyes, 45 patients (90% had visual field defects including temporal defects in 35 patients (70%, non - specific defects in 4 patients (20% and 1patient (10% without any defect. Optic nerve head changes note d and 5 patients (25% presented with partial optic atrophy and 10 presented with established papilloedema. Visual field outcomes are correlated with duration of symptoms, optic nerve head changes at presentation and CT/ MRI findings. CONCLUSION: Visual fi eld defects were present in two thirds of patients at presentation. An overall deterioration in vision and visual fields was noted before surgical resection. A correlation was found between the duration of symptoms, MRI/ CT scan reports and visual field, s ignifying the importance in early diagnosis of neurological lesions on the basis of ophthalmic examination .

  17. Relationship between consecutive deterioration of mean deviation value and progression of visual field defect in open-angle glaucoma

    Directory of Open Access Journals (Sweden)

    Naito T

    2015-11-01

    Full Text Available Tomoko Naito,1 Keiji Yoshikawa,2 Shiro Mizoue,3 Mami Nanno,4 Tairo Kimura,5 Hirotaka Suzumura,6 Ryuji Takeda,7 Fumio Shiraga1 1Department of Ophthalmology, Okayama University Graduate School of Medicine, Okayama, 2Yoshikawa Eye Clinic, Tokyo, 3Department of Ophthalmology, Ehime University Graduate School of Medicine, Ehime, 4Kagurazaka Minamino Eye Clinic, 5Ueno Eye Clinic, 6Suzumura Eye Clinic, Tokyo, 7Department of Agriculture, Kinki University, Nara, Japan Purpose: To analyze the relationship between consecutive deterioration of mean deviation (MD value and glaucomatous visual field (VF progression in open-angle glaucoma (OAG, including primary OAG and normal tension glaucoma.Patients and methods: The subjects of the study were patients undergoing treatment for OAG who had performed VF tests at least 10 times with a Humphrey field analyzer (SITA standard, C30-2 program. The VF progression was defined by a significantly negative MD slope (MD slope worsening at the final VF test during the follow-up period. The relationship between the MD slope worsening and the consecutive deterioration of MD value were retrospectively analyzed.Results: A total of 165 eyes of 165 patients were included in the analysis. Significant progression of VF defects was observed in 72 eyes of 72 patients (43.6%, while no significant progression was evident in 93 eyes of 93 patients (56.4%. There was significant relationship between the frequency of consecutive deterioration of MD value and MD slope worsening (P<0.0001, Cochran–Armitage trend test. A significant association was observed for MD slope worsening in the eyes with three (odds ratio: 2.1, P=0.0224 and four (odds ratio: 3.6, P=0.0008 consecutive deterioration of MD value in multiple logistic regression analysis, but no significant association in the eyes with two consecutive deterioration (odds ratio: 1.1, P=0.8282. The eyes with VF progression had significantly lower intraocular pressure reduction rate (P<0

  18. Differences between Non-arteritic Anterior Ischemic Optic Neuropathy and Open Angle Glaucoma with Altitudinal Visual Field Defect.

    Science.gov (United States)

    Han, Sangyoun; Jung, Jong Jin; Kim, Ungsoo Samuel

    2015-12-01

    To investigate the differences in retinal nerve fiber layer (RNFL) change and optic nerve head parameters between non-arteritic anterior ischemic optic neuropathy (NAION) and open angle glaucoma (OAG) with altitudinal visual field defect. Seventeen NAION patients and 26 OAG patients were enrolled prospectively. The standard visual field indices (mean deviation, pattern standard deviation) were obtained from the Humphrey visual field test and differences between the two groups were analyzed. Cirrus HD-OCT parameters were used, including optic disc head analysis, average RNFL thickness, and RNFL thickness of each quadrant. The mean deviation and pattern standard deviation were not significantly different between the groups. In the affected eye, although the disc area was similar between the two groups (2.00 ± 0.32 and 1.99 ± 0.33 mm(2), p = 0.586), the rim area of the OAG group was smaller than that of the NAION group (1.26 ± 0.56 and 0.61 ± 0.15 mm(2), respectively, p field defects, optic disc head analysis of not only the affected eye, but also the unaffected eye, by using spectral domain optical coherence tomography may be helpful.

  19. Sample preparation in foodomic analyses.

    Science.gov (United States)

    Martinović, Tamara; Šrajer Gajdošik, Martina; Josić, Djuro

    2018-04-16

    Representative sampling and adequate sample preparation are key factors for successful performance of further steps in foodomic analyses, as well as for correct data interpretation. Incorrect sampling and improper sample preparation can be sources of severe bias in foodomic analyses. It is well known that both wrong sampling and sample treatment cannot be corrected anymore. These, in the past frequently neglected facts, are now taken into consideration, and the progress in sampling and sample preparation in foodomics is reviewed here. We report the use of highly sophisticated instruments for both high-performance and high-throughput analyses, as well as miniaturization and the use of laboratory robotics in metabolomics, proteomics, peptidomics and genomics. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  20. Analysing harmonic motions with an iPhone’s magnetometer

    Science.gov (United States)

    Yavuz, Ahmet; Kağan Temiz, Burak

    2016-05-01

    In this paper, we propose an experiment for analysing harmonic motion using an iPhone’s (or iPad’s) magnetometer. This experiment consists of the detection of magnetic field variations obtained from an iPhone’s magnetometer sensor. A graph of harmonic motion is directly displayed on the iPhone’s screen using the Sensor Kinetics application. Data from this application was analysed with Eureqa software to establish the equation of the harmonic motion. Analyses show that the use of an iPhone’s magnetometer to analyse harmonic motion is a practical and effective method for small oscillations and frequencies less than 15-20 Hz.

  1. Descriptive Analyses of Mechanical Systems

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup; Hansen, Claus Thorp

    2003-01-01

    Forord Produktanalyse og teknologianalyse kan gennmføres med et bredt socio-teknisk sigte med henblik på at forstå kulturelle, sociologiske, designmæssige, forretningsmæssige og mange andre forhold. Et delområde heri er systemisk analyse og beskrivelse af produkter og systemer. Nærværende kompend...

  2. Analysing and Comparing Encodability Criteria

    Directory of Open Access Journals (Sweden)

    Kirstin Peters

    2015-08-01

    Full Text Available Encodings or the proof of their absence are the main way to compare process calculi. To analyse the quality of encodings and to rule out trivial or meaningless encodings, they are augmented with quality criteria. There exists a bunch of different criteria and different variants of criteria in order to reason in different settings. This leads to incomparable results. Moreover it is not always clear whether the criteria used to obtain a result in a particular setting do indeed fit to this setting. We show how to formally reason about and compare encodability criteria by mapping them on requirements on a relation between source and target terms that is induced by the encoding function. In particular we analyse the common criteria full abstraction, operational correspondence, divergence reflection, success sensitiveness, and respect of barbs; e.g. we analyse the exact nature of the simulation relation (coupled simulation versus bisimulation that is induced by different variants of operational correspondence. This way we reduce the problem of analysing or comparing encodability criteria to the better understood problem of comparing relations on processes.

  3. Analysing Children's Drawings: Applied Imagination

    Science.gov (United States)

    Bland, Derek

    2012-01-01

    This article centres on a research project in which freehand drawings provided a richly creative and colourful data source of children's imagined, ideal learning environments. Issues concerning the analysis of the visual data are discussed, in particular, how imaginative content was analysed and how the analytical process was dependent on an…

  4. Impact analyses after pipe rupture

    International Nuclear Information System (INIS)

    Chun, R.C.; Chuang, T.Y.

    1983-01-01

    Two of the French pipe whip experiments are reproduced with the computer code WIPS. The WIPS results are in good agreement with the experimental data and the French computer code TEDEL. This justifies the use of its pipe element in conjunction with its U-bar element in a simplified method of impact analyses

  5. Millifluidic droplet analyser for microbiology

    NARCIS (Netherlands)

    Baraban, L.; Bertholle, F.; Salverda, M.L.M.; Bremond, N.; Panizza, P.; Baudry, J.; Visser, de J.A.G.M.; Bibette, J.

    2011-01-01

    We present a novel millifluidic droplet analyser (MDA) for precisely monitoring the dynamics of microbial populations over multiple generations in numerous (=103) aqueous emulsion droplets (100 nL). As a first application, we measure the growth rate of a bacterial strain and determine the minimal

  6. Workload analyse of assembling process

    Science.gov (United States)

    Ghenghea, L. D.

    2015-11-01

    The workload is the most important indicator for managers responsible of industrial technological processes no matter if these are automated, mechanized or simply manual in each case, machines or workers will be in the focus of workload measurements. The paper deals with workload analyses made to a most part manual assembling technology for roller bearings assembling process, executed in a big company, with integrated bearings manufacturing processes. In this analyses the delay sample technique have been used to identify and divide all bearing assemblers activities, to get information about time parts from 480 minutes day work time that workers allow to each activity. The developed study shows some ways to increase the process productivity without supplementary investments and also indicated the process automation could be the solution to gain maximum productivity.

  7. Mitogenomic analyses from ancient DNA

    DEFF Research Database (Denmark)

    Paijmans, Johanna L. A.; Gilbert, Tom; Hofreiter, Michael

    2013-01-01

    The analysis of ancient DNA is playing an increasingly important role in conservation genetic, phylogenetic and population genetic analyses, as it allows incorporating extinct species into DNA sequence trees and adds time depth to population genetics studies. For many years, these types of DNA...... analyses (whether using modern or ancient DNA) were largely restricted to the analysis of short fragments of the mitochondrial genome. However, due to many technological advances during the past decade, a growing number of studies have explored the power of complete mitochondrial genome sequences...... yielded major progress with regard to both the phylogenetic positions of extinct species, as well as resolving population genetics questions in both extinct and extant species....

  8. Recriticality analyses for CAPRA cores

    International Nuclear Information System (INIS)

    Maschek, W.; Thiem, D.

    1995-01-01

    The first scoping calculation performed show that the energetics levels from recriticalities in CAPRA cores are in the same range as in conventional cores. However, considerable uncertainties exist and further analyses are necessary. Additional investigations are performed for the separation scenarios of fuel/steel/inert and matrix material as a large influence of these processes on possible ramp rates and kinetics parameters was detected in the calculations. (orig./HP)

  9. Recriticality analyses for CAPRA cores

    Energy Technology Data Exchange (ETDEWEB)

    Maschek, W.; Thiem, D.

    1995-08-01

    The first scoping calculation performed show that the energetics levels from recriticalities in CAPRA cores are in the same range as in conventional cores. However, considerable uncertainties exist and further analyses are necessary. Additional investigations are performed for the separation scenarios of fuel/steel/inert and matrix material as a large influence of these processes on possible ramp rates and kinetics parameters was detected in the calculations. (orig./HP)

  10. Methodology of cost benefit analyses

    International Nuclear Information System (INIS)

    Patrik, M.; Babic, P.

    2000-10-01

    The report addresses financial aspects of proposed investments and other steps which are intended to contribute to nuclear safety. The aim is to provide introductory insight into the procedures and potential of cost-benefit analyses as a routine guide when making decisions on costly provisions as one of the tools to assess whether a particular provision is reasonable. The topic is applied to the nuclear power sector. (P.A.)

  11. Multifocal electroretinogram and central visual field testing in central areolar choroidal dystrophy.

    Science.gov (United States)

    Gundogan, Fatih Cakir; Dinç, Umut Asli; Erdem, Uzeyir; Ozge, Gokhan; Sobaci, Gungor

    2010-01-01

    To study multifocal electroretinogram (mfERG) and its relation to retinal sensitivity assessed by Humphrey visual field (HVF) analysis in central areolar choroidal dystrophy (CACD). Seven eyes of 4 patients with CACD and 15 normal control subjects were examined. mfERG and central 30/2 HVF were tested for each participant. Ring analysis in mfERG was evaluated. HVF results were evaluated in 5 concentric rings in order to compare the results to concentric ring analysis in mfERG. The differences between control subjects and patients were evaluated by Mann-Whitney U test and the correlations were assessed by Spearman test. Mean Snellen acuity was 0.49+/-0.10 in patients. HVF revealed central scotoma in 6 of 7 eyes (85.7%), whereas a paracentral scotoma extending to fixation point was detected in 1 eye. The retinal sensitivities in 5 concentric rings in HVF were significantly lower (p<0.001 for ring 1 to ring 4, and p=0.017 in ring 5) in CACD patients. Similarly, CACD patients had lower P1/N1 amplitudes (p<0.05) and delayed P1/N1 implicit times (p<0.05). In CACD, in the areas of scotoma detected by HVF, mfERG values were depressed. However, both mfERG and HVF abnormalities were found outside the areas of ophthalmoscopically normal retinal areas.

  12. Analysing the Wrongness of Killing

    DEFF Research Database (Denmark)

    Di Nucci, Ezio

    2014-01-01

    This article provides an in-depth analysis of the wrongness of killing by comparing different versions of three influential views: the traditional view that killing is always wrong; the liberal view that killing is wrong if and only if the victim does not want to be killed; and Don Marquis‟ future...... of value account of the wrongness of killing. In particular, I illustrate the advantages that a basic version of the liberal view and a basic version of the future of value account have over competing alternatives. Still, ultimately none of the views analysed here are satisfactory; but the different...

  13. Theorising and Analysing Academic Labour

    Directory of Open Access Journals (Sweden)

    Thomas Allmer

    2018-01-01

    Full Text Available The aim of this article is to contextualise universities historically within capitalism and to analyse academic labour and the deployment of digital media theoretically and critically. It argues that the post-war expansion of the university can be considered as medium and outcome of informational capitalism and as a dialectical development of social achievement and advanced commodification. The article strives to identify the class position of academic workers, introduces the distinction between academic work and labour, discusses the connection between academic, information and cultural work, and suggests a broad definition of university labour. It presents a theoretical model of working conditions that helps to systematically analyse the academic labour process and to provide an overview of working conditions at universities. The paper furthermore argues for the need to consider the development of education technologies as a dialectics of continuity and discontinuity, discusses the changing nature of the forces and relations of production, and the impact on the working conditions of academics in the digital university. Based on Erik Olin Wright’s inclusive approach of social transformation, the article concludes with the need to bring together anarchist, social democratic and revolutionary strategies for establishing a socialist university in a commons-based information society.

  14. Reliability and safety analyses under fuzziness

    International Nuclear Information System (INIS)

    Onisawa, T.; Kacprzyk, J.

    1995-01-01

    Fuzzy theory, for example possibility theory, is compatible with probability theory. What is shown so far is that probability theory needs not be replaced by fuzzy theory, but rather that the former works much better in applications if it is combined with the latter. In fact, it is said that there are two essential uncertainties in the field of reliability and safety analyses: One is a probabilistic uncertainty which is more relevant for mechanical systems and the natural environment, and the other is fuzziness (imprecision) caused by the existence of human beings in systems. The classical probability theory alone is therefore not sufficient to deal with uncertainties in humanistic system. In such a context this collection of works will put a milestone in the arguments of probability theory and fuzzy theory. This volume covers fault analysis, life time analysis, reliability, quality control, safety analysis and risk analysis. (orig./DG). 106 figs

  15. Field transformations to multivalued fields

    Energy Technology Data Exchange (ETDEWEB)

    Kleinert, H [Institut fuer Theoretische Physik, Arnimallee 14, D-14195 Berlin (Germany)

    2007-05-15

    Changes of field variables may lead to multivalued fields which do not satisfy the Schwarz integrability conditions. Their quantum field theory needs special care as is shown in an application to the superfluid and superconducting phase transitions.

  16. Fractal and multifractal analyses of bipartite networks

    Science.gov (United States)

    Liu, Jin-Long; Wang, Jian; Yu, Zu-Guo; Xie, Xian-Hua

    2017-03-01

    Bipartite networks have attracted considerable interest in various fields. Fractality and multifractality of unipartite (classical) networks have been studied in recent years, but there is no work to study these properties of bipartite networks. In this paper, we try to unfold the self-similarity structure of bipartite networks by performing the fractal and multifractal analyses for a variety of real-world bipartite network data sets and models. First, we find the fractality in some bipartite networks, including the CiteULike, Netflix, MovieLens (ml-20m), Delicious data sets and (u, v)-flower model. Meanwhile, we observe the shifted power-law or exponential behavior in other several networks. We then focus on the multifractal properties of bipartite networks. Our results indicate that the multifractality exists in those bipartite networks possessing fractality. To capture the inherent attribute of bipartite network with two types different nodes, we give the different weights for the nodes of different classes, and show the existence of multifractality in these node-weighted bipartite networks. In addition, for the data sets with ratings, we modify the two existing algorithms for fractal and multifractal analyses of edge-weighted unipartite networks to study the self-similarity of the corresponding edge-weighted bipartite networks. The results show that our modified algorithms are feasible and can effectively uncover the self-similarity structure of these edge-weighted bipartite networks and their corresponding node-weighted versions.

  17. Gauge fields in a torsion field

    International Nuclear Information System (INIS)

    Rosu, Ion

    2004-01-01

    In this paper we analyse the motion and the field equations in a non-null curvature and torsion space. In this 4-n dimensional space, the connection coefficients are γ bc a = 1/2S bc a + 1/2T bc a, where S bc a is the symmetrical part and T bc a are the components of the torsion tensor. We will consider that all the fields depend on x = x α , α = 1,2,3,4 and do not depend on y = y k , k=1,2,...,n. The factor S bc a depends on the components of the metric tensor g αβ (x) and on the gauge fields A ν s 0 (x) and the components of the torsion depend only on the gauge fields A ν s 0 (x). We take into consideration the particular case for which the geodesic equations coincide with the motion equations in the presence of the gravitational and the gauge fields. In this case the field equations are Einstein equations in a 4-n dimensional space. We show that both the geodesic equations and the field equations can be obtained from a variational principle. (author)

  18. Severe accident recriticality analyses (SARA)

    DEFF Research Database (Denmark)

    Frid, W.; Højerup, C.F.; Lindholm, I.

    2001-01-01

    with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality-both super-prompt power bursts and quasi steady-state power......Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies......, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g(-1), was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s(-1). In most cases, however, the predicted energy deposition was smaller, below...

  19. Hydrogen Analyses in the EPR

    International Nuclear Information System (INIS)

    Worapittayaporn, S.; Eyink, J.; Movahed, M.

    2008-01-01

    In severe accidents with core melting large amounts of hydrogen may be released into the containment. The EPR provides a combustible gas control system to prevent hydrogen combustion modes with the potential to challenge the containment integrity due to excessive pressure and temperature loads. This paper outlines the approach for the verification of the effectiveness and efficiency of this system. Specifically, the justification is a multi-step approach. It involves the deployment of integral codes, lumped parameter containment codes and CFD codes and the use of the sigma criterion, which provides the link to the broad experimental data base for flame acceleration (FA) and deflagration to detonation transition (DDT). The procedure is illustrated with an example. The performed analyses show that hydrogen combustion at any time does not lead to pressure or temperature loads that threaten the containment integrity of the EPR. (authors)

  20. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  1. The hemispherical deflector analyser revisited

    Energy Technology Data Exchange (ETDEWEB)

    Benis, E.P. [Institute of Electronic Structure and Laser, P.O. Box 1385, 71110 Heraklion, Crete (Greece)], E-mail: benis@iesl.forth.gr; Zouros, T.J.M. [Institute of Electronic Structure and Laser, P.O. Box 1385, 71110 Heraklion, Crete (Greece); Department of Physics, University of Crete, P.O. Box 2208, 71003 Heraklion, Crete (Greece)

    2008-04-15

    Using the basic spectrometer trajectory equation for motion in an ideal 1/r potential derived in Eq. (101) of part I [T.J.M. Zouros, E.P. Benis, J. Electron Spectrosc. Relat. Phenom. 125 (2002) 221], the operational characteristics of a hemispherical deflector analyser (HDA) such as dispersion, energy resolution, energy calibration, input lens magnification and energy acceptance window are investigated from first principles. These characteristics are studied as a function of the entry point R{sub 0} and the nominal value of the potential V(R{sub 0}) at entry. Electron-optics simulations and actual laboratory measurements are compared to our theoretical results for an ideal biased paracentric HDA using a four-element zoom lens and a two-dimensional position sensitive detector (2D-PSD). These results should be of particular interest to users of modern HDAs utilizing a PSD.

  2. The hemispherical deflector analyser revisited

    International Nuclear Information System (INIS)

    Benis, E.P.; Zouros, T.J.M.

    2008-01-01

    Using the basic spectrometer trajectory equation for motion in an ideal 1/r potential derived in Eq. (101) of part I [T.J.M. Zouros, E.P. Benis, J. Electron Spectrosc. Relat. Phenom. 125 (2002) 221], the operational characteristics of a hemispherical deflector analyser (HDA) such as dispersion, energy resolution, energy calibration, input lens magnification and energy acceptance window are investigated from first principles. These characteristics are studied as a function of the entry point R 0 and the nominal value of the potential V(R 0 ) at entry. Electron-optics simulations and actual laboratory measurements are compared to our theoretical results for an ideal biased paracentric HDA using a four-element zoom lens and a two-dimensional position sensitive detector (2D-PSD). These results should be of particular interest to users of modern HDAs utilizing a PSD

  3. Analysing Protocol Stacks for Services

    DEFF Research Database (Denmark)

    Gao, Han; Nielson, Flemming; Nielson, Hanne Riis

    2011-01-01

    We show an approach, CaPiTo, to model service-oriented applications using process algebras such that, on the one hand, we can achieve a certain level of abstraction without being overwhelmed by the underlying implementation details and, on the other hand, we respect the concrete industrial...... standards used for implementing the service-oriented applications. By doing so, we will be able to not only reason about applications at different levels of abstractions, but also to build a bridge between the views of researchers on formal methods and developers in industry. We apply our approach...... to the financial case study taken from Chapter 0-3. Finally, we develop a static analysis to analyse the security properties as they emerge at the level of concrete industrial protocols....

  4. Analysing performance through value creation

    Directory of Open Access Journals (Sweden)

    Adrian TRIFAN

    2015-12-01

    Full Text Available This paper draws a parallel between measuring financial performance in 2 variants: the first one using data offered by accounting, which lays emphasis on maximizing profit, and the second one which aims to create value. The traditional approach to performance is based on some indicators from accounting data: ROI, ROE, EPS. The traditional management, based on analysing the data from accounting, has shown its limits, and a new approach is needed, based on creating value. The evaluation of value based performance tries to avoid the errors due to accounting data, by using other specific indicators: EVA, MVA, TSR, CVA. The main objective is shifted from maximizing the income to maximizing the value created for shareholders. The theoretical part is accompanied by a practical analysis regarding the creation of value and an analysis of the main indicators which evaluate this concept.

  5. How Many Subjects are Needed for a Visual Field Normative Database? A Comparison of Ground Truth and Bootstrapped Statistics.

    Science.gov (United States)

    Phu, Jack; Bui, Bang V; Kalloniatis, Michael; Khuu, Sieu K

    2018-03-01

    The number of subjects needed to establish the normative limits for visual field (VF) testing is not known. Using bootstrap resampling, we determined whether the ground truth mean, distribution limits, and standard deviation (SD) could be approximated using different set size ( x ) levels, in order to provide guidance for the number of healthy subjects required to obtain robust VF normative data. We analyzed the 500 Humphrey Field Analyzer (HFA) SITA-Standard results of 116 healthy subjects and 100 HFA full threshold results of 100 psychophysically experienced healthy subjects. These VFs were resampled (bootstrapped) to determine mean sensitivity, distribution limits (5th and 95th percentiles), and SD for different ' x ' and numbers of resamples. We also used the VF results of 122 glaucoma patients to determine the performance of ground truth and bootstrapped results in identifying and quantifying VF defects. An x of 150 (for SITA-Standard) and 60 (for full threshold) produced bootstrapped descriptive statistics that were no longer different to the original distribution limits and SD. Removing outliers produced similar results. Differences between original and bootstrapped limits in detecting glaucomatous defects were minimized at x = 250. Ground truth statistics of VF sensitivities could be approximated using set sizes that are significantly smaller than the original cohort. Outlier removal facilitates the use of Gaussian statistics and does not significantly affect the distribution limits. We provide guidance for choosing the cohort size for different levels of error when performing normative comparisons with glaucoma patients.

  6. Spin Structure Analyses of Antiferromagnets

    International Nuclear Information System (INIS)

    Chung, Jae Ho; Song, Young Sang; Lee, Hak Bong

    2010-05-01

    We have synthesized series of powder sample of incommensurate antiferromagnetic multiferroics, (Mn, Co)WO 4 and Al doped Ba 0.5 Sr 1.5 Zn 2 Fe 12 O 22 , incommensurate antiferromagnetic multiferroics. Their spin structure was studied by using the HRPD. In addition, we have synthesized series of crystalline samples of incommensurate multiferroics, (Mn, Co)WO 4 and olivines. Their spin structure was investigated using neutron diffraction under high magnetic field. As a result, we were able to draw the phase diagram of (Mn, Co)WO 4 as a function of composition and temperature. We learned the how the spin structure changes with increased ionic substitution. Finally we have drawn the phase diagram of the multicritical olivine Mn2SiS4/Mn2GeS4 as a function of filed and temperature through the spin structure studies

  7. Comparison between visual field defect in pigmentary glaucoma and primary open-angle glaucoma.

    Science.gov (United States)

    Nilforushan, Naveed; Yadgari, Maryam; Jazayeri, Anisalsadat

    2016-10-01

    To compare visual field defect patterns between pigmentary glaucoma and primary open-angle glaucoma. Retrospective, comparative study. Patients with diagnosis of primary open-angle glaucoma (POAG) and pigmentary glaucoma (PG) in mild to moderate stages were enrolled in this study. Each of the 52 point locations in total and pattern deviation plot (excluding 2 points adjacent to blind spot) of 24-2 Humphrey visual field as well as six predetermined sectors were compared using SPSS software version 20. Comparisons between 2 groups were performed with the Student t test for continuous variables and the Chi-square test for categorical variables. Thirty-eight eyes of 24 patients with a mean age of 66.26 ± 11 years (range 48-81 years) in the POAG group and 36 eyes of 22 patients with a mean age of 50.52 ± 11 years (range 36-69 years) in the PG group were studied. (P = 0.00). More deviation was detected in points 1, 3, 4, and 32 in total deviation (P = 0.03, P = 0.015, P = 0.018, P = 0.023) and in points 3, 4, and 32 in pattern deviation (P = 0.015, P = 0.049, P = 0.030) in the POAG group, which are the temporal parts of the field. It seems that the temporal area of the visual field in primary open-angle glaucoma is more susceptible to damage in comparison with pigmentary glaucoma.

  8. Performance of an iPad Application to Detect Moderate and Advanced Visual Field Loss in Nepal.

    Science.gov (United States)

    Johnson, Chris A; Thapa, Suman; George Kong, Yu Xiang; Robin, Alan L

    2017-10-01

    To evaluate the accuracy and efficiency of Visual Fields Easy (VFE), a free iPad app, for performing suprathreshold perimetric screening. Prospective, cross-sectional validation study. We performed screening visual fields using a calibrated iPad 2 with the VFE application on 206 subjects (411 eyes): 210 normal (NL), 183 glaucoma (GL), and 18 diabetic retinopathy (DR) at Tilganga Institute of Ophthalmology, Kathmandu, Nepal. We correlated the results with a Humphrey Field Analyzer using 24-2 SITA Standard tests on 373 of these eyes (198 NL, 160 GL, 15 DR). The number of missed locations on the VFE correlated with mean deviation (MD, r = 0.79), pattern standard deviation (PSD, r = 0.60), and number of locations that were worse than the 95% confidence limits for total deviation (r = 0.51) and pattern deviation (r = 0.68) using SITA Standard. iPad suprathreshold perimetry was able to detect most visual field deficits with moderate (MD of -6 to -12 dB) and advanced (MD worse than -12 dB) loss, but had greater difficulty in detecting early (MD better than -6 dB) loss, primarily owing to an elevated false-positive response rate. The average time to perform the Visual Fields Easy test was 3 minutes, 18 seconds (standard deviation = 16.88 seconds). The Visual Fields Easy test procedure is a portable, fast, effective procedure for detecting moderate and advanced visual field loss. Improvements are currently underway to monitor eye and head tracking during testing, reduce testing time, improve performance, and eliminate the need to touch the video screen surface. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Proteins analysed as virtual knots

    Science.gov (United States)

    Alexander, Keith; Taylor, Alexander J.; Dennis, Mark R.

    2017-02-01

    Long, flexible physical filaments are naturally tangled and knotted, from macroscopic string down to long-chain molecules. The existence of knotting in a filament naturally affects its configuration and properties, and may be very stable or disappear rapidly under manipulation and interaction. Knotting has been previously identified in protein backbone chains, for which these mechanical constraints are of fundamental importance to their molecular functionality, despite their being open curves in which the knots are not mathematically well defined; knotting can only be identified by closing the termini of the chain somehow. We introduce a new method for resolving knotting in open curves using virtual knots, which are a wider class of topological objects that do not require a classical closure and so naturally capture the topological ambiguity inherent in open curves. We describe the results of analysing proteins in the Protein Data Bank by this new scheme, recovering and extending previous knotting results, and identifying topological interest in some new cases. The statistics of virtual knots in protein chains are compared with those of open random walks and Hamiltonian subchains on cubic lattices, identifying a regime of open curves in which the virtual knotting description is likely to be important.

  10. Digital image analyser for autoradiography

    International Nuclear Information System (INIS)

    Muth, R.A.; Plotnick, J.

    1985-01-01

    The most critical parameter in quantitative autoradiography for assay of tissue concentrations of tracers is the ability to obtain precise and accurate measurements of optical density of the images. Existing high precision systems for image analysis, rotating drum densitometers, are expensive, suffer from mechanical problems and are slow. More moderately priced and reliable video camera based systems are available, but their outputs generally do not have the uniformity and stability necessary for high resolution quantitative autoradiography. The authors have designed and constructed an image analyser optimized for quantitative single and multiple tracer autoradiography which the authors refer to as a memory-mapped charged-coupled device scanner (MM-CCD). The input is from a linear array of CCD's which is used to optically scan the autoradiograph. Images are digitized into 512 x 512 picture elements with 256 gray levels and the data is stored in buffer video memory in less than two seconds. Images can then be transferred to RAM memory by direct memory-mapping for further processing. Arterial blood curve data and optical density-calibrated standards data can be entered and the optical density images can be converted automatically to tracer concentration or functional images. In double tracer studies, images produced from both exposures can be stored and processed in RAM to yield ''pure'' individual tracer concentration or functional images. Any processed image can be transmitted back to the buffer memory to be viewed on a monitor and processed for region of interest analysis

  11. Analysing Harmonic Motions with an iPhone's Magnetometer

    Science.gov (United States)

    Yavuz, Ahmet; Temiz, Burak Kagan

    2016-01-01

    In this paper, we propose an experiment for analysing harmonic motion using an iPhone's (or iPad's) magnetometer. This experiment consists of the detection of magnetic field variations obtained from an iPhone's magnetometer sensor. A graph of harmonic motion is directly displayed on the iPhone's screen using the "Sensor Kinetics"…

  12. How does glaucoma look?: patient perception of visual field loss.

    Science.gov (United States)

    Crabb, David P; Smith, Nicholas D; Glen, Fiona C; Burton, Robyn; Garway-Heath, David F

    2013-06-01

    To explore patient perception of vision loss in glaucoma and, specifically, to test the hypothesis that patients do not recognize their impairment as a black tunnel effect or as black patches in their field of view. Clinic-based cross-sectional study. Fifty patients (age range, 52-82 years) with visual acuity better than 20/30 and with a range of glaucomatous visual field (VF) defects in both eyes, excluding those with very advanced disease (perimetrically blind). Participants underwent monocular VF testing in both eyes using a Humphrey Field Analyzer (HFA; Carl Zeiss Meditec, Dublin, CA; 24-2 Swedish interactive threshold algorithm standard tests) and other tests of visual function. Participants took part in a recorded interview during which they were asked if they were aware of their VF loss; if so, there were encouraged to describe it in their own words. Participants were shown 6 images modified in a variety of ways on a computer monitor and were asked to select the image that most closely represented their perception of their VF loss. Forced choice of an image best representing glaucomatous vision impairment. Participants had a range of VF defect severity: average HFA mean deviation was -8.7 dB (standard deviation [SD], 5.8 dB) and -10.5 dB (SD, 7.1 dB) in the right and left eyes, respectively. Thirteen patients (26%; 95% confidence interval [CI], 15%-40%) reported being completely unaware of their vision loss. None of the patients chose the images with a distinct black tunnel effect or black patches. Only 2 patients (4%; 95% CI, 0%-14%) chose the image with a tunnel effect with blurred edges. An image depicting blurred patches and another with missing patches was chosen by 54% (95% CI, 39%-68%) and 16% (95% CI, 7%-29%) of the patients, respectively. Content analysis of the transcripts from the recorded interviews indicated a frequent use of descriptors of visual symptoms associated with reported blur and missing features. Patients with glaucoma do not perceive

  13. Severe Accident Recriticality Analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Hoejerup, F. [Risoe National Lab. (Denmark); Lindholm, I.; Miettinen, J.; Puska, E.K. [VTT Energy, Helsinki (Finland); Nilsson, Lars [Studsvik Eco and Safety AB, Nykoeping (Sweden); Sjoevall, H. [Teoliisuuden Voima Oy (Finland)

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B{sub 4}C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  14. Severe accident recriticality analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. E-mail: wiktor.frid@ski.se; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Nilsson, L.; Puska, E.K.; Sjoevall, H

    2001-11-01

    Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies, the following issues have been investigated in the SARA project: (1) the energy deposition in the fuel during super-prompt power burst; (2) the quasi steady-state reactor power following the initial power burst; and (3) containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality--both super-prompt power bursts and quasi steady-state power generation--for the range of parameters studied, i.e. with core uncovering and heat-up to maximum core temperatures of approximately 1800 K, and water flow rates of 45-2000 kg s{sup -1} injected into the downcomer. Since recriticality takes place in a small fraction of the core, the power densities are high, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g{sup -1}, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s{sup -1}. In most cases, however, the predicted energy deposition was smaller, below the regulatory limits for fuel failure, but close to or above recently observed thresholds for fragmentation and dispersion of high burn-up fuel. The highest calculated

  15. Severe accident recriticality analyses (SARA)

    International Nuclear Information System (INIS)

    Frid, W.; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Nilsson, L.; Puska, E.K.; Sjoevall, H.

    2001-01-01

    Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies, the following issues have been investigated in the SARA project: (1) the energy deposition in the fuel during super-prompt power burst; (2) the quasi steady-state reactor power following the initial power burst; and (3) containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality--both super-prompt power bursts and quasi steady-state power generation--for the range of parameters studied, i.e. with core uncovering and heat-up to maximum core temperatures of approximately 1800 K, and water flow rates of 45-2000 kg s -1 injected into the downcomer. Since recriticality takes place in a small fraction of the core, the power densities are high, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g -1 , was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s -1 . In most cases, however, the predicted energy deposition was smaller, below the regulatory limits for fuel failure, but close to or above recently observed thresholds for fragmentation and dispersion of high burn-up fuel. The highest calculated quasi steady

  16. Severe Accident Recriticality Analyses (SARA)

    International Nuclear Information System (INIS)

    Frid, W.; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Puska, E.K.; Nilsson, Lars; Sjoevall, H.

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B 4 C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  17. Visual field

    Science.gov (United States)

    ... your visual field. How the Test is Performed Confrontation visual field exam. This is a quick and ... to achieve this important distinction for online health information and services. Learn more about A.D.A. ...

  18. Persistence, spatial distribution and implications for progression detection of blind parts of the visual field in glaucoma: a clinical cohort study.

    Directory of Open Access Journals (Sweden)

    Francisco G Junoy Montolio

    Full Text Available BACKGROUND: Visual field testing is an essential part of glaucoma care. It is hampered by variability related to the disease itself, response errors and fatigue. In glaucoma, blind parts of the visual field contribute to the diagnosis but--once established--not to progression detection; they only increase testing time. The aims of this study were to describe the persistence and spatial distribution of blind test locations in standard automated perimetry in glaucoma and to explore how the omission of presumed blind test locations would affect progression detection. METHODOLOGY/PRINCIPAL FINDINGS: Data from 221 eyes of 221 patients from a cohort study with the Humphrey Field Analyzer with 30-2 grid were used. Patients were stratified according to baseline mean deviation (MD in six strata of 5 dB width each. For one, two, three and four consecutive 0.1 for all strata. Omitting test locations with three consecutive <0 dB sensitivities at baseline did not affect the performance of the MD-based Nonparametric Progression Analysis progression detection algorithm. CONCLUSIONS/SIGNIFICANCE: Test locations that have been shown to be reproducibly blind tend to display a reasonable blindness persistence and do no longer contribute to progression detection. There is no clinically useful universal MD cut-off value beyond which testing can be limited to 10 degree eccentricity.

  19. Pawnee Nation Energy Option Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Matlock, M.; Kersey, K.; Riding In, C.

    2009-07-21

    Pawnee Nation of Oklahoma Energy Option Analyses In 2003, the Pawnee Nation leadership identified the need for the tribe to comprehensively address its energy issues. During a strategic energy planning workshop a general framework was laid out and the Pawnee Nation Energy Task Force was created to work toward further development of the tribe’s energy vision. The overarching goals of the “first steps” project were to identify the most appropriate focus for its strategic energy initiatives going forward, and to provide information necessary to take the next steps in pursuit of the “best fit” energy options. Description of Activities Performed The research team reviewed existing data pertaining to the availability of biomass (focusing on woody biomass, agricultural biomass/bio-energy crops, and methane capture), solar, wind and hydropower resources on the Pawnee-owned lands. Using these data, combined with assumptions about costs and revenue streams, the research team performed preliminary feasibility assessments for each resource category. The research team also reviewed available funding resources and made recommendations to Pawnee Nation highlighting those resources with the greatest potential for financially-viable development, both in the near-term and over a longer time horizon. Findings and Recommendations Due to a lack of financial incentives for renewable energy, particularly at the state level, combined mediocre renewable energy resources, renewable energy development opportunities are limited for Pawnee Nation. However, near-term potential exists for development of solar hot water at the gym, and an exterior wood-fired boiler system at the tribe’s main administrative building. Pawnee Nation should also explore options for developing LFGTE resources in collaboration with the City of Pawnee. Significant potential may also exist for development of bio-energy resources within the next decade. Pawnee Nation representatives should closely monitor

  20. Geophysical Field Theory

    International Nuclear Information System (INIS)

    Eloranta, E.

    2003-11-01

    The geophysical field theory includes the basic principles of electromagnetism, continuum mechanics, and potential theory upon which the computational modelling of geophysical phenomena is based on. Vector analysis is the main mathematical tool in the field analyses. Electrostatics, stationary electric current, magnetostatics, and electrodynamics form a central part of electromagnetism in geophysical field theory. Potential theory concerns especially gravity, but also electrostatics and magnetostatics. Solid state mechanics and fluid mechanics are central parts in continuum mechanics. Also the theories of elastic waves and rock mechanics belong to geophysical solid state mechanics. The theories of geohydrology and mass transport form one central field theory in geophysical fluid mechanics. Also heat transfer is included in continuum mechanics. (orig.)

  1. Relationship between macular ganglion cell complex parameters and visual field parameters after tumor resection in chiasmal compression.

    Science.gov (United States)

    Ohkubo, Shinji; Higashide, Tomomi; Takeda, Hisashi; Murotani, Eiji; Hayashi, Yasuhiko; Sugiyama, Kazuhisa

    2012-01-01

    To evaluate the relationship between macular ganglion cell complex (GCC) parameters and visual field (VF) parameters in chiasmal compression and the potential for GCC parameters in order to predict the short-term postsurgical VF. Twenty-three eyes of 12 patients with chiasmal compression and 33 control eyes were studied. All patients underwent transsphenoidal tumor resection. Before surgery a 3D scan of the macula was taken using spectral-domain optical coherence tomography. All patients underwent Humphrey 24-2 VF testing after surgery. Spearman's rank correlation coefficients were used to evaluate the relationship between the GCC parameters and VF parameters [mean deviation (MD), pattern standard deviation]. Coefficients of determination (R2) were calculated using linear regression. Average thickness in the patients was significantly thinner than that of controls. Average thickness, global loss volume and focal loss volume (FLV) significantly correlated with the MD. We observed the greatest R2 between FLV and MD. Examining the macular GCC was useful for evaluating structural damage in patients with chiasmal compression. Preoperative GCC parameters, especially FLV, may be useful in predicting visual function following surgical decompression of chiasmal compression.

  2. CFD analyses of coolant channel flowfields

    Science.gov (United States)

    Yagley, Jennifer A.; Feng, Jinzhang; Merkle, Charles L.

    1993-01-01

    The flowfield characteristics in rocket engine coolant channels are analyzed by means of a numerical model. The channels are characterized by large length to diameter ratios, high Reynolds numbers, and asymmetrical heating. At representative flow conditions, the channel length is approximately twice the hydraulic entrance length so that fully developed conditions would be reached for a constant property fluid. For the supercritical hydrogen that is used as the coolant, the strong property variations create significant secondary flows in the cross-plane which have a major influence on the flow and the resulting heat transfer. Comparison of constant and variable property solutions show substantial differences. In addition, the property variations prevent fully developed flow. The density variation accelerates the fluid in the channels increasing the pressure drop without an accompanying increase in heat flux. Analyses of the inlet configuration suggest that side entry from a manifold can affect the development of the velocity profile because of vortices generated as the flow enters the channel. Current work is focused on studying the effects of channel bifurcation on the flow field and the heat transfer characteristics.

  3. Magnetic Field

    DEFF Research Database (Denmark)

    Olsen, Nils

    2015-01-01

    he Earth has a large and complicated magnetic field, the major part of which is produced by a self-sustaining dynamo operating in the fluid outer core. Magnetic field observations provide one of the few tools for remote sensing the Earth’s deep interior, especially regarding the dynamics...... of the fluid flow at the top of the core. However, what is measured at or near the surface of the Earth is the superposition of the core field and fields caused by magnetized rocks in the Earth’s crust, by electric currents flowing in the ionosphere, magnetosphere, and oceans, and by currents induced...... in the Earth by time-varying external fields. These sources have their specific characteristics in terms of spatial and temporal variations, and their proper separation, based on magnetic measurements, is a major challenge. Such a separation is a prerequisite for remote sensing by means of magnetic field...

  4. Urban Fields in the making

    DEFF Research Database (Denmark)

    Hovgesen, Henrik Harder; Nielsen, Thomas Alexander Sick

    cities and accentuates the concept of the ?urban field? suggested by John Friedmann (1978). The concept of ?urban field? suggest that mobility has been democratizised and increased to a level where several cities can be part of the same functionally integrated urban field. As a consequence...... the significance of the single urban centre and the city as an entity will change markedly. This paper aims to analyse the development towards urban travel- and commuter fields in Denmark. The question asked is to what degree urban fields are emerging? ? And what is the speed of this development....

  5. Phase field

    International Nuclear Information System (INIS)

    Radhakrishnan, B.; Gorti, S.B.; Clarno, K.; Tonks, M.R.

    2015-01-01

    In this work, the phase-field method and its application to microstructure evolution in reactor fuel and clad are discussed. The examples highlight the capability of the phase-field method to capture evolution processes that are influenced by both thermal and elastic stress fields that are caused by microstructural changes in the solid-state. The challenges that need to be overcome before the technique can become predictive and material-specific are discussed. (authors)

  6. Considerations on field problem structure

    International Nuclear Information System (INIS)

    Pavelescu, M.

    1977-01-01

    A survey of the three field problem types known today: equilibrium, eigen value and propagation problems is presented. The place occupied by neutron field in the nuclear reactor systems both statics and dynamics is shown. The special class of approximate solution method concerning the solving of field and boundary equations is analysed. The residual and variational method and the finite element method which presents a special interest are examined as well. (author)

  7. Finite quantum field theories

    International Nuclear Information System (INIS)

    Lucha, W.; Neufeld, H.

    1986-01-01

    We investigate the relation between finiteness of a four-dimensional quantum field theory and global supersymmetry. To this end we consider the most general quantum field theory and analyse the finiteness conditions resulting from the requirement of the absence of divergent contributions to the renormalizations of the parameters of the theory. In addition to the gauge bosons, both fermions and scalar bosons turn out to be a necessary ingredient in a non-trivial finite gauge theory. In all cases discussed, the supersymmetric theory restricted by two well-known constraints on the dimensionless couplings proves to be the unique solution of the finiteness conditions. (Author)

  8. Velocity Analysis Using Nonhyperbolic Move-Out in Anisotropic Media of Arbitrary Symmetry: Synthetic and Field Data Studies Analyse de vitesse par correction non hyperbolique des indicatrices dans les milieux anisotropes de symétrie arbitraire : étude sur des données synthétiques et réelles

    Directory of Open Access Journals (Sweden)

    Tabti H.

    2006-12-01

    Full Text Available A robust method for estimating the interval parameters (i. e. the normal move-out velocity Vnmo and the anisotropy parameter h of horizontally layered transversely isotropic media from reflected P-waves data has been recently proposed by Alkhalifah (1997 based on move-out equation from Tsvankin and Thomsen (1994. The method, tested on synthetic and field data, is based first on semblance analysis on nonhyberbolic (i. e. long spread move-out for the estimation of the effective parameters, and then on a layer stripping process. Sayers and Ebrom (1997 recently proposed another nonhyperbolic traveltime equation and a corresponding interval velocity analysis which can be used for azimuthally anisotropic layered media. The method was tested on synthetic and physical model data in homogeneous anisotropic media of various symmetry. Here we propose a generalization of the method proposed by Alkhalifah, which can deal with arbitrary, but moderately (i. e. anisotropy strength of roughly 20%, anisotropic layered media. The parametrization is a natural extension of the parametrization used by the previous author and based on generalized Thomsen's parameters (Thomsen, 1986 proposed by Mensch and Rasolofosaon (1997. The method is first applied to synthetic data on a six layer model of contrasted anisotropy (type and magnitude. The robustness of the method is demonstrated. All the interval parameters (here Vnmo and the horizontal velocity Vh are estimated with reasonable errors (typically Une méthode fiable, permettant d'estimer les paramètres d'intervalle (i. e. la vitesse de normal move-outVnmo et le paramètre d'anisotropie h dans les milieux tabulaires transversalement isotropes à partir des ondes P réfléchies, a été proposée récemment par Alkhalifah (1997. Elle est basée sur l'équation du temps de trajet réfléchi développée par Tsvankin et Thomsen (1994. Cette méthode, testée sur des données synthétiques et expérimentales, consiste en

  9. Clinical study of the visual field defects caused by occipital lobe lesions.

    Science.gov (United States)

    Ogawa, Katsuhiko; Ishikawa, Hiroshi; Suzuki, Yutaka; Oishi, Minoru; Kamei, Satoshi

    2014-01-01

    The central visual field is projected to the region from the occipital tip to the posterior portion of the medial area in the striate cortex. However, central visual field disturbances have not been compared with the location of the lesions in the striate cortex. Thirteen patients with visual field defects caused by partial involvement of the striate cortex were enrolled. The lesions were classified according to their location into the anterior portion, the posterior portion of the medial area, and the occipital tip. Visual field defects were examined by the Goldmann perimetry, the Humphrey perimetry and the auto-plot tangent screen. We defined a defect within the central 10° of vision as a central visual field disturbance. The visual field defects in 13 patients were compared with the location of their lesions in the striate cortex. The medial area was involved in 7 patients with no involvement of the occipital tip. In 2 of them, peripheral homonymous hemianopia without central visual field disturbance was shown, and their lesions were located only in the anterior portion. One patient with a lesion in the posterior portion alone showed incomplete central homonymous hemianopia. Three of 4 patients with lesions located in both the anterior and posterior portions of the medial area showed incomplete central homonymous hemianopia and peripheral homonymous hemianopia. The occipital tip was involved in 6 patients. Five of them had small lesions in the occipital tip alone and showed complete central homonymous hemianopia or quadrantanopia. The other patient with a lesion in the lateral posterior portion and bilateral occipital tip lesions showed bilateral slight peripheral visual field disturbance in addition to complete central homonymous hemianopia on both sides. Lesions in the posterior portion of the medial area as well as the occipital tip caused central visual field disturbance in our study, as indicated in previous reports. Central homonymous hemianopia tended to

  10. Improving word coverage using unsupervised morphological analyser

    Indian Academy of Sciences (India)

    To enable a computer to process information in human languages, ... vised morphological analyser (UMA) would learn how to analyse a language just by looking ... result for English, but they did remarkably worse for Finnish and Turkish.

  11. Techniques for Analysing Problems in Engineering Projects

    DEFF Research Database (Denmark)

    Thorsteinsson, Uffe

    1998-01-01

    Description of how CPM network can be used for analysing complex problems in engineering projects.......Description of how CPM network can be used for analysing complex problems in engineering projects....

  12. Field Report

    DEFF Research Database (Denmark)

    Gorm Hansen, Louise Lyngfeldt

    2012-01-01

    This field report expresses perfectly the kind of confusion almost all of us experience when entering the field. How do we know whether what we’re doing is “right” or not? What in particular should we record when we don’t have time to write down everything among all the myriad impressions thrusting...

  13. Visual field defects of the contralateral eye of non-arteritic ischemic anterior optic neuropathy: are they related to sleep apnea?

    Science.gov (United States)

    Aptel, Florent; Aryal-Charles, Nischal; Tamisier, Renaud; Pépin, Jean-Louis; Lesoin, Antoine; Chiquet, Christophe

    2017-06-01

    To evaluate whether obstructive sleep apnea (OSA) is responsible for the visual field defects found in the fellow eyes of patients with non-arteritic ischemic optic neuropathy (NAION). Prospective cross-sectional study. The visual fields of the fellow eyes of NAION subjects with OSA were compared to the visual fields of control OSA patients matched for OSA severity. All patients underwent comprehensive ophthalmological and general examination including Humphrey 24.2 SITA-Standard visual field and polysomnography. Visual field defects were classified according the Ischemic Optic Neuropathy Decompression Trial (IONDT) classification. From a cohort of 78 consecutive subjects with NAION, 34 unaffected fellow eyes were compared to 34 control eyes of subjects matched for OSA severity (apnea-hypopnea index [AHI] 35.5 ± 11.6 vs 35.4 ± 9.4 events per hour, respectively, p = 0.63). After adjustment for age and body mass index, all visual field parameters were significantly different between the NAION fellow eyes and those of the control OSA groups, including mean deviation (-4.5 ± 3.7 vs -1.3 ± 1.8 dB, respectively, p < 0.05), visual field index (91.6 ± 10 vs 97.4 ± 3.5%, respectively, p = 0.002), pattern standard deviation (3.7 ± 2.3 vs 2.5 ± 2 dB, respectively, p = 0.015), and number of subjects with at least one defect on the IONDT classification (20 vs 10, respectively, p < 0.05). OSA alone does not explain the visual field defects frequently found in the fellow eyes of NAION patients.

  14. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic prog...

  15. Analysing public relations education through international standards: The Portuguese case

    OpenAIRE

    Gonçalves, Gisela Marques Pereira; Spínola, Susana de Carvalho; Padamo, Celma

    2013-01-01

    By using international reports on PR education as a benchmark we analyse the status of PR higher education in Portugal. Despite differences among the study programs, the findings reveal that the standard five courses recommendation by the Commission on Public Relations Education (CPRE) are a part of Portuguese undergraduate curriculum. This includes 12 of the 14 content field guidelines needed to achieve the ideal master's program. Data shows, however, the difficulty of positioning public rel...

  16. Novel Space Exploration Technique for Analysing Planetary Atmospheres

    OpenAIRE

    Dekoulis, George

    2010-01-01

    The chapter presents a new reconfigurable wide-beam radio interferometer system for analysing planetary atmospheres. The system operates at frequencies, where the ionisation of the planetary plasma regions induces strong attenuation. For Earth, the attenuation is undistinguishable from the CMB at frequencies over 50 MHz. The system introduces a set of advanced specifications to this field of science, previously unseen in similar suborbital experiments. The reprogrammable dynamic range of the ...

  17. Visual field defects and retinal nerve fiber imaging in patients with obstructive sleep apnea syndrome and in healthy controls.

    Science.gov (United States)

    Casas, Paula; Ascaso, Francisco J; Vicente, Eugenio; Tejero-Garcés, Gloria; Adiego, María I; Cristóbal, José A

    2018-03-02

    To assess the retinal sensitivity in obstructive sleep apnea hypopnea syndrome (OSAHS) patients evaluated with standard automated perimetry (SAP). And to correlate the functional SAP results with structural parameters obtained with optical coherence tomography (OCT). This prospective, observational, case-control study consisted of 63 eyes of 63 OSAHS patients (mean age 51.7 ± 12.7 years, best corrected visual acuity ≥20/25, refractive error less than three spherical or two cylindrical diopters, and intraocular pressure < 21 mmHg) who were enrolled and compared with 38 eyes of 38 age-matched controls. Peripapillary retinal nerve fiber layer (RNFL) thickness was measured by Stratus OCT and SAP sensitivities and indices were explored with Humphrey Field Analyzer perimeter. Correlations between functional and structural parameters were calculated, as well as the relationship between ophthalmologic and systemic indices in OSAHS patients. OSAHS patients showed a significant reduction of the sensitivity for superior visual field division (p = 0.034, t-student test). When dividing the OSAHS group in accordance with the severity of the disease, nasal peripapillary RNFL thickness was significantly lower in severe OSAHS than that in controls and mild-moderate cases (p = 0.031 and p = 0.016 respectively, Mann-Whitney U test). There were no differences between groups for SAP parameters. We found no correlation between structural and functional variables. The central visual field sensitivity of the SAP revealed a poor Pearson correlation with the apnea-hipopnea index (0.284, p = 0.024). Retinal sensitivity show minor differences between healthy subjects and OSAHS. Functional deterioration in OSAHS patients is not easy to demonstrate with visual field examination.

  18. Fourier Transform Mass Spectrometry: The Transformation of Modern Environmental Analyses

    Science.gov (United States)

    Lim, Lucy; Yan, Fangzhi; Bach, Stephen; Pihakari, Katianna; Klein, David

    2016-01-01

    Unknown compounds in environmental samples are difficult to identify using standard mass spectrometric methods. Fourier transform mass spectrometry (FTMS) has revolutionized how environmental analyses are performed. With its unsurpassed mass accuracy, high resolution and sensitivity, researchers now have a tool for difficult and complex environmental analyses. Two features of FTMS are responsible for changing the face of how complex analyses are accomplished. First is the ability to quickly and with high mass accuracy determine the presence of unknown chemical residues in samples. For years, the field has been limited by mass spectrometric methods that were based on knowing what compounds of interest were. Secondly, by utilizing the high resolution capabilities coupled with the low detection limits of FTMS, analysts also could dilute the sample sufficiently to minimize the ionization changes from varied matrices. PMID:26784175

  19. Fourier Transform Mass Spectrometry: The Transformation of Modern Environmental Analyses

    Directory of Open Access Journals (Sweden)

    Lucy Lim

    2016-01-01

    Full Text Available Unknown compounds in environmental samples are difficult to identify using standard mass spectrometric methods. Fourier transform mass spectrometry (FTMS has revolutionized how environmental analyses are performed. With its unsurpassed mass accuracy, high resolution and sensitivity, researchers now have a tool for difficult and complex environmental analyses. Two features of FTMS are responsible for changing the face of how complex analyses are accomplished. First is the ability to quickly and with high mass accuracy determine the presence of unknown chemical residues in samples. For years, the field has been limited by mass spectrometric methods that were based on knowing what compounds of interest were. Secondly, by utilizing the high resolution capabilities coupled with the low detection limits of FTMS, analysts also could dilute the sample sufficiently to minimize the ionization changes from varied matrices.

  20. Field Notes

    Data.gov (United States)

    US Agency for International Development — This is a mobile application for capturing images , data, and geolocation for USAID projects in the field. The data is then stored on a server in AllNet. The...

  1. Selfdecomposable Fields

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole E.; Sauri, Orimar; Szozda, Benedykt

    In the present paper we study selfdecomposability of random fields, as defined directly rather than in terms of finite-dimensional distributions. The main tools in our analysis are the master Lévy measure and the associated Lévy-Itô representation. We give the dilation criterion for selfdecomposa......In the present paper we study selfdecomposability of random fields, as defined directly rather than in terms of finite-dimensional distributions. The main tools in our analysis are the master Lévy measure and the associated Lévy-Itô representation. We give the dilation criterion...... for selfdecomposability analogous to the classical one. Next, we give necessary and sufficient conditions (in terms of the kernel functions) for a Volterra field driven by a Lévy basis to be selfdecomposable. In this context we also study the so-called Urbanik classes of random fields. We follow this with the study...... of existence and selfdecomposability of integrated Volterra fields. Finally, we introduce infinitely divisible field-valued Lévy processes, give the Lévy-Itô representation associated with them and study stochastic integration with respect to such processes. We provide examples in the form of Lévy...

  2. Selfdecomposable Fields

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole E.; Sauri, Orimar; Szozda, Benedykt

    In the present paper we study selfdecomposability of random fields, as defined directly rather than in terms of finite-dimensional distributions. The main tools in our analysis are the master L\\'evy measure and the associated L\\'evy-It\\^o representation. We give the dilation criterion for selfdec......In the present paper we study selfdecomposability of random fields, as defined directly rather than in terms of finite-dimensional distributions. The main tools in our analysis are the master L\\'evy measure and the associated L\\'evy-It\\^o representation. We give the dilation criterion...... for selfdecomposability analogous to the classical one. Next, we give necessary and sufficient conditions (in terms of the kernel functions) for a Volterra field driven by a L\\'evy basis to be selfdecomposable. In this context we also study the so-called Urbanik classes of random fields. We follow this with the study...... of existence and selfdecomposability of integrated Volterra fields. Finally, we introduce infinitely divisible field-valued L\\'evy processes, give the L\\'evy-It\\^o representation associated with them and study stochastic integration with respect to such processes. We provide examples in the form of L...

  3. Random Fields

    Science.gov (United States)

    Vanmarcke, Erik

    1983-03-01

    Random variation over space and time is one of the few attributes that might safely be predicted as characterizing almost any given complex system. Random fields or "distributed disorder systems" confront astronomers, physicists, geologists, meteorologists, biologists, and other natural scientists. They appear in the artifacts developed by electrical, mechanical, civil, and other engineers. They even underlie the processes of social and economic change. The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient." Many new results and methods are included. After outlining the extent and characteristics of the random field approach, the book reviews the classical theory of multidimensional random processes and introduces basic probability concepts and methods in the random field context. It next gives a concise amount of the second-order analysis of homogeneous random fields, in both the space-time domain and the wave number-frequency domain. This is followed by a chapter on spectral moments and related measures of disorder and on level excursions and extremes of Gaussian and related random fields. After developing a new framework of analysis based on local averages of one-, two-, and n-dimensional processes, the book concludes with a chapter discussing ramifications in the important areas of estimation, prediction, and control. The mathematical prerequisite has been held to basic college-level calculus.

  4. The necessity for comparative risk analyses as seen from the political point of view

    International Nuclear Information System (INIS)

    Steger, U.

    1981-01-01

    The author describes the current insufficient utilization of risk analyses in the political decision process and investigates if other technologies encounter the same difficulties of acceptance as in the nuclear energy field. This being likely he is trying to find out which contribution comparative risk analyses could make to the process of democratic will-formation so that new technologies are accepted. Firstly the author establishes theses criticizing the recent scientific efforts made in the field of risk analyses and their usability for the political decision process. He then defines the criteria risk analyses have to meet in order to serve as scientific elements for consultative political discussions. (orig./HP) [de

  5. The role of hemifield sector analysis in multifocal visual evoked potential objective perimetry in the early detection of glaucomatous visual field defects

    Directory of Open Access Journals (Sweden)

    Mousa MF

    2013-05-01

    Full Text Available Mohammad F Mousa,1 Robert P Cubbidge,2 Fatima Al-Mansouri,1 Abdulbari Bener3,41Department of Ophthalmology, Hamad Medical Corporation, Doha, Qatar; 2School of Life and Health Sciences, Aston University, Birmingham, UK; 3Department of Medical Statistics and Epidemiology, Hamad Medical Corporation, Department of Public Health, Weill Cornell Medical College, Doha, Qatar; 4Department Evidence for Population Health Unit, School of Epidemiology and Health Sciences, University of Manchester, Manchester, UKObjective: The purpose of this study was to examine the effectiveness of a new analysis method of mfVEP objective perimetry in the early detection of glaucomatous visual field defects compared to the gold standard technique.Methods and patients: Three groups were tested in this study; normal controls (38 eyes, glaucoma patients (36 eyes, and glaucoma suspect patients (38 eyes. All subjects underwent two standard 24-2 visual field tests: one with the Humphrey Field Analyzer and a single mfVEP test in one session. Analysis of the mfVEP results was carried out using the new analysis ­protocol: the hemifield sector analysis protocol.Results: Analysis of the mfVEP showed that the signal to noise ratio (SNR difference between superior and inferior hemifields was statistically significant between the three groups (analysis of variance, P < 0.001 with a 95% confidence interval, 2.82, 2.89 for normal group; 2.25, 2.29 for glaucoma suspect group; 1.67, 1.73 for glaucoma group. The difference between superior and inferior hemifield sectors and hemi-rings was statistically significant in 11/11 pair of sectors and hemi-rings in the glaucoma patients group (t-test P < 0.001, statistically significant in 5/11 pairs of sectors and hemi-rings in the glaucoma suspect group (t-test P < 0.01, and only 1/11 pair was statistically significant (t-test P < 0.9. The sensitivity and specificity of the hemifield sector analysis protocol in detecting glaucoma was 97% and 86

  6. Fracture analyses of WWER reactor pressure vessels

    International Nuclear Information System (INIS)

    Sievers, J.; Liu, X.

    1997-01-01

    In the paper first the methodology of fracture assessment based on finite element (FE) calculations is described and compared with simplified methods. The FE based methodology was verified by analyses of large scale thermal shock experiments in the framework of the international comparative study FALSIRE (Fracture Analyses of Large Scale Experiments) organized by GRS and ORNL. Furthermore, selected results from fracture analyses of different WWER type RPVs with postulated cracks under different loading transients are presented. 11 refs, 13 figs, 1 tab

  7. Fracture analyses of WWER reactor pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Sievers, J; Liu, X [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Koeln (Germany)

    1997-09-01

    In the paper first the methodology of fracture assessment based on finite element (FE) calculations is described and compared with simplified methods. The FE based methodology was verified by analyses of large scale thermal shock experiments in the framework of the international comparative study FALSIRE (Fracture Analyses of Large Scale Experiments) organized by GRS and ORNL. Furthermore, selected results from fracture analyses of different WWER type RPVs with postulated cracks under different loading transients are presented. 11 refs, 13 figs, 1 tab.

  8. Field theories with subcanonical fields

    International Nuclear Information System (INIS)

    Bigi, I.I.Y.

    1976-01-01

    The properties of quantum field theories with spinor fields of dimension less than the canonical value of 3/2 are studied. As a starting point for the application of common perturbation theory we look for the linear version of these theories. A gange-interaction is introduced and with the aid of power counting the renormalizability of the theory is shown. It follows that in the case of a spinor-field with negative dimension renormalization can only be attained if the interaction has a further symmetry. By this symmetry the theory is determined in an unequivocal way. The gange-interaction introduced in the theory leads to a spontaneous breakdown of scale invariance whereby masses are produced. At the same time the spinor-field operators can now be separated in two orthogonal sections with opposite norm. It is proposed to use the section with negative (positive) norm to describe hadrons (leptons) respectively. (orig./WL) [de

  9. [Anne Arold. Kontrastive Analyse...] / Paul Alvre

    Index Scriptorium Estoniae

    Alvre, Paul, 1921-2008

    2001-01-01

    Arvustus: Arold, Anne. Kontrastive analyse der Wortbildungsmuster im Deutschen und im Estnischen (am Beispiel der Aussehensadjektive). Tartu, 2000. (Dissertationes philologiae germanicae Universitatis Tartuensis)

  10. Methodological Quality Assessment of Meta-analyses in Endodontics.

    Science.gov (United States)

    Kattan, Sereen; Lee, Su-Min; Kohli, Meetu R; Setzer, Frank C; Karabucak, Bekir

    2018-01-01

    The objectives of this review were to assess the methodological quality of published meta-analyses related to endodontics using the assessment of multiple systematic reviews (AMSTAR) tool and to provide a follow-up to previously published reviews. Three electronic databases were searched for eligible studies according to the inclusion and exclusion criteria: Embase via Ovid, The Cochrane Library, and Scopus. The electronic search was amended by a hand search of 6 dental journals (International Endodontic Journal; Journal of Endodontics; Australian Endodontic Journal; Oral Surgery, Oral Medicine, Oral Pathology, Oral Radiology; Endodontics and Dental Traumatology; and Journal of Dental Research). The searches were conducted to include articles published after July 2009, and the deadline for inclusion of the meta-analyses was November 30, 2016. The AMSTAR assessment tool was used to evaluate the methodological quality of all included studies. A total of 36 reports of meta-analyses were included. The overall quality of the meta-analyses reports was found to be medium, with an estimated mean overall AMSTAR score of 7.25 (95% confidence interval, 6.59-7.90). The most poorly assessed areas were providing an a priori design, the assessment of the status of publication, and publication bias. In recent publications in the field of endodontics, the overall quality of the reported meta-analyses is medium according to AMSTAR. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  11. [Methods, challenges and opportunities for big data analyses of microbiome].

    Science.gov (United States)

    Sheng, Hua-Fang; Zhou, Hong-Wei

    2015-07-01

    Microbiome is a novel research field related with a variety of chronic inflamatory diseases. Technically, there are two major approaches to analysis of microbiome: metataxonome by sequencing the 16S rRNA variable tags, and metagenome by shot-gun sequencing of the total microbial (mainly bacterial) genome mixture. The 16S rRNA sequencing analyses pipeline includes sequence quality control, diversity analyses, taxonomy and statistics; metagenome analyses further includes gene annotation and functional analyses. With the development of the sequencing techniques, the cost of sequencing will decrease, and big data analyses will become the central task. Data standardization, accumulation, modeling and disease prediction are crucial for future exploit of these data. Meanwhile, the information property in these data, and the functional verification with culture-dependent and culture-independent experiments remain the focus in future research. Studies of human microbiome will bring a better understanding of the relations between the human body and the microbiome, especially in the context of disease diagnosis and therapy, which promise rich research opportunities.

  12. Gauge fields

    International Nuclear Information System (INIS)

    Itzykson, C.

    1978-01-01

    In these notes the author provides some background on the theory of gauge fields, a subject of increasing popularity among particle physicists (and others). Detailed motivations and applications which are covered in the other lectures of this school are not presented. In particular the application to weak interactions is omitted by referring to the introduction given by J. Ilipoulos a year ago (CERN Report 76-11). The aim is rather to stress those aspects which suggest that gauge fields may play some role in a future theory of strong interactions. (Auth.)

  13. An MDE Approach for Modular Program Analyses

    NARCIS (Netherlands)

    Yildiz, Bugra Mehmet; Bockisch, Christoph; Aksit, Mehmet; Rensink, Arend

    Program analyses are an important tool to check if a system fulfills its specification. A typical implementation strategy for program analyses is to use an imperative, general-purpose language like Java, and access the program to be analyzed through libraries that offer an API for reading, writing

  14. Random error in cardiovascular meta-analyses

    DEFF Research Database (Denmark)

    Albalawi, Zaina; McAlister, Finlay A; Thorlund, Kristian

    2013-01-01

    BACKGROUND: Cochrane reviews are viewed as the gold standard in meta-analyses given their efforts to identify and limit systematic error which could cause spurious conclusions. The potential for random error to cause spurious conclusions in meta-analyses is less well appreciated. METHODS: We exam...

  15. Diversity of primary care systems analysed.

    NARCIS (Netherlands)

    Kringos, D.; Boerma, W.; Bourgueil, Y.; Cartier, T.; Dedeu, T.; Hasvold, T.; Hutchinson, A.; Lember, M.; Oleszczyk, M.; Pavlick, D.R.

    2015-01-01

    This chapter analyses differences between countries and explains why countries differ regarding the structure and process of primary care. The components of primary care strength that are used in the analyses are health policy-making, workforce development and in the care process itself (see Fig.

  16. Approximate analyses of inelastic effects in pipework

    International Nuclear Information System (INIS)

    Jobson, D.A.

    1983-01-01

    This presentation shows figures concerned with analyses of inelastic effects in pipework as follows: comparison of experimental and calculated simplified analyses results for free end rotation and for circumferential strain; interrupted stress relaxation; regenerated relaxation caused by reversed yield; buckling of straight pipe under combined bending and torsion; results of fatigues test of pipe bend

  17. Clinical Correlates of Computationally Derived Visual Field Defect Archetypes in Patients from a Glaucoma Clinic.

    Science.gov (United States)

    Cai, Sophie; Elze, Tobias; Bex, Peter J; Wiggs, Janey L; Pasquale, Louis R; Shen, Lucy Q

    2017-04-01

    To assess the clinical validity of visual field (VF) archetypal analysis, a previously developed machine learning method for decomposing any Humphrey VF (24-2) into a weighted sum of clinically recognizable VF loss patterns. For each of 16 previously identified VF loss patterns ("archetypes," denoted AT1 through AT16), we screened 30,995 reliable VFs to select 10-20 representative patients whose VFs had the highest decomposition coefficients for each archetype. VF global indices and patient ocular and demographic features were extracted retrospectively. Based on resemblances between VF archetypes and clinically observed VF patterns, hypotheses were generated for associations between certain VF archetypes and clinical features, such as an association between AT6 (central island, representing severe VF loss) and large cup-to-disk ratio (CDR). Distributions of the selected clinical features were compared between representative eyes of certain archetypes and all other eyes using the two-tailed t-test or Fisher exact test. 243 eyes from 243 patients were included, representative of AT1 through AT16. CDR was more often ≥ 0.7 among eyes representative of AT6 (central island; p = 0.002), AT10 (inferior arcuate defect; p = 0.048), AT14 (superior paracentral defect; p = 0.016), and AT16 (inferior paracentral defect; p = 0.016) than other eyes. CDR was more often 6D (p = 0.069). Shared clinical features between computationally derived VF archetypes and clinically observed VF patterns support the clinical validity of VF archetypal analysis.

  18. From polycrystal to multi-crystal: ''numerical meso-scope'' development for a local analysis in the elasto-viscoplastic field; Du polycristal au multicristal: elaboration d'un mesoscope numerique pour une analyse locale en elastoviscoplasticite

    Energy Technology Data Exchange (ETDEWEB)

    Heraud, St

    2000-07-01

    The knowledge of the local mechanical fields over several adjacent grains is needed for a better understanding of damage initiation and intergranular. failure in metallic polycrystals. This thesis aimed at the derivation of such fields through a 'numerical meso-scope': this simulation tool relies on the finite element analysis of a multi-crystalline pattern embedded in a large matrix whose mechanical behaviour is derived experimentally from classical tests performed on the studied metal. First, we derived macroscopic elastic-viscoplastic constitutive equations from tensile and creep tests on a AIS1316 stainless steel and we inferred from them the general form of similar, but crystallographic equations to be used for the single crystals; the corresponding parameters were determined by fitting the computed overall response of an aggregate made of 1000 grains with the macroscopic experimental one. We then investigated a creep-damaged area of the same steel and we simulated the same grain ensemble in the 'numerical meso-scope' so as to compare the computed normal stress on all grain boundaries with the observed de-bonded boundaries: this showed the most damaged boundaries to sustain the largest normal stress. Another application was concerned with the understanding of the origin of intergranular damage of aged AIS321 stainless steel. A similar approach was adopted with help of the meso-scope: it showed that observations could not be explained by a sole intragranular hardening as it is currently proposed in the literature. Thus the pertinence of the 'numerical meso-scope' concept can now be demonstrated, which opens on a number of new interesting perspectives. (author)

  19. Angular analyses in relativistic quantum mechanics; Analyses angulaires en mecanique quantique relativiste

    Energy Technology Data Exchange (ETDEWEB)

    Moussa, P [Commissariat a l' Energie Atomique, 91 - Saclay (France). Centre d' Etudes Nucleaires

    1968-06-01

    This work describes the angular analysis of reactions between particles with spin in a fully relativistic fashion. One particle states are introduced, following Wigner's method, as representations of the inhomogeneous Lorentz group. In order to perform the angular analyses, the reduction of the product of two representations of the inhomogeneous Lorentz group is studied. Clebsch-Gordan coefficients are computed for the following couplings: l-s coupling, helicity coupling, multipolar coupling, and symmetric coupling for more than two particles. Massless and massive particles are handled simultaneously. On the way we construct spinorial amplitudes and free fields; we recall how to establish convergence theorems for angular expansions from analyticity hypothesis. Finally we substitute these hypotheses to the idea of 'potential radius', which gives at low energy the usual 'centrifugal barrier' factors. The presence of such factors had never been deduced from hypotheses compatible with relativistic invariance. (author) [French] On decrit un formalisme permettant de tenir compte de l'invariance relativiste, dans l'analyse angulaire des amplitudes de reaction entre particules de spin quelconque. Suivant Wigner, les etats a une particule sont introduits a l'aide des representations du groupe de Lorentz inhomogene. Pour effectuer les analyses angulaires, on etudie la reduction du produit de deux representations du groupe de Lorentz inhomogene. Les coefficients de Clebsch-Gordan correspondants sont calcules dans les couplages suivants: couplage l-s couplage d'helicite, couplage multipolaire, couplage symetrique pour plus de deux particules. Les particules de masse nulle et de masse non nulle sont traitees simultanement. Au passage, on introduit les amplitudes spinorielles et on construit les champs libres, on rappelle comment des hypotheses d'analyticite permettent d'etablir des theoremes de convergence pour les developpements angulaires. Enfin on fournit un substitut a la

  20. Features of finite quantum field theories

    International Nuclear Information System (INIS)

    Boehm, M.; Denner, A.

    1987-01-01

    We analyse general features of finite quantum field theories. A quantum field theory is considered to be finite, if the corresponding renormalization constants evaluated in the dimensional regularization scheme are free from divergences in all orders of perturbation theory. We conclude that every finite renormalizable quantum field theory with fields of spin one or less must contain both scalar fields and fermion fields and nonabelian gauge fields. Some secific nonsupersymmetric models are found to be finite at the one- and two-loop level. (orig.)

  1. Level II Ergonomic Analyses, Dover AFB, DE

    Science.gov (United States)

    1999-02-01

    IERA-RS-BR-TR-1999-0002 UNITED STATES AIR FORCE IERA Level II Ergonomie Analyses, Dover AFB, DE Andrew Marcotte Marilyn Joyce The Joyce...Project (070401881, Washington, DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE 4. TITLE AND SUBTITLE Level II Ergonomie Analyses, Dover...1.0 INTRODUCTION 1-1 1.1 Purpose Of The Level II Ergonomie Analyses : 1-1 1.2 Approach 1-1 1.2.1 Initial Shop Selection and Administration of the

  2. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  3. Cost-Benefit Analyses of Transportation Investments

    DEFF Research Database (Denmark)

    Næss, Petter

    2006-01-01

    This paper discusses the practice of cost-benefit analyses of transportation infrastructure investment projects from the meta-theoretical perspective of critical realism. Such analyses are based on a number of untenable ontological assumptions about social value, human nature and the natural......-to-pay investigations. Accepting the ontological and epistemological assumptions of cost-benefit analysis involves an implicit acceptance of the ethical and political values favoured by these assumptions. Cost-benefit analyses of transportation investment projects tend to neglect long-term environmental consequences...

  4. Field theory

    CERN Multimedia

    1999-11-08

    In these lectures I will build up the concept of field theory using the language of Feynman diagrams. As a starting point, field theory in zero spacetime dimensions is used as a vehicle to develop all the necessary techniques: path integral, Feynman diagrams, Schwinger-Dyson equations, asymptotic series, effective action, renormalization etc. The theory is then extended to more dimensions, with emphasis on the combinatorial aspects of the diagrams rather than their particular mathematical structure. The concept of unitarity is used to, finally, arrive at the various Feynman rules in an actual, four-dimensional theory. The concept of gauge-invariance is developed, and the structure of a non-abelian gauge theory is discussed, again on the level of Feynman diagrams and Feynman rules.

  5. Seismic response analyses for reactor facilities at Savannah River

    International Nuclear Information System (INIS)

    Miller, C.A.; Costantino, C.J.; Xu, J.

    1991-01-01

    The reactor facilities at the Savannah River Plant (SRP) were designed during the 1950's. The original seismic criteria defining the input ground motion was 0.1 G with UBC [uniform building code] provisions used to evaluate structural seismic loads. Later ground motion criteria have defined the free field seismic motion with a 0.2 G ZPA [free field acceleration] and various spectral shapes. The spectral shapes have included the Housner spectra, a site specific spectra, and the US NRC [Nuclear Regulatory Commission] Reg. Guide 1.60 shape. The development of these free field seismic criteria are discussed in the paper. The more recent seismic analyses have been of the following type: fixed base response spectra, frequency independent lumped parameter soil/structure interaction (SSI), frequency dependent lumped parameter SSI, and current state of the art analyses using computer codes such as SASSI. The results from these computations consist of structural loads and floor response spectra (used for piping and equipment qualification). These results are compared in the paper and the methods used to validate the results are discussed. 14 refs., 11 figs

  6. Comparison with Russian analyses of meteor impact

    Energy Technology Data Exchange (ETDEWEB)

    Canavan, G.H.

    1997-06-01

    The inversion model for meteor impacts is used to discuss Russian analyses and compare principal results. For common input parameters, the models produce consistent estimates of impactor parameters. Directions for future research are discussed and prioritized.

  7. 7 CFR 94.102 - Analyses available.

    Science.gov (United States)

    2010-01-01

    ... analyses for total ash, fat by acid hydrolysis, moisture, salt, protein, beta-carotene, catalase... glycol, SLS, and zeolex. There are also be tests for starch, total sugars, sugar profile, whey, standard...

  8. Anthocyanin analyses of Vaccinium fruit dietary supplements

    Science.gov (United States)

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed; their anthocyanin profiles (based on HPLC separation...

  9. Analyse of Maintenance Cost in ST

    CERN Document Server

    Jenssen, B W

    2001-01-01

    An analyse has been carried out in ST concerning the total costs for the division. Even though the target was the maintenance costs in ST, the global budget over has been analysed. This has been done since there is close relation between investments & consolidation and the required level for maintenance. The purpose of the analyse was to focus on maintenance cost in ST as a ratio of total maintenance costs over the replacement value of the equipment, and to make some comparisons with other industries and laboratories. Families of equipment have been defined and their corresponding ratios calculated. This first approach gives us some "quantitative" measurements. This analyse should be combined with performance indicators (more "qualitative" measurements) that are telling us how well we are performing. This will help us in defending our budget, make better priorities, and we will satisfy the requirements from our external auditors.

  10. A History of Rotorcraft Comprehensive Analyses

    Science.gov (United States)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  11. Safety analyses for reprocessing and waste processing

    International Nuclear Information System (INIS)

    1983-03-01

    Presentation of an incident analysis of process steps of the RP, simplified considerations concerning safety, and safety analyses of the storage and solidification facilities of the RP. A release tree method is developed and tested. An incident analysis of process steps, the evaluation of the SRL-study and safety analyses of the storage and solidification facilities of the RP are performed in particular. (DG) [de

  12. Risk analyses of nuclear power plants

    International Nuclear Information System (INIS)

    Jehee, J.N.T.; Seebregts, A.J.

    1991-02-01

    Probabilistic risk analyses of nuclear power plants are carried out by systematically analyzing the possible consequences of a broad spectrum of causes of accidents. The risk can be expressed in the probabilities for melt down, radioactive releases, or harmful effects for the environment. Following risk policies for chemical installations as expressed in the mandatory nature of External Safety Reports (EVRs) or, e.g., the publication ''How to deal with risks'', probabilistic risk analyses are required for nuclear power plants

  13. Iterative categorization (IC): a systematic technique for analysing qualitative data

    Science.gov (United States)

    2016-01-01

    Abstract The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. PMID:26806155

  14. Iterative categorization (IC): a systematic technique for analysing qualitative data.

    Science.gov (United States)

    Neale, Joanne

    2016-06-01

    The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. © 2016 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  15. Elemental abundance and analyses with coadded DAO spectrograms

    International Nuclear Information System (INIS)

    Adelman, S.J.

    1987-01-01

    One can improve the quality of elemental abundance analyses by using higher signal-to-noise data than has been the practice at high resolution. The procedures developed at the Dominion Astrophysical Observatory to coadd high-dispersion coude spectrograms are used with a minimum of 10 6.5 A mm -1 IIa-O spectrograms of each of three field horizontal-branch (FHB)A stars to increase the signal-to-noise ratio of the photographic data over a considerable wavelength region. Fine analyses of the sharp-lined prototype FHB stars HD 109995 and 161817 show an internal consistency which justifies this effort. Their photospheric elemental abundances are similar to those of Population II globular cluster giants. (author)

  16. Relationship between consecutive deterioration of mean deviation value and progression of visual field defect in open-angle glaucoma.

    Science.gov (United States)

    Naito, Tomoko; Yoshikawa, Keiji; Mizoue, Shiro; Nanno, Mami; Kimura, Tairo; Suzumura, Hirotaka; Takeda, Ryuji; Shiraga, Fumio

    2015-01-01

    To analyze the relationship between consecutive deterioration of mean deviation (MD) value and glaucomatous visual field (VF) progression in open-angle glaucoma (OAG), including primary OAG and normal tension glaucoma. The subjects of the study were patients undergoing treatment for OAG who had performed VF tests at least 10 times with a Humphrey field analyzer (SITA standard, C30-2 program). The VF progression was defined by a significantly negative MD slope (MD slope worsening) at the final VF test during the follow-up period. The relationship between the MD slope worsening and the consecutive deterioration of MD value were retrospectively analyzed. A total of 165 eyes of 165 patients were included in the analysis. Significant progression of VF defects was observed in 72 eyes of 72 patients (43.6%), while no significant progression was evident in 93 eyes of 93 patients (56.4%). There was significant relationship between the frequency of consecutive deterioration of MD value and MD slope worsening (P<0.0001, Cochran-Armitage trend test). A significant association was observed for MD slope worsening in the eyes with three (odds ratio: 2.1, P=0.0224) and four (odds ratio: 3.6, P=0.0008) consecutive deterioration of MD value in multiple logistic regression analysis, but no significant association in the eyes with two consecutive deterioration (odds ratio: 1.1, P=0.8282). The eyes with VF progression had significantly lower intraocular pressure reduction rate (P<0.01). This retrospective study has shown that three or more consecutive deterioration of MD value might be a predictor to future significant MD slope worsening in OAG.

  17. The Watts New? Collection: Columns by the SEI’s Watts Humphrey

    Science.gov (United States)

    2009-11-01

    be to convince them to explore and ultimately adopt these technologies. Have you ever started what you thought was a two- or three-day job and have...product into test is a mistake but they don’t know how to fight the pressure. Then they feel compelled to rush through requirements and design and to...of us had ever worked before. In our first year, he took a team of rookies to the AAU championship of 13 states. What was most interesting to me was

  18. DIGITAL FLOOD INSURANCE RATE MAP DATABASE, HUMPHREYS COUNTY, TENNESSEE AND INCORPORATED AREAS

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — The Digital Flood Insurance Rate Map (DFIRM) Database depicts flood risk information and supporting data used to develop the risk data. The primary risk...

  19. 75 FR 32201 - Tennessee National Wildlife Refuge, Henry, Benton, Decatur, and Humphreys Counties, TN

    Science.gov (United States)

    2010-06-07

    ... Alternative A. We would expand control efforts of invasive species through active methods of removal. These... efforts of invasive species through active methods of removal. These methods would work towards reducing... include: (1) Managing for invasive species, migratory birds, and species of special concern; (2) managing...

  20. 76 FR 5194 - Tennessee National Wildlife Refuge, Henry, Benton, Decatur, and Humphreys Counties, TN; Final...

    Science.gov (United States)

    2011-01-28

    ... plants, 303 species of birds, and 280 species of mammals, fish, reptiles, and amphibians. We announce our... inventories for non-game mammals, reptiles, amphibians, fish, and invertebrates. We will also consider...

  1. Gauge fields

    International Nuclear Information System (INIS)

    Mills, R.

    1989-01-01

    This article is a survey of the history and ideas of gauge theory. Described here are the gradual emergence of symmetry as a driving force in the shaping of physical theory; the elevation of Noether's theorem, relating symmetries to conservation laws, to a fundamental principle of nature; and the force of the idea (''the gauge principle'') that the symmetries of nature, like the interactions themselves, should be local in character. The fundamental role of gauge fields in mediating the interactions of physics springs from Noether's theorem and the gauge principle in a remarkably clean and elegant way, leaving, however, some tantalizing loose ends that might prove to be the clue to a future deeper level of understanding. The example of the electromagnetic field as the prototype gauge theory is discussed in some detail and serves as the basis for examining the similarities and differences that emerge in generalizing to non-Abelian gauge theories. The article concludes with a brief examination of the dream of total unification: all the forces of nature in a single unified gauge theory, with the differences among the forces due to the specific way in which the fundamental symmetries are broken in the local environment

  2. The diagnostic use of choroidal thickness analysis and its correlation with visual field indices in glaucoma using spectral domain optical coherence tomography.

    Directory of Open Access Journals (Sweden)

    Zhongjing Lin

    Full Text Available To evaluate the quantitative characteristics of choroidal thickness in primary open-angle glaucoma (POAG, normal tension glaucoma (NTG and in normal eyes using spectral-domain optical coherence tomography (SD-OCT. To evaluate the diagnostic ability of choroidal thickness in glaucoma and to determine the correlation between choroidal thickness and visual field parameters in glaucoma.A total of 116 subjects including 40 POAG, 30 NTG and 46 healthy subjects were enrolled in this study. Choroidal thickness measurements were acquired in the macular and peripapillary regions using SD-OCT. All subjects underwent white-on-white (W/W and blue-on-yellow (B/Y visual field tests using Humphrey Field Analyzer. The receiver operating characteristic (ROC curve and the area under curve (AUC were generated to assess the discriminating power of choroidal thickness for glaucoma. Pearson's correlation coefficients were calculated to assess the structure function correlation for glaucoma patients.No significant differences were observed for macular choroidal thickness among the different groups (all P > 0.05. Regarding the peripapillary choroidal thickness (PPCT, significant differences were observed among the three groups (all P 0.05, but showed significant correlations with B/Y MD (all P < 0.05. In the early glaucomatous eyes, PPCT showed significant correlations with W/W MD and B/Y MD (all P < 0.05.In our study, peripapillary choroidal thickness measured on OCT showed a low to moderate but statistically significant diagnostic power and a significant correlation with blue-on-yellow visual field indices in glaucoma. This may indicate a potential adjunct for peripapillary choroidal thickness in glaucoma diagnosis.

  3. Rate of progression of total, upper, and lower visual field defects in patients with open-angle glaucoma and high myopia.

    Science.gov (United States)

    Yoshino, Takaiko; Fukuchi, Takeo; Togano, Tetsuya; Sakaue, Yuta; Seki, Masaaki; Tanaka, Takayuki; Ueda, Jun

    2016-03-01

    We evaluated the rate of progression of total, upper, and lower visual field defects in patients with treated primary open-angle glaucoma (POAG) with high myopia (HM). Seventy eyes of 70 POAG patients with HM [≤-8 diopters (D)] were examined. The mean deviation (MD) slope and the upper and lower total deviation (upper TD, lower TD) slopes of the Humphrey Field Analyzer were calculated in patients with high-tension glaucoma (HTG) (>21 mmHg) versus normal-tension glaucoma (NTG) (≤21 mmHg). The mean age of all the patients (29 eyes with HTG and 41 eyes with NTG) was 48.5 ± 9.6 years. The MD slope, and upper and lower TD slopes of the HM group were compared to those of the non-HM group (NHM) (>-8 D) selected from 544 eyes in 325 age-matched POAG patients. In all, 70 eyes with HM and NHM were examined. The mean MD slope was -0.33 ± 0.33 dB/year in the HM, and -0.38 ± 0.49 dB/year in the NHM. There were no statistical differences between the HM and NHM (p = 0.9565). In the comparison of HTG versus NTG patients in both groups, the MD slope, and upper and lower TD slopes were similar. The rate of progression of total, upper, and lower visual field defects was similar among patients with HM and NHM. Although HM is a risk factor for the onset of glaucoma, HM may not be a risk factor for progression of visual field defects as assessed by the progression rate under treatment.

  4. Field errors in hybrid insertion devices

    International Nuclear Information System (INIS)

    Schlueter, R.D.

    1995-02-01

    Hybrid magnet theory as applied to the error analyses used in the design of Advanced Light Source (ALS) insertion devices is reviewed. Sources of field errors in hybrid insertion devices are discussed

  5. Field errors in hybrid insertion devices

    Energy Technology Data Exchange (ETDEWEB)

    Schlueter, R.D. [Lawrence Berkeley Lab., CA (United States)

    1995-02-01

    Hybrid magnet theory as applied to the error analyses used in the design of Advanced Light Source (ALS) insertion devices is reviewed. Sources of field errors in hybrid insertion devices are discussed.

  6. Mass separated neutral particle energy analyser

    International Nuclear Information System (INIS)

    Takeuchi, Hiroshi; Matsuda, Toshiaki; Miura, Yukitoshi; Shiho, Makoto; Maeda, Hikosuke; Hashimoto, Kiyoshi; Hayashi, Kazuo.

    1983-09-01

    A mass separated neutral particle energy analyser which could simultaneously measure hydrogen and deuterium atoms emitted from tokamak plasma was constructed. The analyser was calibrated for the energy and mass separation in the energy range from 0.4 keV to 9 keV. In order to investigate the behavior of deuteron and proton in the JFT-2 tokamak plasma heated with ion cyclotron wave and neutral beam injection, this analyser was installed in JFT-2 tokamak. It was found that the energy spectrum could be determined with sufficient accuracy. The obtained ion temperature and ratio of deuteron and proton density from the energy spectrum were in good agreement with the value deduced from Doppler broadening of TiXIV line and the line intensities of H sub(α) and D sub(α) respectively. (author)

  7. Advanced toroidal facility vaccuum vessel stress analyses

    International Nuclear Information System (INIS)

    Hammonds, C.J.; Mayhall, J.A.

    1987-01-01

    The complex geometry of the Advance Toroidal Facility (ATF) vacuum vessel required special analysis techniques in investigating the structural behavior of the design. The response of a large-scale finite element model was found for transportation and operational loading. Several computer codes and systems, including the National Magnetic Fusion Energy Computer Center Cray machines, were implemented in accomplishing these analyses. The work combined complex methods that taxed the limits of both the codes and the computer systems involved. Using MSC/NASTRAN cyclic-symmetry solutions permitted using only 1/12 of the vessel geometry to mathematically analyze the entire vessel. This allowed the greater detail and accuracy demanded by the complex geometry of the vessel. Critical buckling-pressure analyses were performed with the same model. The development, results, and problems encountered in performing these analyses are described. 5 refs., 3 figs

  8. Periodic safety analyses; Les essais periodiques

    Energy Technology Data Exchange (ETDEWEB)

    Gouffon, A; Zermizoglou, R

    1990-12-01

    The IAEA Safety Guide 50-SG-S8 devoted to 'Safety Aspects of Foundations of Nuclear Power Plants' indicates that operator of a NPP should establish a program for inspection of safe operation during construction, start-up and service life of the plant for obtaining data needed for estimating the life time of structures and components. At the same time the program should ensure that the safety margins are appropriate. Periodic safety analysis are an important part of the safety inspection program. Periodic safety reports is a method for testing the whole system or a part of the safety system following the precise criteria. Periodic safety analyses are not meant for qualification of the plant components. Separate analyses are devoted to: start-up, qualification of components and materials, and aging. All these analyses are described in this presentation. The last chapter describes the experience obtained for PWR-900 and PWR-1300 units from 1986-1989.

  9. A Simple, Reliable Precision Time Analyser

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, B. V.; Nargundkar, V. R.; Subbarao, K.; Kamath, M. S.; Eligar, S. K. [Atomic Energy Establishment Trombay, Bombay (India)

    1966-06-15

    A 30-channel time analyser is described. The time analyser was designed and built for pulsed neutron research but can be applied to other uses. Most of the logic is performed by means of ferrite memory core and transistor switching circuits. This leads to great versatility, low power consumption, extreme reliability and low cost. The analyser described provides channel Widths from 10 {mu}s to 10 ms; arbitrarily wider channels are easily obtainable. It can handle counting rates up to 2000 counts/min in each channel with less than 1% dead time loss. There is a provision for an initial delay equal to 100 channel widths. An input pulse de-randomizer unit using tunnel diodes ensures exactly equal channel widths. A brief description of the principles involved in core switching circuitry is given. The core-transistor transfer loop is compared with the usual core-diode loops and is shown to be more versatile and better adapted to the making of a time analyser. The circuits derived from the basic loop are described. These include the scale of ten, the frequency dividers and the delay generator. The current drivers developed for driving the cores are described. The crystal-controlled clock which controls the width of the time channels and synchronizes the operation of the various circuits is described. The detector pulse derandomizer unit using tunnel diodes is described. The scheme of the time analyser is then described showing how the various circuits can be integrated together to form a versatile time analyser. (author)

  10. Fundamental data analyses for measurement control

    International Nuclear Information System (INIS)

    Campbell, K.; Barlich, G.L.; Fazal, B.; Strittmatter, R.B.

    1987-02-01

    A set of measurment control data analyses was selected for use by analysts responsible for maintaining measurement quality of nuclear materials accounting instrumentation. The analyses consist of control charts for bias and precision and statistical tests used as analytic supplements to the control charts. They provide the desired detection sensitivity and yet can be interpreted locally, quickly, and easily. The control charts provide for visual inspection of data and enable an alert reviewer to spot problems possibly before statistical tests detect them. The statistical tests are useful for automating the detection of departures from the controlled state or from the underlying assumptions (such as normality). 8 refs., 3 figs., 5 tabs

  11. A theoretical framework for analysing preschool teaching

    DEFF Research Database (Denmark)

    Chaiklin, Seth

    2014-01-01

    This article introduces a theoretical framework for analysing preschool teaching as a historically-grounded societal practice. The aim is to present a unified framework that can be used to analyse and compare both historical and contemporary examples of preschool teaching practice within and across...... national traditions. The framework has two main components, an analysis of preschool teaching as a practice, formed in relation to societal needs, and an analysis of the categorical relations which necessarily must be addressed in preschool teaching activity. The framework is introduced and illustrated...

  12. Power System Oscillatory Behaviors: Sources, Characteristics, & Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Follum, James D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tuffner, Francis K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dosiek, Luke A. [Union College, Schenectady, NY (United States); Pierre, John W. [Univ. of Wyoming, Laramie, WY (United States)

    2017-05-17

    This document is intended to provide a broad overview of the sources, characteristics, and analyses of natural and forced oscillatory behaviors in power systems. These aspects are necessarily linked. Oscillations appear in measurements with distinguishing characteristics derived from the oscillation’s source. These characteristics determine which analysis methods can be appropriately applied, and the results from these analyses can only be interpreted correctly with an understanding of the oscillation’s origin. To describe oscillations both at their source within a physical power system and within measurements, a perspective from the boundary between power system and signal processing theory has been adopted.

  13. Field electron emission from branched nanotubes film

    International Nuclear Information System (INIS)

    Zeng Baoqing; Tian Shikai; Yang Zhonghai

    2005-01-01

    We describe the preparation and analyses of films composed of branched carbon nanotubes (CNTs). The CNTs were grown on a Ni catalyst film using chemical vapor deposition from a gas containing acetylene. From scanning electron microscope (SEM) and transmission electron microscope (TEM) analyses, the branched structure of the CNTs was determined; the field emission characteristics in a vacuum chamber indicated a lower turn on field for branched CNTs than normal CNTs

  14. The Advanced Glaucoma Intervention Study (AGIS): 14. Distinguishing progression of glaucoma from visual field fluctuations.

    Science.gov (United States)

    Kim, Jonghyeon; Dally, Leonard G; Ederer, Fred; Gaasterland, Douglas E; VanVeldhuisen, Paul C; Blackwell, Beth; Sullivan, E Kenneth; Prum, Bruce; Shafranov, George; Beck, Allen; Spaeth, George L

    2004-11-01

    To determine the least worsening of a visual field (VF) and the least number of confirming tests needed to identify progression of glaucomatous VF defects. Cohort study of participants in a clinical trial. Seven hundred fifty-two eyes of 565 patients with advanced glaucoma. Visual field tests were quantified with the Advanced Glaucoma Intervention Study (AGIS) VF defect score and the Humphrey Field Analyzer mean deviation (MD). Follow-up was 8 to 13 years. Two measures based on the AGIS VF defect score: (1) sustained decrease of VF (SDVF), a worsening from baseline by 2 (alternatively, 3 or 4) or more units and sustained for 2 (alternatively, 3) consecutive 6-month visits and (2) after the occurrence of SDVF, the average percent of eyes with worsening by 2 (alternatively, 3 or 4) or more units from baseline. Two similar measures based on MD. Based on the original AGIS criteria for SDVF (a worsening of 4 units in the AGIS score sustained during 3 consecutive 6-month visits), 31% of eyes had an SDVF. The percent of eyes with a sustained event increases by approximately 10% when either the minimum number of units of field loss or the minimum number of 6-month visits during which the loss is sustained decreases by 1. During 3 years of follow-up after a sustained event, a worsening of at least 2 units was found in 72% of eyes that had a 2-visit sustained event. The same worsening was found in 84% of eyes that had a 3-visit sustained event. Through the next 10 years after a sustained event, based on worsening of 2, 3, or 4 units at 2 or 3 consecutive tests, the loss reoccurred, on average, in >/=75% of study eyes. Results for MD are similar. In patients with advanced glaucoma, a single confirmatory test 6 months after a VF worsening indicates with at least 72% probability a persistent defect when the worsening is defined by at least 2 units of AGIS score or by at least 2 decibels of MD. When the number of confirmatory tests is increased from 1 to 2, the percentage of

  15. Diagnostic Comparison of Meteorological Analyses during the 2002 Antarctic Winter

    Science.gov (United States)

    Manney, Gloria L.; Allen, Douglas R.; Kruger, Kirstin; Naujokat, Barbara; Santee, Michelle L.; Sabutis, Joseph L.; Pawson, Steven; Swinbank, Richard; Randall, Cora E.; Simmons, Adrian J.; hide

    2005-01-01

    Several meteorological datasets, including U.K. Met Office (MetO), European Centre for Medium-Range Weather Forecasts (ECMWF), National Centers for Environmental Prediction (NCEP), and NASA's Goddard Earth Observation System (GEOS-4) analyses, are being used in studies of the 2002 Southern Hemisphere (SH) stratospheric winter and Antarctic major warming. Diagnostics are compared to assess how these studies may be affected by the meteorological data used. While the overall structure and evolution of temperatures, winds, and wave diagnostics in the different analyses provide a consistent picture of the large-scale dynamics of the SH 2002 winter, several significant differences may affect detailed studies. The NCEP-NCAR reanalysis (REAN) and NCEP-Department of Energy (DOE) reanalysis-2 (REAN-2) datasets are not recommended for detailed studies, especially those related to polar processing, because of lower-stratospheric temperature biases that result in underestimates of polar processing potential, and because their winds and wave diagnostics show increasing differences from other analyses between similar to 30 and 10 hPa (their top level). Southern Hemisphere polar stratospheric temperatures in the ECMWF 40-Yr Re-analysis (ERA-40) show unrealistic vertical structure, so this long-term reanalysis is also unsuited for quantitative studies. The NCEP/Climate Prediction Center (CPC) objective analyses give an inferior representation of the upper-stratospheric vortex. Polar vortex transport barriers are similar in all analyses, but there is large variation in the amount, patterns, and timing of mixing, even among the operational assimilated datasets (ECMWF, MetO, and GEOS-4). The higher-resolution GEOS-4 and ECMWF assimilations provide significantly better representation of filamentation and small-scale structure than the other analyses, even when fields gridded at reduced resolution are studied. The choice of which analysis to use is most critical for detailed transport

  16. 10 CFR 61.13 - Technical analyses.

    Science.gov (United States)

    2010-01-01

    ... air, soil, groundwater, surface water, plant uptake, and exhumation by burrowing animals. The analyses... processes such as erosion, mass wasting, slope failure, settlement of wastes and backfill, infiltration through covers over disposal areas and adjacent soils, and surface drainage of the disposal site. The...

  17. En kvantitativ metode til analyse af radio

    Directory of Open Access Journals (Sweden)

    Christine Lejre

    2014-06-01

    Full Text Available I den danske såvel som den internationale radiolitteratur er bud på metoder til analyse af radiomediet sparsomme. Det skyldes formentlig, at radiomediet er svært at analysere, fordi det er et medie, der ikke er visualiseret i form af billeder eller understøttet af printet tekst. Denne artikel har til formål at beskrive en ny kvantitativ metode til analyse af radio, der tager særligt hensyn til radiomediets modalitet – lyd struktureret som et lineært forløb i tid. Metoden understøtter dermed både radiomediet som et medie i tid og som et blindt medie. Metoden er udviklet i forbindelse med en komparativ analyse af kulturprogrammer på P1 og Radio24syv lavet for Danmarks Radio. Artiklen peger på, at metoden er velegnet til analyse af ikke kun radio, men også andre medieplatforme samt forskellige journalistiske stofområder.

  18. Analysing User Lifetime in Voluntary Online Collaboration

    DEFF Research Database (Denmark)

    McHugh, Ronan; Larsen, Birger

    2010-01-01

    This paper analyses persuasion in online collaboration projects. It introduces a set of heuristics that can be applied to such projects and combines these with a quantitative analysis of user activity over time. Two example sites are studies, Open Street Map and The Pirate Bay. Results show that ...

  19. Analyses of hydraulic performance of velocity caps

    DEFF Research Database (Denmark)

    Christensen, Erik Damgaard; Degn Eskesen, Mark Chr.; Buhrkall, Jeppe

    2014-01-01

    The hydraulic performance of a velocity cap has been investigated. Velocity caps are often used in connection with offshore intakes. CFD (computational fluid dynamics) examined the flow through the cap openings and further down into the intake pipes. This was combined with dimension analyses...

  20. Quantitative analyses of shrinkage characteristics of neem ...

    African Journals Online (AJOL)

    Quantitative analyses of shrinkage characteristics of neem (Azadirachta indica A. Juss.) wood were carried out. Forty five wood specimens were prepared from the three ecological zones of north eastern Nigeria, viz: sahel savanna, sudan savanna and guinea savanna for the research. The results indicated that the wood ...

  1. Hybrid Logical Analyses of the Ambient Calculus

    DEFF Research Database (Denmark)

    Bolander, Thomas; Hansen, Rene Rydhof

    2010-01-01

    In this paper, hybrid logic is used to formulate three control flow analyses for Mobile Ambients, a process calculus designed for modelling mobility. We show that hybrid logic is very well-suited to express the semantic structure of the ambient calculus and how features of hybrid logic can...

  2. Micromechanical photothermal analyser of microfluidic samples

    DEFF Research Database (Denmark)

    2014-01-01

    The present invention relates to a micromechanical photothermal analyser of microfluidic samples comprising an oblong micro-channel extending longitudinally from a support element, the micro-channel is made from at least two materials with different thermal expansion coefficients, wherein...

  3. Systematic review and meta-analyses

    DEFF Research Database (Denmark)

    Dreier, Julie Werenberg; Andersen, Anne-Marie Nybo; Berg-Beckhoff, Gabriele

    2014-01-01

    1990 were excluded. RESULTS: The available literature supported an increased risk of adverse offspring health in association with fever during pregnancy. The strongest evidence was available for neural tube defects, congenital heart defects, and oral clefts, in which meta-analyses suggested between a 1...

  4. Secundaire analyses organisatiebeleid psychosociale arbeidsbelasting (PSA)

    NARCIS (Netherlands)

    Kraan, K.O.; Houtman, I.L.D.

    2016-01-01

    Hoe het organisatiebeleid rond psychosociale arbeidsbelasting (PSA) eruit ziet anno 2014 en welke samenhang er is met ander beleid en uitkomstmaten, zijn de centrale vragen in dit onderzoek. De resultaten van deze verdiepende analyses kunnen ten goede komen aan de lopende campagne ‘Check je

  5. Exergoeconomic and environmental analyses of CO

    NARCIS (Netherlands)

    Mosaffa, A. H.; Garousi Farshi, L; Infante Ferreira, C.A.; Rosen, M. A.

    2016-01-01

    Exergoeconomic and environmental analyses are presented for two CO2/NH3 cascade refrigeration systems equipped with (1) two flash tanks and (2) a flash tank along with a flash intercooler with indirect subcooler. A comparative study is performed for the proposed systems, and

  6. Meta-analyses on viral hepatitis

    DEFF Research Database (Denmark)

    Gluud, Lise L; Gluud, Christian

    2009-01-01

    This article summarizes the meta-analyses of interventions for viral hepatitis A, B, and C. Some of the interventions assessed are described in small trials with unclear bias control. Other interventions are supported by large, high-quality trials. Although attempts have been made to adjust...

  7. Multivariate differential analyses of adolescents' experiences of ...

    African Journals Online (AJOL)

    Aggression is reasoned to be dependent on aspects such as self-concept, moral reasoning, communication, frustration tolerance and family relationships. To analyse the data from questionnaires of 101 families (95 adolescents, 95 mothers and 91 fathers) Cronbach Alpha, various consecutive first and second order factor ...

  8. Chromosomal evolution and phylogenetic analyses in Tayassu ...

    Indian Academy of Sciences (India)

    Chromosome preparation and karyotype description. The material analysed consists of chromosome preparations of the tayassuid species T. pecari (three individuals) and. P. tajacu (four individuals) and were made from short-term lymphocyte cultures of whole blood samples using standard protocols (Chaves et al. 2002).

  9. Grey literature in meta-analyses.

    Science.gov (United States)

    Conn, Vicki S; Valentine, Jeffrey C; Cooper, Harris M; Rantz, Marilyn J

    2003-01-01

    In meta-analysis, researchers combine the results of individual studies to arrive at cumulative conclusions. Meta-analysts sometimes include "grey literature" in their evidential base, which includes unpublished studies and studies published outside widely available journals. Because grey literature is a source of data that might not employ peer review, critics have questioned the validity of its data and the results of meta-analyses that include it. To examine evidence regarding whether grey literature should be included in meta-analyses and strategies to manage grey literature in quantitative synthesis. This article reviews evidence on whether the results of studies published in peer-reviewed journals are representative of results from broader samplings of research on a topic as a rationale for inclusion of grey literature. Strategies to enhance access to grey literature are addressed. The most consistent and robust difference between published and grey literature is that published research is more likely to contain results that are statistically significant. Effect size estimates of published research are about one-third larger than those of unpublished studies. Unfunded and small sample studies are less likely to be published. Yet, importantly, methodological rigor does not differ between published and grey literature. Meta-analyses that exclude grey literature likely (a) over-represent studies with statistically significant findings, (b) inflate effect size estimates, and (c) provide less precise effect size estimates than meta-analyses including grey literature. Meta-analyses should include grey literature to fully reflect the existing evidential base and should assess the impact of methodological variations through moderator analysis.

  10. Field training

    International Nuclear Information System (INIS)

    Mumford, G.E.; Hadaway, E.H.

    1991-01-01

    Individualized, personal training can be used to increase an employee's awareness of the HSE program. Such training can stimulate personal commitment and provide personal skills that can be utilized for the benefit of the overall HSE effort. But, providing such training within our industry can be a difficult task due to the scheduling, travel arrangements, and cost associated with bringing employees from isolated, remote locations to centrally located training facilities. One method of overcoming these obstacles involves the use of field instructors to provide the training at the many, and varied number of individuals can be reached with minimal disruption to their work scheduling or to their time off. In fact, this type of on-site training is already used by some oil companies and drilling contractors with encouraging results. This paper describes one drilling contractor's experiences with such a training program. The results after eight years how that this program not only can provide and efficient, economical means of employee training, but also can have a direct application to employee motivation regarding a company's HSE effort

  11. Comparison of design and probabilistic analyses of nuclear power plants

    International Nuclear Information System (INIS)

    Maslenikov, O.R.; Johnson, J.J.; Campbell, R.D.

    1995-01-01

    A study was made to evaluate the margin of conservatism introduced into design in-structures response spectra by following standard design analysis procedures according to the U.S.Nuclear Regulatory Commission (NRC) Standard Review Plan and Regulatory Guides for comparing spectra produced by such a design analysis to response from median-centered probabilistic analyses. Three typical nuclear plant structures were studied: PWR reactor building, PWR auxiliary building and BWR reactor building. Each building was assumed to be situated on three idealized sites: a rock site, a medium and a soft soil site. All buildings were assumed to have embedded foundations. The PWR reactor building was also assumed to have a surface foundation. Each design analysis was performed inn accordance with the current SRP criteria. Each probabilistic analysis consisted of 30 earthquake simulations for which the free-field motions and soil and structural properties were varied; the simulated earthquakes were generated such that their mean-plus-one-standard-deviation free-field spectra approximated the Regulatory Guide (RG) 1.60 design spectra. In-structure response spectra from the design analyses were compared with the 84% non-exceedance probability (NEP) spectra from the probabilistic analyses. The comparisons showed that the design method produced conservative results for all cases. The smallest margin was about 10% for buildings on rock sites. Softer sides had larger margins of conservatism; the reactor buildings on the soft soil site had margins of as much as 100% (factor of 2),. The shorter structures and lower locations in all buildings had smaller margins. The margin of conservatism for the surface founded reactor building was about 20% more than for the embedded reactor building. (author). 3 refs., 5 figs., 1 tab

  12. Thermal analyses. Information on the expected baking process; Thermische analyses. Informatie over een te verwachten bakgedrag

    Energy Technology Data Exchange (ETDEWEB)

    Van Wijck, H. [Stichting Technisch Centrum voor de Keramische Industrie TCKI, Velp (Netherlands)

    2009-09-01

    The design process and the drying process for architectural ceramics and pottery partly determine the characteristics of the final product, but the largest changes occur during the baking process. An overview is provided of the different thermal analyses and how the information from these analyses can predict the process in practice. (mk) [Dutch] Het vormgevingsproces en het droogproces voor bouwkeramische producten en aardewerk bepalen voor een deel de eigenschappen van de eindproducten, maar de grootste veranderingen treden op bij het bakproces. Een overzicht wordt gegeven van de verschillende thermische analyses en hoe de informatie uit deze analyses het in de praktijk te verwachten gedrag kan voorspellen.

  13. Using Ontologies in Cybersecurity Field

    Directory of Open Access Journals (Sweden)

    Tiberiu Marian GEORGESCU

    2017-01-01

    Full Text Available This paper is an exploratory research which aims to improve the cybersecurity field by means of semantic web technologies. The authors present a framework which uses Semantic Web technologies to automatically extract and analyse text in natural language available online. The system provides results that are further analysed by cybersecurity experts to detect black hat hackers’ activities. The authors examine several characteristics of how hacking communities communicate and collaborate online and how much information can be obtained by analysing different types of internet text communication channels. Having online sources as input data, the model proposed extracts and analyses natural language that relates with cybersecurity field, with the aid of ontologies. The main objective is to generate information about possible black hat hacking actions, which later can be analysed punctually by experts. This paper describes the data flow of the framework and it proposes technological solutions so that the model can be applied. In their future work, the authors plan to implement the framework described as a system software application.

  14. Applicability of two mobile analysers for mercury in urine in small-scale gold mining areas.

    Science.gov (United States)

    Baeuml, Jennifer; Bose-O'Reilly, Stephan; Lettmeier, Beate; Maydl, Alexandra; Messerer, Katalin; Roider, Gabriele; Drasch, Gustav; Siebert, Uwe

    2011-12-01

    Mercury is still used in developing countries to extract gold from the ore in small-scale gold mining areas. This is a major health hazard for people living in mining areas. The concentration of mercury in urine was analysed in different mining areas in Zimbabwe, Indonesia and Tanzania. First the urine samples were analysed by CV-AAS (cold vapour atomic absorption spectrometry) during the field projects with a mobile mercury analyser (Lumex(®) or Seefelder(®)) and secondly, in a laboratory with a stationary CV-AAS mercury analyser (PerkinElmer(®)). Caused by the different systems (reduction agent either SnCl(2) (Lumex(®) or Seefelder(®))) or NaBH(4) (PerkinElmer(®)), with the mobile analysers only the inorganic mercury was obtained and with the stationary system the total mercury concentration was measured. The aims of the study were whether the results obtained in field with the mobile equipments can be compared with the stationary reference method in the laboratory and allow the application of these mobile analysers in screening studies on concerned populations to select those, who are exposed to critical mercury levels. Overall, the concentrations obtained with the two mobile systems were approximately 25% lower than determined with the stationary system. Nevertheless, both mobile systems seem to be very useful for screening of volunteers in field. Moreover, regional staff may be trained on such analysers to perform screening tests by themselves. Copyright © 2011 Elsevier GmbH. All rights reserved.

  15. Analyses and characterization of double shell tank

    Energy Technology Data Exchange (ETDEWEB)

    1994-10-04

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams.

  16. DCH analyses using the CONTAIN code

    International Nuclear Information System (INIS)

    Hong, Sung Wan; Kim, Hee Dong

    1996-08-01

    This report describes CONTAIN analyses performed during participation in the project of 'DCH issue resolution for ice condenser plants' which is sponsored by NRC at SNL. Even though the calculations were performed for the Ice Condenser plant, CONTAIN code has been used for analyses of many phenomena in the PWR containment and the DCH module can be commonly applied to any plant types. The present ice condenser issue resolution effort intended to provide guidance as to what might be needed to resolve DCH for ice condenser plants. It includes both a screening analysis and a scoping study if the screening analysis cannot provide an complete resolution. The followings are the results concerning DCH loads in descending order. 1. Availability of ignition sources prior to vessel breach 2. availability and effectiveness of ice in the ice condenser 3. Loads modeling uncertainties related to co-ejected RPV water 4. Other loads modeling uncertainties 10 tabs., 3 figs., 14 refs. (Author)

  17. DCH analyses using the CONTAIN code

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Sung Wan; Kim, Hee Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-08-01

    This report describes CONTAIN analyses performed during participation in the project of `DCH issue resolution for ice condenser plants` which is sponsored by NRC at SNL. Even though the calculations were performed for the Ice Condenser plant, CONTAIN code has been used for analyses of many phenomena in the PWR containment and the DCH module can be commonly applied to any plant types. The present ice condenser issue resolution effort intended to provide guidance as to what might be needed to resolve DCH for ice condenser plants. It includes both a screening analysis and a scoping study if the screening analysis cannot provide an complete resolution. The followings are the results concerning DCH loads in descending order. 1. Availability of ignition sources prior to vessel breach 2. availability and effectiveness of ice in the ice condenser 3. Loads modeling uncertainties related to co-ejected RPV water 4. Other loads modeling uncertainties 10 tabs., 3 figs., 14 refs. (Author).

  18. Analyses and characterization of double shell tank

    International Nuclear Information System (INIS)

    1994-01-01

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams

  19. Soil analyses by ICP-MS (Review)

    International Nuclear Information System (INIS)

    Yamasaki, Shin-ichi

    2000-01-01

    Soil analyses by inductively coupled plasma mass spectrometry (ICP-MS) are reviewed. The first half of the paper is devoted to the development of techniques applicable to soil analyses, where diverse analytical parameters are carefully evaluated. However, the choice of soil samples is somewhat arbitrary, and only a limited number of samples (mostly reference materials) are examined. In the second half, efforts are mostly concentrated on the introduction of reports, where a large number of samples and/or very precious samples have been analyzed. Although the analytical techniques used in these reports are not necessarily novel, valuable information concerning such topics as background levels of elements in soils, chemical forms of elements in soils and behavior of elements in soil ecosystems and the environment can be obtained. The major topics discussed are total elemental analysis, analysis of radionuclides with long half-lives, speciation, leaching techniques, and isotope ratio measurements. (author)

  20. Sorption analyses in materials science: selected oxides

    International Nuclear Information System (INIS)

    Fuller, E.L. Jr.; Condon, J.B.; Eager, M.H.; Jones, L.L.

    1981-01-01

    Physical adsorption studies have been shown to be extremely valuable in studying the chemistry and structure of dispersed materials. Many processes rely on the access to the large amount of surface made available by the high degree of dispersion. Conversely, there are many applications where consolidation of the dispersed solids is required. Several systems (silica gel, alumina catalysts, mineralogic alumino-silicates, and yttrium oxide plasters) have been studied to show the type and amount of chemical and structural information that can be obtained. Some review of current theories is given and additional concepts are developed based on statistical and thermodynamic arguments. The results are applied to sorption data to show that detailed sorption analyses are extremely useful and can provide valuable information that is difficult to obtain by any other means. Considerable emphasis has been placed on data analyses and interpretation of a nonclassical nature to show the potential of such studies that is often not recognized nor utilized

  1. Standardized analyses of nuclear shipping containers

    International Nuclear Information System (INIS)

    Parks, C.V.; Hermann, O.W.; Petrie, L.M.; Hoffman, T.J.; Tang, J.S.; Landers, N.F.; Turner, W.D.

    1983-01-01

    This paper describes improved capabilities for analyses of nuclear fuel shipping containers within SCALE -- a modular code system for Standardized Computer Analyses for Licensing Evaluation. Criticality analysis improvements include the new KENO V, a code which contains an enhanced geometry package and a new control module which uses KENO V and allows a criticality search on optimum pitch (maximum k-effective) to be performed. The SAS2 sequence is a new shielding analysis module which couples fuel burnup, source term generation, and radial cask shielding. The SAS5 shielding sequence allows a multidimensional Monte Carlo analysis of a shipping cask with code generated biasing of the particle histories. The thermal analysis sequence (HTAS1) provides an easy-to-use tool for evaluating a shipping cask response to the accident capability of the SCALE system to provide the cask designer or evaluator with a computational system that provides the automated procedures and easy-to-understand input that leads to standarization

  2. Quantitative Analyse und Visualisierung der Herzfunktionen

    Science.gov (United States)

    Sauer, Anne; Schwarz, Tobias; Engel, Nicole; Seitel, Mathias; Kenngott, Hannes; Mohrhardt, Carsten; Loßnitzer, Dirk; Giannitsis, Evangelos; Katus, Hugo A.; Meinzer, Hans-Peter

    Die computergestützte bildbasierte Analyse der Herzfunktionen ist mittlerweile Standard in der Kardiologie. Die verfügbaren Produkte erfordern meist ein hohes Maß an Benutzerinteraktion und somit einen erhöhten Zeitaufwand. In dieser Arbeit wird ein Ansatz vorgestellt, der dem Kardiologen eine größtenteils automatische Analyse der Herzfunktionen mittels MRT-Bilddaten ermöglicht und damit Zeitersparnis schafft. Hierbei werden alle relevanten herzphysiologsichen Parameter berechnet und mithilfe von Diagrammen und Graphen visualisiert. Diese Berechnungen werden evaluiert, indem die ermittelten Werte mit manuell vermessenen verglichen werden. Der hierbei berechnete mittlere Fehler liegt mit 2,85 mm für die Wanddicke und 1,61 mm für die Wanddickenzunahme immer noch im Bereich einer Pixelgrösse der verwendeten Bilder.

  3. Exergetic and thermoeconomic analyses of power plants

    International Nuclear Information System (INIS)

    Kwak, H.-Y.; Kim, D.-J.; Jeon, J.-S.

    2003-01-01

    Exergetic and thermoeconomic analyses were performed for a 500-MW combined cycle plant. In these analyses, mass and energy conservation laws were applied to each component of the system. Quantitative balances of the exergy and exergetic cost for each component, and for the whole system was carefully considered. The exergoeconomic model, which represented the productive structure of the system considered, was used to visualize the cost formation process and the productive interaction between components. The computer program developed in this study can determine the production costs of power plants, such as gas- and steam-turbines plants and gas-turbine cogeneration plants. The program can be also be used to study plant characteristics, namely, thermodynamic performance and sensitivity to changes in process and/or component design variables

  4. Pratique de l'analyse fonctionelle

    CERN Document Server

    Tassinari, Robert

    1997-01-01

    Mettre au point un produit ou un service qui soit parfaitement adapté aux besoins et aux exigences du client est indispensable pour l'entreprise. Pour ne rien laisser au hasard, il s'agit de suivre une méthodologie rigoureuse : celle de l'analyse fonctionnelle. Cet ouvrage définit précisément cette méthode ainsi que ses champs d'application. Il décrit les méthodes les plus performantes en termes de conception de produit et de recherche de qualité et introduit la notion d'analyse fonctionnelle interne. Un ouvrage clé pour optimiser les processus de conception de produit dans son entreprise. -- Idées clés, par Business Digest

  5. Kinetic stability analyses in a bumpy cylinder

    International Nuclear Information System (INIS)

    Dominguez, R.R.; Berk, H.L.

    1981-01-01

    Recent interest in the ELMO Bumpy Torus (EBT) has prompted a number of stability analyses of both the hot electron rings and the toroidal plasma. Typically these works employ the local approximation, neglecting radial eigenmode structure and ballooning effects to perform the stability analysis. In the present work we develop a fully kinetic formalism for performing nonlocal stability analyses in a bumpy cylinder. We show that the Vlasov-Maxwell integral equations (with one ignorable coordinate) are self-adjoint and hence amenable to analysis using numerical techniques developed for self-adjoint systems of equations. The representation we obtain for the kernel of the Vlasov-Maxwell equations is a differential operator of arbitrarily high order. This form leads to a manifestly self-adjoint system of differential equations for long wavelength modes

  6. Sectorial Group for Incident Analyses (GSAI)

    International Nuclear Information System (INIS)

    Galles, Q.; Gamo, J. M.; Jorda, M.; Sanchez-Garrido, P.; Lopez, F.; Asensio, L.; Reig, J.

    2013-01-01

    In 2008, the UNESA Nuclear Energy Committee (CEN) proposed the creation of a working group formed by experts from all Spanish NPPs with the purpose of jointly analyze relevant incidents occurred in each one of the plants. This initiative was a response to a historical situation in which the exchange of information on incidents between the Spanish NPP's was below the desired level. In june 2009, UNESA's Guide CEN-29 established the performance criteria for the so called Sectorial Group for Incident Analyses (GSAI), whose activity would be coordinated by the UNESA's Group for Incident Analyses (GSAI), whose activity would be coordinated by the UNESA's Group of Operating Experience, under the Operations Commission (COP). (Author)

  7. Analysing organic transistors based on interface approximation

    International Nuclear Information System (INIS)

    Akiyama, Yuto; Mori, Takehiko

    2014-01-01

    Temperature-dependent characteristics of organic transistors are analysed thoroughly using interface approximation. In contrast to amorphous silicon transistors, it is characteristic of organic transistors that the accumulation layer is concentrated on the first monolayer, and it is appropriate to consider interface charge rather than band bending. On the basis of this model, observed characteristics of hexamethylenetetrathiafulvalene (HMTTF) and dibenzotetrathiafulvalene (DBTTF) transistors with various surface treatments are analysed, and the trap distribution is extracted. In turn, starting from a simple exponential distribution, we can reproduce the temperature-dependent transistor characteristics as well as the gate voltage dependence of the activation energy, so we can investigate various aspects of organic transistors self-consistently under the interface approximation. Small deviation from such an ideal transistor operation is discussed assuming the presence of an energetically discrete trap level, which leads to a hump in the transfer characteristics. The contact resistance is estimated by measuring the transfer characteristics up to the linear region

  8. New environmental metabarcodes for analysing soil DNA

    DEFF Research Database (Denmark)

    Epp, Laura S.; Boessenkool, Sanne; Bellemain, Eva P.

    2012-01-01

    was systematically evaluated by (i) in silico PCRs using all standard sequences in the EMBL public database as templates, (ii) in vitro PCRs of DNA extracts from surface soil samples from a site in Varanger, northern Norway and (iii) in vitro PCRs of DNA extracts from permanently frozen sediment samples of late......Metabarcoding approaches use total and typically degraded DNA from environmental samples to analyse biotic assemblages and can potentially be carried out for any kinds of organisms in an ecosystem. These analyses rely on specific markers, here called metabarcodes, which should be optimized...... for taxonomic resolution, minimal bias in amplification of the target organism group and short sequence length. Using bioinformatic tools, we developed metabarcodes for several groups of organisms: fungi, bryophytes, enchytraeids, beetles and birds. The ability of these metabarcodes to amplify the target groups...

  9. Visuelle Analyse von E-mail-Verkehr

    OpenAIRE

    Mansmann, Florian

    2003-01-01

    Diese Arbeit beschreibt Methoden zur visuellen geographischen Analyse von E-mail Verkehr.Aus dem Header einer E-mail können Hostadressen und IP-Adressen herausgefiltert werden. Anhand einer Datenbank werden diesen Host- und IP-Adressen geographische Koordinaten zugeordnet.Durch eine Visualisierung werden in übersichtlicher Art und Weise mehrere tausend E-mail Routen dargestellt. Zusätzlich dazu wurden interktive Manipulationsmöglichkeiten vorgestellt, welche eine visuelle Exploration der Date...

  10. BWR core melt progression phenomena: Experimental analyses

    International Nuclear Information System (INIS)

    Ott, L.J.

    1992-01-01

    In the BWR Core Melt in Progression Phenomena Program, experimental results concerning severe fuel damage and core melt progression in BWR core geometry are used to evaluate existing models of the governing phenomena. These include control blade eutectic liquefaction and the subsequent relocation and attack on the channel box structure; oxidation heating and hydrogen generation; Zircaloy melting and relocation; and the continuing oxidation of zirconium with metallic blockage formation. Integral data have been obtained from the BWR DF-4 experiment in the ACRR and from BWR tests in the German CORA exreactor fuel-damage test facility. Additional integral data will be obtained from new CORA BWR test, the full-length FLHT-6 BWR test in the NRU test reactor, and the new program of exreactor experiments at Sandia National Laboratories (SNL) on metallic melt relocation and blockage formation. an essential part of this activity is interpretation and use of the results of the BWR tests. The Oak Ridge National Laboratory (ORNL) has developed experiment-specific models for analysis of the BWR experiments; to date, these models have permitted far more precise analyses of the conditions in these experiments than has previously been available. These analyses have provided a basis for more accurate interpretation of the phenomena that the experiments are intended to investigate. The results of posttest analyses of BWR experiments are discussed and significant findings from these analyses are explained. The ORNL control blade/canister models with materials interaction, relocation and blockage models are currently being implemented in SCDAP/RELAP5 as an optional structural component

  11. En Billig GPS Data Analyse Platform

    DEFF Research Database (Denmark)

    Andersen, Ove; Christiansen, Nick; Larsen, Niels T.

    2011-01-01

    Denne artikel præsenterer en komplet software platform til analyse af GPS data. Platformen er bygget udelukkende vha. open-source komponenter. De enkelte komponenter i platformen beskrives i detaljer. Fordele og ulemper ved at bruge open-source diskuteres herunder hvilke IT politiske tiltage, der...... organisationer med et digitalt vejkort og GPS data begynde at lave trafikanalyser på disse data. Det er et krav, at der er passende IT kompetencer tilstede i organisationen....

  12. Neuronal network analyses: premises, promises and uncertainties

    OpenAIRE

    Parker, David

    2010-01-01

    Neuronal networks assemble the cellular components needed for sensory, motor and cognitive functions. Any rational intervention in the nervous system will thus require an understanding of network function. Obtaining this understanding is widely considered to be one of the major tasks facing neuroscience today. Network analyses have been performed for some years in relatively simple systems. In addition to the direct insights these systems have provided, they also illustrate some of the diffic...

  13. Modelling and analysing oriented fibrous structures

    International Nuclear Information System (INIS)

    Rantala, M; Lassas, M; Siltanen, S; Sampo, J; Takalo, J; Timonen, J

    2014-01-01

    A mathematical model for fibrous structures using a direction dependent scaling law is presented. The orientation of fibrous nets (e.g. paper) is analysed with a method based on the curvelet transform. The curvelet-based orientation analysis has been tested successfully on real data from paper samples: the major directions of fibrefibre orientation can apparently be recovered. Similar results are achieved in tests on data simulated by the new model, allowing a comparison with ground truth

  14. Kinematic gait analyses in healthy Golden Retrievers

    OpenAIRE

    Silva, Gabriela C.A.; Cardoso, Mariana Trés; Gaiad, Thais P.; Brolio, Marina P.; Oliveira, Vanessa C.; Assis Neto, Antonio; Martins, Daniele S.; Ambrósio, Carlos E.

    2014-01-01

    Kinematic analysis relates to the relative movement between rigid bodies and finds application in gait analysis and other body movements, interpretation of their data when there is change, determines the choice of treatment to be instituted. The objective of this study was to standardize the march of Dog Golden Retriever Healthy to assist in the diagnosis and treatment of musculoskeletal disorders. We used a kinematic analysis system to analyse the gait of seven dogs Golden Retriever, female,...

  15. Evaluation of periodic safety status analyses

    International Nuclear Information System (INIS)

    Faber, C.; Staub, G.

    1997-01-01

    In order to carry out the evaluation of safety status analyses by the safety assessor within the periodical safety reviews of nuclear power plants safety goal oriented requirements have been formulated together with complementary evaluation criteria. Their application in an inter-disciplinary coopertion covering the subject areas involved facilitates a complete safety goal oriented assessment of the plant status. The procedure is outlined briefly by an example for the safety goal 'reactivity control' for BWRs. (orig.) [de

  16. Geostatistical methods applied to field model residuals

    DEFF Research Database (Denmark)

    Maule, Fox; Mosegaard, K.; Olsen, Nils

    consists of measurement errors and unmodelled signal), and is typically assumed to be uncorrelated and Gaussian distributed. We have applied geostatistical methods to analyse the residuals of the Oersted(09d/04) field model [http://www.dsri.dk/Oersted/Field_models/IGRF_2005_candidates/], which is based...

  17. Factoring polynomials over arbitrary finite fields

    NARCIS (Netherlands)

    Lange, T.; Winterhof, A.

    2000-01-01

    We analyse an extension of Shoup's (Inform. Process. Lett. 33 (1990) 261–267) deterministic algorithm for factoring polynomials over finite prime fields to arbitrary finite fields. In particular, we prove the existence of a deterministic algorithm which completely factors all monic polynomials of

  18. Application of RUNTA code in flood analyses

    International Nuclear Information System (INIS)

    Perez Martin, F.; Benitez Fonzalez, F.

    1994-01-01

    Flood probability analyses carried out to date indicate the need to evaluate a large number of flood scenarios. This necessity is due to a variety of reasons, the most important of which include: - Large number of potential flood sources - Wide variety of characteristics of flood sources - Large possibility of flood-affected areas becoming inter linked, depending on the location of the potential flood sources - Diversity of flood flows from one flood source, depending on the size of the rupture and mode of operation - Isolation times applicable - Uncertainties in respect of the structural resistance of doors, penetration seals and floors - Applicable degrees of obstruction of floor drainage system Consequently, a tool which carries out the large number of calculations usually required in flood analyses, with speed and flexibility, is considered necessary. The RUNTA Code enables the range of possible scenarios to be calculated numerically, in accordance with all those parameters which, as a result of previous flood analyses, it is necessary to take into account in order to cover all the possible floods associated with each flood area

  19. An analyser for power plant operations

    International Nuclear Information System (INIS)

    Rogers, A.E.; Wulff, W.

    1990-01-01

    Safe and reliable operation of power plants is essential. Power plant operators need a forecast of what the plant will do when its current state is disturbed. The in-line plant analyser provides precisely this information at relatively low cost. The plant analyser scheme uses a mathematical model of the dynamic behaviour of the plant to establish a numerical simulation. Over a period of time, the simulation is calibrated with measurements from the particular plant in which it is used. The analyser then provides a reference against which to evaluate the plant's current behaviour. It can be used to alert the operator to any atypical excursions or combinations of readings that indicate malfunction or off-normal conditions that, as the Three Mile Island event suggests, are not easily recognised by operators. In a look-ahead mode, it can forecast the behaviour resulting from an intended change in settings or operating conditions. Then, when such changes are made, the plant's behaviour can be tracked against the forecast in order to assure that the plant is behaving as expected. It can be used to investigate malfunctions that have occurred and test possible adjustments in operating procedures. Finally, it can be used to consider how far from the limits of performance the elements of the plant are operating. Then by adjusting settings, the required power can be generated with as little stress as possible on the equipment. (6 figures) (Author)

  20. Symmetry inheritance of scalar fields

    International Nuclear Information System (INIS)

    Ivica Smolić

    2015-01-01

    Matter fields do not necessarily have to share the symmetries with the spacetime they live in. When this happens, we speak of the symmetry inheritance of fields. In this paper we classify the obstructions of symmetry inheritance by the scalar fields, both real and complex, and look more closely at the special cases of stationary and axially symmetric spacetimes. Since the symmetry noninheritance is present in the scalar fields of boson stars and may enable the existence of the black hole scalar hair, our results narrow the possible classes of such solutions. Finally, we define and analyse the symmetry noninheritance contributions to the Komar mass and angular momentum of the black hole scalar hair. (paper)

  1. Deformation analyse of the high point field Košická Nová Ves

    Directory of Open Access Journals (Sweden)

    Sedlák Vladimír

    2003-09-01

    Full Text Available From the science point of view the deformation measurements serve to an objective determination of movements and from the technical point of view the deformation measurements serve to a determinantion of the building technologies and the construction procedures. Detrmined movements by means of using the geodetic terrestrial or satellite navigation technologies give informations about displacements in a concrete time information on the base of repeated geodetic measurements in the concrete time intervals (epochs.Level deformation investigation of the point of the monitoring station stabled in the fill slope territory Košická Nová Ves is the main task of the presented paper. Level measurements are realized in autumn 2000 (the epoch 200.9 - it is considered as the first epoch of the deformation measurement, and in spring 2001 (the epoch 2001.3 – it is considered as the second epoch of the deformation measurement.

  2. Barrandian of the Prague Basin: Field Observations, Analyses and Numerical Simulation of Petroleum Generation-Migration

    Czech Academy of Sciences Publication Activity Database

    Mann, U.; Volk, H.; Suchý, Václav; Franců, J.; Filip, Jiří; Glasmacher, U.; Radke, M.; Sýkorová, Ivana; Wagner, G.; Wilkes, H.; Zeman, Antonín

    č. 1 (1999), s. 142 ISSN 0946-8978. [Old Crust - New Problems: Geodynamics and Utilization Includes the final international colloquium of the DFG priority programme Orogenic Processes Quantification and Modeling in the Variscan Belt. 22.02.1999-26.02.1999, Freiberg/Saxony] Subject RIV: DB - Geology ; Mineralogy

  3. Analysing an academic field through the lenses of Internet Science : Digital Humanities as a Virtual Community

    NARCIS (Netherlands)

    Akdag Salah, A.; Scharnhorst, Andrea; Wyatt, S.; Tiropanis, Thanassis; Vakali, Athena; Sartori, Laura; Burnap, Pete

    2015-01-01

    Digital Humanities (DH) has been depicted as an innovative engine for humanities, as a challenge for Data Science, and as an area where libraries, archives and providers of e-research infrastructures join forces with research pioneers. However DH is defined, one thing is cer- tain: DH is a new

  4. Product personality : From analysing to applying

    NARCIS (Netherlands)

    Pourtalebi Hendehkhaleh, S.; Pouralvar, K.

    2012-01-01

    Nowadays products are expected to undertake their functions properly and the competition for satisfying consumer is in the field of product attachments and emotional characteristics. Products have a symbolic meaning in addition to their utilitarian benefits. This symbolic meaning that refers to

  5. Theoretical Analyses of Superconductivity in Iron Based ...

    African Journals Online (AJOL)

    fire7-

    expulsion of magnetic field from the interior of a given superconducting material for temperatures below the critical ... replacing lanthanum by magnetic rare earth elements such as Ce, Sm, Nd or Pr and the critical temperature could be ... addition to a small anomaly in the dc magnetic susceptibility. Optical conductivity and.

  6. Field simulations for large dipole magnets

    International Nuclear Information System (INIS)

    Lazzaro, A.; Cappuzzello, F.; Cunsolo, A.; Cavallaro, M.; Foti, A.; Khouaja, A.; Orrigo, S.E.A.; Winfield, J.S.

    2007-01-01

    The problem of the description of magnetic field for large bending magnets is addressed in relation to the requirements of modern techniques of trajectory reconstruction. The crucial question of the interpolation and extrapolation of fields known at a discrete number of points is analysed. For this purpose a realistic field model of the large dipole of the MAGNEX spectrometer, obtained with finite elements three dimensional simulations, is used. The influence of the uncertainties in the measured field to the quality of the trajectory reconstruction is treated in detail. General constraints for field measurements in terms of required resolutions, step sizes and precisions are thus extracted

  7. Conflict field energy

    International Nuclear Information System (INIS)

    Krebsbach-Gnath, C.

    1981-01-01

    Violent social controversies characterize the treatment of the energy problem. Solutions of this conflict decisively depend on the knowledge and evaluation of the causes and the possible development. How is it possible to explain the opinions, views, and the attitude of the population to different kinds of energy. Which factors are decisive for the explosive effect and the stability of the conflict in the field of nulcear energy. What will happen when there arises a possible lack of energy. Which socio-political effects will such a lack have. Are there new proposals for solving problems in the nulcear-energy debate. The contributions of this book are results of scientific and empiric works. They provide perceptive approaches and analyses to the problems and by discussing them are useful in giving an orientation for political action. (orig.) [de

  8. Comparison of elastic and inelastic analyses

    International Nuclear Information System (INIS)

    Ammerman, D.J.; Heinstein, M.W.; Wellman, G.W.

    1992-01-01

    The use of inelastic analysis methods instead of the traditional elastic analysis methods in the design of radioactive material (RAM) transport packagings leads to a better understanding of the response of the package to mechanical loadings. Thus, better assessment of the containment, thermal protection, and shielding integrity of the package after a structure accident event can be made. A more accurate prediction of the package response can lead to enhanced safety and also allow for a more efficient use of materials, possibly leading to a package with higher capacity or lower weight. This paper discusses the advantages and disadvantages of using inelastic analysis in the design of RAM shipping packages. The use of inelastic analysis presents several problems to the package designer. When using inelastic analysis the entire nonlinear response of the material must be known, including the effects of temperature changes and strain rate. Another problem is that there currently is not an acceptance criteria for this type of analysis that is approved by regulatory agencies. Inelastic analysis acceptance criteria based on failure stress, failure strain , or plastic energy density could be developed. For both elastic and inelastic analyses it is also important to include other sources of stress in the analyses, such as fabrication stresses, thermal stresses, stresses from bolt preloading, and contact stresses at material interfaces. Offsetting these added difficulties is the improved knowledge of the package behavior. This allows for incorporation of a more uniform margin of safety, which can result in weight savings and a higher level of confidence in the post-accident configuration of the package. In this paper, comparisons between elastic and inelastic analyses are made for a simple ring structure and for a package to transport a large quantity of RAM by rail (rail cask) with lead gamma shielding to illustrate the differences in the two analysis techniques

  9. IDEA: Interactive Display for Evolutionary Analyses.

    Science.gov (United States)

    Egan, Amy; Mahurkar, Anup; Crabtree, Jonathan; Badger, Jonathan H; Carlton, Jane M; Silva, Joana C

    2008-12-08

    The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood) suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. We have developed IDEA (Interactive Display for Evolutionary Analyses), an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  10. IDEA: Interactive Display for Evolutionary Analyses

    Directory of Open Access Journals (Sweden)

    Carlton Jane M

    2008-12-01

    Full Text Available Abstract Background The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. Results We have developed IDEA (Interactive Display for Evolutionary Analyses, an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. Conclusion IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  11. Antarctic observations available for IMS correlative analyses

    International Nuclear Information System (INIS)

    Rycroft, M.J.

    1982-01-01

    A review is provided of the wide-ranging observational programs of 25 stations operating on and around the continent of Antarctica during the International Magnetospheric Study (IMS). Attention is given to observations of geomagnetism, short period fluctuations of the earth's electromagnetic field, observations of the ionosphere and of whistler mode signals, observational programs in ionospheric and magnetospheric physics, upper atmosphere physics observations, details of magnetospheric programs conducted at Kerguelen, H-component magnetograms, magnetic field line oscillations, dynamic spectra of whistlers, and the variation of plasmapause position derived from recorded whistlers. The considered studies suggest that, in principle, if the level of magnetic activity is known, predictions can be made concerning the time at which the trough occurs, and the shape and the movement of the main trough

  12. Bayesian uncertainty analyses of probabilistic risk models

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1989-01-01

    Applications of Bayesian principles to the uncertainty analyses are discussed in the paper. A short review of the most important uncertainties and their causes is provided. An application of the principle of maximum entropy to the determination of Bayesian prior distributions is described. An approach based on so called probabilistic structures is presented in order to develop a method of quantitative evaluation of modelling uncertainties. The method is applied to a small example case. Ideas for application areas for the proposed method are discussed

  13. Safety analyses for high-temperature reactors

    International Nuclear Information System (INIS)

    Mueller, A.

    1978-01-01

    The safety evaluation of HTRs may be based on the three methods presented here: The licensing procedure, the probabilistic risk analysis, and the damage extent analysis. Thereby all safety aspects - from normal operation to the extreme (hypothetical) accidents - of the HTR are covered. The analyses within the licensing procedure of the HTR-1160 have shown that for normal operation and for the design basis accidents the radiation exposures remain clearly below the maximum permissible levels as prescribed by the radiation protection ordinance, so that no real hazard for the population will avise from them. (orig./RW) [de

  14. Introduction: Analysing Emotion and Theorising Affect

    Directory of Open Access Journals (Sweden)

    Peta Tait

    2016-08-01

    Full Text Available This discussion introduces ideas of emotion and affect for a volume of articles demonstrating the scope of approaches used in their study within the humanities and creative arts. The volume offers multiple perspectives on emotion and affect within 20th-century and 21st-century texts, arts and organisations and their histories. The discussion explains how emotion encompasses the emotions, emotional feeling, sensation and mood and how these can be analysed particularly in relation to literature, art and performance. It briefly summarises concepts of affect theory within recent approaches before introducing the articles.

  15. Applications of neural network to numerical analyses

    International Nuclear Information System (INIS)

    Takeda, Tatsuoki; Fukuhara, Makoto; Ma, Xiao-Feng; Liaqat, Ali

    1999-01-01

    Applications of a multi-layer neural network to numerical analyses are described. We are mainly concerned with the computed tomography and the solution of differential equations. In both cases as the objective functions for the training process of the neural network we employed residuals of the integral equation or the differential equations. This is different from the conventional neural network training where sum of the squared errors of the output values is adopted as the objective function. For model problems both the methods gave satisfactory results and the methods are considered promising for some kind of problems. (author)

  16. Komparativ analyse - Scandinavian Airlines & Norwegian Air Shuttle

    OpenAIRE

    Kallesen, Martin Nystrup; Singh, Ravi Pal; Boesen, Nana Wiaberg

    2017-01-01

    The project is based around a pondering of how that a company the size of Scandinavian Airlines or Norwegian Air Shuttle use their Finances and how they see their external environment. This has led to us researching the relationship between the companies and their finances as well as their external environment, and how they differ in both.To do this we have utilised a myriad of different methods to analyse the companies, including PESTEL, SWOT, TOWS; DCF, risk analysis, Sensitivity, Porter’s ...

  17. Implementing partnerships in nonreactor facility safety analyses

    International Nuclear Information System (INIS)

    Courtney, J.C.; Perry, W.H.; Phipps, R.D.

    1996-01-01

    Faculty and students from LSU have been participating in nuclear safety analyses and radiation protection projects at ANL-W at INEL since 1973. A mutually beneficial relationship has evolved that has resulted in generation of safety-related studies acceptable to Argonne and DOE, NRC, and state regulatory groups. Most of the safety projects have involved the Hot Fuel Examination Facility or the Fuel Conditioning Facility; both are hot cells that receive spent fuel from EBR-II. A table shows some of the major projects at ANL-W that involved LSU students and faculty

  18. Cost/benefit analyses of environmental impact

    International Nuclear Information System (INIS)

    Goldman, M.I.

    1974-01-01

    Various aspects of cost-benefit analyses are considered. Some topics discussed are: regulations of the National Environmental Policy Act (NEPA); statement of AEC policy and procedures for implementation of NEPA; Calvert Cliffs decision; AEC Regulatory Guide; application of risk-benefit analysis to nuclear power; application of the as low as practicable (ALAP) rule to radiation discharges; thermal discharge restrictions proposed by EPA under the 1972 Amendment to the Water Pollution Control Act; estimates of somatic and genetic insult per unit population exposure; occupational exposure; EPA Point Source Guidelines for Discharges from Steam Electric Power Plants; and costs of closed-cycle cooling using cooling towers. (U.S.)

  19. The phaco machine: analysing new technology.

    Science.gov (United States)

    Fishkind, William J

    2013-01-01

    The phaco machine is frequently overlooked as the crucial surgical instrument it is. Understanding how to set parameters is initiated by understanding fundamental concepts of machine function. This study analyses the critical concepts of partial occlusion phaco, occlusion phaco and pump technology. In addition, phaco energy categories as well as variations of phaco energy production are explored. Contemporary power modulations and pump controls allow for the enhancement of partial occlusion phacoemulsification. These significant changes in the anterior chamber dynamics produce a balanced environment for phaco; less complications; and improved patient outcomes.

  20. Nuclear analyses of the Pietroasa gold hoard

    International Nuclear Information System (INIS)

    Cojocaru, V.; Besliu, C.

    1999-01-01

    By means of nuclear analyses the concentrations of Au, Ag, Cu, Ir, Os, Pt, Co and Hg were measured in the 12 artifacts of the gold hoard discovered in 1837 at Pietroasa, Buzau country in Romania. The concentrations of the first four elements were used to compare different stylistic groups assumed by historians. Comparisons with gold nuggets from the old Dacian territory and gold Roman imperial coins were also made. A good agreement was found with the oldest hypothesis which considers that the hoard is represented by three styles appropriated mainly by the Goths. (author)

  1. An evaluation of the Olympus "Quickrate" analyser.

    Science.gov (United States)

    Williams, D G; Wood, R J; Worth, H G

    1979-02-01

    The Olympus "Quickrate", a photometer built for both kinetic and end point analysis was evaluated in this laboratory. Aspartate transaminase, lactate dehydrogenase, hydroxybutyrate dehydrogenase, creatine kinase, alkaline phosphatase and gamma glutamyl transpeptidase were measured in the kinetic mode and glucose, urea, total protein, albumin, bilirubin, calcium and iron in the end point mode. Overall, good correlation was observed with routine methodologies and the precision of the methods was acceptable. An electrical evaluation was also performed. In our hands, the instrument proved to be simple to use and gave no trouble. It should prove useful for paediatric and emergency work, and as a back up for other analysers.

  2. Magnetic energy analyser for slow electrons

    International Nuclear Information System (INIS)

    Limberg, W.

    1974-08-01

    A differential spectrometer with high time and energy resolution has been developed using the principle of energy analysis with a longitudinal homogeneous magnetic field. This way it is possible to measure the energy distribution of low energy electrons (eV-range) in the presence of high energy electrons without distortions by secondary electrons. The functioning and application of the analyzer is demonstrated by measuring the energy distributions of slow electrons emitted by a filament. (orig.) [de

  3. MILROY, Lesley. Observing and Analysing Natural Language: A Critical Account of Sociolinguistic Method. Oxford: Basil Blackwell, 1987. 230pp. MILROY, Lesley. Observing and Analysing Natural Language: A Critical Account of Sociolinguistic Method. Oxford: Basil Blackwell, 1987. 230pp.

    Directory of Open Access Journals (Sweden)

    Iria Werlang Garcia

    2008-04-01

    Full Text Available Lesley Milroy's Observing and Analysing Natural Language is a recent addition to an ever growing number of publications in the field of Sociolinguistics. It carries the weight of one of the experienced authors in the current days in the specified field and should offer basic information to both newcomers and established investigators in natural language. Lesley Milroy's Observing and Analysing Natural Language is a recent addition to an ever growing number of publications in the field of Sociolinguistics. It carries the weight of one of the experienced authors in the current days in the specified field and should offer basic information to both newcomers and established investigators in natural language.

  4. Analyses of containment structures with corrosion damage

    International Nuclear Information System (INIS)

    Cherry, J.L.

    1997-01-01

    Corrosion damage that has been found in a number of nuclear power plant containment structures can degrade the pressure capacity of the vessel. This has prompted concerns regarding the capacity of corroded containments to withstand accident loadings. To address these concerns, finite element analyses have been performed for a typical PWR Ice Condenser containment structure. Using ABAQUS, the pressure capacity was calculated for a typical vessel with no corrosion damage. Multiple analyses were then performed with the location of the corrosion and the amount of corrosion varied in each analysis. Using a strain-based failure criterion, a open-quotes lower boundclose quotes, open-quotes best estimateclose quotes, and open-quotes upper boundclose quotes failure level was predicted for each case. These limits were established by: determining the amount of variability that exists in material properties of typical containments, estimating the amount of uncertainty associated with the level of modeling detail and modeling assumptions, and estimating the effect of corrosion on the material properties

  5. Analyser Framework to Verify Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2009-01-01

    Full Text Available Today, it is important for software companies to build software systems in a short time-interval, to reduce costs and to have a good market position. Therefore well organized and systematic development approaches are required. Reusing software components, which are well tested, can be a good solution to develop software applications in effective manner. The reuse of software components is less expensive and less time consuming than a development from scratch. But it is dangerous to think that software components can be match together without any problems. Software components itself are well tested, of course, but even if they composed together problems occur. Most problems are based on interaction respectively communication. Avoiding such errors a framework has to be developed for analysing software components. That framework determines the compatibility of corresponding software components. The promising approach discussed here, presents a novel technique for analysing software components by applying an Abstract Syntax Language Tree (ASLT. A supportive environment will be designed that checks the compatibility of black-box software components. This article is concerned to the question how can be coupled software components verified by using an analyzer framework and determines the usage of the ASLT. Black-box Software Components and Abstract Syntax Language Tree are the basis for developing the proposed framework and are discussed here to provide the background knowledge. The practical implementation of this framework is discussed and shows the result by using a test environment.

  6. Passive safety injection experiments and analyses (PAHKO)

    International Nuclear Information System (INIS)

    Tuunanen, J.

    1998-01-01

    PAHKO project involved experiments on the PACTEL facility and computer simulations of selected experiments. The experiments focused on the performance of Passive Safety Injection Systems (PSIS) of Advanced Light Water Reactors (ALWRs) in Small Break Loss-Of-Coolant Accident (SBLOCA) conditions. The PSIS consisted of a Core Make-up Tank (CMT) and two pipelines (Pressure Balancing Line, PBL, and Injection Line, IL). The examined PSIS worked efficiently in SBLOCAs although the flow through the PSIS stopped temporarily if the break was very small and the hot water filled the CMT. The experiments demonstrated the importance of the flow distributor in the CMT to limit rapid condensation. The project included validation of three thermal-hydraulic computer codes (APROS, CATHARE and RELAP5). The analyses showed the codes are capable to simulate the overall behaviour of the transients. The detailed analyses of the results showed some models in the codes still need improvements. Especially, further development of models for thermal stratification, condensation and natural circulation flow with small driving forces would be necessary for accurate simulation of the PSIS phenomena. (orig.)

  7. Used Fuel Management System Interface Analyses - 13578

    Energy Technology Data Exchange (ETDEWEB)

    Howard, Robert; Busch, Ingrid [Oak Ridge National Laboratory, P.O. Box 2008, Bldg. 5700, MS-6170, Oak Ridge, TN 37831 (United States); Nutt, Mark; Morris, Edgar; Puig, Francesc [Argonne National Laboratory (United States); Carter, Joe; Delley, Alexcia; Rodwell, Phillip [Savannah River National Laboratory (United States); Hardin, Ernest; Kalinina, Elena [Sandia National Laboratories (United States); Clark, Robert [U.S. Department of Energy (United States); Cotton, Thomas [Complex Systems Group (United States)

    2013-07-01

    Preliminary system-level analyses of the interfaces between at-reactor used fuel management, consolidated storage facilities, and disposal facilities, along with the development of supporting logistics simulation tools, have been initiated to provide the U.S. Department of Energy (DOE) and other stakeholders with information regarding the various alternatives for managing used nuclear fuel (UNF) generated by the current fleet of light water reactors operating in the United States. An important UNF management system interface consideration is the need for ultimate disposal of UNF assemblies contained in waste packages that are sized to be compatible with different geologic media. Thermal analyses indicate that waste package sizes for the geologic media under consideration by the Used Fuel Disposition Campaign may be significantly smaller than the canisters being used for on-site dry storage by the nuclear utilities. Therefore, at some point along the UNF disposition pathway, there could be a need to repackage fuel assemblies already loaded and being loaded into the dry storage canisters currently in use. The implications of where and when the packaging or repackaging of commercial UNF will occur are key questions being addressed in this evaluation. The analysis demonstrated that thermal considerations will have a major impact on the operation of the system and that acceptance priority, rates, and facility start dates have significant system implications. (authors)

  8. Sensitivity in risk analyses with uncertain numbers.

    Energy Technology Data Exchange (ETDEWEB)

    Tucker, W. Troy; Ferson, Scott

    2006-06-01

    Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is Dempster-Shafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a ''pinching'' strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered.

  9. Special analyses reveal coke-deposit structure

    International Nuclear Information System (INIS)

    Albright, L.F.

    1988-01-01

    A scanning electron microscope (SEM) and an energy dispersive X-ray analyzer (EDAX) have been used to obtain information that clarifies the three mechanisms of coke formation in ethylene furnaces, and to analyze the metal condition at the exit of furnace. The results can be used to examine furnace operations and develop improved ethylene plant practices. In this first of four articles on the analyses of coke and metal samples, the coking mechanisms and coke deposits in a section of tube from an actual ethylene furnace (Furnace A) from a plant on the Texas Gulf Coast are discussed. The second articles in the series will analyze the condition of the tube metal in the same furnace. To show how coke deposition and metal condition dependent on the operating parameters of an ethylene furnace, the third article in the series will show the coke deposition in a Texas Gulf Coast furnace tube (Furnace B) that operated at shorter residence time. The fourth article discusses the metal condition in that furnace. Some recommendations, based on the analyses and findings, are offered in the fourth article that could help extend the life of ethylene furnace tubes, and also improve overall ethylene plant operations

  10. Overview of cooperative international piping benchmark analyses

    International Nuclear Information System (INIS)

    McAfee, W.J.

    1982-01-01

    This paper presents an overview of an effort initiated in 1976 by the International Working Group on Fast Reactors (IWGFR) of the International Atomic Energy Agency (IAEA) to evaluate detailed and simplified inelastic analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IAEA countries descriptions of tests and test results for piping systems or bends (with emphasis on high temperature inelastic tests), to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analyses results. Of the problem descriptions submitted three were selected to be used: a 90 0 -elbow at 600 0 C with an in-plane transverse force; a 90 0 -elbow with an in-plane moment; and a 180 0 -elbow at room temperature with a reversed, cyclic, in-plane transverse force. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this paper. 15 figures

  11. Ethics of cost analyses in medical education.

    Science.gov (United States)

    Walsh, Kieran

    2013-11-01

    Cost analyses in medical education are rarely straightforward, and rarely lead to clear-cut conclusions. Occasionally they do lead to clear conclusions but even when that happens, some stakeholders will ask difficult but valid questions about what to do following cost analyses-specifically about distributive justice in the allocation of resources. At present there are few or no debates about these issues and rationing decisions that are taken in medical education are largely made subconsciously. Distributive justice 'concerns the nature of a socially just allocation of goods in a society'. Inevitably there is a large degree of subjectivity in the judgment as to whether an allocation is seen as socially just or ethical. There are different principles by which we can view distributive justice and which therefore affect the prism of subjectivity through which we see certain problems. For example, we might say that distributive justice at a certain institution or in a certain medical education system operates according to the principle that resources must be divided equally amongst learners. Another system may say that resources should be distributed according to the needs of learners or even of patients. No ethical system or model is inherently right or wrong, they depend on the context in which the educator is working.

  12. In-field analysis

    International Nuclear Information System (INIS)

    Stewart, Richard

    2010-01-01

    Full text: A new technology for in-field measurement of hydrocarbons in soil promises rapid results. Standard industry practice in Australia for measuring hydrocarbons in soil is to send a soil sample to an off-site accredited laboratory for analysis. This typically costs $25-50 per sample and takes 5-7 days to turnaround the results. While there are in-field hydrocarbon measurement technologies available in the US, most involve extracting the hydrocarbons from the soil and analysing the resulting liquid. These methods are time- consuming and often involve toxic solvents and clumsy equipment. A new technology developed by Ziltek and CSIRO allows for real-time me as-urement in the field. The user simply pulls the trigger on a hand-held infrared spectrometer and within a few seconds gets a digital read-out of the hydrocarbon concentration. The technology requires no toxic solvents or consumables, and sampling positions can also be logged automatically using GPS coordinates. A new technology developed by Ziltek and CSIRO allows for real-time measurement in the field. The user simply pulls the trigger on a hand-held infrared spectrometer and within a few seconds gets a digital read-out of the hydrocarbon concentration. The technology requires no toxic solvents or consumables, and sampling positions can also be logged automatically using GPS coordinates. The technology is essentially a software application that can be used with any third-party supplied hand-held infrared device. A working prototype has been tested at several contaminated sites across Australia, with very promising results. The site trials involved taking in-situ measurements using an infrared instrument before sending the soil to an external laboratory for conventional analysis - and comparing the results. Ziltek technical director Dr Ben Dearman noted at some sites the variation between the infrared results and lab results was less than 10 per cent.The technology gives a single concentration value in

  13. Long-term occupational exposure to organic solvents affects color vision, contrast sensitivity and visual fields.

    Directory of Open Access Journals (Sweden)

    Thiago Leiros Costa

    Full Text Available The purpose of this study was to evaluate the visual outcome of chronic occupational exposure to a mixture of organic solvents by measuring color discrimination, achromatic contrast sensitivity and visual fields in a group of gas station workers. We tested 25 workers (20 males and 25 controls with no history of chronic exposure to solvents (10 males. All participants had normal ophthalmologic exams. Subjects had worked in gas stations on an average of 9.6 ± 6.2 years. Color vision was evaluated with the Lanthony D15d and Cambridge Colour Test (CCT. Visual field assessment consisted of white-on-white 24-2 automatic perimetry (Humphrey II-750i. Contrast sensitivity was measured for sinusoidal gratings of 0.2, 0.5, 1.0, 2.0, 5.0, 10.0 and 20.0 cycles per degree (cpd. Results from both groups were compared using the Mann-Whitney U test. The number of errors in the D15d was higher for workers relative to controls (p<0.01. Their CCT color discrimination thresholds were elevated compared to the control group along the protan, deutan and tritan confusion axes (p<0.01, and their ellipse area and ellipticity were higher (p<0.01. Genetic analysis of subjects with very elevated color discrimination thresholds excluded congenital causes for the visual losses. Automated perimetry thresholds showed elevation in the 9°, 15° and 21° of eccentricity (p<0.01 and in MD and PSD indexes (p<0.01. Contrast sensitivity losses were found for all spatial frequencies measured (p<0.01 except for 0.5 cpd. Significant correlation was found between previous working years and deutan axis thresholds (rho = 0.59; p<0.05, indexes of the Lanthony D15d (rho=0.52; p<0.05, perimetry results in the fovea (rho= -0.51; p<0.05 and at 3, 9 and 15 degrees of eccentricity (rho= -0.46; p<0.05. Extensive and diffuse visual changes were found, suggesting that specific occupational limits should be created.

  14. Relationship between visual field progression and baseline refraction in primary open-angle glaucoma.

    Science.gov (United States)

    Naito, Tomoko; Yoshikawa, Keiji; Mizoue, Shiro; Nanno, Mami; Kimura, Tairo; Suzumura, Hirotaka; Umeda, Yuzo; Shiraga, Fumio

    2016-01-01

    To analyze the relationship between visual field (VF) progression and baseline refraction in Japanese patients with primary open-angle glaucoma (POAG) including normal-tension glaucoma. In this retrospective study, the subjects were patients with POAG who had undergone VF tests at least ten times with a Humphrey Field Analyzer (Swedish interactive thresholding algorithm standard, Central 30-2 program). VF progression was defined as a significantly negative value of mean deviation (MD) slope at the final VF test. Multivariate logistic regression models were applied to detect an association between MD slope deterioration and baseline refraction. A total of 156 eyes of 156 patients were included in this analysis. Significant deterioration of MD slope was observed in 70 eyes of 70 patients (44.9%), whereas no significant deterioration was evident in 86 eyes of 86 patients (55.1%). The eyes with VF progression had significantly higher baseline refraction compared to those without apparent VF progression (-1.9±3.8 diopter [D] vs -3.5±3.4 D, P=0.0048) (mean ± standard deviation). When subject eyes were classified into four groups by the level of baseline refraction applying spherical equivalent (SE): no myopia (SE > -1D), mild myopia (-1D ≥ SE > -3D), moderate myopia (-3D ≥ SE > -6D), and severe myopia (-6D ≥ SE), the Cochran-Armitage trend analysis showed a decreasing trend in the proportion of MD slope deterioration with increasing severity of myopia (P=0.0002). The multivariate analysis revealed that baseline refraction (P=0.0108, odds ratio [OR]: 1.13, 95% confidence interval [CI]: 1.03-1.25) and intraocular pressure reduction rate (P=0.0150, OR: 0.97, 95% CI: 0.94-0.99) had a significant association with MD slope deterioration. In the current analysis of Japanese patients with POAG, baseline refraction was a factor significantly associated with MD slope deterioration as well as intraocular pressure reduction rate. When baseline refraction was classified into

  15. COCAP: a carbon dioxide analyser for small unmanned aircraft systems

    Science.gov (United States)

    Kunz, Martin; Lavric, Jost V.; Gerbig, Christoph; Tans, Pieter; Neff, Don; Hummelgård, Christine; Martin, Hans; Rödjegård, Henrik; Wrenger, Burkhard; Heimann, Martin

    2018-03-01

    Unmanned aircraft systems (UASs) could provide a cost-effective way to close gaps in the observation of the carbon cycle, provided that small yet accurate analysers are available. We have developed a COmpact Carbon dioxide analyser for Airborne Platforms (COCAP). The accuracy of COCAP's carbon dioxide (CO2) measurements is ensured by calibration in an environmental chamber, regular calibration in the field and by chemical drying of sampled air. In addition, the package contains a lightweight thermal stabilisation system that reduces the influence of ambient temperature changes on the CO2 sensor by 2 orders of magnitude. During validation of COCAP's CO2 measurements in simulated and real flights we found a measurement error of 1.2 µmol mol-1 or better with no indication of bias. COCAP is a self-contained package that has proven well suited for the operation on board small UASs. Besides carbon dioxide dry air mole fraction it also measures air temperature, humidity and pressure. We describe the measurement system and our calibration strategy in detail to support others in tapping the potential of UASs for atmospheric trace gas measurements.

  16. Sensitivity of surface meteorological analyses to observation networks

    Science.gov (United States)

    Tyndall, Daniel Paul

    A computationally efficient variational analysis system for two-dimensional meteorological fields is developed and described. This analysis approach is most efficient when the number of analysis grid points is much larger than the number of available observations, such as for large domain mesoscale analyses. The analysis system is developed using MATLAB software and can take advantage of multiple processors or processor cores. A version of the analysis system has been exported as a platform independent application (i.e., can be run on Windows, Linux, or Macintosh OS X desktop computers without a MATLAB license) with input/output operations handled by commonly available internet software combined with data archives at the University of Utah. The impact of observation networks on the meteorological analyses is assessed by utilizing a percentile ranking of individual observation sensitivity and impact, which is computed by using the adjoint of the variational surface assimilation system. This methodology is demonstrated using a case study of the analysis from 1400 UTC 27 October 2010 over the entire contiguous United States domain. The sensitivity of this approach to the dependence of the background error covariance on observation density is examined. Observation sensitivity and impact provide insight on the influence of observations from heterogeneous observing networks as well as serve as objective metrics for quality control procedures that may help to identify stations with significant siting, reporting, or representativeness issues.

  17. Transmission Characteristics of Primate Vocalizations: Implications for Acoustic Analyses

    Science.gov (United States)

    Maciej, Peter; Fischer, Julia; Hammerschmidt, Kurt

    2011-01-01

    Acoustic analyses have become a staple method in field studies of animal vocal communication, with nearly all investigations using computer-based approaches to extract specific features from sounds. Various algorithms can be used to extract acoustic variables that may then be related to variables such as individual identity, context or reproductive state. Habitat structure and recording conditions, however, have strong effects on the acoustic structure of sound signals. The purpose of this study was to identify which acoustic parameters reliably describe features of propagated sounds. We conducted broadcast experiments and examined the influence of habitat type, transmission height, and re-recording distance on the validity (deviation from the original sound) and reliability (variation within identical recording conditions) of acoustic features of different primate call types. Validity and reliability varied independently of each other in relation to habitat, transmission height, and re-recording distance, and depended strongly on the call type. The smallest deviations from the original sounds were obtained by a visually-controlled calculation of the fundamental frequency. Start- and end parameters of a sound were most susceptible to degradation in the environment. Because the recording conditions can have appreciable effects on acoustic parameters, it is advisable to validate the extraction method of acoustic variables from recordings over longer distances before using them in acoustic analyses. PMID:21829682

  18. Deflection type energy analyser for energetic electron beams in a beam-plasma system

    International Nuclear Information System (INIS)

    Michel, J.A.; Hogge, J.P.

    1988-11-01

    An energy analyser for the study of electron beam distribution functions in unmagnetized plasmas is described. This analyser is designed to avoid large electric fields which are created in multi-grid analysers and to measure directly the beam distribution function without differentiation. As an example of an application we present results on the propagation of an energetic beam (E b : 2.0 keV) in a plasma (n o : 1.10 10 cm -3 , T e : 1.4 eV) (author) 7 figs., 10 refs

  19. Improving Climate Communication through Comprehensive Linguistic Analyses Using Computational Tools

    Science.gov (United States)

    Gann, T. M.; Matlock, T.

    2014-12-01

    An important lesson on climate communication research is that there is no single way to reach out and inform the public. Different groups conceptualize climate issues in different ways and different groups have different values and assumptions. This variability makes it extremely difficult to effectively and objectively communicate climate information. One of the main challenges is the following: How do we acquire a better understanding of how values and assumptions vary across groups, including political groups? A necessary starting point is to pay close attention to the linguistic content of messages used across current popular media sources. Careful analyses of that information—including how it is realized in language for conservative and progressive media—may ultimately help climate scientists, government agency officials, journalists and others develop more effective messages. Past research has looked at partisan media coverage of climate change, but little attention has been given to the fine-grained linguistic content of such media. And when researchers have done detailed linguistic analyses, they have relied primarily on hand-coding, an approach that is costly, labor intensive, and time-consuming. Our project, building on recent work on partisan news media (Gann & Matlock, 2014; under review) uses high dimensional semantic analyses and other methods of automated classification techniques from the field of natural language processing to quantify how climate issues are characterized in media sources that differ according to political orientation. In addition to discussing varied linguistic patterns, we share new methods for improving climate communication for varied stakeholders, and for developing better assessments of their effectiveness.

  20. Pathway analyses implicate glial cells in schizophrenia.

    Directory of Open Access Journals (Sweden)

    Laramie E Duncan

    Full Text Available The quest to understand the neurobiology of schizophrenia and bipolar disorder is ongoing with multiple lines of evidence indicating abnormalities of glia, mitochondria, and glutamate in both disorders. Despite high heritability estimates of 81% for schizophrenia and 75% for bipolar disorder, compelling links between findings from neurobiological studies, and findings from large-scale genetic analyses, are only beginning to emerge.Ten publically available gene sets (pathways related to glia, mitochondria, and glutamate were tested for association to schizophrenia and bipolar disorder using MAGENTA as the primary analysis method. To determine the robustness of associations, secondary analyses were performed with: ALIGATOR, INRICH, and Set Screen. Data from the Psychiatric Genomics Consortium (PGC were used for all analyses. There were 1,068,286 SNP-level p-values for schizophrenia (9,394 cases/12,462 controls, and 2,088,878 SNP-level p-values for bipolar disorder (7,481 cases/9,250 controls.The Glia-Oligodendrocyte pathway was associated with schizophrenia, after correction for multiple tests, according to primary analysis (MAGENTA p = 0.0005, 75% requirement for individual gene significance and also achieved nominal levels of significance with INRICH (p = 0.0057 and ALIGATOR (p = 0.022. For bipolar disorder, Set Screen yielded nominally and method-wide significant associations to all three glial pathways, with strongest association to the Glia-Astrocyte pathway (p = 0.002.Consistent with findings of white matter abnormalities in schizophrenia by other methods of study, the Glia-Oligodendrocyte pathway was associated with schizophrenia in our genomic study. These findings suggest that the abnormalities of myelination observed in schizophrenia are at least in part due to inherited factors, contrasted with the alternative of purely environmental causes (e.g. medication effects or lifestyle. While not the primary purpose of our study

  1. DEPUTY: analysing architectural structures and checking style

    International Nuclear Information System (INIS)

    Gorshkov, D.; Kochelev, S.; Kotegov, S.; Pavlov, I.; Pravilnikov, V.; Wellisch, J.P.

    2001-01-01

    The DepUty (dependencies utility) can be classified as a project and process management tool. The main goal of DepUty is to assist by means of source code analysis and graphical representation using UML, in understanding dependencies of sub-systems and packages in CMS Object Oriented software, to understand architectural structure, and to schedule code release in modularised integration. It also allows a new-comer to more easily understand the global structure of CMS software, and to void circular dependencies up-front or re-factor the code, in case it was already too close to the edge of non-maintainability. The authors will discuss the various views DepUty provides to analyse package dependencies, and illustrate both the metrics and style checking facilities it provides

  2. Response surface use in safety analyses

    International Nuclear Information System (INIS)

    Prosek, A.

    1999-01-01

    When thousands of complex computer code runs related to nuclear safety are needed for statistical analysis, the response surface is used to replace the computer code. The main purpose of the study was to develop and demonstrate a tool called optimal statistical estimator (OSE) intended for response surface generation of complex and non-linear phenomena. The performance of optimal statistical estimator was tested by the results of 59 different RELAP5/MOD3.2 code calculations of the small-break loss-of-coolant accident in a two loop pressurized water reactor. The results showed that OSE adequately predicted the response surface for the peak cladding temperature. Some good characteristic of the OSE like monotonic function between two neighbor points and independence on the number of output parameters suggest that OSE can be used for response surface generation of any safety or system parameter in the thermal-hydraulic safety analyses.(author)

  3. Spatial Analyses of Harappan Urban Settlements

    Directory of Open Access Journals (Sweden)

    Hirofumi Teramura

    2006-12-01

    Full Text Available The Harappan Civilization occupies a unique place among the early civilizations of the world with its well planned urban settlements, advanced handicraft and technology, religious and trade activities. Using a Geographical Information Systems (GIS, this study presents spatial analyses that locate urban settlements on a digital elevation model (DEM according to the three phases of early, mature and late. Understanding the relationship between the spatial distribution of Harappan sites and the change in some factors, such as topographic features, river passages or sea level changes, will lead to an understanding of the dynamism of this civilization. It will also afford a glimpse of the factors behind the formation, development, and decline of the Harappan Civilization.

  4. The plant design analyser and its applications

    International Nuclear Information System (INIS)

    Whitmarsh-Everiss, M.J.

    1992-01-01

    Consideration is given to the history of computational methods for the non-linear dynamic analysis of plant behaviour. This is traced from analogue to hybrid computers. When these were phased out simulation languages were used in the batch mode and the interactive computational capabilities were lost. These have subsequently been recovered using mainframe computing architecture in the context of small models using the Prototype Plant Design Analyser. Given the development of parallel processing architectures, the restriction on model size can be lifted. This capability and the use of advanced Work Stations and graphics software has enabled an advanced interactive design environment to be developed. This system is generic and can be used, with suitable graphics development, to study the dynamics and control behaviour of any plant or system for minimum cost. Examples of past and possible future uses are identified. (author)

  5. Abundance analyses of thirty cool carbon stars

    International Nuclear Information System (INIS)

    Utsumi, Kazuhiko

    1985-01-01

    The results were previously obtained by use of the absolute gf-values and the cosmic abundance as a standard. These gf-values were found to contain large systematic errors, and as a result, the solar photospheric abundances were revised. Our previous results, therefore, must be revised by using new gf-values, and abundance analyses are extended for as many carbon stars as possible. In conclusion, in normal cool carbon stars heavy metals are overabundant by factors of 10 - 100 and rare-earth elements are overabundant by a factor of about 10, and in J-type cool carbon stars, C 12 /C 13 ratio is smaller, C 2 and CN bands and Li 6708 are stronger than in normal cool carbon stars, and the abundances of s-process elements with respect to Fe are nearly normal. (Mori, K.)

  6. Analysing Medieval Urban Space; a methodology

    Directory of Open Access Journals (Sweden)

    Marlous L. Craane MA

    2007-08-01

    Full Text Available This article has been written in reaction to recent developments in medieval history and archaeology, to study not only the buildings in a town but also the spaces that hold them together. It discusses a more objective and interdisciplinary approach for analysing urban morphology and use of space. It proposes a 'new' methodology by combining town plan analysis and space syntax. This methodology was trialled on the city of Utrecht in the Netherlands. By comparing the results of this 'new' methodology with the results of previous, more conventional, research, this article shows that space syntax can be applied successfully to medieval urban contexts. It does this by demonstrating a strong correlation between medieval economic spaces and the most integrated spaces, just as is found in the study of modern urban environments. It thus provides a strong basis for the use of this technique in future research of medieval urban environments.

  7. Precise Chemical Analyses of Planetary Surfaces

    Science.gov (United States)

    Kring, David; Schweitzer, Jeffrey; Meyer, Charles; Trombka, Jacob; Freund, Friedemann; Economou, Thanasis; Yen, Albert; Kim, Soon Sam; Treiman, Allan H.; Blake, David; hide

    1996-01-01

    We identify the chemical elements and element ratios that should be analyzed to address many of the issues identified by the Committee on Planetary and Lunar Exploration (COMPLEX). We determined that most of these issues require two sensitive instruments to analyze the necessary complement of elements. In addition, it is useful in many cases to use one instrument to analyze the outermost planetary surface (e.g. to determine weathering effects), while a second is used to analyze a subsurface volume of material (e.g., to determine the composition of unaltered planetary surface material). This dual approach to chemical analyses will also facilitate the calibration of orbital and/or Earth-based spectral observations of the planetary body. We determined that in many cases the scientific issues defined by COMPLEX can only be fully addressed with combined packages of instruments that would supplement the chemical data with mineralogic or visual information.

  8. Seismic analyses of structures. 1st draft

    International Nuclear Information System (INIS)

    David, M.

    1995-01-01

    The dynamic analysis presented in this paper refers to the seismic analysis of the main building of Paks NPP. The aim of the analysis was to determine the floor response spectra as response to seismic input. This analysis was performed by the 3-dimensional calculation model and the floor response spectra were determined for a number levels from the floor response time histories and no other adjustments were applied. The following results of seismic analysis are presented: 3-dimensional finite element model; basic assumptions of dynamic analyses; table of frequencies and included factors; modal masses for all modes; floor response spectra in all the selected nodes with figures of indicated nodes and important nodes of free vibration

  9. Analysing Terrorism from a Systems Thinking Perspective

    Directory of Open Access Journals (Sweden)

    Lukas Schoenenberger

    2014-02-01

    Full Text Available Given the complexity of terrorism, solutions based on single factors are destined to fail. Systems thinking offers various tools for helping researchers and policy makers comprehend terrorism in its entirety. We have developed a semi-quantitative systems thinking approach for characterising relationships between variables critical to terrorism and their impact on the system as a whole. For a better understanding of the mechanisms underlying terrorism, we present a 16-variable model characterising the critical components of terrorism and perform a series of highly focused analyses. We show how to determine which variables are best suited for government intervention, describing in detail their effects on the key variable—the political influence of a terrorist network. We also offer insights into how to elicit variables that destabilise and ultimately break down these networks. Because we clarify our novel approach with fictional data, the primary importance of this paper lies in the new framework for reasoning that it provides.

  10. Seismic analyses of structures. 1st draft

    Energy Technology Data Exchange (ETDEWEB)

    David, M [David Consulting, Engineering and Design Office (Czech Republic)

    1995-07-01

    The dynamic analysis presented in this paper refers to the seismic analysis of the main building of Paks NPP. The aim of the analysis was to determine the floor response spectra as responseto seismic input. This analysis was performed by the 3-dimensional calculation model and the floor response spectra were determined for a number levels from the floor response time histories and no other adjustments were applied. The following results of seismic analysis are presented: 3-dimensional finite element model; basic assumptions of dynamic analyses; table of frequencies and included factors; modal masses for all modes; floor response spectra in all the selected nodes with figures of indicated nodes and important nodes of free vibration.

  11. Project analysis and integration economic analyses summary

    Science.gov (United States)

    Macomber, H. L.

    1986-01-01

    An economic-analysis summary was presented for the manufacture of crystalline-silicon modules involving silicon ingot/sheet, growth, slicing, cell manufacture, and module assembly. Economic analyses provided: useful quantitative aspects for complex decision-making to the Flat-plate Solar Array (FSA) Project; yardsticks for design and performance to industry; and demonstration of how to evaluate and understand the worth of research and development both to JPL and other government agencies and programs. It was concluded that future research and development funds for photovoltaics must be provided by the Federal Government because the solar industry today does not reap enough profits from its present-day sales of photovoltaic equipment.

  12. Level 2 probabilistic event analyses and quantification

    International Nuclear Information System (INIS)

    Boneham, P.

    2003-01-01

    In this paper an example of quantification of a severe accident phenomenological event is given. The performed analysis for assessment of the probability that the debris released from the reactor vessel was in a coolable configuration in the lower drywell is presented. It is also analysed the assessment of the type of core/concrete attack that would occur. The coolability of the debris ex-vessel evaluation by an event in the Simplified Boiling Water Reactor (SBWR) Containment Event Tree (CET) and a detailed Decomposition Event Tree (DET) developed to aid in the quantification of this CET event are considered. The headings in the DET selected to represent plant physical states (e.g., reactor vessel pressure at the time of vessel failure) and the uncertainties associated with the occurrence of critical physical phenomena (e.g., debris configuration in the lower drywell) considered important to assessing whether the debris was coolable or not coolable ex-vessel are also discussed

  13. Externalizing Behaviour for Analysing System Models

    DEFF Research Database (Denmark)

    Ivanova, Marieta Georgieva; Probst, Christian W.; Hansen, René Rydhof

    2013-01-01

    System models have recently been introduced to model organisations and evaluate their vulnerability to threats and especially insider threats. Especially for the latter these models are very suitable, since insiders can be assumed to have more knowledge about the attacked organisation than outside...... attackers. Therefore, many attacks are considerably easier to be performed for insiders than for outsiders. However, current models do not support explicit specification of different behaviours. Instead, behaviour is deeply embedded in the analyses supported by the models, meaning that it is a complex......, if not impossible task to change behaviours. Especially when considering social engineering or the human factor in general, the ability to use different kinds of behaviours is essential. In this work we present an approach to make the behaviour a separate component in system models, and explore how to integrate...

  14. ATLAS helicity analyses in beauty hadron decays

    CERN Document Server

    Smizanska, M

    2000-01-01

    The ATLAS detector will allow a precise spatial reconstruction of the kinematics of B hadron decays. In combination with the efficient lepton identification applied already at trigger level, ATLAS is expected to provide large samples of exclusive decay channels cleanly separable from background. These data sets will allow spin-dependent analyses leading to the determination of production and decay parameters, which are not accessible if the helicity amplitudes are not separated. Measurement feasibility studies for decays B/sub s //sup 0/ to J/ psi phi and Lambda /sub b//sup 0/ to Lambda J/ psi , presented in this document, show the experimental precisions that can be achieved in determination of B/sub s//sup 0/ and Lambda /sub b //sup 0/ characteristics. (19 refs).

  15. Thermal hydraulic reactor safety analyses and experiments

    International Nuclear Information System (INIS)

    Holmstroem, H.; Eerikaeinen, L.; Kervinen, T.; Kilpi, K.; Mattila, L.; Miettinen, J.; Yrjoelae, V.

    1989-04-01

    The report introduces the results of the thermal hydraulic reactor safety research performed in the Nuclear Engineering Laboratory of the Technical Research Centre of Finland (VTT) during the years 1972-1987. Also practical applications i.e. analyses for the safety authorities and power companies are presented. The emphasis is on description of the state-of-the-art know how. The report describes VTT's most important computer codes, both those of foreign origin and those developed at VTT, and their assessment work, VTT's own experimental research, as well as international experimental projects and other forms of cooperation VTT has participated in. Appendix 8 contains a comprehensive list of the most important publications and technical reports produced. They present the content and results of the research in detail.(orig.)

  16. Digital analyses of cartometric Fruska Gora guidelines

    Directory of Open Access Journals (Sweden)

    Živković Dragica

    2013-01-01

    Full Text Available Modern geo morphological topography research have been using quantity statistic and cartographic methods for topographic relief features, mutual relief features, mutual connection analyses on the grounds of good quality numeric parameters etc. Topographic features are important for topographic activities are important for important natural activities. Important morphological characteristics are precisely at the angle of topography, hypsometry, and topography exposition and so on. Small yet unknown relief slants can deeply affect land configuration, hypsometry, topographic exposition etc. Expositions modify the light and heat of interconnected phenomena: soil and air temperature, soil disintegration, the length of vegetation period, the complexity of photosynthesis, the fruitfulness of agricultural crops, the height of snow limit etc. [Projekat Ministarstva nauke Republike Srbije, br. 176008 i br. III44006

  17. Attitude stability analyses for small artificial satellites

    International Nuclear Information System (INIS)

    Silva, W R; Zanardi, M C; Formiga, J K S; Cabette, R E S; Stuchi, T J

    2013-01-01

    The objective of this paper is to analyze the stability of the rotational motion of a symmetrical spacecraft, in a circular orbit. The equilibrium points and regions of stability are established when components of the gravity gradient torque acting on the spacecraft are included in the equations of rotational motion, which are described by the Andoyer's variables. The nonlinear stability of the equilibrium points of the rotational motion is analysed here by the Kovalev-Savchenko theorem. With the application of the Kovalev-Savchenko theorem, it is possible to verify if they remain stable under the influence of the terms of higher order of the normal Hamiltonian. In this paper, numerical simulations are made for a small hypothetical artificial satellite. Several stable equilibrium points were determined and regions around these points have been established by variations in the orbital inclination and in the spacecraft principal moment of inertia. The present analysis can directly contribute in the maintenance of the spacecraft's attitude

  18. Cointegration Approach to Analysing Inflation in Croatia

    Directory of Open Access Journals (Sweden)

    Lena Malešević-Perović

    2009-06-01

    Full Text Available The aim of this paper is to analyse the determinants of inflation in Croatia in the period 1994:6-2006:6. We use a cointegration approach and find that increases in wages positively influence inflation in the long-run. Furthermore, in the period from June 1994 onward, the depreciation of the currency also contributed to inflation. Money does not explain Croatian inflation. This irrelevance of the money supply is consistent with its endogeneity to exchange rate targeting, whereby the money supply is determined by developments in the foreign exchange market. The value of inflation in the previous period is also found to be significant, thus indicating some inflation inertia.

  19. Comprehensive immunoproteogenomic analyses of malignant pleural mesothelioma.

    Science.gov (United States)

    Lee, Hyun-Sung; Jang, Hee-Jin; Choi, Jong Min; Zhang, Jun; de Rosen, Veronica Lenge; Wheeler, Thomas M; Lee, Ju-Seog; Tu, Thuydung; Jindra, Peter T; Kerman, Ronald H; Jung, Sung Yun; Kheradmand, Farrah; Sugarbaker, David J; Burt, Bryan M

    2018-04-05

    We generated a comprehensive atlas of the immunologic cellular networks within human malignant pleural mesothelioma (MPM) using mass cytometry. Data-driven analyses of these high-resolution single-cell data identified 2 distinct immunologic subtypes of MPM with vastly different cellular composition, activation states, and immunologic function; mass spectrometry demonstrated differential abundance of MHC-I and -II neopeptides directly identified between these subtypes. The clinical relevance of this immunologic subtyping was investigated with a discriminatory molecular signature derived through comparison of the proteomes and transcriptomes of these 2 immunologic MPM subtypes. This molecular signature, representative of a favorable intratumoral cell network, was independently associated with improved survival in MPM and predicted response to immune checkpoint inhibitors in patients with MPM and melanoma. These data additionally suggest a potentially novel mechanism of response to checkpoint blockade: requirement for high measured abundance of neopeptides in the presence of high expression of MHC proteins specific for these neopeptides.

  20. Deterministic analyses of severe accident issues

    International Nuclear Information System (INIS)

    Dua, S.S.; Moody, F.J.; Muralidharan, R.; Claassen, L.B.

    2004-01-01

    Severe accidents in light water reactors involve complex physical phenomena. In the past there has been a heavy reliance on simple assumptions regarding physical phenomena alongside of probability methods to evaluate risks associated with severe accidents. Recently GE has developed realistic methodologies that permit deterministic evaluations of severe accident progression and of some of the associated phenomena in the case of Boiling Water Reactors (BWRs). These deterministic analyses indicate that with appropriate system modifications, and operator actions, core damage can be prevented in most cases. Furthermore, in cases where core-melt is postulated, containment failure can either be prevented or significantly delayed to allow sufficient time for recovery actions to mitigate severe accidents

  1. Risques naturels en montagne et analyse spatiale

    Directory of Open Access Journals (Sweden)

    Yannick Manche

    1999-06-01

    Full Text Available Le concept de risque repose sur deux notions :l'aléa, qui représente le phénomène physique par son amplitude et sa période retour ;la vulnérabilité, qui représente l'ensemble des biens et des personnes pouvant être touchés par un phénomène naturel.Le risque se définit alors comme le croisement de ces deux notions. Cette vision théorique permet de modéliser indépendamment les aléas et la vulnérabilité.Ce travail s'intéresse essentiellement à la prise en compte de la vulnérabilité dans la gestion des risques naturels. Son évaluation passe obligatoirement par une certaine analyse spatiale qui prend en compte l'occupation humaine et différentes échelles de l'utilisation de l'espace. Mais l'évaluation spatiale, que ce soit des biens et des personnes, ou des effets indirects se heurte à de nombreux problèmes. Il faut estimer l'importance de l'occupation de l'espace. Par ailleurs, le traitement des données implique des changements constants d'échelle pour passer des éléments ponctuels aux surfaces, ce que les systèmes d'information géographique ne gèrent pas parfaitement. La gestion des risques entraîne de fortes contraintes d'urbanisme, la prise en compte de la vulnérabilité permet de mieux comprendre et gérer les contraintes spatiales qu'impliquent les risques naturels. aléa, analyse spatiale, risques naturels, S.I.G., vulnérabilité

  2. Isotropy analyses of the Planck convergence map

    Science.gov (United States)

    Marques, G. A.; Novaes, C. P.; Bernui, A.; Ferreira, I. S.

    2018-01-01

    The presence of matter in the path of relic photons causes distortions in the angular pattern of the cosmic microwave background (CMB) temperature fluctuations, modifying their properties in a slight but measurable way. Recently, the Planck Collaboration released the estimated convergence map, an integrated measure of the large-scale matter distribution that produced the weak gravitational lensing (WL) phenomenon observed in Planck CMB data. We perform exhaustive analyses of this convergence map calculating the variance in small and large regions of the sky, but excluding the area masked due to Galactic contaminations, and compare them with the features expected in the set of simulated convergence maps, also released by the Planck Collaboration. Our goal is to search for sky directions or regions where the WL imprints anomalous signatures to the variance estimator revealed through a χ2 analyses at a statistically significant level. In the local analysis of the Planck convergence map, we identified eight patches of the sky in disagreement, in more than 2σ, with what is observed in the average of the simulations. In contrast, in the large regions analysis we found no statistically significant discrepancies, but, interestingly, the regions with the highest χ2 values are surrounding the ecliptic poles. Thus, our results show a good agreement with the features expected by the Λ cold dark matter concordance model, as given by the simulations. Yet, the outliers regions found here could suggest that the data still contain residual contamination, like noise, due to over- or underestimation of systematic effects in the simulation data set.

  3. The radiation analyses of ITER lower ports

    International Nuclear Information System (INIS)

    Petrizzi, L.; Brolatti, G.; Martin, A.; Loughlin, M.; Moro, F.; Villari, R.

    2010-01-01

    The ITER Vacuum Vessel has upper, equatorial, and lower ports used for equipment installation, diagnostics, heating and current drive systems, cryo-vacuum pumping, and access inside the vessel for maintenance. At the level of the divertor, the nine lower ports for remote handling, cryo-vacuum pumping and diagnostic are inclined downwards and toroidally located each every 40 o . The cryopump port has additionally a branch to allocate a second cryopump. The ports, as openings in the Vacuum Vessel, permit radiation streaming out of the vessel which affects the heating in the components in the outer regions of the machine inside and outside the ports. Safety concerns are also raised with respect to the dose after shutdown at the cryostat behind the ports: in such zones the radiation dose level must be kept below the regulatory limit to allow personnel access for maintenance purposes. Neutronic analyses have been required to qualify the ITER project related to the lower ports. A 3-D model was used to take into account full details of the ports and the lower machine surroundings. MCNP version 5 1.40 has been used with the FENDL 2.1 nuclear data library. The ITER 40 o model distributed by the ITER Organization was developed in the lower part to include the relevant details. The results of a first analysis, focused on cryopump system only, were recently published. In this paper more complete data on the cryopump port and analysis for the remote handling port and the diagnostic rack are presented; the results of both analyses give a complete map of the radiation loads in the outer divertor ports. Nuclear heating, dpa, tritium production, and dose rates after shutdown are provided and the implications for the design are discussed.

  4. Technology advancement for integrative stem cell analyses.

    Science.gov (United States)

    Jeong, Yoon; Choi, Jonghoon; Lee, Kwan Hyi

    2014-12-01

    Scientists have endeavored to use stem cells for a variety of applications ranging from basic science research to translational medicine. Population-based characterization of such stem cells, while providing an important foundation to further development, often disregard the heterogeneity inherent among individual constituents within a given population. The population-based analysis and characterization of stem cells and the problems associated with such a blanket approach only underscore the need for the development of new analytical technology. In this article, we review current stem cell analytical technologies, along with the advantages and disadvantages of each, followed by applications of these technologies in the field of stem cells. Furthermore, while recent advances in micro/nano technology have led to a growth in the stem cell analytical field, underlying architectural concepts allow only for a vertical analytical approach, in which different desirable parameters are obtained from multiple individual experiments and there are many technical challenges that limit vertically integrated analytical tools. Therefore, we propose--by introducing a concept of vertical and horizontal approach--that there is the need of adequate methods to the integration of information, such that multiple descriptive parameters from a stem cell can be obtained from a single experiment.

  5. Static electric fields modify the locomotory behaviour of cockroaches.

    Science.gov (United States)

    Jackson, Christopher W; Hunt, Edmund; Sharkh, Suleiman; Newland, Philip L

    2011-06-15

    Static electric fields are found throughout the environment and there is growing interest in how electric fields influence insect behaviour. Here we have analysed the locomotory behaviour of cockroaches (Periplaneta americana) in response to static electric fields at levels equal to and above those found in the natural environment. Walking behaviour (including velocity, distance moved, turn angle and time spent walking) were analysed as cockroaches approached an electric field boundary in an open arena, and also when continuously exposed to an electric field. On approaching an electric field boundary, the greater the electric field strength the more likely a cockroach would be to turn away from, or be repulsed by, the electric field. Cockroaches completely exposed to electric fields showed significant changes in locomotion by covering less distance, walking slowly and turning more often. This study highlights the importance of electric fields on the normal locomotory behaviour of insects.

  6. Database-Driven Analyses of Astronomical Spectra

    Science.gov (United States)

    Cami, Jan

    2012-03-01

    Spectroscopy is one of the most powerful tools to study the physical properties and chemical composition of very diverse astrophysical environments. In principle, each nuclide has a unique set of spectral features; thus, establishing the presence of a specific material at astronomical distances requires no more than finding a laboratory spectrum of the right material that perfectly matches the astronomical observations. Once the presence of a substance is established, a careful analysis of the observational characteristics (wavelengths or frequencies, intensities, and line profiles) allows one to determine many physical parameters of the environment in which the substance resides, such as temperature, density, velocity, and so on. Because of this great diagnostic potential, ground-based and space-borne astronomical observatories often include instruments to carry out spectroscopic analyses of various celestial objects and events. Of particular interest is molecular spectroscopy at infrared wavelengths. From the spectroscopic point of view, molecules differ from atoms in their ability to vibrate and rotate, and quantum physics inevitably causes those motions to be quantized. The energies required to excite vibrations or rotations are such that vibrational transitions generally occur at infrared wavelengths, whereas pure rotational transitions typically occur at sub-mm wavelengths. Molecular vibration and rotation are coupled though, and thus at infrared wavelengths, one commonly observes a multitude of ro-vibrational transitions (see Figure 13.1). At lower spectral resolution, all transitions blend into one broad ro-vibrational molecular band. The isotope. Molecular spectroscopy thus allows us to see a difference of one neutron in an atomic nucleus that is located at astronomical distances! Since the detection of the first interstellar molecules (the CH [21] and CN [14] radicals), more than 150 species have been detected in space, ranging in size from diatomic

  7. High performance liquid chromatography in pharmaceutical analyses

    Directory of Open Access Journals (Sweden)

    Branko Nikolin

    2004-05-01

    Full Text Available In testing the pre-sale procedure the marketing of drugs and their control in the last ten years, high performance liquid chromatographyreplaced numerous spectroscopic methods and gas chromatography in the quantitative and qualitative analysis. In the first period of HPLC application it was thought that it would become a complementary method of gas chromatography, however, today it has nearly completely replaced gas chromatography in pharmaceutical analysis. The application of the liquid mobile phase with the possibility of transformation of mobilized polarity during chromatography and all other modifications of mobile phase depending upon the characteristics of substance which are being tested, is a great advantage in the process of separation in comparison to other methods. The greater choice of stationary phase is the next factor which enables realization of good separation. The separation line is connected to specific and sensitive detector systems, spectrafluorimeter, diode detector, electrochemical detector as other hyphernated systems HPLC-MS and HPLC-NMR, are the basic elements on which is based such wide and effective application of the HPLC method. The purpose high performance liquid chromatography(HPLC analysis of any drugs is to confirm the identity of a drug and provide quantitative results and also to monitor the progress of the therapy of a disease.1 Measuring presented on the Fig. 1. is chromatogram obtained for the plasma of depressed patients 12 h before oral administration of dexamethasone. It may also be used to further our understanding of the normal and disease process in the human body trough biomedical and therapeutically research during investigation before of the drugs registration. The analyses of drugs and metabolites in biological fluids, particularly plasma, serum or urine is one of the most demanding but one of the most common uses of high performance of liquid chromatography. Blood, plasma or

  8. High perfomance liquid chromatography in pharmaceutical analyses.

    Science.gov (United States)

    Nikolin, Branko; Imamović, Belma; Medanhodzić-Vuk, Saira; Sober, Miroslav

    2004-05-01

    In testing the pre-sale procedure the marketing of drugs and their control in the last ten years, high performance liquid chromatography replaced numerous spectroscopic methods and gas chromatography in the quantitative and qualitative analysis. In the first period of HPLC application it was thought that it would become a complementary method of gas chromatography, however, today it has nearly completely replaced gas chromatography in pharmaceutical analysis. The application of the liquid mobile phase with the possibility of transformation of mobilized polarity during chromatography and all other modifications of mobile phase depending upon the characteristics of substance which are being tested, is a great advantage in the process of separation in comparison to other methods. The greater choice of stationary phase is the next factor which enables realization of good separation. The separation line is connected to specific and sensitive detector systems, spectrafluorimeter, diode detector, electrochemical detector as other hyphernated systems HPLC-MS and HPLC-NMR, are the basic elements on which is based such wide and effective application of the HPLC method. The purpose high performance liquid chromatography (HPLC) analysis of any drugs is to confirm the identity of a drug and provide quantitative results and also to monitor the progress of the therapy of a disease.1) Measuring presented on the Fig. 1. is chromatogram obtained for the plasma of depressed patients 12 h before oral administration of dexamethasone. It may also be used to further our understanding of the normal and disease process in the human body trough biomedical and therapeutically research during investigation before of the drugs registration. The analyses of drugs and metabolites in biological fluids, particularly plasma, serum or urine is one of the most demanding but one of the most common uses of high performance of liquid chromatography. Blood, plasma or serum contains numerous endogenous

  9. Analysing lawyers’ attitude towards knowledge sharing

    Directory of Open Access Journals (Sweden)

    Wole M. Olatokun

    2012-09-01

    Method: A field survey of 273 lawyers was carried out, using questionnaire for data collection. Collected data on all variables were structured into grouped frequency distributions. Principal Component Factor Analysis was applied to reduce the constructs and Simple Regression was applied to test the hypotheses. These were tested at 0.05% level of significance. Results: Results showed that expected associations and contributions were the major determinants of lawyers’ attitudes towards knowledge sharing. Expected reward was not significantly related to lawyers’ attitudes towards knowledge sharing. A positive attitude towards knowledge sharing was found to lead to a positive intention to share knowledge, although a positive intention to share knowledge did not significantly predict a positive knowledge sharing behaviour. The level of Information Technology (IT usage was also found to significantly affect the knowledge sharing behaviour of lawyers’. Conclusion: It was recommended that law firms in the study area should deploy more IT infrastructure and services that encourage effective knowledge sharing amongst lawyers.

  10. Optimizing Extender Code for NCSX Analyses

    International Nuclear Information System (INIS)

    Richman, M.; Ethier, S.; Pomphrey, N.

    2008-01-01

    Extender is a parallel C++ code for calculating the magnetic field in the vacuum region of a stellarator. The code was optimized for speed and augmented with tools to maintain a specialized NetCDF database. Two parallel algorithms were examined. An even-block work-distribution scheme was comparable in performance to a master-slave scheme. Large speedup factors were achieved by representing the plasma surface with a spline rather than Fourier series. The accuracy of this representation and the resulting calculations relied on the density of the spline mesh. The Fortran 90 module db access was written to make it easy to store Extender output in a manageable database. New or updated data can be added to existing databases. A generalized PBS job script handles the generation of a database from scratch

  11. Analysing human genomes at different scales

    DEFF Research Database (Denmark)

    Liu, Siyang

    The thriving of the Next-Generation sequencing (NGS) technologies in the past decade has dramatically revolutionized the field of human genetics. We are experiencing a wave of several large-scale whole genome sequencing studies of humans in the world. Those studies vary greatly regarding cohort...... will be reflected by the analysis of real data. This thesis covers studies in two human genome sequencing projects that distinctly differ in terms of studied population, sample size and sequencing depth. In the first project, we sequenced 150 Danish individuals from 50 trio families to 78x coverage....... The sophisticated experimental design enables high-quality de novo assembly of the genomes and provides a good opportunity for mapping the structural variations in the human population. We developed the AsmVar approach to discover, genotype and characterize the structural variations from the assemblies. Our...

  12. Uncertainty Analyses for Back Projection Methods

    Science.gov (United States)

    Zeng, H.; Wei, S.; Wu, W.

    2017-12-01

    So far few comprehensive error analyses for back projection methods have been conducted, although it is evident that high frequency seismic waves can be easily affected by earthquake depth, focal mechanisms and the Earth's 3D structures. Here we perform 1D and 3D synthetic tests for two back projection methods, MUltiple SIgnal Classification (MUSIC) (Meng et al., 2011) and Compressive Sensing (CS) (Yao et al., 2011). We generate synthetics for both point sources and finite rupture sources with different depths, focal mechanisms, as well as 1D and 3D structures in the source region. The 3D synthetics are generated through a hybrid scheme of Direct Solution Method and Spectral Element Method. Then we back project the synthetic data using MUSIC and CS. The synthetic tests show that the depth phases can be back projected as artificial sources both in space and time. For instance, for a source depth of 10km, back projection gives a strong signal 8km away from the true source. Such bias increases with depth, e.g., the error of horizontal location could be larger than 20km for a depth of 40km. If the array is located around the nodal direction of direct P-waves the teleseismic P-waves are dominated by the depth phases. Therefore, back projections are actually imaging the reflection points of depth phases more than the rupture front. Besides depth phases, the strong and long lasted coda waves due to 3D effects near trench can lead to additional complexities tested here. The strength contrast of different frequency contents in the rupture models also produces some variations to the back projection results. In the synthetic tests, MUSIC and CS derive consistent results. While MUSIC is more computationally efficient, CS works better for sparse arrays. In summary, our analyses indicate that the impact of various factors mentioned above should be taken into consideration when interpreting back projection images, before we can use them to infer the earthquake rupture physics.

  13. Scanning electron microscopy and micro-analyses

    International Nuclear Information System (INIS)

    Brisset, F.; Repoux, L.; Ruste, J.; Grillon, F.; Robaut, F.

    2008-01-01

    Scanning electron microscopy (SEM) and the related micro-analyses are involved in extremely various domains, from the academic environments to the industrial ones. The overall theoretical bases, the main technical characteristics, and some complements of information about practical usage and maintenance are developed in this book. high-vacuum and controlled-vacuum electron microscopes are thoroughly presented, as well as the last generation of EDS (energy dispersive spectrometer) and WDS (wavelength dispersive spectrometer) micro-analysers. Beside these main topics, other analysis or observation techniques are approached, such as EBSD (electron backscattering diffraction), 3-D imaging, FIB (focussed ion beams), Monte-Carlo simulations, in-situ tests etc.. This book, in French language, is the only one which treats of this subject in such an exhaustive way. It represents the actualized and totally updated version of a previous edition of 1979. It gathers the lectures given in 2006 at the summer school of Saint Martin d'Heres (France). Content: 1 - electron-matter interactions; 2 - characteristic X-radiation, Bremsstrahlung; 3 - electron guns in SEM; 4 - elements of electronic optics; 5 - vacuum techniques; 6 - detectors used in SEM; 7 - image formation and optimization in SEM; 7a - SEM practical instructions for use; 8 - controlled pressure microscopy; 8a - applications; 9 - energy selection X-spectrometers (energy dispersive spectrometers - EDS); 9a - EDS analysis; 9b - X-EDS mapping; 10 - technological aspects of WDS; 11 - processing of EDS and WDS spectra; 12 - X-microanalysis quantifying methods; 12a - quantitative WDS microanalysis of very light elements; 13 - statistics: precision and detection limits in microanalysis; 14 - analysis of stratified samples; 15 - crystallography applied to EBSD; 16 - EBSD: history, principle and applications; 16a - EBSD analysis; 17 - Monte Carlo simulation; 18 - insulating samples in SEM and X-ray microanalysis; 18a - insulating

  14. The ASSET intercomparison of stratosphere and lower mesosphere humidity analyses

    Directory of Open Access Journals (Sweden)

    H. E. Thornton

    2009-02-01

    Full Text Available This paper presents results from the first detailed intercomparison of stratosphere-lower mesosphere water vapour analyses; it builds on earlier results from the EU funded framework V "Assimilation of ENVISAT Data" (ASSET project. Stratospheric water vapour plays an important role in many key atmospheric processes and therefore an improved understanding of its daily variability is desirable. With the availability of high resolution, good quality Michelson Interferometer for Passive Atmospheric Sounding (MIPAS water vapour profiles, the ability of four different atmospheric models to assimilate these data is tested. MIPAS data have been assimilated over September 2003 into the models of the European Centre for Medium Range Weather Forecasts (ECMWF, the Belgian Institute for Space and Aeronomy (BIRA-IASB, the French Service d'Aéronomie (SA-IPSL and the UK Met Office. The resultant middle atmosphere humidity analyses are compared against independent satellite data from the Halogen Occultation Experiment (HALOE, the Polar Ozone and Aerosol Measurement (POAM III and the Stratospheric Aerosol and Gas Experiment (SAGE II. The MIPAS water vapour profiles are generally well assimilated in the ECMWF, BIRA-IASB and SA systems, producing stratosphere-mesosphere water vapour fields where the main features compare favourably with the independent observations. However, the models are less capable of assimilating the MIPAS data where water vapour values are locally extreme or in regions of strong humidity gradients, such as the southern hemisphere lower stratosphere polar vortex. Differences in the analyses can be attributed to the choice of humidity control variable, how the background error covariance matrix is generated, the model resolution and its complexity, the degree of quality control of the observations and the use of observations near the model boundaries. Due to the poor performance of the Met Office analyses the results are not included in

  15. The ASSET intercomparison of stratosphere and lower mesosphere humidity analyses

    Science.gov (United States)

    Thornton, H. E.; Jackson, D. R.; Bekki, S.; Bormann, N.; Errera, Q.; Geer, A. J.; Lahoz, W. A.; Rharmili, S.

    2009-02-01

    This paper presents results from the first detailed intercomparison of stratosphere-lower mesosphere water vapour analyses; it builds on earlier results from the EU funded framework V "Assimilation of ENVISAT Data" (ASSET) project. Stratospheric water vapour plays an important role in many key atmospheric processes and therefore an improved understanding of its daily variability is desirable. With the availability of high resolution, good quality Michelson Interferometer for Passive Atmospheric Sounding (MIPAS) water vapour profiles, the ability of four different atmospheric models to assimilate these data is tested. MIPAS data have been assimilated over September 2003 into the models of the European Centre for Medium Range Weather Forecasts (ECMWF), the Belgian Institute for Space and Aeronomy (BIRA-IASB), the French Service d'Aéronomie (SA-IPSL) and the UK Met Office. The resultant middle atmosphere humidity analyses are compared against independent satellite data from the Halogen Occultation Experiment (HALOE), the Polar Ozone and Aerosol Measurement (POAM III) and the Stratospheric Aerosol and Gas Experiment (SAGE II). The MIPAS water vapour profiles are generally well assimilated in the ECMWF, BIRA-IASB and SA systems, producing stratosphere-mesosphere water vapour fields where the main features compare favourably with the independent observations. However, the models are less capable of assimilating the MIPAS data where water vapour values are locally extreme or in regions of strong humidity gradients, such as the southern hemisphere lower stratosphere polar vortex. Differences in the analyses can be attributed to the choice of humidity control variable, how the background error covariance matrix is generated, the model resolution and its complexity, the degree of quality control of the observations and the use of observations near the model boundaries. Due to the poor performance of the Met Office analyses the results are not included in the intercomparison

  16. Quality assessment of published health economic analyses from South America.

    Science.gov (United States)

    Machado, Márcio; Iskedjian, Michael; Einarson, Thomas R

    2006-05-01

    articles published in South America were rated poor to acceptable and lower than previous research from other countries. Thus, efforts are needed to improve the reporting quality of these analyses in South America. Future research should examine the region's level of expertise and educational opportunities for those in the field of health economics.

  17. Towards interoperable and reproducible QSAR analyses: Exchange of datasets.

    Science.gov (United States)

    Spjuth, Ola; Willighagen, Egon L; Guha, Rajarshi; Eklund, Martin; Wikberg, Jarl Es

    2010-06-30

    QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but

  18. Towards interoperable and reproducible QSAR analyses: Exchange of datasets

    Directory of Open Access Journals (Sweden)

    Spjuth Ola

    2010-06-01

    Full Text Available Abstract Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join

  19. Progression of visual field in patients with primary open-angle glaucoma - ProgF study 1.

    Science.gov (United States)

    Aptel, Florent; Aryal-Charles, Nishal; Giraud, Jean-Marie; El Chehab, Hussam; Delbarre, Maxime; Chiquet, Christophe; Romanet, Jean-Paul; Renard, Jean-Paul

    2015-12-01

    To evaluate the visual field rate of progression of patients with treated ocular hypertension (OHT) and primary open-angle glaucoma (POAG) in clinical practice, using the mean deviation (MD) and the visual field index (VFI). Non-interventional cohort study. From a large multicentre database representative of the French population, 441 eyes of 228 patients with treated OHT or POAG followed up at least 6 years with Humphrey 24.2 Sita-Standard visual field examination at least twice a year were identified. From initial data, eyes were classified in five groups: 121 with OHT, 188 with early glaucoma (MD greater than -6 dB), 45 with moderate glaucoma (MD -6 to -12 dB), 41 with advanced glaucoma (MD -12 to -18 dB) and 46 with severe glaucoma (MD less than -18 dB). Rate of progression during the follow-up period was calculated using the trend analysis of the Guided Progression Analysis software. The mean duration of follow-up was 8.4 ± 2.7 years and the mean number of visual field, 18.4 ± 3.5. In eyes with OHT, rate of progression was -0.09 dB/year (-0.17%VFI/year). In eyes with POAG, rate of progression was -0.32 dB/year (-0.83%VFI/year) in eyes with early glaucoma, -0.52 dB/year (-1.81%VFI/year) in moderate glaucoma, -0.54 dB/year (-2.35%VFI/year) in advanced glaucoma and -0.45 dB/year (-1.97%VFI/year) in severe glaucoma. In eyes with POAG, a significant progression (p open-angle glaucoma is a progressive disease in the majority of patients despite cautioned treatment and follow-up. The rate of progression varies greatly among subjects. © 2015 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  20. Magnetic fields of HgMn stars

    DEFF Research Database (Denmark)

    Hubrig, S.; González, J. F.; Ilyin, I.

    2012-01-01

    Context. The frequent presence of weak magnetic fields on the surface of spotted late-B stars with HgMn peculiarity in binary systems has been controversial during the two last decades. Recent studies of magnetic fields in these stars using the least-squares deconvolution (LSD) technique have...... failed to detect magnetic fields, indicating an upper limit on the longitudinal field between 8 and 15G. In these LSD studies, assumptions were made that all spectral lines are identical in shape and can be described by a scaled mean profile. Aims. We re-analyse the available spectropolarimetric material...

  1. Multichannel amplitude analyser for nuclear spectrometry

    International Nuclear Information System (INIS)

    Jankovic, S.; Milovanovic, B.

    2003-01-01

    A multichannel amplitude analyser with 4096 channels was designed. It is based on a fast 12-bit analog-to-digital converter. The intended purpose of the instrument is recording nuclear spectra by means of scintillation detectors. The computer link is established through an opto-isolated serial connection cable, thus reducing instrument sensitivity to disturbances originating from digital circuitry. Refreshing of the data displayed on the screen occurs on every 2.5 seconds. The impulse peak detection is implemented through the differentiation of the amplified input signal, while the synchronization with the data coming from the converter output is established by taking advantage of the internal 'pipeline' structure of the converter itself. The mode of operation of the built-in microcontroller provides that there are no missed impulses, and the simple logic network prevents the initiation of the amplitude reading sequence for the next impulse in case it appears shortly after its precedent. The solution proposed here demonstrated a good performance at a comparatively low manufacturing cost, and is thus suitable for educational purposes (author)

  2. Scleral topography analysed by optical coherence tomography.

    Science.gov (United States)

    Bandlitz, Stefan; Bäumer, Joachim; Conrad, Uwe; Wolffsohn, James

    2017-08-01

    A detailed evaluation of the corneo-scleral-profile (CSP) is of particular relevance in soft and scleral lenses fitting. The aim of this study was to use optical coherence tomography (OCT) to analyse the profile of the limbal sclera and to evaluate the relationship between central corneal radii, corneal eccentricity and scleral radii. Using OCT (Optos OCT/SLO; Dunfermline, Scotland, UK) the limbal scleral radii (SR) of 30 subjects (11M, 19F; mean age 23.8±2.0SD years) were measured in eight meridians 45° apart. Central corneal radii (CR) and corneal eccentricity (CE) were evaluated using the Oculus Keratograph 4 (Oculus, Wetzlar, Germany). Differences between SR in the meridians and the associations between SR and corneal topography were assessed. Median SR measured along 45° (58.0; interquartile range, 46.8-84.8mm) was significantly (ptopography and may provide additional data useful in fitting soft and scleral contact lenses. Copyright © 2017 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.

  3. Bayesian analyses of seasonal runoff forecasts

    Science.gov (United States)

    Krzysztofowicz, R.; Reese, S.

    1991-12-01

    Forecasts of seasonal snowmelt runoff volume provide indispensable information for rational decision making by water project operators, irrigation district managers, and farmers in the western United States. Bayesian statistical models and communication frames have been researched in order to enhance the forecast information disseminated to the users, and to characterize forecast skill from the decision maker's point of view. Four products are presented: (i) a Bayesian Processor of Forecasts, which provides a statistical filter for calibrating the forecasts, and a procedure for estimating the posterior probability distribution of the seasonal runoff; (ii) the Bayesian Correlation Score, a new measure of forecast skill, which is related monotonically to the ex ante economic value of forecasts for decision making; (iii) a statistical predictor of monthly cumulative runoffs within the snowmelt season, conditional on the total seasonal runoff forecast; and (iv) a framing of the forecast message that conveys the uncertainty associated with the forecast estimates to the users. All analyses are illustrated with numerical examples of forecasts for six gauging stations from the period 1971 1988.

  4. Analyses of demand response in Denmark

    International Nuclear Information System (INIS)

    Moeller Andersen, F.; Grenaa Jensen, S.; Larsen, Helge V.; Meibom, P.; Ravn, H.; Skytte, K.; Togeby, M.

    2006-10-01

    Due to characteristics of the power system, costs of producing electricity vary considerably over short time intervals. Yet, many consumers do not experience corresponding variations in the price they pay for consuming electricity. The topic of this report is: are consumers willing and able to respond to short-term variations in electricity prices, and if so, what is the social benefit of consumers doing so? Taking Denmark and the Nord Pool market as a case, the report focuses on what is known as short-term consumer flexibility or demand response in the electricity market. With focus on market efficiency, efficient allocation of resources and security of supply, the report describes demand response from a micro-economic perspective and provides empirical observations and case studies. The report aims at evaluating benefits from demand response. However, only elements contributing to an overall value are presented. In addition, the analyses are limited to benefits for society, and costs of obtaining demand response are not considered. (au)

  5. WIND SPEED AND ENERGY POTENTIAL ANALYSES

    Directory of Open Access Journals (Sweden)

    A. TOKGÖZLÜ

    2013-01-01

    Full Text Available This paper provides a case study on application of wavelet techniques to analyze wind speed and energy (renewable and environmental friendly energy. Solar and wind are main sources of energy that allows farmers to have the potential for transferring kinetic energy captured by the wind mill for pumping water, drying crops, heating systems of green houses, rural electrification's or cooking. Larger wind turbines (over 1 MW can pump enough water for small-scale irrigation. This study tried to initiate data gathering process for wavelet analyses, different scale effects and their role on wind speed and direction variations. The wind data gathering system is mounted at latitudes: 37° 50" N; longitude 30° 33" E and height: 1200 m above mean sea level at a hill near Süleyman Demirel University campus. 10 minutes average values of two levels wind speed and direction (10m and 30m above ground level have been recorded by a data logger between July 2001 and February 2002. Wind speed values changed between the range of 0 m/s and 54 m/s. Annual mean speed value is 4.5 m/s at 10 m ground level. Prevalent wind

  6. PRECLOSURE CONSEQUENCE ANALYSES FOR LICENSE APPLICATION

    Energy Technology Data Exchange (ETDEWEB)

    S. Tsai

    2005-01-12

    Radiological consequence analyses are performed for potential releases from normal operations in surface and subsurface facilities and from Category 1 and Category 2 event sequences during the preclosure period. Surface releases from normal repository operations are primarily from radionuclides released from opening a transportation cask during dry transfer operations of spent nuclear fuel (SNF) in Dry Transfer Facility 1 (DTF 1), Dry Transfer Facility 2 (DTF 2), the Canister Handling facility (CHF), or the Fuel Handling Facility (FHF). Subsurface releases from normal repository operations are from resuspension of waste package surface contamination and neutron activation of ventilated air and silica dust from host rock in the emplacement drifts. The purpose of this calculation is to demonstrate that the preclosure performance objectives, specified in 10 CFR 63.111(a) and 10 CFR 63.111(b), have been met for the proposed design and operations in the geologic repository operations area. Preclosure performance objectives are discussed in Section 6.2.3 and are summarized in Tables 1 and 2.

  7. Soil deflation analyses from wind erosion events

    Directory of Open Access Journals (Sweden)

    Lenka Lackóová

    2015-09-01

    Full Text Available There are various methods to assess soil erodibility for wind erosion. This paper focuses on aggregate analysis by a laser particle sizer ANALYSETTE 22 (FRITSCH GmbH, made to determine the size distribution of soil particles detached by wind (deflated particles. Ten soil samples, trapped along the same length of the erosion surface (150–155 m but at different wind speeds, were analysed. The soil was sampled from a flat, smooth area without vegetation cover or soil crust, not affected by the impact of windbreaks or other barriers, from a depth of maximum 2.5 cm. Prior to analysis the samples were prepared according to the relevant specifications. An experiment was also conducted using a device that enables characterisation of the vertical movement of the deflated material. The trapped samples showed no differences in particle size and the proportions of size fractions at different hourly average wind speeds. It was observed that most of particles travelling in saltation mode (size 50–500 μm – 58–70% – moved vertically up to 26 cm above the soil surface. At greater heights, particles moving in suspension mode (floating in the air; size < 100 μm accounted for up to 90% of the samples. This result suggests that the boundary between the two modes of the vertical movement of deflated soil particles lies at about 25 cm above the soil surface.

  8. Genomic analyses of modern dog breeds.

    Science.gov (United States)

    Parker, Heidi G

    2012-02-01

    A rose may be a rose by any other name, but when you call a dog a poodle it becomes a very different animal than if you call it a bulldog. Both the poodle and the bulldog are examples of dog breeds of which there are >400 recognized worldwide. Breed creation has played a significant role in shaping the modern dog from the length of his leg to the cadence of his bark. The selection and line-breeding required to maintain a breed has also reshaped the genome of the dog, resulting in a unique genetic pattern for each breed. The breed-based population structure combined with extensive morphologic variation and shared human environments have made the dog a popular model for mapping both simple and complex traits and diseases. In order to obtain the most benefit from the dog as a genetic system, it is necessary to understand the effect structured breeding has had on the genome of the species. That is best achieved by looking at genomic analyses of the breeds, their histories, and their relationships to each other.

  9. Interim Basis for PCB Sampling and Analyses

    International Nuclear Information System (INIS)

    BANNING, D.L.

    2001-01-01

    This document was developed as an interim basis for sampling and analysis of polychlorinated biphenyls (PCBs) and will be used until a formal data quality objective (DQO) document is prepared and approved. On August 31, 2000, the Framework Agreement for Management of Polychlorinated Biphenyls (PCBs) in Hanford Tank Waste was signed by the US. Department of Energy (DOE), the Environmental Protection Agency (EPA), and the Washington State Department of Ecology (Ecology) (Ecology et al. 2000). This agreement outlines the management of double shell tank (DST) waste as Toxic Substance Control Act (TSCA) PCB remediation waste based on a risk-based disposal approval option per Title 40 of the Code of Federal Regulations 761.61 (c). The agreement calls for ''Quantification of PCBs in DSTs, single shell tanks (SSTs), and incoming waste to ensure that the vitrification plant and other ancillary facilities PCB waste acceptance limits and the requirements of the anticipated risk-based disposal approval are met.'' Waste samples will be analyzed for PCBs to satisfy this requirement. This document describes the DQO process undertaken to assure appropriate data will be collected to support management of PCBs and is presented in a DQO format. The DQO process was implemented in accordance with the U.S. Environmental Protection Agency EPA QAlG4, Guidance for the Data Quality Objectives Process (EPA 1994) and the Data Quality Objectives for Sampling and Analyses, HNF-IP-0842/Rev.1 A, Vol. IV, Section 4.16 (Banning 1999)

  10. Achieving reasonable conservatism in nuclear safety analyses

    International Nuclear Information System (INIS)

    Jamali, Kamiar

    2015-01-01

    In the absence of methods that explicitly account for uncertainties, seeking reasonable conservatism in nuclear safety analyses can quickly lead to extreme conservatism. The rate of divergence to extreme conservatism is often beyond the expert analysts’ intuitive feeling, but can be demonstrated mathematically. Too much conservatism in addressing the safety of nuclear facilities is not beneficial to society. Using certain properties of lognormal distributions for representation of input parameter uncertainties, example calculations for the risk and consequence of a fictitious facility accident scenario are presented. Results show that there are large differences between the calculated 95th percentiles and the extreme bounding values derived from using all input variables at their upper-bound estimates. Showing the relationship of the mean values to the key parameters of the output distributions, the paper concludes that the mean is the ideal candidate for representation of the value of an uncertain parameter. The mean value is proposed as the metric that is consistent with the concept of reasonable conservatism in nuclear safety analysis, because its value increases towards higher percentiles of the underlying positively skewed distribution with increasing levels of uncertainty. Insensitivity of the results to the actual underlying distributions is briefly demonstrated. - Highlights: • Multiple conservative assumptions can quickly diverge into extreme conservatism. • Mathematics and attractive properties provide basis for wide use of lognormal distribution. • Mean values are ideal candidates for representation of parameter uncertainties. • Mean values are proposed as reasonably conservative estimates of parameter uncertainties

  11. Fast and accurate methods for phylogenomic analyses

    Directory of Open Access Journals (Sweden)

    Warnow Tandy

    2011-10-01

    Full Text Available Abstract Background Species phylogenies are not estimated directly, but rather through phylogenetic analyses of different gene datasets. However, true gene trees can differ from the true species tree (and hence from one another due to biological processes such as horizontal gene transfer, incomplete lineage sorting, and gene duplication and loss, so that no single gene tree is a reliable estimate of the species tree. Several methods have been developed to estimate species trees from estimated gene trees, differing according to the specific algorithmic technique used and the biological model used to explain differences between species and gene trees. Relatively little is known about the relative performance of these methods. Results We report on a study evaluating several different methods for estimating species trees from sequence datasets, simulating sequence evolution under a complex model including indels (insertions and deletions, substitutions, and incomplete lineage sorting. The most important finding of our study is that some fast and simple methods are nearly as accurate as the most accurate methods, which employ sophisticated statistical methods and are computationally quite intensive. We also observe that methods that explicitly consider errors in the estimated gene trees produce more accurate trees than methods that assume the estimated gene trees are correct. Conclusions Our study shows that highly accurate estimations of species trees are achievable, even when gene trees differ from each other and from the species tree, and that these estimations can be obtained using fairly simple and computationally tractable methods.

  12. Mediation Analyses in the Real World

    DEFF Research Database (Denmark)

    Lange, Theis; Starkopf, Liis

    2016-01-01

    The paper by Nguyen et al.1 published in this issue of Epidemiology presents a comparison of the recently suggested inverse odds ratio approach for addressing mediation and a more conventional Baron and Kenny-inspired method. Interestingly, the comparison is not done through a discussion of restr......The paper by Nguyen et al.1 published in this issue of Epidemiology presents a comparison of the recently suggested inverse odds ratio approach for addressing mediation and a more conventional Baron and Kenny-inspired method. Interestingly, the comparison is not done through a discussion...... it simultaneously ensures that the comparison is based on properties, which matter in actual applications, and makes the comparison accessible for a broader audience. In a wider context, the choice to stay close to real-life problems mirrors a general trend within the literature on mediation analysis namely to put...... applications using the inverse odds ration approach, as it simply has not had enough time to move from theoretical concept to published applied paper, we do expect to be able to judge the willingness of authors and journals to employ the causal inference-based approach to mediation analyses. Our hope...

  13. Reproducibility of neuroimaging analyses across operating systems.

    Science.gov (United States)

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  14. Activation analyses for different fusion structural alloys

    International Nuclear Information System (INIS)

    Attaya, H.; Smith, D.

    1991-01-01

    The leading candidate structural materials, viz., the vanadium alloys, the nickel or the manganese stabilized austenitic steels, and the ferritic steels, are analysed in terms of their induced activation in the TPSS fusion power reactor. The TPSS reactor has 1950 MW fusion power and inboard and outboard average neutron wall loading of 3.75 and 5.35 MW/m 2 respectively. The results shows that, after one year of continuous operation, the vanadium alloys have the least radioactivity at reactor shutdown. The maximum difference between the induced radioactivity in the vanadium alloys and in the other iron-based alloys occurs at about 10 years after reactor shutdown. At this time, the total reactor radioactivity, using the vanadium alloys, is about two orders of magnitude less than the total reactor radioactivity utilizing any other alloy. The difference is even larger in the first wall, the FW-vanadium activation is 3 orders of magnitude less than other alloys' FW activation. 2 refs., 7 figs

  15. Statistical analyses of extreme food habits

    International Nuclear Information System (INIS)

    Breuninger, M.; Neuhaeuser-Berthold, M.

    2000-01-01

    This report is a summary of the results of the project ''Statistical analyses of extreme food habits'', which was ordered from the National Office for Radiation Protection as a contribution to the amendment of the ''General Administrative Regulation to paragraph 45 of the Decree on Radiation Protection: determination of the radiation exposition by emission of radioactive substances from facilities of nuclear technology''. Its aim is to show if the calculation of the radiation ingested by 95% of the population by food intake, like it is planned in a provisional draft, overestimates the true exposure. If such an overestimation exists, the dimension of it should be determined. It was possible to prove the existence of this overestimation but its dimension could only roughly be estimated. To identify the real extent of it, it is necessary to include the specific activities of the nuclides, which were not available for this investigation. In addition to this the report shows how the amounts of food consumption of different groups of foods influence each other and which connections between these amounts should be taken into account, in order to estimate the radiation exposition as precise as possible. (orig.) [de

  16. Evaluation of the Olympus AU-510 analyser.

    Science.gov (United States)

    Farré, C; Velasco, J; Ramón, F

    1991-01-01

    The selective multitest Olympus AU-510 analyser was evaluated according to the recommendations of the Comision de Instrumentacion de la Sociedad Española de Quimica Clinica and the European Committee for Clinical Laboratory Standards. The evaluation was carried out in two stages: an examination of the analytical units and then an evaluation in routine work conditions. The operational characteristics of the system were also studied.THE FIRST STAGE INCLUDED A PHOTOMETRIC STUDY: dependent on the absorbance, the inaccuracy varies between +0.5% to -0.6% at 405 nm and from -5.6% to 10.6% at 340 nm; the imprecision ranges between -0.22% and 0.56% at 405 nm and between 0.09% and 2.74% at 340 nm. Linearity was acceptable, apart from a very low absorbance for NADH at 340 nm; and the imprecision of the serum sample pipetter was satisfactory.TWELVE SERUM ANALYTES WERE STUDIED UNDER ROUTINE CONDITIONS: glucose, urea urate, cholesterol, triglycerides, total bilirubin, creatinine, phosphate, iron, aspartate aminotransferase, alanine aminotransferase and gamma-glutamyl transferase.The within-run imprecision (CV%) ranged from 0.67% for phosphate to 2.89% for iron and the between-run imprecision from 0.97% for total bilirubin to 7.06% for iron. There was no carryover in a study of the serum sample pipetter. Carry-over studies with the reagent and sample pipetters shows some cross contamination in the iron assay.

  17. PRECLOSURE CONSEQUENCE ANALYSES FOR LICENSE APPLICATION

    International Nuclear Information System (INIS)

    S. Tsai

    2005-01-01

    Radiological consequence analyses are performed for potential releases from normal operations in surface and subsurface facilities and from Category 1 and Category 2 event sequences during the preclosure period. Surface releases from normal repository operations are primarily from radionuclides released from opening a transportation cask during dry transfer operations of spent nuclear fuel (SNF) in Dry Transfer Facility 1 (DTF 1), Dry Transfer Facility 2 (DTF 2), the Canister Handling facility (CHF), or the Fuel Handling Facility (FHF). Subsurface releases from normal repository operations are from resuspension of waste package surface contamination and neutron activation of ventilated air and silica dust from host rock in the emplacement drifts. The purpose of this calculation is to demonstrate that the preclosure performance objectives, specified in 10 CFR 63.111(a) and 10 CFR 63.111(b), have been met for the proposed design and operations in the geologic repository operations area. Preclosure performance objectives are discussed in Section 6.2.3 and are summarized in Tables 1 and 2

  18. Genomic analyses of the CAM plant pineapple.

    Science.gov (United States)

    Zhang, Jisen; Liu, Juan; Ming, Ray

    2014-07-01

    The innovation of crassulacean acid metabolism (CAM) photosynthesis in arid and/or low CO2 conditions is a remarkable case of adaptation in flowering plants. As the most important crop that utilizes CAM photosynthesis, the genetic and genomic resources of pineapple have been developed over many years. Genetic diversity studies using various types of DNA markers led to the reclassification of the two genera Ananas and Pseudananas and nine species into one genus Ananas and two species, A. comosus and A. macrodontes with five botanical varieties in A. comosus. Five genetic maps have been constructed using F1 or F2 populations, and high-density genetic maps generated by genotype sequencing are essential resources for sequencing and assembling the pineapple genome and for marker-assisted selection. There are abundant expression sequence tag resources but limited genomic sequences in pineapple. Genes involved in the CAM pathway has been analysed in several CAM plants but only a few of them are from pineapple. A reference genome of pineapple is being generated and will accelerate genetic and genomic research in this major CAM crop. This reference genome of pineapple provides the foundation for studying the origin and regulatory mechanism of CAM photosynthesis, and the opportunity to evaluate the classification of Ananas species and botanical cultivars. © The Author 2014. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  19. Social Media Analyses for Social Measurement

    Science.gov (United States)

    Schober, Michael F.; Pasek, Josh; Guggenheim, Lauren; Lampe, Cliff; Conrad, Frederick G.

    2016-01-01

    Demonstrations that analyses of social media content can align with measurement from sample surveys have raised the question of whether survey research can be supplemented or even replaced with less costly and burdensome data mining of already-existing or “found” social media content. But just how trustworthy such measurement can be—say, to replace official statistics—is unknown. Survey researchers and data scientists approach key questions from starting assumptions and analytic traditions that differ on, for example, the need for representative samples drawn from frames that fully cover the population. New conversations between these scholarly communities are needed to understand the potential points of alignment and non-alignment. Across these approaches, there are major differences in (a) how participants (survey respondents and social media posters) understand the activity they are engaged in; (b) the nature of the data produced by survey responses and social media posts, and the inferences that are legitimate given the data; and (c) practical and ethical considerations surrounding the use of the data. Estimates are likely to align to differing degrees depending on the research topic and the populations under consideration, the particular features of the surveys and social media sites involved, and the analytic techniques for extracting opinions and experiences from social media. Traditional population coverage may not be required for social media content to effectively predict social phenomena to the extent that social media content distills or summarizes broader conversations that are also measured by surveys. PMID:27257310

  20. Reliability Analyses of Groundwater Pollutant Transport

    Energy Technology Data Exchange (ETDEWEB)

    Dimakis, Panagiotis

    1997-12-31

    This thesis develops a probabilistic finite element model for the analysis of groundwater pollution problems. Two computer codes were developed, (1) one using finite element technique to solve the two-dimensional steady state equations of groundwater flow and pollution transport, and (2) a first order reliability method code that can do a probabilistic analysis of any given analytical or numerical equation. The two codes were connected into one model, PAGAP (Probability Analysis of Groundwater And Pollution). PAGAP can be used to obtain (1) the probability that the concentration at a given point at a given time will exceed a specified value, (2) the probability that the maximum concentration at a given point will exceed a specified value and (3) the probability that the residence time at a given point will exceed a specified period. PAGAP could be used as a tool for assessment purposes and risk analyses, for instance the assessment of the efficiency of a proposed remediation technique or to study the effects of parameter distribution for a given problem (sensitivity study). The model has been applied to study the greatest self sustained, precipitation controlled aquifer in North Europe, which underlies Oslo`s new major airport. 92 refs., 187 figs., 26 tabs.

  1. System for analysing sickness absenteeism in Poland.

    Science.gov (United States)

    Indulski, J A; Szubert, Z

    1997-01-01

    The National System of Sickness Absenteeism Statistics has been functioning in Poland since 1977, as the part of the national health statistics. The system is based on a 15-percent random sample of copies of certificates of temporary incapacity for work issued by all health care units and authorised private medical practitioners. A certificate of temporary incapacity for work is received by every insured employee who is compelled to stop working due to sickness, accident, or due to the necessity to care for a sick member of his/her family. The certificate is required on the first day of sickness. Analyses of disease- and accident-related sickness absenteeism carried out each year in Poland within the statistical system lead to the main conclusions: 1. Diseases of the musculoskeletal and peripheral nervous systems accounting, when combined, for 1/3 of the total sickness absenteeism, are a major health problem of the working population in Poland. During the past five years, incapacity for work caused by these diseases in males increased 2.5 times. 2. Circulatory diseases, and arterial hypertension and ischaemic heart disease in particular (41% and 27% of sickness days, respectively), create an essential health problem among males at productive age, especially, in the 40 and older age group. Absenteeism due to these diseases has increased in males more than two times.

  2. Comparative analyses of bidirectional promoters in vertebrates

    Directory of Open Access Journals (Sweden)

    Taylor James

    2008-05-01

    Full Text Available Abstract Background Orthologous genes with deep phylogenetic histories are likely to retain similar regulatory features. In this report we utilize orthology assignments for pairs of genes co-regulated by bidirectional promoters to map the ancestral history of the promoter regions. Results Our mapping of bidirectional promoters from humans to fish shows that many such promoters emerged after the divergence of chickens and fish. Furthermore, annotations of promoters in deep phylogenies enable detection of missing data or assembly problems present in higher vertebrates. The functional importance of bidirectional promoters is indicated by selective pressure to maintain the arrangement of genes regulated by the promoter over long evolutionary time spans. Characteristics unique to bidirectional promoters are further elucidated using a technique for unsupervised classification, known as ESPERR. Conclusion Results of these analyses will aid in our understanding of the evolution of bidirectional promoters, including whether the regulation of two genes evolved as a consequence of their proximity or if function dictated their co-regulation.

  3. Thermomagnetic Analyses to Test Concrete Stability

    Science.gov (United States)

    Geiss, C. E.; Gourley, J. R.

    2017-12-01

    Over the past decades pyrrhotite-containing aggregate has been used in concrete to build basements and foundations in central Connecticut. The sulphur in the pyrrhotite reacts to several secondary minerals, and associated changes in volume lead to a loss of structural integrity. As a result hundreds of homes have been rendered worthless as remediation costs often exceed the value of the homes and the value of many other homes constructed during the same time period is in question as concrete provenance and potential future structural issues are unknown. While minor abundances of pyrrhotite are difficult to detect or quantify by traditional means, the mineral is easily identified through its magnetic properties. All concrete samples from affected homes show a clear increase in magnetic susceptibility above 220°C due to the γ - transition of Fe9S10 [1] and a clearly defined Curie-temperature near 320°C for Fe7S8. X-ray analyses confirm the presence of pyrrhotite and ettringite in these samples. Synthetic mixtures of commercially available concrete and pyrrhotite show that the method is semiquantitative but needs to be calibrated for specific pyrrhotite mineralogies. 1. Schwarz, E.J., Magnetic properties of pyrrhotite and their use in applied geology and geophysics. 1975, Geological Survey of Canada : Ottawa, ON, Canada: Canada.

  4. Social Media Analyses for Social Measurement.

    Science.gov (United States)

    Schober, Michael F; Pasek, Josh; Guggenheim, Lauren; Lampe, Cliff; Conrad, Frederick G

    2016-01-01

    Demonstrations that analyses of social media content can align with measurement from sample surveys have raised the question of whether survey research can be supplemented or even replaced with less costly and burdensome data mining of already-existing or "found" social media content. But just how trustworthy such measurement can be-say, to replace official statistics-is unknown. Survey researchers and data scientists approach key questions from starting assumptions and analytic traditions that differ on, for example, the need for representative samples drawn from frames that fully cover the population. New conversations between these scholarly communities are needed to understand the potential points of alignment and non-alignment. Across these approaches, there are major differences in (a) how participants (survey respondents and social media posters) understand the activity they are engaged in; (b) the nature of the data produced by survey responses and social media posts, and the inferences that are legitimate given the data; and (c) practical and ethical considerations surrounding the use of the data. Estimates are likely to align to differing degrees depending on the research topic and the populations under consideration, the particular features of the surveys and social media sites involved, and the analytic techniques for extracting opinions and experiences from social media. Traditional population coverage may not be required for social media content to effectively predict social phenomena to the extent that social media content distills or summarizes broader conversations that are also measured by surveys.

  5. Validating experimental and theoretical Langmuir probe analyses

    Science.gov (United States)

    Pilling, L. S.; Carnegie, D. A.

    2007-08-01

    Analysis of Langmuir probe characteristics contains a paradox in that it is unknown a priori which theory is applicable before it is applied. Often theories are assumed to be correct when certain criteria are met although they may not validate the approach used. We have analysed the Langmuir probe data from cylindrical double and single probes acquired from a dc discharge plasma over a wide variety of conditions. This discharge contains a dual-temperature distribution and hence fitting a theoretically generated curve is impractical. To determine the densities, an examination of the current theories was necessary. For the conditions where the probe radius is the same order of magnitude as the Debye length, the gradient expected for orbital-motion limited (OML) is approximately the same as the radial-motion gradients. An analysis of the 'gradients' from the radial-motion theory was able to resolve the differences from the OML gradient value of two. The method was also able to determine whether radial or OML theories applied without knowledge of the electron temperature, or separation of the ion and electron contributions. Only the value of the space potential is necessary to determine the applicable theory.

  6. Nuclear energy and the public opinion: analyses, communication strategy and actions

    International Nuclear Information System (INIS)

    Ansel, P.; Pages, J.P.

    1994-01-01

    A series of papers analysing the reactions of the public opinion concerning the nuclear energy, describing the information and communication strategies of some of the main French companies involved in the nuclear field, and presenting some of the actions undertaken in France and abroad

  7. Analysing Theoretical Frameworks of Moral Education through Lakatos's Philosophy of Science

    Science.gov (United States)

    Han, Hyemin

    2014-01-01

    The structure of studies of moral education is basically interdisciplinary; it includes moral philosophy, psychology, and educational research. This article systematically analyses the structure of studies of moral educational from the vantage points of philosophy of science. Among the various theoretical frameworks in the field of philosophy of…

  8. 7 CFR 98.3 - Analyses performed and locations of laboratories.

    Science.gov (United States)

    2010-01-01

    ... the special laboratory analyses rendered by the Science and Technology as a result of an agreement... Sausage Fat, salt 4 Pork Sausage Fat, moisture 4 Pork Sausage Fat 4 Mil-P-44131A (Pork Steaks, Flaked... performed at any one of the Science and Technology (S&T) field laboratories as follows: (1) USDA, AMS...

  9. Analysing Key Debates in Education and Sustainable Development in Relation to ESD Practice in Viet Nam

    Science.gov (United States)

    Balls, Emily

    2016-01-01

    This article is based on qualitative field research carried out in Ha Noi, Viet Nam, in 2013 for an MA dissertation in Education and International Development at the UCL Institute of Education. It analyses interpretations of education for sustainable development (ESD) in Viet Nam, relating these to key debates around instrumental and democratic…

  10. Bench top and portable mineral analysers, borehole core analysers and in situ borehole logging

    International Nuclear Information System (INIS)

    Howarth, W.J.; Watt, J.S.

    1982-01-01

    Bench top and portable mineral analysers are usually based on balanced filter techniques using scintillation detectors or on low resolution proportional detectors. The application of radioisotope x-ray techniques to in situ borehole logging is increasing, and is particularly suited for logging for tin and higher atomic number elements

  11. Transient Seepage for Levee Engineering Analyses

    Science.gov (United States)

    Tracy, F. T.

    2017-12-01

    Historically, steady-state seepage analyses have been a key tool for designing levees by practicing engineers. However, with the advances in computer modeling, transient seepage analysis has become a potentially viable tool. A complication is that the levees usually have partially saturated flow, and this is significantly more complicated in transient flow. This poster illustrates four elements of our research in partially saturated flow relating to the use of transient seepage for levee design: (1) a comparison of results from SEEP2D, SEEP/W, and SLIDE for a generic levee cross section common to the southeastern United States; (2) the results of a sensitivity study of varying saturated hydraulic conductivity, the volumetric water content function (as represented by van Genuchten), and volumetric compressibility; (3) a comparison of when soils do and do not exhibit hysteresis, and (4) a description of proper and improper use of transient seepage in levee design. The variables considered for the sensitivity and hysteresis studies are pore pressure beneath the confining layer at the toe, the flow rate through the levee system, and a levee saturation coefficient varying between 0 and 1. Getting results for SEEP2D, SEEP/W, and SLIDE to match proved more difficult than expected. After some effort, the results matched reasonably well. Differences in results were caused by various factors, including bugs, different finite element meshes, different numerical formulations of the system of nonlinear equations to be solved, and differences in convergence criteria. Varying volumetric compressibility affected the above test variables the most. The levee saturation coefficient was most affected by the use of hysteresis. The improper use of pore pressures from a transient finite element seepage solution imported into a slope stability computation was found to be the most grievous mistake in using transient seepage in the design of levees.

  12. Summary of the analyses for recovery factors

    Science.gov (United States)

    Verma, Mahendra K.

    2017-07-17

    IntroductionIn order to determine the hydrocarbon potential of oil reservoirs within the U.S. sedimentary basins for which the carbon dioxide enhanced oil recovery (CO2-EOR) process has been considered suitable, the CO2 Prophet model was chosen by the U.S. Geological Survey (USGS) to be the primary source for estimating recovery-factor values for individual reservoirs. The choice was made because of the model’s reliability and the ease with which it can be used to assess a large number of reservoirs. The other two approaches—the empirical decline curve analysis (DCA) method and a review of published literature on CO2-EOR projects—were deployed to verify the results of the CO2 Prophet model. This chapter discusses the results from CO2 Prophet (chapter B, by Emil D. Attanasi, this report) and compares them with results from decline curve analysis (chapter C, by Hossein Jahediesfanjani) and those reported in the literature for selected reservoirs with adequate data for analyses (chapter D, by Ricardo A. Olea).To estimate the technically recoverable hydrocarbon potential for oil reservoirs where CO2-EOR has been applied, two of the three approaches—CO2 Prophet modeling and DCA—do not include analysis of economic factors, while the third approach—review of published literature—implicitly includes economics. For selected reservoirs, DCA has provided estimates of the technically recoverable hydrocarbon volumes, which, in combination with calculated amounts of original oil in place (OOIP), helped establish incremental CO2-EOR recovery factors for individual reservoirs.The review of published technical papers and reports has provided substantial information on recovery factors for 70 CO2-EOR projects that are either commercially profitable or classified as pilot tests. When comparing the results, it is important to bear in mind the differences and limitations of these three approaches.

  13. The ABC (Analysing Biomolecular Contacts-database

    Directory of Open Access Journals (Sweden)

    Walter Peter

    2007-03-01

    Full Text Available As protein-protein interactions are one of the basic mechanisms in most cellular processes, it is desirable to understand the molecular details of protein-protein contacts and ultimately be able to predict which proteins interact. Interface areas on a protein surface that are involved in protein interactions exhibit certain characteristics. Therefore, several attempts were made to distinguish protein interactions from each other and to categorize them. One way of classification are the groups of transient and permanent interactions. Previously two of the authors analysed several properties for transient complexes such as the amino acid and secondary structure element composition and pairing preferences. Certainly, interfaces can be characterized by many more possible attributes and this is a subject of intense ongoing research. Although several freely available online databases exist that illuminate various aspects of protein-protein interactions, we decided to construct a new database collecting all desired interface features allowing for facile selection of subsets of complexes. As database-server we applied MySQL and the program logic was written in JAVA. Furthermore several class extensions and tools such as JMOL were included to visualize the interfaces and JfreeChart for the representation of diagrams and statistics. The contact data is automatically generated from standard PDB files by a tcl/tk-script running through the molecular visualization package VMD. Currently the database contains 536 interfaces extracted from 479 PDB files and it can be queried by various types of parameters. Here, we describe the database design and demonstrate its usefulness with a number of selected features.

  14. Trend analyses with river sediment rating curves

    Science.gov (United States)

    Warrick, Jonathan A.

    2015-01-01

    Sediment rating curves, which are fitted relationships between river discharge (Q) and suspended-sediment concentration (C), are commonly used to assess patterns and trends in river water quality. In many of these studies it is assumed that rating curves have a power-law form (i.e., C = aQb, where a and b are fitted parameters). Two fundamental questions about the utility of these techniques are assessed in this paper: (i) How well to the parameters, a and b, characterize trends in the data? (ii) Are trends in rating curves diagnostic of changes to river water or sediment discharge? As noted in previous research, the offset parameter, a, is not an independent variable for most rivers, but rather strongly dependent on b and Q. Here it is shown that a is a poor metric for trends in the vertical offset of a rating curve, and a new parameter, â, as determined by the discharge-normalized power function [C = â (Q/QGM)b], where QGM is the geometric mean of the Q values sampled, provides a better characterization of trends. However, these techniques must be applied carefully, because curvature in the relationship between log(Q) and log(C), which exists for many rivers, can produce false trends in â and b. Also, it is shown that trends in â and b are not uniquely diagnostic of river water or sediment supply conditions. For example, an increase in â can be caused by an increase in sediment supply, a decrease in water supply, or a combination of these conditions. Large changes in water and sediment supplies can occur without any change in the parameters, â and b. Thus, trend analyses using sediment rating curves must include additional assessments of the time-dependent rates and trends of river water, sediment concentrations, and sediment discharge.

  15. BN-600 hybrid core benchmark analyses

    International Nuclear Information System (INIS)

    Kim, Y.I.; Stanculescu, A.; Finck, P.; Hill, R.N.; Grimm, K.N.

    2003-01-01

    Benchmark analyses for the hybrid BN-600 reactor that contains three uranium enrichment zones and one plutonium zone in the core, have been performed within the frame of an IAEA sponsored Coordinated Research Project. The results for several relevant reactivity parameters obtained by the participants with their own state-of-the-art basic data and codes, were compared in terms of calculational uncertainty, and their effects on the ULOF transient behavior of the hybrid BN-600 core were evaluated. The comparison of the diffusion and transport results obtained for the homogeneous representation generally shows good agreement for most parameters between the RZ and HEX-Z models. The burnup effect and the heterogeneity effect on most reactivity parameters also show good agreement for the HEX-Z diffusion and transport theory results. A large difference noticed for the sodium and steel density coefficients is mainly due to differences in the spatial coefficient predictions for non fuelled regions. The burnup reactivity loss was evaluated to be 0.025 (4.3 $) within ∼ 5.0% standard deviation. The heterogeneity effect on most reactivity coefficients was estimated to be small. The heterogeneity treatment reduced the control rod worth by 2.3%. The heterogeneity effect on the k-eff and control rod worth appeared to differ strongly depending on the heterogeneity treatment method. A substantial spread noticed for several reactivity coefficients did not give a significant impact on the transient behavior prediction. This result is attributable to compensating effects between several reactivity effects and the specific design of the partially MOX fuelled hybrid core. (author)

  16. Analysing biomass torrefaction supply chain costs.

    Science.gov (United States)

    Svanberg, Martin; Olofsson, Ingemar; Flodén, Jonas; Nordin, Anders

    2013-08-01

    The objective of the present work was to develop a techno-economic system model to evaluate how logistics and production parameters affect the torrefaction supply chain costs under Swedish conditions. The model consists of four sub-models: (1) supply system, (2) a complete energy and mass balance of drying, torrefaction and densification, (3) investment and operating costs of a green field, stand-alone torrefaction pellet plant, and (4) distribution system to the gate of an end user. The results show that the torrefaction supply chain reaps significant economies of scale up to a plant size of about 150-200 kiloton dry substance per year (ktonDS/year), for which the total supply chain costs accounts to 31.8 euro per megawatt hour based on lower heating value (€/MWhLHV). Important parameters affecting total cost are amount of available biomass, biomass premium, logistics equipment, biomass moisture content, drying technology, torrefaction mass yield and torrefaction plant capital expenditures (CAPEX). Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Scintigraphical analyses of pulmonary function in dogs

    International Nuclear Information System (INIS)

    Clercx, C.

    1988-01-01

    The main goal of this study was to develop a quantitative analysis from 99mTc aerosol inhalation/perfusion (I/P) lung scintigrams. In particular attention was focused on both the regional I/P distribution, concerning the ratio of the mean I and P values in several lung regions, as well as on the local (intraregional) distribution of I/P, under a wide range of circumstances. In Ch. 2, the method and reference material are described. The distribution of the inhalation-to-perfusion ratios (I/P) is studied in anesthetized healthy dogs, with emphasis on inter-regional distribution and intra-regional dispersion of the I/P ratio. Moreover, it provides an insight into canine pulmonary physiology, frequently transposed from human lung physiology, what is not always correct. Ch. 3 deals with the possible methodological and physiological influences on the interpretation of scintigraphical measurements, such as age, posture and breed. Investigation of the effects of age and breed was pursued using qualitative studies of canine lung surfactant. The actual knowledge in this field lets prospect veterinary clinical meaning in the future. Finally in Ch. 4, the diagnostic value of the measurements was examined in experimental models of important lung disorders with different pathophysiological features, such as lobar and sublobar airway obstruction, and lung embolism. It also permits the investigation of the relative contribution of different compensating mechanisms upon the ventilation-to-perfusion ratio, such as collateral ventilation and hypoxic vasoconstriction. 218 refs.; 31 figs.; 14 tabs

  18. Analysing lawyers’ attitude towards knowledge sharing

    Directory of Open Access Journals (Sweden)

    Wole M. Olatokun

    2012-02-01

    Full Text Available Objectives: The study examined and identified the factors that affect lawyers’ attitudes to knowledge sharing, and their knowledge sharing behaviour. Specifically, it investigated the relationship between the salient beliefs affecting the knowledge sharing attitude of lawyers’, and applied a modified version of the Theory of Reasoned Action (TRA in the knowledge sharing context, to predict how these factors affect their knowledge sharing behaviour.Method: A field survey of 273 lawyers was carried out, using questionnaire for data collection. Collected data on all variables were structured into grouped frequency distributions. Principal Component Factor Analysis was applied to reduce the constructs and Simple Regression was applied to test the hypotheses. These were tested at 0.05% level of significance.Results: Results showed that expected associations and contributions were the major determinants of lawyers’ attitudes towards knowledge sharing. Expected reward was not significantly related to lawyers’ attitudes towards knowledge sharing. A positive attitude towards knowledge sharing was found to lead to a positive intention to share knowledge, although a positive intention to share knowledge did not significantly predict a positive knowledge sharing behaviour. The level of Information Technology (IT usage was also found to significantly affect the knowledge sharing behaviour of lawyers’.Conclusion: It was recommended that law firms in the study area should deploy more IT infrastructure and services that encourage effective knowledge sharing amongst lawyers. 

  19. Quantized fields in external field. Pt. 2

    International Nuclear Information System (INIS)

    Bellissard, J.

    1976-01-01

    The case of a charged scalar field is considered first. The existence of the corresponding Green's functions is proved. For weak fields, as well as pure electric or scalar external fields, the Bogoliubov S-operator is shown to be unitary, covariant, causal up-to-a-phase. These results are generalised to a class of higher spin quantized fields, 'nicely' coupled to external fields, which includes the Dirac theory, and in the case of minimal and magnetic dipole coupling, the spin one Petiau-Duffin-Kemmer theory. (orig.) [de

  20. Experimental Investigation of Integrated Optical Intensive Impulse Electric Field Sensors

    International Nuclear Information System (INIS)

    Bao, Sun; Fu-Shen, Chen

    2009-01-01

    We design and fabricate an integrated optical electric field sensor with segmented electrode for intensive impulse electric field measurement. The integrated optical sensor is based on a Mach–Zehnder interferometer with segmented electrodes. The output/input character of the sensing system is analysed and measured. The maximal detectable electric field range (−75 kV/m to 245 kV/m) is obtained by analysing the results. As a result, the integrated optics electric field sensing system is suitable for transient intensive electric field measurement investigation

  1. Electro-aerodynamic field aided needleless electrospinning.

    Science.gov (United States)

    Yan, Guilong; Niu, Haitao; Zhou, Hua; Wang, Hongxia; Shao, Hao; Zhao, Xueting; Lin, Tong

    2018-06-08

    Auxiliary fields have been used to enhance the performance of needle electrospinning. However, much less has been reported on how auxiliary fields affect needleless electrospinning. Herein, we report a novel needleless electrospinning technique that consists of an aerodynamic field and a second electric field. The second electric field is generated by setting two grounded inductive electrodes near the spinneret. The two auxiliary fields have to be applied simultaneously to ensure working of the electrospinning process. A synergistic effect was observed between inductive electrode and airflow. The aerodynamic-electric auxiliary field was found to significantly increase fiber production rate (4.5 g h -1 ), by 350% in comparison to the setup without auxiliary field (1.0 g h -1 ), whereas it had little effect on fiber diameter. The auxiliary fields allow running needleless electrospinning at an applied voltage equivalent to that in needle electrospinning (e.g. 10-30 kV). The finite element analyses of electric field and airflow field verify that the inductive electrodes increase electric field strength near the spinneret, and the airflow assists in fiber deposition. This novel needleless electrospinning may be useful for development of high-efficiency, low energy-consumption nanofiber production systems.

  2. Electro-aerodynamic field aided needleless electrospinning

    Science.gov (United States)

    Yan, Guilong; Niu, Haitao; Zhou, Hua; Wang, Hongxia; Shao, Hao; Zhao, Xueting; Lin, Tong

    2018-06-01

    Auxiliary fields have been used to enhance the performance of needle electrospinning. However, much less has been reported on how auxiliary fields affect needleless electrospinning. Herein, we report a novel needleless electrospinning technique that consists of an aerodynamic field and a second electric field. The second electric field is generated by setting two grounded inductive electrodes near the spinneret. The two auxiliary fields have to be applied simultaneously to ensure working of the electrospinning process. A synergistic effect was observed between inductive electrode and airflow. The aerodynamic-electric auxiliary field was found to significantly increase fiber production rate (4.5 g h‑1), by 350% in comparison to the setup without auxiliary field (1.0 g h‑1), whereas it had little effect on fiber diameter. The auxiliary fields allow running needleless electrospinning at an applied voltage equivalent to that in needle electrospinning (e.g. 10–30 kV). The finite element analyses of electric field and airflow field verify that the inductive electrodes increase electric field strength near the spinneret, and the airflow assists in fiber deposition. This novel needleless electrospinning may be useful for development of high-efficiency, low energy-consumption nanofiber production systems.

  3. Noncommutative field gas driven inflation

    Energy Technology Data Exchange (ETDEWEB)

    Barosi, Luciano; Brito, Francisco A [Departamento de Fisica, Universidade Federal de Campina Grande, Caixa Postal 10071, 58109-970 Campina Grande, Paraiba (Brazil); Queiroz, Amilcar R, E-mail: lbarosi@ufcg.edu.br, E-mail: fabrito@df.ufcg.edu.br, E-mail: amilcarq@gmail.com [Centro Internacional de Fisica da Materia Condensada, Universidade de Brasilia, Caixa Postal 04667, Brasilia, DF (Brazil)

    2008-04-15

    We investigate early time inflationary scenarios in a Universe filled with a dilute noncommutative bosonic gas at high temperature. A noncommutative bosonic gas is a gas composed of a bosonic scalar field with noncommutative field space on a commutative spacetime. Such noncommutative field theories were recently introduced as a generalization of quantum mechanics on a noncommutative spacetime. Key features of these theories are Lorentz invariance violation and CPT violation. In the present study we use a noncommutative bosonic field theory that, besides the noncommutative parameter {theta}, shows up a further parameter {sigma}. This parameter {sigma} controls the range of the noncommutativity and acts as a regulator for the theory. Both parameters play a key role in the modified dispersion relations of the noncommutative bosonic field, leading to possible striking consequences for phenomenology. In this work we obtain an equation of state p = {omega}({sigma},{theta};{beta}){rho} for the noncommutative bosonic gas relating pressure p and energy density {rho}, in the limit of high temperature. We analyse possible behaviours for these gas parameters {sigma}, {theta} and {beta}, so that -1{<=}{omega}<-1/3, which is the region where the Universe enters an accelerated phase.

  4. Vibro-spring particle size distribution analyser

    International Nuclear Information System (INIS)

    Patel, Ketan Shantilal

    2002-01-01

    This thesis describes the design and development of an automated pre-production particle size distribution analyser for particles in the 20 - 2000 μm size range. This work is follow up to the vibro-spring particle sizer reported by Shaeri. In its most basic form, the instrument comprises a horizontally held closed coil helical spring that is partly filled with the test powder and sinusoidally vibrated in the transverse direction. Particle size distribution data are obtained by stretching the spring to known lengths and measuring the mass of the powder discharged from the spring's coils. The size of the particles on the other hand is determined from the spring 'intercoil' distance. The instrument developed by Shaeri had limited use due to its inability to measure sample mass directly. For the device reported here, modifications are made to the original configurations to establish means of direct sample mass measurement. The feasibility of techniques for measuring the mass of powder retained within the spring are investigated in detail. Initially, the measurement of mass is executed in-situ from the vibration characteristics based on the spring's first harmonic resonant frequency. This method is often erratic and unreliable due to the particle-particle-spring wall interactions and the spring bending. An much more successful alternative is found from a more complicated arrangement in which the spring forms part of a stiff cantilever system pivoted along its main axis. Here, the sample mass is determined in the 'static mode' by monitoring the cantilever beam's deflection following the wanton termination of vibration. The system performance has been optimised through the variations of the mechanical design of the key components and the operating procedure as well as taking into account the effect of changes in the ambient temperature on the system's response. The thesis also describes the design and development of the ancillary mechanisms. These include the pneumatic

  5. Kuosheng Mark III containment analyses using GOTHIC

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Ansheng, E-mail: samuellin1999@iner.gov.tw; Chen, Yen-Shu; Yuann, Yng-Ruey

    2013-10-15

    Highlights: • The Kuosheng Mark III containment model is established using GOTHIC. • Containment pressure and temperature responses due to LOCA are presented. • The calculated results are all below the design values and compared with the FSAR results. • The calculated results can be served as an analysis reference for an SPU project in the future. -- Abstract: Kuosheng nuclear power plant in Taiwan is a twin-unit BWR/6 plant, and both units utilize the Mark III containment. Currently, the plant is performing a stretch power uprate (SPU) project to increase the core thermal power to 103.7% OLTP (original licensed thermal power). However, the containment response in the Kuosheng Final Safety Analysis Report (FSAR) was completed more than twenty-five years ago. The purpose of this study is to establish a Kuosheng Mark III containment model using the containment program GOTHIC. The containment pressure and temperature responses under the design-basis accidents, which are the main steam line break (MSLB) and the recirculation line break (RCLB) accidents, are investigated. Short-term and long-term analyses are presented in this study. The short-term analysis is to calculate the drywell peak pressure and temperature which happen in the early stage of the LOCAs. The long-term analysis is to calculate the peak pressure and temperature of the reactor building space. In the short-term analysis, the calculated peak drywell to wetwell differential pressure is 140.6 kPa for the MSLB, which is below than the design value of 189.6 kPa. The calculated peak drywell temperature is 158 °C, which is still below the design value of 165.6 °C. In addition, in the long-term analysis, the calculated peak containment pressure is 47 kPa G, which is below the design value of 103.4 kPa G. The calculated peak values of containment temperatures are 74.7 °C, which is lower than the design value of 93.3 °C. Therefore, the Kuosheng Mark III containment can maintain the integrity after

  6. YALINA Booster subcritical assembly modeling and analyses

    International Nuclear Information System (INIS)

    Talamo, A.; Gohar, Y.; Aliberti, G.; Cao, Y.; Zhong, Z.; Kiyavitskaya, H.; Bournos, V.; Fokov, Y.; Routkovskaya, C.; Sadovich, S.

    2010-01-01

    Full text: Accurate simulation models of the YALINA Booster assembly of the Joint Institute for Power and Nuclear Research (JIPNR)-Sosny, Belarus have been developed by Argonne National Laboratory (ANL) of the USA. YALINA-Booster has coupled zones operating with fast and thermal neutron spectra, which requires a special attention in the modelling process. Three different uranium enrichments of 90%, 36% or 21% were used in the fast zone and 10% uranium enrichment was used in the thermal zone. Two of the most advanced Monte Carlo computer programs have been utilized for the ANL analyses: MCNP of the Los Alamos National Laboratory and MONK of the British Nuclear Fuel Limited and SERCO Assurance. The developed geometrical models for both computer programs modelled all the details of the YALINA Booster facility as described in the technical specifications defined in the International Atomic Energy Agency (IAEA) report without any geometrical approximation or material homogenization. Materials impurities and the measured material densities have been used in the models. The obtained results for the neutron multiplication factors calculated in criticality mode (keff) and in source mode (ksrc) with an external neutron source from the two Monte Carlo programs are very similar. Different external neutron sources have been investigated including californium, deuterium-deuterium (D-D), and deuterium-tritium (D-T) neutron sources. The spatial neutron flux profiles and the neutron spectra in the experimental channels were calculated. In addition, the kinetic parameters were defined including the effective delayed neutron fraction, the prompt neutron lifetime, and the neutron generation time. A new calculation methodology has been developed at ANL to simulate the pulsed neutron source experiments. In this methodology, the MCNP code is used to simulate the detector response from a single pulse of the external neutron source and a C code is used to superimpose the pulse until the

  7. Altools: a user friendly NGS data analyser.

    Science.gov (United States)

    Camiolo, Salvatore; Sablok, Gaurav; Porceddu, Andrea

    2016-02-17

    Genotyping by re-sequencing has become a standard approach to estimate single nucleotide polymorphism (SNP) diversity, haplotype structure and the biodiversity and has been defined as an efficient approach to address geographical population genomics of several model species. To access core SNPs and insertion/deletion polymorphisms (indels), and to infer the phyletic patterns of speciation, most such approaches map short reads to the reference genome. Variant calling is important to establish patterns of genome-wide association studies (GWAS) for quantitative trait loci (QTLs), and to determine the population and haplotype structure based on SNPs, thus allowing content-dependent trait and evolutionary analysis. Several tools have been developed to investigate such polymorphisms as well as more complex genomic rearrangements such as copy number variations, presence/absence variations and large deletions. The programs available for this purpose have different strengths (e.g. accuracy, sensitivity and specificity) and weaknesses (e.g. low computation speed, complex installation procedure and absence of a user-friendly interface). Here we introduce Altools, a software package that is easy to install and use, which allows the precise detection of polymorphisms and structural variations. Altools uses the BWA/SAMtools/VarScan pipeline to call SNPs and indels, and the dnaCopy algorithm to achieve genome segmentation according to local coverage differences in order to identify copy number variations. It also uses insert size information from the alignment of paired-end reads and detects potential large deletions. A double mapping approach (BWA/BLASTn) identifies precise breakpoints while ensuring rapid elaboration. Finally, Altools implements several processes that yield deeper insight into the genes affected by the detected polymorphisms. Altools was used to analyse both simulated and real next-generation sequencing (NGS) data and performed satisfactorily in terms of

  8. Field-In-Field Technique With Intrafractionally Modulated Junction Shifts for Craniospinal Irradiation

    International Nuclear Information System (INIS)

    Yom, Sue S.; Frija, Erik K. C.; Mahajan, Anita; Chang, Eric; Klein, Kelli C.; Shiu, Almon; Ohrt, Jared; Woo, Shiao

    2007-01-01

    Purpose: To plan craniospinal irradiation with 'field-in-field' (FIF) homogenization in combination with daily, intrafractional modulation of the field junctions, to minimize the possibility of spinal cord overdose. Methods and Materials: Lateral cranial fields and posterior spinal fields were planned using a forward-planned, step-and-shoot FIF technique. Field junctions were automatically modulated and custom-weighted for maximal homogeneity within each treatment fraction. Dose-volume histogram analyses and film dosimetry were used to assess results. Results: Plan inhomogeneity improved with FIF. Planning with daily modulated junction shifts provided consistent dose delivery during each fraction of treatment across the junctions. Modulation minimized the impact of a 5-mm setup error at the junction. Film dosimetry confirmed that no point in the junction exceeded the anticipated dose. Conclusions: Field-in-field planning and modulated junction shifts improve the homogeneity and consistency of daily dose delivery, simplify treatment, and reduce the impact of setup errors

  9. Metatranscriptomic analyses of honey bee colonies.

    Science.gov (United States)

    Tozkar, Cansu Ö; Kence, Meral; Kence, Aykut; Huang, Qiang; Evans, Jay D

    2015-01-01

    Honey bees face numerous biotic threats from viruses to bacteria, fungi, protists, and mites. Here we describe a thorough analysis of microbes harbored by worker honey bees collected from field colonies in geographically distinct regions of Turkey. Turkey is one of the World's most important centers of apiculture, harboring five subspecies of Apis mellifera L., approximately 20% of the honey bee subspecies in the world. We use deep ILLUMINA-based RNA sequencing to capture RNA species for the honey bee and a sampling of all non-endogenous species carried by bees. After trimming and mapping these reads to the honey bee genome, approximately 10% of the sequences (9-10 million reads per library) remained. These were then mapped to a curated set of public sequences containing ca. Sixty megabase-pairs of sequence representing known microbial species associated with honey bees. Levels of key honey bee pathogens were confirmed using quantitative PCR screens. We contrast microbial matches across different sites in Turkey, showing new country recordings of Lake Sinai virus, two Spiroplasma bacterium species, symbionts Candidatus Schmidhempelia bombi, Frischella perrara, Snodgrassella alvi, Gilliamella apicola, Lactobacillus spp.), neogregarines, and a trypanosome species. By using metagenomic analysis, this study also reveals deep molecular evidence for the presence of bacterial pathogens (Melissococcus plutonius, Paenibacillus larvae), Varroa destructor-1 virus, Sacbrood virus, and fungi. Despite this effort we did not detect KBV, SBPV, Tobacco ringspot virus, VdMLV (Varroa Macula like virus), Acarapis spp., Tropilaeleps spp. and Apocephalus (phorid fly). We discuss possible impacts of management practices and honey bee subspecies on microbial retinues. The described workflow and curated microbial database will be generally useful for microbial surveys of healthy and declining honey bees.

  10. Procurement and execution of PCB analyses: Customer-analyst interactions

    International Nuclear Information System (INIS)

    Erickson, M.D.

    1993-01-01

    The practical application of PCB (polychlorinated biphenyl) analyses begins with a request for the analysis and concludes with provision of the requested analysis. The key to successful execution of this iteration is timely, professional communication between the requester and the analyst. Often PCB analyses are not satisfactorily executed, either because the requester failed to give adequate instructions or because the analyst simply ''did what he/she was told.'' The request for and conduct of a PCB analysis represents a contract for the procurement of a product (information about the sample); if both parties recognize and abide by this contractual relationship, the process generally proceeds smoothly. Requesters may be corporate purchasing agents working from a scope of work, a sample management office, a field team leader, a project manager, a physician's office, or the analyst himself. The analyst with whom the requester communicates may be a laboratory supervisor, a sample-receiving department, a salesperson for the laboratory, or the analyst himself. The analyst conducting the analysis is often a team, with custody of the sample being passed from sample receiving to the extraction laboratory, to the cleanup laboratory, to the gas chromatography (GC) laboratory, to the data reduction person, to the package preparation person, to the quality control (QC) department for verification, to shipping. Where a team of analysts is involved, the requester needs a central point of contact to minimize confusion and frustration. For the requester-analyst interface to work smoothly, it must function as if it is a one-to-one interaction. This article addresses the pitfalls of the requester-analyst interaction and provides suggestions for improving the quality of the analytical product through the requester-analyst interface

  11. First Super-Earth Atmosphere Analysed

    Science.gov (United States)

    2010-12-01

    The atmosphere around a super-Earth exoplanet has been analysed for the first time by an international team of astronomers using ESO's Very Large Telescope. The planet, which is known as GJ 1214b, was studied as it passed in front of its parent star and some of the starlight passed through the planet's atmosphere. We now know that the atmosphere is either mostly water in the form of steam or is dominated by thick clouds or hazes. The results will appear in the 2 December 2010 issue of the journal Nature. The planet GJ 1214b was confirmed in 2009 using the HARPS instrument on ESO's 3.6-metre telescope in Chile (eso0950) [1]. Initial findings suggested that this planet had an atmosphere, which has now been confirmed and studied in detail by an international team of astronomers, led by Jacob Bean (Harvard-Smithsonian Center for Astrophysics), using the FORS instrument on ESO's Very Large Telescope. "This is the first super-Earth to have its atmosphere analysed. We've reached a real milestone on the road toward characterising these worlds," said Bean. GJ 1214b has a radius of about 2.6 times that of the Earth and is about 6.5 times as massive, putting it squarely into the class of exoplanets known as super-Earths. Its host star lies about 40 light-years from Earth in the constellation of Ophiuchus (the Serpent Bearer). It is a faint star [2], but it is also small, which means that the size of the planet is large compared to the stellar disc, making it relatively easy to study [3]. The planet travels across the disc of its parent star once every 38 hours as it orbits at a distance of only two million kilometres: about seventy times closer than the Earth orbits the Sun. To study the atmosphere, the team observed the light coming from the star as the planet passed in front of it [4]. During these transits, some of the starlight passes through the planet's atmosphere and, depending on the chemical composition and weather on the planet, specific wavelengths of light are

  12. Applications of high lateral and energy resolution imaging XPS with a double hemispherical analyser based spectromicroscope

    International Nuclear Information System (INIS)

    Escher, M.; Winkler, K.; Renault, O.; Barrett, N.

    2010-01-01

    The design and applications of an instrument for imaging X-ray photoelectron spectroscopy (XPS) are reviewed. The instrument is based on a photoelectron microscope and a double hemispherical analyser whose symmetric configuration avoids the spherical aberration (α 2 -term) inherent for standard analysers. The analyser allows high transmission imaging without sacrificing the lateral and energy resolution of the instrument. The importance of high transmission, especially for highest resolution imaging XPS with monochromated laboratory X-ray sources, is outlined and the close interrelation of energy resolution, lateral resolution and analyser transmission is illustrated. Chemical imaging applications using a monochromatic laboratory Al Kα-source are shown, with a lateral resolution of 610 nm. Examples of measurements made using synchrotron and laboratory ultra-violet light show the broad field of applications from imaging of core level electrons with chemical shift identification, high resolution threshold photoelectron emission microscopy (PEEM), work function imaging and band structure imaging.

  13. Systems reliability analyses and risk analyses for the licencing procedure under atomic law

    International Nuclear Information System (INIS)

    Berning, A.; Spindler, H.

    1983-01-01

    For the licencing procedure under atomic law in accordance with Article 7 AtG, the nuclear power plant as a whole needs to be assessed, plus the reliability of systems and plant components that are essential to safety are to be determined with probabilistic methods. This requirement is the consequence of safety criteria for nuclear power plants issued by the Home Department (BMI). Systems reliability studies and risk analyses used in licencing procedures under atomic law are identified. The stress is on licencing decisions, mainly for PWR-type reactors. Reactor Safety Commission (RSK) guidelines, examples of reasoning in legal proceedings and arguments put forth by objectors are also dealt with. Correlations are shown between reliability analyses made by experts and licencing decisions by means of examples. (orig./HP) [de

  14. Thermal analyses of spent nuclear fuel repository

    International Nuclear Information System (INIS)

    Ikonen, K.

    2003-06-01

    calibrated by numerical analysis. Superposing single line heat sources the evolution of the temperature field of the repository can be determined efficiently. Efficient visualisation programmes were used for showing the results. Visualisation is an important element in assuring the reliability of the calculation process. (orig.)

  15. Interaction of strong electromagnetic fields with atoms

    International Nuclear Information System (INIS)

    Brandi, H.S.; Davidovich, L.; Zagury, N.

    1982-06-01

    Several non-linear processes involvoing the interaction of atoms with strong laser fields are discussed, with particular emphasis on the ionization problem. Non-perturbative methods which have been proposed to tackle this problem are analysed, and shown to correspond to an expansion in the intra-atomic potential. The relation between tunneling and multiphoton absorption as ionization mechanisms, and the generalization of Einstein's photoelectric equation to the strong-field case are discussed. (Author) [pt

  16. Gravitation and vacuum field

    International Nuclear Information System (INIS)

    Tevikyan, R.V.

    1986-01-01

    This paper presents equations that describe particles with spins s = 0, 1/2, 1 completely and which also describe 2s + 2 limiting fields as E → ∞. It is shown that the ordinary Hilbert-Einstein action for the gravitation field must be augmented by the action for the Bose vacuum field. This means that one must introduce in the gravitational equations a cosmological term proportional to the square of the strength of the Bose vacuum field. It is shown that the theory of gravitation describes three realities: matter, field, and vacuum field. A new form of matter--the vacuum field--is introduced into field theory

  17. Fractal vector optical fields.

    Science.gov (United States)

    Pan, Yue; Gao, Xu-Zhen; Cai, Meng-Qiang; Zhang, Guan-Lin; Li, Yongnan; Tu, Chenghou; Wang, Hui-Tian

    2016-07-15

    We introduce the concept of a fractal, which provides an alternative approach for flexibly engineering the optical fields and their focal fields. We propose, design, and create a new family of optical fields-fractal vector optical fields, which build a bridge between the fractal and vector optical fields. The fractal vector optical fields have polarization states exhibiting fractal geometry, and may also involve the phase and/or amplitude simultaneously. The results reveal that the focal fields exhibit self-similarity, and the hierarchy of the fractal has the "weeding" role. The fractal can be used to engineer the focal field.

  18. Electric Field Imaging

    Data.gov (United States)

    National Aeronautics and Space Administration — NDE historically has focused technology development in propagating wave phenomena with little attention to the field of electrostatics and emanating electric fields....

  19. Magnetic Field Calculator

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Magnetic Field Calculator will calculate the total magnetic field, including components (declination, inclination, horizontal intensity, northerly intensity,...

  20. Therapeutic avenues for hereditary forms of retinal blindness

    Indian Academy of Sciences (India)

    CHITRA KANNABIRAN

    for Ocular Regeneration, and 3Prof Brien Holden Eye Research Centre, L. V. Prasad Eye Institute, Kallam Anji Reddy .... in the mid-peripheral field as measured by the Humphrey ..... clinical translation of this technology in the near future.