WorldWideScience

Sample records for field correction automated

  1. Automation of one-loop QCD corrections

    CERN Document Server

    Hirschi, Valentin; Frixione, Stefano; Garzelli, Maria Vittoria; Maltoni, Fabio; Pittau, Roberto

    2011-01-01

    We present the complete automation of the computation of one-loop QCD corrections, including UV renormalization, to an arbitrary scattering process in the Standard Model. This is achieved by embedding the OPP integrand reduction technique, as implemented in CutTools, into the MadGraph framework. By interfacing the tool so constructed, which we dub MadLoop, with MadFKS, the fully automatic computation of any infrared-safe observable at the next-to-leading order in QCD is attained. We demonstrate the flexibility and the reach of our method by calculating the production rates for a variety of processes at the 7 TeV LHC.

  2. Automated general temperature correction method for dielectric soil moisture sensors

    Science.gov (United States)

    Kapilaratne, R. G. C. Jeewantinie; Lu, Minjiao

    2017-08-01

    An effective temperature correction method for dielectric sensors is important to ensure the accuracy of soil water content (SWC) measurements of local to regional-scale soil moisture monitoring networks. These networks are extensively using highly temperature sensitive dielectric sensors due to their low cost, ease of use and less power consumption. Yet there is no general temperature correction method for dielectric sensors, instead sensor or site dependent correction algorithms are employed. Such methods become ineffective at soil moisture monitoring networks with different sensor setups and those that cover diverse climatic conditions and soil types. This study attempted to develop a general temperature correction method for dielectric sensors which can be commonly used regardless of the differences in sensor type, climatic conditions and soil type without rainfall data. In this work an automated general temperature correction method was developed by adopting previously developed temperature correction algorithms using time domain reflectometry (TDR) measurements to ThetaProbe ML2X, Stevens Hydra probe II and Decagon Devices EC-TM sensor measurements. The rainy day effects removal procedure from SWC data was automated by incorporating a statistical inference technique with temperature correction algorithms. The temperature correction method was evaluated using 34 stations from the International Soil Moisture Monitoring Network and another nine stations from a local soil moisture monitoring network in Mongolia. Soil moisture monitoring networks used in this study cover four major climates and six major soil types. Results indicated that the automated temperature correction algorithms developed in this study can eliminate temperature effects from dielectric sensor measurements successfully even without on-site rainfall data. Furthermore, it has been found that actual daily average of SWC has been changed due to temperature effects of dielectric sensors with a

  3. Automation of electroweak NLO corrections in general models

    Energy Technology Data Exchange (ETDEWEB)

    Lang, Jean-Nicolas [Universitaet Wuerzburg (Germany)

    2016-07-01

    I discuss the automation of generation of scattering amplitudes in general quantum field theories at next-to-leading order in perturbation theory. The work is based on Recola, a highly efficient one-loop amplitude generator for the Standard Model, which I have extended so that it can deal with general quantum field theories. Internally, Recola computes off-shell currents and for new models new rules for off-shell currents emerge which are derived from the Feynman rules. My work relies on the UFO format which can be obtained by a suited model builder, e.g. FeynRules. I have developed tools to derive the necessary counterterm structures and to perform the renormalization within Recola in an automated way. I describe the procedure using the example of the two-Higgs-doublet model.

  4. Quantum electrodynamical corrections in critical fields

    International Nuclear Information System (INIS)

    Soff, G.

    1990-09-01

    We investigate field-theoretical corrections, such as vacuum polarization and self energy to study their influence on strongly bound electrons in heavy and superheavy atoms. In critical fields (Z≅170) for spontaneous e + e - pair creation the coupling constant of the external field Zα exceeds 1 thereby preventing the ordinary perturbative approach of quantum electrodynamical correction which employs an expansion in Zα. For heavy and superheavy elements radiative corrections have to be treated to all orders in Zα. The dominant effect is provided by the Uehling contribution being visualized by the first diagram on the right hand side. It is linear in the external field and thus of order α(Zα). (orig./HSI)

  5. Longitudinal wake field corrections in circular machines

    International Nuclear Information System (INIS)

    Symon, K.R.

    1996-01-01

    In computations of longitudinal particle motions in accelerators and storage rings, the fields produced by the interactions of the beam with the cavity in which it circulates are usually calculated by multiplying Fourier components of the beam current by the appropriate impedances. This procedure neglects the slow variation with time of the Fourier coefficients and of the beam revolution frequency. When there are cavity elements with decay times that are comparable with or larger than the time during which changes in the beam parameters occur, these changes can not be neglected. Corrections for this effect have been worked out in terms of the response functions of elements in the ring. The result is expressed as a correction to the impedance which depends on the way in which the beam parameters are changing. A method is presented for correcting a numerical simulation by keeping track of the steady state and transient terms in the response of a cavity

  6. Mean Field Analysis of Quantum Annealing Correction.

    Science.gov (United States)

    Matsuura, Shunji; Nishimori, Hidetoshi; Albash, Tameem; Lidar, Daniel A

    2016-06-03

    Quantum annealing correction (QAC) is a method that combines encoding with energy penalties and decoding to suppress and correct errors that degrade the performance of quantum annealers in solving optimization problems. While QAC has been experimentally demonstrated to successfully error correct a range of optimization problems, a clear understanding of its operating mechanism has been lacking. Here we bridge this gap using tools from quantum statistical mechanics. We study analytically tractable models using a mean-field analysis, specifically the p-body ferromagnetic infinite-range transverse-field Ising model as well as the quantum Hopfield model. We demonstrate that for p=2, where the phase transition is of second order, QAC pushes the transition to increasingly larger transverse field strengths. For p≥3, where the phase transition is of first order, QAC softens the closing of the gap for small energy penalty values and prevents its closure for sufficiently large energy penalty values. Thus QAC provides protection from excitations that occur near the quantum critical point. We find similar results for the Hopfield model, thus demonstrating that our conclusions hold in the presence of disorder.

  7. Comparatively Studied Color Correction Methods for Color Calibration of Automated Microscopy Complex of Biomedical Specimens

    Directory of Open Access Journals (Sweden)

    T. A. Kravtsova

    2016-01-01

    Full Text Available The paper considers a task of generating the requirements and creating a calibration target for automated microscopy systems (AMS of biomedical specimens to provide the invariance of algorithms and software to the hardware configuration. The required number of color fields of the calibration target and their color coordinates are mostly determined by the color correction method, for which coefficients of the equations are estimated during the calibration process. The paper analyses existing color calibration techniques for digital imaging systems using an optical microscope and shows that there is a lack of published results of comparative studies to demonstrate a particular useful color correction method for microscopic images. A comparative study of ten image color correction methods in RGB space using polynomials and combinations of color coordinate of different orders was carried out. The method of conditioned least squares to estimate the coefficients in the color correction equations using captured images of 217 color fields of the calibration target Kodak Q60-E3 was applied. The regularization parameter in this method was chosen experimentally. It was demonstrated that the best color correction quality characteristics are provided by the method that uses a combination of color coordinates of the 3rd order. The study of the influence of the number and the set of color fields included in calibration target on color correction quality for microscopic images was performed. Six train sets containing 30, 35, 40, 50, 60 and 80 color fields, and test set of 47 color fields not included in any of the train sets were formed. It was found out that the train set of 60 color fields minimizes the color correction error values for both operating modes of digital camera: using "default" color settings and with automatic white balance. At the same time it was established that the use of color fields from the widely used now Kodak Q60-E3 target does not

  8. Automated geographic registration and radiometric correction for UAV-based mosaics

    Science.gov (United States)

    Thomasson, J. Alex; Shi, Yeyin; Sima, Chao; Yang, Chenghai; Cope, Dale A.

    2017-05-01

    Texas A and M University has been operating a large-scale, UAV-based, agricultural remote-sensing research project since 2015. To use UAV-based images in agricultural production, many high-resolution images must be mosaicked together to create an image of an agricultural field. Two key difficulties to science-based utilization of such mosaics are geographic registration and radiometric calibration. In our current research project, image files are taken to the computer laboratory after the flight, and semi-manual pre-processing is implemented on the raw image data, including ortho-mosaicking and radiometric calibration. Ground control points (GCPs) are critical for high-quality geographic registration of images during mosaicking. Applications requiring accurate reflectance data also require radiometric-calibration references so that reflectance values of image objects can be calculated. We have developed a method for automated geographic registration and radiometric correction with targets that are installed semi-permanently at distributed locations around fields. The targets are a combination of black (≍5% reflectance), dark gray (≍20% reflectance), and light gray (≍40% reflectance) sections that provide for a transformation of pixel-value to reflectance in the dynamic range of crop fields. The exact spectral reflectance of each target is known, having been measured with a spectrophotometer. At the time of installation, each target is measured for position with a real-time kinematic GPS receiver to give its precise latitude and longitude. Automated location of the reference targets in the images is required for precise, automated, geographic registration; and automated calculation of the digital-number to reflectance transformation is required for automated radiometric calibration. To validate the system for radiometric calibration, a calibrated UAV-based image mosaic of a field was compared to a calibrated single image from a manned aircraft. Reflectance

  9. A fully automated algorithm of baseline correction based on wavelet feature points and segment interpolation

    Science.gov (United States)

    Qian, Fang; Wu, Yihui; Hao, Peng

    2017-11-01

    Baseline correction is a very important part of pre-processing. Baseline in the spectrum signal can induce uneven amplitude shifts across different wavenumbers and lead to bad results. Therefore, these amplitude shifts should be compensated before further analysis. Many algorithms are used to remove baseline, however fully automated baseline correction is convenient in practical application. A fully automated algorithm based on wavelet feature points and segment interpolation (AWFPSI) is proposed. This algorithm finds feature points through continuous wavelet transformation and estimates baseline through segment interpolation. AWFPSI is compared with three commonly introduced fully automated and semi-automated algorithms, using simulated spectrum signal, visible spectrum signal and Raman spectrum signal. The results show that AWFPSI gives better accuracy and has the advantage of easy use.

  10. Automated correction of unwanted phase jumps in reference signals which corrupt MRSI spectra after eddy current correction

    Science.gov (United States)

    Simonetti, A. W.; Melssen, W. J.; van der Graaf, M.; Heerschap, A.; Buydens, L. M. C.

    2002-12-01

    A commonly applied step in the postprocessing of gradient localized proton MR spectroscopy, is correction for eddy current effects using the water signal as a reference. However, this method can degrade some of the metabolite signals, in particular if applied on proton MR spectroscopic imaging data. This artifact arises from the water reference signal in the presence of a second signal which resonates close to the main water resonance. The interference of both resonances will introduce jumps in the phase of the reference time domain signal. Using this phase for eddy current correction will result in a ringing artifact in the frequency domain of the metabolite signal over the whole frequency range. We propose a moving window correction algorithm, which screens the phase of reference signals and removes phase jumps in time domain caused by interference of signals from multiple spin systems. The phase jumps may be abrupt or gradually distributed over several time data points. Because the correction algorithm only corrects time data points which contain phase jumps, the phase is minimally disrupted. Furthermore, the algorithm is automated for large datasets, correcting only those water reference signals which are corrupted. After correction of the corrupted reference signals, normal eddy current correction may be performed. The algorithm is compared with a method which uses a low-pass filter and tested on simulated data as well as on in vivo proton spectroscopic imaging data from a healthy volunteer and from patients with a brain tumor.

  11. Radiative corrections in strong Coulomb fields

    International Nuclear Information System (INIS)

    Soff, G.

    1993-04-01

    Contributions to the electron Lamb shift in highly charged ions are summarized. Recent theoretical developments as well as current experimental results are considered. Special emphasis is laid on higher-order vacuum polarization corrections as well as on the nuclear size effect on the electron self energy. (orig.). 5 figs

  12. Electromagnetic fields with vanishing quantum corrections

    Czech Academy of Sciences Publication Activity Database

    Ortaggio, Marcello; Pravda, Vojtěch

    2018-01-01

    Roč. 779, 10 April (2018), s. 393-395 ISSN 0370-2693 R&D Projects: GA ČR GA13-10042S Institutional support: RVO:67985840 Keywords : nonlinear electrodynamics * quantum corrections Subject RIV: BA - General Mathematics OBOR OECD: Applied mathematics Impact factor: 4.807, year: 2016 https://www.sciencedirect.com/science/article/pii/S0370269318300327?via%3Dihub

  13. Electromagnetic fields with vanishing quantum corrections

    Czech Academy of Sciences Publication Activity Database

    Ortaggio, Marcello; Pravda, Vojtěch

    2018-01-01

    Roč. 779, 10 April (2018), s. 393-395 ISSN 0370-2693 R&D Projects: GA ČR GA13-10042S Institutional support: RVO:67985840 Keywords : nonlinear electrodynamics * quantum corrections Subject RIV: BA - General Mathematics OBOR OECD: Applied mathematics Impact factor: 4.807, year: 2016 https://www. science direct.com/ science /article/pii/S0370269318300327?via%3Dihub

  14. On robust and reliable automated baseline corrections for strong motion seismology

    Science.gov (United States)

    Melgar, Diego; Bock, Yehuda; Sanchez, Dominga; Crowell, Brendan W.

    2013-03-01

    Computation of displacements from strong motion inertial sensors is to date an open problem. Two distinct methodologies have been proposed to solve it. One involves baseline corrections determined from the inertial data themselves and the other a combination with other geophysical sensors such as GPS. Here we analyze a proposed automated baseline correction algorithm using only accelerometer data and compare it to the results from the real-time combination of strong motion and GPS data. The analysis is performed on 48 collocated GPS and accelerometers in Japan that recorded the 2011 Mw 9.0 Tohoku-oki earthquake. We study the time and frequency domain behavior of both methodologies. We find that the error incurred from automated baseline corrections that rely on seismic data alone is complex and can be large in both the time and frequency domains of interest in seismological and engineering applications. The GPS/accelerometer combination has no such problems and can adequately recover broadband strong motion displacements for this event. The problems and ambiguities with baseline corrections and the success of the GPS/accelerometer combination lead us to advocate for instrument collocations as opposed to automated baseline correction algorithms for accelerometers.

  15. Automated particulate sampler field test model operations guide

    Energy Technology Data Exchange (ETDEWEB)

    Bowyer, S.M.; Miley, H.S.

    1996-10-01

    The Automated Particulate Sampler Field Test Model Operations Guide is a collection of documents which provides a complete picture of the Automated Particulate Sampler (APS) and the Field Test in which it was evaluated. The Pacific Northwest National Laboratory (PNNL) Automated Particulate Sampler was developed for the purpose of radionuclide particulate monitoring for use under the Comprehensive Test Ban Treaty (CTBT). Its design was directed by anticipated requirements of small size, low power consumption, low noise level, fully automatic operation, and most predominantly the sensitivity requirements of the Conference on Disarmament Working Paper 224 (CDWP224). This guide is intended to serve as both a reference document for the APS and to provide detailed instructions on how to operate the sampler. This document provides a complete description of the APS Field Test Model and all the activity related to its evaluation and progression.

  16. Electromagnetic fields with vanishing quantum corrections

    Science.gov (United States)

    Ortaggio, Marcello; Pravda, Vojtěch

    2018-04-01

    We show that a large class of null electromagnetic fields are immune to any modifications of Maxwell's equations in the form of arbitrary powers and derivatives of the field strength. These are thus exact solutions to virtually any generalized classical electrodynamics containing both non-linear terms and higher derivatives, including, e.g., non-linear electrodynamics as well as QED- and string-motivated effective theories. This result holds not only in a flat or (anti-)de Sitter background, but also in a larger subset of Kundt spacetimes, which allow for the presence of aligned gravitational waves and pure radiation.

  17. Quadratic α′-corrections to heterotic double field theory

    Directory of Open Access Journals (Sweden)

    Kanghoon Lee

    2015-10-01

    Full Text Available We investigate α′-corrections of heterotic double field theory up to quadratic order in the language of supersymmetric O(D,D+dim⁡G gauged double field theory. After introducing double-vielbein formalism with a parametrization which reproduces heterotic supergravity, we show that supersymmetry for heterotic double field theory up to leading order α′-correction is obtained from supersymmetric gauged double field theory. We discuss the necessary modifications of the symmetries defined in supersymmetric gauged double field theory. Further, we construct supersymmetric completion at quadratic order in α′.

  18. An Automated Field Bakery System for Bread

    Science.gov (United States)

    1983-10-01

    unit has two star-shaped impellers (four stainless steel lobes per impeller) built into a machined stainless steel housing that blends, kneads , and...34, ., •. , ..• ’"? .. • ... AN AU’fOMATED FIELD BAKERY SYSTEM FOR BREAD BY RICHARD J. LANZA ·AND ROBERT V. DECAREAU [/ ~~ . ) I ti (/ 3 lY CA...~r-1/./\\ .. / I y... BREAD Final Report 6. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(.) S. CONTRACT OR GRANT NUMBER(&) I Richard J. Lanza and Robert V. Decareau N/A 9

  19. Can small field diode correction factors be applied universally?

    Science.gov (United States)

    Liu, Paul Z Y; Suchowerska, Natalka; McKenzie, David R

    2014-09-01

    Diode detectors are commonly used in dosimetry, but have been reported to over-respond in small fields. Diode correction factors have been reported in the literature. The purpose of this study is to determine whether correction factors for a given diode type can be universally applied over a range of irradiation conditions including beams of different qualities. A mathematical relation of diode over-response as a function of the field size was developed using previously published experimental data in which diodes were compared to an air core scintillation dosimeter. Correction factors calculated from the mathematical relation were then compared those available in the literature. The mathematical relation established between diode over-response and the field size was found to predict the measured diode correction factors for fields between 5 and 30 mm in width. The average deviation between measured and predicted over-response was 0.32% for IBA SFD and PTW Type E diodes. Diode over-response was found to be not strongly dependent on the type of linac, the method of collimation or the measurement depth. The mathematical relation was found to agree with published diode correction factors derived from Monte Carlo simulations and measurements, indicating that correction factors are robust in their transportability between different radiation beams. Copyright © 2014. Published by Elsevier Ireland Ltd.

  20. Field nonuniformity correction for quantitative analysis of digitized mammograms

    International Nuclear Information System (INIS)

    Pawluczyk, Olga; Yaffe, Martin J.

    2001-01-01

    Several factors, including the heel effect, variation in distance from the x-ray source to points in the image and path obliquity contribute to the signal nonuniformity of mammograms. To best use digitized mammograms for quantitative image analysis, these field non-uniformities must be corrected. An empirically based correction method, which uses a bowl-shaped calibration phantom, has been developed. Due to the annular spherical shape of the phantom, its attenuation is constant over the entire image. Remaining nonuniformities are due only to the heel and inverse square effects as well as the variable path through the beam filter, compression plate and image receptor. In logarithmic space, a normalized image of the phantom can be added to mammograms to correct for these effects. Then, an analytical correction for path obliquity in the breast can be applied to the images. It was found that the correction causes the errors associated with field nonuniformity to be reduced from 14% to 2% for a 4 cm block of material corresponding to a combination of 50% fibroglandular and 50% fatty breast tissue. A repeatability study has been conducted to show that in regions as far as 20 cm away from the chest wall, variations due to imaging conditions and phantom alignment contribute to <2% of overall corrected signal

  1. Intensity calibration and flat-field correction for fluorescence microscopes.

    Science.gov (United States)

    Model, Michael

    2014-04-01

    Standardization in fluorescence microscopy involves calibration of intensity in reproducible units and correction for spatial nonuniformity of illumination (flat-field or shading correction). Both goals can be achieved using concentrated solutions of fluorescent dyes. When a drop of a highly concentrated fluorescent dye is placed between a slide and a coverslip it produces a spatially uniform field, resistant to photobleaching and with reproducible quantum yield; it can be used as a brightness standard for wide-field and confocal microscopes. For wide-field microscopes, calibration can be further extended to absolute molecular units. This can be done by imaging a solution of known concentration and known depth; the latter can be prepared by placing a small spherical lens in a diluted solution of the same fluorophore that is used in the biological specimen. Copyright © 2014 John Wiley & Sons, Inc.

  2. Relativistic Scott correction in self-generated magnetic fields

    DEFF Research Database (Denmark)

    Erdös, Laszlo; Fournais, Søren; Solovej, Jan Philip

    2012-01-01

    We consider a large neutral molecule with total nuclear charge $Z$ in a model with self-generated classical magnetic field and where the kinetic energy of the electrons is treated relativistically. To ensure stability, we assume that $Z \\alpha alpha$ denotes the fine structure....../3}$ and it is unchanged by including the self-generated magnetic field. We prove the first correction term to this energy, the so-called Scott correction of the form $S(\\alpha Z) Z^2$. The current paper extends the result of \\cite{SSS} on the Scott correction for relativistic molecules to include a self-generated...... constant. We are interested in the ground state energy in the simultaneous limit $Z \\rightarrow \\infty$, $\\alpha \\rightarrow 0$ such that $\\kappa=Z \\alpha$ is fixed. The leading term in the energy asymptotics is independent of $\\kappa$, it is given by the Thomas-Fermi energy of order $Z^{7...

  3. Inclusion of Yang–Mills fields in string corrected supergravity

    OpenAIRE

    Bellucci, S.; O'Reilly, D.

    2008-01-01

    We consistently incorporate Yang Mills matter fields into string corrected (deformed), D=10, N=1 Supergravity. We solve the Bianchi identities within the framework of the modified beta function favored constraints to second order in the string slope parameter $\\g$ also including the Yang Mills fields. In the torsion, curvature and H sectors we find that a consistent solution is readily obtained with a Yang Mills modified supercurrent $A_{abc}$. We find a solution in the F sector following our...

  4. Local field corrections in the lattice dynamics of chromium | Ndukwe ...

    African Journals Online (AJOL)

    This work extends the inclusion of local field corrections in the calculation of the phonon dispersion curves to the transition metal, chromium (Cr3+) using the formalism of lattice dynamics based on the transition metal model potential approach in the adiabatic and hatmonic approximations. The results obtained here have a ...

  5. Mass corrections in string theory and lattice field theory

    International Nuclear Information System (INIS)

    Del Debbio, Luigi; Kerrane, Eoin; Russo, Rodolfo

    2009-01-01

    Kaluza-Klein (KK) compactifications of higher-dimensional Yang-Mills theories contain a number of 4-dimensional scalars corresponding to the internal components of the gauge field. While at tree level the scalar zero modes are massless, it is well known that quantum corrections make them massive. We compute these radiative corrections at 1 loop in an effective field theory framework, using the background field method and proper Schwinger-time regularization. In order to clarify the proper treatment of the sum over KK modes in the effective field theory approach, we consider the same problem in two different UV completions of Yang-Mills: string theory and lattice field theory. In both cases, when the compactification radius R is much bigger than the scale of the UV completion (R>>√(α ' ), a), we recover a mass renormalization that is independent of the UV scale and agrees with the one derived in the effective field theory approach. These results support the idea that the value of the mass corrections is, in this regime, universal for any UV completion that respects locality and gauge invariance. The string analysis suggests that this property holds also at higher loops. The lattice analysis suggests that the mass of the adjoint scalars appearing in N=2, 4 super Yang-Mills is highly suppressed, even if the lattice regularization breaks all supersymmetries explicitly. This is due to an interplay between the higher-dimensional gauge invariance and the degeneracy of bosonic and fermionic degrees of freedom.

  6. DEVELOPMENT AND TESTING OF ERRORS CORRECTION ALGORITHM IN ELECTRONIC DESIGN AUTOMATION

    Directory of Open Access Journals (Sweden)

    E. B. Romanova

    2016-03-01

    Full Text Available Subject of Research. We have developed and presented a method of design errors correction for printed circuit boards (PCB in electronic design automation (EDA. Control of process parameters of PCB in EDA is carried out by means of Design Rule Check (DRC program. The DRC program monitors compliance with the design rules (minimum width of the conductors and gaps, the parameters of pads and via-holes, the parameters of polygons, etc. and also checks the route tracing, short circuits, the presence of objects outside PCB edge and other design errors. The result of the DRC program running is the generated error report. For quality production of circuit boards DRC-errors should be corrected, that is ensured by the creation of error-free DRC report. Method. A problem of correction repeatability of DRC-errors was identified as a result of trial operation of P-CAD, Altium Designer and KiCAD programs. For its solution the analysis of DRC-errors was carried out; the methods of their correction were studied. DRC-errors were proposed to be clustered. Groups of errors include the types of errors, which correction sequence has no impact on the correction time. The algorithm for correction of DRC-errors is proposed. Main Results. The best correction sequence of DRC-errors has been determined. The algorithm has been tested in the following EDA: P-CAD, Altium Designer and KiCAD. Testing has been carried out on two and four-layer test PCB (digital and analog. Comparison of DRC-errors correction time with the algorithm application to the same time without it has been done. It has been shown that time saved for the DRC-errors correction increases with the number of error types up to 3.7 times. Practical Relevance. The proposed algorithm application will reduce PCB design time and improve the quality of the PCB design. We recommend using the developed algorithm when the number of error types is equal to four or more. The proposed algorithm can be used in different

  7. Mean field with corrections in lattice gauge theory

    International Nuclear Information System (INIS)

    Flyvbjerg, H.; Zuber, J.B.; Lautrup, B.

    1981-12-01

    A systematic expansion of the path integral for lattice gauge theory is performed around the mean field solution. In this letter the authors present the results for the pure gauge groups Z(2), SU(2) and SO(3). The agreement with Monte Carlo calculations is excellent. For the discrete group the calculation is performed with and without gauge fixing, whereas for the continuous groups gauge fixing is mandatory. In the case of SU(2) the absence of a phase transition is correctly signalled by mean field theory. (Auth.)

  8. Text recognition and correction for automated data collection by mobile devices

    Science.gov (United States)

    Ozarslan, Suleyman; Eren, P. Erhan

    2014-03-01

    Participatory sensing is an approach which allows mobile devices such as mobile phones to be used for data collection, analysis and sharing processes by individuals. Data collection is the first and most important part of a participatory sensing system, but it is time consuming for the participants. In this paper, we discuss automatic data collection approaches for reducing the time required for collection, and increasing the amount of collected data. In this context, we explore automated text recognition on images of store receipts which are captured by mobile phone cameras, and the correction of the recognized text. Accordingly, our first goal is to evaluate the performance of the Optical Character Recognition (OCR) method with respect to data collection from store receipt images. Images captured by mobile phones exhibit some typical problems, and common image processing methods cannot handle some of them. Consequently, the second goal is to address these types of problems through our proposed Knowledge Based Correction (KBC) method used in support of the OCR, and also to evaluate the KBC method with respect to the improvement on the accurate recognition rate. Results of the experiments show that the KBC method improves the accurate data recognition rate noticeably.

  9. Flat-field and Dark Frame Corrections for IRIS Data

    Science.gov (United States)

    Saar, S. H.; Jaeggli, S. A.; Bush, R. I.; Boerner, P.; Wuelser, J.; Tarbell, T. D.; Lites, B. W.; De Pontieu, B.

    2013-12-01

    We discuss the development of flat-field and dark frame corrections for Interface Region Imaging Spectrograph (IRIS) data. Flat-fields for IRIS were taken prior to launch using a lamp filtered to NUV wavelengths; following launch the Sun itself was used as a flat-field source. The solar flat-field for the slit-jaw imagers is constructed using the Chae method, which extracts the moving object and fixed gain patterns from a set of dithered images. The spectrographic flat-fields are produced by significantly defocusing the telescope and averaging many images in a scan to remove solar structure, so that the average spectral profile can be removed. A given dark frame consists of pedestal and dark current components. In IRIS, both are temperature dependent, though they respond to different measured temperatures; the dark current is also exposure time dependent. Each CCD readout port has a slightly different temperature and exposure time response. We have analyzed a series of dark frames over an IRIS orbit to calibrate for these effects. We plan to continue to monitor the flat-field and dark frames regularly for any changes.

  10. Efficient Photometry In-Frame Calibration (EPIC) Gaussian Corrections for Automated Background Normalization of Rate-Tracked Satellite Imagery

    Science.gov (United States)

    Griesbach, J.; Wetterer, C.; Sydney, P.; Gerber, J.

    Photometric processing of non-resolved Electro-Optical (EO) images has commonly required the use of dark and flat calibration frames that are obtained to correct for charge coupled device (CCD) dark (thermal) noise and CCD quantum efficiency/optical path vignetting effects respectively. It is necessary to account/calibrate for these effects so that the brightness of objects of interest (e.g. stars or resident space objects (RSOs)) may be measured in a consistent manner across the CCD field of view. Detected objects typically require further calibration using aperture photometry to compensate for sky background (shot noise). For this, annuluses are measured around each detected object whose contained pixels are used to estimate an average background level that is subtracted from the detected pixel measurements. In a new photometric calibration software tool developed for AFRL/RD, called Efficient Photometry In-Frame Calibration (EPIC), an automated background normalization technique is proposed that eliminates the requirement to capture dark and flat calibration images. The proposed technique simultaneously corrects for dark noise, shot noise, and CCD quantum efficiency/optical path vignetting effects. With this, a constant detection threshold may be applied for constant false alarm rate (CFAR) object detection without the need for aperture photometry corrections. The detected pixels may be simply summed (without further correction) for an accurate instrumental magnitude estimate. The noise distribution associated with each pixel is assumed to be sampled from a Poisson distribution. Since Poisson distributed data closely resembles Gaussian data for parameterized means greater than 10, the data may be corrected by applying bias subtraction and standard-deviation division. EPIC performs automated background normalization on rate-tracked satellite images using the following technique. A deck of approximately 50-100 images is combined by performing an independent median

  11. A new controller for the JET error field correction coils

    International Nuclear Information System (INIS)

    Zanotto, L.; Sartori, F.; Bigi, M.; Piccolo, F.; De Benedetti, M.

    2005-01-01

    This paper describes the hardware and the software structure of a new controller for the JET error field correction coils (EFCC) system, a set of ex-vessel coils that recently replaced the internal saddle coils. The EFCC controller has been developed on a conventional VME hardware platform using a new software framework, recently designed for real-time applications at JET, and replaces the old disruption feedback controller increasing the flexibility and the optimization of the system. The use of conventional hardware has required a particular effort in designing the software part in order to meet the specifications. The peculiarities of the new controller will be highlighted, such as its very useful trigger logic interface, which allows in principle exploring various error field experiment scenarios

  12. Multi-objective optimization for an automated and simultaneous phase and baseline correction of NMR spectral data

    Science.gov (United States)

    Sawall, Mathias; von Harbou, Erik; Moog, Annekathrin; Behrens, Richard; Schröder, Henning; Simoneau, Joël; Steimers, Ellen; Neymeyr, Klaus

    2018-04-01

    Spectral data preprocessing is an integral and sometimes inevitable part of chemometric analyses. For Nuclear Magnetic Resonance (NMR) spectra a possible first preprocessing step is a phase correction which is applied to the Fourier transformed free induction decay (FID) signal. This preprocessing step can be followed by a separate baseline correction step. Especially if series of high-resolution spectra are considered, then automated and computationally fast preprocessing routines are desirable. A new method is suggested that applies the phase and the baseline corrections simultaneously in an automated form without manual input, which distinguishes this work from other approaches. The underlying multi-objective optimization or Pareto optimization provides improved results compared to consecutively applied correction steps. The optimization process uses an objective function which applies strong penalty constraints and weaker regularization conditions. The new method includes an approach for the detection of zero baseline regions. The baseline correction uses a modified Whittaker smoother. The functionality of the new method is demonstrated for experimental NMR spectra. The results are verified against gravimetric data. The method is compared to alternative preprocessing tools. Additionally, the simultaneous correction method is compared to a consecutive application of the two correction steps.

  13. Automation of NLO QCD and EW corrections with Sherpa and Recola

    Energy Technology Data Exchange (ETDEWEB)

    Biedermann, Benedikt; Denner, Ansgar; Pellen, Mathieu [Universitaet Wuerzburg, Institut fuer Theoretische Physik und Astrophysik, Wuerzburg (Germany); Braeuer, Stephan; Schumann, Steffen [Georg-August Universitaet Goettingen, II. Physikalisches Institut, Goettingen (Germany); Thompson, Jennifer M. [Universitaet Heidelberg, Institut fuer Theoretische Physik, Heidelberg (Germany)

    2017-07-15

    This publication presents the combination of the one-loop matrix-element generator Recola with the multipurpose Monte Carlo program Sherpa. Since both programs are highly automated, the resulting Sherpa +Recola framework allows for the computation of - in principle - any Standard Model process at both NLO QCD and EW accuracy. To illustrate this, three representative LHC processes have been computed at NLO QCD and EW: vector-boson production in association with jets, off-shell Z-boson pair production, and the production of a top-quark pair in association with a Higgs boson. In addition to fixed-order computations, when considering QCD corrections, all functionalities of Sherpa, i.e. particle decays, QCD parton showers, hadronisation, underlying events, etc. can be used in combination with Recola. This is demonstrated by the merging and matching of one-loop QCD matrix elements for Drell-Yan production in association with jets to the parton shower. The implementation is fully automatised, thus making it a perfect tool for both experimentalists and theorists who want to use state-of-the-art predictions at NLO accuracy. (orig.)

  14. Automated fetal brain segmentation from 2D MRI slices for motion correction.

    Science.gov (United States)

    Keraudren, K; Kuklisova-Murgasova, M; Kyriakopoulou, V; Malamateniou, C; Rutherford, M A; Kainz, B; Hajnal, J V; Rueckert, D

    2014-11-01

    Motion correction is a key element for imaging the fetal brain in-utero using Magnetic Resonance Imaging (MRI). Maternal breathing can introduce motion, but a larger effect is frequently due to fetal movement within the womb. Consequently, imaging is frequently performed slice-by-slice using single shot techniques, which are then combined into volumetric images using slice-to-volume reconstruction methods (SVR). For successful SVR, a key preprocessing step is to isolate fetal brain tissues from maternal anatomy before correcting for the motion of the fetal head. This has hitherto been a manual or semi-automatic procedure. We propose an automatic method to localize and segment the brain of the fetus when the image data is acquired as stacks of 2D slices with anatomy misaligned due to fetal motion. We combine this segmentation process with a robust motion correction method, enabling the segmentation to be refined as the reconstruction proceeds. The fetal brain localization process uses Maximally Stable Extremal Regions (MSER), which are classified using a Bag-of-Words model with Scale-Invariant Feature Transform (SIFT) features. The segmentation process is a patch-based propagation of the MSER regions selected during detection, combined with a Conditional Random Field (CRF). The gestational age (GA) is used to incorporate prior knowledge about the size and volume of the fetal brain into the detection and segmentation process. The method was tested in a ten-fold cross-validation experiment on 66 datasets of healthy fetuses whose GA ranged from 22 to 39 weeks. In 85% of the tested cases, our proposed method produced a motion corrected volume of a relevant quality for clinical diagnosis, thus removing the need for manually delineating the contours of the brain before motion correction. Our method automatically generated as a side-product a segmentation of the reconstructed fetal brain with a mean Dice score of 93%, which can be used for further processing. Copyright

  15. Field test of distribution automation equipment at Kansas Utilities

    International Nuclear Information System (INIS)

    Pahwa, A.; Shultis, J.K.; Dowling, W.N.

    1991-01-01

    Distribution automation can be beneficial to the utilities by reducing operating cost and increasing efficiency. Since needs of most utilities are different, obtaining first hand experience with distribution automation equipment is important. In this paper experiences of Kansas City Power and Light Co., and Midwest Energy, Inc. (members of Kansas Electric Utilities Research Program) related to operation of pilot distribution automation systems are described. Data gathered on failure of equipment is also included. As a part of this project, a microcomputer program was developed for cost/benefit analysis of eight distribution automation functions. In this paper salient features of this program are discussed and results of an example are presented

  16. Correcting GRACE gravity fields for ocean tide effects

    DEFF Research Database (Denmark)

    Knudsen, Per; Andersen, Ole Baltazar

    2002-01-01

    [1] The GRACE mission will be launch in early 2002 and will map the Earth's gravity fields and its variations with unprecedented accuracy during its 5-year lifetime. Unless ocean tide signals and their load upon the solid earth are removed from the GRACE data, their long period aliases obscure more...... subtle climate signals which GRACE aims at. The difference between two existing ocean tide models can be used as an estimate of current tidal model error for the M-2,S-2,K-1, and O-1 constituents. When compared with the expected accuracy of the GRACE system, both expressed as spherical harmonic degree...... variances, we find that the current ocean tide models are not accurate enough to correct GRACE data at harmonic degrees lower that 35. The accumulated tidal errors may affect the GRACE data up to harmonic degree 56. Furthermore, the atmospheric (radiation) tides may cause significant errors in the ocean...

  17. Automated eye blink detection and correction method for clinical MR eye imaging.

    Science.gov (United States)

    Wezel, Joep; Garpebring, Anders; Webb, Andrew G; van Osch, Matthias J P; Beenakker, Jan-Willem M

    2017-07-01

    To implement an on-line monitoring system to detect eye blinks during ocular MRI using field probes, and to reacquire corrupted k-space lines by means of an automatic feedback system integrated with the MR scanner. Six healthy subjects were scanned on a 7 Tesla MRI whole-body system using a custom-built receive coil. Subjects were asked to blink multiple times during the MR-scan. The local magnetic field changes were detected with an external fluorine-based field probe which was positioned close to the eye. The eye blink produces a field shift greater than a threshold level, this was communicated in real-time to the MR system which immediately reacquired the motion-corrupted k-space lines. The uncorrected images, using the original motion-corrupted data, showed severe artifacts, whereas the corrected images, using the reacquired data, provided an image quality similar to images acquired without blinks. Field probes can successfully detect eye blinks during MRI scans. By automatically reacquiring the eye blink-corrupted data, high quality MR-images of the eye can be acquired. Magn Reson Med 78:165-171, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  18. Automated Critical Peak Pricing Field Tests: Program Descriptionand Results

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David; Motegi, Naoya; Kiliccote, Sila; Xu, Peng

    2006-04-06

    California utilities have been exploring the use of critical peak prices (CPP) to help reduce needle peaks in customer end-use loads. CPP is a form of price-responsive demand response (DR). Recent experience has shown that customers have limited knowledge of how to operate their facilities in order to reduce their electricity costs under CPP (Quantum 2004). While the lack of knowledge about how to develop and implement DR control strategies is a barrier to participation in DR programs like CPP, another barrier is the lack of automation of DR systems. During 2003 and 2004, the PIER Demand Response Research Center (DRRC) conducted a series of tests of fully automated electric demand response (Auto-DR) at 18 facilities. Overall, the average of the site-specific average coincident demand reductions was 8% from a variety of building types and facilities. Many electricity customers have suggested that automation will help them institutionalize their electric demand savings and improve their overall response and DR repeatability. This report focuses on and discusses the specific results of the Automated Critical Peak Pricing (Auto-CPP, a specific type of Auto-DR) tests that took place during 2005, which build on the automated demand response (Auto-DR) research conducted through PIER and the DRRC in 2003 and 2004. The long-term goal of this project is to understand the technical opportunities of automating demand response and to remove technical and market impediments to large-scale implementation of automated demand response (Auto-DR) in buildings and industry. A second goal of this research is to understand and identify best practices for DR strategies and opportunities. The specific objectives of the Automated Critical Peak Pricing test were as follows: (1) Demonstrate how an automated notification system for critical peak pricing can be used in large commercial facilities for demand response (DR). (2) Evaluate effectiveness of such a system. (3) Determine how customers

  19. Correction of an input function for errors introduced with automated blood sampling

    Energy Technology Data Exchange (ETDEWEB)

    Schlyer, D.J.; Dewey, S.L. [Brookhaven National Lab., Upton, NY (United States)

    1994-05-01

    Accurate kinetic modeling of PET data requires an precise arterial plasma input function. The use of automated blood sampling machines has greatly improved the accuracy but errors can be introduced by the dispersion of the radiotracer in the sampling tubing. This dispersion results from three effects. The first is the spreading of the radiotracer in the tube due to mass transfer. The second is due to the mechanical action of the peristaltic pump and can be determined experimentally from the width of a step function. The third is the adsorption of the radiotracer on the walls of the tubing during transport through the tube. This is a more insidious effect since the amount recovered from the end of the tube can be significantly different than that introduced into the tubing. We have measured the simple mass transport using [{sup 18}F]fluoride in water which we have shown to be quantitatively recovered with no interaction with the tubing walls. We have also carried out experiments with several radiotracers including [{sup 18}F]Haloperidol, [{sup 11}C]L-deprenyl, [{sup 18}]N-methylspiroperidol ([{sup 18}F]NMS) and [{sup 11}C]buprenorphine. In all cases there was some retention of the radiotracer by untreated silicone tubing. The amount retained in the tubing ranged from 6% for L-deprenyl to 30% for NMS. The retention of the radiotracer was essentially eliminated after pretreatment with the relevant unlabeled compound. For example less am 2% of the [{sup 18}F]NMS was retained in tubing treated with unlabelled NMS. Similar results were obtained with baboon plasma although the amount retained in the untreated tubing was less in all cases. From these results it is possible to apply a mathematical correction to the measured input function to account for mechanical dispersion and to apply a chemical passivation to the tubing to reduce the dispersion due to adsorption of the radiotracer on the tubing walls.

  20. Automated weed detection in the field - possibilities and limits

    Directory of Open Access Journals (Sweden)

    Pflanz, Michael

    2016-02-01

    Full Text Available Unmanned Aerial Vehicles (UAV have become omnipresent and adequate tools to generate high-resolution spatial data of agricultural cropland. Their implementation into remote sensing approaches of weeds provides suitable applications for a site-specific herbicide management. In general, an increasingly use of innovative technologies gradually leads from agricultural research into the practical application. This requires an evaluation of possibilities and limits of UAV-based remote sensing procedures. While spectrals from UAVs are being used already for mapping needs of nutrient or water, the image supported weed detection is much more complex and at the moment not relevant in practice. In this regard, there is a lack of weed and crop differentiation through spectral analyses and object-based approaches separate different plants not species-specific or are not adapted to morphologic changes of the growth. Moreover, there is a need for alternative positioning techniques without GPS, as it is required for a precise optical imaging analysis at low altitudes. To evaluate the possibilities and limitations of automated weed identification regarding the optical and sampling requirements, flights were carried out with a hexacopter at an altitude of 5 m over agricultural crop land with variable weed patches. The altitude was controlled by the GPS-autopilot. Images were captured at geo-referenced points and the number of different weed species was simultaneously determined by manually counting. The required optical resolution on the ground was estimated by comparing the number of weeds between image analysis on the PC and with the field rating data.

  1. Free-field correction values for Interacoustics DD 45 supra-aural audiometric earphones

    DEFF Research Database (Denmark)

    Poulsen, Torben

    2010-01-01

    This paper report free-field correction values for the Interacoustics DD 45 audiometric earphones. The free-field correction values for earphones provide the loudness based equivalence to loudspeaker presentation. Correction values are especially used for the calibration of audiometric equipment ...

  2. Automated Critical PeakPricing Field Tests: 2006 Pilot ProgramDescription and Results

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David; Motegi, Naoya; Kiliccote, Sila

    2007-06-19

    During 2006 Lawrence Berkeley National Laboratory (LBNL) and the Demand Response Research Center (DRRC) performed a technology evaluation for the Pacific Gas and Electric Company (PG&E) Emerging Technologies Programs. This report summarizes the design, deployment, and results from the 2006 Automated Critical Peak Pricing Program (Auto-CPP). The program was designed to evaluate the feasibility of deploying automation systems that allow customers to participate in critical peak pricing (CPP) with a fully-automated response. The 2006 program was in operation during the entire six-month CPP period from May through October. The methodology for this field study included site recruitment, control strategy development, automation system deployment, and evaluation of sites' participation in actual CPP events through the summer of 2006. LBNL recruited sites in PG&E's territory in northern California through contacts from PG&E account managers, conferences, and industry meetings. Each site contact signed a memorandum of understanding with LBNL that outlined the activities needed to participate in the Auto-CPP program. Each facility worked with LBNL to select and implement control strategies for demand response and developed automation system designs based on existing Internet connectivity and building control systems. Once the automation systems were installed, LBNL conducted communications tests to ensure that the Demand Response Automation Server (DRAS) correctly provided and logged the continuous communications of the CPP signals with the energy management and control system (EMCS) for each site. LBNL also observed and evaluated Demand Response (DR) shed strategies to ensure proper commissioning of controls. The communication system allowed sites to receive day-ahead as well as day-of signals for pre-cooling, a DR strategy used at a few sites. Measurement of demand response was conducted using two different baseline models for estimating peak load savings. One

  3. The magnetic field for the ZEUS central detector - analysis and correction of the field measurement

    International Nuclear Information System (INIS)

    Mengel, S.

    1992-06-01

    The magnetic field in the central tracking region of the ZEUS-detector - a facility to investigate highly energetic electron-proton-collisions at the HERA-collider at DESY Hamburg - is generated by a superconducting coil and reaches 18 kG (1.8 T). Some of the tracking devices particularly the drift chambers in the proton forward and rear direction (FTD1-3 and RTD) are not fully contained within the coil and therefore situated in a highly inhomogeneous magnetic field: The radial component B r is up to 6.6 kG, maximum gradients are found to be 300 G/cm for δB r /δr. Evaluating the space drifttime relation necessitates a detailed knowledge of the magnetic field. To reach this goal we analysed the field measurements and corrected them for systematic errors. The corrected data were compared with the field calculations (TOSCA-maps). Measurements and calculations are confirmed by studying consistency with Maxwell's equations. The accuracy reached is better than 100 G throughout the forward and central drift chambers (FTD1-3, CTD) and better than 150 G in the RTD. (orig.) [de

  4. Automation surprise : results of a field survey of Dutch pilots

    NARCIS (Netherlands)

    de Boer, R.J.; Hurts, Karel

    2017-01-01

    Automation surprise (AS) has often been associated with aviation safety incidents. Although numerous laboratory studies have been conducted, few data are available from routine flight operations. A survey among a representative sample of 200 Dutch airline pilots was used to determine the prevalence

  5. Correction of the closed orbit and vertical dispersion and the tuning and field correction system in ISABELLE

    Energy Technology Data Exchange (ETDEWEB)

    Parzen, G.

    1979-01-01

    Each ring in ISABELLE will have 10 separately powered systematic field correction coils to make required corrections which are the same in corresponding magnets around the ring. These corrections include changing the ..nu..-value, shaping the working line in ..nu..-space, correction of field errors due to iron saturation effects, the conductor arrangements, the construction of the coil ends, diamagnetic effects in the superconductor and to rate-dependent induced currents. The twelve insertion quadrupoles in the insertion surrounding each crossing point will each have a quadrupole trim coil. The closed orbit will be controlled by a system of 84 horizontal dipole coils and 90 vertical dipole coils in each ring, each coil being separately powered. This system of dipole coils will also be used to correct the vertical dispersion at the crossing points. Two families of skew quadrupoles per ring will be provided for correction of the coupling between the horizontal and vertical motions. Although there will be 258 separately powered correction coils in each ring.

  6. One-loop Correction of the Tachyon Action in Boundary Superstring Field Theory

    NARCIS (Netherlands)

    Alishahiha, M.

    2001-01-01

    We compute one-loop correction to the string field theory. We would expect that the one-loop correction comes from the partition function of the two-dimensional worldsheet theory on the annulus. The annulus correction suggests that the genus expansion is, somehow, governed by the effective string

  7. Evaluation of full field automated photoelastic analysis based on phase stepping

    Science.gov (United States)

    Haake, S. J.; Wang, Z. F.; Patterson, E. A.

    A full-field automated polariscope designed for photoelastic analysis and based on the method of phase-stepping is described. The system is evaluated through the analysis of five different photoelastic models using both the automated system and using manual analysis employing the Tardy Compensation method. Models were chosen to provide a range of different fringe patterns, orders, and stress gradients and were: a disk in diametral compression, a constrained beam subject to a point load, a tensile plate with a central hole, a turbine blade, and a turbine disk slot. The repeatability of the full-field system was found to compare well with point by point systems. The worst isochromatic error was approximately 0.007 fringes, and the corresponding isoclinic error was 0.75. Results from the manual and automated methods showed good agreement. It is concluded that automated photoelastic analysis based on phase-stepping procedures offers a potentially accurate and reliable tool for stress analysts.

  8. Assessment of automated disease detection in diabetic retinopathy screening using two-field photography.

    Directory of Open Access Journals (Sweden)

    Keith Goatman

    Full Text Available To assess the performance of automated disease detection in diabetic retinopathy screening using two field mydriatic photography.Images from 8,271 sequential patient screening episodes from a South London diabetic retinopathy screening service were processed by the Medalytix iGrading™ automated grading system. For each screening episode macular-centred and disc-centred images of both eyes were acquired and independently graded according to the English national grading scheme. Where discrepancies were found between the automated result and original manual grade, internal and external arbitration was used to determine the final study grades. Two versions of the software were used: one that detected microaneurysms alone, and one that detected blot haemorrhages and exudates in addition to microaneurysms. Results for each version were calculated once using both fields and once using the macula-centred field alone.Of the 8,271 episodes, 346 (4.2% were considered unassessable. Referable disease was detected in 587 episodes (7.1%. The sensitivity of the automated system for detecting unassessable images ranged from 97.4% to 99.1% depending on configuration. The sensitivity of the automated system for referable episodes ranged from 98.3% to 99.3%. All the episodes that included proliferative or pre-proliferative retinopathy were detected by the automated system regardless of configuration (192/192, 95% confidence interval 98.0% to 100%. If implemented as the first step in grading, the automated system would have reduced the manual grading effort by between 2,183 and 3,147 patient episodes (26.4% to 38.1%.Automated grading can safely reduce the workload of manual grading using two field, mydriatic photography in a routine screening service.

  9. Field correction for a one meter long permanent-magnet wiggler

    International Nuclear Information System (INIS)

    Fortgang, C.M.

    1992-01-01

    Field errors in wigglers are usually measured and corrected on-axis only, thus ignoring field error gradients. We find that gradient scale lengths are of the same order as electron beam size and therefore can be important. We report measurements of wiggler field errors in three dimensions and expansion of these errors out to first order (including two dipole and two quadrupole components). Conventional techniques for correcting on-axis errors (order zero) create new off-axis (first order) errors. We present a new approach to correcting wiggler fields out to first order. By correcting quadrupole errors in addition to the usual dipole correction, we minimize growth in electron beam size. Correction to first order yields better overlap between the electron and optical beams and should improve laser gain. (Author) 2 refs., 5 figs

  10. Automated and observer based light field indicator edge evaluation in diagnostic X-ray equipment

    OpenAIRE

    Bottaro, Márcio; Nagy, Balázs Vince; Soares, Fernanda Cristina Salvador; Rosendo, Danilo Cabral

    2017-01-01

    Abstract Introduction To analyze edge detection and optical contrast calculation of light field-indicators used in X-ray via automated- and observer-based methods, and comparison with current standard approaches, which do not give exact definition for light field edge determination. Methods Automated light sensor array was used to measure the penumbra zone of the edge in the standard X-ray equipment, while trained and naïve human observers were asked to mark the light field edge according t...

  11. Automated and observer based light field indicator edge evaluation in diagnostic X-ray equipment

    Energy Technology Data Exchange (ETDEWEB)

    Bottaro, Marcio; Nagy, Balazs Vince; Soares, Fernanda Cristina Salvador; Rosendo, Danilo Cabral, E-mail: marcio@iee.usp.br [Universidade de Sao Paulo (USP), SP (Brazil); Optics and Engineering Informatics, Budapest University of Technology and Economics, Budapest (Hungary)

    2017-04-15

    Introduction: To analyze edge detection and optical contrast calculation of light field-indicators used in X-ray via automated- and observer-based methods, and comparison with current standard approaches, which do not give exact definition for light field edge determination. Methods: Automated light sensor array was used to measure the penumbra zone of the edge in the standard X-ray equipment, while trained and naive human observers were asked to mark the light field edge according to their own determination. Different interpretations of the contrast were then calculated and compared. Results: In contrast to automated measurements of edge definition and detection, measurements by human observers showed large inter-observer variation independent of their training with X-ray equipment. Different contrast calculations considering the different edge definitions gave very different contrast values. Conclusion: As the main conclusion, we propose a more exact edge definition of the X-ray light field, corresponding well to the average human observer's edge determination. The new edge definition method with automated systems would reduce human variability in edge determination. Such errors could potentially affect the approval of X-ray equipment, and also increase the radiation dose. The automated measurement based on human observers’ edge definition and the corresponding contrast calculation may lead to a more precise light field calibration, which enables reduced irradiation doses on radiology patients. (author)

  12. Automated and observer based light field indicator edge evaluation in diagnostic X-ray equipment

    Directory of Open Access Journals (Sweden)

    Márcio Bottaro

    Full Text Available Abstract Introduction To analyze edge detection and optical contrast calculation of light field-indicators used in X-ray via automated- and observer-based methods, and comparison with current standard approaches, which do not give exact definition for light field edge determination. Methods Automated light sensor array was used to measure the penumbra zone of the edge in the standard X-ray equipment, while trained and naïve human observers were asked to mark the light field edge according to their own determination. Different interpretations of the contrast were then calculated and compared. Results In contrast to automated measurements of edge definition and detection, measurements by human observers showed large inter-observer variation independent of their training with X-ray equipment. Different contrast calculations considering the different edge definitions gave very different contrast values. Conclusion As the main conclusion, we propose a more exact edge definition of the X-ray light field, corresponding well to the average human observer’s edge determination. The new edge definition method with automated systems would reduce human variability in edge determination. Such errors could potentially affect the approval of X-ray equipment, and also increase the radiation dose. The automated measurement based on human observers’ edge definition and the corresponding contrast calculation may lead to a more precise light field calibration, which enables reduced irradiation doses on radiology patients.

  13. Automated and observer based light field indicator edge evaluation in diagnostic X-ray equipment

    International Nuclear Information System (INIS)

    Bottaro, Marcio; Nagy, Balazs Vince; Soares, Fernanda Cristina Salvador; Rosendo, Danilo Cabral

    2017-01-01

    Introduction: To analyze edge detection and optical contrast calculation of light field-indicators used in X-ray via automated- and observer-based methods, and comparison with current standard approaches, which do not give exact definition for light field edge determination. Methods: Automated light sensor array was used to measure the penumbra zone of the edge in the standard X-ray equipment, while trained and naive human observers were asked to mark the light field edge according to their own determination. Different interpretations of the contrast were then calculated and compared. Results: In contrast to automated measurements of edge definition and detection, measurements by human observers showed large inter-observer variation independent of their training with X-ray equipment. Different contrast calculations considering the different edge definitions gave very different contrast values. Conclusion: As the main conclusion, we propose a more exact edge definition of the X-ray light field, corresponding well to the average human observer's edge determination. The new edge definition method with automated systems would reduce human variability in edge determination. Such errors could potentially affect the approval of X-ray equipment, and also increase the radiation dose. The automated measurement based on human observers’ edge definition and the corresponding contrast calculation may lead to a more precise light field calibration, which enables reduced irradiation doses on radiology patients. (author)

  14. Semi-Automated Correction Tools for Mathematics-Based Exercises in MOOC Environments

    Directory of Open Access Journals (Sweden)

    Alberto Corbi

    2015-06-01

    Full Text Available Massive Open Online Courses (MOOCs allow the participation of hundreds of students who are interested in a wide range of areas. Given the huge numbers enrolled, it is almost impossible to give complex homework to students and have it carefully corrected and reviewed by a tutor or assistant professor. In this paper, we present a software framework that aims at assisting teachers in MOOCs during correction tasks for mathematics exercises. This framework might suit maths, physics or technical subjects. As a test experience, we apply it to 300+ physics homework bulletins from 80+ students. Test results show our solution can prove very useful in guiding assistant teachers during correction shifts.

  15. Automated field testing of a track-type tractor

    Science.gov (United States)

    Taylor, Michael A.; Lay, Keith; Struble, Joshua; Allen, William; Subrt, Michael

    2003-09-01

    During the design process, earthmoving manufacturers routinely subject machines to rigorous, long-term tests to ensure quality. Automating portions of the testing process can potentially reduce the cost and time to complete these tests. We present a system that guides a 175 horsepower track-type tractor (Caterpillar Model D6R XL) along a prescribed route, allowing simple tasks to be completed by the automated machine while more complex tasks, such as site clean up, are handled by an operator. Additionally, the machine can be operated manually or via remote control and observed over the internet using a remote supervisor program. We envision that safety would be handled using work procedures, multiple over-ride methods and a GPS fence. The current system can follow turns within a half meter and straight sections within a quarter meter. The controller hardware and software are integrated with existing on-board electronic modules and allow for portability. The current system successfully handles the challenges of a clutch-brake drive train and has the potential to improve control over test variables, lower testing costs and enable testing at higher speeds allowing for higher impact tests than a human operator can tolerate.

  16. An automated analysis workflow for optimization of force-field parameters using neutron scattering data

    Energy Technology Data Exchange (ETDEWEB)

    Lynch, Vickie E.; Borreguero, Jose M. [Neutron Data Analysis & Visualization Division, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States); Bhowmik, Debsindhu [Computational Sciences & Engineering Division, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States); Ganesh, Panchapakesan; Sumpter, Bobby G. [Center for Nanophase Material Sciences, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States); Computational Sciences & Engineering Division, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States); Proffen, Thomas E. [Neutron Data Analysis & Visualization Division, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States); Goswami, Monojoy, E-mail: goswamim@ornl.gov [Center for Nanophase Material Sciences, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States); Computational Sciences & Engineering Division, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States)

    2017-07-01

    Graphical abstract: - Highlights: • An automated workflow to optimize force-field parameters. • Used the workflow to optimize force-field parameter for a system containing nanodiamond and tRNA. • The mechanism relies on molecular dynamics simulation and neutron scattering experimental data. • The workflow can be generalized to any other experimental and simulation techniques. - Abstract: Large-scale simulations and data analysis are often required to explain neutron scattering experiments to establish a connection between the fundamental physics at the nanoscale and data probed by neutrons. However, to perform simulations at experimental conditions it is critical to use correct force-field (FF) parameters which are unfortunately not available for most complex experimental systems. In this work, we have developed a workflow optimization technique to provide optimized FF parameters by comparing molecular dynamics (MD) to neutron scattering data. We describe the workflow in detail by using an example system consisting of tRNA and hydrophilic nanodiamonds in a deuterated water (D{sub 2}O) environment. Quasi-elastic neutron scattering (QENS) data show a faster motion of the tRNA in the presence of nanodiamond than without the ND. To compare the QENS and MD results quantitatively, a proper choice of FF parameters is necessary. We use an efficient workflow to optimize the FF parameters between the hydrophilic nanodiamond and water by comparing to the QENS data. Our results show that we can obtain accurate FF parameters by using this technique. The workflow can be generalized to other types of neutron data for FF optimization, such as vibrational spectroscopy and spin echo.

  17. Improved field-mapping and artifact correction in multispectral imaging.

    Science.gov (United States)

    Quist, Brady; Shi, Xinwei; Weber, Hans; Hargreaves, Brian A

    2017-11-01

    To develop a method for improved B 0 field-map estimation, deblurring, and image combination for multispectral imaging near metal. A goodness-of-fit field-map estimation technique is proposed that uses only the multispectral imaging (MSI) data to estimate the field map. Using the improved field map, a novel deblurring technique is proposed that also employs a new image combination scheme to reduce the effects of noise and other residual MSI artifacts. The proposed field-map estimation and deblurring techniques are compared to the current methods in phantoms and/or in vivo from subjects with knee, hip, and spinal metallic implants. Phantom experiments validate that the goodness-of-fit field-map estimation is less sensitive to noise and bias than the conventional center-of-mass technique, which reduces distortion in the deblurring methods. The new deblurring approach also is substantially less sensitive to noise and distortion than the current deblurring method, as demonstrated in phantoms and in vivo, and is able to find a good tradeoff between deblurring and distortion. The proposed methods not only enable field-mapping with reduced noise sensitivity but are able to create deblurred images with less distortion and better signal-to-noise ratio with no additional scan time, thereby enabling improved visualization of underlying anatomy near metallic implants. Magn Reson Med 78:2022-2034, 2017. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  18. Sludge settleability detection using automated SV30 measurement and its application to a field WWTP.

    Science.gov (United States)

    Kim, Y J; Choi, S J; Bae, H; Kim, C W

    2011-01-01

    The need for automation & measurement technologies to detect the process state has been a driving force in the development of various measurements at wastewater treatment plants. While the number of applications of automation & measurement technologies to the field is increasing, there have only been a few cases where they have been applied to the area of sludge settling. This is because it is not easy to develop an automated operation support system for the detection of sludge settleability due to its site-specific characteristics. To automate the human operator's daily test and diagnosis works on sludge settling, an on-line SV30 measurement was developed and an automated detection algorithm on settleability was developed that imitated heuristics to detect settleability faults. The automated SV30 measurement is based on automatic pumping with a predefined schedule, the image capture of the settling test with a digital camera, and an analysis of the images to detect the settled sludge height. A sludge settleability detection method was developed and its applicability was investigated by field application.

  19. In vivo robotics: the automation of neuroscience and other intact-system biological fields.

    Science.gov (United States)

    Kodandaramaiah, Suhasa B; Boyden, Edward S; Forest, Craig R

    2013-12-01

    Robotic and automation technologies have played a huge role in in vitro biological science, having proved critical for scientific endeavors such as genome sequencing and high-throughput screening. Robotic and automation strategies are beginning to play a greater role in in vivo and in situ sciences, especially when it comes to the difficult in vivo experiments required for understanding the neural mechanisms of behavior and disease. In this perspective, we discuss the prospects for robotics and automation to influence neuroscientific and intact-system biology fields. We discuss how robotic innovations might be created to open up new frontiers in basic and applied neuroscience and present a concrete example with our recent automation of in vivo whole-cell patch clamp electrophysiology of neurons in the living mouse brain. © 2013 New York Academy of Sciences.

  20. Feasibility study for the computerized automation of the Annapolis Field Office of EPA region III

    International Nuclear Information System (INIS)

    Ames, H.S.; Barton, G.W. Jr.; Bystroff, R.I.; Crawford, R.W.; Kray, A.M.; Maples, M.D.

    1976-08-01

    This report describes a feasibility study for computerized automation of the Annapolis Field Office (AFO) of EPA's Region III. The AFO laboratory provides analytical support for a number of EPA divisions; its primary function at present is analysis of water samples from rivers, estuaries, and the ocean in the Chesapeake Bay area. Automation of the AFO laboratory is found to be not only feasible but also highly desirable. An automation system is proposed which will give major improvements in analytical capacity, quality control, sample management, and reporting capabilities. This system is similar to the LLL-developed automation systems already installed at other EPA laboratories, with modifications specific to the needs of the AFO laboratory and the addition of sample file control. It is estimated that the initial cost of the system, nearly $300,000, would be recouped in about three years by virtue of the increased capacity and efficiency of operation

  1. Automated mass correction and data interpretation for protein open-access liquid chromatography-mass spectrometry.

    Science.gov (United States)

    Wagner, Craig D; Hall, John T; White, Wendy L; Miller, Luke A D; Williams, Jon D

    2007-02-01

    Characterization of recombinant protein purification fractions and final products by liquid chromatography-mass spectrometry (LC/MS) are requested more frequently each year. A protein open-access (OA) LC/MS system was developed in our laboratory to meet this demand. This paper compares the system that we originally implemented in our facilities in 2003 to the one now in use, and discusses, in more detail, recent enhancements that have improved its robustness, reliability, and data reporting capabilities. The system utilizes instruments equipped with reversed-phase chromatography and an orthogonal accelerated time-of-flight mass spectrometer fitted with an electrospray source. Sample analysis requests are accomplished using a simple form on a web-enabled laboratory information management system (LIMS). This distributed form is accessible from any intranet-connected company desktop computer. Automated data acquisition and processing are performed using a combination of in-house (OA-Self Service, OA-Monitor, and OA-Analysis Engine) and vendor-supplied programs (AutoLynx, and OpenLynx) located on acquisition computers and off-line processing workstations. Analysis results are then reported via the same web-based LIMS. Also presented are solutions to problems not addressed on commercially available, small-molecule OA-LC/MS systems. These include automated transforming of mass-to-charge (m/z) spectra to mass spectra and automated data interpretation that considers minor variants to the protein sequence-such as common post-translational modifications (PTMs). Currently, our protein OA-LC/MS platform runs on five LC/MS instruments located in three separate GlaxoSmithKline R&D sites in the US and UK. To date, more than 8000 protein OA-LC/MS samples have been analyzed. With these user friendly and highly automated OA systems in place, mass spectrometry plays a key role in assessing the quality of recombinant proteins, either produced at our facilities or bought from external

  2. Radiative Corrections from Heavy Fast-Roll Fields during Inflation

    DEFF Research Database (Denmark)

    Jain, Rajeev Kumar; Sandora, McCullen; Sloth, Martin S.

    2015-01-01

    to an unobservable small running of the spectral index. An observable level of tensor modes can also be accommodated, but, surprisingly, this requires running to be induced by a curvaton. If upcoming observations are consistent with a small tensor-to-scalar ratio as predicted by small field models of inflation...

  3. N3 Bias Field Correction Explained as a Bayesian Modeling Method

    DEFF Research Database (Denmark)

    Larsen, Christian Thode; Iglesias, Juan Eugenio; Van Leemput, Koen

    2014-01-01

    Although N3 is perhaps the most widely used method for MRI bias field correction, its underlying mechanism is in fact not well understood. Specifically, the method relies on a relatively heuristic recipe of alternating iterative steps that does not optimize any particular objective function....... In this paper we explain the successful bias field correction properties of N3 by showing that it implicitly uses the same generative models and computational strategies as expectation maximization (EM) based bias field correction methods. We demonstrate experimentally that purely EM-based methods are capable...

  4. Optical correction for multiple back reflections in an automated spectroradiometric measurement system

    Science.gov (United States)

    Garland, William C.; Biggar, Stuart F.; Zalewski, Edward F.; Thome, Kurtis J.

    2004-01-01

    The University of Arizona's Remote Sensing Group depends heavily upon automated solar radiometers and transfer radiometers for calibration of sensors. Interference filters are essential for these devices and accuracy in determining filter transmittance characteristics is crucial. The Remote Sensing Group uses a commercially available automated spectroradiometric measurement system equipped with a dual monochromator and a filter transmittance accessory for measuring filter transmittance. Examination of the design of transmittance attachment and the detector assembly indicated the possibility of multiple back reflections between an interference filter and the detector and that higher than expected transmittance values were likely. To reduce this, a fine annealed BK7 wedge with an 8-degree deviation angle was placed in the optical path between the transmittance accessory focusing lens and the detector. The purpose of this paper is to evaluate the performance of the system with the BK7 wedge. The effect of the wedge will be negligible for an absorption filter and possibly significant for interference filters in the band-pass region. Two interference filters were analyzed via three repeats for each of the following scenarios: broadband with and without the wedge and band-pass with and without the wedge. The broadband at low spectral resolution and band-pass at high spectral resolution trials had comparable results while the greatest percent difference in transmittance occurred in the out-of-band region due to the extremely small transmittance values associated with the noise level for the instrument in general and for each interference filter specifically. For the band-pass region, the trials yielded a 0.015 to 0.062 difference in transmittance with the greatest difference occurring in the large gradient zone between the band-pass and the out-of-band region. The wedge makes a significant difference in transmittance measurements.

  5. Orbit correction in a linear nonscaling fixed field alternating gradient accelerator

    Directory of Open Access Journals (Sweden)

    D. J. Kelliher

    2014-11-01

    Full Text Available In a linear nonscaling fixed field alternating gradient accelerator, the large natural chromaticity of the machine results in a betatron tune that varies by several integers over the momentum range. Orbit correction is complicated by the consequent variation of the phase advance between lattice elements. Here we investigate how the correction of multiple closed orbit harmonics allows correction of both the closed orbit distortion and the accelerated orbit distortion over the momentum range.

  6. Correction

    CERN Multimedia

    2002-01-01

    Tile Calorimeter modules stored at CERN. The larger modules belong to the Barrel, whereas the smaller ones are for the two Extended Barrels. (The article was about the completion of the 64 modules for one of the latter.) The photo on the first page of the Bulletin n°26/2002, from 24 July 2002, illustrating the article «The ATLAS Tile Calorimeter gets into shape» was published with a wrong caption. We would like to apologise for this mistake and so publish it again with the correct caption.

  7. Transcranial Magnetic Stimulation: An Automated Procedure to Obtain Coil-specific Models for Field Calculations

    DEFF Research Database (Denmark)

    Madsen, Kristoffer Hougaard; Ewald, Lars; Siebner, Hartwig R.

    2015-01-01

    potential of the TMS coils. Objective: To develop an approach to reconstruct the magnetic vector potential based on automated measurements. Methods: We implemented a setup that simultaneously measures the three components of the magnetic field with high spatial resolution. This is complemented by a novel...

  8. Three-loop corrections in a covariant effective field theory

    International Nuclear Information System (INIS)

    McIntire, Jeff

    2008-01-01

    Chiral effective field theories have been used with success in the study of nuclear structure. It is of interest to systematically improve these energy functionals (particularly that of quantum hadrodynamics) through the inclusion of many-body correlations. One possible source of improvement is the loop expansion. Using the techniques of Infrared Regularization, the short-range, local dynamics at each order in the loops is absorbed into the parameterization of the underlying effective Lagrangian. The remaining nonlocal, exchange correlations must be calculated explicitly. Given that the interactions of quantum hadrodynamics are relatively soft, the loop expansion may be manageable or even perturbative in nuclear matter. This work investigates the role played by the three-loop contributions to the loop expansion for quantum hadrodynamics

  9. Automated correction on X-rays calibration using transmission chamber and LabVIEW{sup TM}

    Energy Technology Data Exchange (ETDEWEB)

    Betti, Flavio; Potiens, Maria da Penha Albuquerque, E-mail: fbetti@ipen.b, E-mail: mppalbu@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2009-07-01

    Uncertainties during prolonged exposure times on X-rays calibration procedures at the Instruments Calibration facilities at IPEN may suffer from efficiency (and therefore intensity) variations on the industrial X-Ray generator used. Using a transmission chamber as an online reference chamber during the whole irradiation process is proposed in order to compensate for such error source. Also temperature (and pressure) fluctuations may arise from the performance limited calibration room air conditioning system. As an open ionization chamber, that monitor chamber does require calculation of a correction factor due to the temperature and pressure effects on air density. Sending and processing data from all related instruments (electrometer, thermometer and barometer) can be more easily achieved by interfacing them to a host computer running an especially developed algorithm using LabVIEW{sup TM} environment which will not only apply the proper correction factors during runtime, but also determine the exact length of time to reach a desired condition, which can be: time period, charge collected, or air kerma, based on the previous calibration of the whole system using a reference chamber traceable to primary standard dosimetry laboratories. When performing such calibration, two temperature sensors (secondary standard thermistors) are simultaneously used, one for the transmission chamber, and other for the reference chamber. As the substitution method is used during actual customer's calibration, the readings from the second thermistor can also be used when desired for further corrections. Use of LabVIEW{sup TM} programming language allowed for a shorter development time, and it is also extremely convenient to make things easier when improvements and modifications are called for. (author)

  10. Correction

    Directory of Open Access Journals (Sweden)

    2012-01-01

    Full Text Available Regarding Gorelik, G., & Shackelford, T.K. (2011. Human sexual conflict from molecules to culture. Evolutionary Psychology, 9, 564–587: The authors wish to correct an omission in citation to the existing literature. In the final paragraph on p. 570, we neglected to cite Burch and Gallup (2006 [Burch, R. L., & Gallup, G. G., Jr. (2006. The psychobiology of human semen. In S. M. Platek & T. K. Shackelford (Eds., Female infidelity and paternal uncertainty (pp. 141–172. New York: Cambridge University Press.]. Burch and Gallup (2006 reviewed the relevant literature on FSH and LH discussed in this paragraph, and should have been cited accordingly. In addition, Burch and Gallup (2006 should have been cited as the originators of the hypothesis regarding the role of FSH and LH in the semen of rapists. The authors apologize for this oversight.

  11. Correction

    CERN Multimedia

    2002-01-01

    The photo on the second page of the Bulletin n°48/2002, from 25 November 2002, illustrating the article «Spanish Visit to CERN» was published with a wrong caption. We would like to apologise for this mistake and so publish it again with the correct caption.   The Spanish delegation, accompanied by Spanish scientists at CERN, also visited the LHC superconducting magnet test hall (photo). From left to right: Felix Rodriguez Mateos of CERN LHC Division, Josep Piqué i Camps, Spanish Minister of Science and Technology, César Dopazo, Director-General of CIEMAT (Spanish Research Centre for Energy, Environment and Technology), Juan Antonio Rubio, ETT Division Leader at CERN, Manuel Aguilar-Benitez, Spanish Delegate to Council, Manuel Delfino, IT Division Leader at CERN, and Gonzalo León, Secretary-General of Scientific Policy to the Minister.

  12. Correction

    Directory of Open Access Journals (Sweden)

    2014-01-01

    Full Text Available Regarding Tagler, M. J., and Jeffers, H. M. (2013. Sex differences in attitudes toward partner infidelity. Evolutionary Psychology, 11, 821–832: The authors wish to correct values in the originally published manuscript. Specifically, incorrect 95% confidence intervals around the Cohen's d values were reported on page 826 of the manuscript where we reported the within-sex simple effects for the significant Participant Sex × Infidelity Type interaction (first paragraph, and for attitudes toward partner infidelity (second paragraph. Corrected values are presented in bold below. The authors would like to thank Dr. Bernard Beins at Ithaca College for bringing these errors to our attention. Men rated sexual infidelity significantly more distressing (M = 4.69, SD = 0.74 than they rated emotional infidelity (M = 4.32, SD = 0.92, F(1, 322 = 23.96, p < .001, d = 0.44, 95% CI [0.23, 0.65], but there was little difference between women's ratings of sexual (M = 4.80, SD = 0.48 and emotional infidelity (M = 4.76, SD = 0.57, F(1, 322 = 0.48, p = .29, d = 0.08, 95% CI [−0.10, 0.26]. As expected, men rated sexual infidelity (M = 1.44, SD = 0.70 more negatively than they rated emotional infidelity (M = 2.66, SD = 1.37, F(1, 322 = 120.00, p < .001, d = 1.12, 95% CI [0.85, 1.39]. Although women also rated sexual infidelity (M = 1.40, SD = 0.62 more negatively than they rated emotional infidelity (M = 2.09, SD = 1.10, this difference was not as large and thus in the evolutionary theory supportive direction, F(1, 322 = 72.03, p < .001, d = 0.77, 95% CI [0.60, 0.94].

  13. Setup accuracy of stereoscopic X-ray positioning with automated correction for rotational errors in patients treated with conformal arc radiotherapy for prostate cancer

    International Nuclear Information System (INIS)

    Soete, Guy; Verellen, Dirk; Tournel, Koen; Storme, Guy

    2006-01-01

    We evaluated setup accuracy of NovalisBody stereoscopic X-ray positioning with automated correction for rotational errors with the Robotics Tilt Module in patients treated with conformal arc radiotherapy for prostate cancer. The correction of rotational errors was shown to reduce random and systematic errors in all directions. (NovalisBody TM and Robotics Tilt Module TM are products of BrainLAB A.G., Heimstetten, Germany)

  14. Automated geographic registration and radiometric correction for UAV-based mosaics

    Science.gov (United States)

    Texas A&M University has been operating a large-scale, UAV-based, agricultural remote-sensing research project since 2015. To use UAV-based images in agricultural production, many high-resolution images must be mosaicked together to create an image of an agricultural field. Two key difficulties to s...

  15. Correcting Inconsistencies and Errors in Bacterial Genome Metadata Using an Automated Curation Tool in Excel (AutoCurE).

    Science.gov (United States)

    Schmedes, Sarah E; King, Jonathan L; Budowle, Bruce

    2015-01-01

    Whole-genome data are invaluable for large-scale comparative genomic studies. Current sequencing technologies have made it feasible to sequence entire bacterial genomes with relative ease and time with a substantially reduced cost per nucleotide, hence cost per genome. More than 3,000 bacterial genomes have been sequenced and are available at the finished status. Publically available genomes can be readily downloaded; however, there are challenges to verify the specific supporting data contained within the download and to identify errors and inconsistencies that may be present within the organizational data content and metadata. AutoCurE, an automated tool for bacterial genome database curation in Excel, was developed to facilitate local database curation of supporting data that accompany downloaded genomes from the National Center for Biotechnology Information. AutoCurE provides an automated approach to curate local genomic databases by flagging inconsistencies or errors by comparing the downloaded supporting data to the genome reports to verify genome name, RefSeq accession numbers, the presence of archaea, BioProject/UIDs, and sequence file descriptions. Flags are generated for nine metadata fields if there are inconsistencies between the downloaded genomes and genomes reports and if erroneous or missing data are evident. AutoCurE is an easy-to-use tool for local database curation for large-scale genome data prior to downstream analyses.

  16. A few more comments on secularly growing loop corrections in strong electric fields

    International Nuclear Information System (INIS)

    Akhmedov, E.T.; Popov, F.K.

    2015-01-01

    We extend the observations of our previous paper http://dx.doi.org/10.1007/JHEP09(2014)071 [http://arxiv.org/abs/1405.5285]. particular, we show that the secular growth of the loop corrections to the two-point correlation functions is gauge independent: we observe the same growth in the case of the static gauge for the constant background electric field. Furthermore we solve the kinetic equation describing photon production from the background fields, which was derived in our previous paper and allows one to sum up leading secularly growing corrections from all loops. Finally, we show that in the constant electric field background the one-loop correction to the current of the produced pairs is not zero: it also grows with time and violates time translational and reversal invariance of QED on the constant electric field background.

  17. Thermal corrections to Rényi entropies for conformal field theories

    Energy Technology Data Exchange (ETDEWEB)

    Herzog, Christopher P.; Nian, Jun [C. N. Yang Institute for Theoretical Physics, Department of Physics and Astronomy, Stony Brook University, Stony Brook, NY 11794 (United States)

    2015-06-03

    We compute thermal corrections to Rényi entropies of d dimensional conformal field theories on spheres. Consider the nth Rényi entropy for a cap of opening angle 2θ on S{sup d−1}. From a Boltzmann sum decomposition and the operator-state correspondence, the leading correction is related to a certain two-point correlation function of the operator (not equal to the identity) with smallest scaling dimension. More specifically, via a conformal map, the correction can be expressed in terms of the two-point function on a certain conical space with opening angle 2πn. In the case of free conformal field theories, this two-point function can be computed explicitly using the method of images. We perform the computation for the conformally coupled scalar. From the n→1 limit of our results, we extract the leading thermal correction to the entanglement entropy, reproducing results of arXiv:1407.1358.

  18. High magnetic field multipoles generated by superconductor magnetization within a set of nested superconducting correction coils

    International Nuclear Information System (INIS)

    Green, M.A.

    1990-04-01

    Correction elements in colliding beam accelerators such as the SSC can be the source of undesirable higher magnetic field multipoles due to magnetization of the superconductor within the corrector. Quadrupole and sextupole correctors located within the main dipole will produce sextupole and decapole due to magnetization of the superconductor within the correction coils. Lumped nested correction coils can produce a large number of skew and normal magnetization multipoles which may have an adverse effect on a stored beam at injection into a high energy colliding beam machine such as the SSC. 6 refs., 2 figs., 2 tabs

  19. Higher magnetic field multipoles generated by superconductor magnetization within a set of nested superconducting correction coils

    International Nuclear Information System (INIS)

    Green, M.A.

    1990-01-01

    Correction elements in colliding beam accelerators such as the Superconducting Super Collider (SSC) can be the source of undesirable higher magnetic field multipoles due to magnetization of the superconductor within the corrector. Quadrupole and sextupole correctors located within the main dipole will produce sextupole and decapole due to magnetization of the superconductor within the correction coils. Lumped nested correction coils can produce a large number of skew and normal magnetization multipoles which may have an adverse effect on a stored beam at injection into a high energy colliding beam machine such as the SSC. Multipole magnetization field components have been measured within the HERA storage ring dipole magnets. Calculations of these components using the SCMAG04 code, which agree substantially with the measured multipoles, are presented in the report. As a result, in the proposed continuous correction winding for the SSC, dipoles have been replaced with lumped correction elements every six dipole magnets (about 120 meters apart). Nested lumped correction elements will also produce undesirable higher magnetization multipoles. This report shows a method by which the higher multipole generated by nested correction elements can be identified. (author)

  20. Massive Corrections to Entanglement in Minimal E8 Toda Field Theory

    Directory of Open Access Journals (Sweden)

    Olalla A. Castro-Alvaredo

    2017-02-01

    Full Text Available In this letter we study the exponentially decaying corrections to saturation of the second R\\'enyi entropy of one interval of length L in minimal E8 Toda field theory. It has been known for some time that the entanglement entropy of a massive quantum field theory in 1+1 dimensions saturates to a constant value for m1 L <<1 where m1 is the mass of the lightest particle in the spectrum. Subsequently, results by Cardy, Castro-Alvaredo and Doyon have shown that there are exponentially decaying corrections to this behaviour which are characterised by Bessel functions with arguments proportional to m1 L. For the von Neumann entropy the leading correction to saturation takes the precise universal form -K0(2m1 L/8 whereas for the R\\'enyi entropies leading corrections which are proportional to K0(m1 L are expected. Recent numerical work by P\\'almai for the second R\\'enyi entropy of minimal E8 Toda has found next-to-leading order corrections decaying as exp(-2m1 L rather than the expected exp(-m1 L. In this paper we investigate the origin of this result and show that it is incorrect. An exact form factor computation of correlators of branch point twist fields reveals that the leading corrections are proportional to K0(m1 L as expected.

  1. Error-correction learning for artificial neural networks using the Bayesian paradigm. Application to automated medical diagnosis.

    Science.gov (United States)

    Belciug, Smaranda; Gorunescu, Florin

    2014-12-01

    Automated medical diagnosis models are now ubiquitous, and research for developing new ones is constantly growing. They play an important role in medical decision-making, helping physicians to provide a fast and accurate diagnosis. Due to their adaptive learning and nonlinear mapping properties, the artificial neural networks are widely used to support the human decision capabilities, avoiding variability in practice and errors based on lack of experience. Among the most common learning approaches, one can mention either the classical back-propagation algorithm based on the partial derivatives of the error function with respect to the weights, or the Bayesian learning method based on posterior probability distribution of weights, given training data. This paper proposes a novel training technique gathering together the error-correction learning, the posterior probability distribution of weights given the error function, and the Goodman-Kruskal Gamma rank correlation to assembly them in a Bayesian learning strategy. This study had two main purposes; firstly, to develop anovel learning technique based on both the Bayesian paradigm and the error back-propagation, and secondly,to assess its effectiveness. The proposed model performance is compared with those obtained by traditional machine learning algorithms using real-life breast and lung cancer, diabetes, and heart attack medical databases. Overall, the statistical comparison results indicate that thenovellearning approach outperforms the conventional techniques in almost all respects. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. In situ correction of field errors induced by temperature gradient in cryogenic undulators

    Directory of Open Access Journals (Sweden)

    Takashi Tanaka

    2009-12-01

    Full Text Available A new technique of undulator field correction for cryogenic permanent magnet undulators (CPMUs is proposed to correct the phase error induced by temperature gradient. This technique takes advantage of two important instruments: one is the in-vacuum self-aligned field analyzer with laser instrumentation system to precisely measure the distribution of the magnetic field generated by the permanent magnet arrays placed in vacuum, and the other is the differential adjuster to correct the local variation of the magnet gap. The details of the two instruments are described together with the method of how to analyze the field measurement data and deduce the gap variation along the undulator axis. The correction technique was applied to the CPMU with a length of 1.7 m and a magnetic period of 14 mm. It was found that the phase error induced during the cooling process was attributable to local gap variations of around 30  μm, which were then corrected by the differential adjuster.

  3. Thermal corrections to the Casimir energy in a general weak gravitational field

    Science.gov (United States)

    Nazari, Borzoo

    2016-12-01

    We calculate finite temperature corrections to the energy of the Casimir effect of a two conducting parallel plates in a general weak gravitational field. After solving the Klein-Gordon equation inside the apparatus, mode frequencies inside the apparatus are obtained in terms of the parameters of the weak background. Using Matsubara’s approach to quantum statistical mechanics gravity-induced thermal corrections of the energy density are obtained. Well-known weak static and stationary gravitational fields are analyzed and it is found that in the low temperature limit the energy of the system increases compared to that in the zero temperature case.

  4. Near-field antenna testing using the Hewlett Packard 8510 automated network analyzer

    Science.gov (United States)

    Kunath, Richard R.; Garrett, Michael J.

    1990-01-01

    Near-field antenna measurements were made using a Hewlett-Packard 8510 automated network analyzer. This system features measurement sensitivity better than -90 dBm, at measurement speeds of one data point per millisecond in the fast data acquisition mode. The system was configured using external, even harmonic mixers and a fiber optic distributed local oscillator signal. Additionally, the time domain capability of the HP8510, made it possible to generate far-field diagnostic results immediately after data acquisition without the use of an external computer.

  5. Field test of the PNNL Automated Radioxenon Sampler/Analyzer (ARSA)

    International Nuclear Information System (INIS)

    Lagomarsino, R.J.; Ku, E.; Latner, N.; Sanderson, C.G.

    1998-07-01

    As part of the requirements of the Comprehensive Test Ban Treaty (CTBT), the Automated Radioxenon/Sampler Analyzer (ARSA) was designed and engineered by the Pacific Northwest National Laboratory (PNNL). The instrument is to provide near real-time detection and measurement of the radioxenons released into the atmosphere after a nuclear test. Forty-six field tests, designed to determine the performance of the ARSA prototype under simulated field conditions, were conducted at EML from March to December 1997. This final report contains detailed results of the tests with recommendations for improvements in instrument performance

  6. Field test of the PNNL Automated Radioxenon Sampler/Analyzer (ARSA)

    Energy Technology Data Exchange (ETDEWEB)

    Lagomarsino, R.J.; Ku, E.; Latner, N.; Sanderson, C.G.

    1998-07-01

    As part of the requirements of the Comprehensive Test Ban Treaty (CTBT), the Automated Radioxenon/Sampler Analyzer (ARSA) was designed and engineered by the Pacific Northwest National Laboratory (PNNL). The instrument is to provide near real-time detection and measurement of the radioxenons released into the atmosphere after a nuclear test. Forty-six field tests, designed to determine the performance of the ARSA prototype under simulated field conditions, were conducted at EML from March to December 1997. This final report contains detailed results of the tests with recommendations for improvements in instrument performance.

  7. Copula-based assimilation of radar and gauge information to derive bias-corrected precipitation fields

    Directory of Open Access Journals (Sweden)

    S. Vogl

    2012-07-01

    Full Text Available This study addresses the problem of combining radar information and gauge measurements. Gauge measurements are the best available source of absolute rainfall intensity albeit their spatial availability is limited. Precipitation information obtained by radar mimics well the spatial patterns but is biased for their absolute values.

    In this study copula models are used to describe the dependence structure between gauge observations and rainfall derived from radar reflectivity at the corresponding grid cells. After appropriate time series transformation to generate "iid" variates, only the positive pairs (radar >0, gauge >0 of the residuals are considered. As not each grid cell can be assigned to one gauge, the integration of point information, i.e. gauge rainfall intensities, is achieved by considering the structure and the strength of dependence between the radar pixels and all the gauges within the radar image. Two different approaches, namely Maximum Theta and Multiple Theta, are presented. They finally allow for generating precipitation fields that mimic the spatial patterns of the radar fields and correct them for biases in their absolute rainfall intensities. The performance of the approach, which can be seen as a bias-correction for radar fields, is demonstrated for the Bavarian Alps. The bias-corrected rainfall fields are compared to a field of interpolated gauge values (ordinary kriging and are validated with available gauge measurements. The simulated precipitation fields are compared to an operationally corrected radar precipitation field (RADOLAN. The copula-based approach performs similarly well as indicated by different validation measures and successfully corrects for errors in the radar precipitation.

  8. Mapping and correcting respiration-induced field changes in the brain using fluorine field probes

    DEFF Research Database (Denmark)

    Andersen, Mads; Madsen, Kristoffer; Hanson, Lars G.

    2014-01-01

    for a single volume was 11s, which was easily tolerated during a breath hold. The field probes were triggered to perform a measurement (3ms duration) 75 ms prior to the first excitation in each dynamic, when no scanner generated RF pulses or gradients were applied. In case of real-time shimming, the scanner......Purpose. Breathing induced dynamic B0 field perturbations in the head can lead to artefacts in ultra-high field MR by causing line broadening in spectroscopy and signal dropout, ghosting, displacement artifacts and blurring in imaging. It has recently been proposed to continuously stabilize...... the magnetic field by real-time updating of the shim fields, based on synchronous field measurements with external probes1,2. A thorough analysis of how accurate such field measurements at few (e.g. 16) positions outside the head can reflect the spatially varying dynamic fields inside the head is currently...

  9. Mapping and correcting respiration-induced field changes in the brain using fluorine field probes

    DEFF Research Database (Denmark)

    Andersen, Mads; Madsen, Kristoffer H; Hanson, L.G.

    : The experiments were performed on a 7T MRI system (Philips Healthcare, Best, NL) using a 32-channel Nova Medical head coil. Fourteen fluorine T/R NMR field probes3 were firmly distributed around the transmit/receive head coil. A stand-alone spectrometer4 digitized the field probe signals, and calculated field...

  10. Treatment planning for SBRT using automated field delivery: A case study

    International Nuclear Information System (INIS)

    Ritter, Timothy A.; Owen, Dawn; Brooks, Cassandra M.; Stenmark, Matthew H.

    2015-01-01

    Stereotactic body radiation therapy (SBRT) treatment planning and delivery can be accomplished using a variety of techniques that achieve highly conformal dose distributions. Herein, we describe a template-based automated treatment field approach that enables rapid delivery of more than 20 coplanar fields. A case study is presented to demonstrate how modest adaptations to traditional SBRT planning can be implemented to take clinical advantage of this technology. Treatment was planned for a left-sided lung lesion adjacent to the chest wall using 25 coplanar treatment fields spaced at 11° intervals. The plan spares the contralateral lung and is in compliance with the conformality standards set forth in Radiation Therapy and Oncology Group protocol 0915, and the dose tolerances found in the report of the American Association of Physicists in Medicine Task Group 101. Using a standard template, treatment planning was accomplished in less than 20 minutes, and each 10 Gy fraction was delivered in approximately 5.4 minutes. For those centers equipped with linear accelerators capable of automated treatment field delivery, the use of more than 20 coplanar fields is a viable SBRT planning approach and yields excellent conformality and quality combined with rapid planning and treatment delivery. Although the case study discusses a laterally located lung lesion, this technique can be applied to centrally located tumors with similar results

  11. Consequences of the center-of-mass correction in nuclear mean-field models

    International Nuclear Information System (INIS)

    Bender, M.; Rutz, K.; Reinhard, P.G.; Maruhn, J.A.

    2000-01-01

    We study the influence of the scheme for the correction for spurious center-of-mass motion on the fit of effective interactions for self-consistent nuclear mean-field calculations. We find that interactions with very simple center-of-mass correction have significantly larger surface coefficients than interactions for which the center-of-mass correction was calculated for the actual many-body state during the fit. The reason for that is that the effective interaction has to counteract the wrong trends with nucleon number of all simplified schemes for center-of-mass correction which puts a wrong trend with mass number into the effective interaction itself. The effect becomes clearly visible when looking at the deformation energy of largely deformed systems, e.g. superdeformed states or fission barriers of heavy nuclei. (orig.)

  12. Implementation and Application of PSF-Based EPI Distortion Correction to High Field Animal Imaging

    Directory of Open Access Journals (Sweden)

    Dominik Paul

    2009-01-01

    Full Text Available The purpose of this work is to demonstrate the functionality and performance of a PSF-based geometric distortion correction for high-field functional animal EPI. The EPI method was extended to measure the PSF and a postprocessing chain was implemented in Matlab for offline distortion correction. The correction procedure was applied to phantom and in vivo imaging of mice and rats at 9.4T using different SE-EPI and DWI-EPI protocols. Results show the significant improvement in image quality for single- and multishot EPI. Using a reduced FOV in the PSF encoding direction clearly reduced the acquisition time for PSF data by an acceleration factor of 2 or 4, without affecting the correction quality.

  13. Improved correction methods for field measurements of particulate light backscattering in turbid waters.

    Science.gov (United States)

    Doxaran, David; Leymarie, Edouard; Nechad, Bouchra; Dogliotti, Ana; Ruddick, Kevin; Gernez, Pierre; Knaeps, Els

    2016-02-22

    Monte Carlo simulations are used to compute the uncertainty associated to light backscattering measurements in turbid waters using the ECO-BB (WET Labs) and Hydroscat (HOBI Labs) scattering sensors. ECO-BB measurements provide an accurate estimate of the particulate volume scattering coefficient after correction for absorption along the short instrument pathlength. For Hydroscat measurements, because of a longer photon pathlength, both absorption and scattering effects must be corrected for. As the standard (sigma) correction potentially leads to large errors, an improved correction method is developed then validated using field inherent and apparent optical measurements carried out in turbid estuarine waters. Conclusions are also drawn to guide development of future short pathlength backscattering sensors for turbid waters.

  14. Field of view extension and truncation correction for MR-based human attenuation correction in simultaneous MR/PET imaging

    International Nuclear Information System (INIS)

    Blumhagen, Jan O.; Ladebeck, Ralf; Fenchel, Matthias; Braun, Harald; Quick, Harald H.; Faul, David; Scheffler, Klaus

    2014-01-01

    Purpose: In quantitative PET imaging, it is critical to accurately measure and compensate for the attenuation of the photons absorbed in the tissue. While in PET/CT the linear attenuation coefficients can be easily determined from a low-dose CT-based transmission scan, in whole-body MR/PET the computation of the linear attenuation coefficients is based on the MR data. However, a constraint of the MR-based attenuation correction (AC) is the MR-inherent field-of-view (FoV) limitation due to static magnetic field (B 0 ) inhomogeneities and gradient nonlinearities. Therefore, the MR-based human AC map may be truncated or geometrically distorted toward the edges of the FoV and, consequently, the PET reconstruction with MR-based AC may be biased. This is especially of impact laterally where the patient arms rest beside the body and are not fully considered. Methods: A method is proposed to extend the MR FoV by determining an optimal readout gradient field which locally compensates B 0 inhomogeneities and gradient nonlinearities. This technique was used to reduce truncation in AC maps of 12 patients, and the impact on the PET quantification was analyzed and compared to truncated data without applying the FoV extension and additionally to an established approach of PET-based FoV extension. Results: The truncation artifacts in the MR-based AC maps were successfully reduced in all patients, and the mean body volume was thereby increased by 5.4%. In some cases large patient-dependent changes in SUV of up to 30% were observed in individual lesions when compared to the standard truncated attenuation map. Conclusions: The proposed technique successfully extends the MR FoV in MR-based attenuation correction and shows an improvement of PET quantification in whole-body MR/PET hybrid imaging. In comparison to the PET-based completion of the truncated body contour, the proposed method is also applicable to specialized PET tracers with little uptake in the arms and might reduce the

  15. Possibilities of the common research-development action in the field of automated logistical engines

    Directory of Open Access Journals (Sweden)

    Pap Lajos

    2003-12-01

    Full Text Available The paper briefly presents the R&D cooperation of the Department of Materials Handling and Logistics and Departments of Automation. The main fields of cooperation are introduced. Different kind of Linear Motor (hereafter LM drives are being developed and tested for warehouse and rolling conveyor systems. Modern control strategies using AI methods are being investigated and tested for Automated guide vehicle. Wireless communication methods are being searched and developed for mobile material handling devices. Application possibilities of voice recognition and image processing are being tested for control of material handling robots and devices. Application of process visualization programs are being developed and investigated. Multi-level industrial communication system is being developed for the laboratories of the cooperating departments.

  16. Coulomb’s law corrections and fermion field localization in a tachyonic de Sitter thick braneworld

    Energy Technology Data Exchange (ETDEWEB)

    Cartas-Fuentevilla, Roberto; Escalante, Alberto [Instituto de Física, Benemérita Universidad Autónoma de Puebla,Apdo. postal J-48, 72570 Puebla, Pue. (Mexico); Germán, Gabriel [Instituto de Ciencias Físicas, Universidad Nacional Autónoma de México,Apdo. Postal 48-3, 62251 Cuernavaca, Morelos (Mexico); Rudolf Peierls Centre for Theoretical Physics, University of Oxford, 1 Keble Road,Oxford, OX1 3NP (United Kingdom); Herrera-Aguilar, Alfredo [Instituto de Física, Benemérita Universidad Autónoma de Puebla,Apdo. postal J-48, 72570 Puebla, Pue. (Mexico); Institutode Física y Matemáticas, Universidad Michoacana de San Nicolás de Hidalgo,Edificio C-3, Ciudad Universitaria, CP 58040, Morelia, Michoacán (Mexico); Mora-Luna, Refugio Rigel [Instituto de Ciencias Físicas, Universidad Nacional Autónoma de México,Apdo. Postal 48-3, 62251 Cuernavaca, Morelos (Mexico)

    2016-05-11

    Following recent studies which show that it is possible to localize gravity as well as scalar and gauge vector fields in a tachyonic de Sitter thick braneworld, we investigate the solution of the gauge hierarchy problem, the localization of fermion fields in this model, the recovering of the Coulomb law on the non-relativistic limit of the Yukawa interaction between bulk fermions and gauge bosons localized in the brane, and confront the predicted 5D corrections to the photon mass with its upper experimental/observational bounds, finding the model physically viable since it passes these tests. In order to achieve the latter aims we first consider the Yukawa interaction term between the fermionic and the tachyonic scalar fields MF(T)ΨΨ-bar in the action and analyze four distinct tachyonic functions F(T) that lead to four different structures of the respective fermionic mass spectra with different physics. In particular, localization of the massless left-chiral fermion zero mode is possible for three of these cases. We further analyze the phenomenology of these Yukawa interactions among fermion fields and gauge bosons localized on the brane and obtain the crucial and necessary information to compute the corrections to Coulomb’s law coming from massive KK vector modes in the non-relativistic limit. These corrections are exponentially suppressed due to the presence of the mass gap in the mass spectrum of the bulk gauge vector field. From our results we conclude that corrections to Coulomb’s law in the thin brane limit have the same form (up to a numerical factor) as far as the left-chiral massless fermion field is localized on the brane. Finally we compute the corrections to the Coulomb’s law for an arbitrarily thick brane scenario which can be interpreted as 5D corrections to the photon mass. By performing consistent estimations with brane phenomenology, we found that the predicted corrections to the photon mass, which are well bounded by the experimentally

  17. Normalized gradient fields cross-correlation for automated detection of prostate in magnetic resonance images

    Science.gov (United States)

    Fotin, Sergei V.; Yin, Yin; Periaswamy, Senthil; Kunz, Justin; Haldankar, Hrishikesh; Muradyan, Naira; Cornud, François; Turkbey, Baris; Choyke, Peter L.

    2012-02-01

    Fully automated prostate segmentation helps to address several problems in prostate cancer diagnosis and treatment: it can assist in objective evaluation of multiparametric MR imagery, provides a prostate contour for MR-ultrasound (or CT) image fusion for computer-assisted image-guided biopsy or therapy planning, may facilitate reporting and enables direct prostate volume calculation. Among the challenges in automated analysis of MR images of the prostate are the variations of overall image intensities across scanners, the presence of nonuniform multiplicative bias field within scans and differences in acquisition setup. Furthermore, images acquired with the presence of an endorectal coil suffer from localized high-intensity artifacts at the posterior part of the prostate. In this work, a three-dimensional method for fast automated prostate detection based on normalized gradient fields cross-correlation, insensitive to intensity variations and coil-induced artifacts, is presented and evaluated. The components of the method, offline template learning and the localization algorithm, are described in detail. The method was validated on a dataset of 522 T2-weighted MR images acquired at the National Cancer Institute, USA that was split in two halves for development and testing. In addition, second dataset of 29 MR exams from Centre d'Imagerie Médicale Tourville, France were used to test the algorithm. The 95% confidence intervals for the mean Euclidean distance between automatically and manually identified prostate centroids were 4.06 +/- 0.33 mm and 3.10 +/- 0.43 mm for the first and second test datasets respectively. Moreover, the algorithm provided the centroid within the true prostate volume in 100% of images from both datasets. Obtained results demonstrate high utility of the detection method for a fully automated prostate segmentation.

  18. Saturne II: characteristics of the proton beam, field qualities and corrections, acceleration of the polarized protons

    International Nuclear Information System (INIS)

    Laclare, J.-L.

    1978-01-01

    Indicated specifications of Saturne II are summed up: performance of the injection system, quality of the guidance field (magnetic measurements and multipolar corrections), transverse and longitudinal instabilities, characteristics of the beam stored in the machine and of the extracted beam. The problem of depolarization along the acceleration cycle is briefly discussed (1 or 2% between injection and 3 GeV) [fr

  19. Integrals of random fields treated by the model correction factor method

    DEFF Research Database (Denmark)

    Franchin, P.; Ditlevsen, Ove Dalager; Kiureghian, Armen Der

    2002-01-01

    The model correction factor method (MCFM) is used in conjunction with the first-order reliability method (FORM) to solve structural reliability problems involving integrals of non-Gaussian random fields. The approach replaces the limit-state function with an idealized one, in which the integrals ...

  20. Born--Infeld theory of electroweak and gravitational fields: Possible correction to Newton and Coulomb laws

    OpenAIRE

    Palatnik, Dmitriy

    2002-01-01

    In this note one suggests a possibility of direct observation of the $\\theta$-parameter, introduced in the Born--Infeld theory of electroweak and gravitational fields, developed in quant-ph/0202024. Namely, one may treat $\\theta$ as a universal constant, responsible for correction to the Coulomb and Newton laws, allowing direct interaction between electrical charges and masses.

  1. Depolarization corrections to the coercive field in thin-film ferroelectrics

    CERN Document Server

    Dawber, M; Littlewood, P B; Scott, J F

    2003-01-01

    Empirically, the coercive field needed to reverse the polarization in a ferroelectric increases with decreasing film thickness. For ferroelectric films of 100 mu m to 100 nm in thickness the coercive field has been successfully described by a semi-empirical scaling law. Accounting for depolarization corrections, we show that this scaling behaviour is consistent with field measurements of ultrathin ferroelectric capacitors down to one nanometre in film thickness. Our results also indicate that the minimum film thickness, determined by a polarization instability, can be tuned by the choice of electrodes, and recommendations for next-generation ferroelectric devices are discussed. (letter to the editor)

  2. Far-field beam shaping through static wavefront correction in the near field on the HELEN laser

    Science.gov (United States)

    Bett, Thomas H.; Hopps, N. W.; Nolan, J. R.

    2002-10-01

    This report discusses the design and installation of a phase optic inserted in the near field of the HELEN high power glass laser. The element is designed to shape the intensity distribution at the focal spot of the laser to produce an increase in the peak intensity through correction of static and thermally induced wavefront errors on the beam. A phase element has been fabricated commercially using a magneto-rheological finishing tool. Test data is presented.

  3. Strong-field ionization of polar molecules: Stark-shift-corrected strong-field approximation

    DEFF Research Database (Denmark)

    Dimitrovski, Darko; Martiny, Christian P. J.; Madsen, Lars Bojer

    2010-01-01

    We extend the molecular strong-field approximation for ionization, in the tunneling limit, to include systematically the linear and quadratic static Stark shifts of the ionizing molecular orbital. This approach, simple to implement, is capable of describing the essential physics of the process of...

  4. Differential Effect of Correct Name Translation on Human and Automated Judgments of Translation Acceptability: A Pilot Study

    National Research Council Canada - National Science Library

    Vanni, Michelle; Walrath, James

    2008-01-01

    This study proffers two important findings: (1) automated machine translation (MT) evaluation is insensitive to the cognitive gravitas of proper names, contributing to its weak modeling of human judgments of higher quality MT output...

  5. Local-field correction in the lattice dynamics of b.b.c. transition metals

    International Nuclear Information System (INIS)

    Onwuagba, B.N.

    1984-01-01

    It is shown that the off-diagonal components of the inverse dielectric matrix which determine the local-field correction associated with s-d interactions, make contributions to the dynamical matrix for phonon dispersion in the body-centred cubic transition metals V, Nb and Ta which tend to cancel the Born-Mayer contribution, just as the diagonal components of the inverse dielectric matrix tend to cancel or screen the long-range (Coulombic) contribution. Numerical calculations show that the cancellation of the Born-Mayer contribution to the dynamical matrix by the local-field correction is such that the effective short-range interatomic potential turns out to be attractive rather than repulsive in these metals and accounts for some peculiar shapes of the major soft modes observed in these metals

  6. Markov random field and Gaussian mixture for segmented MRI-based partial volume correction in PET

    International Nuclear Information System (INIS)

    Bousse, Alexandre; Thomas, Benjamin A; Erlandsson, Kjell; Hutton, Brian F; Pedemonte, Stefano; Ourselin, Sébastien; Arridge, Simon

    2012-01-01

    In this paper we propose a segmented magnetic resonance imaging (MRI) prior-based maximum penalized likelihood deconvolution technique for positron emission tomography (PET) images. The model assumes the existence of activity classes that behave like a hidden Markov random field (MRF) driven by the segmented MRI. We utilize a mean field approximation to compute the likelihood of the MRF. We tested our method on both simulated and clinical data (brain PET) and compared our results with PET images corrected with the re-blurred Van Cittert (VC) algorithm, the simplified Guven (SG) algorithm and the region-based voxel-wise (RBV) technique. We demonstrated our algorithm outperforms the VC algorithm and outperforms SG and RBV corrections when the segmented MRI is inconsistent (e.g. mis-segmentation, lesions, etc) with the PET image. (paper)

  7. Scaling up Ecological Measurements of Coral Reefs Using Semi-Automated Field Image Collection and Analysis

    Directory of Open Access Journals (Sweden)

    Manuel González-Rivero

    2016-01-01

    Full Text Available Ecological measurements in marine settings are often constrained in space and time, with spatial heterogeneity obscuring broader generalisations. While advances in remote sensing, integrative modelling and meta-analysis enable generalisations from field observations, there is an underlying need for high-resolution, standardised and geo-referenced field data. Here, we evaluate a new approach aimed at optimising data collection and analysis to assess broad-scale patterns of coral reef community composition using automatically annotated underwater imagery, captured along 2 km transects. We validate this approach by investigating its ability to detect spatial (e.g., across regions and temporal (e.g., over years change, and by comparing automated annotation errors to those of multiple human annotators. Our results indicate that change of coral reef benthos can be captured at high resolution both spatially and temporally, with an average error below 5%, among key benthic groups. Cover estimation errors using automated annotation varied between 2% and 12%, slightly larger than human errors (which varied between 1% and 7%, but small enough to detect significant changes among dominant groups. Overall, this approach allows a rapid collection of in-situ observations at larger spatial scales (km than previously possible, and provides a pathway to link, calibrate, and validate broader analyses across even larger spatial scales (10–10,000 km2.

  8. Automated disposal of produced water from a coalbed methane well field, a case history

    International Nuclear Information System (INIS)

    Luckianow, B.J.; Findley, M.L.; Paschal, W.T.

    1994-01-01

    This paper provides an overview of the automated disposal system for produced water designed and operated by Taurus Exploration, Inc. This presentation draws from Taurus' case study in the planning, design, construction, and operation of production water disposal facilities for the Mt. Olive well field, located in the Black Warrior Basin of Alabama. The common method for disposing of water produced from coalbed methane wells in the Warrior Basin is to discharge into a receiving stream. The limiting factor in the discharge method is the capability of the receiving stream to assimilate the chloride component of the water discharged. During the winter and spring, the major tributaries of the Black Warrior River are capable of assimilating far more production water than operations can generate. During the summer and fall months, however, these same tributaries can approach near zero flow, resulting in insufficient flow for dilution. During such periods pumping shut-down within the well field can be avoided by routing production waters into a storage facility. This paper discusses the automated production water disposal system on Big Sandy Creek designed and operated by Taurus. This system allows for continuous discharge to the receiving stream, thus taking full advantage of Big Sandy Creek's assimilative capacity, while allowing a provision for excess produced water storage and future stream discharge

  9. Compact and field-portable 3D printed shearing digital holographic microscope for automated cell identification.

    Science.gov (United States)

    Rawat, Siddharth; Komatsu, Satoru; Markman, Adam; Anand, Arun; Javidi, Bahram

    2017-03-20

    We propose a low-cost, compact, and field-portable 3D printed holographic microscope for automated cell identification based on a common path shearing interferometer setup. Once a hologram is captured from the portable setup, a 3D reconstructed height profile of the cell is created. We extract several morphological cell features from the reconstructed 3D height profiles, including mean physical cell thickness, coefficient of variation, optical volume (OV) of the cell, projected area of the cell (PA), ratio of PA to OV, cell thickness kurtosis, cell thickness skewness, and the dry mass of the cell for identification using the random forest (RF) classifier. The 3D printed prototype can serve as a low-cost alternative for the developing world, where access to laboratory facilities for disease diagnosis are limited. Additionally, a cell phone sensor is used to capture the digital holograms. This enables the user to send the acquired holograms over the internet to a computational device located remotely for cellular identification and classification (analysis). The 3D printed system presented in this paper can be used as a low-cost, stable, and field-portable digital holographic microscope as well as an automated cell identification system. To the best of our knowledge, this is the first research paper presenting automatic cell identification using a low-cost 3D printed digital holographic microscopy setup based on common path shearing interferometry.

  10. Effects of Field-Map Distortion Correction on Resting State Functional Connectivity MRI

    Directory of Open Access Journals (Sweden)

    Hiroki Togo

    2017-12-01

    Full Text Available Magnetic field inhomogeneities cause geometric distortions of echo planar images used for functional magnetic resonance imaging (fMRI. To reduce this problem, distortion correction (DC with field map is widely used for both task and resting-state fMRI (rs-fMRI. Although DC with field map has been reported to improve the quality of task fMRI, little is known about its effects on rs-fMRI. Here, we tested the influence of field-map DC on rs-fMRI results using two rs-fMRI datasets derived from 40 healthy subjects: one with DC (DC+ and the other without correction (DC−. Independent component analysis followed by the dual regression approach was used for evaluation of resting-state functional connectivity networks (RSN. We also obtained the ratio of low-frequency to high-frequency signal power (0.01–0.1 Hz and above 0.1 Hz, respectively; LFHF ratio to assess the quality of rs-fMRI signals. For comparison of RSN between DC+ and DC− datasets, the default mode network showed more robust functional connectivity in the DC+ dataset than the DC− dataset. Basal ganglia RSN showed some decreases in functional connectivity primarily in white matter, indicating imperfect registration/normalization without DC. Supplementary seed-based and simulation analyses supported the utility of DC. Furthermore, we found a higher LFHF ratio after field map correction in the anterior cingulate cortex, posterior cingulate cortex, ventral striatum, and cerebellum. In conclusion, field map DC improved detection of functional connectivity derived from low-frequency rs-fMRI signals. We encourage researchers to include a DC step in the preprocessing pipeline of rs-fMRI analysis.

  11. Susceptibility correction for improved tractography using high field DT-EPI

    Science.gov (United States)

    Pintjens, W.; Poot, D. H. J.; Verhoye, M.; Van Der Linden, A.; Sijbers, J.

    2008-03-01

    Diffusion Tensor Magnetic Resonance Imaging (DTI) is a well known technique that can provide information about the neuronal fiber structure of the brain. However, since DTI requires a large amount of data, a high speed MRI acquisition technique is needed to acquire these data within a reasonable time. Echo Planar Imaging (EPI) is a technique that provides the desired speed. Unfortunately, the advantage of speed is overshadowed by image artifacts, especially at high fields. EPI artifacts originate from susceptibility differences in adjacent tissues and correction techniques are required to obtain reliable images. In this work, the fieldmap method, which tries to measure distortion effects, is optimized by using a non linear least squares estimator for calculating pixel shifts. This method is tested on simulated data and proves to be more robust against noise compared to previously suggested methods. Another advantage of this new method is that other parameters like relaxation and the odd/even phase difference are estimated. This new way of estimating the field map is demonstrated on a hardware phantom, which consists of parallel bundles made of woven strands of Micro Dyneema fibers. Using a modified EPI-sequence, reference data was measured for the calculation of fieldmaps. This allows one to reposition the pixels in order to obtain images with less distortions. The correction is applied to non-diffusion weighted images as well as diffusion weighted images and fiber tracking is performed on this corrected data.

  12. Extended Field Laser Confocal Microscopy (EFLCM): Combining automated Gigapixel image capture with in silico virtual microscopy

    International Nuclear Information System (INIS)

    Flaberg, Emilie; Sabelström, Per; Strandh, Christer; Szekely, Laszlo

    2008-01-01

    Confocal laser scanning microscopy has revolutionized cell biology. However, the technique has major limitations in speed and sensitivity due to the fact that a single laser beam scans the sample, allowing only a few microseconds signal collection for each pixel. This limitation has been overcome by the introduction of parallel beam illumination techniques in combination with cold CCD camera based image capture. Using the combination of microlens enhanced Nipkow spinning disc confocal illumination together with fully automated image capture and large scale in silico image processing we have developed a system allowing the acquisition, presentation and analysis of maximum resolution confocal panorama images of several Gigapixel size. We call the method Extended Field Laser Confocal Microscopy (EFLCM). We show using the EFLCM technique that it is possible to create a continuous confocal multi-colour mosaic from thousands of individually captured images. EFLCM can digitize and analyze histological slides, sections of entire rodent organ and full size embryos. It can also record hundreds of thousands cultured cells at multiple wavelength in single event or time-lapse fashion on fixed slides, in live cell imaging chambers or microtiter plates. The observer independent image capture of EFLCM allows quantitative measurements of fluorescence intensities and morphological parameters on a large number of cells. EFLCM therefore bridges the gap between the mainly illustrative fluorescence microscopy and purely quantitative flow cytometry. EFLCM can also be used as high content analysis (HCA) instrument for automated screening processes

  13. Enabling full-field physics-based optical proximity correction via dynamic model generation

    Science.gov (United States)

    Lam, Michael; Clifford, Chris; Raghunathan, Ananthan; Fenger, Germain; Adam, Kostas

    2017-07-01

    As extreme ultraviolet lithography becomes closer to reality for high volume production, its peculiar modeling challenges related to both inter and intrafield effects have necessitated building an optical proximity correction (OPC) infrastructure that operates with field position dependency. Previous state-of-the-art approaches to modeling field dependency used piecewise constant models where static input models are assigned to specific x/y-positions within the field. OPC and simulation could assign the proper static model based on simulation-level placement. However, in the realm of 7 and 5 nm feature sizes, small discontinuities in OPC from piecewise constant model changes can cause unacceptable levels of edge placement errors. The introduction of dynamic model generation (DMG) can be shown to effectively avoid these dislocations by providing unique mask and optical models per simulation region, allowing a near continuum of models through the field. DMG allows unique models for electromagnetic field, apodization, aberrations, etc. to vary through the entire field and provides a capability to precisely and accurately model systematic field signatures.

  14. Field Demonstration of Automated Demand Response for Both Winter and Summer Events in Large Buildings in the Pacific Northwest

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Kiliccote, Sila; Dudley, Junqiao H.

    2011-11-11

    There are growing strains on the electric grid as cooling peaks grow and equipment ages. Increased penetration of renewables on the grid is also straining electricity supply systems and the need for flexible demand is growing. This paper summarizes results of a series of field test of automated demand response systems in large buildings in the Pacific Northwest. The objective of the research was two fold. One objective was to evaluate the use demand response automation technologies. A second objective was to evaluate control strategies that could change the electric load shape in both winter and summer conditions. Winter conditions focused on cold winter mornings, a time when the electric grid is often stressed. The summer test evaluated DR strategies in the afternoon. We found that we could automate both winter and summer control strategies with the open automated demand response communication standard. The buildings were able to provide significant demand response in both winter and summer events.

  15. Quantitative evaluation of automated skull-stripping methods applied to contemporary and legacy images: effects of diagnosis, bias correction, and slice location

    DEFF Research Database (Denmark)

    Fennema-Notestine, Christine; Ozyurt, I Burak; Clark, Camellia P

    2006-01-01

    Performance of automated methods to isolate brain from nonbrain tissues in magnetic resonance (MR) structural images may be influenced by MR signal inhomogeneities, type of MR image set, regional anatomy, and age and diagnosis of subjects studied. The present study compared the performance of four...... Extractor (BSE, Sandor and Leahy [1997] IEEE Trans Med Imag 16:41-54; Shattuck et al. [2001] Neuroimage 13:856-876) to manually stripped images. The methods were applied to uncorrected and bias-corrected datasets; Legacy and Contemporary T1-weighted image sets; and four diagnostic groups (depressed...

  16. Bright-field scanning confocal electron microscopy using a double aberration-corrected transmission electron microscope.

    Science.gov (United States)

    Wang, Peng; Behan, Gavin; Kirkland, Angus I; Nellist, Peter D; Cosgriff, Eireann C; D'Alfonso, Adrian J; Morgan, Andrew J; Allen, Leslie J; Hashimoto, Ayako; Takeguchi, Masaki; Mitsuishi, Kazutaka; Shimojo, Masayuki

    2011-06-01

    Scanning confocal electron microscopy (SCEM) offers a mechanism for three-dimensional imaging of materials, which makes use of the reduced depth of field in an aberration-corrected transmission electron microscope. The simplest configuration of SCEM is the bright-field mode. In this paper we present experimental data and simulations showing the form of bright-field SCEM images. We show that the depth dependence of the three-dimensional image can be explained in terms of two-dimensional images formed in the detector plane. For a crystalline sample, this so-called probe image is shown to be similar to a conventional diffraction pattern. Experimental results and simulations show how the diffracted probes in this image are elongated in thicker crystals and the use of this elongation to estimate sample thickness is explored. Copyright © 2010 Elsevier B.V. All rights reserved.

  17. AUTOMATED FORCE FIELD PARAMETERIZATION FOR NON-POLARIZABLE AND POLARIZABLE ATOMIC MODELS BASED ONAB INITIOTARGET DATA.

    Science.gov (United States)

    Huang, Lei; Roux, Benoît

    2013-08-13

    Classical molecular dynamics (MD) simulations based on atomistic models are increasingly used to study a wide range of biological systems. A prerequisite for meaningful results from such simulations is an accurate molecular mechanical force field. Most biomolecular simulations are currently based on the widely used AMBER and CHARMM force fields, which were parameterized and optimized to cover a small set of basic compounds corresponding to the natural amino acids and nucleic acid bases. Atomic models of additional compounds are commonly generated by analogy to the parameter set of a given force field. While this procedure yields models that are internally consistent, the accuracy of the resulting models can be limited. In this work, we propose a method, General Automated Atomic Model Parameterization (GAAMP), for generating automatically the parameters of atomic models of small molecules using the results from ab initio quantum mechanical (QM) calculations as target data. Force fields that were previously developed for a wide range of model compounds serve as initial guess, although any of the final parameter can be optimized. The electrostatic parameters (partial charges, polarizabilities and shielding) are optimized on the basis of QM electrostatic potential (ESP) and, if applicable, the interaction energies between the compound and water molecules. The soft dihedrals are automatically identified and parameterized by targeting QM dihedral scans as well as the energies of stable conformers. To validate the approach, the solvation free energy is calculated for more than 200 small molecules and MD simulations of 3 different proteins are carried out.

  18. Fock-Matrix Corrections in Density Functional Theory and Use in Embedded Mean-Field Theory.

    Science.gov (United States)

    Miyamoto, Kaito; Miller, Thomas F; Manby, Frederick R

    2016-12-13

    We introduce Fock-corrected density functional theory (FCDFT), a semiempirical minimal-basis method part way between density-functional tight binding (DFTB) and DFT. FCDFT contains DFTB-like Fock-matrix contributions calculated using simple pairwise formulas and Slater-Koster transformations, but it also contains the full Kohn-Sham treatment of Coulombic electrostatics. The resulting method is better suited than either minimal-basis DFT or DFTB for modeling the low-level subsystem in embedded mean-field theory (EMFT), improving upon the former by correcting for basis-set incompleteness and upon the latter by properly accounting for electrostatics. EMFT calculations using DFT-in-FCDFT have much smaller errors in orbital energies, dipole moments, and reaction energies than our previous DFT-in-DFT calculations.

  19. Errors of first-order probe correction for higher-order probes in spherical near-field antenna measurements

    DEFF Research Database (Denmark)

    Laitinen, Tommi; Nielsen, Jeppe Majlund; Pivnenko, Sergiy

    2004-01-01

    An investigation is performed to study the error of the far-field pattern determined from a spherical near-field antenna measurement in the case where a first-order (mu=+-1) probe correction scheme is applied to the near-field signal measured by a higher-order probe.......An investigation is performed to study the error of the far-field pattern determined from a spherical near-field antenna measurement in the case where a first-order (mu=+-1) probe correction scheme is applied to the near-field signal measured by a higher-order probe....

  20. Industrial automation in floating production vessels for deep water oil and gas fields

    International Nuclear Information System (INIS)

    de Garcia, A.L.; Ferrante, A.J.

    1990-01-01

    The process supervision in offshore platforms was performed in the past through the use of local pneumatic instrumentation, based on relays, semi-graphic panels and button operated control panels. Considering the advanced technology used in the new floating production projects for deep water, it became mandatory to develop supervision systems capable of integrating different control panels, increasing the level of monitorization and reducing the number of operators and control rooms. From the point of view of field integration, a standardized architecture makes the communication between different production platforms and the regional headquarters, where all the equipment and support infrastructure for the computerized network is installed, possible. This test paper describes the characteristics of the initial systems, the main problems observed, the studies performed and the results obtained in relation to the design and implementation of computational systems with open architecture for automation of process control in floating production systems for deep water in Brazil

  1. Automated fault extraction and classification using 3-D seismic data for the Ekofisk field development

    Energy Technology Data Exchange (ETDEWEB)

    Signer, C.; Nickel, M.; Randen, T.; Saeter, T.; Soenneland, H.H.

    1998-12-31

    Mapping of fractures is important for the prediction of fluid flow in many reservoir types. The fluid flow depends mainly on the efficiency of the reservoir seals. Improved spatial mapping of the open and closed fracture systems will allow a better prediction of the fluid flow pattern. The primary objectives of this paper is to present fracture characterization at the reservoir scale combined with seismic facies mapping. The complexity of the giant Ekofisk field on the Norwegian continental shelf provides an ideal framework for testing the validity and the applicability of an automated seismic fault and fracture detection and mapping tool. The mapping of the faults can be based on seismic attribute grids, which means that attribute-responses related to faults are extracted along key horizons which were interpreted in the reservoir interval. 3 refs., 3 figs.

  2. Sensitivity of resistive and Hall measurements to local inhomogeneities: Finite-field, intensity, and area corrections

    Science.gov (United States)

    Koon, Daniel W.; Wang, Fei; Petersen, Dirch Hjorth; Hansen, Ole

    2014-10-01

    We derive exact, analytic expressions for the sensitivity of sheet resistance and Hall sheet resistance measurements to local inhomogeneities for the cases of nonzero magnetic fields, strong perturbations, and perturbations over a finite area, extending our earlier results on weak perturbations. We express these sensitivities for conductance tensor components and for other charge transport quantities. Both resistive and Hall sensitivities, for a van der Pauw specimen in a finite magnetic field, are a superposition of the zero-field sensitivities to both sheet resistance and Hall sheet resistance. Strong perturbations produce a nonlinear correction term that depends on the strength of the inhomogeneity. Solution of the specific case of a finite-sized circular inhomogeneity coaxial with a circular specimen suggests a first-order correction for the general case. Our results are confirmed by computer simulations on both a linear four-point probe array on a large circular disc and a van der Pauw square geometry. Furthermore, the results also agree well with Náhlík et al. published experimental results for physical holes in a circular copper foil disc.

  3. Momentum conservation and local field corrections for the response of interacting Fermi gases

    International Nuclear Information System (INIS)

    Morawetz, K.; Fuhrmann, U.

    2000-01-01

    We reanalyze the recently derived response function for interacting systems in relaxation time approximation respecting density, momentum and energy conservation. We find that momentum conservation leads exactly to the local field corrections for both cases respecting only density conservation and respecting density and energy conservation. This rewriting simplifies the former formulae dramatically. We discuss the small wave vector expansion and find that the response function shows a high frequency dependence of ω -5 which allows to fulfill higher order sum rules. The momentum conservation also resolves a puzzle about the conductivity which should only be finite in multicomponent systems. (authors)

  4. Fully-automated field mapping of a dipole magnet of a multi-passage spectrometer (MPS)

    Energy Technology Data Exchange (ETDEWEB)

    Meissner, Robert; Thirolf, Peter; Weber, Christine [Fakultaet fuer Physik, LMU - Muenchen (Germany)

    2013-07-01

    MLLTRAP is a Penning-trap mass-spectrometer facility, which is currently being commissioned at the Maier-Leibnitz Laboratory in Garching. Here, atomic mass values are determined by measuring cyclotron frequencies of stored ions in a strong magnetic field. In the future, highly-charged ions should be utilized for an improvement in the achievable mass accuracy. For this purpose, singly-charged ions will have to be injected into a charge-breeding device, such as an EBIT, and transferred back towards the Penning traps, while being q/A selected. To fulfill these tasks a multi-passage-spectrometer (MPS) is being built. It consists of a fast-ramping, round-pole dipole magnet with an electrostatic mirror system. A basic requirement for building the MPS is a detailed knowledge on the magnetic field produced by the magnet. It is necessary to simulate the trajectories of the ions and gain knowledge on the design and geometry of the electrostatic mirror system and the vacuum chamber. For this purpose, a robot was designed, which - powered by three step motors - measures the magnetic field fully automated. The robot moves a Hall probe within three dimensions with a resolution of 1 mm and an uncertainty of 0.5 mm. In this presentation, the development of the robot, its control and data acquisition via LabView and the results are presented.

  5. Method for determining correction factors induced by irradiation of ionization chamber cables in large radiation field

    International Nuclear Information System (INIS)

    Rodrigues, L.L.C.

    1988-01-01

    A simple method was developed to be suggested to hospital physicists in order to be followed during large radiation field dosimetry, to evaluate the effects of cables, connectors and extension cables irradiation and to determine correction factors for each system or geometry. All quality control tests were performed according to the International Electrotechnical Commission for three clinical dosimeters. Photon and electron irradiation effects for cables, connectors and extention cables were investigated under different experimental conditions by means of measurements of chamber sensitivity to a standard radiation source of 90 Sr. The radiation induced leakage current was also measured for cables, connectors and extension cables irradiated by photons and electrons. All measurements were performed at standard dosimetry conditions. Finally, measurements were performed in large fields. Cable factors and leakage factors were determined by the relation between chamber responses for irradiated and unirradiated cables. (author) [pt

  6. Automated jitter correction for IR image processing to assess the quality of W7-X high heat flux components

    International Nuclear Information System (INIS)

    Greuner, H; De Marne, P; Herrmann, A; Boeswirth, B; Schindler, T; Smirnow, M

    2009-01-01

    An automated IR image processing method was developed to evaluate the surface temperature distribution of cyclically loaded high heat flux (HHF) plasma facing components. IPP Garching will perform the HHF testing of a high percentage of the series production of the WENDELSTEIN 7-X (W7-X) divertor targets to minimize the number of undiscovered uncertainties in the finally installed components. The HHF tests will be performed as quality assurance (QA) complementary to the non-destructive examination (NDE) methods used during the manufacturing. The IR analysis of an HHF-loaded component detects growing debonding of the plasma facing material, made of carbon fibre composite (CFC), after a few thermal cycles. In the case of the prototype testing, the IR data was processed manually. However, a QA method requires a reliable, reproducible and efficient automated procedure. Using the example of the HHF testing of W7-X pre-series target elements, the paper describes the developed automated IR image processing method. The algorithm is based on an iterative two-step correlation analysis with an individually defined reference pattern for the determination of the jitter.

  7. Correction factors for ionization chamber dosimetry in CyberKnife: Machine-specific, plan-class, and clinical fields

    International Nuclear Information System (INIS)

    Gago-Arias, Araceli; Antolín, Elena; Fayos-Ferrer, Francisco; Simón, Rocío; González-Castaño, Diego M.; Palmans, Hugo; Sharpe, Peter; Gómez, Faustino; Pardo-Montero, Juan

    2013-01-01

    Purpose: The aim of this work is the application of the formalism for ionization chamber reference dosimetry of small and nonstandard fields [R. Alfonso, P. Andreo, R. Capote, M. S. Huq, W. Kilby, P. Kjäll, T. R. Mackie, H. Palmans, K. Rosser, J. Seuntjens, W. Ullrich, and S. Vatnitsky, “A new formalism for reference dosimetry of small and nonstandard fields,” Med. Phys. 35, 5179–5186 (2008)] to the CyberKnife robotic radiosurgery system. Correction factors for intermediate calibration fields, a machine-specific reference field (msr) and two plan-class specific reference fields (pcsr), have been studied. Furthermore, the applicability of the new formalism to clinical dosimetry has been analyzed through the investigation of two clinical treatments. Methods: PTW31014 and Scanditronix-Wellhofer CC13 ionization chamber measurements were performed for the fields under investigation. Absorbed dose to water was determined using alanine reference dosimetry, and experimental correction factors were calculated from alanine to ionization chamber readings ratios. In addition, correction factors were calculated for the intermediate calibration fields and one of the clinical treatment fields using the Monte Carlo method and these were compared with the experimental values. Results: Overall correction factors deviating from unity by approximately 2% were obtained from both measurements and simulations, with values below and above unity for the studied intermediate calibration fields and clinical fields for the ionization chambers under consideration. Monte Carlo simulations yielded correction factors comparable with those obtained from measurements for the machine-specific reference field, although differences from 1% to 3.3% were observed between measured and calculated correction factors for the composite intermediate calibration fields. Dose distribution inhomogeneities are thought to be responsible for such discrepancies. Conclusions: The differences found between

  8. UAS imaging for automated crop lodging detection: a case study over an experimental maize field

    Science.gov (United States)

    Chu, Tianxing; Starek, Michael J.; Brewer, Michael J.; Masiane, Tiisetso; Murray, Seth C.

    2017-05-01

    Lodging has been recognized as one of the major destructive factors for crop quality and yield, particularly in corn. A variety of contributing causes, e.g. disease and/or pest, weather conditions, excessive nitrogen, and high plant density, may lead to lodging before harvesting season. Traditional lodging detection strategies mainly rely on ground data collection, which is insufficient in efficiency and accuracy. To address this problem, this research focuses on the use of unmanned aircraft systems (UAS) for automated detection of crop lodging. The study was conducted over an experimental corn field at the Texas A and M AgriLife Research and Extension Center at Corpus Christi, Texas, during the growing season of 2016. Nadir-view images of the corn field were taken by small UAS platforms equipped with consumer grade RGB and NIR cameras on a per week basis, enabling a timely observation of the plant growth. 3D structural information of the plants was reconstructed using structure-from-motion photogrammetry. The structural information was then applied to calculate crop height, and rates of growth. A lodging index for detecting corn lodging was proposed afterwards. Ground truth data of lodging was collected on a per row basis and used for fair assessment and tuning of the detection algorithm. Results show the UAS-measured height correlates well with the ground-measured height. More importantly, the lodging index can effectively reflect severity of corn lodging and yield after harvesting.

  9. Software development based on high speed PC oscilloscope for automated pulsed magnetic field measurement system

    International Nuclear Information System (INIS)

    Sun Yuxiang; Shang Lei; Li Ji; Ge Lei

    2011-01-01

    It introduces a method of a software development which is based on high speed PC oscilloscope for pulsed magnetic field measurement system. The previous design has been improved by this design, high-speed virtual oscilloscope has been used in the field for the first time. In the design, the automatic data acquisition, data process, data analysis and storage have been realized. Automated point checking reduces the workload. The use of precise motion bench increases the positioning accuracy. The software gets the data from PC oscilloscope by calling DLLs and includes the function of oscilloscope, such as trigger, ranges, and sample rate setting etc. Spline Interpolation and Bandstop Filter are used to denoise the signals. The core of the software is the state machine which controls the motion of stepper motors and data acquisition and stores the data automatically. NI Vision Acquisition Software and Database Connectivity Toolkit make the video surveillance of laboratory and MySQL database connectivity available. The raw signal and processed signal have been compared in this paper. The waveform has been greatly improved by the signal processing. (authors)

  10. Automated Identification of Northern Leaf Blight-Infected Maize Plants from Field Imagery Using Deep Learning.

    Science.gov (United States)

    DeChant, Chad; Wiesner-Hanks, Tyr; Chen, Siyuan; Stewart, Ethan L; Yosinski, Jason; Gore, Michael A; Nelson, Rebecca J; Lipson, Hod

    2017-11-01

    Northern leaf blight (NLB) can cause severe yield loss in maize; however, scouting large areas to accurately diagnose the disease is time consuming and difficult. We demonstrate a system capable of automatically identifying NLB lesions in field-acquired images of maize plants with high reliability. This approach uses a computational pipeline of convolutional neural networks (CNNs) that addresses the challenges of limited data and the myriad irregularities that appear in images of field-grown plants. Several CNNs were trained to classify small regions of images as containing NLB lesions or not; their predictions were combined into separate heat maps, then fed into a final CNN trained to classify the entire image as containing diseased plants or not. The system achieved 96.7% accuracy on test set images not used in training. We suggest that such systems mounted on aerial- or ground-based vehicles can help in automated high-throughput plant phenotyping, precision breeding for disease resistance, and reduced pesticide use through targeted application across a variety of plant and disease categories.

  11. Automated measurement of spatial preference in the open field test with transmitted lighting.

    Science.gov (United States)

    Kulikov, Alexander V; Tikhonova, Maria A; Kulikov, Victor A

    2008-05-30

    New modification of the open field was designed to improve automation of the test. The main innovations were: (1) transmitted lighting and (2) estimation of probability to find pixels associated with an animal in the selected region of arena as an objective index of spatial preference. Transmitted (inverted) lighting significantly ameliorated the contrast between an animal and arena and allowed to track white animals with similar efficacy as colored ones. Probability as a measure of preference of selected region was mathematically proved and experimentally verified. A good correlation between probability and classic indices of spatial preference (number of region entries and time spent therein) was shown. The algorithm of calculation of probability to find pixels associated with an animal in the selected region was implemented in the EthoStudio software. Significant interstrain differences in locomotion and the central zone preference (index of anxiety) were shown using the inverted lighting and the EthoStudio software in mice of six inbred strains. The effects of arena shape (circle or square) and a novel object presence in the center of arena on the open field behavior in mice were studied.

  12. Towards Automated Large-Scale 3D Phenotyping of Vineyards under Field Conditions

    Directory of Open Access Journals (Sweden)

    Johann Christian Rose

    2016-12-01

    Full Text Available In viticulture, phenotypic data are traditionally collected directly in the field via visual and manual means by an experienced person. This approach is time consuming, subjective and prone to human errors. In recent years, research therefore has focused strongly on developing automated and non-invasive sensor-based methods to increase data acquisition speed, enhance measurement accuracy and objectivity and to reduce labor costs. While many 2D methods based on image processing have been proposed for field phenotyping, only a few 3D solutions are found in the literature. A track-driven vehicle consisting of a camera system, a real-time-kinematic GPS system for positioning, as well as hardware for vehicle control, image storage and acquisition is used to visually capture a whole vine row canopy with georeferenced RGB images. In the first post-processing step, these images were used within a multi-view-stereo software to reconstruct a textured 3D point cloud of the whole grapevine row. A classification algorithm is then used in the second step to automatically classify the raw point cloud data into the semantic plant components, grape bunches and canopy. In the third step, phenotypic data for the semantic objects is gathered using the classification results obtaining the quantity of grape bunches, berries and the berry diameter.

  13. Vector Sky Glint Corrections for Above Surface Retrieval of the Subsurface Polarized Light Field

    Science.gov (United States)

    Gilerson, A.; Foster, R.; McGilloway, A.; Ibrahim, A.; El-habashi, A.; Carrizo, C.; Ahmed, S.

    2016-02-01

    Knowledge of the underwater light field is fundamental to determining the health of the world's oceans and coastal regions. For decades, traditional remote sensing retrieval methods that rely solely on the spectral intensity of the water-leaving light have provided indicators of marine ecosystem health. As the demand for retrieval accuracy rises, use of the polarized nature of light as an additional remote sensing tool is becoming necessary. In order to observe the underwater polarized light field from above the surface (for ship, shore, or satellite applications), a method of correcting the above water signal for the effects of polarized surface-reflected skylight is needed. For three weeks in July-August 2014, the NASA Ship Aircraft Bio-Optical Research (SABOR) cruise continuously observed the polarized radiance of the ocean and the sky using a HyperSAS-POL system. The system autonomously tracks the Sun position and the heading of the research vessel in order to maintain a fixed relative solar azimuth angle (i.e. ±90°) and therefore avoid the specular reflection of the sunlight. Additionally, in-situ inherent optical properties (IOPs) were continuously acquired using a set of instrument packages modified for underway measurement, hyperspectral radiometric measurements were taken manually at all stations, and an underwater polarimeter was deployed when conditions permitted. All measurements, above and below the sea surface, were combined and compared in an effort to first develop a glint (sky + Sun) correction scheme for the upwelling polarized signal from a wind-driven ocean surface and compare with one assuming that the ocean surface is flat. Accurate retrieval of the subsurface vector light field is demonstrated through comparisons with polarized radiative transfer codes and direct measurements made by the underwater polarimeter.

  14. Correcting mean-field approximations for birth-death-movement processes

    Science.gov (United States)

    Baker, Ruth E.; Simpson, Matthew J.

    2010-10-01

    On the microscale, migration, proliferation and death are crucial in the development, homeostasis and repair of an organism; on the macroscale, such effects are important in the sustainability of a population in its environment. Dependent on the relative rates of migration, proliferation and death, spatial heterogeneity may arise within an initially uniform field; this leads to the formation of spatial correlations and can have a negative impact upon population growth. Usually, such effects are neglected in modeling studies and simple phenomenological descriptions, such as the logistic model, are used to model population growth. In this work we outline some methods for analyzing exclusion processes which include agent proliferation, death and motility in two and three spatial dimensions with spatially homogeneous initial conditions. The mean-field description for these types of processes is of logistic form; we show that, under certain parameter conditions, such systems may display large deviations from the mean field, and suggest computationally tractable methods to correct the logistic-type description.

  15. Automation of the CHARMM General Force Field (CGenFF) I: bond perception and atom typing.

    Science.gov (United States)

    Vanommeslaeghe, K; MacKerell, A D

    2012-12-21

    Molecular mechanics force fields are widely used in computer-aided drug design for the study of drug-like molecules alone or interacting with biological systems. In simulations involving biological macromolecules, the biological part is typically represented by a specialized biomolecular force field, while the drug is represented by a matching general (organic) force field. In order to apply these general force fields to an arbitrary drug-like molecule, functionality for assignment of atom types, parameters, and charges is required. In the present article, which is part I of a series of two, we present the algorithms for bond perception and atom typing for the CHARMM General Force Field (CGenFF). The CGenFF atom typer first associates attributes to the atoms and bonds in a molecule, such as valence, bond order, and ring membership among others. Of note are a number of features that are specifically required for CGenFF. This information is then used by the atom typing routine to assign CGenFF atom types based on a programmable decision tree. This allows for straightforward implementation of CGenFF's complicated atom typing rules and for equally straightforward updating of the atom typing scheme as the force field grows. The presented atom typer was validated by assigning correct atom types on 477 model compounds including in the training set as well as 126 test-set molecules that were constructed to specifically verify its different components. The program may be utilized via an online implementation at https://www.paramchem.org/ .

  16. Drift correction for single-molecule imaging by molecular constraint field, a distance minimum metric

    International Nuclear Information System (INIS)

    Han, Renmin; Wang, Liansan; Xu, Fan; Zhang, Yongdeng; Zhang, Mingshu; Liu, Zhiyong; Ren, Fei; Zhang, Fa

    2015-01-01

    The recent developments of far-field optical microscopy (single molecule imaging techniques) have overcome the diffraction barrier of light and improve image resolution by a factor of ten compared with conventional light microscopy. These techniques utilize the stochastic switching of probe molecules to overcome the diffraction limit and determine the precise localizations of molecules, which often requires a long image acquisition time. However, long acquisition times increase the risk of sample drift. In the case of high resolution microscopy, sample drift would decrease the image resolution. In this paper, we propose a novel metric based on the distance between molecules to solve the drift correction. The proposed metric directly uses the position information of molecules to estimate the frame drift. We also designed an algorithm to implement the metric for the general application of drift correction. There are two advantages of our method: First, because our method does not require space binning of positions of molecules but directly operates on the positions, it is more natural for single molecule imaging techniques. Second, our method can estimate drift with a small number of positions in each temporal bin, which may extend its potential application. The effectiveness of our method has been demonstrated by both simulated data and experiments on single molecular images

  17. Clearing the waters: Evaluating the need for site-specific field fluorescence corrections based on turbidity measurements

    Science.gov (United States)

    Saraceno, John F.; Shanley, James B.; Downing, Bryan D.; Pellerin, Brian A.

    2017-01-01

    In situ fluorescent dissolved organic matter (fDOM) measurements have gained increasing popularity as a proxy for dissolved organic carbon (DOC) concentrations in streams. One challenge to accurate fDOM measurements in many streams is light attenuation due to suspended particles. Downing et al. (2012) evaluated the need for corrections to compensate for particle interference on fDOM measurements using a single sediment standard in a laboratory study. The application of those results to a large river improved unfiltered field fDOM accuracy. We tested the same correction equation in a headwater tropical stream and found that it overcompensated fDOM when turbidity exceeded ∼300 formazin nephelometric units (FNU). Therefore, we developed a site-specific, field-based fDOM correction equation through paired in situ fDOM measurements of filtered and unfiltered streamwater. The site-specific correction increased fDOM accuracy up to a turbidity as high as 700 FNU, the maximum observed in this study. The difference in performance between the laboratory-based correction equation of Downing et al. (2012) and our site-specific, field-based correction equation likely arises from differences in particle size distribution between the sediment standard used in the lab (silt) and that observed in our study (fine to medium sand), particularly during high flows. Therefore, a particle interference correction equation based on a single sediment type may not be ideal when field sediment size is significantly different. Given that field fDOM corrections for particle interference under turbid conditions are a critical component in generating accurate DOC estimates, we describe a way to develop site-specific corrections.

  18. Research on Scientific Data Sharing and Distribution Policy in Advanced Manufacturing and Automation Fields

    Directory of Open Access Journals (Sweden)

    Liya Li

    2007-12-01

    Full Text Available Scientific data sharing is a long-term and complicated task. The related data sharing and distribution policies are prime concerns. By using both domestic and international experiences in scientific data sharing, the sources, distribution, and classification of scientific data in advanced manufacturing and automation are discussed. A primary data sharing and distribution policy in advanced manufacture and automation is introduced.

  19. An automated fog water collector suitable for deposition networks: design, operation and field tests

    Energy Technology Data Exchange (ETDEWEB)

    Fuzzi, S.; Orsi, G.; Bonforte, G; Zardini, B.; Franchini, P.L. [Consiglio Nazionale delle Richerche, Bologna (Italy). Instituto FISBAT

    1997-01-01

    The study of fog water chemical composition and the contribution of fog droplets to total chemical deposition has become a relevant environmental subject over the past few years. This paper describes a fog water collector suitable for deposition network operation, due to its complete automation and to the facility of remote acquisition of sampling information. Sampling of fog droplets on teflon strings is activated by an optical fog detector according to a particular protocol operated by a microprocessor controller Multiple sample collection, also microprocessor controlled, is possible with this instrument. The problem of fog droplet sampling in sub-freezing conditions is overcome using a sampling schedule implemented by the microprocessor controller which alternates between sampling periods and stand-by periods during which melting of the rime collected on the strings is allowed. Field tests on the reliability and reproducibility of the sampling operations are presented in the paper. Side by side operation of the fog collector with PVM-100 fog liquid water content meter shows that the amount of water per unit volume of air collected by the sampling instrument is proportional to the fog liquid water content averaged over the period of an entire fog event. 16 refs., 7 figs.

  20. Costs to Automate Demand Response - Taxonomy and Results from Field Studies and Programs

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Schetrit, Oren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kiliccote, Sila [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cheung, Iris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Li, Becky Z [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-07-31

    During the past decade, the technology to automate demand response (DR) in buildings and industrial facilities has advanced significantly. Automation allows rapid, repeatable, reliable operation. This study focuses on costs for DR automation in commercial buildings with some discussion on residential buildings and industrial facilities. DR automation technology relies on numerous components, including communication systems, hardware and software gateways, standards-based messaging protocols, controls and integration platforms, and measurement and telemetry systems. This report compares cost data from several DR automation programs and pilot projects, evaluates trends in the cost per unit of DR and kilowatts (kW) available from automated systems, and applies a standard naming convention and classification or taxonomy for system elements. Median costs for the 56 installed automated DR systems studied here are about $200/kW. The deviation around this median is large with costs in some cases being an order of magnitude great or less than the median. This wide range is a result of variations in system age, size of load reduction, sophistication, and type of equipment included in cost analysis. The costs to automate fast DR systems for ancillary services are not fully analyzed in this report because additional research is needed to determine the total cost to install, operate, and maintain these systems. However, recent research suggests that they could be developed at costs similar to those of existing hot-summer DR automation systems. This report considers installation and configuration costs and does include the costs of owning and operating DR automation systems. Future analysis of the latter costs should include the costs to the building or facility manager costs as well as utility or third party program manager cost.

  1. Histograms of Oriented 3D Gradients for Fully Automated Fetal Brain Localization and Robust Motion Correction in 3 T Magnetic Resonance Images.

    Science.gov (United States)

    Serag, Ahmed; Macnaught, Gillian; Denison, Fiona C; Reynolds, Rebecca M; Semple, Scott I; Boardman, James P

    2017-01-01

    Fetal brain magnetic resonance imaging (MRI) is a rapidly emerging diagnostic imaging tool. However, automated fetal brain localization is one of the biggest obstacles in expediting and fully automating large-scale fetal MRI processing. We propose a method for automatic localization of fetal brain in 3 T MRI when the images are acquired as a stack of 2D slices that are misaligned due to fetal motion. First, the Histogram of Oriented Gradients (HOG) feature descriptor is extended from 2D to 3D images. Then, a sliding window is used to assign a score to all possible windows in an image, depending on the likelihood of it containing a brain, and the window with the highest score is selected. In our evaluation experiments using a leave-one-out cross-validation strategy, we achieved 96% of complete brain localization using a database of 104 MRI scans at gestational ages between 34 and 38 weeks. We carried out comparisons against template matching and random forest based regression methods and the proposed method showed superior performance. We also showed the application of the proposed method in the optimization of fetal motion correction and how it is essential for the reconstruction process. The method is robust and does not rely on any prior knowledge of fetal brain development.

  2. Histograms of Oriented 3D Gradients for Fully Automated Fetal Brain Localization and Robust Motion Correction in 3 T Magnetic Resonance Images

    Directory of Open Access Journals (Sweden)

    Ahmed Serag

    2017-01-01

    Full Text Available Fetal brain magnetic resonance imaging (MRI is a rapidly emerging diagnostic imaging tool. However, automated fetal brain localization is one of the biggest obstacles in expediting and fully automating large-scale fetal MRI processing. We propose a method for automatic localization of fetal brain in 3 T MRI when the images are acquired as a stack of 2D slices that are misaligned due to fetal motion. First, the Histogram of Oriented Gradients (HOG feature descriptor is extended from 2D to 3D images. Then, a sliding window is used to assign a score to all possible windows in an image, depending on the likelihood of it containing a brain, and the window with the highest score is selected. In our evaluation experiments using a leave-one-out cross-validation strategy, we achieved 96% of complete brain localization using a database of 104 MRI scans at gestational ages between 34 and 38 weeks. We carried out comparisons against template matching and random forest based regression methods and the proposed method showed superior performance. We also showed the application of the proposed method in the optimization of fetal motion correction and how it is essential for the reconstruction process. The method is robust and does not rely on any prior knowledge of fetal brain development.

  3. Bench and ambulatory field evaluation of the A & D TM-2420 automated sphygmomanometer.

    Science.gov (United States)

    Jamieson, M J; Fowler, G; MacDonald, T M; Webster, J; Witte, K; Lawson, L; Crichton, W; Jeffers, T A; Petrie, J C

    1990-07-01

    Adequate evaluation of automated sphygmomanometers, in terms of safety, accuracy, mechanical reliability, patient acceptability and ability to record ambulatory blood pressure is essential before these devices are used in clinical practice and in clinical trials. We have evaluated the accuracy and performance of the A & D TM-2420 automated sphygmomanometer, an auscultatory device designed for ambulatory blood pressure recording. Four devices were tested for accuracy by simultaneous comparison against two experienced observers using standard mercury column sphygmomanometers. Two of these devices developed faults that precluded complete evaluation. One of the remaining devices met and one failed to meet the somewhat liberal criteria for accuracy recommended by the American Association for the Advancement of Medical Instrumentation, the current standard for evaluation (mean difference of less than or equal to 5 mmHg and standard deviation of differences less than or equal to 8 mmHg). The mean differences (standard deviation of differences) between observers for simultaneous triplicate observations of systolic/diastolic pressure in 50 subjects, including 35 hypertensives, were 0.8 (3.0)/-0.6 (2.4) mmHg. In comparison, the differences between each device and each observer were: device 11, observer 1, -6.4 (5.4)/-6.3 (9.9); device 11, observer 2, -5.6 (4.7)/-7.0 (10.4); device 12, observer 1, -4.9 (5.2)/-4.0 (7.5); device 12, observer 2, -4.1 (4.9)/- -4.5 (7.7) mmHg. Ambulatory trials were carried out with a further 10 devices. Of these, seven developed faults requiring their return to the supplier. Numerous additional problems were encountered with microphones, cuffs, leads and connections, the processing unit, error algorithms and data-handling software. The device was not capable of making truly ambulatory recordings. We do not confirm the previously favourable, but limited, evaluation of this device. We stress the vital importance of subjecting a number of devices to

  4. SU-C-304-07: Are Small Field Detector Correction Factors Strongly Dependent On Machine-Specific Characteristics?

    International Nuclear Information System (INIS)

    Mathew, D; Tanny, S; Parsai, E; Sperling, N

    2015-01-01

    Purpose: The current small field dosimetry formalism utilizes quality correction factors to compensate for the difference in detector response relative to dose deposited in water. The correction factors are defined on a machine-specific basis for each beam quality and detector combination. Some research has suggested that the correction factors may only be weakly dependent on machine-to-machine variations, allowing for determinations of class-specific correction factors for various accelerator models. This research examines the differences in small field correction factors for three detectors across two Varian Truebeam accelerators to determine the correction factor dependence on machine-specific characteristics. Methods: Output factors were measured on two Varian Truebeam accelerators for equivalently tuned 6 MV and 6 FFF beams. Measurements were obtained using a commercial plastic scintillation detector (PSD), two ion chambers, and a diode detector. Measurements were made at a depth of 10 cm with an SSD of 100 cm for jaw-defined field sizes ranging from 3×3 cm 2 to 0.6×0.6 cm 2 , normalized to values at 5×5cm 2 . Correction factors for each field on each machine were calculated as the ratio of the detector response to the PSD response. Percent change of correction factors for the chambers are presented relative to the primary machine. Results: The Exradin A26 demonstrates a difference of 9% for 6×6mm 2 fields in both the 6FFF and 6MV beams. The A16 chamber demonstrates a 5%, and 3% difference in 6FFF and 6MV fields at the same field size respectively. The Edge diode exhibits less than 1.5% difference across both evaluated energies. Field sizes larger than 1.4×1.4cm2 demonstrated less than 1% difference for all detectors. Conclusion: Preliminary results suggest that class-specific correction may not be appropriate for micro-ionization chamber. For diode systems, the correction factor was substantially similar and may be useful for class-specific reference

  5. Reconstructing interacting entropy-corrected holographic scalar field models of dark energy in the non-flat universe

    Energy Technology Data Exchange (ETDEWEB)

    Karami, K; Khaledian, M S [Department of Physics, University of Kurdistan, Pasdaran Street, Sanandaj (Iran, Islamic Republic of); Jamil, Mubasher, E-mail: KKarami@uok.ac.ir, E-mail: MS.Khaledian@uok.ac.ir, E-mail: mjamil@camp.nust.edu.pk [Center for Advanced Mathematics and Physics (CAMP), National University of Sciences and Technology (NUST), Islamabad (Pakistan)

    2011-02-15

    Here we consider the entropy-corrected version of the holographic dark energy (DE) model in the non-flat universe. We obtain the equation of state parameter in the presence of interaction between DE and dark matter. Moreover, we reconstruct the potential and the dynamics of the quintessence, tachyon, K-essence and dilaton scalar field models according to the evolutionary behavior of the interacting entropy-corrected holographic DE model.

  6. Field and laboratory emission cell automation and control system for investigating surface chemistry reactions

    Science.gov (United States)

    Flemmer, Michael M.; Ham, Jason E.; Wells, J. R.

    2007-01-01

    A novel system [field and laboratory emission cell (FLEC) automation and control system] has been developed to deliver ozone to a surface utilizing the FLEC to simulate indoor surface chemistry. Ozone, humidity, and air flow rate to the surface were continuously monitored using an ultraviolet ozone monitor, humidity, and flow sensors. Data from these sensors were used as feedback for system control to maintain predetermined experimental parameters. The system was used to investigate the chemistry of ozone with α-terpineol on a vinyl surface over 72h. Keeping all other experimental parameters the same, volatile organic compound emissions from the vinyl tile with α-terpineol were collected from both zero and 100ppb(partsper109) ozone exposures. System stability profiles collected from sensor data indicated experimental parameters were maintained to within a few percent of initial settings. Ozone data from eight experiments at 100ppb (over 339h) provided a pooled standard deviation of 1.65ppb and a 95% tolerance of 3.3ppb. Humidity data from 17 experiments at 50% relative humidity (over 664h) provided a pooled standard deviation of 1.38% and a 95% tolerance of 2.77%. Data of the flow rate of air flowing through the FLEC from 14 experiments at 300ml/min (over 548h) provided a pooled standard deviation of 3.02ml/min and a 95% tolerance range of 6.03ml/min. Initial experimental results yielded long term emissions of ozone/α-terpineol reaction products, suggesting that surface chemistry could play an important role in indoor environments.

  7. Influence and Correction from the Human Body on the Measurement of a Power-Frequency Electric Field Sensor

    Directory of Open Access Journals (Sweden)

    Dongping Xiao

    2016-06-01

    Full Text Available According to the operating specifications of existing electric field measuring instruments, measuring technicians must be located far from the instruments to eliminate the influence of the human body occupancy on a spatial electric field. Nevertheless, in order to develop a portable safety protection instrument with an effective electric field warning function for working staff in a high-voltage environment, it is necessary to study the influence of an approaching human body on the measurement of an electric field and to correct the measurement results. A single-shaft electric field measuring instrument called the Type LP-2000, which was developed by our research team, is used as the research object in this study. First, we explain the principle of electric field measurement and describe the capacitance effect produced by the human body. Through a theoretical analysis, we show that the measured electric field value decreases as a human body approaches. Their relationship is linearly proportional. Then, the ratio is identified as a correction coefficient to correct for the influence of human body proximity. The conclusion drawn from the theoretical analysis is proved via simulation. The correction coefficient kb = 1.8010 is obtained on the basis of the linear fitting of simulated data. Finally, a physical experiment is performed. When no human is present, we compare the results from the Type LP-2000 measured with Narda EFA-300 and the simulated value to verify the accuracy of the Type LP-2000. For the case of an approaching human body, the correction coefficient kb* = 1.9094 is obtained by comparing the data measured with the Type LP-2000 to the simulated value. The correction coefficient obtained from the experiment (i.e., kb* is highly consistent with that obtained from the simulation (i.e., kb. Two experimental programs are set; under these programs, the excitation voltages and distance measuring points are regulated to produce different

  8. Influence and Correction from the Human Body on the Measurement of a Power-Frequency Electric Field Sensor.

    Science.gov (United States)

    Xiao, Dongping; Liu, Huaitong; Zhou, Qiang; Xie, Yutong; Ma, Qichao

    2016-06-10

    According to the operating specifications of existing electric field measuring instruments, measuring technicians must be located far from the instruments to eliminate the influence of the human body occupancy on a spatial electric field. Nevertheless, in order to develop a portable safety protection instrument with an effective electric field warning function for working staff in a high-voltage environment, it is necessary to study the influence of an approaching human body on the measurement of an electric field and to correct the measurement results. A single-shaft electric field measuring instrument called the Type LP-2000, which was developed by our research team, is used as the research object in this study. First, we explain the principle of electric field measurement and describe the capacitance effect produced by the human body. Through a theoretical analysis, we show that the measured electric field value decreases as a human body approaches. Their relationship is linearly proportional. Then, the ratio is identified as a correction coefficient to correct for the influence of human body proximity. The conclusion drawn from the theoretical analysis is proved via simulation. The correction coefficient kb = 1.8010 is obtained on the basis of the linear fitting of simulated data. Finally, a physical experiment is performed. When no human is present, we compare the results from the Type LP-2000 measured with Narda EFA-300 and the simulated value to verify the accuracy of the Type LP-2000. For the case of an approaching human body, the correction coefficient kb* = 1.9094 is obtained by comparing the data measured with the Type LP-2000 to the simulated value. The correction coefficient obtained from the experiment (i.e., kb*) is highly consistent with that obtained from the simulation (i.e., kb). Two experimental programs are set; under these programs, the excitation voltages and distance measuring points are regulated to produce different electric field

  9. Corrections to classical kinetic and transport theory for a two-temparature, fully ionized plasma in electromagnetic fields

    International Nuclear Information System (INIS)

    Oeien, A.H.

    1977-06-01

    Sets of lower order and higher order kinetic and macroscopic equations are developed for a plasma where collisions are important but electrons and ions are allowed to have different temperatures when transports, due to gradients and fields, set in. Solving the lower order kinetic equations and taking appropriate velocity moments we show that usual classical transports emerge. From the higher order kinetic equations special notice is taken of some new correction terms to the classical transports. These corrections are linear in gradients and fields, some of which are found in a two-temperature state only. (Auth.)

  10. Distortion correction in EPI at ultra-high-field MRI using PSF mapping with optimal combination of shift detection dimension.

    Science.gov (United States)

    Oh, Se-Hong; Chung, Jun-Young; In, Myung-Ho; Zaitsev, Maxim; Kim, Young-Bo; Speck, Oliver; Cho, Zang-Hee

    2012-10-01

    Despite its wide use, echo-planar imaging (EPI) suffers from geometric distortions due to off-resonance effects, i.e., strong magnetic field inhomogeneity and susceptibility. This article reports a novel method for correcting the distortions observed in EPI acquired at ultra-high-field such as 7 T. Point spread function (PSF) mapping methods have been proposed for correcting the distortions in EPI. The PSF shift map can be derived either along the nondistorted or the distorted coordinates. Along the nondistorted coordinates more information about compressed areas is present but it is prone to PSF-ghosting artifacts induced by large k-space shift in PSF encoding direction. In contrast, shift maps along the distorted coordinates contain more information in stretched areas and are more robust against PSF-ghosting. In ultra-high-field MRI, an EPI contains both compressed and stretched regions depending on the B0 field inhomogeneity and local susceptibility. In this study, we present a new geometric distortion correction scheme, which selectively applies the shift map with more information content. We propose a PSF-ghost elimination method to generate an artifact-free pixel shift map along nondistorted coordinates. The proposed method can correct the effects of the local magnetic field inhomogeneity induced by the susceptibility effects along with the PSF-ghost artifact cancellation. We have experimentally demonstrated the advantages of the proposed method in EPI data acquisitions in phantom and human brain using 7-T MRI. Copyright © 2011 Wiley Periodicals, Inc.

  11. Fully automated laboratory and field-portable goniometer used for performing accurate and precise multiangular reflectance measurements

    Science.gov (United States)

    Harms, Justin D.; Bachmann, Charles M.; Ambeau, Brittany L.; Faulring, Jason W.; Ruiz Torres, Andres J.; Badura, Gregory; Myers, Emily

    2017-10-01

    Field-portable goniometers are created for a wide variety of applications. Many of these applications require specific types of instruments and measurement schemes and must operate in challenging environments. Therefore, designs are based on the requirements that are specific to the application. We present a field-portable goniometer that was designed for measuring the hemispherical-conical reflectance factor (HCRF) of various soils and low-growing vegetation in austere coastal and desert environments and biconical reflectance factors in laboratory settings. Unlike some goniometers, this system features a requirement for "target-plane tracking" to ensure that measurements can be collected on sloped surfaces, without compromising angular accuracy. The system also features a second upward-looking spectrometer to measure the spatially dependent incoming illumination, an integrated software package to provide full automation, an automated leveling system to ensure a standard frame of reference, a design that minimizes the obscuration due to self-shading to measure the opposition effect, and the ability to record a digital elevation model of the target region. This fully automated and highly mobile system obtains accurate and precise measurements of HCRF in a wide variety of terrain and in less time than most other systems while not sacrificing consistency or repeatability in laboratory environments.

  12. Characterization and correction of eddy-current artifacts in unipolar and bipolar diffusion sequences using magnetic field monitoring

    Science.gov (United States)

    Chan, Rachel W.; von Deuster, Constantin; Giese, Daniel; Stoeck, Christian T.; Harmer, Jack; Aitken, Andrew P.; Atkinson, David; Kozerke, Sebastian

    2014-07-01

    Diffusion tensor imaging (DTI) of moving organs is gaining increasing attention but robust performance requires sequence modifications and dedicated correction methods to account for system imperfections. In this study, eddy currents in the "unipolar" Stejskal-Tanner and the velocity-compensated "bipolar" spin-echo diffusion sequences were investigated and corrected for using a magnetic field monitoring approach in combination with higher-order image reconstruction. From the field-camera measurements, increased levels of second-order eddy currents were quantified in the unipolar sequence relative to the bipolar diffusion sequence while zeroth and linear orders were found to be similar between both sequences. Second-order image reconstruction based on field-monitoring data resulted in reduced spatial misalignment artifacts and residual displacements of less than 0.43 mm and 0.29 mm (in the unipolar and bipolar sequences, respectively) after second-order eddy-current correction. Results demonstrate the need for second-order correction in unipolar encoding schemes but also show that bipolar sequences benefit from second-order reconstruction to correct for incomplete intrinsic cancellation of eddy-currents.

  13. Temporal B0 field variation effects on MRSI of the human prostate at 7 T and feasibility of correction using an internal field probe.

    Science.gov (United States)

    Arteaga de Castro, C S; Boer, V O; Luttje, M P; van der Velden, T A; Bhogal, A; van Vulpen, M; Luijten, P R; van der Heide, U A; Klomp, D W J

    2014-11-01

    Spectral degradations as a result of temporal field variations are observed in MRSI of the human prostate. Moving organs generate substantial temporal and spatial field fluctuations as a result of susceptibility mismatch with the surrounding tissue (i.e. periodic breathing, cardiac motion or random bowel motion). Nine patients with prostate cancer were scanned with an endorectal coil (ERC) on a 7-T MR scanner. Temporal B0 field variations were observed with fast dynamic B0 mapping in these patients. Simulations of dynamic B0 corrections were performed using zero- to second-order shim terms. In addition, the temporal B0 variations were applied to simulated MR spectra causing, on average, 15% underestimation of the choline/citrate ratio. Linewidth distortions and frequency shifts (up to 30 and 8 Hz, respectively) were observed. To demonstrate the concept of observing local field fluctuations in real time during MRSI data acquisition, a field probe (FP) tuned and matched for the (19)  F frequency was incorporated into the housing of the ERC. The data acquired with the FP were compared with the B0 field map data and used to correct the MRSI datasets retrospectively. The dynamic B0 mapping data showed variations of up to 30 Hz (0.1 ppm) over 72 s at 7 T. The simulated zero-order corrections, calculated as the root mean square, reduced the standard deviation (SD) of the dynamic variations by an average of 41%. When using second-order corrections, the reduction in the SD was, on average, 56%. The FP data showed the same variation range as the dynamic B0 data and the variation patterns corresponded. After retrospective correction, the MRSI data showed artifact reduction and improved spectral resolution. B0 variations can degrade the MRSI substantially. The simple incorporation of an FP into an ERC can improve prostate cancer MRSI without prior knowledge of the origin of the dynamic field distortions. Copyright © 2014 John Wiley & Sons, Ltd.

  14. Study of Static Magnetic Properties of Transformer Oil Based Magnetic Fluids for Various Technical Applications Using Demagnetizing Field Correction

    Directory of Open Access Journals (Sweden)

    Oana Maria Marinica

    2017-01-01

    Full Text Available Static magnetization data of eight transformer oil based magnetic fluid samples, with saturation magnetization ranging in a large interval from 9 kA/m to 90 kA/m, have been subjected to the demagnetizing field correction. Using the tabulated demagnetization factors and the differential magnetic susceptibility of the samples, the values of the radial magnetometric demagnetization factor were obtained in the particular case of VSM880 magnetometer. It was found that the demagnetizing field correction keeps the saturation magnetization values unchanged, but instead the initial magnetic susceptibility of the magnetic fluid samples varies widely. The mean magnetic diameter, obtained through magnetogranulometry from the measured data, is higher than that obtained from the corrected ones and the variation rate increases with the magnetic particle volume fraction growth.

  15. Study of Static Magnetic Properties of Transformer Oil Based Magnetic Fluids for Various Technical Applications Using Demagnetizing Field Correction

    OpenAIRE

    Oana Maria Marinica

    2017-01-01

    Static magnetization data of eight transformer oil based magnetic fluid samples, with saturation magnetization ranging in a large interval from 9 kA/m to 90 kA/m, have been subjected to the demagnetizing field correction. Using the tabulated demagnetization factors and the differential magnetic susceptibility of the samples, the values of the radial magnetometric demagnetization factor were obtained in the particular case of VSM880 magnetometer. It was found that the demagnetizing field corre...

  16. Corrective measures technology for shallow land burial at arid sites: field studies of biointrusion barriers and erosion control

    Energy Technology Data Exchange (ETDEWEB)

    Nyhan, J.W.; Hakonson, T.E.; Lopez, E.A.

    1986-03-01

    The field research program involving corrective measures technologies for arid shallow land burial (SLB) sites is described. Results of field testing of a biointrusion barrier installed at a close-out waste disposal site (Area B) at Los Alamos are presented. Soil erosion and infiltration of water into a simulated trench cap with various surface treatments were measured, and the interaction between erosion control and subsurface water dynamics is discussed relative to waste management.

  17. Application of the iterative probe correction technique for a high-order probe in spherical near-field antenna measurements

    DEFF Research Database (Denmark)

    Laitinen, Tommi; Pivnenko, Sergey; Breinbjerg, Olav

    2006-01-01

    An iterative probe-correction technique for spherical near-field antenna measurements is examined. This technique has previously been shown to be well-suited for non-ideal first-order probes. In this paper, its performance in the case of a high-order probe (a dual-ridged horn) is examined....

  18. Recent progress in the field of automated welding applied to maintenance activities

    International Nuclear Information System (INIS)

    Cullafroz, M.

    2004-01-01

    Automated and robot welding has 5 advantages compared to manual welding: -) under some conditions the automated circular welding does not require requalification testing as manual welding does, -) welding heads in robots have a reduced size compared to manual gears so they can enter and treat complex piping, -) by using an adequate viewing system the operator can be more than 10 meters away from the welding site which means that the radiation doses he receives is cut by a factor 1.5 to 2, -) whatever the configuration is, the deposition rate in automated welding stays high, the quality standard is steady and the risk of repairing is low, -) a gain in productivity if adequate equipment is used. In general, automated welding requires a TIG welding process and is applied in maintenance activities to: -) the main primary system and other circuits in stainless austenitic steels, -) the main secondary system and other circuits in low-percentage carbon steels, and -) the closure of spent fuel canisters. An application to the repairing of BWR's pipes is shown. (A.C.)

  19. Models of Automation surprise : results of a field survey in aviation

    NARCIS (Netherlands)

    De Boer, Robert; Dekker, Sidney

    2017-01-01

    Automation surprises in aviation continue to be a significant safety concern and the community’s search for effective strategies to mitigate them are ongoing. The literature has offered two fundamentally divergent directions, based on different ideas about the nature of cognition and collaboration

  20. Development of automated welding process for field fabrication of thick walled pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, U A

    1981-01-01

    Research on automatic welding processes for the fabrication of thick-walled pressure vessels continued. A literature review on the subject was completed. A laboratory study of criteria for judging acceptable root parameters continued. Equipment for a demonstration facility to test the components and processes of the automated welding system has been specified and is being obtained. (LCL)

  1. Findings from Seven Years of Field Performance Data for Automated Demand Response in Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Kiliccote, Sila; Piette, Mary Ann; Mathieu, Johanna; Parrish, Kristen

    2010-05-14

    California is a leader in automating demand response (DR) to promote low-cost, consistent, and predictable electric grid management tools. Over 250 commercial and industrial facilities in California participate in fully-automated programs providing over 60 MW of peak DR savings. This paper presents a summary of Open Automated DR (OpenADR) implementation by each of the investor-owned utilities in California. It provides a summary of participation, DR strategies and incentives. Commercial buildings can reduce peak demand from 5 to 15percent with an average of 13percent. Industrial facilities shed much higher loads. For buildings with multi-year savings we evaluate their load variability and shed variability. We provide a summary of control strategies deployed, along with costs to install automation. We report on how the electric DR control strategies perform over many years of events. We benchmark the peak demand of this sample of buildings against their past baselines to understand the differences in building performance over the years. This is done with peak demand intensities and load factors. The paper also describes the importance of these data in helping to understand possible techniques to reach net zero energy using peak day dynamic control capabilities in commercial buildings. We present an example in which the electric load shape changed as a result of a lighting retrofit.

  2. Unitarity corrections and high field strengths in high energy hard collisions

    International Nuclear Information System (INIS)

    Kovchegov, Y.V.; Mueller, A.H.

    1997-01-01

    Unitarity corrections to the BFKL description of high energy hard scattering are viewed in large N c QCD in light-cone quantization. In a center of mass frame unitarity corrections to high energy hard scattering are manifestly perturbatively calculable and unrelated to questions of parton saturation. In a frame where one of the hadrons is initially at rest unitarity corrections are related to parton saturation effects and involve potential strengths A μ ∝1/g. In such a frame we describe the high energy scattering in terms of the expectation value of a Wilson loop. The large potentials A μ ∝1/g are shown to be pure gauge terms allowing perturbation theory to again describe unitarity corrections and parton saturation effects. Genuine nonperturbative effects only come in at energies well beyond those energies where unitarity constraints first become important. (orig.)

  3. Physiologic noise regression, motion regression, and TOAST dynamic field correction in complex-valued fMRI time series.

    Science.gov (United States)

    Hahn, Andrew D; Rowe, Daniel B

    2012-02-01

    As more evidence is presented suggesting that the phase, as well as the magnitude, of functional MRI (fMRI) time series may contain important information and that there are theoretical drawbacks to modeling functional response in the magnitude alone, removing noise in the phase is becoming more important. Previous studies have shown that retrospective correction of noise from physiologic sources can remove significant phase variance and that dynamic main magnetic field correction and regression of estimated motion parameters also remove significant phase fluctuations. In this work, we investigate the performance of physiologic noise regression in a framework along with correction for dynamic main field fluctuations and motion regression. Our findings suggest that including physiologic regressors provides some benefit in terms of reduction in phase noise power, but it is small compared to the benefit of dynamic field corrections and use of estimated motion parameters as nuisance regressors. Additionally, we show that the use of all three techniques reduces phase variance substantially, removes undesirable spatial phase correlations and improves detection of the functional response in magnitude and phase. Copyright © 2011 Elsevier Inc. All rights reserved.

  4. Intra-field on-product overlay improvement by application of RegC and TWINSCAN corrections

    Science.gov (United States)

    Sharoni, Ofir; Dmitriev, Vladimir; Graitzer, Erez; Perets, Yuval; Gorhad, Kujan; van Haren, Richard; Cekli, Hakki E.; Mulkens, Jan

    2015-03-01

    The on product overlay specification and Advanced Process Control (APC) is getting extremely challenging particularly after the introduction of multi-patterning applications like Spacer Assisted Double Patterning (SADP) and multipatterning techniques like N-repetitive Litho-Etch steps (LEN, N >= 2). When the latter is considered, most of the intrafield overlay contributors drop out of the overlay budget. This is a direct consequence of the fact that the scanner settings (like dose, illumination settings, etc.) as well as the subsequent processing steps can be made very similar for two consecutive Litho-Etch layers. The major overlay contributor that may require additional attention is the Image Placement Error (IPE). When the inter-layer overlay is considered, controlling the intra-field overlay contribution gets more complicated. In addition to the IPE contribution, the TWINSCANTM lens fingerprint in combination with the exposure settings is going to play a role as well. Generally speaking, two subsequent functional layers have different exposure settings. This results in a (non-reticle) additional overlay contribution. In this paper, we have studied the wafer overlay correction capability by RegC® in addition to the TWINSCANTM intrafield corrections to improve the on product overlay performance. RegC® is a reticle intra-volume laser writing technique that causes a predictable deformation element (RegC® deformation element) inside the quartz (Qz) material of a reticle. This technique enables to post-process an existing reticle to correct for instance for IPE. Alternatively, a pre-determined intra-field fingerprint can be added to the reticle such that it results in a straight field after exposure. This second application might be very powerful to correct for instance for (cold) lens fingerprints that cannot be corrected by the scanner itself. Another possible application is the intra-field processing fingerprint. One should realize that a RegC® treatment of a

  5. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  6. Speed of sound in a quark–gluon-plasma with one loop correction in mean-field potential

    Science.gov (United States)

    Singh, S. Somorendro; Ramanathan, R.

    2018-02-01

    We study the thermodynamic properties and speed of sound in a free energy evolution of quark-gluon p lasma with one loop correction factor in the mean-field potential. The values of the thermodynamic properties like pressure, entropy and specific heat are calculated for a range of temperatures. The results agree with the recent lattice results. The speed of sound calculated with one loop correction is found to be Cs^2=0.3 at this parameter γ q=1/8, and γ g= 8 γ q where the largest stable droplet is formed. The value agrees asymptotically with lattice result of speed of sound. It means loop correction contributes to the value of speed of sound.

  7. Analysis and correction of intrinsic non-axisymmetric magnetic fields in high-β DIII-D plasmas

    International Nuclear Information System (INIS)

    Garofalo, A.M.; La Haye, R.J.; Scoville, J.T.

    2002-01-01

    Rapid plasma toroidal rotation, sufficient for stabilization of the n=1 resistive wall mode, can be sustained by improving the axisymmetry of the toroidal magnetic field geometry of DIII-D. The required symmetrization is determined experimentally both by optimizing currents in external n=1 correction coils with respect to the plasma rotation, and by use of the n=1 magnetic feedback to detect and minimize the plasma response to non-axisymmetric fields as β increases. Both methods point to an intrinsic ∼7 G (0.03% of the toroidal field), m/n=2/1 resonant helical field at the q=2 surface as the cause of the plasma rotation slowdown above the no-wall β limit. The drag exerted by this field on the plasma rotation is consistent with the behaviour of 'slipping' in a simple induction motor model. (author)

  8. A voxelation-corrected non-stationary 3D cluster-size test based on random field theory.

    Science.gov (United States)

    Li, Huanjie; Nickerson, Lisa D; Zhao, Xuna; Nichols, Thomas E; Gao, Jia-Hong

    2015-09-01

    Cluster-size tests (CSTs) based on random field theory (RFT) are commonly adopted to identify significant differences in brain images. However, the use of RFT in CSTs rests on the assumption of uniform smoothness (stationarity). When images are non-stationary, CSTs based on RFT will likely lead to increased false positives in smooth regions and reduced power in rough regions. An adjustment to the cluster size according to the local smoothness at each voxel has been proposed for the standard test based on RFT to address non-stationarity, however, this technique requires images with a large degree of spatial smoothing, large degrees of freedom and high intensity thresholding. Recently, we proposed a voxelation-corrected 3D CST based on Gaussian random field theory that does not place constraints on the degree of spatial smoothness. However, this approach is only applicable to stationary images, requiring further modification to enable use for non-stationary images. In this study, we present modifications of this method to develop a voxelation-corrected non-stationary 3D CST based on RFT. Both simulated and real data were used to compare the voxelation-corrected non-stationary CST to the standard cluster-size adjusted non-stationary CST based on RFT and the voxelation-corrected stationary CST. We found that voxelation-corrected stationary CST is liberal for non-stationary images and the voxelation-corrected non-stationary CST performs better than cluster-size adjusted non-stationary CST based on RFT under low smoothness, low intensity threshold and low degrees of freedom. Published by Elsevier Inc.

  9. Fator de correção para indivíduos com capacidade acomodativa baseado no uso do refrator automático Correction factor for individuals with accommodative capacity based on automated refractor

    Directory of Open Access Journals (Sweden)

    Rodrigo Ueno Takahagi

    2009-12-01

    Full Text Available OBJETIVO: Pesquisar um fator de correção para avaliação do erro refrativo sem a utilização da cicloplegia. MÉTODOS: Foram estudados 623 pacientes (1.246 olhos, de ambos os sexos, com idade entre 3 e 40 anos. As refratometrias estática e dinâmica foram obtidas usando-se o refrator automático Shin-Nippon Accuref-K 9001. A cicloplegia foi obtida com a instilação de uma gota de colírio ciclopentolato a 1%, com refratometria estática 30 minutos após. Os dados foram submetidos à análise estatística usando a técnica do modelo de regressão linear e modelo de regressão múltipla do valor dióptrico com e sem cicloplegia, em função da idade. RESULTADOS: A correlação entre valores dióptricos sem e com cicloplegia quanto ao erro astigmático variou de 81,52% a 92,27%. Quanto ao valor dióptrico esférico, a correlação foi menor (53,57% a 87,78%. O mesmo se observou em relação ao eixo do astigmatismo (28,86% a 58,80%. O modelo de regressão múltipla em função da idade mostrou coeficiente de determinação múltiplo maior para a miopia (86,38% e astigmatismo (79,79%. O menor coeficiente foi observado para o eixo do astigmatismo (17,70%. CONCLUSÃO: Avaliando-se os erros refrativos com e sem cicloplegia, observou-se alta correlação nas ametropias cilíndricas. Foram desenvolvidas equações matemáticas como fator de correção para refratometrias dos pacientes sem cicloplegia, portadores de ametropias cilíndricas e esféricas.PURPOSE: To determine a correction factor for refractive errors evaluated without cycloplegy effect. METHODS: A study was made with 623 patients (1,246 eyes of both sexes, aging between 3 and 40 years old. The dynamic and static refractometries were obtained using the automated refractor Shin-Nippon Accuref-K 9001. 1% Cyclopentolate was dropped and the static refractometry was performed in 30 minutes. Data were analyzed using the linear regression model and the multiple regression model of the diopter

  10. Analysis of the Failures and Corrective Actions for the LHC Cryogenics Radiation Tolerant Electronics and its Field Instruments

    CERN Document Server

    Balle, Ch; Vauthier, N

    2014-01-01

    The LHC cryogenic system radiation tolerant electronics and their associated field instruments have been in nominal conditions since before the commissioning of the first LHC beams in September 2008. This system is made of about 15’000 field instruments (thermometers, pressure sensors, liquid helium level gauges, electrical heaters and position switches), 7’500 electronic cards and 853 electronic crates. Since mid-2008 a software tool has been deployed, this allows an operator to report a problem and then lists the corrective actions. The tool is a great help in detecting recurrent problems that may be tackled by a hardware or software consolidation. The corrective actions range from simple resets, exchange of defective equipment, repair of electrical connectors, etc. However a recurrent problem that heals by itself is present on some channels. This type of fault is extremely difficult to diagnose and it appears as a temporary opening of an electrical circuit; its duration can range from a few minutes to ...

  11. Can Lucifer Yellow Indicate Correct Permeability of Biological Cell Membrane under An Electric and Magnetic Field?

    OpenAIRE

    Pourmirjafari Firoozabadi, Tahereh; Shankayi, Zeinab; Izadi, Azam; Pourmirjafari Firoozabadi, Seyed Mohammad

    2015-01-01

    The effect of external magnetic and electric fields, in the range of electroporation and magnetoporation, on Lucifer Yellow (LY) fluorescence in the absence of cells is studied. Electric-field-induced quenching and magnetic field-induced increase are observed for fluorescence intensity of LY. Regard to the fact that the variation of field-induced fluorescence, even in the absence of cells, can be observed, the application of LY, as a marker, is debatable in electroporation and magnetoporation...

  12. Can Lucifer Yellow Indicate Correct Permeability of Biological Cell Membrane under An Electric and Magnetic Field?

    Science.gov (United States)

    Pourmirjafari Firoozabadi, Tahereh; Shankayi, Zeinab; Izadi, Azam; Pourmirjafari Firoozabadi, Seyed Mohammad

    2015-01-01

    The effect of external magnetic and electric fields, in the range of electroporation and magnetoporation, on Lucifer Yellow (LY) fluorescence in the absence of cells is studied. Electric-field-induced quenching and magnetic field-induced increase are observed for fluorescence intensity of LY. Regard to the fact that the variation of field-induced fluorescence, even in the absence of cells, can be observed, the application of LY, as a marker, is debatable in electroporation and magnetoporation techniques.

  13. OBSERVATIONS OF SIMILARITY THEORY STABILITY CORRECTION TERMS FOR MOMENTUM AND TEMPERATURE, OVER AGRICULTURAL FIELDS AND FORESTS.

    Science.gov (United States)

    Many observations of temperature and wind speed profiles have been taken over "ideal" terrain and analyzed to develop the stability correction terms which are commonly used in the application of similarity theory. Fewer observations have been taken and analyzed in this manner ov...

  14. Vertex corrections to the mean-field electrical conductivity in disordered electron systems

    Czech Academy of Sciences Publication Activity Database

    Pokorný, Vladislav; Janiš, Václav

    2013-01-01

    Roč. 25, č. 17 (2013), "175502-1"-"175502-10" ISSN 0953-8984 Institutional support: RVO:68378271 Keywords : disordered electron systems * electrical conductivity * vertex corrections Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 2.223, year: 2013

  15. On the covariant formalism of the effective field theory of gravity and leading order corrections

    DEFF Research Database (Denmark)

    Codello, Alessandro; Jain, Rajeev Kumar

    2016-01-01

    as gravity coupled to matter. By means of heat kernel methods we renormalize and compute the leading quantum corrections to quadratic order in a curvature expansion. The final effective action in our covariant formalism is generally non-local and can be readily used to understand the phenomenology...

  16. Voxel Spread Function (VSF) Method for Correction of Magnetic Field Inhomogeneity Effects in Quantitative Gradient-Echo-Based MRI

    Science.gov (United States)

    Yablonskiy, Dmitriy A; Sukstanskii, Alexander L; Luo, Jie; Wang, Xiaoqi

    2012-01-01

    Purpose Macroscopic magnetic field inhomogeneities adversely affect different aspects of MRI images. In quantitative MRI when the goal is to quantify biological tissue parameters, they bias and often corrupt such measurements. The goal of this paper is to develop a method for correction of macroscopic field inhomogeneities that can be applied to a variety of quantitative gradient-echo-based MRI techniques. Methods We have re-analyzed a basic theory of gradient echo (GE) MRI signal formation in the presence of background field inhomogeneities and derived equations that allow for correction of magnetic field inhomogeneity effects based on the phase and magnitude of GE data. We verified our theory by mapping R2* relaxation rate in computer simulated, phantom, and in vivo human data collected with multi-GE sequences. Results The proposed technique takes into account voxel spread function (VSF) effects and allowed obtaining virtually free from artifacts R2* maps for all simulated, phantom and in vivo data except of the edge areas with very steep field gradients. Conclusion The VSF method, allowing quantification of tissue specific R2*-related tissue properties, has a potential to breed new MRI biomarkers serving as surrogates for tissue biological properties similar to R1 and R2 relaxation rate constants widely used in clinical and research MRI. PMID:23233445

  17. Dose corrections for field obliquity for 45-MV x-ray therapy

    International Nuclear Information System (INIS)

    McGinley, P.H.; Clanton, A.; Downes, B.; Nuskind, J.

    1983-01-01

    The degree of dose perturbation produced by a 25.7-cm-diam circular water phantom was determined for a 45-MV x-ray beam by direct measurement. Data obtained in a circular and a cubical water phantom was utilized to test three accepted techniques (isodose shift, TAR method, and effective SSD method) for the correction of isodose levels to account for patient curvature. In general, the effective SSD method yielded the most accurate results for all depth including the buildup region. An isodose shift factor of 0.8 was found for the 45-MV x-ray beam. Key words: curvature corrections, 45-MV x ray, isodose shift, TAR, effective SSD method

  18. Self-propelled in-tube shuttle and control system for automated measurements of magnetic field alignment

    International Nuclear Information System (INIS)

    Boroski, W.N.; Nicol, T.H.; Pidcoe, S.V.

    1990-03-01

    A magnetic field alignment gauge is used to measure the field angle as a function of axial position in each of the magnets for the Superconducting Super Collider (SSC). Present measurements are made by manually pushing the through the magnet bore tube and stopping at intervals to record field measurements. Gauge location is controlled through graduation marks and alignment pins on the push rods. Field measurements are recorded on a logging multimeter with tape output. Described is a computerized control system being developed to replace the manual procedure for field alignment measurements. The automated system employs a pneumatic walking device to move the measurement gauge through the bore tube. Movement of the device, called the Self-Propelled In-Tube Shuttle (SPITS), is accomplished through an integral, gas driven, double-acting cylinder. The motion of the SPITS is transferred to the bore tube by means of a pair of controlled, retractable support feet. Control of the SPITS is accomplished through an RS-422 interface from an IBM-compatible computer to a series of solenoid-actuated air valves. Direction of SPITS travel is determined by the air-valve sequence, and is managed through the control software. Precise axial position of the gauge within the magnet is returned to the control system through an optically-encoded digital position transducer attached to the shuttle. Discussed is the performance of the transport device and control system during preliminary testing of the first prototype shuttle. 1 ref., 7 figs

  19. POGO satellite orbit corrections: an opportunity to improve the quality of the geomagnetic field measurements?

    DEFF Research Database (Denmark)

    Stockmann, Reto; Christiansen, Freddy; Olsen, Nils

    2015-01-01

    We present an attempt to improve the quality of the geomagnetic field measurements from the Polar Orbiting Geophysical Observatory (POGO) satellite missions in the late 1960s. Inaccurate satellite positions are believed to be a major source of errors for using the magnetic observations for field...

  20. Instrumentation, Field Network And Process Automation for the LHC Cryogenic Line Tests

    CERN Document Server

    Bager, T; Bertrand, G; Casas-Cubillos, J; Gomes, P; Parente, C; Riddone, G; Suraci, A

    2000-01-01

    This paper describes the cryogenic control system and associated instrumentation of the test facility for 3 pre-series units of the LHC Cryogenic Distribution Line. For each unit, the process automation is based on a Programmable Logic Con-troller implementing more than 30 closed control loops and handling alarms, in-terlocks and overall process management. More than 160 sensors and actuators are distributed over 150 m on a Profibus DP/PA network. Parameterization, cali-bration and diagnosis are remotely available through the bus. Considering the diversity, amount and geographical distribution of the instru-mentation involved, this is a representative approach to the cryogenic control system for CERN's next accelerator.

  1. Automated data model evaluation

    International Nuclear Information System (INIS)

    Kazi, Zoltan; Kazi, Ljubica; Radulovic, Biljana

    2012-01-01

    Modeling process is essential phase within information systems development and implementation. This paper presents methods and techniques for analysis and evaluation of data model correctness. Recent methodologies and development results regarding automation of the process of model correctness analysis and relations with ontology tools has been presented. Key words: Database modeling, Data model correctness, Evaluation

  2. Fully automated prostate segmentation in 3D MR based on normalized gradient fields cross-correlation initialization and LOGISMOS refinement

    Science.gov (United States)

    Yin, Yin; Fotin, Sergei V.; Periaswamy, Senthil; Kunz, Justin; Haldankar, Hrishikesh; Muradyan, Naira; Cornud, François; Turkbey, Baris; Choyke, Peter

    2012-02-01

    Manual delineation of the prostate is a challenging task for a clinician due to its complex and irregular shape. Furthermore, the need for precisely targeting the prostate boundary continues to grow. Planning for radiation therapy, MR-ultrasound fusion for image-guided biopsy, multi-parametric MRI tissue characterization, and context-based organ retrieval are examples where accurate prostate delineation can play a critical role in a successful patient outcome. Therefore, a robust automated full prostate segmentation system is desired. In this paper, we present an automated prostate segmentation system for 3D MR images. In this system, the prostate is segmented in two steps: the prostate displacement and size are first detected, and then the boundary is refined by a shape model. The detection approach is based on normalized gradient fields cross-correlation. This approach is fast, robust to intensity variation and provides good accuracy to initialize a prostate mean shape model. The refinement model is based on a graph-search based framework, which contains both shape and topology information during deformation. We generated the graph cost using trained classifiers and used coarse-to-fine search and region-specific classifier training. The proposed algorithm was developed using 261 training images and tested on another 290 cases. The segmentation performance using mean DSC ranging from 0.89 to 0.91 depending on the evaluation subset demonstrates state of the art performance. Running time for the system is about 20 to 40 seconds depending on image size and resolution.

  3. Two Ramond-Ramond corrections to type II supergravity via field-theory amplitude

    Energy Technology Data Exchange (ETDEWEB)

    Bakhtiarizadeh, Hamid R. [Sirjan University of Technology, Department of Physics, Sirjan (Iran, Islamic Republic of)

    2017-12-15

    Motivated by the standard form of the string-theory amplitude, we calculate the field-theory amplitude to complete the higher-derivative terms in type II supergravity theories in their conventional form. We derive explicitly the O(α{sup '3}) interactions for the RR (Ramond-Ramond) fields with graviton, B-field and dilaton in the low-energy effective action of type II superstrings. We check our results by comparison with previous work that has been done by the other methods, and we find exact agreement. (orig.)

  4. Particle crossing versus field crossing; a corrective response to Duff's recent account of string theory

    International Nuclear Information System (INIS)

    Schroer, Bert; FU-Berlin

    2012-02-01

    Using recent results of advanced quantum field theory, we confute some of M. Duff's claims about string theory which he wrote as an invited paper to the project 'Forty Years Of String Theory: Reflecting on the Foundations' (author)

  5. Monte Carlo-based diode design for correction-less small field dosimetry.

    Science.gov (United States)

    Charles, P H; Crowe, S B; Kairn, T; Knight, R T; Hill, B; Kenny, J; Langton, C M; Trapp, J V

    2013-07-07

    Due to their small collecting volume, diodes are commonly used in small field dosimetry. However, the relative sensitivity of a diode increases with decreasing small field size. Conversely, small air gaps have been shown to cause a significant decrease in the sensitivity of a detector as the field size is decreased. Therefore, this study uses Monte Carlo simulations to look at introducing air upstream to diodes such that they measure with a constant sensitivity across all field sizes in small field dosimetry. Varying thicknesses of air were introduced onto the upstream end of two commercial diodes (PTW 60016 photon diode and PTW 60017 electron diode), as well as a theoretical unenclosed silicon chip using field sizes as small as 5 mm × 5 mm. The metric D(w,Q)/D(Det,Q) used in this study represents the ratio of the dose to a point of water to the dose to the diode active volume, for a particular field size and location. The optimal thickness of air required to provide a constant sensitivity across all small field sizes was found by plotting D(w,Q)/D(Det,Q) as a function of introduced air gap size for various field sizes, and finding the intersection point of these plots. That is, the point at which D(w,Q)/D(Det,Q) was constant for all field sizes was found. The optimal thickness of air was calculated to be 3.3, 1.15 and 0.10 mm for the photon diode, electron diode and unenclosed silicon chip, respectively. The variation in these results was due to the different design of each detector. When calculated with the new diode design incorporating the upstream air gap, k(f(clin),f(msr))(Q(clin),Q(msr)) was equal to unity to within statistical uncertainty (0.5%) for all three diodes. Cross-axis profile measurements were also improved with the new detector design. The upstream air gap could be implanted on the commercial diodes via a cap consisting of the air cavity surrounded by water equivalent material. The results for the unclosed silicon chip show that an ideal small

  6. High-resolution wide-field microscopy with adaptive optics for spherical aberration correction and motionless focusing.

    Science.gov (United States)

    Kner, P; Sedat, J W; Agard, D A; Kam, Z

    2010-02-01

    Live imaging in cell biology requires three-dimensional data acquisition with the best resolution and signal-to-noise ratio possible. Depth aberrations are a major source of image degradation in three-dimensional microscopy, causing a significant loss of resolution and intensity deep into the sample. These aberrations occur because of the mismatch between the sample refractive index and the immersion medium index. We have built a wide-field fluorescence microscope that incorporates a large-throw deformable mirror to simultaneously focus and correct for depth aberration in three-dimensional imaging. Imaging fluorescent beads in water and glycerol with an oil immersion lens we demonstrate a corrected point spread function and a 2-fold improvement in signal intensity. We apply this new microscope to imaging biological samples, and show sharper images and improved deconvolution.

  7. Loop corrections and other many-body effects in relativistic field theories

    International Nuclear Information System (INIS)

    Ainsworth, T.L.; Brown, G.E.; Prakash, M.; Weise, W.

    1988-01-01

    Incorporation of effective masses into negative energy states (nucleon loop corrections) gives rise to repulsive many-body forces, as has been known for some time. Rather than renormalizing away the three- and four-body terms, we introduce medium corrections into the effective σ-exchange, which roughly cancel the nucleon loop terms for densities ρ ≅ ρ nm , where ρ nm is nuclear matter density. Going to higher densities, the repulsive contributions tend to saturate whereas the attractive ones keep on growing in magnitude. The latter is achieved through use of a density-dependent effective mass for the σ-particle, m σ = m σ (ρ), such that m σ (ρ) decreases with increasing density. Such a behavior is seen e.g. in the Nambu-Jona-Lasinio model. It is argued that a smooth transition to chiral restoration implies a similar behavior. The resulting nuclear equation of state is, because of the self-consistency in the problem, immensely insensitive to changes in the mass or coupling constant of the σ-particle. (orig.)

  8. Evaluating the potential of automated telephony systems in rural communities: Field assessment for project Lwazi of HLT Meraka

    CSIR Research Space (South Africa)

    Gumede, T

    2008-11-01

    Full Text Available the potential role automated telephony services in the improving access to important government information and services. Our interviews, focus groups and surveys revealed that an automated telephony service could be greatly support current government efforts...

  9. Development of Automated Image Analysis Tools for Verification of Radiotherapy Field Accuracy with AN Electronic Portal Imaging Device.

    Science.gov (United States)

    Dong, Lei

    1995-01-01

    The successful management of cancer with radiation relies on the accurate deposition of a prescribed dose to a prescribed anatomical volume within the patient. Treatment set-up errors are inevitable because the alignment of field shaping devices with the patient must be repeated daily up to eighty times during the course of a fractionated radiotherapy treatment. With the invention of electronic portal imaging devices (EPIDs), patient's portal images can be visualized daily in real-time after only a small fraction of the radiation dose has been delivered to each treatment field. However, the accuracy of human visual evaluation of low-contrast portal images has been found to be inadequate. The goal of this research is to develop automated image analysis tools to detect both treatment field shape errors and patient anatomy placement errors with an EPID. A moments method has been developed to align treatment field images to compensate for lack of repositioning precision of the image detector. A figure of merit has also been established to verify the shape and rotation of the treatment fields. Following proper alignment of treatment field boundaries, a cross-correlation method has been developed to detect shifts of the patient's anatomy relative to the treatment field boundary. Phantom studies showed that the moments method aligned the radiation fields to within 0.5mm of translation and 0.5^ circ of rotation and that the cross-correlation method aligned anatomical structures inside the radiation field to within 1 mm of translation and 1^ circ of rotation. A new procedure of generating and using digitally reconstructed radiographs (DRRs) at megavoltage energies as reference images was also investigated. The procedure allowed a direct comparison between a designed treatment portal and the actual patient setup positions detected by an EPID. Phantom studies confirmed the feasibility of the methodology. Both the moments method and the cross -correlation technique were

  10. Non linear field correction effects on the dynamic aperture of the FCC-hh

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00361058; Seryi, Andrei; Maclean, Ewen Hamish; Martin, Roman; Tomas Garcia, Rogelio

    2017-01-01

    The Future Circular Collider (FCC) design study aims to develop the designs of possible circular colliders in the post LHC era. In particular the FCC-hh will aim to produce proton-proton collisions at a center of mass energy of 100 TeV. Given the large beta functions and integrated length of the quadrupoles of the final focus triplet the effect of systematic and random non linear errors in the magnets are expected to have a severe impact on the stability of the beam. Following the experience on the HL-LHC this work explores the implementation of non-linear correctors to minimize the resonance driving terms arising from the errors of the triplet. Dynamic aperture studies are then performed to study the impact of this correction.

  11. Online corrections - Evidence based practice utilizing electronic portal imaging to improve the accuracy of field placement for locally advanced prostate cancer

    International Nuclear Information System (INIS)

    Middleton, M.; Medwell, S.; Rolfo, A.; Joon, M.L.

    2003-01-01

    The requirement of accurate field placement in the treatment of locally advanced prostate cancer is of great significance given the onset of dose escalation and increased Planning Target Volume (PTV) conformity. With these factors in mind, it becomes essential to ensure accurate field placement for the duration of a course of Radiotherapy. This study examines the role of Online Corrections to increase accuracy of field placement, utilizing Varian Vision EPI equipment. The study also examines the hypothetical scenario of effect on three-dimensional computer dosimetry if Online Corrections were not performed, incorporating TCP and NTCP data. Field placement data was collected on patients receiving radical radiotherapy to the prostate utilizing the Varian Vision (TM)EPI software. Both intra and inter field data was collected with Online Corrections being carried out within the confines of the BAROC PROSTATE EPI POLICY. Analysis was performed on the data to illustrate the value of Online Corrections in the pursuit of accurate field placement. This evidence was further supported by computer dosimetry presenting the worst case possible impact upon a patients total course of treatment if Online Corrections were not performed. The use of Online Corrections can prove to be of enormous benefit to both patient and practitioner. For centres with the available technology, it places the responsibility of field placement upon the Radiation Therapist. This responsibility in turn impacts on the education, training and empowerment of the Radiation Therapy group. These are issues of the utmost importance to centres considering the use of Online Corrections

  12. Automated extraction of faults and porous reservoir bodies. Examples from the Vallhall Field

    Energy Technology Data Exchange (ETDEWEB)

    Barkved, Olav Inge; Whitman, Doug; Kunz, Tim

    1998-12-31

    The Norwegian Vahall field is located 250 km South-West of Stavanger. The production is primarily from the highly porous and fractured chalk, the Tor formation. Fractures, evidently play a significant role in enhancing flow properties as well as production rates, are significantly higher than expected from matrix permeability alone. The fractures are primarily tectonically induced and related to faulting. Syn-depositional faulting is believed to be a controlling factor on reservoir thickness variations observed across the field. Due to the low acoustic contrast and weak appearance of the highly porous chalk, direct evidence of faulting in well bore logs is limited. The seismic data quality in the most central area of the field is very poor due to tertiary gas charging, but in the flank area of the field, the quality is excellent. 1 ref., 5 figs.

  13. On random field Completely Automated Public Turing Test to Tell Computers and Humans Apart generation.

    Science.gov (United States)

    Kouritzin, Michael A; Newton, Fraser; Wu, Biao

    2013-04-01

    Herein, we propose generating CAPTCHAs through random field simulation and give a novel, effective and efficient algorithm to do so. Indeed, we demonstrate that sufficient information about word tests for easy human recognition is contained in the site marginal probabilities and the site-to-nearby-site covariances and that these quantities can be embedded directly into certain conditional probabilities, designed for effective simulation. The CAPTCHAs are then partial random realizations of the random CAPTCHA word. We start with an initial random field (e.g., randomly scattered letter pieces) and use Gibbs resampling to re-simulate portions of the field repeatedly using these conditional probabilities until the word becomes human-readable. The residual randomness from the initial random field together with the random implementation of the CAPTCHA word provide significant resistance to attack. This results in a CAPTCHA, which is unrecognizable to modern optical character recognition but is recognized about 95% of the time in a human readability study.

  14. Rapid mapping of compound eye visual sampling parameters with FACETS, a highly automated wide-field goniometer.

    Science.gov (United States)

    Douglass, John K; Wehling, Martin F

    2016-12-01

    A highly automated goniometer instrument (called FACETS) has been developed to facilitate rapid mapping of compound eye parameters for investigating regional visual field specializations. The instrument demonstrates the feasibility of analyzing the complete field of view of an insect eye in a fraction of the time required if using non-motorized, non-computerized methods. Faster eye mapping makes it practical for the first time to employ sample sizes appropriate for testing hypotheses about the visual significance of interspecific differences in regional specializations. Example maps of facet sizes are presented from four dipteran insects representing the Asilidae, Calliphoridae, and Stratiomyidae. These maps provide the first quantitative documentation of the frontal enlarged-facet zones (EFZs) that typify asilid eyes, which, together with the EFZs in male Calliphoridae, are likely to be correlated with high-spatial-resolution acute zones. The presence of EFZs contrasts sharply with the almost homogeneous distribution of facet sizes in the stratiomyid. Moreover, the shapes of EFZs differ among species, suggesting functional specializations that may reflect differences in visual ecology. Surveys of this nature can help identify species that should be targeted for additional studies, which will elucidate fundamental principles and constraints that govern visual field specializations and their evolution.

  15. Arbitrary magnetic field gradient waveform correction using an impulse response based pre-equalization technique.

    Science.gov (United States)

    Goora, Frédéric G; Colpitts, Bruce G; Balcom, Bruce J

    2014-01-01

    The time-varying magnetic fields used in magnetic resonance applications result in the induction of eddy currents on conductive structures in the vicinity of both the sample under investigation and the gradient coils. These eddy currents typically result in undesired degradations of image quality for MRI applications. Their ubiquitous nature has resulted in the development of various approaches to characterize and minimize their impact on image quality. This paper outlines a method that utilizes the magnetic field gradient waveform monitor method to directly measure the temporal evolution of the magnetic field gradient from a step-like input function and extracts the system impulse response. With the basic assumption that the gradient system is sufficiently linear and time invariant to permit system theory analysis, the impulse response is used to determine a pre-equalized (optimized) input waveform that provides a desired gradient response at the output of the system. An algorithm has been developed that calculates a pre-equalized waveform that may be accurately reproduced by the amplifier (is physically realizable) and accounts for system limitations including system bandwidth, amplifier slew rate capabilities, and noise inherent in the initial measurement. Significant improvements in magnetic field gradient waveform fidelity after pre-equalization have been realized and are summarized. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. First-order correction terms in the weak-field asymptotic theory of tunneling ionization

    DEFF Research Database (Denmark)

    Trinh, Vinh H.; Tolstikhin, Oleg I.; Madsen, Lars Bojer

    2013-01-01

    of the WFAT at the quantitative level toward stronger fields, practically up to the boundary between tunneling and over-the-barrier regimes of ionization. The results apply to any atom or molecule treated in the single-active-electron and frozen-nuclei approximations. The theory is illustrated by calculations...... for hydrogen and noble-gas atoms....

  17. Effect of error field correction coils on W7-X limiter loads

    Science.gov (United States)

    Bozhenkov, S. A.; Jakubowski, M. W.; Niemann, H.; Lazerson, S. A.; Wurden, G. A.; Biedermann, C.; Kocsis, G.; König, R.; Pisano, F.; Stephey, L.; Szepesi, T.; Wenzel, U.; Pedersen, T. S.; Wolf, R. C.; W7-X Team

    2017-12-01

    In the first campaign Wendelstein 7-X was operated with five poloidal graphite limiters installed stellarator symmetrically. In an ideal situation the power losses would be equally distributed between the limiters. The limiter shape was designed to smoothly distribute the heat flux over two strike lines. Vertically the strike lines are not uniform because of different connection lengths. In this paper it is demonstrated both numerically and experimentally that the heat flux distribution can be significantly changed by non-resonant n=1 perturbation field of the order of 10-4 . Numerical studies are performed with field line tracing. In experiments perturbation fields are excited with five error field trim coils. The limiters are diagnosed with infrared cameras, neutral gas pressure gauges, thermocouples and spectroscopic diagnostics. Experimental results are qualitatively consistent with the simulations. With a suitable choice of the phase and amplitude of the perturbation a more symmetric plasma-limiter interaction can be potentially achieved. These results are also of interest for the later W7-X divertor operation.

  18. Geometric corrections due to inhomogeneous field in the magnetospheric double current layer

    International Nuclear Information System (INIS)

    Callebaut, D.K.; Van den Buys, A.M.

    1985-01-01

    The case of oblique incidence and of a slope in the magnetic field for plane parallel models of the magnetospheric double layer is considered. The two models are the Magnetospheric Double Layer (MDL) and the Magnetospheric Double Current Layer (MDCL). The latter is more appropriate but due to some approximations it gives sometimes incorrect results. An improved model uses a triple current layer. (R.P.)

  19. Development of a new error field correction coil (C-coil) for DIII-D

    International Nuclear Information System (INIS)

    Robinson, J.I.; Scoville, J.T.

    1995-12-01

    The C-coil recently installed on the DIII-D tokamak was developed to reduce the error fields created by imperfections in the location and geometry of the existing coils used to confine, heat, and shape the plasma. First results from C-coil experiments include stable operation in a 1.6 MA plasma with a density less than 1.0 x 10 13 cm -3 , nearly a factor of three lower density than that achievable without the C-coil. The C-coil has also been used in magnetic braking of the plasma rotation and high energy particle confinement experiments. The C-coil system consists of six individual saddle coils, each 60 degree wide toroidally, spanning the midplane of the vessel with a vertical height of 1.6 m. The coils are located at a major radius of 3.2 m, just outside of the toroidal field coils. The actual shape and geometry of each coil section varied somewhat from the nominal dimensions due to the large number of obstructions to the desired coil path around the already crowded tokamak. Each coil section consists of four turns of 750 MCM insulated copper cable banded with stainless steel straps within the web of a 3 in. x 3 in. stainless steel angle frame. The C-coil structure was designed to resist peak transient radial forces (up to 1,800 Nm) exerted on the coil by the toroidal and ploidal fields. The coil frames were supported from existing poloidal field coil case brackets, coil studs, and various other structures on the tokamak

  20. Electric field gradients in cuprtaes: Does LDA+U give the correct charge distribution?

    Czech Academy of Sciences Publication Activity Database

    Blaha, P.; Schwarz, K.; Novák, Pavel

    2005-01-01

    Roč. 101, - (2005), s. 550-566 ISSN 0020-7608 R&D Projects: GA ČR(CZ) GA202/03/0552 Institutional research plan: CEZ:AV0Z10100521 Keywords : cuprtaes * electric field gradients (EFG) * density fuctional theory (DFT) * LDA+U * band structure Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.192, year: 2005

  1. Quantum corrections to the classical model of the atom-field system.

    Science.gov (United States)

    Ugulava, A; McHedlishvili, G; Chkhaidze, S; Chotorlishvili, L

    2011-10-01

    The nonlinear-oscillating system in action-angle variables is characterized by the dependence of frequency of oscillation ω(I) on action I. Periodic perturbation is capable of realizing in the system a stable nonlinear resonance at which the action I adapts to the resonance condition ω(I(0))≃ω, that is, "sticking" in the resonance frequency. For a particular physical problem there may be a case when I≫ℏ is the classical quantity, whereas its correction ΔI≃ℏ is the quantum quantity. Naturally, dynamics of ΔI is described by the quantum equation of motion. In particular, in the moderate nonlinearity approximation ɛ≪(dω/dI)(I/ω)≪1/ɛ, where ɛ is the small parameter, the description of quantum state is reduced to the solution of the Mathieu-Schrödinger equation. The state formed as a result of sticking in resonance is an eigenstate of the operator ΔI that does not commute with the Hamiltonian H. Expanding the eigenstate wave functions in Hamiltonian eigenfunctions, one can obtain a probability distribution of energy level population. Thus, an inverse level population for times lower than the relaxation time can be obtained.

  2. No Substitute for Going to the Field: Correcting Lidar DEMs in Salt Marshes

    Science.gov (United States)

    Renken, K.; Morris, J. T.; Lynch, J.; Bayley, H.; Neil, A.; Rasmussen, S.; Tyrrell, M.; Tanis, M.

    2016-12-01

    Models that forecast the response of salt marshes to current and future trends in sea level rise increasingly are used to guide management of these vulnerable ecosystems. Lidar-derived DEMs serve as the foundation for modeling landform change. However, caution is advised when using these DEMs as the starting point for models of salt marsh evolution. While broad vegetation class (i.e., young forest, old forest, grasslands, desert, etc.) has proven to be a significant predictor of vertical displacement error in terrestrial environments, differentiating error among different species or community types within the same ecosystem has received less attention. Salt marshes are dominated by monocultures of grass species and thus are an ideal environment to examine the within-species effect on lidar DEM error. We analyzed error of lidar DEMs using elevations from real-time kinematic (RTK) surveys in saltmarshes in multiple national parks and wildlife refuge areas from the mouth of the Chesapeake Bay to Massachusetts. Error of the lidar DEMs was sometimes large, on the order of 0.25 m, and varied significantly between sites because vegetation cover varies seasonally and lidar data was not always collected in the same season for each park. Vegetation cover and composition were used to explain differences between RTK elevations and lidar DEMs. This research underscores the importance of collecting RTK elevation data and vegetation cover data coincident with lidar data to produce correction factors specific to individual salt marsh sites.

  3. Hartree corrections in a mean-field limit for fermions with Coulomb interaction

    Science.gov (United States)

    Petrat, Sören

    2017-06-01

    We consider the many-body dynamics of fermions with Coulomb interaction in a mean-field scaling limit where the kinetic and potential energy are of the same order for large particle numbers. In the considered limit the spatial variation of the mean-field is small. We prove two results about this scaling limit. First, due to the small variation, i.e., small forces, we show that the many-body dynamics can be approximated by the free dynamics with an appropriate phase factor with the conjectured optimal error term. Second, we show that the Hartree dynamics gives a better approximation with a smaller error term. In this sense, assuming that the error term in the first result is optimal, we derive the Hartree equations from the many-body dynamics with Coulomb interaction in a mean-field scaling limit. , which features invited work from the best early-career researchers working within the scope of J. Phys. A. This project is part of the Journal of Physics series’ 50th anniversary celebrations in 2017. Sören Petrat was selected by the Editorial Board of J. Phys. A as an emerging talent.

  4. Coulomb-corrected Volkov-type solution for an electron in an intense circularly polarized laser field

    Science.gov (United States)

    Bauer, Jaroslaw

    2001-04-01

    A simple analytical approximation exists for the wavefunction of an unbound electron interacting both with a strong circularly polarized laser field and an atomic Coulomb potential (Reiss and Krainov 1994 Phys. Rev. A 50 R910). This wavefunction is the Volkov state with a first-order Coulomb correction coming from some perturbative expansion of the potential in the Kramers-Henneberger reference frame. The expansion is valid, if the distance from the centre of the Coulomb force is smaller than the classical radius of motion of a free electron in a plane-wave field. We improve the approximate Coulomb-Volkov wavefunction by including the next term in the perturbative expansion of the atomic potential.

  5. Automating the mean-field method for large dynamic gossip networks

    NARCIS (Netherlands)

    Bakhshi, Rena; Endrullis, Jörg; Endrullis, Stefan; Fokkink, Wan; Haverkort, Boudewijn R.H.M.

    We investigate an abstraction method, called mean- field method, for the performance evaluation of dynamic net- works with pairwise communication between nodes. It allows us to evaluate systems with very large numbers of nodes, that is, systems of a size where traditional performance evaluation

  6. Automated TV based system for open field studies: Effects of methamphetamine

    NARCIS (Netherlands)

    Tanger, H.J.; Vanwersch, R.A.P.; Wolthuis, O.L.

    1978-01-01

    A method is described whereby open field behaviour of rats can be automatically registered using a TV camera, a video converter, an X-Y recorder and a papertape puncher. Use is made of the scanning properties of the TV camera to obtain the X and Y coordinates of the rat's position and to print this

  7. Cubic Dresselhaus interaction parameter from quantum corrections to the conductivity in the presence of an in-plane magnetic field

    Science.gov (United States)

    Marinescu, D. C.

    2017-09-01

    We evaluate the quantum corrections to the conductivity of a two-dimensional electron system with competing Rashba (R) and linear and cubic Dresselhaus (D) spin-orbit interactions in the presence of an in-plane magnetic field B . Within a perturbative approximation, we investigate the interplay between the spin-orbit coupling and the magnetic field in determining the transport regime in two different limiting scenarios: when only one of the linear terms, either Rashba or Dresselhaus, dominates, and at equal linear couplings, when the cubic Dresselhaus breaks the spin symmetry. In each instance, we find that for B higher than a critical value, the antilocalization correction is suppressed and the effective dephasing time saturates to a constant value determined only by the spin-orbit interaction. At equal R-D linear couplings, this value is directly proportional with the cubic Dresselhaus contribution. In the same regime, the magnetoconductivity is expressed as a simple logarithmic function dependent only on the cubic Dresselhaus constant.

  8. Bright field microscopy as an alternative to whole cell fluorescence in automated analysis of macrophage images.

    Directory of Open Access Journals (Sweden)

    Jyrki Selinummi

    2009-10-01

    Full Text Available Fluorescence microscopy is the standard tool for detection and analysis of cellular phenomena. This technique, however, has a number of drawbacks such as the limited number of available fluorescent channels in microscopes, overlapping excitation and emission spectra of the stains, and phototoxicity.We here present and validate a method to automatically detect cell population outlines directly from bright field images. By imaging samples with several focus levels forming a bright field -stack, and by measuring the intensity variations of this stack over the -dimension, we construct a new two dimensional projection image of increased contrast. With additional information for locations of each cell, such as stained nuclei, this bright field projection image can be used instead of whole cell fluorescence to locate borders of individual cells, separating touching cells, and enabling single cell analysis. Using the popular CellProfiler freeware cell image analysis software mainly targeted for fluorescence microscopy, we validate our method by automatically segmenting low contrast and rather complex shaped murine macrophage cells.The proposed approach frees up a fluorescence channel, which can be used for subcellular studies. It also facilitates cell shape measurement in experiments where whole cell fluorescent staining is either not available, or is dependent on a particular experimental condition. We show that whole cell area detection results using our projected bright field images match closely to the standard approach where cell areas are localized using fluorescence, and conclude that the high contrast bright field projection image can directly replace one fluorescent channel in whole cell quantification. Matlab code for calculating the projections can be downloaded from the supplementary site: http://sites.google.com/site/brightfieldorstaining.

  9. Corrections for a constant radial magnetic field in the muon g - 2 and electric-dipole-moment experiments in storage rings

    Energy Technology Data Exchange (ETDEWEB)

    Silenko, Alexander J. [Belarusian State University, Research Institute for Nuclear Problems, Minsk (Belarus); Joint Institute for Nuclear Research, Bogoliubov Laboratory of Theoretical Physics, Dubna (Russian Federation)

    2017-10-15

    We calculate the corrections for constant radial magnetic field in muon g - 2 and electric-dipole-moment experiments in storage rings. While the correction is negligible for the current generation of g - 2 experiments, it affects the upcoming muon electric-dipole-moment experiment at Fermilab. (orig.)

  10. Edge-Corrected Mean-Field Hubbard Model: Principle and Applications in 2D Materials

    Science.gov (United States)

    Zhang, Xi; Wang, Tianlei; Chen, Wencong; Wang, Sanmei; Peng, Da

    2017-05-01

    This work reviews the current progress of tight-binding methods and the recent edge-modified mean-field Hubbard model. Undercoordinated atoms and nonbonding electrons exist widely in nanomaterials and in network-structural materials with their impact under-estimated. A quantum theory was proposed to calculate the under-coordinated effects on the electronic structure of materials by incorporating bond order-length-strength (BOLS) correlation theory to mean-field Hubbard model, i.e. BOLS-HM. Consistency between the BOLS-HM calculation and density functional theory (DFT) calculation on 2D materials verified that i) bond contractions and potential well depression occur at the edge of graphene, phosphorene, and antimonene nanoribbons; ii) the physical origin of the band gap opening of graphene, phosphorene, and antimonene nanoribbons lays in the enhancement of edge potentials and hopping integrals due to the shorter and stronger bonds between undercoordinated atoms; iii) the band gap of 2D material nanoribbons expand as the width decreases due to the increasing under-coordination effects of edges which modulates the conductive behaviors; and iv) nonbond electrons at the edges and atomic vacancies of 2D material accompanied with the broken bond contribute to the Dirac-Fermi polaron (DFP) with a local magnetic moment.

  11. Edge-Corrected Mean-Field Hubbard Model: Principle and Applications in 2D Materials

    Directory of Open Access Journals (Sweden)

    Xi Zhang

    2017-05-01

    Full Text Available This work reviews the current progress of tight-binding methods and the recent edge-modified mean-field Hubbard model. Undercoordinated atoms (atoms not fully coordinated exist at a high rate in nanomaterials with their impact overlooked. A quantum theory was proposed to calculate electronic structure of nanomaterials by incorporating bond order-length-strength (BOLS correlation to mean-field Hubbard model, i.e., BOLS-HM. Consistency between the BOLS-HM calculation and density functional theory (DFT calculation on 2D materials verified that (i bond contractions and potential well depression occur at the edge of graphene, phosphorene, and antimonene nanoribbons; (ii the physical origin of the band gap opening of graphene, phosphorene, and antimonene nanoribbons lays in the enhancement of edge potentials and hopping integrals due to the shorter and stronger bonds between undercoordinated atoms; (iii the band gap of 2D material nanoribbons expand as the width decreases due to the increasing under-coordination effects of edges which modulates the conductive behaviors; and (iv non-bond electrons at the edges and atomic vacancies of 2D material accompanied with the broken bond contribute to the Dirac-Fermi polaron (DFP with a local magnetic moment.

  12. Prisons and Correctional Facilities, Located during MicroData field address collection 2004-2006. Kept in Spillman database for retrieval., Published in 2004, Vilas County Government.

    Data.gov (United States)

    NSGIC Local Govt | GIS Inventory — Prisons and Correctional Facilities dataset current as of 2004. Located during MicroData field address collection 2004-2006. Kept in Spillman database for retrieval..

  13. Automated TV-based system for open field studies: effects of methamphetamine.

    Science.gov (United States)

    Tanger, H J; Vanwersch, R A; Wolthuis, O L

    1978-10-01

    A method is described whereby open field behaviour of rats can be automatically registered using a TV camera, a video converter, an X-Y recorder and a papertape puncher. Use is made of the scanning properties of the TV camera to obtain the X and Y coordinates of the rat's position and to print this position on an X-Y recorder to obtain the running pattern. In addition, the X and Y coordinates at 1 sec intervals are punched on papertape. With computer processing of the tape, one can obtain--for any given period--the distance run, a frequency distribution of speeds, the number of entries into an inner field, the time spent in an inner field as well as the number of changes in corner positions. As an example the effects of 1 and 2 mg/kg methamphetamine are shown. This drug enhances all parameters measured in a dose-dependent fashion except the changes in corner positions which were not altered significantly.

  14. Scott correction for large atoms and molecules in a self-generated magnetic field

    DEFF Research Database (Denmark)

    Erdös, Laszlo; Fournais, Søren; Solovej, Jan Philip

    2012-01-01

    We consider a large neutral molecule with total nuclear charge $Z$ in non-relativistic quantum mechanics with a self-generated classical electromagnetic field. To ensure stability, we assume that $Z\\al^2\\le \\kappa_0$ for a sufficiently small $\\kappa_0$, where $\\al$ denotes the fine structure...... constant. We show that, in the simultaneous limit $Z\\to\\infty$, $\\al\\to 0$ such that $\\kappa =Z\\al^2$ is fixed, the ground state energy of the system is given by a two term expansion $c_1Z^{7/3} + c_2(\\kappa) Z^2 + o(Z^2)$. The leading term is given by the non-magnetic Thomas-Fermi theory. Our result shows...

  15. Quantum-corrected plasmonic field analysis using a time domain PMCHWT integral equation

    KAUST Repository

    Uysal, Ismail E.

    2016-03-13

    When two structures are within sub-nanometer distance of each other, quantum tunneling, i.e., electrons "jumping" from one structure to another, becomes relevant. Classical electromagnetic solvers do not directly account for this additional path of current. In this work, an auxiliary tunnel made of Drude material is used to "connect" the structures as a support for this current path (R. Esteban et al., Nat. Commun., 2012). The plasmonic fields on the resulting connected structure are analyzed using a time domain surface integral equation solver. Time domain samples of the dispersive medium Green function and the dielectric permittivities are computed from the analytical inverse Fourier transform applied to the rational function representation of their frequency domain samples.

  16. Comparison of automated repetitive-sequence-based polymerase chain reaction and spa typing versus pulsed-field gel electrophoresis for molecular typing of methicillin-resistant Staphylococcus aureus.

    Science.gov (United States)

    Church, Deirdre L; Chow, Barbara L; Lloyd, Tracie; Gregson, Daniel B

    2011-01-01

    Automated repetitive polymerase chain reaction (PCR) (DiversiLab, bioMérieux, St. Laurent, Quebec, Canada) and single locus sequence typing of the Staphylococcus protein A (spa) gene with spa-type assignment by StaphType RIDOM software were compared to pulsed-field gel electrophoresis (PFGE) as the "gold standard" method for methicillin-resistant Staphylococcus aureus (MRSA) typing. Fifty-four MRSA isolates were typed by all methods: 10 of known PFGE CMRSA type and 44 clinical isolates. Correct assignment of CMRSA type or cluster occurred for 47 of 54 (87%) of the isolates when using a rep-PCR similarity index (SI) of ≥95%. Rep-PCR gave 7 discordant results [CMRSA1 (3), CMRSA2 (1), CMRSA4 (1), and CMRSA10 (2)], and some CMRSA clusters were not distinguished (CMRSA10/5/9, CMRSA 7/8, and CMRSA3/6). Several spa types occurred within a single PFGE or repetitive PCR types among the 19 different spa types found. spa type t037 was shared by CMRSA3 and CMRSA6 strains, and CMRSA9 and most CMRSA10 strains shared spa type t008. Time to results for PFGE, repetitive PCR, and spa typing was 3-4 days, 24 h, and 48 h, respectively. The annual costs of using spa or repetitive PCR were 2.4× and 1.9× higher, respectively, than PFGE but routine use of spa typing would lower annual labor costs by 0.10 full-time equivalents compared to PFGE. Repetitive PCR is a good method for rapid outbreak screening, but MRSA isolates that share the same repetitive PCR or PFGE patterns can be distinguished by spa typing. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. Automated x-ray/light field congruence using the LINAC EPID panel.

    Science.gov (United States)

    Polak, Wojciech; O'Doherty, Jim; Jones, Matt

    2013-03-01

    X-ray/light field alignment is a test described in many guidelines for the routine quality control of clinical linear accelerators (LINAC). Currently, the gold standard method for measuring alignment is through utilization of radiographic film. However, many modern LINACs are equipped with an electronic portal imaging device (EPID) that may be used to perform this test and thus subsequently reducing overall cost, processing, and analysis time, removing operator dependency and the requirement to sustain the departmental film processor. This work describes a novel method of utilizing the EPID together with a custom inhouse designed jig and automatic image processing software allowing measurement of the light field size, x-ray field size, and congruence between them. The authors present results of testing the method for aS1000 and aS500 Varian EPID detectors for six LINACs at a range of energies (6, 10, and 15 MV) in comparison with the results obtained from the use of radiographic film. Reproducibility of the software in fully automatic operation under a range of operating conditions for a single image showed a congruence of 0.01 cm with a coefficient of variation of 0. Slight variation in congruence repeatability was noted through semiautomatic processing by four independent operators due to manual marking of positions on the jig. Testing of the methodology using the automatic method shows a high precision of 0.02 mm compared to a maximum of 0.06 mm determined by film processing. Intraindividual examination of operator measurements of congruence was shown to vary as much as 0.75 mm. Similar congruence measurements of 0.02 mm were also determined for a lower resolution EPID (aS500 model), after rescaling of the image to the aS1000 image size. The designed methodology was proven to be time efficient, cost effective, and at least as accurate as using the gold standard radiographic film. Additionally, congruence testing can be easily performed for all four cardinal

  18. Analytical Formulation of the Electric Field Induced by Electrode Arrays: Towards Automated Dielectrophoretic Cell Sorting

    Directory of Open Access Journals (Sweden)

    Vladimir Gauthier

    2017-08-01

    Full Text Available Dielectrophoresis is defined as the motion of an electrically polarisable particle in a non-uniform electric field. Current dielectrophoretic devices enabling sorting of cells are mostly controlled in open-loop applying a predefined voltage on micro-electrodes. Closed-loop control of these devices would enable to get advanced functionalities and also more robust behavior. Currently, the numerical models of dielectrophoretic force are too complex to be used in real-time closed-loop control. The aim of this paper is to propose a new type of models usable in this framework. We propose an analytical model of the electric field based on Fourier series to compute the dielectrophoretic force produced by parallel electrode arrays. Indeed, this method provides an analytical expression of the electric potential which decouples the geometrical factors (parameter of our system, the voltages applied on electrodes (input of our system, and the position of the cells (output of our system. Considering the Newton laws on each cell, it enables to generate easily a dynamic model of the cell positions (output function of the voltages on electrodes (input. This dynamic model of our system is required to design the future closed-loop control law. The predicted dielectrophoretic forces are compared to a numerical simulation based on finite element model using COMSOL software. The model presented in this paper enables to compute the dielectrophoretic force applied to a cell by an electrode array in a few tenths of milliseconds. This model could be consequently used in future works for closed-loop control of dielectrophoretic devices.

  19. Rough Sets and Stomped Normal Distribution for Simultaneous Segmentation and Bias Field Correction in Brain MR Images.

    Science.gov (United States)

    Banerjee, Abhirup; Maji, Pradipta

    2015-12-01

    The segmentation of brain MR images into different tissue classes is an important task for automatic image analysis technique, particularly due to the presence of intensity inhomogeneity artifact in MR images. In this regard, this paper presents a novel approach for simultaneous segmentation and bias field correction in brain MR images. It integrates judiciously the concept of rough sets and the merit of a novel probability distribution, called stomped normal (SN) distribution. The intensity distribution of a tissue class is represented by SN distribution, where each tissue class consists of a crisp lower approximation and a probabilistic boundary region. The intensity distribution of brain MR image is modeled as a mixture of finite number of SN distributions and one uniform distribution. The proposed method incorporates both the expectation-maximization and hidden Markov random field frameworks to provide an accurate and robust segmentation. The performance of the proposed approach, along with a comparison with related methods, is demonstrated on a set of synthetic and real brain MR images for different bias fields and noise levels.

  20. Correction induced by irrelevant operators in the correlators of the two-dimensional Ising model in a magnetic field

    Energy Technology Data Exchange (ETDEWEB)

    Caselle, M.; Grinza, P. [Dipartimento di Fisica Teorica dell' Universita di Torino and Istituto Nazionale di Fisica Nucleare, Sezione di Torino, Torino (Italy)]. E-mails: caselle@to.infn.it; grinza@to.infn.it; Magnoli, N. [Dipartimento di Fisica, Universita di Genova and Istituto Nazionale di Fisica Nucleare, Sezione di Genova, Genova (Italy)]. E-mail: magnoli@ge.infn.it

    2001-10-26

    We investigate the presence of irrelevant operators in the two-dimensional Ising model perturbed by a magnetic field, by studying the corrections induced by these operators in the spin-spin correlator of the model. To this end we perform a set of high-precision simulations for the correlator both along the axes and along the diagonal of the lattice. By comparing the numerical results with the predictions of a perturbative expansion around the critical point we find unambiguous evidence of the presence of such irrelevant operators. It turns out that among the irrelevant operators the one which gives the largest correction is the spin-4 operator T{sup 2}+T-bar{sup 2}, which accounts for the breaking of the rotational invariance due to the lattice. This result agrees with what was already known for the correlator evaluated exactly at the critical point and also with recent results obtained in the case of the thermal perturbation of the model. (author)

  1. Field-portable and automated immunosensors for hexavalent uranium, other heavy metals and chelators. Final Report

    International Nuclear Information System (INIS)

    Blake, Diane A.

    2009-01-01

    This is the final technical report for this 10-year project. A better understanding of in situ bioremediation processes and the development of strategies to enhance bacterial remediation of contaminated sites depend either directly or indirectly upon accurate detection and measurement of organics, metal and other toxic elements prior to, during and following the remediation process. Detection and measurement costs are presently high due to the complex methodologies required for analysis. Remediation costs could be significantly reduced through the use of rapid, simple on-site methods. The cost of laboratory analysis continues to climb and the outlay for the assessment of a single site can frequently reach hundreds of thousands of dollars. One estimate suggests that the use of low cost field methods (defined as less than $100/test) with 5-20% standard laboratory confirmation could reduce analytical costs by greater than 70%. Perhaps as important as the cost of analysis is ability to obtain data about the remediation process in near real-time. The instruments normally used for environmental analysis of uranium (atomic absorption spectrophotometer, inductive coupled plasma emission spectrometer, IC-MS and kinetic phosphorescence analyzer) or can be quite expensive; these instruments are thus usually located only in centralized facilities. Environmental samples must therefore be transported to these facilities and often wait in a queue before they can be analyzed. Both sample transport and time-in-queue lead to long turn-around times (days to weeks). Such long turn-around times are especially worrisome during site remediation, especially when an unexpected finding might dictate a change in the methodologies being employed at the site. The goal of this project was to develop sensors that could yield reliable data in near realtime (< 1 hour) be field-ready (ie, simple, durable and accurate) and present low costs (<< $100/assay and <$5,000 for the initial equipment

  2. Automated Ortho-Rectification of UAV-Based Hyperspectral Data over an Agricultural Field Using Frame RGB Imagery

    Directory of Open Access Journals (Sweden)

    Ayman Habib

    2016-09-01

    Full Text Available Low-cost Unmanned Airborne Vehicles (UAVs equipped with consumer-grade imaging systems have emerged as a potential remote sensing platform that could satisfy the needs of a wide range of civilian applications. Among these applications, UAV-based agricultural mapping and monitoring have attracted significant attention from both the research and professional communities. The interest in UAV-based remote sensing for agricultural management is motivated by the need to maximize crop yield. Remote sensing-based crop yield prediction and estimation are primarily based on imaging systems with different spectral coverage and resolution (e.g., RGB and hyperspectral imaging systems. Due to the data volume, RGB imaging is based on frame cameras, while hyperspectral sensors are primarily push-broom scanners. To cope with the limited endurance and payload constraints of low-cost UAVs, the agricultural research and professional communities have to rely on consumer-grade and light-weight sensors. However, the geometric fidelity of derived information from push-broom hyperspectral scanners is quite sensitive to the available position and orientation established through a direct geo-referencing unit onboard the imaging platform (i.e., an integrated Global Navigation Satellite System (GNSS and Inertial Navigation System (INS. This paper presents an automated framework for the integration of frame RGB images, push-broom hyperspectral scanner data and consumer-grade GNSS/INS navigation data for accurate geometric rectification of the hyperspectral scenes. The approach relies on utilizing the navigation data, together with a modified Speeded-Up Robust Feature (SURF detector and descriptor, for automating the identification of conjugate features in the RGB and hyperspectral imagery. The SURF modification takes into consideration the available direct geo-referencing information to improve the reliability of the matching procedure in the presence of repetitive texture

  3. Breast Imaging Reporting and Data System (BI-RADS) breast composition descriptors: Automated measurement development for full field digital mammography

    International Nuclear Information System (INIS)

    Fowler, E. E.; Sellers, T. A.; Lu, B.; Heine, J. J.

    2013-01-01

    Purpose: The Breast Imaging Reporting and Data System (BI-RADS) breast composition descriptors are used for standardized mammographic reporting and are assessed visually. This reporting is clinically relevant because breast composition can impact mammographic sensitivity and is a breast cancer risk factor. New techniques are presented and evaluated for generating automated BI-RADS breast composition descriptors using both raw and calibrated full field digital mammography (FFDM) image data.Methods: A matched case-control dataset with FFDM images was used to develop three automated measures for the BI-RADS breast composition descriptors. Histograms of each calibrated mammogram in the percent glandular (pg) representation were processed to create the new BR pg measure. Two previously validated measures of breast density derived from calibrated and raw mammograms were converted to the new BR vc and BR vr measures, respectively. These three measures were compared with the radiologist-reported BI-RADS compositions assessments from the patient records. The authors used two optimization strategies with differential evolution to create these measures: method-1 used breast cancer status; and method-2 matched the reported BI-RADS descriptors. Weighted kappa (κ) analysis was used to assess the agreement between the new measures and the reported measures. Each measure's association with breast cancer was evaluated with odds ratios (ORs) adjusted for body mass index, breast area, and menopausal status. ORs were estimated as per unit increase with 95% confidence intervals.Results: The three BI-RADS measures generated by method-1 had κ between 0.25–0.34. These measures were significantly associated with breast cancer status in the adjusted models: (a) OR = 1.87 (1.34, 2.59) for BR pg ; (b) OR = 1.93 (1.36, 2.74) for BR vc ; and (c) OR = 1.37 (1.05, 1.80) for BR vr . The measures generated by method-2 had κ between 0.42–0.45. Two of these measures were significantly

  4. Breast Imaging Reporting and Data System (BI-RADS) breast composition descriptors: Automated measurement development for full field digital mammography

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, E. E.; Sellers, T. A.; Lu, B. [Department of Cancer Epidemiology, Division of Population Sciences, H. Lee Moffitt Cancer Center, Tampa, Florida 33612 (United States); Heine, J. J. [Department of Cancer Imaging and Metabolism, H. Lee Moffitt Cancer Center, Tampa, Florida 33612 (United States)

    2013-11-15

    Purpose: The Breast Imaging Reporting and Data System (BI-RADS) breast composition descriptors are used for standardized mammographic reporting and are assessed visually. This reporting is clinically relevant because breast composition can impact mammographic sensitivity and is a breast cancer risk factor. New techniques are presented and evaluated for generating automated BI-RADS breast composition descriptors using both raw and calibrated full field digital mammography (FFDM) image data.Methods: A matched case-control dataset with FFDM images was used to develop three automated measures for the BI-RADS breast composition descriptors. Histograms of each calibrated mammogram in the percent glandular (pg) representation were processed to create the new BR{sub pg} measure. Two previously validated measures of breast density derived from calibrated and raw mammograms were converted to the new BR{sub vc} and BR{sub vr} measures, respectively. These three measures were compared with the radiologist-reported BI-RADS compositions assessments from the patient records. The authors used two optimization strategies with differential evolution to create these measures: method-1 used breast cancer status; and method-2 matched the reported BI-RADS descriptors. Weighted kappa (κ) analysis was used to assess the agreement between the new measures and the reported measures. Each measure's association with breast cancer was evaluated with odds ratios (ORs) adjusted for body mass index, breast area, and menopausal status. ORs were estimated as per unit increase with 95% confidence intervals.Results: The three BI-RADS measures generated by method-1 had κ between 0.25–0.34. These measures were significantly associated with breast cancer status in the adjusted models: (a) OR = 1.87 (1.34, 2.59) for BR{sub pg}; (b) OR = 1.93 (1.36, 2.74) for BR{sub vc}; and (c) OR = 1.37 (1.05, 1.80) for BR{sub vr}. The measures generated by method-2 had κ between 0.42–0.45. Two of these

  5. Breast Imaging Reporting and Data System (BI-RADS) breast composition descriptors: automated measurement development for full field digital mammography.

    Science.gov (United States)

    Fowler, E E; Sellers, T A; Lu, B; Heine, J J

    2013-11-01

    The Breast Imaging Reporting and Data System (BI-RADS) breast composition descriptors are used for standardized mammographic reporting and are assessed visually. This reporting is clinically relevant because breast composition can impact mammographic sensitivity and is a breast cancer risk factor. New techniques are presented and evaluated for generating automated BI-RADS breast composition descriptors using both raw and calibrated full field digital mammography (FFDM) image data. A matched case-control dataset with FFDM images was used to develop three automated measures for the BI-RADS breast composition descriptors. Histograms of each calibrated mammogram in the percent glandular (pg) representation were processed to create the new BR(pg) measure. Two previously validated measures of breast density derived from calibrated and raw mammograms were converted to the new BR(vc) and BR(vr) measures, respectively. These three measures were compared with the radiologist-reported BI-RADS compositions assessments from the patient records. The authors used two optimization strategies with differential evolution to create these measures: method-1 used breast cancer status; and method-2 matched the reported BI-RADS descriptors. Weighted kappa (κ) analysis was used to assess the agreement between the new measures and the reported measures. Each measure's association with breast cancer was evaluated with odds ratios (ORs) adjusted for body mass index, breast area, and menopausal status. ORs were estimated as per unit increase with 95% confidence intervals. The three BI-RADS measures generated by method-1 had κ between 0.25-0.34. These measures were significantly associated with breast cancer status in the adjusted models: (a) OR = 1.87 (1.34, 2.59) for BR(pg); (b) OR = 1.93 (1.36, 2.74) for BR(vc); and (c) OR = 1.37 (1.05, 1.80) for BR(vr). The measures generated by method-2 had κ between 0.42-0.45. Two of these measures were significantly associated with breast cancer

  6. Long-term geomagnetically induced current observations in New Zealand: Earth return corrections and geomagnetic field driver

    Science.gov (United States)

    Mac Manus, Daniel H.; Rodger, Craig J.; Dalzell, Michael; Thomson, Alan W. P.; Clilverd, Mark A.; Petersen, Tanja; Wolf, Moritz M.; Thomson, Neil R.; Divett, Tim

    2017-08-01

    Transpower New Zealand Limited has measured DC currents in transformer neutrals in the New Zealand electrical network at multiple South Island locations. Near-continuous archived DC current data exist since 2001, starting with 12 different substations and expanding from 2009 to include 17 substations. From 2001 to 2015 up to 58 individual transformers were simultaneously monitored. Primarily, the measurements were intended to monitor the impact of the high-voltage DC system linking the North and South Islands when it is operating in "Earth return" mode. However, after correcting for Earth return operation, as described here, the New Zealand measurements provide an unusually long and spatially detailed set of geomagnetically induced current (GIC) measurements. We examine the peak GIC magnitudes observed from these observations during two large geomagnetic storms on 6 November 2001 and 2 October 2013. Currents of 30-50 A are observed, depending on the measurement location. There are large spatial variations in the GIC observations over comparatively small distances, which likely depend upon network layout and ground conductivity. We then go on to examine the GIC in transformers throughout the South Island during more than 151 h of geomagnetic storm conditions. We compare the GIC to the various magnitude and rate of change components of the magnetic field. Our results show that there is a strong correlation between the magnitude of the GIC and the rate of change of the horizontal magnetic field (H'). This correlation is particularly clear for transformers that show large GIC current during magnetic storms.

  7. Small field detector correction factors kQclin,Qmsr (fclin,fmsr) for silicon-diode and diamond detectors with circular 6 MV fields derived using both empirical and numerical methods.

    Science.gov (United States)

    O'Brien, D J; León-Vintró, L; McClean, B

    2016-01-01

    The use of radiotherapy fields smaller than 3 cm in diameter has resulted in the need for accurate detector correction factors for small field dosimetry. However, published factors do not always agree and errors introduced by biased reference detectors, inaccurate Monte Carlo models, or experimental errors can be difficult to distinguish. The aim of this study was to provide a robust set of detector-correction factors for a range of detectors using numerical, empirical, and semiempirical techniques under the same conditions and to examine the consistency of these factors between techniques. Empirical detector correction factors were derived based on small field output factor measurements for circular field sizes from 3.1 to 0.3 cm in diameter performed with a 6 MV beam. A PTW 60019 microDiamond detector was used as the reference dosimeter. Numerical detector correction factors for the same fields were derived based on calculations from a geant4 Monte Carlo model of the detectors and the Linac treatment head. Semiempirical detector correction factors were derived from the empirical output factors and the numerical dose-to-water calculations. The PTW 60019 microDiamond was found to over-respond at small field sizes resulting in a bias in the empirical detector correction factors. The over-response was similar in magnitude to that of the unshielded diode. Good agreement was generally found between semiempirical and numerical detector correction factors except for the PTW 60016 Diode P, where the numerical values showed a greater over-response than the semiempirical values by a factor of 3.7% for a 1.1 cm diameter field and higher for smaller fields. Detector correction factors based solely on empirical measurement or numerical calculation are subject to potential bias. A semiempirical approach, combining both empirical and numerical data, provided the most reliable results.

  8. Gypsy moth (Lepidoptera: Lymantriidae) flight behavior and phenology based on field-deployed automated pheromone-baited traps

    Science.gov (United States)

    Patrick C. Tobin; Kenneth T. Klein; Donna S. Leonard

    2009-01-01

    Populations of the gypsy moth, Lymantria dispar (L.), are extensively monitored in the United States through the use of pheromone-baited traps.We report on use of automated pheromone-baited traps that use a recording sensor and data logger to record the unique date-time stamp of males as they enter the trap.We deployed a total of 352 automated traps...

  9. Correction factors for A1SL ionization chamber dosimetry in TomoTherapy: Machine-specific, plan-class, and clinical fields

    International Nuclear Information System (INIS)

    Gago-Arias, Araceli; Rodriguez-Romero, Ruth; Sanchez-Rubio, Patricia; Miguel Gonzalez-Castano, Diego; Gomez, Faustino; Nunez, Luis; Palmans, Hugo; Sharpe, Peter; Pardo-Montero, Juan

    2012-01-01

    Purpose: Recently, an international working group on nonstandard fields presented a new formalism for ionization chamber reference dosimetry of small and nonstandard fields [Alfonso et al., Med. Phys. 35, 5179-5186 (2008)] which has been adopted by AAPM TG-148. This work presents an experimental determination of the correction factors for reference dosimetry with an Exradin A1SL thimble ionization chamber in a TomoTherapy unit, focusing on: (i) machine-specific reference field, (ii) plan-class-specific reference field, and (iii) two clinical treatments. Methods: Ionization chamber measurements were performed in the TomoTherapy unit for intermediate (machine-specific and plan-class-specific) calibration fields, based on the reference conditions defined by AAPM TG-148, and two clinical treatments (lung and head-and-neck). Alanine reference dosimetry was employed to determine absorbed dose to water at the point of interest for the fields under investigation. The corresponding chamber correction factors were calculated from alanine to ionization chamber measurements ratios. Results: Two different methods of determining the beam quality correction factor k Q,Q 0 for the A1SL ionization chamber in this TomoTherapy unit, where reference conditions for conventional beam quality determination cannot be met, result in consistent values. The observed values of overall correction factors obtained for intermediate and clinical fields are consistently around 0.98 with a typical expanded relative uncertainty of 2% (k = 2), which when considered make such correction factors compatible with unity. However, all of them are systematically lower than unity, which is shown to be significant when a hypothesis test assuming a t-student distribution is performed (p=1.8x10 -2 ). Correction factors k Q clin ,Q pcsr f clin ,f pcsr and k Q clin ,Q msr f clin ,f msr , which are needed for the computation of field factors for relative dosimetry of clinical beams, have been found to be very

  10. Quantitative evaluation of automated skull-stripping methods applied to contemporary and legacy images: effects of diagnosis, bias correction, and slice location

    DEFF Research Database (Denmark)

    Fennema-Notestine, Christine; Ozyurt, I Burak; Clark, Camellia P

    2006-01-01

    Performance of automated methods to isolate brain from nonbrain tissues in magnetic resonance (MR) structural images may be influenced by MR signal inhomogeneities, type of MR image set, regional anatomy, and age and diagnosis of subjects studied. The present study compared the performance of four......, Alzheimer's, young and elderly control). To provide a criterion for outcome assessment, two experts manually stripped six sagittal sections for each dataset in locations where brain and nonbrain tissue are difficult to distinguish. Methods were compared on Jaccard similarity coefficients, Hausdorff...

  11. State-of-the art comparability of corrected emission spectra. 2. Field laboratory assessment of calibration performance using spectral fluorescence standards.

    Science.gov (United States)

    Resch-Genger, Ute; Bremser, Wolfram; Pfeifer, Dietmar; Spieles, Monika; Hoffmann, Angelika; DeRose, Paul C; Zwinkels, Joanne C; Gauthier, François; Ebert, Bernd; Taubert, R Dieter; Voigt, Jan; Hollandt, Jörg; Macdonald, Rainer

    2012-05-01

    In the second part of this two-part series on the state-of-the-art comparability of corrected emission spectra, we have extended this assessment to the broader community of fluorescence spectroscopists by involving 12 field laboratories that were randomly selected on the basis of their fluorescence measuring equipment. These laboratories performed a reference material (RM)-based fluorometer calibration with commercially available spectral fluorescence standards following a standard operating procedure that involved routine measurement conditions and the data evaluation software LINKCORR developed and provided by the Federal Institute for Materials Research and Testing (BAM). This instrument-specific emission correction curve was subsequently used for the determination of the corrected emission spectra of three test dyes, X, QS, and Y, revealing an average accuracy of 6.8% for the corrected emission spectra. This compares well with the relative standard uncertainties of 4.2% for physical standard-based spectral corrections demonstrated in the first part of this study (previous paper in this issue) involving an international group of four expert laboratories. The excellent comparability of the measurements of the field laboratories also demonstrates the effectiveness of RM-based correction procedures.

  12. Operational proof of automation

    International Nuclear Information System (INIS)

    Jaerschky, R.; Reifenhaeuser, R.; Schlicht, K.

    1976-01-01

    Automation of the power plant process may imply quite a number of problems. The automation of dynamic operations requires complicated programmes often interfering in several branched areas. This reduces clarity for the operating and maintenance staff, whilst increasing the possibilities of errors. The synthesis and the organization of standardized equipment have proved very successful. The possibilities offered by this kind of automation for improving the operation of power plants will only sufficiently and correctly be turned to profit, however, if the application of these technics of equipment is further improved and if its volume is tallied with a definite etc. (orig.) [de

  13. Field correction factors for a PTW-31016 Pinpoint ionization chamber for both flattened and unflattened beams. Study of the main sources of uncertainties.

    Science.gov (United States)

    Puxeu-Vaqué, Josep; Duch, Maria A; Nailon, William H; Cruz Lizuain, M; Ginjaume, Mercè

    2017-05-01

    The primary aim of this study was to determine correction factors, kQclin,Qmsrfclin,fmsr for a PTW-31016 ionization chamber on field sizes from 0.5 cm × 0.5 cm to 2 cm × 2 cm for both flattened (FF) and flattened filter-free (FFF) beams produced in a TrueBeam clinical accelerator. The secondary objective was the determination of field output factors, ΩQclin,Qmsrfclin,fmsr over this range of field sizes using both Monte Carlo (MC) simulation and measurements. kQclin,Qmsrfclin,fmsr for the PTW-31016 chamber were calculated by MC simulation for field sizes of 0.5 cm × 0.5 cm, 1 cm × 1 cm, and 2 cm × 2 cm. MC simulations were performed with the PENELOPE code system for the 10 MV FFF Particle Space File from a TrueBeam linear accelerator (LINAC) provided by the manufacturer (Varian Medical Systems, Inc. Palo Alto, CA, USA). Simulations were repeated taking into account chamber manufacturing tolerances and accelerator jaw positioning in order to assess the uncertainty of the calculated correction factors. Output ratios were measured on square fields ranging from 0.5 cm × 0.5 cm to 10 cm × 10 cm for 6 MV and 10 MV FF and FFF beams produced by a TrueBeam using a PTW-31016 ionization chamber; a Sun Nuclear Edge detector (SunNuclear Corp., Melbourne, FL, USA) and TLD-700R (Harshaw, Thermo Scientific, Waltham, MA, USA). The validity of the proposed correction factors was verified using the calculated correction factors for the determination of ΩQclin,Qmsrfclin,fmsr using a PTW-31016 at the four TrueBeam energies and comparing the results with both TLD-700R measurements and MC simulations. Finally, the proposed correction factors were used to assess the correction factors of the SunNuclear Edge detector. The present work provides a set of MC calculated correction factors for a PTW-31016 chamber used on a TrueBeam FF and FFF mode. For the 0.5 cm × 0.5 cm square field size, kQclin,Qmsrfclin,fmsr is equal to 1.17 with a combined uncertainty of 2% (k = 1). A detailed

  14. NUMERICAL SIMULATION AND EXPERIMENTAL STUDIES ON AFT HULL LOCAL PARAMETERIZED NON-GEOSIM DEFORMATION FOR CORRECTING SCALE EFFECTS OF NOMINAL WAKE FIELD

    Directory of Open Access Journals (Sweden)

    Tiecheng Wu

    2017-01-01

    Full Text Available The scale effects of an aft hull wake field pose a great challenge to propeller design and its performance prediction. Research into the characteristics of the scale effects and the subsequent correction of the errors caused by such effects play an important role in improving a ship’s energy conservation and propulsion performance. For this research, using a KCS ship as the research target, the aft shape of an original ship model has been modified based on the smart dummy model (SDM to change its nominal wake field. The present study explores the aft hull deformation of a KCS ship through a series of numerical calculations and validates the results using a similar ship model. In addition, wake field PIV-measurements are performed using particle image velocimetry to verify the corrected effects of the SDM. The SDM correction method offers a new pathway for correcting the errors associated with the scale effects in the nominal wake field measurements of a ship model.

  15. On the truncation of the azimuthal mode spectrum of high-order probes in probe-corrected spherical near-field antenna measurements

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Laitinen, Tommi

    2011-01-01

    Azimuthal mode (m mode) truncation of a high-order probe pattern in probe-corrected spherical near-field antenna measurements is studied in this paper. The results of this paper provide rules for appropriate and sufficient m-mode truncation for non-ideal first-order probes and odd-order probes wi...

  16. Automation system for the linear induction accelerator with a strong guiding field for experiments with a hallow electron beam

    International Nuclear Information System (INIS)

    Arkhipov, O.V.; Bobyleva, L.V.; Glejbman, Eh.M.

    1989-01-01

    A system of the JINR linear induction accelerator automation is described. The system is basewd on microprocessor subsystems. Each subsystem comprises a set of multiprocessor system (mms) moduli. The central crate and Pravets-16 personal computer make up the kernel of the system. 9 refs.; 1 fig

  17. On transcending the impasse of respiratory motion correction applications in routine clinical imaging - a consideration of a fully automated data driven motion control framework

    Energy Technology Data Exchange (ETDEWEB)

    Kesner, Adam L [Division of Nuclear Medicine, Department of Radiology, Anschutz Medical Campus, University of Colorado Denver, 12700 E 19th Ave, Box C-278, Aurora, CO 80045 (United States); Schleyer, Paul J [Division of Imaging Sciences and Biomedical Engineering, King’s College London, London, WC2R 2LS (United Kingdom); Büther, Florian [European Institute for Molecular Imaging, University of Münster, Münster, 48149 (Germany); Walter, Martin A [Institute of Nuclear Medicine and Department of Clinical Research, University Hospital Bern, Bern, 3010 (Switzerland); Schäfers, Klaus P [European Institute for Molecular Imaging, University of Münster, Münster, 48149 (Germany); Koo, Phillip J [Division of Nuclear Medicine, Department of Radiology, Anschutz Medical Campus, University of Colorado Denver, 12700 E 19th Ave, Box C-278, Aurora, CO 80045 (United States)

    2014-06-17

    Positron emission tomography (PET) is increasingly used for the detection, characterization, and follow-up of tumors located in the thorax. However, patient respiratory motion presents a unique limitation that hinders the application of high-resolution PET technology for this type of imaging. Efforts to transcend this limitation have been underway for more than a decade, yet PET remains for practical considerations a modality vulnerable to motion-induced image degradation. Respiratory motion control is not employed in routine clinical operations. In this article, we take an opportunity to highlight some of the recent advancements in data-driven motion control strategies and how they may form an underpinning for what we are presenting as a fully automated data-driven motion control framework. This framework represents an alternative direction for future endeavors in motion control and can conceptually connect individual focused studies with a strategy for addressing big picture challenges and goals. The online version of this article (doi:10.1186/2197-7364-1-8) contains supplementary material, which is available to authorized users.

  18. Co-optimization of RegC and TWINSCAN corrections to improve the intra-field on-product overlay performance

    Science.gov (United States)

    Gorhad, Kujan; Sharoni, Ofir; Dmitriev, Vladimir; Cohen, Avi; van Haren, Richard; Roelofs, Christian; Cekli, Hakki Ergun; Gallagher, Emily; Leray, Philippe; Beyer, Dirk; Trautzsch, Thomas; Steinert, Steffen

    2016-03-01

    Improving wafer On Product Overlay (OPO) is becoming a major challenge in lithography, especially for multipatterning techniques like N-repetitive Litho-Etch steps (LEN, N >= 2). When using different scanner settings and litho processes between inter-layer overlays, intra-field overlay control becomes more complicated. In addition to the Image Placement Error (IPE) contribution, the TWINSCANTM lens fingerprint in combination with the exposure settings is playing a significant role as well. Furthermore the scanner needs to deal with dynamic fingerprints caused by for instance lens and/or reticle heating. This paper will demonstrate the complementary RegC® and TWINSCANTM solution for improving the OPO by cooptimizing the correction capabilities of the individual tools, respectively. As a consequence, the systematic intra-field fingerprints can be decreased along with the overlay (OVL) error at wafer level. Furthermore, the application could be utilized for extending some of the scanner actuators ranges by inducing a pre-determined signatures. These solutions perfectly fit into the ASML Litho InSight (LIS) product in which feedforward and feedback corrections based on YieldStar overlay and other measurements are used to improve the OPO. While the TWINSCANTM scanner corrects for global distortions (up to third order) - scanner Correctable Errors ( CE), the RegC® application can correct for the None Correctable Errors (NCE) by making the high frequency NCE into a CE with low frequency nature. The RegC® induces predictable deformation elements inside the quartz (Qz) material of the reticle, and by doing so it can induce a desired pre-defined signature into the reticle. The deformation introduced by the RegC® is optimized for the actual wafer print taking into account the scale and ortho compensation by the scanner, to correct for the systematic fingerprints and the wafer overlay. These two applications might be very powerful and could contribute to achieve a better

  19. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  20. Creation of subsonic macro-and microjets facilities and automated measuring system (AMS-2) for the spatial - temporal hot - wire anemometric visualization of jet flow field

    Science.gov (United States)

    Sorokin, A. M.; Grek, G. R.; Gilev, V. M.; Zverkov, I. D.

    2017-10-01

    Macro-and microjets facilities for generation of the round and plane subsonic jets are designed and fabricated. Automated measuring system (AMS - 2) for the spatial - temporal hot - wire anemometric visualization of jet flow field is designed and fabricated. Coordinate device and unit of the measurement, collecting, storage and processing of hot - wire anemometric information were integrated in the AMS. Coordinate device is intended for precision movement of the hot - wire probe in jet flow field according to the computer program. At the same time accuracy of the hot - wire probe movement is 5 microns on all three coordinates (x, y, z). Unit of measurement, collecting, storage and processing of hot - wire anemometric information is intended for the hot - wire anemometric measurement of the jet flow field parameters (registration of the mean - U and fluctuation - u' characteristics of jet flow velocity), their accumulation and preservation in the computer memory, and also carries out their processing according to certain programms.

  1. Interacting viscous entropy-corrected holographic scalar field models of dark energy with time-varying G in modified FRW cosmology

    International Nuclear Information System (INIS)

    Adabi, Farzin; Karami, Kayoomars; Felegary, Fereshte; Azarmi, Zohre

    2012-01-01

    We study the entropy-corrected version of the holographic dark energy (HDE) model in the framework of modified Friedmann-Robertson-Walker cosmology. We consider a non-flat universe filled with an interacting viscous entropy-corrected HDE (ECHDE) with dark matter. Also included in our model is the case of the variable gravitational constant G. We obtain the equation of state and the deceleration parameters of the interacting viscous ECHDE. Moreover, we reconstruct the potential and the dynamics of the quintessence, tachyon, K-essence and dilaton scalar field models according to the evolutionary behavior of the interacting viscous ECHDE model with time-varying G. (research papers)

  2. Comparison of Threshold Saccadic Vector Optokinetic Perimetry (SVOP) and Standard Automated Perimetry (SAP) in Glaucoma. Part II: Patterns of Visual Field Loss and Acceptability.

    Science.gov (United States)

    McTrusty, Alice D; Cameron, Lorraine A; Perperidis, Antonios; Brash, Harry M; Tatham, Andrew J; Agarwal, Pankaj K; Murray, Ian C; Fleck, Brian W; Minns, Robert A

    2017-09-01

    We compared patterns of visual field loss detected by standard automated perimetry (SAP) to saccadic vector optokinetic perimetry (SVOP) and examined patient perceptions of each test. A cross-sectional study was done of 58 healthy subjects and 103 with glaucoma who were tested using SAP and two versions of SVOP (v1 and v2). Visual fields from both devices were categorized by masked graders as: 0, normal; 1, paracentral defect; 2, nasal step; 3, arcuate defect; 4, altitudinal; 5, biarcuate; and 6, end-stage field loss. SVOP and SAP classifications were cross-tabulated. Subjects completed a questionnaire on their opinions of each test. We analyzed 142 (v1) and 111 (v2) SVOP and SAP test pairs. SVOP v2 had a sensitivity of 97.7% and specificity of 77.9% for identifying normal versus abnormal visual fields. SAP and SVOP v2 classifications showed complete agreement in 54% of glaucoma patients, with a further 23% disagreeing by one category. On repeat testing, 86% of SVOP v2 classifications agreed with the previous test, compared to 91% of SAP classifications; 71% of subjects preferred SVOP compared to 20% who preferred SAP. Eye-tracking perimetry can be used to obtain threshold visual field sensitivity values in patients with glaucoma and produce maps of visual field defects, with patterns exhibiting close agreement to SAP. Patients preferred eye-tracking perimetry compared to SAP. This first report of threshold eye tracking perimetry shows good agreement with conventional automated perimetry and provides a benchmark for future iterations.

  3. Correction factors for A1SL ionization chamber dosimetry in TomoTherapy: Machine-specific, plan-class, and clinical fields

    Energy Technology Data Exchange (ETDEWEB)

    Gago-Arias, Araceli; Rodriguez-Romero, Ruth; Sanchez-Rubio, Patricia; Miguel Gonzalez-Castano, Diego; Gomez, Faustino; Nunez, Luis; Palmans, Hugo; Sharpe, Peter; Pardo-Montero, Juan [Departamento de Fisica de Particulas, Facultad de Fisica, Universidad de Santiago de Compostela (Spain); Servicio de Radiofisica, Hospital Universitario Puerta de Hierro, Madrid 28222 (Spain); Departamento de Fisica de Particulas, Facultad de Fisica, Universidad de Santiago de Compostela, 15782 (Spain) and Radiation Physics Laboratory, Universidad de Santiago de Compostela, 15782 (Spain); Servicio de Radiofisica, Hospital Universitario Puerta de Hierro, Madrid, 28222 (Spain); National Physical Laboratory, Teddington, Middx, TW11 OLW (United Kingdom); Departamento de Fisica de Particulas, Facultad de Fisica, Universidad de Santiago de Compostela, 15782 (Spain)

    2012-04-15

    Purpose: Recently, an international working group on nonstandard fields presented a new formalism for ionization chamber reference dosimetry of small and nonstandard fields [Alfonso et al., Med. Phys. 35, 5179-5186 (2008)] which has been adopted by AAPM TG-148. This work presents an experimental determination of the correction factors for reference dosimetry with an Exradin A1SL thimble ionization chamber in a TomoTherapy unit, focusing on: (i) machine-specific reference field, (ii) plan-class-specific reference field, and (iii) two clinical treatments. Methods: Ionization chamber measurements were performed in the TomoTherapy unit for intermediate (machine-specific and plan-class-specific) calibration fields, based on the reference conditions defined by AAPM TG-148, and two clinical treatments (lung and head-and-neck). Alanine reference dosimetry was employed to determine absorbed dose to water at the point of interest for the fields under investigation. The corresponding chamber correction factors were calculated from alanine to ionization chamber measurements ratios. Results: Two different methods of determining the beam quality correction factor k{sub Q,Q{sub 0}} for the A1SL ionization chamber in this TomoTherapy unit, where reference conditions for conventional beam quality determination cannot be met, result in consistent values. The observed values of overall correction factors obtained for intermediate and clinical fields are consistently around 0.98 with a typical expanded relative uncertainty of 2% (k = 2), which when considered make such correction factors compatible with unity. However, all of them are systematically lower than unity, which is shown to be significant when a hypothesis test assuming a t-student distribution is performed (p=1.8x10{sup -2}). Correction factors k{sub Q{sub c{sub l{sub i{sub n,Q{sub p{sub c{sub s{sub r}{sup f{sub c}{sub l}{sub i}{sub n},f{sub p}{sub c}{sub s}{sub r}}}}}}}}}} and k{sub Q{sub c{sub l{sub i{sub n,Q{sub m{sub s

  4. Automated Water Extraction Index

    DEFF Research Database (Denmark)

    Feyisa, Gudina Legese; Meilby, Henrik; Fensholt, Rasmus

    2014-01-01

    of various sorts of environmental noise and at the same time offers a stable threshold value. Thus we introduced a new Automated Water Extraction Index (AWEI) improving classification accuracy in areas that include shadow and dark surfaces that other classification methods often fail to classify correctly...

  5. SU-C-304-06: Determination of Intermediate Correction Factors for Three Dosimeters in Small Composite Photon Fields Used in Robotic Radiosurgery

    Energy Technology Data Exchange (ETDEWEB)

    Christiansen, E [Medical Physics Unit, McGill University, Montreal (Canada); Belec, J; Vandervoort, E [The Ottawa Hospital Cancer Centre, Ottawa (Canada); Muir, B [National Research Council of Canada, Ottawa (Canada)

    2015-06-15

    Purpose: To calculate using Monte-Carlo the intermediate and total correction factors (CFs) for two microchambers and a plastic scintillator for composite fields delivered by the CyberKnife system. Methods: A linac model was created in BEAMnrc by matching percentage depth dose (PDD) curves and output factors (OFs) measured using an A16 microchamber with Monte Carlo calculations performed in egs-chamber to explicitly model detector response. Intermediate CFs were determined for the A16 and A26 microchambers and the W1 plastic scintillator in fourteen different composite fields inside a solid water phantom. Seven of these fields used a 5 mm diameter collimator; the remaining fields employed a 7.5 mm collimator but were otherwise identical to the first seven. Intermediate CFs are reported relative to the respective CF for a 60 mm collimator (800 mm source to detector distance and 100 mm depth in water). Results: For microchambers in composite fields, the intermediate CFs that account for detector density and volume were the largest contributors to total CFs. The total CFs for the A26 were larger than those for the A16, especially for the 5 mm cone (1.227±0.003 to 1.144±0.004 versus 1.142±0.003 to 1.099±0.004), due to the A26’s larger active volume (0.015 cc) relative to the A16 (0.007 cc), despite the A26 using similar wall and electrode material. The W1 total and intermediate CFs are closer to unity, due to its smaller active volume and near water-equivalent composition, however, 3–4% detector volume corrections are required for 5 mm collimator fields. In fields using the 7.5 mm collimator, the correction is nearly eliminated for the W1 except for a non-isocentric field. Conclusion: Large and variable CFs are required for microchambers in small composite fields primarily due to density and volume effects. Corrections are reduced but not eliminated for a plastic scintillator in the same fields.

  6. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  7. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  8. Remote monitoring field trial. Application to automated air sampling. Report on Task FIN-E935 of the Finnish Support Programme to IAEA Safeguards

    International Nuclear Information System (INIS)

    Poellaenen, R.; Ilander, T.; Lehtinen, J.; Leppaenen, A.; Nikkinen, M.; Toivonen, H.; Ylaetalo, S.; Smartt, H.; Garcia, R.; Martinez, R.; Glidewell, D.; Krantz, K.

    1999-01-01

    An automated air sampling station has recently been developed by Radiation and Nuclear Safety Authority (STUK). The station is furnished with equipment that allows comprehensive remote monitoring of the station and the data. Under the Finnish Support Programme to IAEA Safeguards, STUK and Sandia National Laboratories (SNL) established a field trial to demonstrate the use of remote monitoring technologies. STUK provided means for real-lime radiation monitoring and sample authentication whereas SNL delivered means for authenticated surveillance of the equipment and its location. The field trial showed that remote monitoring can be carried out using simple means although advanced facilities are needed for comprehensive surveillance. Authenticated measurement data could be reliably transferred from the monitoring site to the headquarters without the presence of authorized personnel in the monitoring site. The operation of the station and the remote monitoring system were reliable. (orig.)

  9. An image analysis pipeline for automated classification of imaging light conditions and for quantification of wheat canopy cover time series in field phenotyping.

    Science.gov (United States)

    Yu, Kang; Kirchgessner, Norbert; Grieder, Christoph; Walter, Achim; Hund, Andreas

    2017-01-01

    Robust segmentation of canopy cover (CC) from large amounts of images taken under different illumination/light conditions in the field is essential for high throughput field phenotyping (HTFP). We attempted to address this challenge by evaluating different vegetation indices and segmentation methods for analyzing images taken at varying illuminations throughout the early growth phase of wheat in the field. 40,000 images taken on 350 wheat genotypes in two consecutive years were assessed for this purpose. We proposed an image analysis pipeline that allowed for image segmentation using automated thresholding and machine learning based classification methods and for global quality control of the resulting CC time series. This pipeline enabled accurate classification of imaging light conditions into two illumination scenarios, i.e. high light-contrast (HLC) and low light-contrast (LLC), in a series of continuously collected images by employing a support vector machine (SVM) model. Accordingly, the scenario-specific pixel-based classification models employing decision tree and SVM algorithms were able to outperform the automated thresholding methods, as well as improved the segmentation accuracy compared to general models that did not discriminate illumination differences. The three-band vegetation difference index (NDI3) was enhanced for segmentation by incorporating the HSV-V and the CIE Lab-a color components, i.e. the product images NDI3*V and NDI3*a. Field illumination scenarios can be successfully identified by the proposed image analysis pipeline, and the illumination-specific image segmentation can improve the quantification of CC development. The integrated image analysis pipeline proposed in this study provides great potential for automatically delivering robust data in HTFP.

  10. Visual field

    Science.gov (United States)

    Perimetry; Tangent screen exam; Automated perimetry exam; Goldmann visual field exam; Humphrey visual field exam ... Confrontation visual field exam. This is a quick and basic check of the visual field. The health care provider ...

  11. Automating indicator data reporting from health facility EMR to a national aggregate data system in Kenya: An Interoperability field-test using OpenMRS and DHIS2.

    Science.gov (United States)

    Kariuki, James M; Manders, Eric-Jan; Richards, Janise; Oluoch, Tom; Kimanga, Davies; Wanyee, Steve; Kwach, James O; Santas, Xenophon

    2016-01-01

    Introduction: Developing countries are increasingly strengthening national health information systems (HIS) for evidence-based decision-making. However, the inability to report indicator data automatically from electronic medical record systems (EMR) hinders this process. Data are often printed and manually re-entered into aggregate reporting systems. This affects data completeness, accuracy, reporting timeliness, and burdens staff who support routine indicator reporting from patient-level data. Method: After conducting a feasibility test to exchange indicator data from Open Medical Records System (OpenMRS) to District Health Information System version 2 (DHIS2), we conducted a field test at a health facility in Kenya. We configured a field-test DHIS2 instance, similar to the Kenya Ministry of Health (MOH) DHIS2, to receive HIV care and treatment indicator data and the KenyaEMR, a customized version of OpenMRS, to generate and transmit the data from a health facility. After training facility staff how to send data using DHIS2 reporting module, we compared completeness, accuracy and timeliness of automated indicator reporting with facility monthly reports manually entered into MOH DHIS2. Results: All 45 data values in the automated reporting process were 100% complete and accurate while in manual entry process, data completeness ranged from 66.7% to 100% and accuracy ranged from 33.3% to 95.6% for seven months (July 2013-January 2014). Manual tally and entry process required at least one person to perform each of the five reporting activities, generating data from EMR and manual entry required at least one person to perform each of the three reporting activities, while automated reporting process had one activity performed by one person. Manual tally and entry observed in October 2013 took 375 minutes. Average time to generate data and manually enter into DHIS2 was over half an hour (M=32.35 mins, SD=0.29) compared to less than a minute for automated submission (M=0

  12. Political Correctness--Correct?

    Science.gov (United States)

    Boase, Paul H.

    1993-01-01

    Examines the phenomenon of political correctness, its roots and objectives, and its successes and failures in coping with the conflicts and clashes of multicultural campuses. Argues that speech codes indicate failure in academia's primary mission to civilize and educate through talk, discussion, thought,166 and persuasion. (SR)

  13. M-theory and stringy corrections to anti-de Sitter black holes and conformal field theories

    International Nuclear Information System (INIS)

    Caldarelli, Marco M.; Klemm, Dietmar

    1999-01-01

    We consider black holes in anti-de Sitter space AdS p+2 (p = 2, 3, 5), which have hyperbolic, flat or spherical event horizons. The O(α' 3 ) corrections (or the leading corrections in powers of the eleven-dimensional Planck length, in the case of M-theory compactifications) to the black hole metrics are computed for the various topologies and dimensions. We investigate the consequences of the stringy or M-theory corrections for the black hole thermodynamics. In particular, we show the emergence of a stable branch of small spherical black holes. Surprisingly, for any of the considered dimension and topologies, the corrected thermodynamical quantities turn out to coincide with those calculated within a simplified approach, which uses only the unperturbed metric. We obtain the corrected Hawking-Page transition temperature for black holes with spherical horizons, and show that for p = 3 this phase transition disappears at a value of α' considerably smaller than that estimated previously by Gao and Li. Using the AdS/CFT correspondence, we determine the S 1 x S 3 N = 4 SYM phase diagram for sufficiently large 't Hooft coupling, and show that the critical point at which the Hawking-Page transition disappears (the correspondence point of Horowitz-Polchinski), occurs at g 2 YM N ∼ 20.5. The d = 4 and d = 7 black hole phase diagrams are also determined, and connection is made with the corresponding boundary CFTs. Finally, for flat and hyperbolic horizons, we show that the leading stringy or M-theory corrections do not give rise to any phase transition. However, if the horizon is compactified to a torus T p or to a quotient of hyperbolic space, H p /Γ, the appearance of light winding modes around non-contractible cycles signal new phase transitions, which in the toroidal case have previously been discussed by Barbon et al. We comment on these phase transitions for SYM on H p /Γ and on T p , when the moduli of the torus are taken into account

  14. Correction magnetic field in electromagnet of proton accelerator using CST software; Correcao do campo magnetico em um eletroima de um acelerador de protons usando o software CST

    Energy Technology Data Exchange (ETDEWEB)

    Rabelo, L.A.; Campos, T.P.R., E-mail: luisarabelo88@gmail.com [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte (Brazil). Dept. de Engenharia Nuclear

    2016-07-01

    The aim of this paper is to present the study and simulation of uniform magnetic field electromagnets new circular accelerator model for protons with energy range between 15 MeV and 64 MeV. In addition, investigating materials and the changes induced by the presence of 'gaps' synchronism correction. The electromagnet simulations, predefined, were made in electromagnetic field simulation software CST EM Studio® 3D 2015. The results showed an even distribution of the magnetic field in the compact electromagnet with the same homogenization structures. The results showed regular distribution of the magnetic field in the compact electromagnet with homogenization structures. In conclusion, the electromagnetic model proposed shown to be feasible for a circular accelerator and comply the synchronization requirements. (author)

  15. High precision, continuous measurements of water vapor isotopes using a field deployable analyzer with a novel automated calibration system to facilitate ecohydrological studies

    Science.gov (United States)

    Gupta, P.; Crosson, E.; Richman, B. A.; Apodaca, R. L.; Green, I.

    2009-12-01

    The use of stable isotopic analysis techniques has proved quite valuable in establishing links between ecology and hydrology. We present an alternative and novel approach to isotope ratio mass spectrometry (IRMS) for making high-precision D/H and 18O/16O isotope ratio measurements of water vapor at a field site using wavelength-scanned cavity ring-down spectroscopy (WS-CRDS) based technology. This WS-CRDS analyzer allows continuous real-time measurements of water vapor with automated periodic calibration using liquid standards, needing no human intervention for weeks during deployment. The new automated calibration system, designed specifically for field deployment, uses syringe pumps and is robust, consistent and reliable. The advanced temperature and pressure control within the analyzer are some of the key design features that allow high precision (0.2‰ for δ18O and 1.0‰ for δD) performance at extremely low drift (water vapor analyzer, a field trial was conducted where the common isotopologues of water vapor were measured at a local ecological site over a period of a few days. The resulting high resolution data gives us the ability to understand the impact of meteorology and plant physiology on the isotopic composition of water vapor in ambient air. Such measurements of water vapor, when combined with measurements of the isotopic composition of liquid water in plants, soil water and local water bodies, will close the eco-hydrological loop of any region. The ability of the WS-CRDS analyzer to make continuous, real-time measurements with a resolution on the order of a few seconds will aid in understanding the complex interdependencies between ecological and hydrological processes and will provide critical information in refining existing models of water transport in ecosystems. These studies are critical to understanding the impact of global climate change on landscapes.

  16. An Assessment of the Factors in Office Automation Systems Affecting Air Force Middle Managers and Clerical Workers in the Information Management Career Field

    Science.gov (United States)

    1990-12-01

    The purpose of this theses was to investigate the attitudes and opinions of personnel affected by two office automation systems. Reprographics...information management personnel to the office automation systems (RAMS and RIMS)? (2) Were information management personnel’s perception different now from...the period prior to installation of the office automation systems? (3) What is the frequency of use for each office automation system? (4) What major

  17. Sensitivity and specificity of automated analysis of single-field non-mydriatic fundus photographs by Bosch DR Algorithm-Comparison with mydriatic fundus photography (ETDRS for screening in undiagnosed diabetic retinopathy.

    Directory of Open Access Journals (Sweden)

    Pritam Bawankar

    Full Text Available Diabetic retinopathy (DR is a leading cause of blindness among working-age adults. Early diagnosis through effective screening programs is likely to improve vision outcomes. The ETDRS seven-standard-field 35-mm stereoscopic color retinal imaging (ETDRS of the dilated eye is elaborate and requires mydriasis, and is unsuitable for screening. We evaluated an image analysis application for the automated diagnosis of DR from non-mydriatic single-field images. Patients suffering from diabetes for at least 5 years were included if they were 18 years or older. Patients already diagnosed with DR were excluded. Physiologic mydriasis was achieved by placing the subjects in a dark room. Images were captured using a Bosch Mobile Eye Care fundus camera. The images were analyzed by the Retinal Imaging Bosch DR Algorithm for the diagnosis of DR. All subjects also subsequently underwent pharmacological mydriasis and ETDRS imaging. Non-mydriatic and mydriatic images were read by ophthalmologists. The ETDRS readings were used as the gold standard for calculating the sensitivity and specificity for the software. 564 consecutive subjects (1128 eyes were recruited from six centers in India. Each subject was evaluated at a single outpatient visit. Forty-four of 1128 images (3.9% could not be read by the algorithm, and were categorized as inconclusive. In four subjects, neither eye provided an acceptable image: these four subjects were excluded from the analysis. This left 560 subjects for analysis (1084 eyes. The algorithm correctly diagnosed 531 of 560 cases. The sensitivity, specificity, and positive and negative predictive values were 91%, 97%, 94%, and 95% respectively. The Bosch DR Algorithm shows favorable sensitivity and specificity in diagnosing DR from non-mydriatic images, and can greatly simplify screening for DR. This also has major implications for telemedicine in the use of screening for retinopathy in patients with diabetes mellitus.

  18. Sensitivity and specificity of automated analysis of single-field non-mydriatic fundus photographs by Bosch DR Algorithm-Comparison with mydriatic fundus photography (ETDRS) for screening in undiagnosed diabetic retinopathy.

    Science.gov (United States)

    Bawankar, Pritam; Shanbhag, Nita; K, S Smitha; Dhawan, Bodhraj; Palsule, Aratee; Kumar, Devesh; Chandel, Shailja; Sood, Suneet

    2017-01-01

    Diabetic retinopathy (DR) is a leading cause of blindness among working-age adults. Early diagnosis through effective screening programs is likely to improve vision outcomes. The ETDRS seven-standard-field 35-mm stereoscopic color retinal imaging (ETDRS) of the dilated eye is elaborate and requires mydriasis, and is unsuitable for screening. We evaluated an image analysis application for the automated diagnosis of DR from non-mydriatic single-field images. Patients suffering from diabetes for at least 5 years were included if they were 18 years or older. Patients already diagnosed with DR were excluded. Physiologic mydriasis was achieved by placing the subjects in a dark room. Images were captured using a Bosch Mobile Eye Care fundus camera. The images were analyzed by the Retinal Imaging Bosch DR Algorithm for the diagnosis of DR. All subjects also subsequently underwent pharmacological mydriasis and ETDRS imaging. Non-mydriatic and mydriatic images were read by ophthalmologists. The ETDRS readings were used as the gold standard for calculating the sensitivity and specificity for the software. 564 consecutive subjects (1128 eyes) were recruited from six centers in India. Each subject was evaluated at a single outpatient visit. Forty-four of 1128 images (3.9%) could not be read by the algorithm, and were categorized as inconclusive. In four subjects, neither eye provided an acceptable image: these four subjects were excluded from the analysis. This left 560 subjects for analysis (1084 eyes). The algorithm correctly diagnosed 531 of 560 cases. The sensitivity, specificity, and positive and negative predictive values were 91%, 97%, 94%, and 95% respectively. The Bosch DR Algorithm shows favorable sensitivity and specificity in diagnosing DR from non-mydriatic images, and can greatly simplify screening for DR. This also has major implications for telemedicine in the use of screening for retinopathy in patients with diabetes mellitus.

  19. A new method to detect and correct sample tilt in scanning transmission electron microscopy bright-field imaging

    Energy Technology Data Exchange (ETDEWEB)

    Brown, H.G. [School of Physics, University of Melbourne, Parkville, Victoria 3010 (Australia); Ishikawa, R.; Sánchez-Santolino, G. [Institute of Engineering Innovation, School of Engineering, University of Tokyo, Tokyo 113-8656 (Japan); Lugg, N.R., E-mail: shibata@sigma.t.u-tokyo.ac.jp [Institute of Engineering Innovation, School of Engineering, University of Tokyo, Tokyo 113-8656 (Japan); Ikuhara, Y. [Institute of Engineering Innovation, School of Engineering, University of Tokyo, Tokyo 113-8656 (Japan); Allen, L.J. [School of Physics, University of Melbourne, Parkville, Victoria 3010 (Australia); Shibata, N. [Institute of Engineering Innovation, School of Engineering, University of Tokyo, Tokyo 113-8656 (Japan)

    2017-02-15

    Important properties of functional materials, such as ferroelectric shifts and octahedral distortions, are associated with displacements of the positions of lighter atoms in the unit cell. Annular bright-field scanning transmission electron microscopy is a good experimental method for investigating such phenomena due to its ability to image light and heavy atoms simultaneously. To map atomic positions at the required accuracy precise angular alignment of the sample with the microscope optical axis is necessary, since misalignment (tilt) of the specimen contributes to errors in position measurements of lighter elements in annular bright-field imaging. In this paper it is shown that it is possible to detect tilt with the aid of images recorded using a central bright-field detector placed within the inner radius of the annular bright-field detector. For a probe focus near the middle of the specimen the central bright-field image becomes especially sensitive to tilt and we demonstrate experimentally that misalignment can be detected with a precision of less than a milliradian, as we also confirm in simulation. Coma in the probe, an aberration that can be misidentified as tilt of the specimen, is also investigated and it is shown how the effects of coma and tilt can be differentiated. The effects of tilt may be offset to a large extent by shifting the diffraction plane detector an amount equivalent to the specimen tilt and we provide an experimental proof of principle of this using a segmented detector system. - Highlights: • Octahedral distortions are associated with displacements of lighter atoms. • Annular bright-field imaging is sensitive to light and heavy atoms simultaneously. • Mistilt of the specimen leads to errors in position measurements of lighter elements. • It is possible to detect tilt using images taken by a central bright-field detector. • Tilt may be offset by shifting the diffraction plane detector by an equivalent amount.

  20. A machine vision system for automated non-invasive assessment of cell viability via dark field microscopy, wavelet feature selection and classification.

    Science.gov (United States)

    Wei, Ning; Flaschel, Erwin; Friehs, Karl; Nattkemper, Tim Wilhelm

    2008-10-21

    Cell viability is one of the basic properties indicating the physiological state of the cell, thus, it has long been one of the major considerations in biotechnological applications. Conventional methods for extracting information about cell viability usually need reagents to be applied on the targeted cells. These reagent-based techniques are reliable and versatile, however, some of them might be invasive and even toxic to the target cells. In support of automated noninvasive assessment of cell viability, a machine vision system has been developed. This system is based on supervised learning technique. It learns from images of certain kinds of cell populations and trains some classifiers. These trained classifiers are then employed to evaluate the images of given cell populations obtained via dark field microscopy. Wavelet decomposition is performed on the cell images. Energy and entropy are computed for each wavelet subimage as features. A feature selection algorithm is implemented to achieve better performance. Correlation between the results from the machine vision system and commonly accepted gold standards becomes stronger if wavelet features are utilized. The best performance is achieved with a selected subset of wavelet features. The machine vision system based on dark field microscopy in conjugation with supervised machine learning and wavelet feature selection automates the cell viability assessment, and yields comparable results to commonly accepted methods. Wavelet features are found to be suitable to describe the discriminative properties of the live and dead cells in viability classification. According to the analysis, live cells exhibit morphologically more details and are intracellularly more organized than dead ones, which display more homogeneous and diffuse gray values throughout the cells. Feature selection increases the system's performance. The reason lies in the fact that feature selection plays a role of excluding redundant or misleading

  1. A machine vision system for automated non-invasive assessment of cell viability via dark field microscopy, wavelet feature selection and classification

    Directory of Open Access Journals (Sweden)

    Friehs Karl

    2008-10-01

    Full Text Available Abstract Background Cell viability is one of the basic properties indicating the physiological state of the cell, thus, it has long been one of the major considerations in biotechnological applications. Conventional methods for extracting information about cell viability usually need reagents to be applied on the targeted cells. These reagent-based techniques are reliable and versatile, however, some of them might be invasive and even toxic to the target cells. In support of automated noninvasive assessment of cell viability, a machine vision system has been developed. Results This system is based on supervised learning technique. It learns from images of certain kinds of cell populations and trains some classifiers. These trained classifiers are then employed to evaluate the images of given cell populations obtained via dark field microscopy. Wavelet decomposition is performed on the cell images. Energy and entropy are computed for each wavelet subimage as features. A feature selection algorithm is implemented to achieve better performance. Correlation between the results from the machine vision system and commonly accepted gold standards becomes stronger if wavelet features are utilized. The best performance is achieved with a selected subset of wavelet features. Conclusion The machine vision system based on dark field microscopy in conjugation with supervised machine learning and wavelet feature selection automates the cell viability assessment, and yields comparable results to commonly accepted methods. Wavelet features are found to be suitable to describe the discriminative properties of the live and dead cells in viability classification. According to the analysis, live cells exhibit morphologically more details and are intracellularly more organized than dead ones, which display more homogeneous and diffuse gray values throughout the cells. Feature selection increases the system's performance. The reason lies in the fact that feature

  2. Automatic physiological waveform processing for FMRI noise correction and analysis.

    Directory of Open Access Journals (Sweden)

    Daniel J Kelley

    2008-03-01

    Full Text Available Functional MRI resting state and connectivity studies of brain focus on neural fluctuations at low frequencies which share power with physiological fluctuations originating from lung and heart. Due to the lack of automated software to process physiological signals collected at high magnetic fields, a gap exists in the processing pathway between the acquisition of physiological data and its use in fMRI software for both physiological noise correction and functional analyses of brain activation and connectivity. To fill this gap, we developed an open source, physiological signal processing program, called PhysioNoise, in the python language. We tested its automated processing algorithms and dynamic signal visualization on resting monkey cardiac and respiratory waveforms. PhysioNoise consistently identifies physiological fluctuations for fMRI noise correction and also generates covariates for subsequent analyses of brain activation and connectivity.

  3. Characterization of radiation beams used to determinate the correction factor for a CyberKnife® unit reference field using ionization chambers

    Energy Technology Data Exchange (ETDEWEB)

    Aragón-Martínez, Nestor, E-mail: nestoraragon@fisica.unam.mx; Massillon-JL, Guerda, E-mail: massillon@fisica.unam.mx [Instituto de Física, Universidad Nacional Autónoma de México, D.F (Mexico); Gómez-Muñoz, Arnulfo [Hospital de Oncología, Centro Médico Nacional Siglo XXI, D.F (Mexico)

    2014-11-07

    This paper aimed to characterize a 6 MV x-ray beam from a Varian® iX linear accelerator in order to obtain the correction factors needed by the IAEA/AAPM new formalism{sup 1}. The experiments were performed in a liquid water phantom under different irradiation conditions: a) Calibration of the reference field of 10 cm × 10 cm at 90 cm SSD and 10 cm depth was carried out according to the TRS-398 protocol using three ionization chambers (IC) calibrated in different reference laboratory and b) Measurement of the absorbed dose rate at 70 cm SSD and 10 cm depth in a 10 cm × 10 cm and 5.4 cm × 5.4 cm fields was obtained in order to simulate the CyberKnife® conditions where maximum distance between the source and the detector is equal to 80 cm and the maximum field size is 6 cm diameter. Depending where the IC was calibrated, differences between 0.16% and 2.24% in the absorbed dose rate measured in the 10 cm × 10 cm field at 90 cm SSD were observed, while for the measurements at 70 cm SSD, differences between 1.27% and 3.88% were obtained. For the 5.4 cm × 5.4 cm field, the absorbed dose measured with the three ICs varies between 1.37% and 3.52%. The increase in the difference on the absorbed dose when decreasing the SSD could possibly be associated to scattering radiation generated from the collimators and/or the energy dependence of the ionization chambers to low-energy radiation. The results presented in this work suggest the importance of simulating the CyberKnife® conditions using other linear accelerator for obtaining the correction factors as proposed by the IAEA/AAPM new formalism in order to measure the absorbed dose with acceptable accuracy.

  4. MRI intensity inhomogeneity correction by combining intensity and spatial information

    International Nuclear Information System (INIS)

    Vovk, Uros; Pernus, Franjo; Likar, Bostjan

    2004-01-01

    We propose a novel fully automated method for retrospective correction of intensity inhomogeneity, which is an undesired phenomenon in many automatic image analysis tasks, especially if quantitative analysis is the final goal. Besides most commonly used intensity features, additional spatial image features are incorporated to improve inhomogeneity correction and to make it more dynamic, so that local intensity variations can be corrected more efficiently. The proposed method is a four-step iterative procedure in which a non-parametric inhomogeneity correction is conducted. First, the probability distribution of image intensities and corresponding second derivatives is obtained. Second, intensity correction forces, condensing the probability distribution along the intensity feature, are computed for each voxel. Third, the inhomogeneity correction field is estimated by regularization of all voxel forces, and fourth, the corresponding partial inhomogeneity correction is performed. The degree of inhomogeneity correction dynamics is determined by the size of regularization kernel. The method was qualitatively and quantitatively evaluated on simulated and real MR brain images. The obtained results show that the proposed method does not corrupt inhomogeneity-free images and successfully corrects intensity inhomogeneity artefacts even if these are more dynamic

  5. Mixed model phase evolution for correction of magnetic field inhomogeneity effects in 3D quantitative gradient echo-based MRI

    DEFF Research Database (Denmark)

    Fatnassi, Chemseddine; Boucenna, Rachid; Zaidi, Habib

    2017-01-01

    and at the paranasal sinuses, however, this assumption is often broken. Herein, we explored a novel model that considers both linear and stochastic dependences of the phase evolution with echo time in the presence of weak and strong macroscopic field inhomogeneities. We tested the performance of the model at large...

  6. Embedded system for building automation

    OpenAIRE

    Rolih, Andrej

    2014-01-01

    Home automation is a fast developing field of computer science and electronics. Companies are offering many different products for home automation. Ranging anywhere from complete systems for building management and control, to simple smart lights that can be connected to the internet. These products offer the user greater living comfort and lower their expenses by reducing the energy usage. This thesis shows the development of a simple home automation system that focuses mainly on the enhance...

  7. Comparison of visual field test results obtained through Humphrey matrix frequency doubling technology perimetry versus standard automated perimetry in healthy children

    Directory of Open Access Journals (Sweden)

    Sibel Kocabeyoglu

    2013-01-01

    Full Text Available Aims : The aim of this study was to compare the visual field test results in healthy children obtained via the Humphrey matrix 24-2 threshold program and standard automated perimetry (SAP using the Swedish interactive threshold algorithm (SITA-Standard 24-2 test. Materials and Methods: This prospective study included 55 healthy children without ocular or systemic disorders who underwent both SAP and frequency doubling technology (FDT perimetry visual field testing. Visual field test reliability indices, test duration, global indices (mean deviation [MD], and pattern standard deviation [PSD] were compared between the 2 tests using the Wilcoxon signed-rank test and paired t-test. The performance of the Humphrey field analyzer (HFA 24-2 SITA-standard and frequency-doubling technology Matrix 24-2 tests between genders were compared with Mann-Whitney U-test. Results: Fifty-five healthy children with a mean age of 12.2 ± 1.9 years (range from 8 years to 16 years were included in this prospective study. The test durations of SAP and FDT were similar (5.2 ± 0.5 and 5.1 ± 0.2 min, respectively, P = 0.651. MD and the PSD values obtained via FDT Matrix were significantly higher than those obtained via SAP (P < 0.001, and fixation losses and false negative errors were significantly less with SAP (P < 0.05. A weak positive correlation between the two tests in terms of MD (r = 0.352, P = 0.008 and PSD (r = 0.329, P = 0.014 was observed. Conclusion: Children were able to complete both the visual test algorithms successfully within 6 min. However, SAP testing appears to be associated with less depression of the visual field indices of healthy children. FDT Matrix and SAP should not be used interchangeably in the follow-up of children.

  8. On the possibility of a relativistic correction to the E and B fields around a current-carrying wire

    International Nuclear Information System (INIS)

    Folman, Ron

    2013-01-01

    It is well known that electric and magnetic fields may change when they are observed from different frames of reference. For example, the motion of a charged probe particle moving parallel to a current-carrying wire would be described by utilizing different electric or magnetic fields, depending on from which frame of reference the system is observed and described. To describe the situation in all frames by utilizing the theory of relativity, one has to first describe the situation in one particular frame, and this choice in the case of a current-carrying wire is the topic of this paper. Specifically, I consider the question of in which frame the current carrying wire is neutral. The importance of relaxation processes is emphasized. As an example, I examine a specific alternative to the standard choice, and consider its theoretical and experimental validity. An outcome of alternative approaches is that in the rest frame of a wire, running a current introduces also an electric field by giving rise to a minute charge. Present day experimental sensitivities, specifically those of cold ions, may be able to differentiate between the observable signatures predicted by the different approaches.

  9. Quantitative pixel-wise measurement of myocardial blood flow: the impact of surface coil-related field inhomogeneity and a comparison of methods for its correction.

    Science.gov (United States)

    Miller, Christopher A; Hsu, Li-Yueh; Ta, Allison; Conn, Hannah; Winkler, Susanne; Arai, Andrew E

    2015-02-11

    Surface coil-related field inhomogeneity potentially confounds pixel-wise quantitative analysis of perfusion CMR images. This study assessed the effect of surface coil-related field inhomogeneity on the spatial variation of pixel-wise myocardial blood flow (MBF), and assessed its impact on the ability of MBF quantification to differentiate ischaemic from remote coronary territories. Two surface coil intensity correction (SCIC) techniques were evaluated: 1) a proton density-based technique (PD-SCIC) and; 2) a saturation recovery steady-state free precession-based technique (SSFP-SCIC). 26 subjects (18 with significant CAD and 8 healthy volunteers) underwent stress perfusion CMR using a motion-corrected, saturation recovery SSFP dual-sequence protocol. A proton density (PD)-weighted image was acquired at the beginning of the sequence. Surface coil-related field inhomogeneity was approximated using a third-order surface fit to the PD image or a pre-contrast saturation prepared SSFP image. The estimated intensity bias field was subsequently applied to the image series. Pixel-wise MBF was measured from mid-ventricular stress images using the two SCIC approaches and compared to measurements made without SCIC. MBF heterogeneity in healthy volunteers was higher using SSFP-SCIC (24.8 ± 4.1%) compared to PD-SCIC (20.8 ± 3.0%; p = 0.009), however heterogeneity was significantly lower using either SCIC technique compared to analysis performed without SCIC (36.2 ± 6.3%). In CAD patients, the difference in MBF between remote and ischaemic territories was minimal when analysis was performed without SCIC (0.06 ± 0.91 mL/min/kg), and was substantially lower than with either PD-SCIC (0.50 ± 0.63 mL/min/kg; p = 0.013) or with SSFP-SCIC (0.63 ± 0.89 mL/min/kg; p = 0.005). In 6 patients, MBF quantified without SCIC was artifactually higher in the stenosed coronary territory compared to the remote territory. PD-SCIC and SSFP-SCIC had similar differences in MBF between remote and

  10. Understanding human management of automation errors

    Science.gov (United States)

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  11. Library Automation

    OpenAIRE

    Dhakne, B. N.; Giri, V. V; Waghmode, S. S.

    2010-01-01

    New technologies library provides several new materials, media and mode of storing and communicating the information. Library Automation reduces the drudgery of repeated manual efforts in library routine. By use of library automation collection, Storage, Administration, Processing, Preservation and communication etc.

  12. The Effective Dynamic Ranges for Glaucomatous Visual Field Progression With Standard Automated Perimetry and Stimulus Sizes III and V.

    Science.gov (United States)

    Wall, Michael; Zamba, Gideon K D; Artes, Paul H

    2018-01-01

    It has been shown that threshold estimates below approximately 20 dB have little effect on the ability to detect visual field progression in glaucoma. We aimed to compare stimulus size V to stimulus size III, in areas of visual damage, to confirm these findings by using (1) a different dataset, (2) different techniques of progression analysis, and (3) an analysis to evaluate the effect of censoring on mean deviation (MD). In the Iowa Variability in Perimetry Study, 120 glaucoma subjects were tested every 6 months for 4 years with size III SITA Standard and size V Full Threshold. Progression was determined with three complementary techniques: pointwise linear regression (PLR), permutation of PLR, and linear regression of the MD index. All analyses were repeated on "censored'' datasets in which threshold estimates below a given criterion value were set to equal the criterion value. Our analyses confirmed previous observations that threshold estimates below 20 dB contribute much less to visual field progression than estimates above this range. These findings were broadly similar with stimulus sizes III and V. Censoring of threshold values < 20 dB has relatively little impact on the rates of visual field progression in patients with mild to moderate glaucoma. Size V, which has lower retest variability, performs at least as well as size III for longitudinal glaucoma progression analysis and appears to have a larger useful dynamic range owing to the upper sensitivity limit being higher.

  13. Interactive and automated application of virtual microscopy.

    Science.gov (United States)

    Kayser, Klaus; Görtler, Jürgen; Borkenfeld, Stephan; Kayser, Gian

    2011-03-30

    Virtual microscopy can be applied in an interactive and an automated manner. Interactive application is performed in close association to conventional microscopy. It includes image standardization suitable to the performance of an individual pathologist such as image colorization, white color balance, or individual adjusted brightness. The steering commands have to include selection of wanted magnification, easy navigation, notification, and simple measurements (distances, areas). The display of the histological image should be adjusted to the physical limits of the human eye, which are determined by a view angle of approximately 35 seconds. A more sophisticated performance should include acoustic commands that replace the corresponding visual commands. Automated virtual microscopy includes so-called microscopy assistants which can be defined similar to the developed assistants in computer based editing systems (Microsoft Word, etc.). These include an automated image standardization and correction algorithms that excludes images of poor quality (for example uni-colored or out-of-focus images), an automated selection of the most appropriate field of view, an automated selection of the best magnification, and finally proposals of the most probable diagnosis. A quality control of the final diagnosis, and feedback to the laboratory determine the proposed system. The already developed tools of such a system are described in detail, as well as the results of first trials. In order to enhance the speed of such a system, and to allow further user-independent development a distributed implementation probably based upon Grid technology seems to be appropriate. The advantages of such a system as well as the present pathology environment and its expectations will be discussed in detail.

  14. On the computational assessment of white matter hyperintensity progression: difficulties in method selection and bias field correction performance on images with significant white matter pathology

    Energy Technology Data Exchange (ETDEWEB)

    Valdes Hernandez, Maria del C.; Gonzalez-Castro, Victor; Wang, Xin; Doubal, Fergus; Munoz Maniega, Susana; Wardlaw, Joanna M. [Centre for Clinical Brian Sciences, Department of Neuroimaging Sciences, Edinburgh (United Kingdom); Ghandour, Dina T. [University of Edinburgh, College of Medicine and Veterinary Medicine, Edinburgh (United Kingdom); Armitage, Paul A. [University of Sheffield, Department of Cardiovascular Sciences, Sheffield (United Kingdom)

    2016-05-15

    Subtle inhomogeneities in the scanner's magnetic fields (B{sub 0} and B{sub 1}) alter the intensity levels of the structural magnetic resonance imaging (MRI) affecting the volumetric assessment of WMH changes. Here, we investigate the influence that (1) correcting the images for the B{sub 1} inhomogeneities (i.e. bias field correction (BFC)) and (2) selection of the WMH change assessment method can have on longitudinal analyses of WMH progression and discuss possible solutions. We used brain structural MRI from 46 mild stroke patients scanned at stroke onset and 3 years later. We tested three BFC approaches: FSL-FAST, N4 and exponentially entropy-driven homomorphic unsharp masking (E{sup 2}D-HUM) and analysed their effect on the measured WMH change. Separately, we tested two methods to assess WMH changes: measuring WMH volumes independently at both time points semi-automatically (MCMxxxVI) and subtracting intensity-normalised FLAIR images at both time points following image gamma correction. We then combined the BFC with the computational method that performed best across the whole sample to assess WMH changes. Analysis of the difference in the variance-to-mean intensity ratio in normal tissue between BFC and uncorrected images and visual inspection showed that all BFC methods altered the WMH appearance and distribution, but FSL-FAST in general performed more consistently across the sample and MRI modalities. The WMH volume change over 3 years obtained with MCMxxxVI with vs. without FSL-FAST BFC did not significantly differ (medians(IQR)(with BFC) = 3.2(6.3) vs. 2.9(7.4)ml (without BFC), p = 0.5), but both differed significantly from the WMH volume change obtained from subtracting post-processed FLAIR images (without BFC)(7.6(8.2)ml, p < 0.001). This latter method considerably inflated the WMH volume change as subtle WMH at baseline that became more intense at follow-up were counted as increase in the volumetric change. Measurement of WMH volume change remains

  15. Sodium arsanilate-induced vestibular dysfunction in rats: effects on open-field behavior and spontaneous activity in the automated digiscan monitoring system.

    Science.gov (United States)

    Ossenkopp, K P; Prkacin, A; Hargreaves, E L

    1990-08-01

    Vestibular dysfunction was chemically induced in Long-Evans rats by intratympanic injections (30 mg per side) of sodium arsanilate (atoxyl). Following a one-week recovery period the rats were behaviorally assayed for integrity of the labyrinthine systems. All subjects were tested for presence of the air-righting reflex, the contact-righting reflex (by lightly holding a sheet of Plexiglas against the soles of the rat's feet), and body rotation-induced nystagmus. All animals were then tested for their ability to remain on a small (15 x 15 cm) platform. Next, the subjects were given two 10-min open-field tests during which ambulation, rearing, grooming, and defecation responses were recorded. Four to five weeks later all rats were tested twice (60 min per session) in the automated Digiscan Activity Monitor which provides a multivariate assessment of spontaneous motor activity. The rats with vestibular dysfunction (Group VNX) took significantly less time to fall off the platform (p less than 0.01). They also exhibited significantly more open-field ambulation but fewer rearing responses (ps less than 0.01). An examination of group correlation coefficients for open-field variables and the platform test scores revealed some interesting group differences (ps less than 0.05). In the Digiscan tests the atoxyl-treated rats exhibited fewer number of horizontal movements, but increased speed for these movements (ps less than 0.05). Vertical movements did not differ significantly in incidence, but these movements were greatly reduced in duration (p less than 0.001).(ABSTRACT TRUNCATED AT 250 WORDS)

  16. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  17. A robust approach to correct for pronounced errors in temperature measurements by controlling radiation damping feedback fields in solution NMR

    Science.gov (United States)

    Wolahan, Stephanie M.; Li, Zhao; Hsu, Chao-Hsiung; Huang, Shing-Jong; Clubb, Robert; Hwang, Lian-Pin; Lin, Yung-Ya

    2014-11-01

    Accurate temperature measurement is a requisite for obtaining reliable thermodynamic and kinetic information in all NMR experiments. A widely used method to calibrate sample temperature depends on a secondary standard with temperature-dependent chemical shifts to report the true sample temperature, such as the hydroxyl proton in neat methanol or neat ethylene glycol. The temperature-dependent chemical shift of the hydroxyl protons arises from the sensitivity of the hydrogen-bond network to small changes in temperature. The frequency separation between the alkyl and the hydroxyl protons are then converted to sample temperature. Temperature measurements by this method, however, have been reported to be inconsistent and incorrect in modern NMR, particularly for spectrometers equipped with cryogenically-cooled probes. Such errors make it difficult or even impossible to study chemical exchange and molecular dynamics or to compare data acquired on different instruments, as is frequently done in biomolecular NMR. In this work, we identify the physical origins for such errors to be unequal amount of dynamical frequency shifts on the alkyl and the hydroxyl protons induced by strong radiation damping (RD) feedback fields. Common methods used to circumvent RD may not suppress such errors. A simple, easy-to-implement solution was demonstrated that neutralizes the RD effect on the frequency separation by a "selective crushing recovery" pulse sequence to equalize the transverse magnetization of both spin species. Experiments using cryoprobes at 500 MHz and 800 MHz demonstrated that this approach can effectively reduce the errors in temperature measurements from about ±4.0 K to within ±0.4 K in general.

  18. Beam Trajectory Correction for SNS

    CERN Document Server

    Chu, Chungming

    2005-01-01

    Automated beam trajectory correction with dipole correctors is developed and tested during the Spallation Neutron Source warm linac commissioning periods. The application is based on the XAL Java framework with newly developed optimization tools. Also, dipole corrector polarities and strengths, and beam position monitor (BPM) polarities were checked by an orbit difference program. The on-line model is used in both the trajectory correction and the orbit difference applications. Experimental data for both applications will be presented.

  19. Decision Making In A High-Tech World: Automation Bias and Countermeasures

    Science.gov (United States)

    Mosier, Kathleen L.; Skitka, Linda J.; Burdick, Mark R.; Heers, Susan T.; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    Automated decision aids and decision support systems have become essential tools in many high-tech environments. In aviation, for example, flight management systems computers not only fly the aircraft, but also calculate fuel efficient paths, detect and diagnose system malfunctions and abnormalities, and recommend or carry out decisions. Air Traffic Controllers will soon be utilizing decision support tools to help them predict and detect potential conflicts and to generate clearances. Other fields as disparate as nuclear power plants and medical diagnostics are similarly becoming more and more automated. Ideally, the combination of human decision maker and automated decision aid should result in a high-performing team, maximizing the advantages of additional cognitive and observational power in the decision-making process. In reality, however, the presence of these aids often short-circuits the way that even very experienced decision makers have traditionally handled tasks and made decisions, and introduces opportunities for new decision heuristics and biases. Results of recent research investigating the use of automated aids have indicated the presence of automation bias, that is, errors made when decision makers rely on automated cues as a heuristic replacement for vigilant information seeking and processing. Automation commission errors, i.e., errors made when decision makers inappropriately follow an automated directive, or automation omission errors, i.e., errors made when humans fail to take action or notice a problem because an automated aid fails to inform them, can result from this tendency. Evidence of the tendency to make automation-related omission and commission errors has been found in pilot self reports, in studies using pilots in flight simulations, and in non-flight decision making contexts with student samples. Considerable research has found that increasing social accountability can successfully ameliorate a broad array of cognitive biases and

  20. Ground-based CCD astrometry with wide field imagers. IV. An improved geometric-distortion correction for the blue prime-focus camera at the LBT

    Science.gov (United States)

    Bellini, A.; Bedin, L. R.

    2010-07-01

    High precision astrometry requires an accurate geometric-distortion solution. In this work, we present an average correction for the blue camera of the Large Binocular Telescope which enables a relative astrometric precision of ~15 mas for the BBessel and VBessel broad-band filters. The result of this effort is used in two companion papers: the first to measure the absolute proper motion of the open cluster M 67 with respect to the background galaxies; the second to decontaminate the color-magnitude of M 67 from field objects, enabling the study of the end of its white dwarf cooling sequence. Many other applications might find this distortion correction useful. Based on data acquired using the Large Binocular Telescope (LBT) at Mt. Graham, Arizona, under the Commissioning of the Large Binocular Blue Camera. The LBT is an international collaboration among institutions in the United States, Italy and Germany. LBT Corporation partners are: The University of Arizona on behalf of the Arizona university system; Istituto Nazionale di Astrofisica, Italy; LBT Beteiligungsgesellschaft, Germany, representing the Max-Planck Society, the Astrophysical Institute Potsdam, and Heidelberg University; The Ohio State University, and The Research Corporation, on behalf of The University of Notre Dame, University of Minnesota and University of Virginia.Visiting Ph.D. Student at STScI under the “2008 graduate research assistantship” program.

  1. Lithography focus/exposure control and corrections to improve CDU at post etch step

    Science.gov (United States)

    Kim, Young Ki; Yelverton, Mark; Tristan, John; Lee, Joungchel; Gutjahr, Karsten; Hsu, Ching-Hsiang; Wei, Hong; Wang, Lester; Li, Chen; Subramany, Lokesh; Chung, Woong Jae; Kim, Jeong Soo; Ramanathan, Vidya; Yap, LipKong; Gao, Jie; Karur-Shanmugam, Ram; Golotsvan, Anna; Herrera, Pedro; Huang, Kevin; Pierson, Bill

    2014-04-01

    As leading edge lithography moves to advanced nodes in high-mix, high-volume manufacturing environment, automated control of critical dimension (CD) within wafer has become a requirement. Current control methods to improve CD uniformity (CDU) generally rely upon the use of field by field exposure corrections via factory automation or through scanner sub-recipe. Such CDU control methods are limited to lithography step and cannot be extended to etch step. In this paper, a new method to improve CDU at post etch step by optimizing exposure at lithography step is introduced. This new solution utilizes GLOBALFOUNDRIES' factory automation system and KLA-Tencor's K-T Analyzer as the infrastructure to calculate and feed the necessary field by field level exposure corrections back to scanner, so as to achieve the optimal CDU at post etch step. CD at post lithography and post etch steps are measured by scatterometry metrology tools respectively and are used by K-T Analyzer as the input for correction calculations. This paper will explain in detail the philosophy as well as the methodology behind this novel CDU control solution. In addition, applications and use cases will be reviewed to demonstrate the capability and potential of this solution. The feasibility of adopting this solution in high-mix, high-volume manufacturing environment will be discussed as well.

  2. High-Field Diffusion MR Histology: Image-Based Correction of Eddy-Current Ghosts in Diffusion-Weighted Rapid Acquisition With Relaxation Enhancement (DW-RARE)

    Science.gov (United States)

    Tyszka, J. Michael; Frank, Lawrence R.

    2015-01-01

    High-resolution, diffusion-weighted (DW) MR microscopy is gaining increasing acceptance as a nondestructive histological tool for the study of fixed tissue samples. Spin-echo sequences are popular for high-field diffusion imaging due to their high tolerance to B0 field inhomogeneities. Volumetric DW rapid acquisition with relaxation enhancement (DW-RARE) currently offers the best tradeoff between imaging efficiency and image quality, but is relatively sensitive to residual eddy-current effects on the echo train phase, resulting in encoding direction-dependent ghosting in the DW images. We introduce two efficient, image-based phase corrections for ghost artifact reduction in DW-RARE of fixed tissue samples, neither of which require navigator echo acquisition. Both methods rely on the phase difference in k-space between the unweighted reference image and a given DW image and assume a constant, per-echo phase error arising from residual eddy-current effects in the absence of sample motion. Significant qualitative and quantitative ghost artifact reductions are demonstrated for individual DW and calculated diffusion tensor images. PMID:19097246

  3. Automated drawing generation system

    International Nuclear Information System (INIS)

    Yoshinaga, Toshiaki; Kawahata, Junichi; Yoshida, Naoto; Ono, Satoru

    1991-01-01

    Since automated CAD drawing generation systems still require human intervention, improvements were focussed on an interactive processing section (data input and correcting operation) which necessitates a vast amount of work. As a result, human intervention was eliminated, the original objective of a computerized system. This is the first step taken towards complete automation. The effects of development and commercialization of the system are as described below. (1) The interactive processing time required for generating drawings was improved. It was determined that introduction of the CAD system has reduced the time required for generating drawings. (2) The difference in skills between workers preparing drawings has been eliminated and the quality of drawings has been made uniform. (3) The extent of knowledge and experience demanded of workers has been reduced. (author)

  4. Disassembly automation automated systems with cognitive abilities

    CERN Document Server

    Vongbunyong, Supachai

    2015-01-01

    This book presents a number of aspects to be considered in the development of disassembly automation, including the mechanical system, vision system and intelligent planner. The implementation of cognitive robotics increases the flexibility and degree of autonomy of the disassembly system. Disassembly, as a step in the treatment of end-of-life products, can allow the recovery of embodied value left within disposed products, as well as the appropriate separation of potentially-hazardous components. In the end-of-life treatment industry, disassembly has largely been limited to manual labor, which is expensive in developed countries. Automation is one possible solution for economic feasibility. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.

  5. Effects of dynamical paths on the energy gap and the corrections to the free energy in path integrals of mean-field quantum spin systems

    Science.gov (United States)

    Koh, Yang Wei

    2018-03-01

    In current studies of mean-field quantum spin systems, much attention is placed on the calculation of the ground-state energy and the excitation gap, especially the latter, which plays an important role in quantum annealing. In pure systems, the finite gap can be obtained by various existing methods such as the Holstein-Primakoff transform, while the tunneling splitting at first-order phase transitions has also been studied in detail using instantons in many previous works. In disordered systems, however, it remains challenging to compute the gap of large-size systems with specific realization of disorder. Hitherto, only quantum Monte Carlo techniques are practical for such studies. Recently, Knysh [Nature Comm. 7, 12370 (2016), 10.1038/ncomms12370] proposed a method where the exponentially large dimensionality of such systems is condensed onto a random potential of much lower dimension, enabling efficient study of such systems. Here we propose a slightly different approach, building upon the method of static approximation of the partition function widely used for analyzing mean-field models. Quantum effects giving rise to the excitation gap and nonextensive corrections to the free energy are accounted for by incorporating dynamical paths into the path integral. The time-dependence of the trace of the time-ordered exponential of the effective Hamiltonian is calculated by solving a differential equation perturbatively, yielding a finite-size series expansion of the path integral. Formulae for the first excited-state energy are proposed to aid in computing the gap. We illustrate our approach using the infinite-range ferromagnetic Ising model and the Hopfield model, both in the presence of a transverse field.

  6. Author Correction

    DEFF Research Database (Denmark)

    Grundle, D S; Löscher, C R; Krahmann, G

    2018-01-01

    A correction to this article has been published and is linked from the HTML and PDF versions of this paper. The error has not been fixed in the paper.......A correction to this article has been published and is linked from the HTML and PDF versions of this paper. The error has not been fixed in the paper....

  7. UTILIZACIÓN DE SOFTWARE DE CORRECCIÓN AUTOMÁTICA EN EL CAMPO DE LAS CIENCIAS DE LA SALUD Using automatic correction software in the field of health sciences

    Directory of Open Access Journals (Sweden)

    Ferrán Prados

    2010-06-01

    Full Text Available Estamos viviendo una época de cambios profundos en la educación universitaria. La implantación del plan de Bolonia nos ha llevado a plantear nuevas metodologías docentes, a revisar el papel del estudiante, la evaluación por competencias, la incorporación de las TIC. Hechos impensables hace poco más de una década. Entre las diferentes plataformas informáticas, cabe destacar las que permiten corrección automática de ejercicios, porque son instrumentos de un gran interés pedagógico ya que evalúan al instante al alumnado y aportan un feedback del conocimiento que tiene en forma de mensaje de ayuda o de nota. Si la potencia de estas herramientas la sumamos a la de Internet, usando un entorno de e-learning, el resultado permitirá trabajar, corregir, evaluar, resolver dudas, etc., desde cualquier lugar y a cualquier hora. Este trabajo presenta parte de una plataforma y los resultados de su utilización en el ámbito de las ciencias de la salud.We live in an era of profound changes in university education. The implementation of Bologna plan has led us to raise new teaching methodologies, to review the role of the student, competency assessment, the incorporation of ICT. Unthinkable acts, one or two decade ago. The TIC concept is very broad and is attributed to the media, processes and content usage. Inside the supports and platforms, we stress tools that allow automatic correction of exercises, because they are instruments of great educational value because instantly they assess students and provide instant feedback about the knowledge that they have either as message support or note. If the power of these tools, we add the Internet, using e-learning environment, the results allow us to work, edit, evaluate, resolve doubts, and so on, anywhere, anytime. We present part of a platform and the results of its use in the field of health sciences.

  8. Automated External Defibrillator

    Science.gov (United States)

    ... To Health Topics / Automated External Defibrillator Automated External Defibrillator Also known as What Is An automated external ... in survival. Training To Use an Automated External Defibrillator Learning how to use an AED and taking ...

  9. Library Automation.

    Science.gov (United States)

    Husby, Ole

    1990-01-01

    The challenges and potential benefits of automating university libraries are reviewed, with special attention given to cooperative systems. Aspects discussed include database size, the role of the university computer center, storage modes, multi-institutional systems, resource sharing, cooperative system management, networking, and intelligent…

  10. Correct Models

    OpenAIRE

    Blacher, René

    2010-01-01

    Ce rapport complete les deux rapports précédents et apporte une explication plus simple aux résultats précédents : à savoir la preuve que les suites obtenues sont aléatoires.; In previous reports, we have show how to transform a text $y_n$ in a random sequence by using functions of Fibonacci $T_q$. Now, in this report, we obtain a clearer result by proving that $T_q(y_n)$ has the IID model as correct model. But, it is necessary to define correctly a correct model. Then, we study also this pro...

  11. 2015 Chinese Intelligent Automation Conference

    CERN Document Server

    Li, Hongbo

    2015-01-01

    Proceedings of the 2015 Chinese Intelligent Automation Conference presents selected research papers from the CIAC’15, held in Fuzhou, China. The topics include adaptive control, fuzzy control, neural network based control, knowledge based control, hybrid intelligent control, learning control, evolutionary mechanism based control, multi-sensor integration, failure diagnosis, reconfigurable control, etc. Engineers and researchers from academia, industry and the government can gain valuable insights into interdisciplinary solutions in the field of intelligent automation.

  12. Automated one-loop calculations with GOSAM

    Energy Technology Data Exchange (ETDEWEB)

    Cullen, Gavin [Edinburgh Univ. (United Kingdom). School of Physics and Astronomy; Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Greiner, Nicolas [Illinois Univ., Urbana-Champaign, IL (United States). Dept. of Physics; Max-Planck-Institut fuer Physik, Muenchen (Germany); Heinrich, Gudrun; Reiter, Thomas [Max-Planck-Institut fuer Physik, Muenchen (Germany); Luisoni, Gionata [Durham Univ. (United Kingdom). Inst. for Particle Physics Phenomenology; Mastrolia, Pierpaolo [Max-Planck-Institut fuer Physik, Muenchen (Germany); Padua Univ. (Italy). Dipt. di Fisica; Ossola, Giovanni [New York City Univ., NY (United States). New York City College of Technology; New York City Univ., NY (United States). The Graduate School and University Center; Tramontano, Francesco [European Organization for Nuclear Research (CERN), Geneva (Switzerland)

    2011-11-15

    We present the program package GoSam which is designed for the automated calculation of one-loop amplitudes for multi-particle processes in renormalisable quantum field theories. The amplitudes, which are generated in terms of Feynman diagrams, can be reduced using either D-dimensional integrand-level decomposition or tensor reduction. GoSam can be used to calculate one-loop QCD and/or electroweak corrections to Standard Model processes and offers the flexibility to link model files for theories Beyond the Standard Model. A standard interface to programs calculating real radiation is also implemented. We demonstrate the flexibility of the program by presenting examples of processes with up to six external legs attached to the loop. (orig.)

  13. Automated one-loop calculations with GOSAM

    International Nuclear Information System (INIS)

    Cullen, Gavin; Greiner, Nicolas; Heinrich, Gudrun; Reiter, Thomas; Luisoni, Gionata

    2011-11-01

    We present the program package GoSam which is designed for the automated calculation of one-loop amplitudes for multi-particle processes in renormalisable quantum field theories. The amplitudes, which are generated in terms of Feynman diagrams, can be reduced using either D-dimensional integrand-level decomposition or tensor reduction. GoSam can be used to calculate one-loop QCD and/or electroweak corrections to Standard Model processes and offers the flexibility to link model files for theories Beyond the Standard Model. A standard interface to programs calculating real radiation is also implemented. We demonstrate the flexibility of the program by presenting examples of processes with up to six external legs attached to the loop. (orig.)

  14. Automated MRI segmentation for individualized modeling of current flow in the human head

    Science.gov (United States)

    Huang, Yu; Dmochowski, Jacek P.; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C.

    2013-12-01

    Objective. High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of the brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images requires labor-intensive manual segmentation, even when utilizing available automated segmentation tools. Also, accurate placement of many high-density electrodes on an individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. Approach. A fully automated segmentation technique based on Statical Parametric Mapping 8, including an improved tissue probability map and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on four healthy subjects and seven stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets.Main results. The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly.Significance. Fully

  15. Recent progress in the field of automated welding applied to maintenance activities; Avancees recentes dans le domaine du soudage automatique et robotise applique a la maintenance des equipements

    Energy Technology Data Exchange (ETDEWEB)

    Cullafroz, M. [FRAMATOME ANP, 92 - Paris-La-Defence (France)

    2004-07-01

    Automated and robot welding has 5 advantages compared to manual welding: -) under some conditions the automated circular welding does not require requalification testing as manual welding does, -) welding heads in robots have a reduced size compared to manual gears so they can enter and treat complex piping, -) by using an adequate viewing system the operator can be more than 10 meters away from the welding site which means that the radiation doses he receives is cut by a factor 1.5 to 2, -) whatever the configuration is, the deposition rate in automated welding stays high, the quality standard is steady and the risk of repairing is low, -) a gain in productivity if adequate equipment is used. In general, automated welding requires a TIG welding process and is applied in maintenance activities to: -) the main primary system and other circuits in stainless austenitic steels, -) the main secondary system and other circuits in low-percentage carbon steels, and -) the closure of spent fuel canisters. An application to the repairing of BWR's pipes is shown. (A.C.)

  16. Quantitative Estimation for the Effectiveness of Automation

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun

    2012-01-01

    In advanced MCR, various automation systems are applied to enhance the human performance and reduce the human errors in industrial fields. It is expected that automation provides greater efficiency, lower workload, and fewer human errors. However, these promises are not always fulfilled. As the new types of events related to application of the imperfect and complex automation are occurred, it is required to analyze the effects of automation system for the performance of human operators. Therefore, we suggest the quantitative estimation method to analyze the effectiveness of the automation systems according to Level of Automation (LOA) classification, which has been developed over 30 years. The estimation of the effectiveness of automation will be achieved by calculating the failure probability of human performance related to the cognitive activities

  17. Publisher Correction

    DEFF Research Database (Denmark)

    Flachsbart, Friederike; Dose, Janina; Gentschew, Liljana

    2018-01-01

    The original version of this Article contained an error in the spelling of the author Robert Häsler, which was incorrectly given as Robert Häesler. This has now been corrected in both the PDF and HTML versions of the Article....

  18. Publisher Correction

    DEFF Research Database (Denmark)

    Stokholm, Jakob; Blaser, Martin J.; Thorsen, Jonathan

    2018-01-01

    The originally published version of this Article contained an incorrect version of Figure 3 that was introduced following peer review and inadvertently not corrected during the production process. Both versions contain the same set of abundance data, but the incorrect version has the children...

  19. Impact of Office Automation: An Empirical Assessment

    Science.gov (United States)

    1988-12-01

    imp F rq(I NAVAL POSTGRADUATE SCHOOL Monterey, California N I < DTIC S ELECTEI THESIS -’° "I I MPACT OF OFFICE AUTOMATION : AN EMPIRICAL ASSESSMENT by...FLNDiNG NUMBERS PROGRAM PROCT TASK IWORK UNIT ELEMNT O NONO ACCESSION NO 11 TITLE (Include Security Classification) IMPACT OF OFFICE AUTOMATION : AN...identity by block number) FIELD GROUP I SB-GROLP Productivity Assessment; SACONS; Office Automation I I 19 ABSTRACT (Continue on reverse if necessary

  20. Cavendish Balance Automation

    Science.gov (United States)

    Thompson, Bryan

    2000-01-01

    This is the final report for a project carried out to modify a manual commercial Cavendish Balance for automated use in cryostat. The scope of this project was to modify an off-the-shelf manually operated Cavendish Balance to allow for automated operation for periods of hours or days in cryostat. The purpose of this modification was to allow the balance to be used in the study of effects of superconducting materials on the local gravitational field strength to determine if the strength of gravitational fields can be reduced. A Cavendish Balance was chosen because it is a fairly simple piece of equipment for measuring gravity, one the least accurately known and least understood physical constants. The principle activities that occurred under this purchase order were: (1) All the components necessary to hold and automate the Cavendish Balance in a cryostat were designed. Engineering drawings were made of custom parts to be fabricated, other off-the-shelf parts were procured; (2) Software was written in LabView to control the automation process via a stepper motor controller and stepper motor, and to collect data from the balance during testing; (3)Software was written to take the data collected from the Cavendish Balance and reduce it to give a value for the gravitational constant; (4) The components of the system were assembled and fitted to a cryostat. Also the LabView hardware including the control computer, stepper motor driver, data collection boards, and necessary cabling were assembled; and (5) The system was operated for a number of periods, data collected, and reduced to give an average value for the gravitational constant.

  1. SU-E-T-225: Correction Matrix for PinPoint Ionization Chamber for Dosimetric Measurements in the Newly Released Incise™ Multileaf Collimator Shaped Small Field for CyberKnife M6™ Machine

    International Nuclear Information System (INIS)

    Zhang, Y; Li, T; Heron, D; Huq, M

    2015-01-01

    Purpose: For small field dosimetry, such as measurements of output factors for cones or MLC-shaped irregular small fields, ion chambers often Result in an underestimation of the dose, due to both the volume averaging effect and the lack of lateral charged particle equilibrium. This work presents a mathematical model for correction matrix for a PTW PinPoint ionization chamber for dosimetric measurements made in the newly released Incise™ Multileaf collimator fields of the CyberKnife M6™ machine. Methods: A correction matrix for a PTW 0.015cc PinPoint ionization chamber was developed by modeling its 3D dose response in twelve cone-shaped circular fields created using the 5mm, 7.5mm, 10mm, 12.5mm, 15mm, 20mm, 25mm, 30mm, 35mm, 40mm, 50mm, 60mm cones in a CyberKnife M6™ machine. For each field size, hundreds of readings were recorded for every 2mm chamber shift in the horizontal plane. The contribution of each dose pixel to a measurement point depended on the radial distance and the angle to the chamber axis. These readings were then compared with the theoretical dose as obtained with Monte Carlo calculation. A penalized least-square optimization algorithm was developed to generate the correction matrix. After the parameter fitting, the mathematical model was validated for MLC-shaped irregular fields. Results: The optimization algorithm used for parameter fitting was stable and the resulted response factors were smooth in spatial domain. After correction with the mathematical model, the chamber reading matched with the calculation for all the tested fields to within 2%. Conclusion: A novel mathematical model has been developed for PinPoint chamber for dosimetric measurements in small MLC-shaped irregular fields. The correction matrix is dependent on detector, treatment unit and the geometry of setup. The model can be applied to non-standard composite fields and provides an access to IMRT point dose validation

  2. AUTOMATION OF CONVEYOR BELT TRANSPORT

    Directory of Open Access Journals (Sweden)

    Nenad Marinović

    1990-12-01

    Full Text Available Belt conveyor transport, although one of the most economical mining transport system, introduce many problems to mantain the continuity of the operation. Every stop causes economical loses. Optimal operation require correct tension of the belt, correct belt position and velocity and faultless rolls, which are together input conditions for automation. Detection and position selection of the faults are essential for safety to eliminate fire hazard and for efficient maintenance. Detection and location of idler roll faults are still open problem and up to now not solved successfully (the paper is published in Croatian.

  3. Correction note.

    Science.gov (United States)

    2014-12-01

    Correction note for Sanders, M., Calam, R., Durand, M., Liversidge, T. and Carmont, S. A. (2008), Does self-directed and web-based support for parents enhance the effects of viewing a reality television series based on the Triple P - Positive Parenting Programme?. Journal of Child Psychology and Psychiatry, 49: 924-932. doi: 10.1111/j.1469-7610.2008.01901.x. © 2014 Association for Child and Adolescent Mental Health.

  4. Optimization of automation: III. Development of optimization method for determining automation rate in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Kim, Man Cheol; Seong, Poong Hyun

    2016-01-01

    Highlights: • We propose an appropriate automation rate that enables the best human performance. • We analyze the shortest working time considering Situation Awareness Recovery (SAR). • The optimized automation rate is estimated by integrating the automation and ostracism rate estimation methods. • The process to derive the optimized automation rate is demonstrated through case studies. - Abstract: Automation has been introduced in various industries, including the nuclear field, because it is commonly believed that automation promises greater efficiency, lower workloads, and fewer operator errors through reducing operator errors and enhancing operator and system performance. However, the excessive introduction of automation has deteriorated operator performance due to the side effects of automation, which are referred to as Out-of-the-Loop (OOTL), and this is critical issue that must be resolved. Thus, in order to determine the optimal level of automation introduction that assures the best human operator performance, a quantitative method of optimizing the automation is proposed in this paper. In order to propose the optimization method for determining appropriate automation levels that enable the best human performance, the automation rate and ostracism rate, which are estimation methods that quantitatively analyze the positive and negative effects of automation, respectively, are integrated. The integration was conducted in order to derive the shortest working time through considering the concept of situation awareness recovery (SAR), which states that the automation rate with the shortest working time assures the best human performance. The process to derive the optimized automation rate is demonstrated through an emergency operation scenario-based case study. In this case study, four types of procedures are assumed through redesigning the original emergency operating procedure according to the introduced automation and ostracism levels. Using the

  5. Autonomous Systems: Habitat Automation

    Data.gov (United States)

    National Aeronautics and Space Administration — The Habitat Automation Project Element within the Autonomous Systems Project is developing software to automate the automation of habitats and other spacecraft. This...

  6. An Automation Planning Primer.

    Science.gov (United States)

    Paynter, Marion

    1988-01-01

    This brief planning guide for library automation incorporates needs assessment and evaluation of options to meet those needs. A bibliography of materials on automation planning and software reviews, library software directories, and library automation journals is included. (CLB)

  7. Automated Budget System -

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  8. Automation 2017

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2017-01-01

    This book consists of papers presented at Automation 2017, an international conference held in Warsaw from March 15 to 17, 2017. It discusses research findings associated with the concepts behind INDUSTRY 4.0, with a focus on offering a better understanding of and promoting participation in the Fourth Industrial Revolution. Each chapter presents a detailed analysis of a specific technical problem, in most cases followed by a numerical analysis, simulation and description of the results of implementing the solution in a real-world context. The theoretical results, practical solutions and guidelines presented are valuable for both researchers working in the area of engineering sciences and practitioners looking for solutions to industrial problems. .

  9. Marketing automation

    Directory of Open Access Journals (Sweden)

    TODOR Raluca Dania

    2017-01-01

    Full Text Available The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the technical progress, the marketing fragmentation, demand for customized products and services on one side and the need to achieve constructive dialogue with the customers, immediate and flexible response and the necessity to measure the investments and the results on the other side, the classical marketing approached had changed continue to improve substantially.

  10. Hospital automation system RFID-based: technology embedded in smart devices (cards, tags and bracelets).

    Science.gov (United States)

    Florentino, Gustavo H P; Paz de Araujo, Carlos A; Bezerra, Heitor U; Junior, Helio B A; Xavier, Marcelo Araujo; de Souza, Vinicius S V; de M Valentim, Ricardo A A; Morais, Antonio H F; Guerreiro, Ana M G; Brandao, Glaucio B

    2008-01-01

    RFID is a technology being adopted in many business fields, especially in the medical field. This work has the objective to present a system for automation of a hospital clinical analysis laboratory. This system initially uses contactless smart cards to store patient's data and for authentication of hospital employees in the system. The proposed system also uses RFID tags stuck to containers containing patient's collected samples for the correct identification of the patient who gave away the samples. This work depicts a hospital laboratory workflow, presents the system modeling and deals with security matters related to information stored in the smart cards.

  11. The effect of individual differences in working memory in older adults on performance with different degrees of automated technology.

    Science.gov (United States)

    Pak, Richard; McLaughlin, Anne Collins; Leidheiser, William; Rovira, Ericka

    2017-04-01

    A leading hypothesis to explain older adults' overdependence on automation is age-related declines in working memory. However, it has not been empirically examined. The purpose of the current experiment was to examine how working memory affected performance with different degrees of automation in older adults. In contrast to the well-supported idea that higher degrees of automation, when the automation is correct, benefits performance but higher degrees of automation, when the automation fails, increasingly harms performance, older adults benefited from higher degrees of automation when the automation was correct but were not differentially harmed by automation failures. Surprisingly, working memory did not interact with degree of automation but did interact with automation correctness or failure. When automation was correct, older adults with higher working memory ability had better performance than those with lower abilities. But when automation was incorrect, all older adults, regardless of working memory ability, performed poorly. Practitioner Summary: The design of automation intended for older adults should focus on ways of making the correctness of the automation apparent to the older user and suggest ways of helping them recover when it is malfunctioning.

  12. Comparison of the FFT/matrix inversion and system matrix techniques for higher-order probe correction in spherical near-field antenna measurements

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Nielsen, Jeppe Majlund; Breinbjerg, Olav

    2011-01-01

    correction of general high-order probes, including non-symmetric dual-polarized antennas with independent ports. The investigation was carried out by processing with each technique the same measurement data for a challenging case with an antenna under test significantly offset from the center of rotation...

  13. Home automation as an example of construction innovation

    NARCIS (Netherlands)

    Vlies, R.D. van der; Bronswijk, J.E.M.H. van

    2009-01-01

    Home automation can contribute to the health of (older) adults. Home automation covers a broad field of ‘intelligent’ electronic or mechanical devices in the home (domestic) environment. Realizing home automation is technically possible, though still not common. In this paper main influential

  14. International Conference Automation : Challenges in Automation, Robotics and Measurement Techniques

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2016-01-01

    This book presents the set of papers accepted for presentation at the International Conference Automation, held in Warsaw, 2-4 March of 2016. It presents the research results presented by top experts in the fields of industrial automation, control, robotics and measurement techniques. Each chapter presents a thorough analysis of a specific technical problem which is usually followed by numerical analysis, simulation, and description of results of implementation of the solution of a real world problem. The presented theoretical results, practical solutions and guidelines will be valuable for both researchers working in the area of engineering sciences and for practitioners solving industrial problems. .

  15. Development of an automated asbestos counting software based on fluorescence microscopy.

    Science.gov (United States)

    Alexandrov, Maxym; Ichida, Etsuko; Nishimura, Tomoki; Aoki, Kousuke; Ishida, Takenori; Hirota, Ryuichi; Ikeda, Takeshi; Kawasaki, Tetsuo; Kuroda, Akio

    2015-01-01

    An emerging alternative to the commonly used analytical methods for asbestos analysis is fluorescence microscopy (FM), which relies on highly specific asbestos-binding probes to distinguish asbestos from interfering non-asbestos fibers. However, all types of microscopic asbestos analysis require laborious examination of large number of fields of view and are prone to subjective errors and large variability between asbestos counts by different analysts and laboratories. A possible solution to these problems is automated counting of asbestos fibers by image analysis software, which would lower the cost and increase the reliability of asbestos testing. This study seeks to develop a fiber recognition and counting software for FM-based asbestos analysis. We discuss the main features of the developed software and the results of its testing. Software testing showed good correlation between automated and manual counts for the samples with medium and high fiber concentrations. At low fiber concentrations, the automated counts were less accurate, leading us to implement correction mode for automated counts. While the full automation of asbestos analysis would require further improvements in accuracy of fiber identification, the developed software could already assist professional asbestos analysts and record detailed fiber dimensions for the use in epidemiological research.

  16. Context-Aware user interfaces in automation

    DEFF Research Database (Denmark)

    Olsen, Mikkel Holm

    2007-01-01

    that some important differences exist between the notion of context in these systems and in the automation domain. We find important differences in the needs for information between the control room operators and field operators in complex automation systems, and the need for the field operator...... of automation systems and the level of automation have been rising. This has caused problems regarding the operator's ability to comprehend the overall situation and state of the automation system, in particular in abnormal situations. The amount of data available to the operator results in information overload....... Because the notion of what information is relevant continually changes, the suggestion is to develop context-aware systems that can assist the operator. In order to create a context-aware system we must first examine what context is, and what kinds of data we should consider constituting the context...

  17. A fully automated and reproducible level-set segmentation approach for generation of MR-based attenuation correction map of PET images in the brain employing single STE-MR imaging modality

    International Nuclear Information System (INIS)

    Kazerooni, Anahita Fathi; Aarabi, Mohammad Hadi; Ay, Mohammadreza; Rad, Hamidreza Saligheh

    2014-01-01

    Generating MR-based attenuation correction map (μ-map) for quantitative reconstruction of PET images still remains a challenge in hybrid PET/MRI systems, mainly because cortical bone structures are indistinguishable from proximal air cavities in conventional MR images. Recently, development of short echo-time (STE) MR imaging sequences, has shown promise in differentiating cortical bone from air. However, on STE-MR images, the bone appears with discontinuous boundaries. Therefore, segmentation techniques based on intensity classification, such as thresholding or fuzzy C-means, fail to homogeneously delineate bone boundaries, especially in the presence of intrinsic noise and intensity inhomogeneity. Consequently, they cannot be fully automatized, must be fine-tuned on the case-by-case basis, and require additional morphological operations for segmentation refinement. To overcome the mentioned problems, in this study, we introduce a new fully automatic and reproducible STE-MR segmentation approach exploiting level-set in a clustering-based intensity inhomogeneity correction framework to reliably delineate bone from soft tissue and air.

  18. Both Automation and Paper.

    Science.gov (United States)

    Purcell, Royal

    1988-01-01

    Discusses the concept of a paperless society and the current situation in library automation. Various applications of automation and telecommunications are addressed, and future library automation is considered. Automation at the Monroe County Public Library in Bloomington, Indiana, is described as an example. (MES)

  19. Judson_Mansouri_Automated_Chemical_Curation_QSAREnvRes_Data

    Data.gov (United States)

    U.S. Environmental Protection Agency — Here we describe the development of an automated KNIME workflow to curate and correct errors in the structure and identity of chemicals using the publically...

  20. Mobile home automation-merging mobile value added services and home automation technologies

    OpenAIRE

    Rosendahl, Andreas; Hampe, Felix J.; Botterweck, Goetz

    2007-01-01

    non-peer-reviewed In this paper we study mobile home automation, a field that emerges from an integration of mobile application platforms and home automation technologies. In a conceptual introduction we first illustrate the need for such applications by introducing a two-dimensional conceptual model of mobility. Subsequently we suggest an architecture and discuss different options of how a user might access a mobile home automation service and the controlled devices. As another contrib...

  1. 2013 Chinese Intelligent Automation Conference

    CERN Document Server

    Deng, Zhidong

    2013-01-01

    Proceedings of the 2013 Chinese Intelligent Automation Conference presents selected research papers from the CIAC’13, held in Yangzhou, China. The topics include e.g. adaptive control, fuzzy control, neural network based control, knowledge based control, hybrid intelligent control, learning control, evolutionary mechanism based control, multi-sensor integration, failure diagnosis, and reconfigurable control. Engineers and researchers from academia, industry, and government can gain an inside view of new solutions combining ideas from multiple disciplines in the field of intelligent automation.   Zengqi Sun and Zhidong Deng are professors at the Department of Computer Science, Tsinghua University, China.

  2. 2013 Chinese Intelligent Automation Conference

    CERN Document Server

    Deng, Zhidong

    2013-01-01

    Proceedings of the 2013 Chinese Intelligent Automation Conference presents selected research papers from the CIAC’13, held in Yangzhou, China. The topics include e.g. adaptive control, fuzzy control, neural network based control, knowledge based control, hybrid intelligent control, learning control, evolutionary mechanism based control, multi-sensor integration, failure diagnosis, and reconfigurable control. Engineers and researchers from academia, industry, and government can gain an inside view of new solutions combining ideas from multiple disciplines in the field of intelligent automation. Zengqi Sun and Zhidong Deng are professors at the Department of Computer Science, Tsinghua University, China.

  3. Automated Podcasting System for Universities

    Directory of Open Access Journals (Sweden)

    Ypatios Grigoriadis

    2013-03-01

    Full Text Available This paper presents the results achieved at Graz University of Technology (TU Graz in the field of automating the process of recording and publishing university lectures in a very new way. It outlines cornerstones of the development and integration of an automated recording system such as the lecture hall setup, the recording hardware and software architecture as well as the development of a text-based search for the final product by method of indexing video podcasts. Furthermore, the paper takes a look at didactical aspects, evaluations done in this context and future outlook.

  4. Illumination correction in psoriasis lesions images

    DEFF Research Database (Denmark)

    Maletti, Gabriela Mariel; Ersbøll, Bjarne Kjær

    2003-01-01

    An approach to automatically correct illumination problems in dermatological images is presented. The illumination function is estimated after combining the thematic map indicating skin-produced by an automated classification scheme- with the dermatological image data. The user is only required t...

  5. Molecular typing of vancomycin-resistant Enterococcus faecium with an automated repetitive sequence-based PCR microbial typing system compared with pulsed-field gel electrophoresis and multilocus sequence typing.

    Science.gov (United States)

    Kardén-Lilja, Minna; Vuopio, Jaana; Koskela, Markku; Tissari, Päivi; Salmenlinna, Saara

    2013-05-01

    Pulsed-field gel electrophoresis (PFGE) is the main typing method used for the molecular typing of vancomycin-resistant Enterococcus faecium (VREfm). However, more rapid and unambiguous typing methods are needed. DiversiLab, a repetitive sequence-based PCR (rep-PCR), offers an alternative method for strain typing. Thirty-nine VREfm isolates with known epidemiological relationships were characterized by semi-automated rep-PCR (DiversiLab), PFGE, and multilocus sequence typing (MLST). The DiversiLab results were analysed in 2 ways: first relying solely on the DiversiLab software, and second by DiversiLab analysis combined with manual interpretation. The analysis with interpretation yielded more DiversiLab profiles, correlated better with PFGE and MLST, and grouped the isolates better according to their relatedness in time and space. However, most of the DiversiLab groups also included isolates with different PFGE and MLST types. DiversiLab provides rapid information when investigating a potential hospital outbreak. However, the interpretation of E. faecium DiversiLab results cannot be fully automated and is not always straightforward. Other typing methods may be necessary to confirm the analysis.

  6. The Systems Development Life Cycle as a Planning Methodology for Library Automation.

    Science.gov (United States)

    Cheatham, David

    1985-01-01

    Discussion of the systems development life cycle (SDLC) that supports operational and managerial planning of automation projects covers challenges of library automation, evolution and scope of SDLC, lack of dissemination of SDLC literature within library and information science community, and corrective measures to meet library automation demands.…

  7. Automated illustration of patients instructions.

    Science.gov (United States)

    Bui, Duy; Nakamura, Carlos; Bray, Bruce E; Zeng-Treitler, Qing

    2012-01-01

    A picture can be a powerful communication tool. However, creating pictures to illustrate patient instructions can be a costly and time-consuming task. Building on our prior research in this area, we developed a computer application that automatically converts text to pictures using natural language processing and computer graphics techniques. After iterative testing, the automated illustration system was evaluated using 49 previously unseen cardiology discharge instructions. The completeness of the system-generated illustrations was assessed by three raters using a three-level scale. The average inter-rater agreement for text correctly represented in the pictograph was about 66 percent. Since illustration in this context is intended to enhance rather than replace text, these results support the feasibility of conducting automated illustration.

  8. Automated preconcentration of Fe, Zn, Cu, Ni, Cd, Pb, Co, and Mn in seawater with analysis using high-resolution sector field inductively-coupled plasma mass spectrometry.

    Science.gov (United States)

    Rapp, Insa; Schlosser, Christian; Rusiecka, Dagmara; Gledhill, Martha; Achterberg, Eric P

    2017-07-11

    A rapid, automated, high-throughput analytical method capable of simultaneous analysis of multiple elements at trace and ultratrace levels is required to investigate the biogeochemical cycle of trace metals in the ocean. Here we present an analytical approach which uses a commercially available automated preconcentration device (SeaFAST) with accurate volume loading and in-line pH buffering of the sample prior to loading onto a chelating resin (WAKO) and subsequent simultaneous analysis of iron (Fe), zinc (Zn), copper (Cu), nickel (Ni), cadmium (Cd), lead (Pb), cobalt (Co) and manganese (Mn) by high-resolution inductively-coupled plasma mass spectrometry (HR-ICP-MS). Quantification of sample concentration was undertaken using isotope dilution for Fe, Zn, Cu, Ni, Cd and Pb, and standard addition for Co and Mn. The chelating resin is shown to have a high affinity for all analyzed elements, with recoveries between 83 and 100% for all elements, except Mn (60%) and Ni (48%), and showed higher recoveries for Ni, Cd, Pb, Co and Mn in direct comparison to an alternative resin (NOBIAS Chelate-PA1). The reduced recoveries for Ni and Mn using the WAKO resin did not affect the quantification accuracy. A relatively constant retention efficiency on the resin over a broad pH range (pH 5-8) was observed for the trace metals, except for Mn. Mn quantification using standard addition required accurate sample pH adjustment with optimal recoveries at pH 7.5 ± 0.3. UV digestion was necessary to increase recovery of Co and Cu in seawater by 15.6% and 11.4%, respectively, and achieved full break-down of spiked Co-containing vitamin B 12 complexes. Low blank levels and detection limits could be achieved (e.g., 0.029 nmol L -1 for Fe and 0.028 nmol L -1 for Zn) with the use of high purity reagents. Precision and accuracy were assessed using SAFe S, D1, and D2 reference seawaters, and results were in good agreement with available consensus values. The presented method is ideal for

  9. Laboratory automation: trajectory, technology, and tactics.

    Science.gov (United States)

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a

  10. Differences in field effectiveness and adoption between a novel automated chlorination system and household manual chlorination of drinking water in Dhaka, Bangladesh: a randomized controlled trial.

    Science.gov (United States)

    Pickering, Amy J; Crider, Yoshika; Amin, Nuhu; Bauza, Valerie; Unicomb, Leanne; Davis, Jennifer; Luby, Stephen P

    2015-01-01

    The number of people served by networked systems that supply intermittent and contaminated drinking water is increasing. In these settings, centralized water treatment is ineffective, while household-level water treatment technologies have not been brought to scale. This study compares a novel low-cost technology designed to passively (automatically) dispense chlorine at shared handpumps with a household-level intervention providing water disinfection tablets (Aquatab), safe water storage containers, and behavior promotion. Twenty compounds were enrolled in Dhaka, Bangladesh, and randomly assigned to one of three groups: passive chlorinator, Aquatabs, or control. Over a 10-month intervention period, the mean percentage of households whose stored drinking water had detectable total chlorine was 75% in compounds with access to the passive chlorinator, 72% in compounds receiving Aquatabs, and 6% in control compounds. Both interventions also significantly improved microbial water quality. Aquatabs usage fell by 50% after behavioral promotion visits concluded, suggesting intensive promotion is necessary for sustained uptake. The study findings suggest high potential for an automated decentralized water treatment system to increase consistent access to clean water in low-income urban communities.

  11. Optimization of automation: I. Estimation method of cognitive automation rates reflecting the effects of automation on human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Seong, Poong Hyun

    2014-01-01

    Highlights: • We propose an estimation method of the automation rate by taking the advantages of automation as the estimation measures. • We conduct the experiments to examine the validity of the suggested method. • The higher the cognitive automation rate is, the greater the decreased rate of the working time will be. • The usefulness of the suggested estimation method is proved by statistical analyses. - Abstract: Since automation was introduced in various industrial fields, the concept of the automation rate has been used to indicate the inclusion proportion of automation among all work processes or facilities. Expressions of the inclusion proportion of automation are predictable, as is the ability to express the degree of the enhancement of human performance. However, many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, this paper proposes a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs). Automation in NPPs can be divided into two types: system automation and cognitive automation. Some general descriptions and characteristics of each type of automation are provided, and the advantages of automation are investigated. The advantages of each type of automation are used as measures of the estimation method of the automation rate. One advantage was found to be a reduction in the number of tasks, and another was a reduction in human cognitive task loads. The system and the cognitive automation rate were proposed as quantitative measures by taking advantage of the aforementioned benefits. To quantify the required human cognitive task loads and thus suggest the cognitive automation rate, Conant’s information-theory-based model was applied. The validity of the suggested method, especially as regards the cognitive automation rate, was proven by conducting

  12. Toward Automated Benchmarking of Atomistic Force Fields: Neat Liquid Densities and Static Dielectric Constants from the ThermoML Data Archive.

    Science.gov (United States)

    Beauchamp, Kyle A; Behr, Julie M; Rustenburg, Ariën S; Bayly, Christopher I; Kroenlein, Kenneth; Chodera, John D

    2015-10-08

    Atomistic molecular simulations are a powerful way to make quantitative predictions, but the accuracy of these predictions depends entirely on the quality of the force field employed. Although experimental measurements of fundamental physical properties offer a straightforward approach for evaluating force field quality, the bulk of this information has been tied up in formats that are not machine-readable. Compiling benchmark data sets of physical properties from non-machine-readable sources requires substantial human effort and is prone to the accumulation of human errors, hindering the development of reproducible benchmarks of force-field accuracy. Here, we examine the feasibility of benchmarking atomistic force fields against the NIST ThermoML data archive of physicochemical measurements, which aggregates thousands of experimental measurements in a portable, machine-readable, self-annotating IUPAC-standard format. As a proof of concept, we present a detailed benchmark of the generalized Amber small-molecule force field (GAFF) using the AM1-BCC charge model against experimental measurements (specifically, bulk liquid densities and static dielectric constants at ambient pressure) automatically extracted from the archive and discuss the extent of data available for use in larger scale (or continuously performed) benchmarks. The results of even this limited initial benchmark highlight a general problem with fixed-charge force fields in the representation low-dielectric environments, such as those seen in binding cavities or biological membranes.

  13. Principles and methods for automated palynology.

    Science.gov (United States)

    Holt, K A; Bennett, K D

    2014-08-01

    Pollen grains are microscopic so their identification and quantification has, for decades, depended upon human observers using light microscopes: a labour-intensive approach. Modern improvements in computing and imaging hardware and software now bring automation of pollen analyses within reach. In this paper, we provide the first review in over 15 yr of progress towards automation of the part of palynology concerned with counting and classifying pollen, bringing together literature published from a wide spectrum of sources. We consider which attempts offer the most potential for an automated palynology system for universal application across all fields of research concerned with pollen classification and counting. We discuss what is required to make the datasets of these automated systems as acceptable as those produced by human palynologists, and present suggestions for how automation will generate novel approaches to counting and classifying pollen that have hitherto been unthinkable.

  14. Automated Test Case Generation

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    I would like to present the concept of automated test case generation. I work on it as part of my PhD and I think it would be interesting also for other people. It is also the topic of a workshop paper that I am introducing in Paris. (abstract below) Please note that the talk itself would be more general and not about the specifics of my PhD, but about the broad field of Automated Test Case Generation. I would introduce the main approaches (combinatorial testing, symbolic execution, adaptive random testing) and their advantages and problems. (oracle problem, combinatorial explosion, ...) Abstract of the paper: Over the last decade code-based test case generation techniques such as combinatorial testing or dynamic symbolic execution have seen growing research popularity. Most algorithms and tool implementations are based on finding assignments for input parameter values in order to maximise the execution branch coverage. Only few of them consider dependencies from outside the Code Under Test’s scope such...

  15. Comparison and correction of the light sensor output from 48 wearable light exposure devices by using a side-by-side field calibration method

    DEFF Research Database (Denmark)

    Markvart, Jakob; Hansen, Åse Marie; Christoffersen, Jens

    2015-01-01

    for side-by-side calibration of Actiwatches and similar personal light exposure devices. We suggest that the calibration methods presented can be used for calibration of other practical field devices, with respect to the various sensors already on the market and devices that will be introduced...

  16. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  17. An automated swimming respirometer

    DEFF Research Database (Denmark)

    STEFFENSEN, JF; JOHANSEN, K; BUSHNELL, PG

    1984-01-01

    An automated respirometer is described that can be used for computerized respirometry of trout and sharks.......An automated respirometer is described that can be used for computerized respirometry of trout and sharks....

  18. Configuration Management Automation (CMA) -

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  19. [Practical aspects of automated perimetry].

    Science.gov (United States)

    Stan, Cristina

    2014-01-01

    Visual field testing is a subjective method, but yet a very important part for diagnosis and in follow-up of ocular or neurological diseases. In order to do a correct exam of the visual field, one must know well the equipment and all the factors that could induce errors. Basic skills for working with Optopol or Humphrey perimeter are discussed in this paper.

  20. PROBLEM SETTING AND SOLUTION OF THE RESPONSE CORRECTION OF ARRIVAL AND DEPARTURE AIR TRAFFIC FLOW IN THE VICINITY OF THE FIELD BY MEANS OF THE GENETIC ALGORITHM

    Directory of Open Access Journals (Sweden)

    Georgii N. Lebedev

    2017-01-01

    Full Text Available The improvement in the effectiveness of airfield operation largely depends on the problem solving quality on the interaction boundaries of different technological sections. One of such hotspots is the use of the same runway by inbound and outbound aircraft. At certain intensity of outbound and inbound air traffic flow the conflict of aircraft interests appears, where it may be quite difficult to sort out priorities even for experienced controllers, in consequence of which mistakes in decision-making unavoidably appear.In this work the task of response correction of landing and takeoff time of the aircraft using the same RW, in condition of the conflict of interests “arrival – departure” at the increased operating intensity is formulated. The choice of optimal solution is made taking into account mutual interests without the complete sorting and the evaluation of all solutions.Accordingly, the genetic algorithm, which offers a simple and effective approach to optimal control problem solution by providing flight safety at an acceptably high level, is proposed. The estimation of additional aviation fuel consumption is used as optimal choice evaluation criterion.The advantages of the genetic algorithm application at decision-making in comparison with today’s “team” solution of the conflict “departure – arrival” in the airfield area are shown.

  1. Automation in College Libraries.

    Science.gov (United States)

    Werking, Richard Hume

    1991-01-01

    Reports the results of a survey of the "Bowdoin List" group of liberal arts colleges. The survey obtained information about (1) automation modules in place and when they had been installed; (2) financing of automation and its impacts on the library budgets; and (3) library director's views on library automation and the nature of the…

  2. Automated 3-D Radiation Mapping

    International Nuclear Information System (INIS)

    Tarpinian, J. E.

    1991-01-01

    This work describes an automated radiation detection and imaging system which combines several state-of-the-art technologies to produce a portable but very powerful visualization tool for planning work in radiation environments. The system combines a radiation detection system, a computerized radiation imaging program, and computerized 3-D modeling to automatically locate and measurements are automatically collected and imaging techniques are used to produce colored, 'isodose' images of the measured radiation fields. The isodose lines from the images are then superimposed over the 3-D model of the area. The final display shows the various components in a room and their associated radiation fields. The use of an automated radiation detection system increases the quality of radiation survey obtained measurements. The additional use of a three-dimensional display allows easier visualization of the area and associated radiological conditions than two-dimensional sketches

  3. Automated solid-phase subcloning based on beads brought into proximity by magnetic force.

    Science.gov (United States)

    Hudson, Elton P; Nikoshkov, Andrej; Uhlen, Mathias; Rockberg, Johan

    2012-01-01

    In the fields of proteomics, metabolic engineering and synthetic biology there is a need for high-throughput and reliable cloning methods to facilitate construction of expression vectors and genetic pathways. Here, we describe a new approach for solid-phase cloning in which both the vector and the gene are immobilized to separate paramagnetic beads and brought into proximity by magnetic force. Ligation events were directly evaluated using fluorescent-based microscopy and flow cytometry. The highest ligation efficiencies were obtained when gene- and vector-coated beads were brought into close contact by application of a magnet during the ligation step. An automated procedure was developed using a laboratory workstation to transfer genes into various expression vectors and more than 95% correct clones were obtained in a number of various applications. The method presented here is suitable for efficient subcloning in an automated manner to rapidly generate a large number of gene constructs in various vectors intended for high throughput applications.

  4. Safeguards Automated Facility Evaluation (SAFE) methodology

    International Nuclear Information System (INIS)

    Chapman, L.D.; Grady, L.M.; Bennett, H.A.; Sasser, D.W.; Engi, D.

    1978-08-01

    An automated approach to facility safeguards effectiveness evaluation has been developed. This automated process, called Safeguards Automated Facility Evaluation (SAFE), consists of a collection of a continuous stream of operational modules for facility characterization, the selection of critical paths, and the evaluation of safeguards effectiveness along these paths. The technique has been implemented on an interactive computer time-sharing system and makes use of computer graphics for the processing and presentation of information. Using this technique, a comprehensive evaluation of a safeguards system can be provided by systematically varying the parameters that characterize the physical protection components of a facility to reflect the perceived adversary attributes and strategy, environmental conditions, and site operational conditions. The SAFE procedure has broad applications in the nuclear facility safeguards field as well as in the security field in general. Any fixed facility containing valuable materials or components to be protected from theft or sabotage could be analyzed using this same automated evaluation technique

  5. Fast Automated Decoupling at RHIC

    CERN Document Server

    Beebe-Wang, Joanne

    2005-01-01

    Coupling correction is essential for the operational performance of RHIC. The independence of the transverse degrees of freedom makes diagnostics and tune control easier, and it is advantageous to operate an accelerator close to the coupling resonance to minimize nearby nonlinear sidebands. An automated decoupling application has been developed at RHIC for coupling correction during routine operations. The application decouples RHIC globally by minimizing the tune separation through finding the optimal settings of two orthogonal skew quadrupole families. The program provides options of automatic, semi-automatic and manual decoupling operations. It accesses tune information from all RHIC tune measurement systems: the PLL (Phase Lock Loop), the high frequency Schottky system, and the tune meter. It also supplies tune and skew quadrupole scans, finding the minimum tune separation, display the real time results and interface with the RHIC control system. We summarize the capabilities of the decoupling application...

  6. A Physical Model-based Correction for Charge Traps in the Hubble Space Telescope ’s Wide Field Camera 3 Near-IR Detector and Its Applications to Transiting Exoplanets and Brown Dwarfs

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Yifan; Apai, Dániel; Schneider, Glenn [Department of Astronomy/Steward Observatory, The University of Arizona, 933 N. Cherry Avenue, Tucson, AZ 85721 (United States); Lew, Ben W. P., E-mail: yzhou@as.arizona.edu [Department of Planetary Science/Lunar and Planetary Laboratory, The University of Arizona, 1640 E. University Boulevard, Tucson, AZ 85718 (United States)

    2017-06-01

    The Hubble Space Telescope Wide Field Camera 3 (WFC3) near-IR channel is extensively used in time-resolved observations, especially for transiting exoplanet spectroscopy as well as brown dwarf and directly imaged exoplanet rotational phase mapping. The ramp effect is the dominant source of systematics in the WFC3 for time-resolved observations, which limits its photometric precision. Current mitigation strategies are based on empirical fits and require additional orbits to help the telescope reach a thermal equilibrium . We show that the ramp-effect profiles can be explained and corrected with high fidelity using charge trapping theories. We also present a model for this process that can be used to predict and to correct charge trap systematics. Our model is based on a very small number of parameters that are intrinsic to the detector. We find that these parameters are very stable between the different data sets, and we provide best-fit values. Our model is tested with more than 120 orbits (∼40 visits) of WFC3 observations and is proved to be able to provide near photon noise limited corrections for observations made with both staring and scanning modes of transiting exoplanets as well as for starting-mode observations of brown dwarfs. After our model correction, the light curve of the first orbit in each visit has the same photometric precision as subsequent orbits, so data from the first orbit no longer need to be discarded. Near-IR arrays with the same physical characteristics (e.g., JWST/NIRCam ) may also benefit from the extension of this model if similar systematic profiles are observed.

  7. A Physical Model-based Correction for Charge Traps in the Hubble Space Telescope’s Wide Field Camera 3 Near-IR Detector and Its Applications to Transiting Exoplanets and Brown Dwarfs

    Science.gov (United States)

    Zhou, Yifan; Apai, Dániel; Lew, Ben W. P.; Schneider, Glenn

    2017-06-01

    The Hubble Space Telescope Wide Field Camera 3 (WFC3) near-IR channel is extensively used in time-resolved observations, especially for transiting exoplanet spectroscopy as well as brown dwarf and directly imaged exoplanet rotational phase mapping. The ramp effect is the dominant source of systematics in the WFC3 for time-resolved observations, which limits its photometric precision. Current mitigation strategies are based on empirical fits and require additional orbits to help the telescope reach a thermal equilibrium. We show that the ramp-effect profiles can be explained and corrected with high fidelity using charge trapping theories. We also present a model for this process that can be used to predict and to correct charge trap systematics. Our model is based on a very small number of parameters that are intrinsic to the detector. We find that these parameters are very stable between the different data sets, and we provide best-fit values. Our model is tested with more than 120 orbits (∼40 visits) of WFC3 observations and is proved to be able to provide near photon noise limited corrections for observations made with both staring and scanning modes of transiting exoplanets as well as for starting-mode observations of brown dwarfs. After our model correction, the light curve of the first orbit in each visit has the same photometric precision as subsequent orbits, so data from the first orbit no longer need to be discarded. Near-IR arrays with the same physical characteristics (e.g., JWST/NIRCam) may also benefit from the extension of this model if similar systematic profiles are observed.

  8. Evaluation of the Analytical Anisotropic Algorithm (AAA) in dose calculation for fields with non-uniform fluences considering heterogeneity correction; Avaliacao do Algoritmo Analitico Anisotropico (AAA) no calculo de dose para campos com fluencia nao uniforme considerando correcao de heterogeneidade

    Energy Technology Data Exchange (ETDEWEB)

    Bornatto, P.; Funchal, M.; Bruning, F.; Toledo, H.; Lyra, J.; Fernandes, T.; Toledo, F.; Marciao, C., E-mail: pricila_bornatto@yahoo.com.br [Hospital Erasto Gaertner (LPCC), Curitiba, PR (Brazil). Departamento de Radioterapia

    2014-08-15

    The purpose of this study is to evaluate the calculation of dose distribution AAA (Varian Medical Systems) for fields with non-uniform fluences considering heterogeneity correction. Five different phantoms were used with different density materials. These phantoms were scanned in the CT BrightSpeed (©GE Healthcare) upon the array of detectors MAPCHECK2 TM (Sun Nuclear Corporation) and irradiated in a linear accelerator 600 CD (Varian Medical Systems) 6MV and rate dose 400MU/min with isocentric setup. The fluences used were exported from IMRT plans, calculated by ECLIPSE™ planning system (Varian Medical Systems), and a 10x10 cm{sup 2} field to assess the heterogeneity correction for uniform fluence. The measured dose distribution was compared to the calculated by Gamma analysis with approval criteria of 3% / 3 mm and 10% threshold. The evaluation was performed using the software SNCPatient (Sun Nuclear Corporation) and considering absolute dose normalized at maximum. The phantoms best performers were those with low density materials, with an average of 99.2% approval. Already phantoms with plates of higher density material presented various fluences below 95% of the points approved. The average value reached 94.3%. It was observed a dependency between fluency and approved percentage points, whereas for the same fluency, 100% of the points have been approved in all phantoms. The approval criteria for IMRT plans recommended in most centers is 3% / 3mm with at least 95% of points approved, it can be concluded that, under these conditions, the IMRT plans with heterogeneity correction can be performed , however the quality control must be careful because the difficulty of the system to accurately predict the dose distribution in certain situations. (author)

  9. Quantum error correction for beginners

    International Nuclear Information System (INIS)

    Devitt, Simon J; Nemoto, Kae; Munro, William J

    2013-01-01

    Quantum error correction (QEC) and fault-tolerant quantum computation represent one of the most vital theoretical aspects of quantum information processing. It was well known from the early developments of this exciting field that the fragility of coherent quantum systems would be a catastrophic obstacle to the development of large-scale quantum computers. The introduction of quantum error correction in 1995 showed that active techniques could be employed to mitigate this fatal problem. However, quantum error correction and fault-tolerant computation is now a much larger field and many new codes, techniques, and methodologies have been developed to implement error correction for large-scale quantum algorithms. In response, we have attempted to summarize the basic aspects of quantum error correction and fault-tolerance, not as a detailed guide, but rather as a basic introduction. The development in this area has been so pronounced that many in the field of quantum information, specifically researchers who are new to quantum information or people focused on the many other important issues in quantum computation, have found it difficult to keep up with the general formalisms and methodologies employed in this area. Rather than introducing these concepts from a rigorous mathematical and computer science framework, we instead examine error correction and fault-tolerance largely through detailed examples, which are more relevant to experimentalists today and in the near future. (review article)

  10. A new way of utilizing pole face windings and magnetic field corrections for independent tuning of betatron wave numbers and chromaticities in the CERN Proton Synchrotron

    CERN Document Server

    Gouiran, R

    1978-01-01

    Precise control of the quadrupole and sextupole components of the magnetic fields in focusing and defocusing sectors respectively was achieved by the combined use of pole-face and yoke windings with three separate power supplies synchronously programmed by a computer. Experience of this technique led to a new philosophy in the design of pole-face windings, in which they become an integral and active part of the magnet. With the arrangement described, focusing and guiding functions are partially separated and an old combined-function accelerator can be transformed effectively into a more flexible separate-function machine without any decrease in available straight- section space. (5 refs).

  11. Brain refractive index measured in vivo with high-NA defocus-corrected full-field OCT and consequences for two-photon microscopy.

    Science.gov (United States)

    Binding, Jonas; Ben Arous, Juliette; Léger, Jean-François; Gigan, Sylvain; Boccara, Claude; Bourdieu, Laurent

    2011-03-14

    Two-photon laser scanning microscopy (2PLSM) is an important tool for in vivo tissue imaging with sub-cellular resolution, but the penetration depth of current systems is potentially limited by sample-induced optical aberrations. To quantify these, we measured the refractive index n' in the somatosensory cortex of 7 rats in vivo using defocus optimization in full-field optical coherence tomography (ff-OCT). We found n' to be independent of imaging depth or rat age. From these measurements, we calculated that two-photon imaging beyond 200 µm into the cortex is limited by spherical aberration, indicating that adaptive optics will improve imaging depth.

  12. Low-field magnetic resonance imaging or combined ultrasonography and anti-cyclic citrullinated peptide antibody improve correct classification of individuals as established rheumatoid arthritis

    DEFF Research Database (Denmark)

    Pedersen, Jens K; Lorenzen, Tove; Ejbjerg, Bo

    2014-01-01

    (RA). METHODS: In 53 individuals from a population-based, cross-sectional study, historic fulfilment of the American College of Rheumatology (ACR) 1987 criteria ("classification") or RA diagnosed by a rheumatologist ("diagnosis") were used as standard references. The sensitivity, specificity and Area....../specificity) was 78% (62%/94%) (classification) and 85% (69%/100%) (diagnosis), while for the total synovitis score of MCP joints plus wrist (cut-off ≥10) it was 78% (62%/94%) (both classification and diagnosis). CONCLUSIONS: Compared with the ACR 1987 criteria, low-field MRI alone or adapted criteria incorporating...

  13. Automated One-Loop Calculations with GoSam

    CERN Document Server

    Cullen, Gavin; Heinrich, Gudrun; Luisoni, Gionata; Mastrolia, Pierpaolo; Ossola, Giovanni; Reiter, Thomas; Tramontano, Francesco

    2012-01-01

    We present the program package GoSam which is designed for the automated calculation of one-loop amplitudes for multi-particle processes in renormalisable quantum field theories. The amplitudes, which are generated in terms of Feynman diagrams, can be reduced using either D-dimensional integrand-level decomposition or tensor reduction. GoSam can be used to calculate one-loop QCD and/or electroweak corrections to Standard Model processes and offers the flexibility to link model files for theories Beyond the Standard Model. A standard interface to programs calculating real radiation is also implemented. We demonstrate the flexibility of the program by presenting examples of processes with up to six external legs attached to the loop.

  14. Automated ISS Flight Utilities

    Science.gov (United States)

    Offermann, Jan Tuzlic

    2016-01-01

    EVADES output. As mentioned above, GEnEVADOSE makes extensive use of ROOT version 6, the data analysis framework developed at the European Organization for Nuclear Research (CERN), and the code is written to the C++11 standard (as are the other projects). My second project is the Automated Mission Reference Exposure Utility (AMREU).Unlike GEnEVADOSE, AMREU is a combination of three frameworks written in both Python and C++, also making use of ROOT (and PyROOT). Run as a combination of daily and weekly cron jobs, these macros query the SRAG database system to determine the active ISS missions, and query minute-by-minute radiation dose information from ISS-TEPC (Tissue Equivalent Proportional Counter), one of the radiation detectors onboard the ISS. Using this information, AMREU creates a corrected data set of daily radiation doses, addressing situations where TEPC may be offline or locked up by correcting doses for days with less than 95% live time (the total amount time the instrument acquires data) by averaging the past 7 days. As not all errors may be automatically detectable, AMREU also allows for manual corrections, checking an updated plaintext file each time it runs. With the corrected data, AMREU generates cumulative dose plots for each mission, and uses a Python script to generate a flight note file (.docx format) containing these plots, as well as information sections to be filled in and modified by the space weather environment officers with information specific to the week. AMREU is set up to run without requiring any user input, and it automatically archives old flight notes and information files for missions that are no longer active. My other projects involve cleaning up a large data set from the Charged Particle Directional Spectrometer (CPDS), joining together many different data sets in order to clean up information in SRAG SQL databases, and developing other automated utilities for displaying information on active solar regions, that may be used by the

  15. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  16. Automation of industrial bioprocesses.

    Science.gov (United States)

    Beyeler, W; DaPra, E; Schneider, K

    2000-01-01

    The dramatic development of new electronic devices within the last 25 years has had a substantial influence on the control and automation of industrial bioprocesses. Within this short period of time the method of controlling industrial bioprocesses has changed completely. In this paper, the authors will use a practical approach focusing on the industrial applications of automation systems. From the early attempts to use computers for the automation of biotechnological processes up to the modern process automation systems some milestones are highlighted. Special attention is given to the influence of Standards and Guidelines on the development of automation systems.

  17. Quantum corrections to Schwarzschild black hole

    Energy Technology Data Exchange (ETDEWEB)

    Calmet, Xavier; El-Menoufi, Basem Kamal [University of Sussex, Department of Physics and Astronomy, Brighton (United Kingdom)

    2017-04-15

    Using effective field theory techniques, we compute quantum corrections to spherically symmetric solutions of Einstein's gravity and focus in particular on the Schwarzschild black hole. Quantum modifications are covariantly encoded in a non-local effective action. We work to quadratic order in curvatures simultaneously taking local and non-local corrections into account. Looking for solutions perturbatively close to that of classical general relativity, we find that an eternal Schwarzschild black hole remains a solution and receives no quantum corrections up to this order in the curvature expansion. In contrast, the field of a massive star receives corrections which are fully determined by the effective field theory. (orig.)

  18. 78 FR 75449 - Miscellaneous Corrections; Corrections

    Science.gov (United States)

    2013-12-12

    ... cross- references, correcting grammatical errors, revising language for clarity and consistency, and... practice. Specifically, these amendments are to correct grammatical errors and to revise cross-references.... The final rule contained minor errors in grammar, punctuation, and referencing. This document corrects...

  19. Atmospheric Correction of Ocean Color Imagery: Test of the Spectral Optimization Algorithm with the Sea-Viewing Wide Field-of-View Sensor.

    Science.gov (United States)

    Chomko, R M; Gordon, H R

    2001-06-20

    We implemented the spectral optimization algorithm [SOA; Appl. Opt. 37, 5560 (1998)] in an image-processing environment and tested it with Sea-viewing Wide Field-of-View Sensor (SeaWiFS) imagery from the Middle Atlantic Bight and the Sargasso Sea. We compared the SOA and the standard SeaWiFS algorithm on two days that had significantly different atmospheric turbidities but, because of the location and time of the year, nearly the same water properties. The SOA-derived pigment concentration showed excellent continuity over the two days, with the relative difference in pigments exceeding 10% only in regions that are characteristic of high advection. The continuity in the derived water-leaving radiances at 443 and 555 nm was also within ~10%. There was no obvious correlation between the relative differences in pigments and the aerosol concentration. In contrast, standard processing showed poor continuity in derived pigments over the two days, with the relative differences correlating strongly with atmospheric turbidity. SOA-derived atmospheric parameters suggested that the retrieved ocean and atmospheric reflectances were decoupled on the more turbid day. On the clearer day, for which the aerosol concentration was so low that relatively large changes in aerosol properties resulted in only small changes in aerosol reflectance, water patterns were evident in the aerosol properties. This result implies that SOA-derived atmospheric parameters cannot be accurate in extremely clear atmospheres.

  20. Novel insights in agent-based complex automated negotiation

    CERN Document Server

    Lopez-Carmona, Miguel; Ito, Takayuki; Zhang, Minjie; Bai, Quan; Fujita, Katsuhide

    2014-01-01

    This book focuses on all aspects of complex automated negotiations, which are studied in the field of autonomous agents and multi-agent systems. This book consists of two parts. I: Agent-Based Complex Automated Negotiations, and II: Automated Negotiation Agents Competition. The chapters in Part I are extended versions of papers presented at the 2012 international workshop on Agent-Based Complex Automated Negotiation (ACAN), after peer reviews by three Program Committee members. Part II examines in detail ANAC 2012 (The Third Automated Negotiating Agents Competition), in which automated agents that have different negotiation strategies and are implemented by different developers are automatically negotiated in the several negotiation domains. ANAC is an international competition in which automated negotiation strategies, submitted by a number of universities and research institutes across the world, are evaluated in tournament style. The purpose of the competition is to steer the research in the area of bilate...

  1. Corrective Jaw Surgery

    Medline Plus

    Full Text Available ... and Craniofacial Surgery Cleft Lip/Palate and Craniofacial Surgery A cleft lip may require one or more ... find out more. Corrective Jaw Surgery Corrective Jaw Surgery Orthognathic surgery is performed to correct the misalignment ...

  2. Automated imaging system for single molecules

    Science.gov (United States)

    Schwartz, David Charles; Runnheim, Rodney; Forrest, Daniel

    2012-09-18

    There is provided a high throughput automated single molecule image collection and processing system that requires minimal initial user input. The unique features embodied in the present disclosure allow automated collection and initial processing of optical images of single molecules and their assemblies. Correct focus may be automatically maintained while images are collected. Uneven illumination in fluorescence microscopy is accounted for, and an overall robust imaging operation is provided yielding individual images prepared for further processing in external systems. Embodiments described herein are useful in studies of any macromolecules such as DNA, RNA, peptides and proteins. The automated image collection and processing system and method of same may be implemented and deployed over a computer network, and may be ergonomically optimized to facilitate user interaction.

  3. Automated Testing of Event-Driven Applications

    DEFF Research Database (Denmark)

    Jensen, Casper Svenning

    may be tested by selecting an interesting input (i.e. a sequence of events), and deciding if a failure occurs when the selected input is applied to the event-driven application under test. Automated testing promises to reduce the workload for developers by automatically selecting interesting inputs...... and detect failures. However, it is non-trivial to conduct automated testing of event-driven applications because of, for example, infinite input spaces and the absence of specifications of correct application behavior. In this PhD dissertation, we identify a number of specific challenges when conducting...... automated testing of event-driven applications, and we present novel techniques for solving these challenges. First, we present an algorithm for stateless model-checking of event-driven applications with partial-order reduction, and we show how this algorithm may be used to systematically test web...

  4. Semi-automated Approach to Mapping Sub-hectare Agricultural Fields using Very High Resolution Data in a High-Performance Computing Environment

    Science.gov (United States)

    Wooten, M.; Neigh, C. S. R.; Carroll, M.; McCarty, J. L.

    2017-12-01

    In areas susceptible to drought such as sub-Saharan Africa, Crop Area (CA) and agricultural mapping have become increasingly important as strain on natural ecosystems increases. In Ethiopia alone, the population has grown four-fold in the last 70 years, and rapidly growing human populations bring added stress to ecosystems as more wildlands are converted to pastures and subsistence agriculture. Monitoring change in agriculture is one of the more essential goals of famine early warning systems. However, due to the sub-hectare size of rainfed agricultural fields in regions such as Tigray, Ethiopia, moderate resolution satellite imagery is insufficient at capturing these smallholder farms. Thanks to the increasing density of observations and ease of access to very high resolution (VHR) data, we have developed a generalized method for mapping CA with VHR data and have used this to generate wall-to-wall CA map for the entire Tigray region and samples in Myanmar, Senegal, and Vietnam. Here we present the methodology and early results as well as potential future applications.

  5. An automated system for whole microscopic image acquisition and analysis.

    Science.gov (United States)

    Bueno, Gloria; Déniz, Oscar; Fernández-Carrobles, María Del Milagro; Vállez, Noelia; Salido, Jesús

    2014-09-01

    The field of anatomic pathology has experienced major changes over the last decade. Virtual microscopy (VM) systems have allowed experts in pathology and other biomedical areas to work in a safer and more collaborative way. VMs are automated systems capable of digitizing microscopic samples that were traditionally examined one by one. The possibility of having digital copies reduces the risk of damaging original samples, and also makes it easier to distribute copies among other pathologists. This article describes the development of an automated high-resolution whole slide imaging (WSI) system tailored to the needs and problems encountered in digital imaging for pathology, from hardware control to the full digitization of samples. The system has been built with an additional digital monochromatic camera together with the color camera by default and LED transmitted illumination (RGB). Monochrome cameras are the preferred method of acquisition for fluorescence microscopy. The system is able to digitize correctly and form large high resolution microscope images for both brightfield and fluorescence. The quality of the digital images has been quantified using three metrics based on sharpness, contrast and focus. It has been proved on 150 tissue samples of brain autopsies, prostate biopsies and lung cytologies, at five magnifications: 2.5×, 10×, 20×, 40×, and 63×. The article is focused on the hardware set-up and the acquisition software, although results of the implemented image processing techniques included in the software and applied to the different tissue samples are also presented. © 2014 Wiley Periodicals, Inc.

  6. Development and Application of Tools for MRI Analysis - A Study on the Effects of Exercise in Patients with Alzheimer's Disease and Generative Models for Bias Field Correction in MR Brain Imaging

    DEFF Research Database (Denmark)

    Larsen, Christian Thode

    exercise” (ADEX), where longitudinal Freesurfer analysis was used to obtain segmentations of the hippocampal subfields and cortical regions in a subgroup of participants before and after a four-month exercise period. The participants performed moderate-to-high aerobic exercise for one hour, three times per...... due to the intervention. However, it was found that exercise load (attendance and training intensity) correlated with changes in the hippocampus and in frontal and cingulate cortical thickness. Furthermore, changes in frontal and cingulate cortical thickness were found to correlate with changes...... for longitudinal correction of the bias field, as well as a model that does not require brain masking or probabilistic, anatomical atlases in order to perform well. Finally, the thesis presents the realization of these models in the software package "Intensity Inhomogeneity Correction”, which will be made publicly...

  7. Advances in Automation and Robotics

    CERN Document Server

    International conference on Automation and Robotics ICAR2011

    2012-01-01

    The international conference on Automation and Robotics-ICAR2011 is held during December 12-13, 2011 in Dubai, UAE. The proceedings of ICAR2011 have been published by Springer Lecture Notes in Electrical Engineering, which include 163 excellent papers selected from more than 400 submitted papers.   The conference is intended to bring together the researchers and engineers/technologists working in different aspects of intelligent control systems and optimization, robotics and automation, signal processing, sensors, systems modeling and control, industrial engineering, production and management.   This part of proceedings includes 81 papers contributed by many researchers in relevant topic areas covered at ICAR2011 from various countries such as France, Japan, USA, Korea and China etc.     Many papers introduced their advanced research work recently; some of them gave a new solution to problems in the field, with powerful evidence and detail demonstration. Others stated the application of their designed and...

  8. Automated Methods of Corrosion Measurements

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    1997-01-01

    Scanning probe microscopy (SPM) techniques rely on computer recordings of interactions between the tip of a minute probe and the surface of the small specimen as a function of position; the measurements are used to depict an image of the atomic-scale surface topography on the computer screen....... Mechanical control, recording, and data processing must therefore be automated to a high level of precision and reliability. These general techniques and the apparatus involved have been described extensively. The automated methods of such high-resolution microscopy coordinated with computerized...... electrochemical measurements as well as elemental analysis look very promising for elucidating corrosion reaction mechanisms. The study of initial surface reactions at the atomic or submicron level is becoming an important field of research in the understanding of corrosion processes. At present, mainly two...

  9. Automating occupational protection records systems

    International Nuclear Information System (INIS)

    Lyon, M.; Martin, J.B.

    1991-10-01

    Occupational protection records have traditionally been generated by field and laboratory personnel, assembled into files in the safety office, and eventually stored in a warehouse or other facility. Until recently, these records have been primarily paper copies, often handwritten. Sometimes, the paper is microfilmed for storage. However, electronic records are beginning to replace these traditional methods. The purpose of this paper is to provide guidance for making the transition to automated record keeping and retrieval using modern computer equipment. This paper describes the types of records most readily converted to electronic record keeping and a methodology for implementing an automated record system. The process of conversion is based on a requirements analysis to assess program needs and a high level of user involvement during the development. The importance of indexing the hard copy records for easy retrieval is also discussed. The concept of linkage between related records and its importance relative to reporting, research, and litigation will be addressed. 2 figs

  10. Colorimetric sensor for bad odor detection using automated color correction

    Science.gov (United States)

    Schmitt, K.; Tarantik, K.; Pannek, C.; Benito-Altamirano, I.; Casals, O.; Fàbrega, C.; Romano-Rodríguez, A.; Wöllenstein, J.; Prades, J. D.

    2017-06-01

    Colorimetric sensors based on color-changing dyes offer a convenient approach for the quantitative measurement of gases. An integrated, mobile colorimetric sensor can be particularly helpful for occasional gas measurements, such as informal air quality checks for bad odors. In these situations, the main requirement is high availability, easy usage, and high specificity towards one single chemical compound, combined with cost-efficient production. In this contribution, we show how a well stablished colorimetric method can be adapted for easy operation and readout, making it suitable for the untrained end user. As an example, we present the use of pH indicators for the selective and reversible detection of NH3 in air (one relevant gas contributing to bad odors) using gas-sensitive layers dip coated on glass substrates. Our results show that the method can be adapted to detect NH3 concentrations lower than 1 ppm, with measure-to-result times in the range of a few minutes. We demonstrate that the color measurements can be carried out with the optical signals of RGB sensors, without losing quantitative performance.

  11. Sensors and Automated Analyzers for Radionuclides

    International Nuclear Information System (INIS)

    Grate, Jay W.; Egorov, Oleg B.

    2003-01-01

    The production of nuclear weapons materials has generated large quantities of nuclear waste and significant environmental contamination. We have developed new, rapid, automated methods for determination of radionuclides using sequential injection methodologies to automate extraction chromatographic separations, with on-line flow-through scintillation counting for real time detection. This work has progressed in two main areas: radionuclide sensors for water monitoring and automated radiochemical analyzers for monitoring nuclear waste processing operations. Radionuclide sensors have been developed that collect and concentrate radionuclides in preconcentrating minicolumns with dual functionality: chemical selectivity for radionuclide capture and scintillation for signal output. These sensors can detect pertechnetate to below regulatory levels and have been engineered into a prototype for field testing. A fully automated process monitor has been developed for total technetium in nuclear waste streams. This instrument performs sample acidification, speciation adjustment, separation and detection in fifteen minutes or less

  12. Guidelines for automated control systems for stoves

    DEFF Research Database (Denmark)

    Illerup, Jytte Boll; Mandl, Christoph; Obernberger, Ingwald

    of the project proposed can be structured as follows. Objectives related to emission reduction -Development and implementation of automated control systems for stoves as a feature of new stoves but also as retrofit units for existing models. Automated control systems can help to widely eliminate user induced...... operation which could be comparable to the emission level of automated small-scale boilers. -Evaluation and test of foam ceramic materials for efficient PM emission reduction. -Evaluation of the implementation of modern chimney draught regulators. Objectives related to increasing efficiency and new fields...... partners from 4 European countries collaborated within Woodstoves2020 (see next page). This document summarises the outcomes of the investigations regarding the improvement of wood stoves by the application of automated control concepts as a primary measure for emission reduction. It should support stove...

  13. Automation bias: empirical results assessing influencing factors.

    Science.gov (United States)

    Goddard, Kate; Roudsari, Abdul; Wyatt, Jeremy C

    2014-05-01

    To investigate the rate of automation bias - the propensity of people to over rely on automated advice and the factors associated with it. Tested factors were attitudinal - trust and confidence, non-attitudinal - decision support experience and clinical experience, and environmental - task difficulty. The paradigm of simulated decision support advice within a prescribing context was used. The study employed within participant before-after design, whereby 26 UK NHS General Practitioners were shown 20 hypothetical prescribing scenarios with prevalidated correct and incorrect answers - advice was incorrect in 6 scenarios. They were asked to prescribe for each case, followed by being shown simulated advice. Participants were then asked whether they wished to change their prescription, and the post-advice prescription was recorded. Rate of overall decision switching was captured. Automation bias was measured by negative consultations - correct to incorrect prescription switching. Participants changed prescriptions in 22.5% of scenarios. The pre-advice accuracy rate of the clinicians was 50.38%, which improved to 58.27% post-advice. The CDSS improved the decision accuracy in 13.1% of prescribing cases. The rate of automation bias, as measured by decision switches from correct pre-advice, to incorrect post-advice was 5.2% of all cases - a net improvement of 8%. More immediate factors such as trust in the specific CDSS, decision confidence, and task difficulty influenced rate of decision switching. Lower clinical experience was associated with more decision switching. Age, DSS experience and trust in CDSS generally were not significantly associated with decision switching. This study adds to the literature surrounding automation bias in terms of its potential frequency and influencing factors. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  14. Automation systems for radioimmunoassay

    International Nuclear Information System (INIS)

    Yamasaki, Paul

    1974-01-01

    The application of automation systems for radioimmunoassay (RIA) was discussed. Automated systems could be useful in the second step, of the four basic processes in the course of RIA, i.e., preparation of sample for reaction. There were two types of instrumentation, a semi-automatic pipete, and a fully automated pipete station, both providing for fast and accurate dispensing of the reagent or for the diluting of sample with reagent. Illustrations of the instruments were shown. (Mukohata, S.)

  15. Automated stopcock actuator

    OpenAIRE

    Vandehey, N. T.; O\\'Neil, J. P.

    2015-01-01

    Introduction We have developed a low-cost stopcock valve actuator for radiochemistry automation built using a stepper motor and an Arduino, an open-source single-board microcontroller. The con-troller hardware can be programmed to run by serial communication or via two 5–24 V digital lines for simple integration into any automation control system. This valve actuator allows for automated use of a single, disposable stopcock, providing a number of advantages over stopcock manifold systems ...

  16. Automated Analysis of Accountability

    DEFF Research Database (Denmark)

    Bruni, Alessandro; Giustolisi, Rosario; Schürmann, Carsten

    2017-01-01

    that are amenable to automated verification. Our definitions are general enough to be applied to different classes of protocols and different automated security verification tools. Furthermore, we point out formally the relation between verifiability and accountability. We validate our definitions...... with the automatic verification of three protocols: a secure exam protocol, Google’s Certificate Transparency, and an improved version of Bingo Voting. We find through automated verification that all three protocols satisfy verifiability while only the first two protocols meet accountability....

  17. Modulation of glacier ablation by tephra coverage from Eyjafjallajökull and Grímsvötn volcanoes, Iceland: an automated field experiment

    Science.gov (United States)

    Möller, Rebecca; Möller, Marco; Kukla, Peter A.; Schneider, Christoph

    2018-01-01

    We report results from a field experiment investigating the influence of volcanic tephra coverage on glacier ablation. These influences are known to be significantly different from those of moraine debris on glaciers due to the contrasting grain size distribution and thermal conductivity. Thus far, the influences of tephra deposits on glacier ablation have rarely been studied. For the experiment, artificial plots of two different tephra types from Eyjafjallajökull and Grímsvötn volcanoes were installed on a snow-covered glacier surface of Vatnajökull ice cap, Iceland. Snow-surface lowering and atmospheric conditions were monitored in summer 2015 and compared to a tephra-free reference site. For each of the two volcanic tephra types, three plots of variable thickness ( ˜ 1.5, ˜ 8.5 and ˜ 80 mm) were monitored. After limiting the records to a period of reliable measurements, a 50-day data set of hourly records was obtained, which can be downloaded from the Pangaea data repository (https://www.pangaea.de" target="_blank">https://www.pangaea.de; doi:10.1594/PANGAEA.876656). The experiment shows a substantial increase in snow-surface lowering rates under the ˜ 1.5 and ˜ 8.5 mm tephra plots when compared to uncovered conditions. Under the thick tephra cover some insulating effects could be observed. These results are in contrast to other studies which depicted insulating effects for much thinner tephra coverage on bare-ice glacier surfaces. Differences between the influences of the two different petrological types of tephra exist but are negligible compared to the effect of tephra coverage overall.

  18. Management Planning for Workplace Automation.

    Science.gov (United States)

    McDole, Thomas L.

    Several factors must be considered when implementing office automation. Included among these are whether or not to automate at all, the effects of automation on employees, requirements imposed by automation on the physical environment, effects of automation on the total organization, and effects on clientele. The reasons behind the success or…

  19. Laboratory Automation and Middleware.

    Science.gov (United States)

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Automated cloning methods.; TOPICAL

    International Nuclear Information System (INIS)

    Collart, F.

    2001-01-01

    Argonne has developed a series of automated protocols to generate bacterial expression clones by using a robotic system designed to be used in procedures associated with molecular biology. The system provides plate storage, temperature control from 4 to 37 C at various locations, and Biomek and Multimek pipetting stations. The automated system consists of a robot that transports sources from the active station on the automation system. Protocols for the automated generation of bacterial expression clones can be grouped into three categories (Figure 1). Fragment generation protocols are initiated on day one of the expression cloning procedure and encompass those protocols involved in generating purified coding region (PCR)

  1. SU-G-IeP1-01: A Novel MRI Post-Processing Algorithm for Visualization of the Prostate LDR Brachytherapy Seeds and Calcifications Based On B0 Field Inhomogeneity Correction and Hough Transform

    Energy Technology Data Exchange (ETDEWEB)

    Nosrati, R [Reyrson University, Toronto, Ontario (Canada); Sunnybrook Health Sciences Centre, Toronto, Ontario (Canada); Soliman, A; Owrangi, A [Sunnybrook Research Institute, Toronto, Ontario (Canada); Sunnybrook Health Sciences Centre, Toronto, Ontario (Canada); Ghugre, N [Sunnybrook Research Institute, Toronto, Ontario (Canada); University of Toronto, Toronto, ON (Canada); Morton, G [Sunnybrook Health Sciences Centre, Toronto, Ontario (Canada); University of Toronto, Toronto, ON (Canada); Pejovic-Milic, A [Reyrson University, Toronto, Ontario (Canada); Song, W [Reyrson University, Toronto, Ontario (Canada); Sunnybrook Research Institute, Toronto, Ontario (Canada); Sunnybrook Health Sciences Centre, Toronto, Ontario (Canada); University of Toronto, Toronto, ON (Canada)

    2016-06-15

    Purpose: This study aims at developing an MRI-only workflow for post-implant dosimetry of the prostate LDR brachytherapy seeds. The specific goal here is to develop a post-processing algorithm to produce positive contrast for the seeds and prostatic calcifications and differentiate between them on MR images. Methods: An agar-based phantom incorporating four dummy seeds (I-125) and five calcifications of different sizes (from sheep cortical bone) was constructed. Seeds were placed arbitrarily in the coronal plane. The phantom was scanned with 3T Philips Achieva MR scanner using an 8-channel head coil array. Multi-echo turbo spin echo (ME-TSE) and multi-echo gradient recalled echo (ME-GRE) sequences were acquired. Due to minimal susceptibility artifacts around seeds, ME-GRE sequence (flip angle=15; TR/TE=20/2.3/2.3; resolution=0.7×0.7×2mm3) was further processed.The induced field inhomogeneity due to the presence of titaniumencapsulated seeds was corrected using a B0 field map. B0 map was calculated using the ME-GRE sequence by calculating the phase difference at two different echo times. Initially, the product of the first echo and B0 map was calculated. The features corresponding to the seeds were then extracted in three steps: 1) the edge pixels were isolated using “Prewitt” operator; 2) the Hough transform was employed to detect ellipses approximately matching the dimensions of the seeds and 3) at the position and orientation of the detected ellipses an ellipse was drawn on the B0-corrected image. Results: The proposed B0-correction process produced positive contrast for the seeds and calcifications. The Hough transform based on Prewitt edge operator successfully identified all the seeds according to their ellipsoidal shape and dimensions in the edge image. Conclusion: The proposed post-processing algorithm successfully visualized the seeds and calcifications with positive contrast and differentiates between them according to their shapes. Further

  2. Magnetospheric ULF wave studies in the frame of Swarm mission: new advanced tools for automated detection of pulsations in magnetic and electric field observations

    Science.gov (United States)

    Balasis, Georgios; Daglis, Ioannis A.; Papadimitriou, Constantinos; Georgiou, Marina; Giamini, Sigiava A.; Sandberg, Ingmar; Haagmans, Roger

    2014-05-01

    The rekindling of the interest in space science in the last 15 years has led to many successful satellite missions in the Earth's magnetosphere and topside ionosphere, which were able to provide the scientific community with high-quality data on the magnetic and electric fields surrounding our planet. This data pool will be further enriched by the measurements of ESA's Swarm mission, a constellation of three satellites in different polar orbits, flying at altitudes from 400 to 550 km, which was launched on the 22nd of November 2013. Aiming at the best scientific exploitation of this corpus of accumulated data, we have developed a set of analysis tools that can cope with measurements of various spacecraft, at various regions of the magnetosphere and in the topside ionosphere. Our algorithms are based on a combination of wavelet spectral methods and artificial neural network techniques and are suited for the detection of waves and wave-like disturbances as well as the extraction of several physical parameters. Our recent work demonstrates the applicability of our developed analysis tools, both for individual case studies and statistical analysis of ultra low frequency (ULF) waves. We provide evidence for a rare simultaneous observation of a ULF wave event in the Earth's magnetosphere, topside ionosphere and surface: we have found a specific time interval during the Halloween 2003 magnetic storm, when the Cluster and CHAMP spacecraft were in good local time (LT) conjunction, and have examined the ULF wave activity in the Pc3 (22-100 mHz), Pc4 (7-22 mHz) and Pc5 (1-7 mHz) bands using data from the Geotail, Cluster and CHAMP missions, as well as the CARISMA, GIMA and IMAGE magnetometer networks. Our study shows that the same wave event, characterized by increased activity in the high end of the Pc3 band, was simultaneously observed by all three satellite missions and by certain stations of ground networks. This observation provides a strong argument in favour of the

  3. Bright-field in situ hybridization for HER2 gene amplification in breast cancer using tissue microarrays: correlation between chromogenic (CISH) and automated silver-enhanced (SISH) methods with patient outcome.

    Science.gov (United States)

    Francis, Glenn D; Jones, Mark A; Beadle, Geoffrey F; Stein, Sandra R

    2009-06-01

    with immunohistochemistry results and with breast cancer-specific survival. HER2 SISH testing combines the advantages of automation and bright-field microscopy to facilitate workflow within the laboratory, improves turnaround time, and correlates with patient outcome.

  4. Determination of the Optimized Automation Rate considering Effects of Automation on Human Operators in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun; Kim, Jong Hyun; Kim, Man Cheol

    2015-01-01

    Automation refers to the use of a device or a system to perform a function previously performed by a human operator. It is introduced to reduce the human errors and to enhance the performance in various industrial fields, including the nuclear industry. However, these positive effects are not always achieved in complex systems such as nuclear power plants (NPPs). An excessive introduction of automation can generate new roles for human operators and change activities in unexpected ways. As more automation systems are accepted, the ability of human operators to detect automation failures and resume manual control is diminished. This disadvantage of automation is called the Out-of-the- Loop (OOTL) problem. We should consider the positive and negative effects of automation at the same time to determine the appropriate level of the introduction of automation. Thus, in this paper, we suggest an estimation method to consider the positive and negative effects of automation at the same time to determine the appropriate introduction of automation. This concept is limited in that it does not consider the effects of automation on human operators. Thus, a new estimation method for automation rate was suggested to overcome this problem

  5. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Havlůj, F.; Hejzlar, J.; Vočka, R.

    2013-01-01

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  6. FAST AUTOMATED DECOUPLING AT RHIC

    International Nuclear Information System (INIS)

    BEEBE-WANG, J.J.

    2005-01-01

    Coupling correction is essential for the operational performance of RHIC. The independence of the transverse degrees of freedom makes diagnostics and tune control easier, and it is advantageous to operate an accelerator close to the coupling resonance to minimize nearby nonlinear sidebands. An automated coupling correction application iDQmini has been developed for RHIC routine operations. The application decouples RHIC globally by minimizing the tune separation through finding the optimal settings of two orthogonal skew quadrupole families. The program iDQmini provides options of automatic, semi-automatic and manual decoupling operations. It accesses tune information from all RHIC tune measurement systems: the PLL (phase lock loop), the high frequency Schottky system and the tune meter. It also supplies tune and skew quadrupole scans, finding the minimum tune separation, display the real time results and interface with the RHIC control system. We summarize the capabilities of the coupling correction application iDQmini, and discuss the operational protections incorporated in the program

  7. Debug automation from pre-silicon to post-silicon

    CERN Document Server

    Dehbashi, Mehdi

    2015-01-01

    This book describes automated debugging approaches for the bugs and the faults which appear in different abstraction levels of a hardware system. The authors employ a transaction-based debug approach to systems at the transaction-level, asserting the correct relation of transactions. The automated debug approach for design bugs finds the potential fault candidates at RTL and gate-level of a circuit. Debug techniques for logic bugs and synchronization bugs are demonstrated, enabling readers to localize the most difficult bugs. Debug automation for electrical faults (delay faults)finds the potentially failing speedpaths in a circuit at gate-level. The various debug approaches described achieve high diagnosis accuracy and reduce the debugging time, shortening the IC development cycle and increasing the productivity of designers. Describes a unified framework for debug automation used at both pre-silicon and post-silicon stages; Provides approaches for debug automation of a hardware system at different levels of ...

  8. End-to-end workflow for finite element analysis of tumor treating fields in glioblastomas.

    Science.gov (United States)

    Timmons, Joshua J; Lok, Edwin; San, Pyay; Bui, Kevin; Wong, Eric T

    2017-10-12

    Tumor Treating Fields (TTFields) therapy is an approved modality of treatment for glioblastoma. Patient anatomy-based finite element analysis (FEA) has the potential to reveal not only how these fields affect tumor control but also how to improve efficacy. While the automated tools for segmentation speed up the generation of FEA models, multi-step manual corrections are required, including removal of disconnected voxels, incorporation of unsegmented structures and the addition of 36 electrodes plus gel layers matching the TTFields transducers. Existing approaches are also not scalable for the high throughput analysis of large patient volumes. A semi-automated workflow was developed to prepare FEA models for TTFields mapping in the human brain. Magnetic resonance imaging (MRI) pre-processing, segmentation, electrode and gel placement, and post-processing were all automated. The material properties of each tissue were applied to their corresponding mask in silico using COMSOL Multiphysics (COMSOL, Burlington, MA, USA). The fidelity of the segmentations with and without post-processing was compared against the full semi-automated segmentation workflow approach using Dice coefficient analysis. The average relative differences for the electric fields generated by COMSOL were calculated in addition to observed differences in electric field-volume histograms. Furthermore, the mesh file formats in MPHTXT and NASTRAN were also compared using the differences in the electric field-volume histogram. The Dice coefficient was less for auto-segmentation without versus auto-segmentation with post-processing, indicating convergence on a manually corrected model. An existent but marginal relative difference of electric field maps from models with manual correction versus those without was identified, and a clear advantage of using the NASTRAN mesh file format was found. The software and workflow outlined in this article may be used to accelerate the investigation of TTFields in

  9. End-to-end workflow for finite element analysis of tumor treating fields in glioblastomas

    Science.gov (United States)

    Timmons, Joshua J.; Lok, Edwin; San, Pyay; Bui, Kevin; Wong, Eric T.

    2017-11-01

    Tumor Treating Fields (TTFields) therapy is an approved modality of treatment for glioblastoma. Patient anatomy-based finite element analysis (FEA) has the potential to reveal not only how these fields affect tumor control but also how to improve efficacy. While the automated tools for segmentation speed up the generation of FEA models, multi-step manual corrections are required, including removal of disconnected voxels, incorporation of unsegmented structures and the addition of 36 electrodes plus gel layers matching the TTFields transducers. Existing approaches are also not scalable for the high throughput analysis of large patient volumes. A semi-automated workflow was developed to prepare FEA models for TTFields mapping in the human brain. Magnetic resonance imaging (MRI) pre-processing, segmentation, electrode and gel placement, and post-processing were all automated. The material properties of each tissue were applied to their corresponding mask in silico using COMSOL Multiphysics (COMSOL, Burlington, MA, USA). The fidelity of the segmentations with and without post-processing was compared against the full semi-automated segmentation workflow approach using Dice coefficient analysis. The average relative differences for the electric fields generated by COMSOL were calculated in addition to observed differences in electric field-volume histograms. Furthermore, the mesh file formats in MPHTXT and NASTRAN were also compared using the differences in the electric field-volume histogram. The Dice coefficient was less for auto-segmentation without versus auto-segmentation with post-processing, indicating convergence on a manually corrected model. An existent but marginal relative difference of electric field maps from models with manual correction versus those without was identified, and a clear advantage of using the NASTRAN mesh file format was found. The software and workflow outlined in this article may be used to accelerate the investigation of TTFields in

  10. Automated System Marketplace 1994.

    Science.gov (United States)

    Griffiths, Jose-Marie; Kertis, Kimberly

    1994-01-01

    Reports results of the 1994 Automated System Marketplace survey based on responses from 60 vendors. Highlights include changes in the library automation marketplace; estimated library systems revenues; minicomputer and microcomputer-based systems; marketplace trends; global markets and mergers; research needs; new purchase processes; and profiles…

  11. Automation benefits BWR customers

    International Nuclear Information System (INIS)

    Anon.

    1982-01-01

    A description is given of the increasing use of automation at General Electric's Wilmington fuel fabrication plant. Computerised systems and automated equipment perform a large number of inspections, inventory and process operations, and new advanced systems are being continuously introduced to reduce operator errors and expand product reliability margins. (U.K.)

  12. Automate functional testing

    Directory of Open Access Journals (Sweden)

    Ramesh Kalindri

    2014-06-01

    Full Text Available Currently, software engineers are increasingly turning to the option of automating functional tests, but not always have successful in this endeavor. Reasons range from low planning until over cost in the process. Some principles that can guide teams in automating these tests are described in this article.

  13. Automation in Warehouse Development

    NARCIS (Netherlands)

    Hamberg, R.; Verriet, J.

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and

  14. Identity Management Processes Automation

    Directory of Open Access Journals (Sweden)

    A. Y. Lavrukhin

    2010-03-01

    Full Text Available Implementation of identity management systems consists of two main parts, consulting and automation. The consulting part includes development of a role model and identity management processes description. The automation part is based on the results of consulting part. This article describes the most important aspects of IdM implementation.

  15. Work and Programmable Automation.

    Science.gov (United States)

    DeVore, Paul W.

    A new industrial era based on electronics and the microprocessor has arrived, an era that is being called intelligent automation. Intelligent automation, in the form of robots, replaces workers, and the new products, using microelectronic devices, require significantly less labor to produce than the goods they replace. The microprocessor thus…

  16. Library Automation in Pakistan.

    Science.gov (United States)

    Haider, Syed Jalaluddin

    1998-01-01

    Examines the state of library automation in Pakistan. Discusses early developments; financial support by the Netherlands Library Development Project (Pakistan); lack of automated systems in college/university and public libraries; usage by specialist libraries; efforts by private-sector libraries and the National Library in Pakistan; commonly used…

  17. Library Automation Style Guide.

    Science.gov (United States)

    Gaylord Bros., Liverpool, NY.

    This library automation style guide lists specific terms and names often used in the library automation industry. The terms and/or acronyms are listed alphabetically and each is followed by a brief definition. The guide refers to the "Chicago Manual of Style" for general rules, and a notes section is included for the convenience of individual…

  18. Planning for Office Automation.

    Science.gov (United States)

    Sherron, Gene T.

    1982-01-01

    The steps taken toward office automation by the University of Maryland are described. Office automation is defined and some types of word processing systems are described. Policies developed in the writing of a campus plan are listed, followed by a section on procedures adopted to implement the plan. (Author/MLW)

  19. The Automated Office.

    Science.gov (United States)

    Naclerio, Nick

    1979-01-01

    Clerical personnel may be able to climb career ladders as a result of office automation and expanded job opportunities in the word processing area. Suggests opportunities in an automated office system and lists books and periodicals on word processing for counselors and teachers. (MF)

  20. Automating the Small Library.

    Science.gov (United States)

    Skapura, Robert

    1987-01-01

    Discusses the use of microcomputers for automating school libraries, both for entire systems and for specific library tasks. Highlights include available library management software, newsletters that evaluate software, constructing an evaluation matrix, steps to consider in library automation, and a brief discussion of computerized card catalogs.…

  1. Automated Diatom Analysis Applied to Traditional Light Microscopy: A Proof-of-Concept Study

    Science.gov (United States)

    Little, Z. H. L.; Bishop, I.; Spaulding, S. A.; Nelson, H.; Mahoney, C.

    2017-12-01

    Diatom identification and enumeration by high resolution light microscopy is required for many areas of research and water quality assessment. Such analyses, however, are both expertise and labor-intensive. These challenges motivate the need for an automated process to efficiently and accurately identify and enumerate diatoms. Improvements in particle analysis software have increased the likelihood that diatom enumeration can be automated. VisualSpreadsheet software provides a possible solution for automated particle analysis of high-resolution light microscope diatom images. We applied the software, independent of its complementary FlowCam hardware, to automated analysis of light microscope images containing diatoms. Through numerous trials, we arrived at threshold settings to correctly segment 67% of the total possible diatom valves and fragments from broad fields of view. (183 light microscope images were examined containing 255 diatom particles. Of the 255 diatom particles present, 216 diatoms valves and fragments of valves were processed, with 170 properly analyzed and focused upon by the software). Manual analysis of the images yielded 255 particles in 400 seconds, whereas the software yielded a total of 216 particles in 68 seconds, thus highlighting that the software has an approximate five-fold efficiency advantage in particle analysis time. As in past efforts, incomplete or incorrect recognition was found for images with multiple valves in contact or valves with little contrast. The software has potential to be an effective tool in assisting taxonomists with diatom enumeration by completing a large portion of analyses. Benefits and limitations of the approach are presented to allow for development of future work in image analysis and automated enumeration of traditional light microscope images containing diatoms.

  2. Advances in inspection automation

    Science.gov (United States)

    Weber, Walter H.; Mair, H. Douglas; Jansen, Dion; Lombardi, Luciano

    2013-01-01

    This new session at QNDE reflects the growing interest in inspection automation. Our paper describes a newly developed platform that makes the complex NDE automation possible without the need for software programmers. Inspection tasks that are tedious, error-prone or impossible for humans to perform can now be automated using a form of drag and drop visual scripting. Our work attempts to rectify the problem that NDE is not keeping pace with the rest of factory automation. Outside of NDE, robots routinely and autonomously machine parts, assemble components, weld structures and report progress to corporate databases. By contrast, components arriving in the NDT department typically require manual part handling, calibrations and analysis. The automation examples in this paper cover the development of robotic thickness gauging and the use of adaptive contour following on the NRU reactor inspection at Chalk River.

  3. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  4. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    Science.gov (United States)

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-02-13

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  5. Brane cosmology with curvature corrections

    International Nuclear Information System (INIS)

    Kofinas, Georgios; Maartens, Roy; Papantonopoulos, Eleftherios

    2003-01-01

    We study the cosmology of the Randall-Sundrum brane-world where the Einstein-Hilbert action is modified by curvature correction terms: a four-dimensional scalar curvature from induced gravity on the brane, and a five-dimensional Gauss-Bonnet curvature term. The combined effect of these curvature corrections to the action removes the infinite-density big bang singularity, although the curvature can still diverge for some parameter values. A radiation brane undergoes accelerated expansion near the minimal scale factor, for a range of parameters. This acceleration is driven by the geometric effects, without an inflation field or negative pressures. At late times, conventional cosmology is recovered. (author)

  6. Automated borehole gravity meter system

    International Nuclear Information System (INIS)

    Lautzenhiser, Th.V.; Wirtz, J.D.

    1984-01-01

    An automated borehole gravity meter system for measuring gravity within a wellbore. The gravity meter includes leveling devices for leveling the borehole gravity meter, displacement devices for applying forces to a gravity sensing device within the gravity meter to bring the gravity sensing device to a predetermined or null position. Electronic sensing and control devices are provided for (i) activating the displacement devices, (ii) sensing the forces applied to the gravity sensing device, (iii) electronically converting the values of the forces into a representation of the gravity at the location in the wellbore, and (iv) outputting such representation. The system further includes electronic control devices with the capability of correcting the representation of gravity for tidal effects, as well as, calculating and outputting the formation bulk density and/or porosity

  7. Automated system of monitoring and positioning of functional units of mining technological machines for coal-mining enterprises

    Directory of Open Access Journals (Sweden)

    Meshcheryakov Yaroslav

    2018-01-01

    Full Text Available This article is show to the development of an automated monitoring and positioning system for functional nodes of mining technological machines. It describes the structure, element base, algorithms for identifying the operating states of a walking excavator; various types of errors in the functioning of microelectromechanical gyroscopes and accelerometers, as well as methods for their correction based on the Madgwick fusion filter. The results of industrial tests of an automated monitoring and positioning system for functional units on one of the opencast coal mines of Kuzbass are presented. This work is addressed to specialists working in the fields of the development of embedded systems and control systems, radio electronics, mechatronics, and robotics.

  8. Advanced hardware design for error correcting codes

    CERN Document Server

    Coussy, Philippe

    2015-01-01

    This book provides thorough coverage of error correcting techniques. It includes essential basic concepts and the latest advances on key topics in design, implementation, and optimization of hardware/software systems for error correction. The book’s chapters are written by internationally recognized experts in this field. Topics include evolution of error correction techniques, industrial user needs, architectures, and design approaches for the most advanced error correcting codes (Polar Codes, Non-Binary LDPC, Product Codes, etc). This book provides access to recent results, and is suitable for graduate students and researchers of mathematics, computer science, and engineering. • Examines how to optimize the architecture of hardware design for error correcting codes; • Presents error correction codes from theory to optimized architecture for the current and the next generation standards; • Provides coverage of industrial user needs advanced error correcting techniques.

  9. Automated Comparative Auditing of NCIT Genomic Roles Using NCBI

    Science.gov (United States)

    Cohen, Barry; Oren, Marc; Min, Hua; Perl, Yehoshua; Halper, Michael

    2008-01-01

    Biomedical research has identified many human genes and various knowledge about them. The National Cancer Institute Thesaurus (NCIT) represents such knowledge as concepts and roles (relationships). Due to the rapid advances in this field, it is to be expected that the NCIT’s Gene hierarchy will contain role errors. A comparative methodology to audit the Gene hierarchy with the use of the National Center for Biotechnology Information’s (NCBI’s) Entrez Gene database is presented. The two knowledge sources are accessed via a pair of Web crawlers to ensure up-to-date data. Our algorithms then compare the knowledge gathered from each, identify discrepancies that represent probable errors, and suggest corrective actions. The primary focus is on two kinds of gene-roles: (1) the chromosomal locations of genes, and (2) the biological processes in which genes plays a role. Regarding chromosomal locations, the discrepancies revealed are striking and systematic, suggesting a structurally common origin. In regard to the biological processes, difficulties arise because genes frequently play roles in multiple processes, and processes may have many designations (such as synonymous terms). Our algorithms make use of the roles defined in the NCIT Biological Process hierarchy to uncover many probable gene-role errors in the NCIT. These results show that automated comparative auditing is a promising technique that can identify a large number of probable errors and corrections for them in a terminological genomic knowledge repository, thus facilitating its overall maintenance. PMID:18486558

  10. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  11. On-Site School Library Automation: Automation Anywhere with Laptops.

    Science.gov (United States)

    Gunn, Holly; Oxner, June

    2000-01-01

    Four years after the Halifax Regional School Board was formed through amalgamation, over 75% of its school libraries were automated. On-site automation with laptops was a quicker, more efficient way of automating than sending a shelf list to the Technical Services Department. The Eastern Shore School Library Automation Project was a successful…

  12. Diagnostic Ability of Automated Pupillography in Glaucoma.

    Science.gov (United States)

    Rao, Harsha L; Kadambi, Sujatha V; Mehta, Pooja; Dasari, Srilakshmi; Puttaiah, Narendra K; Pradhan, Zia S; Rao, Dhanraj A S; Shetty, Rohit

    2017-05-01

    To evaluate the diagnostic ability of automated pupillography measurements in glaucoma and study the effect of inter-eye asymmetry in glaucomatous damage on the diagnostic ability. In an observational, cross-sectional study, 47 glaucoma patients and 42 control subjects underwent automated pupillography using a commercially available device. Diagnostic abilities of the pupillary response measurements were evaluated using area under receiver operating characteristic (ROC) curves (AUC) and sensitivities at fixed specificities. Influence of inter-eye asymmetry in glaucoma [inter-eye mean deviation (MD) difference on visual fields (VF)] on the diagnostic ability of pupillography parameters was evaluated by ROC regression approach. The AUCs of automated pupillography parameters ranged from 0.60 (amplitude score with peripheral blue stimulus) to 0.82 (amplitude score with full field white stimulus, Amp-FF-W). Sensitivity at 95% specificity ranged between 5% (amplitude score with full field blue stimulus) and 45% (amplitude score with full field green stimulus). Inter-eye MD difference significantly affected the diagnostic performance of automated pupillography parameters (p glaucoma. The performance of these pupillography measurements in detecting glaucoma significantly increased with greater inter-eye asymmetry in the glaucomatous damage.

  13. Enhancing Cooperative Loan Scheme Through Automated Loan ...

    African Journals Online (AJOL)

    2013-03-01

    Mar 1, 2013 ... Abstract. The concept of automation has been variously applied in most computing fields. This involves utilization of computing or electronic devices to undertake the tasks that are being handled by people. It is a pertinent factor in a profitable and soundly run financial institution. Financial transactions ...

  14. Automated electron microprobe

    International Nuclear Information System (INIS)

    Thompson, K.A.; Walker, L.R.

    1986-01-01

    The Plant Laboratory at the Oak Ridge Y-12 Plant has recently obtained a Cameca MBX electron microprobe with a Tracor Northern TN5500 automation system. This allows full stage and spectrometer automation and digital beam control. The capabilities of the system include qualitative and quantitative elemental microanalysis for all elements above and including boron in atomic number, high- and low-magnification imaging and processing, elemental mapping and enhancement, and particle size, shape, and composition analyses. Very low magnification, quantitative elemental mapping using stage control (which is of particular interest) has been accomplished along with automated size, shape, and composition analysis over a large relative area

  15. Chef infrastructure automation cookbook

    CERN Document Server

    Marschall, Matthias

    2013-01-01

    Chef Infrastructure Automation Cookbook contains practical recipes on everything you will need to automate your infrastructure using Chef. The book is packed with illustrated code examples to automate your server and cloud infrastructure.The book first shows you the simplest way to achieve a certain task. Then it explains every step in detail, so that you can build your knowledge about how things work. Eventually, the book shows you additional things to consider for each approach. That way, you can learn step-by-step and build profound knowledge on how to go about your configuration management

  16. Comparison of Size Modulation Standard Automated Perimetry and Conventional Standard Automated Perimetry with a 10-2 Test Program in Glaucoma Patients.

    Science.gov (United States)

    Hirasawa, Kazunori; Takahashi, Natsumi; Satou, Tsukasa; Kasahara, Masayuki; Matsumura, Kazuhiro; Shoji, Nobuyuki

    2017-08-01

    This prospective observational study compared the performance of size modulation standard automated perimetry with the Octopus 600 10-2 test program, with stimulus size modulation during testing, based on stimulus intensity and conventional standard automated perimetry, with that of the Humphrey 10-2 test program in glaucoma patients. Eighty-seven eyes of 87 glaucoma patients underwent size modulation standard automated perimetry with Dynamic strategy and conventional standard automated perimetry using the SITA standard strategy. The main outcome measures were global indices, point-wise threshold, visual defect size and depth, reliability indices, and test duration; these were compared between size modulation standard automated perimetry and conventional standard automated perimetry. Global indices and point-wise threshold values between size modulation standard automated perimetry and conventional standard automated perimetry were moderately to strongly correlated (p 33.40, p modulation standard automated perimetry than with conventional standard automated perimetry, but the visual-field defect size was smaller (p modulation-standard automated perimetry than on conventional standard automated perimetry. The reliability indices, particularly the false-negative response, of size modulation standard automated perimetry were worse than those of conventional standard automated perimetry (p modulation standard automated perimetry than with conventional standard automated perimetry (p = 0.02). Global indices and the point-wise threshold value of the two testing modalities correlated well. However, the potential of a large stimulus presented at an area with a decreased sensitivity with size modulation standard automated perimetry could underestimate the actual threshold in the 10-2 test protocol, as compared with conventional standard automated perimetry.

  17. Corrective Jaw Surgery

    Medline Plus

    Full Text Available ... more surgeries depending on the extent of the repair needed. Click here to find out more. Corrective ... more surgeries depending on the extent of the repair needed. Click here to find out more. Corrective ...

  18. Corrective Jaw Surgery

    Medline Plus

    Full Text Available ... Jaw Surgery Download Download the ebook for further information Corrective jaw, or orthognathic surgery is performed by ... your treatment. Correction of Common Dentofacial Deformities ​ ​ The information provided here is not intended as a substitute ...

  19. NWS Corrections to Observations

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Form B-14 is the National Weather Service form entitled 'Notice of Corrections to Weather Records.' The forms are used to make corrections to observations on forms...

  20. LandingNav: Terrain Guided Automated Precision Landing, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The Phase I effort successfully demonstrated the feasibility of a terrain guided automated precision landing sensor using an innovative multi-field-of-view stereo...

  1. Teaching Politically Correct Language

    Science.gov (United States)

    Tsehelska, Maryna

    2006-01-01

    This article argues that teaching politically correct language to English learners provides them with important information and opportunities to be exposed to cultural issues. The author offers a brief review of how political correctness became an issue and how being politically correct influences the use of language. The article then presents…

  2. Precision corrections and supersymmetric unification

    Science.gov (United States)

    Matchev, Konstantin Tzvetanov

    1998-07-01

    In this thesis we compute a full set of one-loop corrections to the masses and couplings in the minimal supersymmetric standard model and study their implications in various precision analyses: (1) We use the weak-scale gauge and Yukawa threshold corrections, including the non-logarithmic terms, in a complete next- to-leading order analysis of gauge and Yukawa coupling unification, both for the case of the minimal supergravity and gauge-mediated models. We then examine the effects of unification-scale threshold corrections in the minimal and missing-doublet SU(5) models. (2) We show the generic size of the one-loop mass corrections to the supersymmetric spectrum and provide a set of compact approximations which hold over the unified parameter space of the supergravity models. (3) We compute the superpartner spectrum across the entire parameter space of the gauge-mediated models, comparing it to that of the minimal supergravity model. We delineate the regions where the lightest neutralino or tau slepton is the next- to-lightest supersymmetric particle, and compute its lifetime and various branching ratios. (4) We make a classification of the tree-level mass sum rules, derive in the supergravity and gauge-mediated unification models, and study their stability against radiative corrections. (5) We calculate the leading order QCD correction to K-/overline[K] mixing within a general supersymmetric model. Using an effective field theory language, we construct /Delta S = 2 effective Lagrangians for different hierarchies of the gluino and the first two generation squark masses. For each case, we show the size of the corrections and find that they usually modify previous bounds on intergenerational squark mass mixing by more than a factor of two.

  3. Automation Interface Design Development

    Data.gov (United States)

    National Aeronautics and Space Administration — Our research makes its contributions at two levels. At one level, we addressed the problems of interaction between humans and computers/automation in a particular...

  4. Automated Vehicles Symposium 2014

    CERN Document Server

    Beiker, Sven; Road Vehicle Automation 2

    2015-01-01

    This paper collection is the second volume of the LNMOB series on Road Vehicle Automation. The book contains a comprehensive review of current technical, socio-economic, and legal perspectives written by experts coming from public authorities, companies and universities in the U.S., Europe and Japan. It originates from the Automated Vehicle Symposium 2014, which was jointly organized by the Association for Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Burlingame, CA, in July 2014. The contributions discuss the challenges arising from the integration of highly automated and self-driving vehicles into the transportation system, with a focus on human factors and different deployment scenarios. This book is an indispensable source of information for academic researchers, industrial engineers, and policy makers interested in the topic of road vehicle automation.

  5. Fixed automated spray technology.

    Science.gov (United States)

    2011-04-19

    This research project evaluated the construction and performance of Boschungs Fixed Automated : Spray Technology (FAST) system. The FAST system automatically sprays de-icing material on : the bridge when icing conditions are about to occur. The FA...

  6. Automated Vehicles Symposium 2015

    CERN Document Server

    Beiker, Sven

    2016-01-01

    This edited book comprises papers about the impacts, benefits and challenges of connected and automated cars. It is the third volume of the LNMOB series dealing with Road Vehicle Automation. The book comprises contributions from researchers, industry practitioners and policy makers, covering perspectives from the U.S., Europe and Japan. It is based on the Automated Vehicles Symposium 2015 which was jointly organized by the Association of Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Ann Arbor, Michigan, in July 2015. The topical spectrum includes, but is not limited to, public sector activities, human factors, ethical and business aspects, energy and technological perspectives, vehicle systems and transportation infrastructure. This book is an indispensable source of information for academic researchers, industrial engineers and policy makers interested in the topic of road vehicle automation.

  7. Automation synthesis modules review

    International Nuclear Information System (INIS)

    Boschi, S.; Lodi, F.; Malizia, C.; Cicoria, G.; Marengo, M.

    2013-01-01

    The introduction of 68 Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived 68 Ge/ 68 Ga generator has been at the bases of the development of 68 Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for 68 Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. - Highlights: ► Generators availability and robust chemistry boosted for the huge diffusion of 68Ga radiopharmaceuticals. ► Different technological approaches for 68Ga radiopharmaceuticals will be discussed. ► Generator eluate post processing and evolution to cassette based systems were the major issues in automation. ► Impact of regulations on the technological development will be also considered

  8. Automated generation of lattice QCD Feynman rules

    International Nuclear Information System (INIS)

    Hart, A.; Mueller, E.H.; Horgan, R.R.

    2009-04-01

    The derivation of the Feynman rules for lattice perturbation theory from actions and operators is complicated, especially for highly improved actions such as HISQ. This task is, however, both important and particularly suitable for automation. We describe a suite of software to generate and evaluate Feynman rules for a wide range of lattice field theories with gluons and (relativistic and/or heavy) quarks. Our programs are capable of dealing with actions as complicated as (m)NRQCD and HISQ. Automated differentiation methods are used to calculate also the derivatives of Feynman diagrams. (orig.)

  9. Automated generation of lattice QCD Feynman rules

    Energy Technology Data Exchange (ETDEWEB)

    Hart, A.; Mueller, E.H. [Edinburgh Univ. (United Kingdom). SUPA School of Physics and Astronomy; von Hippel, G.M. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Horgan, R.R. [Cambridge Univ. (United Kingdom). DAMTP, CMS

    2009-04-15

    The derivation of the Feynman rules for lattice perturbation theory from actions and operators is complicated, especially for highly improved actions such as HISQ. This task is, however, both important and particularly suitable for automation. We describe a suite of software to generate and evaluate Feynman rules for a wide range of lattice field theories with gluons and (relativistic and/or heavy) quarks. Our programs are capable of dealing with actions as complicated as (m)NRQCD and HISQ. Automated differentiation methods are used to calculate also the derivatives of Feynman diagrams. (orig.)

  10. Automated Lattice Perturbation Theory

    Energy Technology Data Exchange (ETDEWEB)

    Monahan, Christopher

    2014-11-01

    I review recent developments in automated lattice perturbation theory. Starting with an overview of lattice perturbation theory, I focus on the three automation packages currently "on the market": HiPPy/HPsrc, Pastor and PhySyCAl. I highlight some recent applications of these methods, particularly in B physics. In the final section I briefly discuss the related, but distinct, approach of numerical stochastic perturbation theory.

  11. Automated ISMS control auditability

    OpenAIRE

    Suomu, Mikko

    2015-01-01

    This thesis focuses on researching a possible reference model for automated ISMS’s (Information Security Management System) technical control auditability. The main objective was to develop a generic framework for automated compliance status monitoring of the ISO27001:2013 standard which could be re‐used in any ISMS system. The framework was tested with Proof of Concept (PoC) empirical research in a test infrastructure which simulates the framework target deployment environment. To fulfi...

  12. Marketing automation supporting sales

    OpenAIRE

    Sandell, Niko

    2016-01-01

    The past couple of decades has been a time of major changes in marketing. Digitalization has become a permanent part of marketing and at the same time enabled efficient collection of data. Personalization and customization of content are playing a crucial role in marketing when new customers are acquired. This has also created a need for automation to facilitate the distribution of targeted content. As a result of successful marketing automation more information of the customers is gathered ...

  13. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  14. Automated lattice data generation

    Directory of Open Access Journals (Sweden)

    Ayyar Venkitesh

    2018-01-01

    Full Text Available The process of generating ensembles of gauge configurations (and measuring various observables over them can be tedious and error-prone when done “by hand”. In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.

  15. Automated ship image acquisition

    Science.gov (United States)

    Hammond, T. R.

    2008-04-01

    The experimental Automated Ship Image Acquisition System (ASIA) collects high-resolution ship photographs at a shore-based laboratory, with minimal human intervention. The system uses Automatic Identification System (AIS) data to direct a high-resolution SLR digital camera to ship targets and to identify the ships in the resulting photographs. The photo database is then searchable using the rich data fields from AIS, which include the name, type, call sign and various vessel identification numbers. The high-resolution images from ASIA are intended to provide information that can corroborate AIS reports (e.g., extract identification from the name on the hull) or provide information that has been omitted from the AIS reports (e.g., missing or incorrect hull dimensions, cargo, etc). Once assembled into a searchable image database, the images can be used for a wide variety of marine safety and security applications. This paper documents the author's experience with the practicality of composing photographs based on AIS reports alone, describing a number of ways in which this can go wrong, from errors in the AIS reports, to fixed and mobile obstructions and multiple ships in the shot. The frequency with which various errors occurred in automatically-composed photographs collected in Halifax harbour in winter time were determined by manual examination of the images. 45% of the images examined were considered of a quality sufficient to read identification markings, numbers and text off the entire ship. One of the main technical challenges for ASIA lies in automatically differentiating good and bad photographs, so that few bad ones would be shown to human users. Initial attempts at automatic photo rating showed 75% agreement with manual assessments.

  16. Determination of the correction factor for attenuation, dispersion and production of electrons (Kwall) in the wall of graphite of a ionization chamber Pattern National Type CC01 in fields of gamma radiation of 60Co

    International Nuclear Information System (INIS)

    Alvarez R, J.T.; Morales P, J.; Cruz E, P.

    2001-12-01

    It was determined the Kwall correction factor for the wall of graphite of the chamber of the pattern national type CC01 series 133 for a radiation field Gamma of 60 Co. With this end to measured the currents of ionization l(x) as function of the thickness of the wall of the chamber: X=4,8,12,16 and 20 mm.The mensurations for each thickness consisting of three groups, of sizes n = 30 or 60 data for each group; obtaining 8 complete groups of mensurations independent in eight different dates.The determinate the factor carried out using three regression models: lineal, logarithmic and quadratic, models that were tried to validate with the tests of : i) Shapiro-Wilk and χ 2 for the normality of the entrance data ii) Tests of Bartlett for variances homogeneity among groups for each thickness iii) The tests of Duncan for the stockings among groups of each thickness, and iv) The tests of adjustment lack (LOF) for the models used. Nevertheless, alone the models of the group of corresponding mensurations at 01-03-2000 17-08-2001 they can be validated by LOF, but not for tests of normality and homogeneity of variances. Among other assignable causes of variation we have: i) The values captured by the system of mensuration of the variables of it influences: pressure, temperature and relative humidity don t belong together with the existent ones to the moment to capture the l(x). ii) The mensuration room presents flows of air, for what was suited o diminish their volume and to eliminate the flows of air. iii) A protocol settled down of taking of measures that it consisted in: - Pre-irradiation 5 minutes the chamber after the change of polarity and hood change, with a period of stabilization of 5 minutes after the pre-irradiation. - Pre-irradiation for 5 minutes before the taking of the readings, with the object of eliminating variation sources assigned to currents of escapes or due variations to transitory. iv) To realize corrections for relative humidity of agreement with the

  17. Automated rapid chemistry in heavy element research

    International Nuclear Information System (INIS)

    Schaedel, M.

    1994-01-01

    With the increasingly short half-lives of the heavy element isotopes in the transition region from the heaviest actinides to the transactinide elements the demand for automated rapid chemistry techniques is also increasing. Separation times of significantly less than one minute, high chemical yields, high repetition rates, and an adequate detection system are prerequisites for many successful experiments in this field. The development of techniques for separations in the gas phase and in the aqueous phase for applications of chemical or nuclear studies of the heaviest elements are briefly outlined. Typical examples of results obtained with automated techniques are presented for studies up to element 105, especially those obtained with the Automated Rapid Chemistry Apparatus, ARCA. The prospects to investigate the properties of even heavier elements with chemical techniques are discussed

  18. Future of Automated Insulin Delivery Systems.

    Science.gov (United States)

    Castle, Jessica R; DeVries, J Hans; Kovatchev, Boris

    2017-06-01

    Advances in continuous glucose monitoring (CGM) have brought on a paradigm shift in the management of type 1 diabetes. These advances have enabled the automation of insulin delivery, where an algorithm determines the insulin delivery rate in response to the CGM values. There are multiple automated insulin delivery (AID) systems in development. A system that automates basal insulin delivery has already received Food and Drug Administration approval, and more systems are likely to follow. As the field of AID matures, future systems may incorporate additional hormones and/or multiple inputs, such as activity level. All AID systems are impacted by CGM accuracy and future CGM devices must be shown to be sufficiently accurate to be safely incorporated into AID. In this article, we summarize recent achievements in AID development, with a special emphasis on CGM sensor performance, and discuss the future of AID systems from the point of view of their input-output characteristics, form factor, and adaptability.

  19. Automatic Contextual Text Correction Using The Linguistic Habits Graph Lhg

    Directory of Open Access Journals (Sweden)

    Marcin Gadamer

    2009-01-01

    Full Text Available Automatic text correction is an essential problem of today text processors and editors. Thispaper introduces a novel algorithm for automation of contextual text correction using a LinguisticHabit Graph (LHG also introduced in this paper. A specialist internet crawler hasbeen constructed for searching through web sites in order to build a Linguistic Habit Graphafter text corpuses gathered in polish web sites. The achieved correction results on a basis ofthis algorithm using this LHG were compared with commercial programs which also enableto make text correction: Microsoft Word 2007, Open Office Writer 3.0 and search engineGoogle. The achieved results of text correction were much better than correction made bythese commercial tools.

  20. Quantum gravitational corrections for spinning particles

    International Nuclear Information System (INIS)

    Fröb, Markus B.

    2016-01-01

    We calculate the quantum corrections to the gauge-invariant gravitational potentials of spinning particles in flat space, induced by loops of both massive and massless matter fields of various types. While the corrections to the Newtonian potential induced by massless conformal matter for spinless particles are well known, and the same corrections due to massless minimally coupled scalars http://dx.doi.org/10.1088/0264-9381/27/24/245008, massless non-conformal scalars http://dx.doi.org/10.1103/PhysRevD.87.104027 and massive scalars, fermions and vector bosons http://dx.doi.org/10.1103/PhysRevD.91.064047 have been recently derived, spinning particles receive additional corrections which are the subject of the present work. We give both fully analytic results valid for all distances from the particle, and present numerical results as well as asymptotic expansions. At large distances from the particle, the corrections due to massive fields are exponentially suppressed in comparison to the corrections from massless fields, as one would expect. However, a surprising result of our analysis is that close to the particle itself, on distances comparable to the Compton wavelength of the massive fields running in the loops, these corrections can be enhanced with respect to the massless case.

  1. Design of Correction Coil for ITER

    International Nuclear Information System (INIS)

    Kubo, Hiroatsu; Yoshida, Kiyoshi; Omine, Takeshi

    1998-11-01

    ITER (International Thermonuclear Experimental Reactor) project is under way among EU, Japan, Russia and US. In order to shut plasma, the magnetic field is applied by the superconducting coils in ITER. The coils which are called 'Poloidal field (PF-coil)' are installed to control the location and the cross-section shape for plasma in the vacuum vessel. Incorrect position of Magnetic field (Magnetic error) is occurred by the manufacture tolerance for PF-coil. The coils which are called 'Correction-Coil' are installed in order to correct these magnetic error around the PF-coil. The Correction Coils are consist of the 3-sets of the superconducting coil. The stress analysis for the correction coils is performed and the supporting structure of the coils are designed. The bolts for clamps and the position for clamps are examined from this analysis. (J.P.N.)

  2. A precise technique for manufacturing correction coil

    International Nuclear Information System (INIS)

    Schieber, L.

    1992-01-01

    An automated method of manufacturing correction coils has been developed which provides a precise embodiment of the coil design. Numerically controlled machines have been developed to accurately position coil windings on the beam tube. Two types of machines have been built. One machine bonds the wire to a substrate which is wrapped around the beam tube after it is completed while the second machine bonds the wire directly to the beam tube. Both machines use the Multiwire reg-sign technique of bonding the wire to the substrate utilizing an ultrasonic stylus. These machines are being used to manufacture coils for both the SSC and RHIC

  3. Practical automation for mature producing areas

    International Nuclear Information System (INIS)

    Luppens, J.C.

    1995-01-01

    Successful installation and operation of supervisory control and data acquisition (SCADA) systems on two US gulf coast platforms, prompted the installation of the first SCADA, or automation, system in Oklahoma in 1989. The initial installation consisted of four remote terminal units (RTU's) at four beam-pumped leases and a PC-based control system communicating by means of a 900-MHz data repeated. This first installation was a building block for additional wells to be automated, and then additional systems, consisting of RTU's, a PC, and a data repeated, were installed. By the end of 1992 there were 98 RTU's operating on five separation systems and additional RTU's are being installed on a regular basis. This paper outlines the logical development of automation systems on properties in Oklahoma operated by Phillips Petroleum Co. Those factors critical to the success of the effort are (1) designing data-gathering and control capability in conjunction with the field operations staff to meet and not exceed their needs; (2) selection of a computer operating system and automation software package; (3) selection of computer, RTU, and end-device hardware; and (4) continuous involvement of the field operations staff in the installation, operation, and maintenance of the systems. Additionally, specific tangible and intangible results are discussed

  4. The repeatability of automated and clinician refraction.

    Science.gov (United States)

    Bullimore, M A; Fusaro, R E; Adams, C W

    1998-08-01

    Auto-refractors are used as a starting point for clinicians' refractions and in studies of refractive error. We investigated the repeatability of the Hoya AR-570 and clinician refraction. Eighty-six subjects, aged 11 to 60 years, were recruited by mailing inquiries to 500 randomly selected patients who had received recent examinations at the University of California Optometric Eye Center. Contact lens wearers, patients with best corrected visual acuity worse than 20/30 in either eye, and patients with a history of diabetes were excluded. Each subject was examined by two clinicians during one visit. The first clinician obtained five auto-refractor readings for each eye (which were later averaged), performed a balanced subjective refraction (with spherical masking lenses in the phoropter), and repeated the automated refractor measurements. This protocol was then repeated by the second clinician. Clinicians were randomized with regard to testing order and masked to automated refractor results, each other's refractions, and previous spectacle prescriptions. To quantify repeatability, we used mixed model analyses of variance to estimate the appropriate variance components while accounting for the correlation among, for example, repeated measurements of the same eye. Astigmatic data were analyzed by converting into Fourier form: two cross-cylinders at axis 0 degrees (J0) and axis 45 degrees (J45). For mean spherical equivalent, the average difference between five averaged automated refractor readings, taken by two different optometrists, was +0.02 D (95% limits of agreement = -0.36 to +0.40 D). The average difference between the two optometrists' subjective refractions was -0.12 D (95% limits of agreement = -0.90 to +0.65 D). The 95% limits of agreement for the automated refractor were about half those of the clinician for both astigmatic terms (J0 and J45) and for all comparisons. Automated refraction is more repeatable than subjective refraction and therefore more

  5. Development of An Optimization Method for Determining Automation Rate in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun; Kim, Jong Hyun

    2014-01-01

    Since automation was introduced in various industrial fields, it has been known that automation provides positive effects like greater efficiency and fewer human errors, and negative effect defined as out-of-the-loop (OOTL). Thus, before introducing automation in nuclear field, the estimation of the positive and negative effects of automation on human operators should be conducted. In this paper, by focusing on CPS, the optimization method to find an appropriate proportion of automation is suggested by integrating the suggested cognitive automation rate and the concepts of the level of ostracism. The cognitive automation rate estimation method was suggested to express the reduced amount of human cognitive loads, and the level of ostracism was suggested to express the difficulty in obtaining information from the automation system and increased uncertainty of human operators' diagnosis. The maximized proportion of automation that maintains the high level of attention for monitoring the situation is derived by an experiment, and the automation rate is estimated by the suggested automation rate estimation method. It is expected to derive an appropriate inclusion proportion of the automation system avoiding the OOTL problem and having maximum efficacy at the same time

  6. Applications of the soft computing in the automated history matching

    Energy Technology Data Exchange (ETDEWEB)

    Silva, P.C.; Maschio, C.; Schiozer, D.J. [Unicamp (Brazil)

    2006-07-01

    Reservoir management is a research field in petroleum engineering that optimizes reservoir performance based on environmental, political, economic and technological criteria. Reservoir simulation is based on geological models that simulate fluid flow. Models must be constantly corrected to yield the observed production behaviour. The process of history matching is controlled by the comparison of production data, well test data and measured data from simulations. Parametrization, objective function analysis, sensitivity analysis and uncertainty analysis are important steps in history matching. One of the main challenges facing automated history matching is to develop algorithms that find the optimal solution in multidimensional search spaces. Optimization algorithms can be either global optimizers that work with noisy multi-modal functions, or local optimizers that cannot work with noisy multi-modal functions. The problem with global optimizers is the very large number of function calls, which is an inconvenience due to the long reservoir simulation time. For that reason, techniques such as least squared, thin plane spline, kriging and artificial neural networks (ANN) have been used as substitutes to reservoir simulators. This paper described the use of optimization algorithms to find optimal solution in automated history matching. Several ANN were used, including the generalized regression neural network, fuzzy system with subtractive clustering and radial basis network. The UNIPAR soft computing method was used along with a modified Hooke- Jeeves optimization method. Two case studies with synthetic and real reservoirs are examined. It was concluded that the combination of global and local optimization has the potential to improve the history matching process and that the use of substitute models can reduce computational efforts. 15 refs., 11 figs.

  7. Automating the CMS DAQ

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, G.; et al.

    2014-01-01

    We present the automation mechanisms that have been added to the Data Acquisition and Run Control systems of the Compact Muon Solenoid (CMS) experiment during Run 1 of the LHC, ranging from the automation of routine tasks to automatic error recovery and context-sensitive guidance to the operator. These mechanisms helped CMS to maintain a data taking efficiency above 90% and to even improve it to 95% towards the end of Run 1, despite an increase in the occurrence of single-event upsets in sub-detector electronics at high LHC luminosity.

  8. Control and automation systems

    International Nuclear Information System (INIS)

    Schmidt, R.; Zillich, H.

    1986-01-01

    A survey is given of the development of control and automation systems for energy uses. General remarks about control and automation schemes are followed by a description of modern process control systems along with process control processes as such. After discussing the particular process control requirements of nuclear power plants the paper deals with the reliability and availability of process control systems and refers to computerized simulation processes. The subsequent paragraphs are dedicated to descriptions of the operating floor, ergonomic conditions, existing systems, flue gas desulfurization systems, the electromagnetic influences on digital circuits as well as of light wave uses. (HAG) [de

  9. Automated nuclear materials accounting

    International Nuclear Information System (INIS)

    Pacak, P.; Moravec, J.

    1982-01-01

    An automated state system of accounting for nuclear materials data was established in Czechoslovakia in 1979. A file was compiled of 12 programs in the PL/1 language. The file is divided into four groups according to logical associations, namely programs for data input and checking, programs for handling the basic data file, programs for report outputs in the form of worksheets and magnetic tape records, and programs for book inventory listing, document inventory handling and materials balance listing. A similar automated system of nuclear fuel inventory for a light water reactor was introduced for internal purposes in the Institute of Nuclear Research (UJV). (H.S.)

  10. Robotic Tool Changer For Automated Welding

    Science.gov (United States)

    Gilbert, Jeffrey L.; Spencer, Carl N.

    1994-01-01

    Prototype robotic tool changer for automated welding system eliminates need for most manual tool setups and attendant problems: operates rapidly, always chooses designated tool, maneuvers tip of welding torch or other tool in correct position, and reliably connects water, gas, welding wire, high-voltage electrical signals, and ground. Also loads tools other than those for welding. Intended for use in robotic work cell producing all good parts, no rejects. In production, robot welds part, tests for flaws, and reworks as necessary before releasing it.

  11. AUTOSIM: An automated repetitive software testing tool

    Science.gov (United States)

    Dunham, J. R.; Mcbride, S. E.

    1985-01-01

    AUTOSIM is a software tool which automates the repetitive run testing of software. This tool executes programming tasks previously performed by a programmer with one year of programming experience. Use of the AUTOSIM tool requires a knowledge base containing information about known faults, code fixes, and the fault diagnosis-correction process. AUTOSIM can be considered as an expert system which replaces a low level of programming expertise. Reference information about the design and implementation of the AUTOSIM software test tool provides flowcharts to assist in maintaining the software code and a description of how to use the tool.

  12. System for Automated Calibration of Vector Modulators

    Science.gov (United States)

    Lux, James; Boas, Amy; Li, Samuel

    2009-01-01

    Vector modulators are used to impose baseband modulation on RF signals, but non-ideal behavior limits the overall performance. The non-ideal behavior of the vector modulator is compensated using data collected with the use of an automated test system driven by a LabVIEW program that systematically applies thousands of control-signal values to the device under test and collects RF measurement data. The technology innovation automates several steps in the process. First, an automated test system, using computer controlled digital-to-analog converters (DACs) and a computer-controlled vector network analyzer (VNA) systematically can apply different I and Q signals (which represent the complex number by which the RF signal is multiplied) to the vector modulator under test (VMUT), while measuring the RF performance specifically, gain and phase. The automated test system uses the LabVIEW software to control the test equipment, collect the data, and write it to a file. The input to the Lab - VIEW program is either user-input for systematic variation, or is provided in a file containing specific test values that should be fed to the VMUT. The output file contains both the control signals and the measured data. The second step is to post-process the file to determine the correction functions as needed. The result of the entire process is a tabular representation, which allows translation of a desired I/Q value to the required analog control signals to produce a particular RF behavior. In some applications, corrected performance is needed only for a limited range. If the vector modulator is being used as a phase shifter, there is only a need to correct I and Q values that represent points on a circle, not the entire plane. This innovation has been used to calibrate 2-GHz MMIC (monolithic microwave integrated circuit) vector modulators in the High EIRP Cluster Array project (EIRP is high effective isotropic radiated power). These calibrations were then used to create

  13. Altering user' acceptance of automation through prior automation exposure.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  14. Idaho: Library Automation and Connectivity.

    Science.gov (United States)

    Bolles, Charles

    1996-01-01

    Provides an overview of the development of cooperative library automation and connectivity in Idaho, including telecommunications capacity, library networks, the Internet, and the role of the state library. Information on six shared automation systems in Idaho is included. (LRW)

  15. Automated registration of multispectral MR vessel wall images of the carotid artery

    Energy Technology Data Exchange (ETDEWEB)

    Klooster, R. van ' t; Staring, M.; Reiber, J. H. C.; Lelieveldt, B. P. F.; Geest, R. J. van der, E-mail: rvdgeest@lumc.nl [Department of Radiology, Division of Image Processing, Leiden University Medical Center, 2300 RC Leiden (Netherlands); Klein, S. [Department of Radiology and Department of Medical Informatics, Biomedical Imaging Group Rotterdam, Erasmus MC, Rotterdam 3015 GE (Netherlands); Kwee, R. M.; Kooi, M. E. [Department of Radiology, Cardiovascular Research Institute Maastricht, Maastricht University Medical Center, Maastricht 6202 AZ (Netherlands)

    2013-12-15

    Purpose: Atherosclerosis is the primary cause of heart disease and stroke. The detailed assessment of atherosclerosis of the carotid artery requires high resolution imaging of the vessel wall using multiple MR sequences with different contrast weightings. These images allow manual or automated classification of plaque components inside the vessel wall. Automated classification requires all sequences to be in alignment, which is hampered by patient motion. In clinical practice, correction of this motion is performed manually. Previous studies applied automated image registration to correct for motion using only nondeformable transformation models and did not perform a detailed quantitative validation. The purpose of this study is to develop an automated accurate 3D registration method, and to extensively validate this method on a large set of patient data. In addition, the authors quantified patient motion during scanning to investigate the need for correction. Methods: MR imaging studies (1.5T, dedicated carotid surface coil, Philips) from 55 TIA/stroke patients with ipsilateral <70% carotid artery stenosis were randomly selected from a larger cohort. Five MR pulse sequences were acquired around the carotid bifurcation, each containing nine transverse slices: T1-weighted turbo field echo, time of flight, T2-weighted turbo spin-echo, and pre- and postcontrast T1-weighted turbo spin-echo images (T1W TSE). The images were manually segmented by delineating the lumen contour in each vessel wall sequence and were manually aligned by applying throughplane and inplane translations to the images. To find the optimal automatic image registration method, different masks, choice of the fixed image, different types of the mutual information image similarity metric, and transformation models including 3D deformable transformation models, were evaluated. Evaluation of the automatic registration results was performed by comparing the lumen segmentations of the fixed image and

  16. Automation of finite element methods

    CERN Document Server

    Korelc, Jože

    2016-01-01

    New finite elements are needed as well in research as in industry environments for the development of virtual prediction techniques. The design and implementation of novel finite elements for specific purposes is a tedious and time consuming task, especially for nonlinear formulations. The automation of this process can help to speed up this process considerably since the generation of the final computer code can be accelerated by order of several magnitudes. This book provides the reader with the required knowledge needed to employ modern automatic tools like AceGen within solid mechanics in a successful way. It covers the range from the theoretical background, algorithmic treatments to many different applications. The book is written for advanced students in the engineering field and for researchers in educational and industrial environments.

  17. Automated HAZOP revisited

    DEFF Research Database (Denmark)

    Taylor, J. R.

    2017-01-01

    Hazard and operability analysis (HAZOP) has developed from a tentative approach to hazard identification for process plants in the early 1970s to an almost universally accepted approach today, and a central technique of safety engineering. Techniques for automated HAZOP analysis were developed...

  18. Automated Vehicle Monitoring System

    OpenAIRE

    Wibowo, Agustinus Deddy Arief; Heriansyah, Rudi

    2014-01-01

    An automated vehicle monitoring system is proposed in this paper. The surveillance system is based on image processing techniques such as background subtraction, colour balancing, chain code based shape detection, and blob. The proposed system will detect any human's head as appeared at the side mirrors. The detected head will be tracked and recorded for further action.

  19. Automated Accounting. Instructor Guide.

    Science.gov (United States)

    Moses, Duane R.

    This curriculum guide was developed to assist business instructors using Dac Easy Accounting College Edition Version 2.0 software in their accounting programs. The module consists of four units containing assignment sheets and job sheets designed to enable students to master competencies identified in the area of automated accounting. The first…

  20. Mechatronic Design Automation

    DEFF Research Database (Denmark)

    Fan, Zhun

    successfully design analogue filters, vibration absorbers, micro-electro-mechanical systems, and vehicle suspension systems, all in an automatic or semi-automatic way. It also investigates the very important issue of co-designing plant-structures and dynamic controllers in automated design of Mechatronic...

  1. Protokoller til Home Automation

    DEFF Research Database (Denmark)

    Kjær, Kristian Ellebæk

    2008-01-01

    computer, der kan skifte mellem foruddefinerede indstillinger. Nogle gange kan computeren fjernstyres over internettet, så man kan se hjemmets status fra en computer eller måske endda fra en mobiltelefon. Mens nævnte anvendelser er klassiske indenfor home automation, er yderligere funktionalitet dukket op...

  2. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2015-01-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  3. Addressing Correctional Officer Stress: Programs and Strategies. Issues and Practices.

    Science.gov (United States)

    Finn, Peter

    A review of the literature and interviews with over 50 people in the field revealed that job-related stress is widespread and possibly increasing among correctional officers. This publication is intended to help correctional administrators develop an effective program for preventing and treating correctional officers' stress. A variety of…

  4. Driver Psychology during Automated Platooning

    NARCIS (Netherlands)

    Heikoop, D.D.

    2017-01-01

    With the rapid increase in vehicle automation technology, the call for understanding how humans behave while driving in an automated vehicle becomes more urgent. Vehicles that have automated systems such as Lane Keeping Assist (LKA) or Adaptive Cruise Control (ACC) not only support drivers in their

  5. Holographic bulk reconstruction with α' corrections

    Science.gov (United States)

    Roy, Shubho R.; Sarkar, Debajyoti

    2017-10-01

    We outline a holographic recipe to reconstruct α' corrections to anti-de Sitter (AdS) (quantum) gravity from an underlying CFT in the strictly planar limit (N →∞ ). Assuming that the boundary CFT can be solved in principle to all orders of the 't Hooft coupling λ , for scalar primary operators, the λ-1 expansion of the conformal dimensions can be mapped to higher curvature corrections of the dual bulk scalar field action. Furthermore, for the metric perturbations in the bulk, the AdS /CFT operator-field isomorphism forces these corrections to be of the Lovelock type. We demonstrate this by reconstructing the coefficient of the leading Lovelock correction, also known as the Gauss-Bonnet term in a bulk AdS gravity action using the expression of stress-tensor two-point function up to subleading order in λ-1.

  6. Operator quantum error-correcting subsystems for self-correcting quantum memories

    International Nuclear Information System (INIS)

    Bacon, Dave

    2006-01-01

    The most general method for encoding quantum information is not to encode the information into a subspace of a Hilbert space, but to encode information into a subsystem of a Hilbert space. Recently this notion has led to a more general notion of quantum error correction known as operator quantum error correction. In standard quantum error-correcting codes, one requires the ability to apply a procedure which exactly reverses on the error-correcting subspace any correctable error. In contrast, for operator error-correcting subsystems, the correction procedure need not undo the error which has occurred, but instead one must perform corrections only modulo the subsystem structure. This does not lead to codes which differ from subspace codes, but does lead to recovery routines which explicitly make use of the subsystem structure. Here we present two examples of such operator error-correcting subsystems. These examples are motivated by simple spatially local Hamiltonians on square and cubic lattices. In three dimensions we provide evidence, in the form a simple mean field theory, that our Hamiltonian gives rise to a system which is self-correcting. Such a system will be a natural high-temperature quantum memory, robust to noise without external intervening quantum error-correction procedures

  7. Evaluation of Machine Learning Methods for LHC Optics Measurements and Corrections Software

    CERN Document Server

    AUTHOR|(CDS)2206853; Henning, Peter

    The field of artificial intelligence is driven by the goal to provide machines with human-like intelligence. However modern science is currently facing problems with high complexity that cannot be solved by humans in the same timescale as by machines. Therefore there is a demand on automation of complex tasks. To identify the category of tasks which can be performed by machines in the domain of optics measurements and correction on the Large Hadron Collider (LHC) is one of the central research subjects of this thesis. The application of machine learning methods and concepts of artificial intelligence can be found in various industry and scientific branches. In High Energy Physics these concepts are mostly used in offline analysis of experiments data and to perform regression tasks. In Accelerator Physics the machine learning approach has not found a wide application yet. Therefore potential tasks for machine learning solutions can be specified in this domain. The appropriate methods and their suitability for...

  8. Comparison of automated and manual shielding block fabrication

    International Nuclear Information System (INIS)

    Weeks, K.J.; Fraass, B.A.; McShan, D.L.; Hardybala, S.S.; Hargreaves, E.A.; Lichter, A.S.

    1989-01-01

    This work reports the results of a study comparing computer controlled and manual shielding block cutting. The general problems inherent in automated block cutting have been identified and minimized. A system whose accuracy is sufficient for clinical applications has been developed. The relative accuracy of our automated system versus experienced technician controlled cutting was investigated. In general, it is found that automated cutting is somewhat faster and more accurate than manual cutting for very large fields, but that the reverse is true for most smaller fields. The relative cost effectiveness of automated cutting is dependent on the percentage of computer designed blocks which are generated in the clinical setting. At the present time, the traditional manual method is still favored

  9. Automated interferometric alignment system for paraboloidal mirrors

    Science.gov (United States)

    Maxey, L. Curtis

    1993-01-01

    A method is described for a systematic method of interpreting interference fringes obtained by using a corner cube retroreflector as an alignment aid when aigning a paraboloid to a spherical wavefront. This is applicable to any general case where such alignment is required, but is specifically applicable in the case of aligning an autocollimating test using a diverging beam wavefront. In addition, the method provides information which can be systematically interpreted such that independent information about pitch, yaw and focus errors can be obtained. Thus, the system lends itself readily to automation. Finally, although the method is developed specifically for paraboloids, it can be seen to be applicable to a variety of other aspheric optics when applied in combination with a wavefront corrector that produces a wavefront which, when reflected from the correctly aligned aspheric surface will produce a collimated wavefront like that obtained from the paraboloid when it is correctly aligned to a spherical wavefront.

  10. Phaser.MRage: automated molecular replacement.

    Science.gov (United States)

    Bunkóczi, Gábor; Echols, Nathaniel; McCoy, Airlie J; Oeffner, Robert D; Adams, Paul D; Read, Randy J

    2013-11-01

    Phaser.MRage is a molecular-replacement automation framework that implements a full model-generation workflow and provides several layers of model exploration to the user. It is designed to handle a large number of models and can distribute calculations efficiently onto parallel hardware. In addition, phaser.MRage can identify correct solutions and use this information to accelerate the search. Firstly, it can quickly score all alternative models of a component once a correct solution has been found. Secondly, it can perform extensive analysis of identified solutions to find protein assemblies and can employ assembled models for subsequent searches. Thirdly, it is able to use a priori assembly information (derived from, for example, homologues) to speculatively place and score molecules, thereby customizing the search procedure to a certain class of protein molecule (for example, antibodies) and incorporating additional biological information into molecular replacement.

  11. Phaser.MRage: automated molecular replacement

    International Nuclear Information System (INIS)

    Bunkóczi, Gábor; Echols, Nathaniel; McCoy, Airlie J.; Oeffner, Robert D.; Adams, Paul D.; Read, Randy J.

    2013-01-01

    The functionality of the molecular-replacement pipeline phaser.MRage is introduced and illustrated with examples. Phaser.MRage is a molecular-replacement automation framework that implements a full model-generation workflow and provides several layers of model exploration to the user. It is designed to handle a large number of models and can distribute calculations efficiently onto parallel hardware. In addition, phaser.MRage can identify correct solutions and use this information to accelerate the search. Firstly, it can quickly score all alternative models of a component once a correct solution has been found. Secondly, it can perform extensive analysis of identified solutions to find protein assemblies and can employ assembled models for subsequent searches. Thirdly, it is able to use a priori assembly information (derived from, for example, homologues) to speculatively place and score molecules, thereby customizing the search procedure to a certain class of protein molecule (for example, antibodies) and incorporating additional biological information into molecular replacement

  12. Beam divergence correction method for neutron resonance spin echo spectroscope

    International Nuclear Information System (INIS)

    Maruyama, Ryuji; Tasaki, Seiji; Hino, Masahiro; Kitaguchi, Masaaki; Kawabata, Yuji; Ebisawa, Toru

    2005-01-01

    A beam divergence correction method for Neutron resonance spin echo (NRSE) spectroscope was proposed and the effectiveness is evaluated by simulation. When a beam divergence correction coil was introduced into NRSE spectroscope and the optimum magnetic field was given, the visibility of spin echo signal was recovered by controlling scattering of phase difference generated by beam divergence. The effectiveness of the correction method was proved by the above result. Principle of NRSE spectroscopy, decrease of spin polarization rate by beam divergence and its correction method, structure of divergence angle correction coil and the magnetic field calculation and result of simulation are described. (S.Y.)

  13. Foreign Currency Requirements Automated Data System,

    Science.gov (United States)

    1984-12-07

    specific data matrix for each of the six files as described in Attachments 2-7. C. Data entries vil be right Justified with unused leading positions left...not use low-order negative (no overzone/overpunch). Negative data fields should include a minus symbol immediately preceding the left order digit. H...manpower costing ADS will generally follow the system development pln for the "OP-32 Automated Data System" Report 1398-01-83-CR), General Research

  14. Process computers automate CERN power supply installations

    CERN Document Server

    Ullrich, H

    1974-01-01

    Computerized automation systems are being used at CERN, Geneva, to improve the capacity, operational reliability and flexibility of the power supply installations for main ring magnets in the experimental zones of particle accelerators. A detailed account of the technological problem involved is followed in the article by a description of the system configuration, the program system and field experience already gathered in similar schemes. (1 refs).

  15. Automated modelling of spatially-distributed glacier ice thickness and volume

    Science.gov (United States)

    James, William H. M.; Carrivick, Jonathan L.

    2016-07-01

    Ice thickness distribution and volume are both key parameters for glaciological and hydrological applications. This study presents VOLTA (Volume and Topography Automation), which is a Python script tool for ArcGISTM that requires just a digital elevation model (DEM) and glacier outline(s) to model distributed ice thickness, volume and bed topography. Ice thickness is initially estimated at points along an automatically generated centreline network based on the perfect-plasticity rheology assumption, taking into account a valley side drag component of the force balance equation. Distributed ice thickness is subsequently interpolated using a glaciologically correct algorithm. For five glaciers with independent field-measured bed topography, VOLTA modelled volumes were between 26.5% (underestimate) and 16.6% (overestimate) of that derived from field observations. Greatest differences were where an asymmetric valley cross section shape was present or where significant valley infill had occurred. Compared with other methods of modelling ice thickness and volume, key advantages of VOLTA are: a fully automated approach and a user friendly graphical user interface (GUI), GIS consistent geometry, fully automated centreline generation, inclusion of a side drag component in the force balance equation, estimation of glacier basal shear stress for each individual glacier, fully distributed ice thickness output and the ability to process multiple glaciers rapidly. VOLTA is capable of regional scale ice volume assessment, which is a key parameter for exploring glacier response to climate change. VOLTA also permits subtraction of modelled ice thickness from the input surface elevation to produce an ice-free DEM, which is a key input for reconstruction of former glaciers. VOLTA could assist with prediction of future glacier geometry changes and hence in projection of future meltwater fluxes.

  16. Behavioral Cloning A Correction

    OpenAIRE

    Camacho, Rui; Michie, Donald

    1995-01-01

    We recently reported on the application of a machine-learning (ML) technique to automated flight control using a simulated F-16 combat plane (Michie and Camacho 1994). Subsequent tests of our data-induced flying model have broadly confirmed the reported results but have also identified a lack of robustness. We had underestimated the latter and now regard our report (Michie and Camacho 1994) as being, by omission, potentially misleading.

  17. Modern approaches to agent-based complex automated negotiation

    CERN Document Server

    Bai, Quan; Ito, Takayuki; Zhang, Minjie; Ren, Fenghui; Aydoğan, Reyhan; Hadfi, Rafik

    2017-01-01

    This book addresses several important aspects of complex automated negotiations and introduces a number of modern approaches for facilitating agents to conduct complex negotiations. It demonstrates that autonomous negotiation is one of the most important areas in the field of autonomous agents and multi-agent systems. Further, it presents complex automated negotiation scenarios that involve negotiation encounters that may have, for instance, a large number of agents, a large number of issues with strong interdependencies and/or real-time constraints.

  18. Leading gravitational corrections and a unified universe

    DEFF Research Database (Denmark)

    Codello, Alessandro; Jain, Rajeev Kumar

    2016-01-01

    Leading order gravitational corrections to the Einstein-Hilbert action can lead to a consistent picture of the universe by unifying the epochs of inflation and dark energy in a single framework. While the leading local correction induces an inflationary phase in the early universe, the leading...... nonlocal term leads to an accelerated expansion of the universe at the present epoch. We argue that both the leading UV and IR terms can be obtained within the framework of a covariant effective field theory of gravity. The perturbative gravitational corrections therefore provide a fundamental basis...

  19. Automated Robotic Liquid Handling Assembly of Modular DNA Devices.

    Science.gov (United States)

    Ortiz, Luis; Pavan, Marilene; McCarthy, Lloyd; Timmons, Joshua; Densmore, Douglas M

    2017-12-01

    Recent advances in modular DNA assembly techniques have enabled synthetic biologists to test significantly more of the available "design space" represented by "devices" created as combinations of individual genetic components. However, manual assembly of such large numbers of devices is time-intensive, error-prone, and costly. The increasing sophistication and scale of synthetic biology research necessitates an efficient, reproducible way to accommodate large-scale, complex, and high throughput device construction. Here, a DNA assembly protocol using the Type-IIS restriction endonuclease based Modular Cloning (MoClo) technique is automated on two liquid-handling robotic platforms. Automated liquid-handling robots require careful, often times tedious optimization of pipetting parameters for liquids of different viscosities (e.g. enzymes, DNA, water, buffers), as well as explicit programming to ensure correct aspiration and dispensing of DNA parts and reagents. This makes manual script writing for complex assemblies just as problematic as manual DNA assembly, and necessitates a software tool that can automate script generation. To this end, we have developed a web-based software tool, http://mocloassembly.com, for generating combinatorial DNA device libraries from basic DNA parts uploaded as Genbank files. We provide access to the tool, and an export file from our liquid handler software which includes optimized liquid classes, labware parameters, and deck layout. All DNA parts used are available through Addgene, and their digital maps can be accessed via the Boston University BDC ICE Registry. Together, these elements provide a foundation for other organizations to automate modular cloning experiments and similar protocols. The automated DNA assembly workflow presented here enables the repeatable, automated, high-throughput production of DNA devices, and reduces the risk of human error arising from repetitive manual pipetting. Sequencing data show the automated DNA

  20. Automating the Photogrammetric Workflow in a National Mapping Agency

    Science.gov (United States)

    Holland, D.; Gladstone, C.; Sargent, I.; Horgan, J.; Gardiner, A.; Freeman, M.

    2012-07-01

    The goal of automating the process of identifying changes to topographic features in aerial photography, extracting the geometry of these features and recording the changes in a database, is yet to be fully realised. At Ordnance Survey, Britain's national mapping agency, research into the automation of these processes has been underway for several years, and is now beginning to be implemented in production systems. At the start of the processing chain is the identification of change - new buildings and roads being constructed, old structures demolished, alterations to field and vegetation boundaries and changes to inland water features. Using eCognition object-based image analysis techniques, a system has been developed to detect the changes to features. This uses four-band digital imagery (red, green, blue and near infra-red), together with a digital surface model derived by image matching, to identify all the topographic features of interest to a mapping agency. Once identified, these features are compared with those in the National Geographic Database and any potential changes are highlighted. These changes will be presented to photogrammetrists in the production area, who will rapidly assess whether or not the changes are real. If the change is accepted, they will manually capture the geometry and attributes of the feature concerned. The change detection process, although not fully automatic, cuts down the amount of time required to update the database, enabling a more efficient data collection workflow. Initial results, on the detection of changes to buildings only, showed a completeness value (proportion of the real changes that were found) of 92% and a correctness value (proportion of the changes found that were real changes) of 22%, with a time saving of around 50% when compared with the equivalent manual process. The completeness value is similar to those obtained by the manual process. Further work on the process has added vegetation, water and many other

  1. Students' Attitude toward Correction

    Directory of Open Access Journals (Sweden)

    Rinda Fitriana

    2017-10-01

    Full Text Available Students’ attitudes influence their decision to whether or not accept the teachers’ feedback. Therefore, questionnaire was administered to one hundred and ninety-six twelfth grade of vocational high school students, wherein, ten of them were involved in interview, to figure out their perspective concerning to the teachers’ correction on their oral production. From both instruments, it is found that the students preferred the teachers as the correctors, although, they did not mind for peer correction. They also expected the teachers to give correction at every time they did error and for all types of errors. Additionally, students agreed that teachers’ personality and their way of teaching influenced their willingness to accept the corrective feedback.

  2. Corrected Age for Preemies

    Science.gov (United States)

    ... Spread the Word Shop AAP Find a Pediatrician Ages & Stages Prenatal Baby Bathing & Skin Care Breastfeeding Crying & Colic ... Toddler Preschool Gradeschool Teen Young Adult Healthy Children > Ages & Stages > Baby > Preemie > Corrected Age For Preemies Ages & Stages ...

  3. Eyeglasses for Vision Correction

    Science.gov (United States)

    ... light. Another option for vision correction with UV protection is prescription sunglasses . Also, for people who prefer one set of eyeglasses for both inside and outdoors, photochromatic lenses are ...

  4. Corrective Jaw Surgery

    Medline Plus

    Full Text Available ... is performed by an oral and maxillofacial surgeon (OMS) to correct a wide range of minor and ... when sleeping, including snoring) Your dentist, orthodontist and OMS will work together to determine whether you are ...

  5. Corrective Jaw Surgery

    Medline Plus

    Full Text Available ... It can also invite bacteria that lead to gum disease. Click here to find out more. Who We ... It can also invite bacteria that lead to gum disease. Click here to find out more. Corrective Jaw ...

  6. Corrective Jaw Surgery

    Medline Plus

    Full Text Available ... can also invite bacteria that lead to gum disease. Click here to find out more. Who We ... can also invite bacteria that lead to gum disease. Click here to find out more. Corrective Jaw ...

  7. Corrective Jaw Surgery

    Medline Plus

    Full Text Available ... surgery, orthognathic surgery is performed to correct functional problems. Jaw Surgery can have a dramatic effect on ... without straining Chronic mouth breathing Sleep apnea (breathing problems when sleeping, including snoring) Your dentist, orthodontist and ...

  8. Human-centred automation programme: review of experiment related studies

    International Nuclear Information System (INIS)

    Grimstad, Tone; Andresen, Gisle; Skjerve, Ann Britt Miberg

    2000-04-01

    Twenty-three empirical studies concerning automation and performance have been reviewed. The purposes of the review are to support experimental studies in the Human-Centred Automation (HCA) programme and to develop a general theory on HCA. Each study was reviewed with regard to twelve study characteristics: domain, type of study, purpose, definition of automation, variables, theoretical basis, models of operator performance, methods applied, experimental design, outcome, stated scope of results, strengths and limitations. Seven of the studies involved domain experts, the rest used students as participants. The majority of the articles originated from the aviation domain: only the study conducted in HAMMLAB considered process control in power plants. In the experimental studies, the independent variable was level of automation, or reliability of automation, while the most common dependent variables were workload, situation awareness, complacency, trust, and criteria of performance, e.g., number of correct responses or response time. Although the studies highlight important aspects of human-automation interaction, it is still unclear how system performance is affected. Nevertheless, the fact that many factors seem to be involved is taken as support for the system-oriented approach of the HCA programme. In conclusion, the review provides valuable input both to the design of experiments and to the development of a general theory. (Author). refs

  9. HOO 2012 Error Recognition and Correction Shared Task: Cambridge University Submission Report

    OpenAIRE

    Kochmar, Ekaterina; Andersen, Oeistein Edvin; Briscoe, Edward John

    2012-01-01

    Previous work on automated error recognition and correction of texts written by learners of English as a Second Language has demonstrated experimentally that training classifiers on error-annotated ESL text generally outperforms training on native text alone and that adaptation of error correction models to the native language (L1) of the writer improves performance. Nevertheless, most extant models have poor precision, particularly when attempting error correction, and this limits their usef...

  10. Data-driven approaches to decision making in automated tumor grading. An example of astrocytoma grading.

    Science.gov (United States)

    Kolles, H; von Wangenheim, A; Rahmel, J; Niedermayer, I; Feiden, W

    1996-08-01

    To compare four data-driven approaches to automated tumor grading based on morphometric data. Apart from the statistical procedure of linear discriminant analysis, three other approaches from the field of neural computing were evaluated. The numerical basis of this study was computed tomography-guided, stereotactically obtained astrocytoma biopsies from 86 patients colored with a combination of Feulgen and immunhistochemical Ki-67 (MIB1) staining. In these biopsies the cell nuclei in four consecutive fields of vision were evaluated morphometrically and the following parameters determined: relative nuclei area, secant lengths of the minimal spanning trees and relative volume-weighted mean nuclear volumes of the proliferating nuclei. Based on the analysis of these morphometric features, the multivariate-generated HOM grading system provides the highest correct grading rates (> 90%), whereas the two widely employed qualitative histologic grading systems for astrocytomas yield correct grading rates of about 60%. For automated tumor grading all approaches yield similar grading results; however, back-propagation networks provide reliable results only following an extensive training phase, which requires the use of a supercomputer. All other neurocomputing models can be run on simple UNIX workstations (AT&T, U.S.A). In contrast to discriminant analysis, backpropagation and Kohonen networks, the newly developed neural network architecture model of self-editing nearest neighbor nets (SEN3) provides incremental learning; i.e., the training phase does not need to be restarted each time when there is further information to learn. Trained SEN3 networks can be considered ready-to-use knowledge bases and are appropriate to integrating further morphometric data in a dynamic process that enhances the diagnostic power of such a network.

  11. On the sensitivity of probe-corrected spherical near-field antenna measurements with high-order probes using double phi-step theta-scanning scheme against various measurement uncertainties

    DEFF Research Database (Denmark)

    Laitinen, Tommi; Pivnenko, Sergey; Nielsen, Jeppe Majlund

    2011-01-01

    In this paper, the relatively recently introduced double phi-step theta-scanning scheme and the probe correction technique associated with it is examined against the traditional phi-scanning scheme and the first-order probe correction. The important result of this paper is that the double phi......-step theta-scanning scheme is shown to be clearly less sensitive to the probe misalignment errors compared to the phi-scanning scheme. The two methods show similar sensitivity to noise and channel balance error....

  12. SU-G-BRB-05: Automation of the Photon Dosimetric Quality Assurance Program of a Linear Accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Lebron, S; Lu, B; Yan, G; Li, J; Liu, C [University of Florida, Gainesville, FL (United States)

    2016-06-15

    Purpose: To develop an automated method to calculate a linear accelerator (LINAC) photon radiation field size, flatness, symmetry, output and beam quality in a single delivery for flattened (FF) and flattening-filter-free (FFF) beams using an ionization chamber array. Methods: The proposed method consists of three control points that deliver 30×30, 10×10 and 5×5cm{sup 2} fields (FF or FFF) in a step-and-shoot sequence where the number of monitor units is weighted for each field size. The IC Profiler (Sun Nuclear Inc.) with 5mm detector spacing was used for this study. The corrected counts (CCs) were calculated and the locations of the maxima and minima values of the first-order gradient determined data of each sub field. Then, all CCs for each field size are summed in order to obtain the final profiles. For each profile, the radiation field size, symmetry, flatness, output factor and beam quality were calculated. For field size calculation, a parameterized gradient method was used. For method validation, profiles were collected in the detector array both, individually and as part of the step-and-shoot plan, with 9.9cm buildup for FF and FFF beams at 90cm source-to-surface distance. The same data were collected with the device (plus buildup) placed on a movable platform to achieve a 1mm resolution. Results: The differences between the dosimetric quantities calculated from both deliveries, individually and step-and-shoot, were within 0.31±0.20% and 0.04±0.02mm. The differences between the calculated field sizes with 5mm and 1mm resolution were ±0.1mm. Conclusion: The proposed single delivery method proved to be simple and efficient in automating the photon dosimetric monthly and annual quality assurance.

  13. Automated approach to detecting behavioral states using EEG-DABS

    Directory of Open Access Journals (Sweden)

    Zachary B. Loris

    2017-07-01

    Full Text Available Electrocorticographic (ECoG signals represent cortical electrical dipoles generated by synchronous local field potentials that result from simultaneous firing of neurons at distinct frequencies (brain waves. Since different brain waves correlate to different behavioral states, ECoG signals presents a novel strategy to detect complex behaviors. We developed a program, EEG Detection Analysis for Behavioral States (EEG-DABS that advances Fast Fourier Transforms through ECoG signals time series, separating it into (user defined frequency bands and normalizes them to reduce variability. EEG-DABS determines events if segments of an experimental ECoG record have significantly different power bands than a selected control pattern of EEG. Events are identified at every epoch and frequency band and then are displayed as output graphs by the program. Certain patterns of events correspond to specific behaviors. Once a predetermined pattern was selected for a behavioral state, EEG-DABS correctly identified the desired behavioral event. The selection of frequency band combinations for detection of the behavior affects accuracy of the method. All instances of certain behaviors, such as freezing, were correctly identified from the event patterns generated with EEG-DABS. Detecting behaviors is typically achieved by visually discerning unique animal phenotypes, a process that is time consuming, unreliable, and subjective. EEG-DABS removes variability by using defined parameters of EEG/ECoG for a desired behavior over chronic recordings. EEG-DABS presents a simple and automated approach to quantify different behavioral states from ECoG signals.

  14. Passive correction of persistent current multipoles in superconducting accelerator dipoles

    International Nuclear Information System (INIS)

    Fisk, H.E.; Hanft, R.A.; Kuchnir, M.; McInturff, A.D.

    1986-07-01

    Correction of the magnetization sextupole and decapole fields with strips of superconductor placed just inside the coil winding is discussed. Calculations have been carried out for such a scheme, and tests have been conducted on a 4 cm aperture magnet. The calculated sextupole correction at the injection excitation of 330 A, 5% of full field, was expected to be 77% effective, while the measured correction is 83%, thus suggesting the scheme may be useful for future accelerators such as SSC and LHC

  15. Automated campaign system

    Science.gov (United States)

    Vondran, Gary; Chao, Hui; Lin, Xiaofan; Beyer, Dirk; Joshi, Parag; Atkins, Brian; Obrador, Pere

    2006-02-01

    To run a targeted campaign involves coordination and management across numerous organizations and complex process flows. Everything from market analytics on customer databases, acquiring content and images, composing the materials, meeting the sponsoring enterprise brand standards, driving through production and fulfillment, and evaluating results; all processes are currently performed by experienced highly trained staff. Presented is a developed solution that not only brings together technologies that automate each process, but also automates the entire flow so that a novice user could easily run a successful campaign from their desktop. This paper presents the technologies, structure, and process flows used to bring this system together. Highlighted will be how the complexity of running a targeted campaign is hidden from the user through technologies, all while providing the benefits of a professionally managed campaign.

  16. Rapid automated nuclear chemistry

    International Nuclear Information System (INIS)

    Meyer, R.A.

    1979-01-01

    Rapid Automated Nuclear Chemistry (RANC) can be thought of as the Z-separation of Neutron-rich Isotopes by Automated Methods. The range of RANC studies of fission and its products is large. In a sense, the studies can be categorized into various energy ranges from the highest where the fission process and particle emission are considered, to low energies where nuclear dynamics are being explored. This paper presents a table which gives examples of current research using RANC on fission and fission products. The remainder of this text is divided into three parts. The first contains a discussion of the chemical methods available for the fission product elements, the second describes the major techniques, and in the last section, examples of recent results are discussed as illustrations of the use of RANC

  17. ATLAS Distributed Computing Automation

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C

    2012-01-01

    The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.

  18. Automated Assembly Center (AAC)

    Science.gov (United States)

    Stauffer, Robert J.

    1993-01-01

    The objectives of this project are as follows: to integrate advanced assembly and assembly support technology under a comprehensive architecture; to implement automated assembly technologies in the production of high-visibility DOD weapon systems; and to document the improved cost, quality, and lead time. This will enhance the production of DOD weapon systems by utilizing the latest commercially available technologies combined into a flexible system that will be able to readily incorporate new technologies as they emerge. Automated assembly encompasses the following areas: product data, process planning, information management policies and framework, three schema architecture, open systems communications, intelligent robots, flexible multi-ability end effectors, knowledge-based/expert systems, intelligent workstations, intelligent sensor systems, and PDES/PDDI data standards.

  19. Automated fingerprint identification system

    International Nuclear Information System (INIS)

    Bukhari, U.A.; Sheikh, N.M.; Khan, U.I.; Mahmood, N.; Aslam, M.

    2002-01-01

    In this paper we present selected stages of an automated fingerprint identification system. The software for the system is developed employing algorithm for two-tone conversion, thinning, feature extraction and matching. Keeping FBI standards into account, it has been assured that no details of the image are lost in the comparison process. We have deployed a general parallel thinning algorithm for specialized images like fingerprints and modified the original algorithm after a series of experimentation selecting the one giving the best results. We also proposed an application-based approach for designing automated fingerprint identification systems keeping in view systems requirements. We will show that by using our system, the precision and efficiency of current fingerprint matching techniques are increased. (author)

  20. Automated breeder fuel fabrication

    International Nuclear Information System (INIS)

    Goldmann, L.H.; Frederickson, J.R.

    1983-01-01

    The objective of the Secure Automated Fabrication (SAF) Project is to develop remotely operated equipment for the processing and manufacturing of breeder reactor fuel pins. The SAF line will be installed in the Fuels and Materials Examination Facility (FMEF). The FMEF is presently under construction at the Department of Energy's (DOE) Hanford site near Richland, Washington, and is operated by the Westinghouse Hanford Company (WHC). The fabrication and support systems of the SAF line are designed for computer-controlled operation from a centralized control room. Remote and automated fuel fabriction operations will result in: reduced radiation exposure to workers; enhanced safeguards; improved product quality; near real-time accountability, and increased productivity. The present schedule calls for installation of SAF line equipment in the FMEF beginning in 1984, with qualifying runs starting in 1986 and production commencing in 1987. 5 figures