WorldWideScience

Sample records for automatically processed alpha-track

  1. Automatically processed alpha-track radon monitor

    International Nuclear Information System (INIS)

    Langner, G.H. Jr.

    1993-01-01

    An automatically processed alpha-track radon monitor is provided which includes a housing having an aperture allowing radon entry, and a filter that excludes the entry of radon daughters into the housing. A flexible track registration material is located within the housing that records alpha-particle emissions from the decay of radon and radon daughters inside the housing. The flexible track registration material is capable of being spliced such that the registration material from a plurality of monitors can be spliced into a single strip to facilitate automatic processing of the registration material from the plurality of monitors. A process for the automatic counting of radon registered by a radon monitor is also provided

  2. Automatic spark counting of alpha-tracks in plastic foils

    International Nuclear Information System (INIS)

    Somogyi, G.; Medveczky, L.; Hunyadi, I.; Nyako, B.

    1976-01-01

    The possibility of alpha-track counting by jumping spark counter in cellulose acetate and polycarbonate nuclear track detectors was studied. A theoretical treatment is presented which predicts the optimum residual thickness of the etched foils in which completely through-etched tracks (i.e. holes) can be obtained for alpha-particles of various energies and angles of incidence. In agreement with the theoretical prediction it is shown that a successful spark counting of alpha-tracks can be performed even in polycarbonate foils. Some counting characteristics, such as counting efficiency vs particle energy at various etched foil thicknesses, surface spark density produced by electric breakdowns in unexposed foils vs foil thickness, etc. have been determined. Special attention was given to the spark counting of alpha-tracks entering thin detectors at right angle. The applicability of the spark counting technique is demonstrated in angular distribution measurements of the 27 Al(p,α 0 ) 24 Mg nuclear reaction at Ep = 1899 keV resonance energy. For this study 15 μm thick Makrofol-G foils and a jumping spark counter of improved construction were used. (orig.) [de

  3. Calibration of alpha-track monitors for measurement of thoron

    International Nuclear Information System (INIS)

    Pearson, M.D.

    1990-03-01

    The US Department of Energy (DOE) Office of Remedial Action and Waste Technology established the Technical Measurements Center (TMC) at the DOE Grand Junction Projects Office (GJPO) to provide standardization, calibration, verification of data, quality assurance, and cost-effectiveness for environmental measurements associated with the various DOE remedial action programs. The GJPO Radon Laboratory has conducted a number of studies evaluating the precision and accuracy of alpha-track monitors for the measurement of airborne radon (Rn-222) concentration. These studies have demonstrated the usefulness of using alpha-track monitors to measure radon. Alpha-track devices have also been proposed for estimating concentrations of thoron (Rn-220). 9 refs., 7 figs., 4 tabs

  4. Automatic calculations of electroweak processes

    International Nuclear Information System (INIS)

    Ishikawa, T.; Kawabata, S.; Kurihara, Y.; Shimizu, Y.; Kaneko, T.; Kato, K.; Tanaka, H.

    1996-01-01

    GRACE system is an excellent tool for calculating the cross section and for generating event of the elementary process automatically. However it is not always easy for beginners to use. An interactive version of GRACE is being developed so as to be a user friendly system. Since it works exactly in the same environment as PAW, all functions of PAW are available for handling any histogram information produced by GRACE. As its application the cross sections of all elementary processes with up to 5-body final states induced by e + e - interaction are going to be calculated and to be summarized as a catalogue. (author)

  5. Refinement of a thoron insensitive alpha track detector for environmental radon monitoring

    International Nuclear Information System (INIS)

    Davey, J.F.

    1995-01-01

    Olympic Dam Operations, a Copper/Uranium mine in the north of South Australia, currently monitors environmental radon (Rn 222) concentrations at a total of 17 sites in the area surrounding the mining lease and Roxby Downs township. During 1990 a commercial alpha track radon detector service was replaced with an on-site system resulting in lower costs, greater confidence in detector calibration, and reduction in processing time. Alpha track detectors (ATD's) are placed in triplicate at each of the 17 sites. Flow-through scintillation cell continuous radon monitors are also operated at two of these sites. Comparison of results from the two different types of monitor has raised the question of a possible thoron (Rn 220) contribution in the alpha track detectors. Laboratory experiments revealed that the diffusion membranes used in the ATD's were in fact 'transparent' to thoron. A new membrane was tested which effectively excluded thoron from the detector cup without affecting the sensitivity to radon. Field comparisons of the different membranes revealed that the thoron component was significant. Since there is only a very minor Rn220 emission from the mining operation, it is important that the monitoring be specific only to Rn222, the primary source term. The use of the new membrane will result in more accurate measurements of Rn222. 4 refs., 4 tabs., 5 figs

  6. Natural language processing techniques for automatic test ...

    African Journals Online (AJOL)

    Natural language processing techniques for automatic test questions generation using discourse connectives. ... PROMOTING ACCESS TO AFRICAN RESEARCH. AFRICAN JOURNALS ... Journal of Computer Science and Its Application.

  7. Automatic and strategic processes in advertising effects

    DEFF Research Database (Denmark)

    Grunert, Klaus G.

    1996-01-01

    , the retrieval of information, and provide a heuristic for brand evaluation. Strategic processes govern learning and inference formation. T relative importance of both types of processes will depend on product involvement. The distinction of these two types of processes leads to some conclusions which...... are at variance with current notions about advertising effects. For example, the att span problem will be relevant only for strategic processes, not for automatic processes, a certain amount of learning can occur with very little conscious effort, and advertising's effect on brand evaluation may be more stable......Two kinds of cognitive processes can be distinguished: Automatic processes, which are mostly subconscious, are learned and changed very slowly, and are not subject to the capacity limitations of working memory, and strategic processes, which are conscious, are subject to capacity limitations...

  8. Learning algorithms and automatic processing of languages

    International Nuclear Information System (INIS)

    Fluhr, Christian Yves Andre

    1977-01-01

    This research thesis concerns the field of artificial intelligence. It addresses learning algorithms applied to automatic processing of languages. The author first briefly describes some mechanisms of human intelligence in order to describe how these mechanisms are simulated on a computer. He outlines the specific role of learning in various manifestations of intelligence. Then, based on the Markov's algorithm theory, the author discusses the notion of learning algorithm. Two main types of learning algorithms are then addressed: firstly, an 'algorithm-teacher dialogue' type sanction-based algorithm which aims at learning how to solve grammatical ambiguities in submitted texts; secondly, an algorithm related to a document system which structures semantic data automatically obtained from a set of texts in order to be able to understand by references to any question on the content of these texts

  9. Accessories for Enhancement of the Semi-Automatic Welding Processes

    National Research Council Canada - National Science Library

    Wheeler, Douglas M; Sawhill, James M

    2000-01-01

    The project's objective is to identify specific areas of the semi-automatic welding operation that is performed with the major semi-automatic processes, which would be more productive if a suitable...

  10. Word Processing in Dyslexics: An Automatic Decoding Deficit?

    Science.gov (United States)

    Yap, Regina; Van Der Leu, Aryan

    1993-01-01

    Compares dyslexic children with normal readers on measures of phonological decoding and automatic word processing. Finds that dyslexics have a deficit in automatic phonological decoding skills. Discusses results within the framework of the phonological deficit and the automatization deficit hypotheses. (RS)

  11. Semi-automatic film processing unit

    International Nuclear Information System (INIS)

    Mohamad Annuar Assadat Husain; Abdul Aziz Bin Ramli; Mohd Khalid Matori

    2005-01-01

    The design concept applied in the development of an semi-automatic film processing unit needs creativity and user support in channelling the required information to select materials and operation system that suit the design produced. Low cost and efficient operation are the challenges that need to be faced abreast with the fast technology advancement. In producing this processing unit, there are few elements which need to be considered in order to produce high quality image. Consistent movement and correct time coordination for developing and drying are a few elements which need to be controlled. Other elements which need serious attentions are temperature, liquid density and the amount of time for the chemical liquids to react. Subsequent chemical reaction that take place will cause the liquid chemical to age and this will adversely affect the quality of image produced. This unit is also equipped with liquid chemical drainage system and disposal chemical tank. This unit would be useful in GP clinics especially in rural area which practice manual system for developing and require low operational cost. (Author)

  12. Distributed automatic control of technological processes in conditions of weightlessness

    Science.gov (United States)

    Kukhtenko, A. I.; Merkulov, V. I.; Samoylenko, Y. I.; Ladikov-Royev, Y. P.

    1986-01-01

    Some problems associated with the automatic control of liquid metal and plasma systems under conditions of weightlessness are examined, with particular reference to the problem of stability of liquid equilibrium configurations. The theoretical fundamentals of automatic control of processes in electrically conducting continuous media are outlined, and means of using electromagnetic fields for simulating technological processes in a space environment are discussed.

  13. Some results of automatic processing of images

    International Nuclear Information System (INIS)

    Golenishchev, I.A.; Gracheva, T.N.; Khardikov, S.V.

    1975-01-01

    The problems of automatic deciphering of the radiographic picture the purpose of which is making a conclusion concerning the quality of the inspected product on the basis of the product defect images in the picture are considered. The methods of defect image recognition are listed, and the algorithms and the class features of defects are described. The results of deciphering of a small radiographic picture by means of the ''Minsk-22'' computer are presented. It is established that the sensitivity of the method of the automatic deciphering is close to that obtained for visual deciphering

  14. Automatic image processing as a means of safeguarding nuclear material

    International Nuclear Information System (INIS)

    Kahnmeyer, W.; Willuhn, K.; Uebel, W.

    1985-01-01

    Problems involved in computerized analysis of pictures taken by automatic film or video cameras in the context of international safeguards implementation are described. They include technical ones as well as the need to establish objective criteria for assessing image information. In the near future automatic image processing systems will be useful in verifying the identity and integrity of IAEA seals. (author)

  15. A method of automatic data processing in radiometric control

    International Nuclear Information System (INIS)

    Adonin, V.M.; Gulyukina, N.A.; Nemirov, Yu.V.; Mogil'nitskij, M.I.

    1980-01-01

    Described is the algorithm for automatic data processing in gamma radiography of products. Rapidity due to application of recurrent evaluation is a specific feature of the processing. Experimental data of by-line control are presented. The results obtained have shown the applicability of automatic signal processing to the testing under industrial conditions, which would permit to increase the testing efficiency to eliminate the subjectivism in assessment of testing results and to improve working conditions

  16. Procedure manual for the estimation of average indoor radon-daughter concentrations using the filtered alpha-track method

    International Nuclear Information System (INIS)

    George, J.L.

    1988-04-01

    One of the measurement needs of US Department of Energy (DOE) remedial action programs is the estimation of the annual-average indoor radon-daughter concentration (RDC) in structures. The filtered alpha-track method, using a 1-year exposure period, can be used to accomplish RDC estimations for the DOE remedial action programs. This manual describes the procedure used to obtain filtered alpha-track measurements to derive average RDC estimates from the measurrements. Appropriate quality-assurance and quality-control programs are also presented. The ''prompt'' alpha-track method of exposing monitors for 2 to 6 months during specific periods of the year is also briefly discussed in this manual. However, the prompt alpha-track method has been validated only for use in the Mesa County, Colorado, area. 3 refs., 3 figs

  17. Automatically processing physical data from LHD experiments

    Energy Technology Data Exchange (ETDEWEB)

    Emoto, M., E-mail: emoto.masahiko@nifs.ac.jp; Ida, K.; Suzuki, C.; Yoshida, M.; Akiyama, T.; Nakamura, Y.; Sakamoto, R.; Yokoyama, M.; Yoshinuma, M.

    2014-05-15

    Physical data produced by large helical device (LHD) experiments is supplied by the Kaiseki server, and registers more than 200 types of diagnostic data. Dependencies exist amongst the data; i.e., in many cases, the calculation of one data requires other data. Therefore, to obtain unregistered data, one needs to calculate not only the diagnostic data itself but also the dependent data; however, because the data is registered by different scientists, each scientist must separately calculate and register their respective data. To simplify this complicated procedure, we have developed an automatic calculation system called AutoAna. The calculation programs of AutoAna are distributed on a network, and the number of such programs can be easily increased dynamically. Our system is therefore scalable and ready for substantial increases in the size of the target data.

  18. Beyond behaviorism: on the automaticity of higher mental processes.

    Science.gov (United States)

    Bargh, J A; Ferguson, M J

    2000-11-01

    The first 100 years of experimental psychology were dominated by 2 major schools of thought: behaviorism and cognitive science. Here the authors consider the common philosophical commitment to determinism by both schools, and how the radical behaviorists' thesis of the determined nature of higher mental processes is being pursued today in social cognition research on automaticity. In harmony with "dual process" models in contemporary cognitive science, which equate determined processes with those that are automatic and which require no intervening conscious choice or guidance, as opposed to "controlled" processes which do, the social cognition research on the automaticity of higher mental processes provides compelling evidence for the determinism of those processes. This research has revealed that social interaction, evaluation and judgment, and the operation of internal goal structures can all proceed without the intervention of conscious acts of will and guidance of the process.

  19. Neural Correlates of Automatic and Controlled Auditory Processing in Schizophrenia

    Science.gov (United States)

    Morey, Rajendra A.; Mitchell, Teresa V.; Inan, Seniha; Lieberman, Jeffrey A.; Belger, Aysenil

    2009-01-01

    Individuals with schizophrenia demonstrate impairments in selective attention and sensory processing. The authors assessed differences in brain function between 26 participants with schizophrenia and 17 comparison subjects engaged in automatic (unattended) and controlled (attended) auditory information processing using event-related functional MRI. Lower regional neural activation during automatic auditory processing in the schizophrenia group was not confined to just the temporal lobe, but also extended to prefrontal regions. Controlled auditory processing was associated with a distributed frontotemporal and subcortical dysfunction. Differences in activation between these two modes of auditory information processing were more pronounced in the comparison group than in the patient group. PMID:19196926

  20. Process and device for automatically surveying complex installations

    International Nuclear Information System (INIS)

    Pekrul, P.J.; Thiele, A.W.

    1976-01-01

    A description is given of a process for automatically analysing separate signal processing channels in real time, one channel per signal, in a facility with significant background noise in signals varying in time and coming from transducers at selected points for the continuous monitoring of the operating conditions of the various components of the installation. The signals are intended to determine potential breakdowns, determine conclusions as to the severity of these potential breakdowns and indicate to an operator the measures to be taken in consequence. The feature of this process is that it comprises the automatic and successive selection of each channel for the purpose of spectral analysis, the automatic processing of the signal of each selected channel to show energy spectrum density data at pre-determined frequencies, the automatic comparison of the energy spectrum density data of each channel with pre-determined sets of limits varying with the frequency, and the automatic indication to the operator of the condition of the various components of the installation associated to each channel and the measures to be taken depending on the set of limits [fr

  1. Resource depletion promotes automatic processing: implications for distribution of practice.

    Science.gov (United States)

    Scheel, Matthew H

    2010-12-01

    Recent models of cognition include two processing systems: an automatic system that relies on associative learning, intuition, and heuristics, and a controlled system that relies on deliberate consideration. Automatic processing requires fewer resources and is more likely when resources are depleted. This study showed that prolonged practice on a resource-depleting mental arithmetic task promoted automatic processing on a subsequent problem-solving task, as evidenced by faster responding and more errors. Distribution of practice effects (0, 60, 120, or 180 sec. between problems) on rigidity also disappeared when groups had equal time on resource-depleting tasks. These results suggest that distribution of practice effects is reducible to resource availability. The discussion includes implications for interpreting discrepancies in the traditional distribution of practice effect.

  2. Automatic processing of radioimmunological research data on a computer

    International Nuclear Information System (INIS)

    Korolyuk, I.P.; Gorodenko, A.N.; Gorodenko, S.I.

    1979-01-01

    A program ''CRITEST'' in the language PL/1 for the EC computer intended for automatic processing of the results of radioimmunological research has been elaborated. The program works in the operation system of the OC EC computer and is performed in the section OC 60 kb. When compiling the program Eitken's modified algorithm was used. The program was clinically approved when determining a number of hormones: CTH, T 4 , T 3 , TSH. The automatic processing of the radioimmunological research data on the computer makes it possible to simplify the labour-consuming analysis and to raise its accuracy

  3. Automatic processing of multimodal tomography datasets.

    Science.gov (United States)

    Parsons, Aaron D; Price, Stephen W T; Wadeson, Nicola; Basham, Mark; Beale, Andrew M; Ashton, Alun W; Mosselmans, J Frederick W; Quinn, Paul D

    2017-01-01

    With the development of fourth-generation high-brightness synchrotrons on the horizon, the already large volume of data that will be collected on imaging and mapping beamlines is set to increase by orders of magnitude. As such, an easy and accessible way of dealing with such large datasets as quickly as possible is required in order to be able to address the core scientific problems during the experimental data collection. Savu is an accessible and flexible big data processing framework that is able to deal with both the variety and the volume of data of multimodal and multidimensional scientific datasets output such as those from chemical tomography experiments on the I18 microfocus scanning beamline at Diamond Light Source.

  4. Image processing. A system for the automatic sorting of chromosomes

    International Nuclear Information System (INIS)

    Najai, Amor

    1977-01-01

    The present paper deals with two aspects of the system: - an automata (specialized hardware) dedicated to image processing. Images are digitized, divided into sub-units and computations are carried out on their main parameters. - A software for the automatic recognition and sorting of chromosomes is implemented on a Multi-20 minicomputer, connected to the automata. (author) [fr

  5. Automatic Detection and Resolution of Lexical Ambiguity in Process Models

    NARCIS (Netherlands)

    Pittke, F.; Leopold, H.; Mendling, J.

    2015-01-01

    System-related engineering tasks are often conducted using process models. In this context, it is essential that these models do not contain structural or terminological inconsistencies. To this end, several automatic analysis techniques have been proposed to support quality assurance. While formal

  6. Automatic process control in anaerobic digestion technology: A critical review.

    Science.gov (United States)

    Nguyen, Duc; Gadhamshetty, Venkataramana; Nitayavardhana, Saoharit; Khanal, Samir Kumar

    2015-10-01

    Anaerobic digestion (AD) is a mature technology that relies upon a synergistic effort of a diverse group of microbial communities for metabolizing diverse organic substrates. However, AD is highly sensitive to process disturbances, and thus it is advantageous to use online monitoring and process control techniques to efficiently operate AD process. A range of electrochemical, chromatographic and spectroscopic devices can be deployed for on-line monitoring and control of the AD process. While complexity of the control strategy ranges from a feedback control to advanced control systems, there are some debates on implementation of advanced instrumentations or advanced control strategies. Centralized AD plants could be the answer for the applications of progressive automatic control field. This article provides a critical overview of the available automatic control technologies that can be implemented in AD processes at different scales. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Automation of chromosomes analysis. Automatic system for image processing

    International Nuclear Information System (INIS)

    Le Go, R.; Cosnac, B. de; Spiwack, A.

    1975-01-01

    The A.S.T.I. is an automatic system relating to the fast conversational processing of all kinds of images (cells, chromosomes) converted to a numerical data set (120000 points, 16 grey levels stored in a MOS memory) through a fast D.O. analyzer. The system performs automatically the isolation of any individual image, the area and weighted area of which are computed. These results are directly displayed on the command panel and can be transferred to a mini-computer for further computations. A bright spot allows parts of an image to be picked out and the results to be displayed. This study is particularly directed towards automatic karyo-typing [fr

  8. Automatized material and radioactivity flow control tool in decommissioning process

    International Nuclear Information System (INIS)

    Rehak, I.; Vasko, M.; Daniska, V.; Schultz, O.

    2009-01-01

    In this presentation the automatized material and radioactivity flow control tool in decommissioning process is discussed. It is concluded that: computer simulation of the decommissioning process is one of the important attributes of computer code Omega; one of the basic tools of computer optimisation of decommissioning waste processing are the tools of integral material and radioactivity flow; all the calculated parameters of materials are stored in each point of calculation process and they can be viewed; computer code Omega represents opened modular system, which can be improved; improvement of the module of optimisation of decommissioning waste processing will be performed in the frame of improvement of material procedures and scenarios.

  9. Monte Carlo treatment planning and high-resolution alpha-track autoradiography for neutron capture therapy

    Energy Technology Data Exchange (ETDEWEB)

    Zamenhof, R.G.; Lin, K.; Ziegelmiller, D.; Clement, S.; Lui, C.; Harling, O.K.

    Monte Carlo simulations of thermal neutron flux distributions in a mathematical head model have been compared to experimental measurements in a corresponding anthropomorphic gelatin-based head phantom irradiated by a thermal neutron beam as presently available at the MITR-II Research Reactor. Excellent agreement between Monte Carlo and experimental measurements has encouraged us to employ the Monte Carlo simulation technique to approach treatment planning problems in neutron capture therapy. We have also implemented a high-resolution alpha-track autoradiography technique originally developed in our laboratory at MIT. Initial autoradiograms produced by this technique meet our expectations in terms of the high resolution available and the ability to etch tracks without concommitant destruction of stained tissue. Our preliminary results with computer-aided track distribution analysis indicate that this approach is very promising in being able to quantify boron distributions in tissue at the subcellular level with a minimum amount of operator effort necessary.

  10. Experimental Study for Automatic Colony Counting System Based Onimage Processing

    Science.gov (United States)

    Fang, Junlong; Li, Wenzhe; Wang, Guoxin

    Colony counting in many colony experiments is detected by manual method at present, therefore it is difficult for man to execute the method quickly and accurately .A new automatic colony counting system was developed. Making use of image-processing technology, a study was made on the feasibility of distinguishing objectively white bacterial colonies from clear plates according to the RGB color theory. An optimal chromatic value was obtained based upon a lot of experiments on the distribution of the chromatic value. It has been proved that the method greatly improves the accuracy and efficiency of the colony counting and the counting result is not affected by using inoculation, shape or size of the colony. It is revealed that automatic detection of colony quantity using image-processing technology could be an effective way.

  11. Fluid Intelligence and Automatic Neural Processes in Facial Expression Perception

    DEFF Research Database (Denmark)

    Liu, Tongran; Xiao, Tong; Li, Xiaoyan

    2015-01-01

    The relationship between human fluid intelligence and social-emotional abilities has been a topic of considerable interest. The current study investigated whether adolescents with different intellectual levels had different automatic neural processing of facial expressions. Two groups of adolescent...... males were enrolled: a high IQ group and an average IQ group. Age and parental socioeconomic status were matched between the two groups. Participants counted the numbers of the central cross changes while paired facial expressions were presented bilaterally in an oddball paradigm. There were two.......2). Participants were required to concentrate on the primary task of counting the central cross changes and to ignore the expressions to ensure that facial expression processing was automatic. Event-related potentials (ERPs) were obtained during the tasks. The visual mismatch negativity (vMMN) components were...

  12. Active Learning for Automatic Audio Processing of Unwritten Languages (ALAPUL)

    Science.gov (United States)

    2016-07-01

    AFRL-RH-WP-TR-2016-0074 ACTIVE LEARNING FOR AUTOMATIC AUDIO PROCESSING OF UNWRITTEN LANGUAGES (ALAPUL) Dimitra Vergyri Andreas Kathol Wen Wang...FA8650-15-C-9101 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) *Dimitra Vergyri; Andreas Kathol; Wen Wang; Chris Bartels; Julian VanHout...feature transform through deep auto-encoders for better phone recognition performance. We target iterative learning to improve the system through

  13. Experience in automatic processing of 340.000 images from ITEF 3-m magnetic spectrometer

    International Nuclear Information System (INIS)

    Dzhelyadin, R.I.; Dukhovskoj, I.A.; Ivanov, L.V.; Kishkurno, V.V.; Krutenkova, A.P.; Kulikov, V.V.; Lyulevich, V.I.; Polikarpov, V.M.; Radkevich, I.A.; Fedorets, V.S.; Fedotov, O.P.

    1974-01-01

    A number of conclusions were made regarding automatic processing of 340.000 pictures (1.020.000 frames) developed on a three-meter magnetic spectrometer with spark chambers. Possibilities for time optimization of automatic processing programs are discussed. The results of processing of a series of photographs were analysed to compare the paramters of automatic ans semi-automatic processing. Some problems relating to organization and technology of picture processing are also autlined [ru

  14. Automatic extraction of process categories from process model collections

    NARCIS (Netherlands)

    Malinova, M.; Dijkman, R.M.; Mendling, J.; Lohmann, N.; Song, M.; Wohed, P.

    2014-01-01

    Many organizations build up their business process management activities in an incremental way. As a result, there is no overarching structure defined at the beginning. However, as business process modeling initiatives often yield hundreds to thousands of process models, there is a growing need for

  15. Scheduling algorithms for automatic control systems for technological processes

    Science.gov (United States)

    Chernigovskiy, A. S.; Tsarev, R. Yu; Kapulin, D. V.

    2017-01-01

    Wide use of automatic process control systems and the usage of high-performance systems containing a number of computers (processors) give opportunities for creation of high-quality and fast production that increases competitiveness of an enterprise. Exact and fast calculations, control computation, and processing of the big data arrays - all of this requires the high level of productivity and, at the same time, minimum time of data handling and result receiving. In order to reach the best time, it is necessary not only to use computing resources optimally, but also to design and develop the software so that time gain will be maximal. For this purpose task (jobs or operations), scheduling techniques for the multi-machine/multiprocessor systems are applied. Some of basic task scheduling methods for the multi-machine process control systems are considered in this paper, their advantages and disadvantages come to light, and also some usage considerations, in case of the software for automatic process control systems developing, are made.

  16. Measurement uncertainties of long-term 222Rn averages at environmental levels using alpha track detectors

    International Nuclear Information System (INIS)

    Nelson, R.A.

    1987-01-01

    More than 250 replicate measurements of outdoor Rn concentration integrated over quarterly periods were made to estimate the random component of the measurement uncertainty of Track Etch detectors (type F) under outdoor conditions. The measurements were performed around three U mill tailings piles to provide a range of environmental concentrations. The measurement uncertainty was typically greater than could be accounted for by Poisson counting statistics. Average coefficients of variation of the order of 20% for all measured concentrations were found. It is concluded that alpha track detectors can be successfully used to determine annual average outdoor Rn concentrations through the use of careful quality control procedures. These include rapid deployment and collection of detectors to minimize unintended Rn exposure, careful packaging and shipping to and from the manufacturer, use of direct sunlight shields for all detectors and careful and secure mounting of all detectors in as similar a manner as possible. The use of multiple (at least duplicate) detectors at each monitoring location and an exposure period of no less than one quarter are suggested

  17. A performance evaluation study of three types of alpha-track detector radon monitors

    International Nuclear Information System (INIS)

    Yeager, W.M.; Lucas, R.M.; Daum, K.A.; Sensintaffar, E.; Poppell, S.; Feldt, L.; Clarkin, M.

    1991-01-01

    Three models of alpha-track detector (ATD) Rn monitors were exposed in Environmental Protection Agency (EPA) Rn chambers to obtain estimates of precision and bias for the National Residential Radon Survey (NRRS). Exposures in this study ranged from 37 to 740 Bq y m-3 (1 to 20 pCi y L-1), plus blanks. These exposures correspond to the range expected in most U.S. residences. All detectors were purchased through a Rn mitigation firm to assure that the vendors did not give special attention to the ATDs used in this study. Ten ATDs of each model were studied at 12 exposures. The mean and standard deviation of the reported values for each model were calculated and compared with the continuously monitored chamber concentrations to determine the bias and precision at each exposure. Results of this analysis were discussed with the vendors, who took corrective actions. Changes in track counting procedures and calibrations improved detector performance. Readings of one detector were adjusted based on a regression of the monitored values on the reported values

  18. Rotor assembly and method for automatically processing liquids

    Science.gov (United States)

    Burtis, C.A.; Johnson, W.F.; Walker, W.A.

    1992-12-22

    A rotor assembly is described for performing a relatively large number of processing steps upon a sample, such as a whole blood sample, and a diluent, such as water. It includes a rotor body for rotation about an axis and includes a network of chambers within which various processing steps are performed upon the sample and diluent and passageways through which the sample and diluent are transferred. A transfer mechanism is movable through the rotor body by the influence of a magnetic field generated adjacent the transfer mechanism and movable along the rotor body, and the assembly utilizes centrifugal force, a transfer of momentum and capillary action to perform any of a number of processing steps such as separation, aliquoting, transference, washing, reagent addition and mixing of the sample and diluent within the rotor body. The rotor body is particularly suitable for automatic immunoassay analyses. 34 figs.

  19. Automatic and controlled processing and the Broad Autism Phenotype.

    Science.gov (United States)

    Camodeca, Amy; Voelker, Sylvia

    2016-01-30

    Research related to verbal fluency in the Broad Autism Phenotype (BAP) is limited and dated, but generally suggests intact abilities in the context of weaknesses in other areas of executive function (Hughes et al., 1999; Wong et al., 2006; Delorme et al., 2007). Controlled processing, the generation of search strategies after initial, automated responses are exhausted (Spat, 2013), has yet to be investigated in the BAP, and may be evidenced in verbal fluency tasks. One hundred twenty-nine participants completed the Delis-Kaplan Executive Function System Verbal Fluency test (D-KEFS; Delis et al., 2001) and the Broad Autism Phenotype Questionnaire (BAPQ; Hurley et al., 2007). The BAP group (n=53) produced significantly fewer total words during the 2nd 15" interval compared to the Non-BAP (n=76) group. Partial correlations indicated similar relations between verbal fluency variables for each group. Regression analyses predicting 2nd 15" interval scores suggested differentiation between controlled and automatic processing skills in both groups. Results suggest adequate automatic processing, but slowed development of controlled processing strategies in the BAP, and provide evidence for similar underlying cognitive constructs for both groups. Controlled processing was predictive of Block Design score for Non-BAP participants, and was predictive of Pragmatic Language score on the BAPQ for BAP participants. These results are similar to past research related to strengths and weaknesses in the BAP, respectively, and suggest that controlled processing strategy use may be required in instances of weak lower-level skills. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Meeting new challenges in radon measurements service with solid state alpha track analysis

    International Nuclear Information System (INIS)

    Calamosca, M.; Penzo, S.; Mustapha, A.O.

    2005-01-01

    Full text: The national and international legislation on the control of exposures to radiation now covers exposures to the naturally occurring radioactive materials (NORM), with particular emphasis on radon in workplaces. Consequently, many more working environments have now been brought to the realm of radiation monitoring. This is mirrored by a corresponding increase in the demands for radon monitoring service. The new challenges occasioned by the increase in demands are illustrated in this paper. The paper also describes the new integrated radon measurement system at the ENEA ION-IRP radon-in-air testing laboratory, designed to meet the new challenges. In this presentation emphasis is laid only on the special features, hardware and software, of the measurement system that directly result in enhancement of the overall throughput and the quality of the radon service. In addition the use of a new technique based on a numerical filtering of the main physical parameters of the alpha track analysis used to determine the radon exposure with a CR-39 detector is presented, with particular emphasis to the quality improvements of the measurement. The readout procedure adopted by the Enea radon Service is based on track analysis by light microscopy. Since the dimensions of the tracks are significantly affected by the light intensity, the illumination of the field of view (FOV) has to be as homogeneous as possible, in order to avoid sensitivity dependence on track position in the FOV. The lighting variation could be severe, especially at the borders or when a rectangular FOV is used. The coefficient of variation of the FOV light uniformity has to be periodically checked to maintain the variation below an acceptable limit, whereas if the results exceed this value a remedial action is mandatory. The numerical correction procedure here presented results effective. (author)

  1. Automatic Road Pavement Assessment with Image Processing: Review and Comparison

    Directory of Open Access Journals (Sweden)

    Sylvie Chambon

    2011-01-01

    Full Text Available In the field of noninvasive sensing techniques for civil infrastructures monitoring, this paper addresses the problem of crack detection, in the surface of the French national roads, by automatic analysis of optical images. The first contribution is a state of the art of the image-processing tools applied to civil engineering. The second contribution is about fine-defect detection in pavement surface. The approach is based on a multi-scale extraction and a Markovian segmentation. Third, an evaluation and comparison protocol which has been designed for evaluating this difficult task—the road pavement crack detection—is introduced. Finally, the proposed method is validated, analysed, and compared to a detection approach based on morphological tools.

  2. Childhood trauma exposure disrupts the automatic regulation of emotional processing.

    Science.gov (United States)

    Marusak, Hilary A; Martin, Kayla R; Etkin, Amit; Thomason, Moriah E

    2015-03-13

    Early-life trauma is one of the strongest risk factors for later emotional psychopathology. Although research in adults highlights that childhood trauma predicts deficits in emotion regulation that persist decades later, it is unknown whether neural and behavioral changes that may precipitate illness are evident during formative, developmental years. This study examined whether automatic regulation of emotional conflict is perturbed in a high-risk urban sample of trauma-exposed children and adolescents. A total of 14 trauma-exposed and 16 age-, sex-, and IQ-matched comparison youth underwent functional MRI while performing an emotional conflict task that involved categorizing facial affect while ignoring an overlying emotion word. Engagement of the conflict regulation system was evaluated at neural and behavioral levels. Results showed that trauma-exposed youth failed to dampen dorsolateral prefrontal cortex activity and engage amygdala-pregenual cingulate inhibitory circuitry during the regulation of emotional conflict, and were less able to regulate emotional conflict. In addition, trauma-exposed youth showed greater conflict-related amygdala reactivity that was associated with diminished levels of trait reward sensitivity. These data point to a trauma-related deficit in automatic regulation of emotional processing, and increase in sensitivity to emotional conflict in neural systems implicated in threat detection. Aberrant amygdala response to emotional conflict was related to diminished reward sensitivity that is emerging as a critical stress-susceptibility trait that may contribute to the emergence of mental illness during adolescence. These results suggest that deficits in conflict regulation for emotional material may underlie heightened risk for psychopathology in individuals that endure early-life trauma.

  3. Prosody's Contribution to Fluency: An Examination of the Theory of Automatic Information Processing

    Science.gov (United States)

    Schrauben, Julie E.

    2010-01-01

    LaBerge and Samuels' (1974) theory of automatic information processing in reading offers a model that explains how and where the processing of information occurs and the degree to which processing of information occurs. These processes are dependent upon two criteria: accurate word decoding and automatic word recognition. However, LaBerge and…

  4. Automatic processing of unattended object features by functional connectivity

    Directory of Open Access Journals (Sweden)

    Katja Martina Mayer

    2013-05-01

    Full Text Available Observers can selectively attend to object features that are relevant for a task. However, unattended task-irrelevant features may still be processed and possibly integrated with the attended features. This study investigated the neural mechanisms for processing both task-relevant (attended and task-irrelevant (unattended object features. The Garner paradigm was adapted for functional magnetic resonance imaging (fMRI to test whether specific brain areas process the conjunction of features or whether multiple interacting areas are involved in this form of feature integration. Observers attended to shape, colour, or non-rigid motion of novel objects while unattended features changed from trial to trial (change blocks or remained constant (no-change blocks during a given block. This block manipulation allowed us to measure the extent to which unattended features affected neural responses which would reflect the extent to which multiple object features are automatically processed. We did not find Garner interference at the behavioural level. However, we designed the experiment to equate performance across block types so that any fMRI results could not be due solely to differences in task difficulty between change and no-change blocks. Attention to specific features localised several areas known to be involved in object processing. No area showed larger responses on change blocks compared to no-change blocks. However, psychophysiological interaction analyses revealed that several functionally-localised areas showed significant positive interactions with areas in occipito-temporal and frontal areas that depended on block type. Overall, these findings suggest that both regional responses and functional connectivity are crucial for processing multi-featured objects.

  5. Automatic Control of Arc Process for Making Carbon Nanotubes

    Science.gov (United States)

    Scott, Carl D.; Pulumbarit, Robert B.; Victor, Joe

    2004-01-01

    An automatic-control system has been devised for a process in which carbon nanotubes are produced in an arc between a catalyst-filled carbon anode and a graphite cathode. The control system includes a motor-driven screw that adjusts the distance between the electrodes. The system also includes a bridge circuit that puts out a voltage proportional to the difference between (1) the actual value of potential drop across the arc and (2) a reference value between 38 and 40 V (corresponding to a current of about 100 A) at which the yield of carbon nanotubes is maximized. Utilizing the fact that the potential drop across the arc increases with the interelectrode gap, the output of the bridge circuit is fed to a motor-control circuit that causes the motor to move the anode toward or away from the cathode if the actual potential drop is more or less, respectively, than the reference potential. Thus, the system regulates the interelectrode gap to maintain the optimum potential drop. The system also includes circuitry that records the potential drop across the arc and the relative position of the anode holder as function of time.

  6. Automatic processing of CERN video, audio and photo archives

    Energy Technology Data Exchange (ETDEWEB)

    Kwiatek, M [CERN, Geneva (Switzerland)], E-mail: Michal.Kwiatek@cem.ch

    2008-07-15

    The digitalization of CERN audio-visual archives, a major task currently in progress, will generate over 40 TB of video, audio and photo files. Storing these files is one issue, but a far more important challenge is to provide long-time coherence of the archive and to make these files available on-line with minimum manpower investment. An infrastructure, based on standard CERN services, has been implemented, whereby master files, stored in the CERN Distributed File System (DFS), are discovered and scheduled for encoding into lightweight web formats based on predefined profiles. Changes in master files, conversion profiles or in the metadata database (read from CDS, the CERN Document Server) are automatically detected and the media re-encoded whenever necessary. The encoding processes are run on virtual servers provided on-demand by the CERN Server Self Service Centre, so that new servers can be easily configured to adapt to higher load. Finally, the generated files are made available from the CERN standard web servers with streaming implemented using Windows Media Services.

  7. Automatic processing of CERN video, audio and photo archives

    International Nuclear Information System (INIS)

    Kwiatek, M

    2008-01-01

    The digitalization of CERN audio-visual archives, a major task currently in progress, will generate over 40 TB of video, audio and photo files. Storing these files is one issue, but a far more important challenge is to provide long-time coherence of the archive and to make these files available on-line with minimum manpower investment. An infrastructure, based on standard CERN services, has been implemented, whereby master files, stored in the CERN Distributed File System (DFS), are discovered and scheduled for encoding into lightweight web formats based on predefined profiles. Changes in master files, conversion profiles or in the metadata database (read from CDS, the CERN Document Server) are automatically detected and the media re-encoded whenever necessary. The encoding processes are run on virtual servers provided on-demand by the CERN Server Self Service Centre, so that new servers can be easily configured to adapt to higher load. Finally, the generated files are made available from the CERN standard web servers with streaming implemented using Windows Media Services

  8. Algorithm of automatic generation of technology process and process relations of automotive wiring harnesses

    Institute of Scientific and Technical Information of China (English)

    XU Benzhu; ZHU Jiman; LIU Xiaoping

    2012-01-01

    Identifying each process and their constraint relations from the complex wiring harness drawings quickly and accurately is the basis for formulating process routes. According to the knowledge of automotive wiring harness and the characteristics of wiring harness components, we established the model of wiring harness graph. Then we research the algorithm of identifying technology processes automatically, finally we describe the relationships between processes by introducing the constraint matrix, which is in or- der to lay a good foundation for harness process planning and production scheduling.

  9. Controlled versus automatic processes: which is dominant to safety? The moderating effect of inhibitory control.

    Directory of Open Access Journals (Sweden)

    Yaoshan Xu

    Full Text Available This study explores the precursors of employees' safety behaviors based on a dual-process model, which suggests that human behaviors are determined by both controlled and automatic cognitive processes. Employees' responses to a self-reported survey on safety attitudes capture their controlled cognitive process, while the automatic association concerning safety measured by an Implicit Association Test (IAT reflects employees' automatic cognitive processes about safety. In addition, this study investigates the moderating effects of inhibition on the relationship between self-reported safety attitude and safety behavior, and that between automatic associations towards safety and safety behavior. The results suggest significant main effects of self-reported safety attitude and automatic association on safety behaviors. Further, the interaction between self-reported safety attitude and inhibition and that between automatic association and inhibition each predict unique variances in safety behavior. Specifically, the safety behaviors of employees with lower level of inhibitory control are influenced more by automatic association, whereas those of employees with higher level of inhibitory control are guided more by self-reported safety attitudes. These results suggest that safety behavior is the joint outcome of both controlled and automatic cognitive processes, and the relative importance of these cognitive processes depends on employees' individual differences in inhibitory control. The implications of these findings for theoretical and practical issues are discussed at the end.

  10. A system for classifying wood-using industries and recording statistics for automatic data processing.

    Science.gov (United States)

    E.W. Fobes; R.W. Rowe

    1968-01-01

    A system for classifying wood-using industries and recording pertinent statistics for automatic data processing is described. Forms and coding instructions for recording data of primary processing plants are included.

  11. Automatic system of production, transfer and processing of coin targets for the production of metallic radioisotopes

    Science.gov (United States)

    Pellicioli, M.; Ouadi, A.; Marchand, P.; Foehrenbacher, T.; Schuler, J.; Dick-Schuler, N.; Brasse, D.

    2017-05-01

    The work presented in this paper gathers three main technical developments aiming at 1) optimizing nuclide production by the mean of solid targets 2) automatically transferring coin targets from vault to hotcell without human intervention 3) processing target dilution and purification in hotcell automatically. This system has been installed on a ACSI TR24 cyclotron in Strasbourg France.

  12. Automatic discovery of data-centric and artifact-centric processes

    NARCIS (Netherlands)

    Nooijen, E.H.J.; Dongen, van B.F.; Fahland, D.; La Rosa, M.; Soffer, P.

    2013-01-01

    Process discovery is a technique that allows for automatically discovering a process model from recorded executions of a process as it happens in reality. This technique has successfully been applied for classical processes where one process execution is constituted by a single case with a unique

  13. Automatic identification of corrosion damage using image processing techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bento, Mariana P.; Ramalho, Geraldo L.B.; Medeiros, Fatima N.S. de; Ribeiro, Elvis S. [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil); Medeiros, Luiz C.L. [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper proposes a Nondestructive Evaluation (NDE) method for atmospheric corrosion detection on metallic surfaces using digital images. In this study, the uniform corrosion is characterized by texture attributes extracted from co-occurrence matrix and the Self Organizing Mapping (SOM) clustering algorithm. We present a technique for automatic inspection of oil and gas storage tanks and pipelines of petrochemical industries without disturbing their properties and performance. Experimental results are promising and encourage the possibility of using this methodology in designing trustful and robust early failure detection systems. (author)

  14. Study of automatic boat loading unit and horizontal sintering process of uranium dioxide pellet

    International Nuclear Information System (INIS)

    He Zhongjing; Chen Yu; Yao Dengfeng; Wang Youliang; Shu Binhua; Wu Genjiu

    2014-01-01

    Sintering process is a key process for the manufacture of nuclear fuel UO_2 pellet. In our factory, the continuous high temperature sintering furnace is used for sintering process. During the sintering of green pellets, the furnace, the boat and the accumulation way can influence the quality of the final product. In this text, on the basis of early process research, The automatic loading boat Unit and horizontal sintering process is studied successively. The results show that the physical and chemical properties of the products manufactured by automatic loading boat unit and horizontal sintering process can meet the technique requirements completely, and this system is reliable and continuous. (authors)

  15. Dissociation between controlled and automatic processes in the behavioral variant of fronto-temporal dementia.

    Science.gov (United States)

    Collette, Fabienne; Van der Linden, Martial; Salmon, Eric

    2010-01-01

    A decline of cognitive functioning affecting several cognitive domains was frequently reported in patients with frontotemporal dementia. We were interested in determining if these deficits can be interpreted as reflecting an impairment of controlled cognitive processes by using an assessment tool specifically developed to explore the distinction between automatic and controlled processes, namely the process dissociation procedure (PDP) developed by Jacoby. The PDP was applied to a word stem completion task to determine the contribution of automatic and controlled processes to episodic memory performance and was administered to a group of 12 patients with the behavioral variant of frontotemporal dementia (bv-FTD) and 20 control subjects (CS). Bv-FTD patients obtained a lower performance than CS for the estimates of controlled processes, but no group differences was observed for estimates of automatic processes. The between-groups comparison of the estimates of controlled and automatic processes showed a larger contribution of automatic processes to performance in bv-FTD, while a slightly more important contribution of controlled processes was observed in control subjects. These results are clearly indicative of an alteration of controlled memory processes in bv-FTD.

  16. Automatic Gap Detection in Friction Stir Welding Processes (Preprint)

    National Research Council Canada - National Science Library

    Yang, Yu; Kalya, Prabhanjana; Landers, Robert G; Krishnamurthy, K

    2006-01-01

    .... This paper develops a monitoring algorithm to detect gaps in Friction Stir Welding (FSW) processes. Experimental studies are conducted to determine how the process parameters and the gap width affect the welding process...

  17. Automatic welding processes for reactor coolant pipes used in PWR type nuclear power plant

    International Nuclear Information System (INIS)

    Hamada, T.; Nakamura, A.; Nagura, Y.; Sakamoto, N.

    1979-01-01

    The authors developed automatic welding processes (submerged arc welding process and TIG welding process) for application to the welding of reactor coolant pipes which constitute the most important part of the PWR type nuclear power plant. Submerged arc welding process is suitable for flat position welding in which pipes can be rotated, while TIG welding process is suitable for all position welding. This paper gives an outline of the two processes and the results of tests performed using these processes. (author)

  18. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  19. Automatic generation of optimal business processes from business rules

    NARCIS (Netherlands)

    Steen, B.; Ferreira Pires, Luis; Iacob, Maria Eugenia

    2010-01-01

    In recent years, business process models are increasingly being used as a means for business process improvement. Business rules can be seen as requirements for business processes, in that they describe the constraints that must hold for business processes that implement these business rules.

  20. Statistical data processing with automatic system for environmental radiation monitoring

    International Nuclear Information System (INIS)

    Zarkh, V.G.; Ostroglyadov, S.V.

    1986-01-01

    Practice of statistical data processing for radiation monitoring is exemplified, and some results obtained are presented. Experience in practical application of mathematical statistics methods for radiation monitoring data processing allowed to develop a concrete algorithm of statistical processing realized in M-6000 minicomputer. The suggested algorithm by its content is divided into 3 parts: parametrical data processing and hypotheses test, pair and multiple correlation analysis. Statistical processing programms are in a dialogue operation. The above algorithm was used to process observed data over radioactive waste disposal control region. Results of surface waters monitoring processing are presented

  1. A comparison of conscious and automatic memory processes for picture and word stimuli: a process dissociation analysis.

    Science.gov (United States)

    McBride, Dawn M; Anne Dosher, Barbara

    2002-09-01

    Four experiments were conducted to evaluate explanations of picture superiority effects previously found for several tasks. In a process dissociation procedure (Jacoby, 1991) with word stem completion, picture fragment completion, and category production tasks, conscious and automatic memory processes were compared for studied pictures and words with an independent retrieval model and a generate-source model. The predictions of a transfer appropriate processing account of picture superiority were tested and validated in "process pure" latent measures of conscious and unconscious, or automatic and source, memory processes. Results from both model fits verified that pictures had a conceptual (conscious/source) processing advantage over words for all tasks. The effects of perceptual (automatic/word generation) compatibility depended on task type, with pictorial tasks favoring pictures and linguistic tasks favoring words. Results show support for an explanation of the picture superiority effect that involves an interaction of encoding and retrieval processes.

  2. Test of a potential link between analytic and nonanalytic category learning and automatic, effortful processing.

    Science.gov (United States)

    Tracy, J I; Pinsk, M; Helverson, J; Urban, G; Dietz, T; Smith, D J

    2001-08-01

    The link between automatic and effortful processing and nonanalytic and analytic category learning was evaluated in a sample of 29 college undergraduates using declarative memory, semantic category search, and pseudoword categorization tasks. Automatic and effortful processing measures were hypothesized to be associated with nonanalytic and analytic categorization, respectively. Results suggested that contrary to prediction strong criterion-attribute (analytic) responding on the pseudoword categorization task was associated with strong automatic, implicit memory encoding of frequency-of-occurrence information. Data are discussed in terms of the possibility that criterion-attribute category knowledge, once established, may be expressed with few attentional resources. The data indicate that attention resource requirements, even for the same stimuli and task, vary depending on the category rule system utilized. Also, the automaticity emerging from familiarity with analytic category exemplars is very different from the automaticity arising from extensive practice on a semantic category search task. The data do not support any simple mapping of analytic and nonanalytic forms of category learning onto the automatic and effortful processing dichotomy and challenge simple models of brain asymmetries for such procedures. Copyright 2001 Academic Press.

  3. Towards Automatic Capturing of Manual Data Processing Provenance

    NARCIS (Netherlands)

    Wombacher, Andreas; Huq, M.R.

    2011-01-01

    Often data processing is not implemented by a work ow system or an integration application but is performed manually by humans along the lines of a more or less specified procedure. Collecting provenance information during manual data processing can not be automated. Further, manual collection of

  4. ASAP (Automatic Software for ASL Processing): A toolbox for processing Arterial Spin Labeling images.

    Science.gov (United States)

    Mato Abad, Virginia; García-Polo, Pablo; O'Daly, Owen; Hernández-Tamames, Juan Antonio; Zelaya, Fernando

    2016-04-01

    The method of Arterial Spin Labeling (ASL) has experienced a significant rise in its application to functional imaging, since it is the only technique capable of measuring blood perfusion in a truly non-invasive manner. Currently, there are no commercial packages for processing ASL data and there is no recognized standard for normalizing ASL data to a common frame of reference. This work describes a new Automated Software for ASL Processing (ASAP) that can automatically process several ASL datasets. ASAP includes functions for all stages of image pre-processing: quantification, skull-stripping, co-registration, partial volume correction and normalization. To assess the applicability and validity of the toolbox, this work shows its application in the study of hypoperfusion in a sample of healthy subjects at risk of progressing to Alzheimer's disease. ASAP requires limited user intervention, minimizing the possibility of random and systematic errors, and produces cerebral blood flow maps that are ready for statistical group analysis. The software is easy to operate and results in excellent quality of spatial normalization. The results found in this evaluation study are consistent with previous studies that find decreased perfusion in Alzheimer's patients in similar regions and demonstrate the applicability of ASAP. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Automatic detection of NIL defects using microscopy and image processing

    KAUST Repository

    Pietroy, David; Gereige, Issam; Gourgon, Cé cile

    2013-01-01

    patterns, sticking. In this paper, microscopic imaging combined to a specific processing algorithm is used to detect numerically defects in printed patterns. Results obtained for 1D and 2D imprinted gratings with different microscopic image magnifications

  6. Towards automatic parameter tuning of stream processing systems

    KAUST Repository

    Bilal, Muhammad; Canini, Marco

    2017-01-01

    for automating parameter tuning for stream-processing systems. Our framework supports standard black-box optimization algorithms as well as a novel gray-box optimization algorithm. We demonstrate the multiple benefits of automated parameter tuning in optimizing

  7. Automatic physiological waveform processing for FMRI noise correction and analysis.

    Directory of Open Access Journals (Sweden)

    Daniel J Kelley

    2008-03-01

    Full Text Available Functional MRI resting state and connectivity studies of brain focus on neural fluctuations at low frequencies which share power with physiological fluctuations originating from lung and heart. Due to the lack of automated software to process physiological signals collected at high magnetic fields, a gap exists in the processing pathway between the acquisition of physiological data and its use in fMRI software for both physiological noise correction and functional analyses of brain activation and connectivity. To fill this gap, we developed an open source, physiological signal processing program, called PhysioNoise, in the python language. We tested its automated processing algorithms and dynamic signal visualization on resting monkey cardiac and respiratory waveforms. PhysioNoise consistently identifies physiological fluctuations for fMRI noise correction and also generates covariates for subsequent analyses of brain activation and connectivity.

  8. Measuring automatic retrieval: a comparison of implicit memory, process dissociation, and speeded response procedures.

    Science.gov (United States)

    Horton, Keith D; Wilson, Daryl E; Vonk, Jennifer; Kirby, Sarah L; Nielsen, Tina

    2005-07-01

    Using the stem completion task, we compared estimates of automatic retrieval from an implicit memory task, the process dissociation procedure, and the speeded response procedure. Two standard manipulations were employed. In Experiment 1, a depth of processing effect was found on automatic retrieval using the speeded response procedure although this effect was substantially reduced in Experiment 2 when lexical processing was required of all words. In Experiment 3, the speeded response procedure showed an advantage of full versus divided attention at study on automatic retrieval. An implicit condition showed parallel effects in each study, suggesting that implicit stem completion may normally provide a good estimate of automatic retrieval. Also, we replicated earlier findings from the process dissociation procedure, but estimates of automatic retrieval from this procedure were consistently lower than those from the speeded response procedure, except when conscious retrieval was relatively low. We discuss several factors that may contribute to the conflicting outcomes, including the evidence for theoretical assumptions and criterial task differences between implicit and explicit tests.

  9. Process Concepts for Semi-automatic Dismantling of LCD Televisions

    OpenAIRE

    Elo, Kristofer; Sundin, Erik

    2014-01-01

    There is a large variety of electrical and electronic equipment products, for example liquid crystal display television sets (LCD TVs), in the waste stream today. Many LCD TVs contain mercury, which is a challenge to treat at the recycling plants. Two current used processes to recycle LCD TVs are automated shredding and manual disassembly. This paper aims to present concepts for semi-automated dismantling processes for LCD TVs in order to achieve higher productivity and flexibility, and in tu...

  10. Automatic system for processing the plasma radiation spectra

    International Nuclear Information System (INIS)

    Isakaev, Eh.Kh.; Markin, A.V.; Khajmin, V.A.; Chinnov, V.F.

    2001-01-01

    One is tackling a problem to ensure computer for processing of experimental data when studying plasma obtained due to the present day systems to acquire information. One elaborated rather simple and reliable programs for processing. The system is used in case of plasma quantitative spectroscopy representing the classical and most widely used method to analyze the parameters and the properties of low-temperature and high-temperature plasma [ru

  11. Method for Processing Liver Spheroids Using an Automatic Tissue Processor

    Science.gov (United States)

    2016-05-01

    alcohol dehydration and hot liquid wax infiltration. After the water in the tissue is replaced with wax and cooled, it then becomes possible to cut...effective for processing and preparing microscopy slides of liver spheroids. The general process involved formalin fixation, dehydration in a...DPBS);  formalin (37% neutral buffer formaldehyde);  series of alcohol solutions: 70, 80, 95, and 100% ethanol in water; 2  xylene

  12. Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process

    Science.gov (United States)

    McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.

    1999-01-01

    This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.

  13. Dialog system for automatic data input/output and processing with two BESM-6 computers

    International Nuclear Information System (INIS)

    Belyaev, Y.N.; Gorlov, Y.P.; Makarychev, S.V.; Monakov, A.A.; Shcherbakov, S.A.

    1985-01-01

    This paper presents a system for conducting experiments with fully automatic processing of data from multichannel recorders in the dialog mode. The system acquires data at a rate of 2.5 . 10 3 readings/sec, processes in real time, and outputs digital and graphical material in a multitasking environment

  14. Formal Specification and Automatic Analysis of Business Processes under Authorization Constraints: An Action-Based Approach

    Science.gov (United States)

    Armando, Alessandro; Giunchiglia, Enrico; Ponta, Serena Elisa

    We present an approach to the formal specification and automatic analysis of business processes under authorization constraints based on the action language \\cal{C}. The use of \\cal{C} allows for a natural and concise modeling of the business process and the associated security policy and for the automatic analysis of the resulting specification by using the Causal Calculator (CCALC). Our approach improves upon previous work by greatly simplifying the specification step while retaining the ability to perform a fully automatic analysis. To illustrate the effectiveness of the approach we describe its application to a version of a business process taken from the banking domain and use CCALC to determine resource allocation plans complying with the security policy.

  15. Microsoft excel's automatic data processing and diagram drawing of RIA internal quality control parameters

    International Nuclear Information System (INIS)

    Zeng Pingfan; Liu Guoqiang

    2006-01-01

    We did automatic data processing and diagram drawing of various parameters of RIA' s internal quality control (IQC)by the use of Microsoft Excel (ME). By use of AVERAGE and STDEV of ME, we got x-bar, s and CV%. With pearson, we got the serum quality control coefficients (r). Inputing the original data to diagram's self-definition item, the diagram was drawn automatically. By the use of logic judging, we got the quality control judging results with the status, timing and data of various quality control parameters. For the past four years, the ME data processing and diagram drawing as well as quality control judging have been showed to be accurate, convenient and correct. It was quick and easy to manage and the automatic computer processing of RIA's IQC was realized. Conclusion: the method is applicable to all types of RIA' s IQC. (authors)

  16. Automatic processing of isotopic dilution curves obtained by precordial detection

    International Nuclear Information System (INIS)

    Verite, J.C.

    1973-01-01

    Dilution curves pose two distinct problems: that of their acquisition and that of their processing. A study devoted to the latter aspect only was presented. It was necessary to satisfy two important conditions: the treatment procedure, although applied to a single category of curves (isotopic dilution curves obtained by precordial detection), had to be as general as possible; to allow dissemination of the method the equipment used had to be relatively modest and inexpensive. A simple method, considering the curve processing as a process identification, was developed and should enable the mean heart cavity volume and certain pulmonary circulation parameters to be determined. Considerable difficulties were encountered, limiting the value of the results obtained though not condemning the method itself. The curve processing question raised the problem of their acquisition, i.e. the number of these curves and their meaning. A list of the difficulties encountered is followed by a set of possible solutions, a solution being understood to mean a curve processing combination where the overlapping between the two aspects of the problem is accounted for [fr

  17. Application of parallel processing for automatic inspection of printed circuits

    International Nuclear Information System (INIS)

    Lougheed, R.M.

    1986-01-01

    Automated visual inspection of printed electronic circuits is a challenging application for image processing systems. Detailed inspection requires high speed analysis of gray scale imagery along with high quality optics, lighting, and sensing equipment. A prototype system has been developed and demonstrated at the Environmental Research Institute of Michigan (ERIM) for inspection of multilayer thick-film circuits. The central problem of real-time image processing is solved by a special-purpose parallel processor which includes a new high-speed Cytocomputer. In this chapter the inspection process and the algorithms used are summarized, along with the functional requirements of the machine vision system. Next, the parallel processor is described in detail and then performance on this application is given

  18. [Complex automatic data processing in multi-profile hospitals].

    Science.gov (United States)

    Dovzhenko, Iu M; Panov, G D

    1990-01-01

    The computerization of data processing in multi-disciplinary hospitals is the key factor in raising the quality of medical care provided to the population, intensifying the work of the personnel, improving the curative and diagnostic process and the use of resources. Even a small experience in complex computerization at the Botkin Hospital indicates that due to the use of the automated system the quality of data processing in being improved, a high level of patients' examination is being provided, a speedy training of young specialists is being achieved, conditions are being created for continuing education of physicians through the analysis of their own activity. At big hospitals a complex solution of administrative and curative diagnostic tasks on the basis of general hospital network of display connection and general hospital data bank is the most prospective form of computerization.

  19. The Development from Effortful to Automatic Processing in Mathematical Cognition.

    Science.gov (United States)

    Kaye, Daniel B.; And Others

    This investigation capitalizes upon the information processing models that depend upon measurement of latency of response to a mathematical problem and the decomposition of reaction time (RT). Simple two term addition problems were presented with possible solutions for true-false verification, and accuracy and RT to response were recorded. Total…

  20. Indentification and structuring of data for automatic processing

    International Nuclear Information System (INIS)

    Wohland, H.; Rexer, G.; Ruehle, R.

    1976-01-01

    The data structure of a technical and scientific application system is described. The description of the structure is divided in different sections where the user can describe his own data. By fixing a section of this structure, a high degree of automation of the problem solving process can be achieved while preserving flexibility. (orig.) [de

  1. Process for automatic filling of nuclear fuel rod cans

    International Nuclear Information System (INIS)

    Bezold, H.

    1977-01-01

    A drying section is inserted in the production line for the automation of the filling process for fuel rods with nuclear fuel pellets. The pellets are taken in a drum magazine to a drying furnace and then pushed out one after the other into the can to be filled. (TK) [de

  2. Automatic Processing of Changes in Facial Emotions in Dysphoria: A Magnetoencephalography Study.

    Science.gov (United States)

    Xu, Qianru; Ruohonen, Elisa M; Ye, Chaoxiong; Li, Xueqiao; Kreegipuu, Kairi; Stefanics, Gabor; Luo, Wenbo; Astikainen, Piia

    2018-01-01

    It is not known to what extent the automatic encoding and change detection of peripherally presented facial emotion is altered in dysphoria. The negative bias in automatic face processing in particular has rarely been studied. We used magnetoencephalography (MEG) to record automatic brain responses to happy and sad faces in dysphoric (Beck's Depression Inventory ≥ 13) and control participants. Stimuli were presented in a passive oddball condition, which allowed potential negative bias in dysphoria at different stages of face processing (M100, M170, and M300) and alterations of change detection (visual mismatch negativity, vMMN) to be investigated. The magnetic counterpart of the vMMN was elicited at all stages of face processing, indexing automatic deviance detection in facial emotions. The M170 amplitude was modulated by emotion, response amplitudes being larger for sad faces than happy faces. Group differences were found for the M300, and they were indexed by two different interaction effects. At the left occipital region of interest, the dysphoric group had larger amplitudes for sad than happy deviant faces, reflecting negative bias in deviance detection, which was not found in the control group. On the other hand, the dysphoric group showed no vMMN to changes in facial emotions, while the vMMN was observed in the control group at the right occipital region of interest. Our results indicate that there is a negative bias in automatic visual deviance detection, but also a general change detection deficit in dysphoria.

  3. Automatic detection of NIL defects using microscopy and image processing

    KAUST Repository

    Pietroy, David

    2013-12-01

    Nanoimprint Lithography (NIL) is a promising technology for low cost and large scale nanostructure fabrication. This technique is based on a contact molding-demolding process, that can produce number of defects such as incomplete filling, negative patterns, sticking. In this paper, microscopic imaging combined to a specific processing algorithm is used to detect numerically defects in printed patterns. Results obtained for 1D and 2D imprinted gratings with different microscopic image magnifications are presented. Results are independent on the device which captures the image (optical, confocal or electron microscope). The use of numerical images allows the possibility to automate the detection and to compute a statistical analysis of defects. This method provides a fast analysis of printed gratings and could be used to monitor the production of such structures. © 2013 Elsevier B.V. All rights reserved.

  4. Automatic Optimization of Hardware Accelerators for Image Processing

    OpenAIRE

    Reiche, Oliver; Häublein, Konrad; Reichenbach, Marc; Hannig, Frank; Teich, Jürgen; Fey, Dietmar

    2015-01-01

    In the domain of image processing, often real-time constraints are required. In particular, in safety-critical applications, such as X-ray computed tomography in medical imaging or advanced driver assistance systems in the automotive domain, timing is of utmost importance. A common approach to maintain real-time capabilities of compute-intensive applications is to offload those computations to dedicated accelerator hardware, such as Field Programmable Gate Arrays (FPGAs). Programming such arc...

  5. The Structure of Processing Resource Demands in Monitoring Automatic Systems.

    Science.gov (United States)

    1981-01-01

    Attempts at modelling the human failure detection process have continually focused on normative predictions of optimal operator behavior ( Smallwood ...Broadbent’s filter model (Broadbent, 1957), to Treisman’s attenuation model (Treisman, 1964), to Norman’s late selection model ( Norman , 1968), tife concept...survey and a model. Acta Psychologica, 1967, 27, 84-92. Moray, N. Mental workload: Its theory and measurement. New York: Plenum Press, 1979. Li 42 Norman

  6. Marcoule pilot work-room: process automatic operation

    International Nuclear Information System (INIS)

    Mus, G.; Linger, C.

    1987-01-01

    Commissioned in the early 1960s, the Marcoule Pilot Plant has undergone a series of sweeping transformations. The Research and Development resources concerning irradiated fuel processing have been expanded and modified. Its reprocessing capacity has also been raised from 2 to 5 t/year. Simultaneously, the installation control system was completely remodelled. The control consoles, which were previously positioned locally near the different units, have been grouped together in a centralized control room. To do this, the measurement and operating circuits were replaced by new data acquisition and processing systems requiring the use of numerical algorithms. The management and control of certain units, including mechanical fuel preparation, sampling, and sample transport to the laboratories, have been entrusted to programmable automata. Certain unit operations, such as concentration by evaporation, are set up with complete automation. These new arrangements will expand the resources for analysing the operation of the Pilot Plant, while offering a more overall view of the operations. They have been made possible by a major effort in the development of sensors, and represent the indispensable prerequisite for the installation of expert systems [fr

  7. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations

    Science.gov (United States)

    Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  8. Towards automatic parameter tuning of stream processing systems

    KAUST Repository

    Bilal, Muhammad

    2017-09-27

    Optimizing the performance of big-data streaming applications has become a daunting and time-consuming task: parameters may be tuned from a space of hundreds or even thousands of possible configurations. In this paper, we present a framework for automating parameter tuning for stream-processing systems. Our framework supports standard black-box optimization algorithms as well as a novel gray-box optimization algorithm. We demonstrate the multiple benefits of automated parameter tuning in optimizing three benchmark applications in Apache Storm. Our results show that a hill-climbing algorithm that uses a new heuristic sampling approach based on Latin Hypercube provides the best results. Our gray-box algorithm provides comparable results while being two to five times faster.

  9. Adaptive Automatic Gauge Control of a Cold Strip Rolling Process

    Directory of Open Access Journals (Sweden)

    ROMAN, N.

    2010-02-01

    Full Text Available The paper tackles with thickness control structure of the cold rolled strips. This structure is based on the rolls position control of a reversible quarto rolling mill. The main feature of the system proposed in the paper consists in the compensation of the errors introduced by the deficient dynamics of the hydraulic servo-system used for the rolls positioning, by means of a dynamic compensator that approximates the inverse system of the servo-system. Because the servo-system is considered variant over time, an on-line identification of the servo-system and parameter adapting of the compensator are achieved. The results obtained by numerical simulation are presented together with the data taken from real process. These results illustrate the efficiency of the proposed solutions.

  10. Automatic Methods in Image Processing and Their Relevance to Map-Making.

    Science.gov (United States)

    1981-02-11

    folding fre- quency = .5) and s is the "shaoing fac- tor" which controls the spatial frequency content of the signal; the signal band- width increases...ARIZONA UNIV TUCSON DIGITAL IAgE ANALYSIS LAB Iris 8/ 2AUTOMATIC METHOOS IN IMAGE PROCESSING AND THEIR RELEVANCE TO MA-.ETC~tl;FEB 1 S R HUNT DAA629

  11. Automatic/Control Processing Concepts and Their Implications for the Training of Skills.

    Science.gov (United States)

    1982-04-01

    driving a car are examples of automatic processes. Controll p s is comparatively slow, serial, limited by short-term memory, and requires subject effort...development has convinced us that moivation a oftn more Jmportn nti mAn =other iJli velLJoa jjthpgy gI. njj Lautomatic U_2,LLjjk. Motivation Is much more

  12. 10 CFR 95.49 - Security of automatic data processing (ADP) systems.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Security of automatic data processing (ADP) systems. 95.49 Section 95.49 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) FACILITY SECURITY CLEARANCE AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION AND RESTRICTED DATA Control of Information § 95.49 Security of...

  13. Automatic and Controlled Processing in Sentence Recall: The Role of Long-Term and Working Memory

    Science.gov (United States)

    Jefferies, E.; Lambon Ralph, M.A.; Baddeley, A.D.

    2004-01-01

    Immediate serial recall is better for sentences than word lists presumably because of the additional support that meaningful material receives from long-term memory. This may occur automatically, without the involvement of attention, or may require additional attentionally demanding processing. For example, the episodic buffer model (Baddeley,…

  14. The Development of Automatic and Controlled Inhibitory Retrieval Processes in True and False Recall

    Science.gov (United States)

    Knott, Lauren M.; Howe, Mark L.; Wimmer, Marina C.; Dewhurst, Stephen A.

    2011-01-01

    In three experiments, we investigated the role of automatic and controlled inhibitory retrieval processes in true and false memory development in children and adults. Experiment 1 incorporated a directed forgetting task to examine controlled retrieval inhibition. Experiments 2 and 3 used a part-set cue and retrieval practice task to examine…

  15. Automatic Processing of Metallurgical Abstracts for the Purpose of Information Retrieval. Final Report.

    Science.gov (United States)

    Melton, Jessica S.

    Objectives of this project were to develop and test a method for automatically processing the text of abstracts for a document retrieval system. The test corpus consisted of 768 abstracts from the metallurgical section of Chemical Abstracts (CA). The system, based on a subject indexing rational, had two components: (1) a stored dictionary of words…

  16. REALIZATION OF TRAINING PROGRAMME ON THE BASIS OF LINGUISTIC DATABASE FOR AUTOMATIC TEXTS PROCESSING SYSTEM

    Directory of Open Access Journals (Sweden)

    M. A. Makarych

    2016-01-01

    Full Text Available Due to the constant increasing of electronic textual information, modern society needs for the automatic processing of natural language (NL. The main purpose of NL automatic text processing systems is to analyze and create texts and represent their content. The purpose of the paper is the development of linguistic and software bases of an automatic system for processing English publicistic texts. This article discusses the examples of different approaches to the creation of linguistic databases for processing systems. The author gives a detailed description of basic building blocks for a new linguistic processor: lexical-semantic, syntactical and semantic-syntactical. The main advantage of the processor is using special semantic codes in the alphabetical dictionary. The semantic codes have been developed in accordance with a lexical-semantic classification. It helps to precisely define semantic functions of the keywords that are situated in parsing groups and allows the automatic system to avoid typical mistakes. The author also represents the realization of a developed linguistic database in the form of a training computer program.

  17. Examining the influence of psychopathy, hostility biases, and automatic processing on criminal offenders' Theory of Mind

    NARCIS (Netherlands)

    Nentjes, L.; Bernstein, D.; Arntz, A.; van Breukelen, G.; Slaats, M.

    2015-01-01

    Theory of Mind (ToM) is a social perceptual skill that refers to the ability to take someone else's perspective and infer what others think. The current study examined the effect of potential hostility biases, as well as controlled (slow) versus automatic (fast) processing on ToM performance in

  18. Memory biases in remitted depression: the role of negative cognitions at explicit and automatic processing levels.

    Science.gov (United States)

    Romero, Nuria; Sanchez, Alvaro; Vazquez, Carmelo

    2014-03-01

    Cognitive models propose that depression is caused by dysfunctional schemas that endure beyond the depressive episode, representing vulnerability factors for recurrence. However, research testing negative cognitions linked to dysfunctional schemas in formerly depressed individuals is still scarce. Furthermore, negative cognitions are presumed to be linked to biases in recalling negative self-referent information in formerly depressed individuals, but no studies have directly tested this association. In the present study, we evaluated differences between formerly and never-depressed individuals in several experimental indices of negative cognitions and their associations with the recall of emotional self-referent material. Formerly (n = 30) and never depressed individuals (n = 40) completed measures of explicit (i.e., scrambled sentence test) and automatic (i.e., lexical decision task) processing to evaluate negative cognitions. Furthermore participants completed a self-referent incidental recall task to evaluate memory biases. Formerly compared to never depressed individuals showed greater negative cognitions at both explicit and automatic levels of processing. Results also showed greater recall of negative self-referent information in formerly compared to never-depressed individuals. Finally, individual differences in negative cognitions at both explicit and automatic levels of processing predicted greater recall of negative self-referent material in formerly depressed individuals. Analyses of the relationship between explicit and automatic processing indices and memory biases were correlational and the majority of participants in both groups were women. Our findings provide evidence of negative cognitions in formerly depressed individuals at both automatic and explicit levels of processing that may confer a cognitive vulnerability to depression. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Developmental Dyscalculia and Automatic Magnitudes Processing: Investigating Interference Effects between Area and Perimeter

    Directory of Open Access Journals (Sweden)

    Hili Eidlin-Levy

    2017-12-01

    Full Text Available The relationship between numbers and other magnitudes has been extensively investigated in the scientific literature. Here, the objectives were to examine whether two continuous magnitudes, area and perimeter, are automatically processed and whether adults with developmental dyscalculia (DD are deficient in their ability to automatically process one or both of these magnitudes. Fifty-seven students (30 with DD and 27 with typical development performed a novel Stroop-like task requiring estimation of one aspect (area or perimeter while ignoring the other. In order to track possible changes in automaticity due to practice, we measured performance after initial and continuous exposure to stimuli. Similar to previous findings, current results show a significant group × congruency interaction, evident beyond exposure level or magnitude type. That is, the DD group systematically showed larger Stroop effects. However, analysis of each exposure period showed that during initial exposure to stimuli the DD group showed larger Stroop effects in the perimeter and not in the area task. In contrast, during continuous exposure to stimuli no triple interaction was evident. It is concluded that both magnitudes are automatically processed. Nevertheless, individuals with DD are deficient in inhibiting irrelevant magnitude information in general and, specifically, struggle to inhibit salient area information after initial exposure to a perimeter comparison task. Accordingly, the findings support the assumption that DD involves a deficiency in multiple cognitive components, which include domain-specific and domain-general cognitive functions.

  20. Developmental Dyscalculia and Automatic Magnitudes Processing: Investigating Interference Effects between Area and Perimeter.

    Science.gov (United States)

    Eidlin-Levy, Hili; Rubinsten, Orly

    2017-01-01

    The relationship between numbers and other magnitudes has been extensively investigated in the scientific literature. Here, the objectives were to examine whether two continuous magnitudes, area and perimeter, are automatically processed and whether adults with developmental dyscalculia (DD) are deficient in their ability to automatically process one or both of these magnitudes. Fifty-seven students (30 with DD and 27 with typical development) performed a novel Stroop-like task requiring estimation of one aspect (area or perimeter) while ignoring the other. In order to track possible changes in automaticity due to practice, we measured performance after initial and continuous exposure to stimuli. Similar to previous findings, current results show a significant group × congruency interaction, evident beyond exposure level or magnitude type. That is, the DD group systematically showed larger Stroop effects. However, analysis of each exposure period showed that during initial exposure to stimuli the DD group showed larger Stroop effects in the perimeter and not in the area task. In contrast, during continuous exposure to stimuli no triple interaction was evident. It is concluded that both magnitudes are automatically processed. Nevertheless, individuals with DD are deficient in inhibiting irrelevant magnitude information in general and, specifically, struggle to inhibit salient area information after initial exposure to a perimeter comparison task. Accordingly, the findings support the assumption that DD involves a deficiency in multiple cognitive components, which include domain-specific and domain-general cognitive functions.

  1. Intentional and Automatic Numerical Processing as Predictors of Mathematical Abilities in Primary School Children

    Directory of Open Access Journals (Sweden)

    Violeta ePina

    2015-03-01

    Full Text Available Previous studies have suggested that numerical processing relates to mathematical performance, but it seems that such relationship is more evident for intentional than for automatic numerical processing. In the present study we assessed the relationship between the two types of numerical processing and specific mathematical abilities in a sample of 109 children in grades 1 to 6. Participants were tested in an ample range of mathematical tests and also performed both a numerical and a size comparison task. The results showed that numerical processing related to mathematical performance only when inhibitory control was involved in the comparison tasks. Concretely, we found that intentional numerical processing, as indexed by the numerical distance effect in the numerical comparison task, was related to mathematical reasoning skills only when the task-irrelevant dimension (the physical size was incongruent; whereas automatic numerical processing, indexed by the congruency effect in the size comparison task, was related to mathematical calculation skills only when digits were separated by small distance. The observed double dissociation highlights the relevance of both intentional and automatic numerical processing in mathematical skills, but when inhibitory control is also involved.

  2. Using dual-task methodology to dissociate automatic from nonautomatic processes involved in artificial grammar learning.

    Science.gov (United States)

    Hendricks, Michelle A; Conway, Christopher M; Kellogg, Ronald T

    2013-09-01

    Previous studies have suggested that both automatic and intentional processes contribute to the learning of grammar and fragment knowledge in artificial grammar learning (AGL) tasks. To explore the relative contribution of automatic and intentional processes to knowledge gained in AGL, we utilized dual-task methodology to dissociate automatic and intentional grammar- and fragment-based knowledge in AGL at both acquisition and at test. Both experiments used a balanced chunk strength grammar to assure an equal proportion of fragment cues (i.e., chunks) in grammatical and nongrammatical test items. In Experiment 1, participants engaged in a working memory dual-task either during acquisition, test, or both acquisition and test. The results showed that participants performing the dual-task during acquisition learned the artificial grammar as well as the single-task group, presumably by relying on automatic learning mechanisms. A working memory dual-task at test resulted in attenuated grammar performance, suggesting a role for intentional processes for the expression of grammatical learning at test. Experiment 2 explored the importance of perceptual cues by changing letters between the acquisition and test phase; unlike Experiment 1, there was no significant learning of grammatical information for participants under dual-task conditions in Experiment 2, suggesting that intentional processing is necessary for successful acquisition and expression of grammar-based knowledge under transfer conditions. In sum, it appears that some aspects of learning in AGL are indeed relatively automatic, although the expression of grammatical information and the learning of grammatical patterns when perceptual similarity is eliminated both appear to require explicit resources. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  3. AUTOMR: An automatic processing program system for the molecular replacement method

    International Nuclear Information System (INIS)

    Matsuura, Yoshiki

    1991-01-01

    An automatic processing program system of the molecular replacement method AUTMR is presented. The program solves the initial model of the target crystal structure using a homologous molecule as the search model. It processes the structure-factor calculation of the model molecule, the rotation function, the translation function and the rigid-group refinement successively in one computer job. Test calculations were performed for six protein crystals and the structures were solved in all of these cases. (orig.)

  4. Method and Tool for Design Process Navigation and Automatic Generation of Simulation Models for Manufacturing Systems

    Science.gov (United States)

    Nakano, Masaru; Kubota, Fumiko; Inamori, Yutaka; Mitsuyuki, Keiji

    Manufacturing system designers should concentrate on designing and planning manufacturing systems instead of spending their efforts on creating the simulation models to verify the design. This paper proposes a method and its tool to navigate the designers through the engineering process and generate the simulation model automatically from the design results. The design agent also supports collaborative design projects among different companies or divisions with distributed engineering and distributed simulation techniques. The idea was implemented and applied to a factory planning process.

  5. Sensitometric comparison of E and F dental radiographic films using manual and automatic processing systems

    Directory of Open Access Journals (Sweden)

    Dabaghi A.

    2008-04-01

    Full Text Available Background and Aim: Processing conditions affect sensitometric properties of X-ray films. In this study, we aimed to evaluate the sensitometric characteristics of InSight (IP, a new F-speed film, in fresh and used processing solutions in dental office condition and compare them with Ektaspeed Plus (EP.Materials and Methods: In this experimental in vitro study, an aluminium step wedge was used to construct characteristic curves for InSight and Ektaspeed Plus films (Kodak Eastman, Rochester, USA.All films were processed in Champion solution (X-ray Iran, Tehran, Iran both manually and automatically in a period of six days. Unexposed films of both types were processed manually and automatically to determine base plus fog density. Speed and film contrast were measured according to ISO definition. Data were analyzed using one-way ANOVA and T tests with P<0.05 as the level of significance.Results: IP was 20 to 22% faster than EP and showed to be an F-speed film when processed in automatic condition and E-F film when processed manually. Also it was F-speed in fresh solution and E-speed in old solution. IP and EP contrasts were similar in automatic processing but EP contrast was higher when processed manually. Both EP and IP films had standard values of base plus fog (<0.35 and B+F densities were decreased in old solution.Conclusion: Based on the results of this study, InSight is a F-speed film with a speed of at least 20% greater than Ektaspeed. In addition, it reduces patient exposure with no damage to image quality.

  6. Process and equipment for automatic measurement of resonant frequencies in seismic detectors

    International Nuclear Information System (INIS)

    Fredriksson, O.A.; Thomas, E.L.

    1977-01-01

    This is a process for the automatic indication of the resonant frequency of one or more detector elements which have operated inside a geophysical data-gathering system. Geophones or hydrophones or groups of both instruments are to be understood as comprising the detector elements. The invention concerns the creation of a process and of equipment working with laboratory precision, although it can be used in the field. (orig./RW) [de

  7. Automatic humidification system to support the assessment of food drying processes

    Science.gov (United States)

    Ortiz Hernández, B. D.; Carreño Olejua, A. R.; Castellanos Olarte, J. M.

    2016-07-01

    This work shows the main features of an automatic humidification system to provide drying air that match environmental conditions of different climate zones. This conditioned air is then used to assess the drying process of different agro-industrial products at the Automation and Control for Agro-industrial Processes Laboratory of the Pontifical Bolivarian University of Bucaramanga, Colombia. The automatic system allows creating and improving control strategies to supply drying air under specified conditions of temperature and humidity. The development of automatic routines to control and acquire real time data was made possible by the use of robust control systems and suitable instrumentation. The signals are read and directed to a controller memory where they are scaled and transferred to a memory unit. Using the IP address is possible to access data to perform supervision tasks. One important characteristic of this automatic system is the Dynamic Data Exchange Server (DDE) to allow direct communication between the control unit and the computer used to build experimental curves.

  8. Automatic processing of unattended lexical information in visual oddball presentation: neurophysiological evidence

    Directory of Open Access Journals (Sweden)

    Yury eShtyrov

    2013-08-01

    Full Text Available Previous electrophysiological studies of automatic language processing revealed early (100-200 ms reflections of access to lexical characteristics of speech signal using the so-called mismatch negativity (MMN, a negative ERP deflection elicited by infrequent irregularities in unattended repetitive auditory stimulation. In those studies, lexical processing of spoken stimuli became manifest as an enhanced ERP in response to unattended real words as opposed to phonologically matched but meaningless pseudoword stimuli. This lexical ERP enhancement was explained by automatic activation of word memory traces realised as distributed strongly intra-connected neuronal circuits, whose robustness guarantees memory trace activation even in the absence of attention on spoken input. Such an account would predict the automatic activation of these memory traces upon any presentation of linguistic information, irrespective of the presentation modality. As previous lexical MMN studies exclusively used auditory stimulation, we here adapted the lexical MMN paradigm to investigate early automatic lexical effects in the visual modality. In a visual oddball sequence, matched short word and pseudoword stimuli were presented tachistoscopically in perifoveal area outside the visual focus of attention, as the subjects’ attention was concentrated on a concurrent non-linguistic visual dual task in the centre of the screen. Using EEG, we found a visual analogue of the lexical ERP enhancement effect, with unattended written words producing larger brain response amplitudes than matched pseudowords, starting at ~100 ms. Furthermore, we also found significant visual MMN, reported here for the first time for unattended lexical stimuli presented perifoveally. The data suggest early automatic lexical processing of visually presented language outside the focus of attention.

  9. Software of the BESM-6 computer for automatic image processing from liquid-hydrogen bubble chambers

    International Nuclear Information System (INIS)

    Grebenikov, E.A.; Kiosa, M.N.; Kobzarev, K.K.; Kuznetsova, N.A.; Mironov, S.V.; Nasonova, L.P.

    1978-01-01

    A set of programs, which is used in ''road guidance'' mode on the BESM-6 computer to process picture information taken in liquid hydrogen bubble chambers is discussed. This mode allows the system to process data from an automatic scanner (AS) taking into account the results of manual scanning. The system hardware includes: an automatic scanner, an M-6000 mini-controller and a BESM-6 computer. Software is functionally divided into the following units: computation of event mask parameters and generation . of data files controlling the AS; front-end processing of data coming from the AS; filtering of track data; simulation of AS operation and gauging of the AS reference system. To speed up the overall performance, programs which receive and decode data, coming from the AS via the M-6000 controller and the data link to the BESM-6 computer, are written in machine language

  10. Enhancement of the automatic ultrasonic signal processing system using digital technology

    International Nuclear Information System (INIS)

    Koo, In Soo; Park, H. Y.; Suh, Y. S.; Kim, D. Hoon; Huh, S.; Sung, S. H.; Jang, G. S.; Ryoo, S. G.; Choi, J. H.; Kim, Y. H.; Lee, J. C.; Kim, D. Hyun; Park, H. J.; Kim, Y. C.; Lee, J. P.; Park, C. H.; Kim, M. S.

    1999-12-01

    The objective of this study is to develop the automatic ultrasonic signal processing system which can be used in the inspection equipment to assess the integrity of the reactor vessel by enhancing the performance of the ultrasonic signal processing system. Main activities of this study divided into three categories such as the development of the circuits for generating ultrasonic signal and receiving the signal from the inspection equipment, the development of signal processing algorithm and H/W of the data processing system, and the development of the specification for application programs and system S/W for the analysis and evaluation computer. The results of main activities are as follows 1) the design of the ultrasonic detector and the automatic ultrasonic signal processing system by using the investigation of the state-of-the-art technology in the inside and outside of the country. 2) the development of H/W and S/W of the data processing system based on the results. Especially, the H/W of the data processing system, which have both advantages of digital and analog controls through the real-time digital signal processing, was developed using the DSP which can process the digital signal in the real-time, and was developed not only firmware of the data processing system in order for the peripherals but also the test algorithm of specimen for the calibration. The application programs and the system S/W of the analysis/evaluation computer were developed. Developed equipment was verified by the performance test. Based on developed prototype for the automatic ultrasonic signal processing system, the localization of the overall ultrasonic inspection equipment for nuclear industries would be expected through the further studies of the H/W establishment of real applications, developing the S/W specification of the analysis computer. (author)

  11. Cognitive regulation of smoking behavior within a cigarette: Automatic and nonautomatic processes.

    Science.gov (United States)

    Motschman, Courtney A; Tiffany, Stephen T

    2016-06-01

    There has been limited research on cognitive processes governing smoking behavior in individuals who are tobacco dependent. In a replication (Baxter & Hinson, 2001) and extension, this study examined the theory (Tiffany, 1990) that drug use may be controlled by automatic processes that develop over repeated use. Heavy and occasional cigarette smokers completed a button-press reaction time (RT) task while concurrently smoking a cigarette, pretending to smoke a lit cigarette, or not smoking. Slowed RT during the button-press task indexed the cognitive disruption associated with nonautomatic control of behavior. Occasional smokers' RTs were slowed when smoking or pretending to smoke compared with when not smoking. Heavy smokers' RTs were slowed when pretending to smoke versus not smoking; however, their RTs were similarly fast when smoking compared with not smoking. The results indicated that smoking behavior was more highly regulated by controlled, nonautomatic processes among occasional smokers and by automatic processes among heavy smokers. Patterns of RT across the interpuff interval indicated that occasional smokers were significantly slowed in anticipation of and immediately after puffing onset, whereas heavy smokers were only slowed significantly after puffing onset. These findings suggest that the entirety of the smoking sequence becomes automatized, with the behaviors leading up to puffing becoming more strongly regulated by automatic processes with experience. These results have relevance to theories on the cognitive regulation of cigarette smoking and support the importance of interventions that focus on routinized behaviors that individuals engage in during and leading up to drug use. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  12. Automatic methods for processing track-detector data at the PAVICOM facility

    International Nuclear Information System (INIS)

    Aleksandrov, A.B.; Goncharova, L.A.; Polukhina, N.G.; Fejnberg, E.L.; Davydov, D.A.; Publichenko, P.A.; Roganova, T.M.

    2007-01-01

    New automatic methods essentially simplify and hasten the data treatment of tracking detectors. It allows handling big data files and appreciably improves their statistics; this fact predetermines an elaboration of new experiments, which suppose to use large volume targets, emulsive and solid-state large square tracking detectors. Thereupon the problem of training competent physicists able to work on modern automatic equipment is very relevant. About ten Moscow students working in LPI at PAVICOM facility master new methods every year. Most of the students working in high-energy physics take the print only about archaic hand methods of data handling from tracking detectors. In 2005 on the base of the PAVICOM facility and physics training of the MSU a new educational work for determination of the energy of neutrons passing through nuclear emulsion, which lets students acquire a base habit of data handling from tracking detectors using an automatic facility, was prepared; it can be included in the training process for students of any physical faculty. Specialists mastering methods of an automatic handling by the simple and obvious example of tracking detectors will be able to use their knowledge in various areas of science and techniques. The organization of upper division courses is a new additional aspect of using the PAVICOM facility described in an earlier paper [4

  13. Automatic and controlled processing in sentence recall: The role of long-term and working memory

    OpenAIRE

    Jefferies, Elizabeth; Lambon Ralph, Matthew A.; Baddeley, Alan D.

    2004-01-01

    Immediate serial recall is better for sentences than word lists presumably because of the additional support that meaningful material receives from long-term memory. This may occur automatically, without the involvement of attention, or may require additional attentionally demanding processing. For example, the episodic buffer model (Baddeley, 2000) proposes that the executive component of working memory plays a crucial role in the formation of links between different representational formats...

  14. Process acceptance and adjustment techniques for Swiss automatic screw machine parts. Final report

    International Nuclear Information System (INIS)

    Robb, J.M.

    1976-01-01

    Product tolerance requirements for small, cylindrical, piece parts produced on swiss automatic screw machines have progressed to the reliability limits of inspection equipment. The miniature size, configuration, and tolerance requirements (plus or minus 0.0001 in.) (0.00254 mm) of these parts preclude the use of screening techniques to accept product or adjust processes during setup and production runs; therefore, existing means of product acceptance and process adjustment must be refined or new techniques must be developed. The purpose of this endeavor has been to determine benefits gained through the implementation of a process acceptance technique (PAT) to swiss automatic screw machine processes. PAT is a statistical approach developed for the purpose of accepting product and centering processes for parts produced by selected, controlled processes. Through this endeavor a determination has been made of the conditions under which PAT can benefit a controlled process and some specific types of screw machine processes upon which PAT could be applied. However, it was also determined that PAT, if used indiscriminately, may become a record keeping burden when applied to more than one dimension at a given machining operation

  15. Automatic-heuristic and executive-analytic processing during reasoning: Chronometric and dual-task considerations.

    Science.gov (United States)

    De Neys, Wim

    2006-06-01

    Human reasoning has been shown to overly rely on intuitive, heuristic processing instead of a more demanding analytic inference process. Four experiments tested the central claim of current dual-process theories that analytic operations involve time-consuming executive processing whereas the heuristic system would operate automatically. Participants solved conjunction fallacy problems and indicative and deontic selection tasks. Experiment 1 established that making correct analytic inferences demanded more processing time than did making heuristic inferences. Experiment 2 showed that burdening the executive resources with an attention-demanding secondary task decreased correct, analytic responding and boosted the rate of conjunction fallacies and indicative matching card selections. Results were replicated in Experiments 3 and 4 with a different secondary-task procedure. Involvement of executive resources for the deontic selection task was less clear. Findings validate basic processing assumptions of the dual-process framework and complete the correlational research programme of K. E. Stanovich and R. F. West (2000).

  16. A dual-task investigation of automaticity in visual word processing

    Science.gov (United States)

    McCann, R. S.; Remington, R. W.; Van Selst, M.

    2000-01-01

    An analysis of activation models of visual word processing suggests that frequency-sensitive forms of lexical processing should proceed normally while unattended. This hypothesis was tested by having participants perform a speeded pitch discrimination task followed by lexical decisions or word naming. As the stimulus onset asynchrony between the tasks was reduced, lexical-decision and naming latencies increased dramatically. Word-frequency effects were additive with the increase, indicating that frequency-sensitive processing was subject to postponement while attention was devoted to the other task. Either (a) the same neural hardware shares responsibility for lexical processing and central stages of choice reaction time task processing and cannot perform both computations simultaneously, or (b) lexical processing is blocked in order to optimize performance on the pitch discrimination task. Either way, word processing is not as automatic as activation models suggest.

  17. Associative priming in a masked perceptual identification task: evidence for automatic processes.

    Science.gov (United States)

    Pecher, Diane; Zeelenberg, René; Raaijmakers, Jeroen G W

    2002-10-01

    Two experiments investigated the influence of automatic and strategic processes on associative priming effects in a perceptual identification task in which prime-target pairs are briefly presented and masked. In this paradigm, priming is defined as a higher percentage of correctly identified targets for related pairs than for unrelated pairs. In Experiment 1, priming was obtained for mediated word pairs. This mediated priming effect was affected neither by the presence of direct associations nor by the presentation time of the primes, indicating that automatic priming effects play a role in perceptual identification. Experiment 2 showed that the priming effect was not affected by the proportion (.90 vs. .10) of related pairs if primes were presented briefly to prevent their identification. However, a large proportion effect was found when primes were presented for 1000 ms so that they were clearly visible. These results indicate that priming in a masked perceptual identification task is the result of automatic processes and is not affected by strategies. The present paradigm provides a valuable alternative to more commonly used tasks such as lexical decision.

  18. Is place-value processing in four-digit numbers fully automatic? Yes, but not always.

    Science.gov (United States)

    García-Orza, Javier; Estudillo, Alejandro J; Calleja, Marina; Rodríguez, José Miguel

    2017-12-01

    Knowing the place-value of digits in multi-digit numbers allows us to identify, understand and distinguish between numbers with the same digits (e.g., 1492 vs. 1942). Research using the size congruency task has shown that the place-value in a string of three zeros and a non-zero digit (e.g., 0090) is processed automatically. In the present study, we explored whether place-value is also automatically activated when more complex numbers (e.g., 2795) are presented. Twenty-five participants were exposed to pairs of four-digit numbers that differed regarding the position of some digits and their physical size. Participants had to decide which of the two numbers was presented in a larger font size. In the congruent condition, the number shown in a bigger font size was numerically larger. In the incongruent condition, the number shown in a smaller font size was numerically larger. Two types of numbers were employed: numbers composed of three zeros and one non-zero digit (e.g., 0040-0400) and numbers composed of four non-zero digits (e.g., 2795-2759). Results showed larger congruency effects in more distant pairs in both type of numbers. Interestingly, this effect was considerably stronger in the strings composed of zeros. These results indicate that place-value coding is partially automatic, as it depends on the perceptual and numerical properties of the numbers to be processed.

  19. Automatic tissue image segmentation based on image processing and deep learning

    Science.gov (United States)

    Kong, Zhenglun; Luo, Junyi; Xu, Shengpu; Li, Ting

    2018-02-01

    Image segmentation plays an important role in multimodality imaging, especially in fusion structural images offered by CT, MRI with functional images collected by optical technologies or other novel imaging technologies. Plus, image segmentation also provides detailed structure description for quantitative visualization of treating light distribution in the human body when incorporated with 3D light transport simulation method. Here we used image enhancement, operators, and morphometry methods to extract the accurate contours of different tissues such as skull, cerebrospinal fluid (CSF), grey matter (GM) and white matter (WM) on 5 fMRI head image datasets. Then we utilized convolutional neural network to realize automatic segmentation of images in a deep learning way. We also introduced parallel computing. Such approaches greatly reduced the processing time compared to manual and semi-automatic segmentation and is of great importance in improving speed and accuracy as more and more samples being learned. Our results can be used as a criteria when diagnosing diseases such as cerebral atrophy, which is caused by pathological changes in gray matter or white matter. We demonstrated the great potential of such image processing and deep leaning combined automatic tissue image segmentation in personalized medicine, especially in monitoring, and treatments.

  20. Automatic differentiation tools in the dynamic simulation of chemical engineering processes

    Directory of Open Access Journals (Sweden)

    Castro M.C.

    2000-01-01

    Full Text Available Automatic Differentiation is a relatively recent technique developed for the differentiation of functions applicable directly to the source code to compute the function written in standard programming languages. That technique permits the automatization of the differentiation step, crucial for dynamic simulation and optimization of processes. The values for the derivatives obtained with AD are exact (to roundoff. The theoretical exactness of the AD comes from the fact that it uses the same rules of differentiation as in differential calculus, but these rules are applied to an algorithmic specification of the function rather than to a formula. The main purpose of this contribution is to discuss the impact of Automatic Differentiation in the field of dynamic simulation of chemical engineering processes. The influence of the differentiation technique on the behavior of the integration code, the performance of the generated code and the incorporation of AD tools in consistent initialization tools are discussed from the viewpoint of dynamic simulation of typical models in chemical engineering.

  1. ERPs reveal deficits in automatic cerebral stimulus processing in patients with NIDDM.

    Science.gov (United States)

    Vanhanen, M; Karhu, J; Koivisto, K; Pääkkönen, A; Partanen, J; Laakso, M; Riekkinen, P

    1996-11-04

    We compared auditory event-related potentials (ERPs) and neuropsychological test scores in nine patients with non-insulin-dependent diabetes mellitus (NIDDM) and in nine control subjects. The measures of automatic stimulus processing, habituation of auditory N100 and mismatch negativity (MMN) were impaired in patients. No differences were observed in the N2b and P3 components, which presumably reflect conscious cognitive analysis of the stimuli. A trend towards impaired performance in the Digit Span backward was found in diabetic subjects, but in the tests of secondary or long-term memory the groups were comparable. Patients with NIDDM may have defects in arousal and in the automatic ability to redirect attention, which can affect their cognitive performance.

  2. ACTIV - a program for automatic processing of gamma-ray spectra

    International Nuclear Information System (INIS)

    Zlokazov, V.B.

    1982-01-01

    Program ACTIV is intended for precise analysis of γ-rays and X-ray spectra and allows the user to carry out the full cycle of automatic processing of a series of spectra, i.e. calibration, automatic peak search, determination of peak positions and areas, identification of the radioisotopes and the transformation of the areas found into masses of isotopes in the irradiated sample. ACTIV uses a complex mathematical technique and is oriented mainly to large computers, but using overlaid loading, it can be run also on small computers like the PDP 11/70. Compared with other similar programs, ACTIV has some advantages in accuracy of peak shape description and in the reliability of the peak search and its least-square analysis. The program can be used for the purpose of activation analysis. The program can analyze spectra with poor statistics and with broad and narrow peaks. (orig.)

  3. Learning algorithms and automatic processing of languages; Algorithmes a apprentissage et traitement automatique des langues

    Energy Technology Data Exchange (ETDEWEB)

    Fluhr, Christian Yves Andre

    1977-06-15

    This research thesis concerns the field of artificial intelligence. It addresses learning algorithms applied to automatic processing of languages. The author first briefly describes some mechanisms of human intelligence in order to describe how these mechanisms are simulated on a computer. He outlines the specific role of learning in various manifestations of intelligence. Then, based on the Markov's algorithm theory, the author discusses the notion of learning algorithm. Two main types of learning algorithms are then addressed: firstly, an 'algorithm-teacher dialogue' type sanction-based algorithm which aims at learning how to solve grammatical ambiguities in submitted texts; secondly, an algorithm related to a document system which structures semantic data automatically obtained from a set of texts in order to be able to understand by references to any question on the content of these texts.

  4. The Automatic Conservative: Ideology-Based Attentional Asymmetries in the Processing of Valenced Information

    Science.gov (United States)

    Carraro, Luciana; Castelli, Luigi; Macchiella, Claudia

    2011-01-01

    Research has widely explored the differences between conservatives and liberals, and it has been also recently demonstrated that conservatives display different reactions toward valenced stimuli. However, previous studies have not yet fully illuminated the cognitive underpinnings of these differences. In the current work, we argued that political ideology is related to selective attention processes, so that negative stimuli are more likely to automatically grab the attention of conservatives as compared to liberals. In Experiment 1, we demonstrated that negative (vs. positive) information impaired the performance of conservatives, more than liberals, in an Emotional Stroop Task. This finding was confirmed in Experiment 2 and in Experiment 3 employing a Dot-Probe Task, demonstrating that threatening stimuli were more likely to attract the attention of conservatives. Overall, results support the conclusion that people embracing conservative views of the world display an automatic selective attention for negative stimuli. PMID:22096486

  5. Study of an automatic dosing of neptunium in the industrial process of separation neptunium 237-plutonium 238

    International Nuclear Information System (INIS)

    Ros, Pierre

    1973-01-01

    The objective is to study and to adapt a method of automatic dosing of neptunium to the industrial process of separation and purification of plutonium 238, while taking the information quality and economic aspects into account. After a recall of some generalities on the production of plutonium 238, and the process of separation plutonium-neptunium, the author addresses the dosing of neptunium. The adopted measurement technique is spectrophotometry (of neptunium, of neptunium peroxide) which is the most flexible and economic to adapt to automatic control. The author proposes a project of chemical automatic machine, and discusses the complex (stoichiometry, form) and some aspects of neptunium dosing (redox reactions, process control) [fr

  6. Combination of digital autoradiography and alpha track analysis to reveal the distribution of definite alpha- and beta-emitting nuclides in contaminated samples

    Energy Technology Data Exchange (ETDEWEB)

    Vlasova, I. [Lomonosov MSU (Russian Federation); Kuzmenkova, N. [Vernadsky GEOKHI RAS (Russian Federation); Shiryaev, A. [Frumkin IPCE RAS (Russian Federation); Pryakhin, E. [Urals Research Center for Radiation Medicine (Russian Federation); Kalmykov, S.; Ivanov, I. [PA Mayak (Russian Federation)

    2014-07-01

    Digital autoradiography using Imaging Plate is commonly employed for searching 'hot' particles in the contaminated soil, sediment and aerosol probes. However digital radiography images combined with Alpha Track radiography data could provide much more information about micro-distribution of different alpha- and beta- nuclides. The discrimination method to estimate the distribution of radionuclides that are the main contributors to the total radioactivity ({sup 90}Sr/{sup 90}Y, {sup 137}Cs, {sup 241}Am) has been developed on the case of artificial reservoir V-17 (PA 'Mayak'). The bottom sediments and hydrobionts probes collected from V-17 along with the standards of {sup 137}Cs, {sup 90}Sr/{sup 90}Y and {sup 241}Am have been exposed for a short time (15 min) using a stack of 3 Imaging Plates (Cyclone Plus Storage Phosphor System, Perkin Elmer). The attenuation of photostimulated luminescence (PSL) intensity from layer to layer of the Imaging Plates depends on the type and energy of radiation. Integrated approach using PSL attenuation in the samples and standards (digital radiography) along with Alpha Track radiography and gamma-spectroscopy of the preparation was used to estimate the contribution of the main nuclides in specific parts of contaminated samples. The observation of the {sup 90}Sr/{sup 90}Y and {sup 137}Cs activity maxima could help to find the phases which are responsible for preferential sorption of the nuclides. Document available in abstract form only. (authors)

  7. Automatic processing of gamma ray spectra employing classical and modified Fourier transform approach

    International Nuclear Information System (INIS)

    Rattan, S.S.; Madan, V.K.

    1994-01-01

    This report describes methods for automatic processing of gamma ray spectra acquired with HPGe detectors. The processing incorporated both classical and signal processing approach. The classical method was used for smoothing, detecting significant peaks, finding peak envelope limits and a proposed method of finding peak limits, peak significance index, full width at half maximum, detecting doublets for further analysis. To facilitate application of signal processing to nuclear spectra, Madan et al. gave a new classification of signals and identified nuclear spectra as Type II signals, mathematically formalized modified Fourier transform and pioneered its application to process doublet envelopes acquired with modern spectrometers. It was extended to facilitate routine analysis of the spectra. A facility for energy and efficiency calibration was also included. The results obtained by analyzing observed gamma-ray spectra using the above approach compared favourably with those obtained with SAMPO and also those derived from table of radioisotopes. (author). 15 refs., 3 figs., 3 tabs

  8. Generating Impact Maps from Automatically Detected Bomb Craters in Aerial Wartime Images Using Marked Point Processes

    Science.gov (United States)

    Kruse, Christian; Rottensteiner, Franz; Hoberg, Thorsten; Ziems, Marcel; Rebke, Julia; Heipke, Christian

    2018-04-01

    The aftermath of wartime attacks is often felt long after the war ended, as numerous unexploded bombs may still exist in the ground. Typically, such areas are documented in so-called impact maps which are based on the detection of bomb craters. This paper proposes a method for the automatic detection of bomb craters in aerial wartime images that were taken during the Second World War. The object model for the bomb craters is represented by ellipses. A probabilistic approach based on marked point processes determines the most likely configuration of objects within the scene. Adding and removing new objects to and from the current configuration, respectively, changing their positions and modifying the ellipse parameters randomly creates new object configurations. Each configuration is evaluated using an energy function. High gradient magnitudes along the border of the ellipse are favored and overlapping ellipses are penalized. Reversible Jump Markov Chain Monte Carlo sampling in combination with simulated annealing provides the global energy optimum, which describes the conformance with a predefined model. For generating the impact map a probability map is defined which is created from the automatic detections via kernel density estimation. By setting a threshold, areas around the detections are classified as contaminated or uncontaminated sites, respectively. Our results show the general potential of the method for the automatic detection of bomb craters and its automated generation of an impact map in a heterogeneous image stock.

  9. Adjustment of automatic control systems of production facilities at coal processing plants using multivariant physico- mathematical models

    Science.gov (United States)

    Evtushenko, V. F.; Myshlyaev, L. P.; Makarov, G. V.; Ivushkin, K. A.; Burkova, E. V.

    2016-10-01

    The structure of multi-variant physical and mathematical models of control system is offered as well as its application for adjustment of automatic control system (ACS) of production facilities on the example of coal processing plant.

  10. Image processing applied to automatic detection of defects during ultrasonic examination

    International Nuclear Information System (INIS)

    Moysan, J.

    1992-10-01

    This work is a study about image processing applied to ultrasonic BSCAN images which are obtained in the field of non destructive testing of weld. The goal is to define what image processing techniques can bring to ameliorate the exploitation of the data collected and, more precisely, what image processing can do to extract the meaningful echoes which enable to characterize and to size the defects. The report presents non destructive testing by ultrasounds in the nuclear field and it indicates specificities of the propagation of ultrasonic waves in austenitic weld. It gives a state of the art of the data processing applied to ultrasonic images in nondestructive evaluation. A new image analysis is then developed. It is based on a powerful tool, the co-occurrence matrix. This matrix enables to represent, in a whole representation, relations between amplitudes of couples of pixels. From the matrix analysis, a new complete and automatic method has been set down in order to define a threshold which separates echoes from noise. An automatic interpretation of the ultrasonic echoes is then possible. Complete validation has been done with standard pieces

  11. Intentional and automatic processing of numerical information in mathematical anxiety: testing the influence of emotional priming.

    Science.gov (United States)

    Ashkenazi, Sarit

    2018-02-05

    Current theoretical approaches suggest that mathematical anxiety (MA) manifests itself as a weakness in quantity manipulations. This study is the first to examine automatic versus intentional processing of numerical information using the numerical Stroop paradigm in participants with high MA. To manipulate anxiety levels, we combined the numerical Stroop task with an affective priming paradigm. We took a group of college students with high MA and compared their performance to a group of participants with low MA. Under low anxiety conditions (neutral priming), participants with high MA showed relatively intact number processing abilities. However, under high anxiety conditions (mathematical priming), participants with high MA showed (1) higher processing of the non-numerical irrelevant information, which aligns with the theoretical view regarding deficits in selective attention in anxiety and (2) an abnormal numerical distance effect. These results demonstrate that abnormal, basic numerical processing in MA is context related.

  12. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    Science.gov (United States)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  13. Information processing requirements for on-board monitoring of automatic landing

    Science.gov (United States)

    Sorensen, J. A.; Karmarkar, J. S.

    1977-01-01

    A systematic procedure is presented for determining the information processing requirements for on-board monitoring of automatic landing systems. The monitoring system detects landing anomalies through use of appropriate statistical tests. The time-to-correct aircraft perturbations is determined from covariance analyses using a sequence of suitable aircraft/autoland/pilot models. The covariance results are used to establish landing safety and a fault recovery operating envelope via an event outcome tree. This procedure is demonstrated with examples using the NASA Terminal Configured Vehicle (B-737 aircraft). The procedure can also be used to define decision height, assess monitoring implementation requirements, and evaluate alternate autoland configurations.

  14. Age-related differences in the automatic processing of single letters: implications for selective attention.

    Science.gov (United States)

    Daffner, Kirk R; Alperin, Brittany R; Mott, Katherine K; Holcomb, Phillip J

    2014-01-22

    Older adults exhibit diminished ability to inhibit the processing of visual stimuli that are supposed to be ignored. The extent to which age-related changes in early visual processing contribute to impairments in selective attention remains to be determined. Here, 103 adults, 18-85 years of age, completed a color selective attention task in which they were asked to attend to a specified color and respond to designated target letters. An optimal approach would be to initially filter according to color and then process letter forms in the attend color to identify targets. An asymmetric N170 ERP component (larger amplitude over left posterior hemisphere sites) was used as a marker of the early automatic processing of letter forms. Young and middle-aged adults did not generate an asymmetric N170 component. In contrast, young-old and old-old adults produced a larger N170 over the left hemisphere. Furthermore, older adults generated a larger N170 to letter than nonletter stimuli over the left, but not right hemisphere. More asymmetric N170 responses predicted greater allocation of late selection resources to target letters in the ignore color, as indexed by P3b amplitude. These results suggest that unlike their younger counterparts, older adults automatically process stimuli as letters early in the selection process, when it would be more efficient to attend to color only. The inability to ignore letters early in the processing stream helps explain the age-related increase in subsequent processing of target letter forms presented in the ignore color.

  15. Oscillatory brain dynamics associated with the automatic processing of emotion in words.

    Science.gov (United States)

    Wang, Lin; Bastiaansen, Marcel

    2014-10-01

    This study examines the automaticity of processing the emotional aspects of words, and characterizes the oscillatory brain dynamics that accompany this automatic processing. Participants read emotionally negative, neutral and positive nouns while performing a color detection task in which only perceptual-level analysis was required. Event-related potentials and time frequency representations were computed from the concurrently measured EEG. Negative words elicited a larger P2 and a larger late positivity than positive and neutral words, indicating deeper semantic/evaluative processing of negative words. In addition, sustained alpha power suppressions were found for the emotional compared to neutral words, in the time range from 500 to 1000ms post-stimulus. These results suggest that sustained attention was allocated to the emotional words, whereas the attention allocated to the neutral words was released after an initial analysis. This seems to hold even when the emotional content of the words is task-irrelevant. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Sensitometric characteristics of D-, E- and F-speed dental radiographic films in manual and automatic processing

    Directory of Open Access Journals (Sweden)

    Jahangir Haghani

    2012-12-01

    Full Text Available BACKGROUND: The purpose of this study was to evaluate the sensitometric characteristics of Ultraspeed, Ektaspeed Plus and Insight dental radiographic films using manual and automatic processing systems. METHODS: In this experimental invitro study, an aluminum step-wedge was used to construct characteristic curves for D-, E- and F-speed radiographic films (Kodak Eastman, Rochester, USA. All films were processed in Iranian processing solution (chemical industries Co., Iran, Tehran both manually and automatically in a period of six days. Unexposed films of three types were processed manually and automatically to determine base plus fog density. Speed and film contrast were measured according to International Standard Organization definition. RESULTS: There was significant difference in density obtained with the D-, E- and F-speed films in both manually and automatically processing systems (P < 0.001. There was significant difference in density obtained with the Ultraspeed and insight films. There was no significant difference in contrast obtained with the D-, E- and F-speed films in both manually and automatically processing systems (P = 0.255 , P = 0.26. There was significant difference in speed obtained with the D-, E- and F-speed films in both manually and automatically processing systems (P = 0.034, P = 0.04. CONCLUSIONS: The choice of processing system can affect radiographic characteristics. The F-speed film processed in automatic system has greater speed in comparison with manual processing system, and it provides a further reduction in radiation exposure without detriment to image quality.

  17. Automatic two- and three-dimensional mesh generation based on fuzzy knowledge processing

    Science.gov (United States)

    Yagawa, G.; Yoshimura, S.; Soneda, N.; Nakao, K.

    1992-09-01

    This paper describes the development of a novel automatic FEM mesh generation algorithm based on the fuzzy knowledge processing technique. A number of local nodal patterns are stored in a nodal pattern database of the mesh generation system. These nodal patterns are determined a priori based on certain theories or past experience of experts of FEM analyses. For example, such human experts can determine certain nodal patterns suitable for stress concentration analyses of cracks, corners, holes and so on. Each nodal pattern possesses a membership function and a procedure of node placement according to this function. In the cases of the nodal patterns for stress concentration regions, the membership function which is utilized in the fuzzy knowledge processing has two meanings, i.e. the “closeness” of nodal location to each stress concentration field as well as “nodal density”. This is attributed to the fact that a denser nodal pattern is required near a stress concentration field. What a user has to do in a practical mesh generation process are to choose several local nodal patterns properly and to designate the maximum nodal density of each pattern. After those simple operations by the user, the system places the chosen nodal patterns automatically in an analysis domain and on its boundary, and connects them smoothly by the fuzzy knowledge processing technique. Then triangular or tetrahedral elements are generated by means of the advancing front method. The key issue of the present algorithm is an easy control of complex two- or three-dimensional nodal density distribution by means of the fuzzy knowledge processing technique. To demonstrate fundamental performances of the present algorithm, a prototype system was constructed with one of object-oriented languages, Smalltalk-80 on a 32-bit microcomputer, Macintosh II. The mesh generation of several two- and three-dimensional domains with cracks, holes and junctions was presented as examples.

  18. Automatic Prompt System in the Process of Mapping plWordNet on Princeton WordNet

    Directory of Open Access Journals (Sweden)

    Paweł Kędzia

    2015-06-01

    Full Text Available Automatic Prompt System in the Process of Mapping plWordNet on Princeton WordNet The paper offers a critical evaluation of the power and usefulness of an automatic prompt system based on the extended Relaxation Labelling algorithm in the process of (manual mapping plWordNet on Princeton WordNet. To this end the results of manual mapping – that is inter-lingual relations between plWN and PWN synsets – are juxtaposed with the automatic prompts that were generated for the source language synsets to be mapped. We check the number and type of inter-lingual relations introduced on the basis of automatic prompts and the distance of the respective prompt synsets from the actual target language synsets.

  19. Automatic pre-processing for an object-oriented distributed hydrological model using GRASS-GIS

    Science.gov (United States)

    Sanzana, P.; Jankowfsky, S.; Branger, F.; Braud, I.; Vargas, X.; Hitschfeld, N.

    2012-04-01

    Landscapes are very heterogeneous, which impact the hydrological processes occurring in the catchments, especially in the modeling of peri-urban catchments. The Hydrological Response Units (HRUs), resulting from the intersection of different maps, such as land use, soil types and geology, and flow networks, allow the representation of these elements in an explicit way, preserving natural and artificial contours of the different layers. These HRUs are used as model mesh in some distributed object-oriented hydrological models, allowing the application of a topological oriented approach. The connectivity between polygons and polylines provides a detailed representation of the water balance and overland flow in these distributed hydrological models, based on irregular hydro-landscape units. When computing fluxes between these HRUs, the geometrical parameters, such as the distance between the centroid of gravity of the HRUs and the river network, and the length of the perimeter, can impact the realism of the calculated overland, sub-surface and groundwater fluxes. Therefore, it is necessary to process the original model mesh in order to avoid these numerical problems. We present an automatic pre-processing implemented in the open source GRASS-GIS software, for which several Python scripts or some algorithms already available were used, such as the Triangle software. First, some scripts were developed to improve the topology of the various elements, such as snapping of the river network to the closest contours. When data are derived with remote sensing, such as vegetation areas, their perimeter has lots of right angles that were smoothed. Second, the algorithms more particularly address bad-shaped elements of the model mesh such as polygons with narrow shapes, marked irregular contours and/or the centroid outside of the polygons. To identify these elements we used shape descriptors. The convexity index was considered the best descriptor to identify them with a threshold

  20. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique.

    Science.gov (United States)

    Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Shaw, Philip J; Ukosakit, Kittipat; Tragoonrung, Somvong; Tongsima, Sissades

    2015-01-01

    DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. This work presents an automated genotyping tool from DNA

  1. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique

    Science.gov (United States)

    2015-01-01

    Background DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. Results We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. Conclusions This work presents an

  2. Automatic detection of health changes using statistical process control techniques on measured transfer times of elderly.

    Science.gov (United States)

    Baldewijns, Greet; Luca, Stijn; Nagels, William; Vanrumste, Bart; Croonenborghs, Tom

    2015-01-01

    It has been shown that gait speed and transfer times are good measures of functional ability in elderly. However, data currently acquired by systems that measure either gait speed or transfer times in the homes of elderly people require manual reviewing by healthcare workers. This reviewing process is time-consuming. To alleviate this burden, this paper proposes the use of statistical process control methods to automatically detect both positive and negative changes in transfer times. Three SPC techniques: tabular CUSUM, standardized CUSUM and EWMA, known for their ability to detect small shifts in the data, are evaluated on simulated transfer times. This analysis shows that EWMA is the best-suited method with a detection accuracy of 82% and an average detection time of 9.64 days.

  3. Automatic Scaling Hadoop in the Cloud for Efficient Process of Big Geospatial Data

    Directory of Open Access Journals (Sweden)

    Zhenlong Li

    2016-09-01

    Full Text Available Efficient processing of big geospatial data is crucial for tackling global and regional challenges such as climate change and natural disasters, but it is challenging not only due to the massive data volume but also due to the intrinsic complexity and high dimensions of the geospatial datasets. While traditional computing infrastructure does not scale well with the rapidly increasing data volume, Hadoop has attracted increasing attention in geoscience communities for handling big geospatial data. Recently, many studies were carried out to investigate adopting Hadoop for processing big geospatial data, but how to adjust the computing resources to efficiently handle the dynamic geoprocessing workload was barely explored. To bridge this gap, we propose a novel framework to automatically scale the Hadoop cluster in the cloud environment to allocate the right amount of computing resources based on the dynamic geoprocessing workload. The framework and auto-scaling algorithms are introduced, and a prototype system was developed to demonstrate the feasibility and efficiency of the proposed scaling mechanism using Digital Elevation Model (DEM interpolation as an example. Experimental results show that this auto-scaling framework could (1 significantly reduce the computing resource utilization (by 80% in our example while delivering similar performance as a full-powered cluster; and (2 effectively handle the spike processing workload by automatically increasing the computing resources to ensure the processing is finished within an acceptable time. Such an auto-scaling approach provides a valuable reference to optimize the performance of geospatial applications to address data- and computational-intensity challenges in GIScience in a more cost-efficient manner.

  4. The origins of levels-of-processing effects in a conceptual test: evidence for automatic influences of memory from the process-dissociation procedure.

    Science.gov (United States)

    Bergerbest, Dafna; Goshen-Gottstein, Yonatan

    2002-12-01

    In three experiments, we explored automatic influences of memory in a conceptual memory task, as affected by a levels-of-processing (LoP) manipulation. We also explored the origins of the LoP effect by examining whether the effect emerged only when participants in the shallow condition truncated the perceptual processing (the lexical-processing hypothesis) or even when the entire word was encoded in this condition (the conceptual-processing hypothesis). Using the process-dissociation procedure and an implicit association-generation task, we found that the deep encoding condition yielded higher estimates of automatic influences than the shallow condition. In support of the conceptual processing hypothesis, the LoP effect was found even when the shallow task did not lead to truncated processing of the lexical units. We suggest that encoding for meaning is a prerequisite for automatic processing on conceptual tests of memory.

  5. DEVELOPING A SPATIAL PROCESSING SERVICE FOR AUTOMATIC CALCULATION OF STORM INUNDATION

    Directory of Open Access Journals (Sweden)

    H. Jafari

    2017-09-01

    Full Text Available With the increase in urbanization, the surface of earth and its climate are changing. These changes resulted in more frequent floodingand storm inundation in urban areas. The challenges of flooding can be addressed through several computational procedures. Due to its numerous advantages, accessible web services can be chosen as a proper format for determining the storm inundation. Web services have facilitated the integration and interactivity of the web applications. Such services made the interaction between machines more feasible. Web services enable the heterogeneous software systems to communicate with each other. A Web Processing Service (WPS makes it possible to process spatial data with different formats. In this study, we developed a WPS to automatically calculate the amount of storm inundation caused by rainfall in urban areas. The method we used for calculating the storm inundation is based on a simplified hydrologic model which estimates the final status of inundation. The simulation process and water transfer between subcatchments are carried out respectively, without user’s interference. The implementation of processing functions in a form of processing web services gives the capability to reuse the services and apply them in other services. As a result, it would avoid creating the duplicate resources.

  6. A Development Process for Enterprise Information Systems Based on Automatic Generation of the Components

    Directory of Open Access Journals (Sweden)

    Adrian ALEXANDRESCU

    2008-01-01

    Full Text Available This paper contains some ideas concerning the Enterprise Information Systems (EIS development. It combines known elements from the software engineering domain, with original elements, which the author has conceived and experimented. The author has followed two major objectives: to use a simple description for the concepts of an EIS, and to achieve a rapid and reliable EIS development process with minimal cost. The first goal was achieved defining some models, which describes the conceptual elements of the EIS domain: entities, events, actions, states and attribute-domain. The second goal is based on a predefined architectural model for the EIS, on predefined analyze and design models for the elements of the domain and finally on the automatic generation of the system components. The proposed methods do not depend on a special programming language or a data base management system. They are general and may be applied to any combination of such technologies.

  7. Application of digital process controller for automatic pulse operation in the NSRR

    International Nuclear Information System (INIS)

    Ishijima, K.; Ueda, T.; Saigo, M.

    1992-01-01

    The NSRR at JAERI is a modified TRIGA Reactor. It was built for investigating reactor fuel behavior under reactivity initiated accident (RIA) conditions. Recently, there has been a need to improve the flexibility of pulsing operations in the NSRR to cover a wide range of accidental situations, including RIA events at elevated power levels, and various abnormal power transients. To satisfy this need, we developed a new reactor control system which allows us to perform 'Shaped Pulse Operation: SP' and 'Combined Pulse Operation: CP'. Quick, accurate and complicated manipulation of control rods was required to realize these operations. Therefore we installed a new reactor control system, which we call an automatic pulse control system. This control system is composed of digital processing controllers and other digital equipments, and is fully automated and highly accurate. (author)

  8. Developing an Intelligent Automatic Appendix Extraction Method from Ultrasonography Based on Fuzzy ART and Image Processing

    Directory of Open Access Journals (Sweden)

    Kwang Baek Kim

    2015-01-01

    Full Text Available Ultrasound examination (US does a key role in the diagnosis and management of the patients with clinically suspected appendicitis which is the most common abdominal surgical emergency. Among the various sonographic findings of appendicitis, outer diameter of the appendix is most important. Therefore, clear delineation of the appendix on US images is essential. In this paper, we propose a new intelligent method to extract appendix automatically from abdominal sonographic images as a basic building block of developing such an intelligent tool for medical practitioners. Knowing that the appendix is located at the lower organ area below the bottom fascia line, we conduct a series of image processing techniques to find the fascia line correctly. And then we apply fuzzy ART learning algorithm to the organ area in order to extract appendix accurately. The experiment verifies that the proposed method is highly accurate (successful in 38 out of 40 cases in extracting appendix.

  9. Neural dynamics of morphological processing in spoken word comprehension: Laterality and automaticity

    Directory of Open Access Journals (Sweden)

    Caroline M. Whiting

    2013-11-01

    Full Text Available Rapid and automatic processing of grammatical complexity is argued to take place during speech comprehension, engaging a left-lateralised fronto-temporal language network. Here we address how neural activity in these regions is modulated by the grammatical properties of spoken words. We used combined magneto- and electroencephalography (MEG, EEG to delineate the spatiotemporal patterns of activity that support the recognition of morphologically complex words in English with inflectional (-s and derivational (-er affixes (e.g. bakes, baker. The mismatch negativity (MMN, an index of linguistic memory traces elicited in a passive listening paradigm, was used to examine the neural dynamics elicited by morphologically complex words. Results revealed an initial peak 130-180 ms after the deviation point with a major source in left superior temporal cortex. The localisation of this early activation showed a sensitivity to two grammatical properties of the stimuli: 1 the presence of morphological complexity, with affixed words showing increased left-laterality compared to non-affixed words; and 2 the grammatical category, with affixed verbs showing greater left-lateralisation in inferior frontal gyrus compared to affixed nouns (bakes vs. beaks. This automatic brain response was additionally sensitive to semantic coherence (the meaning of the stem vs. the meaning of the whole form in fronto-temporal regions. These results demonstrate that the spatiotemporal pattern of neural activity in spoken word processing is modulated by the presence of morphological structure, predominantly engaging the left-hemisphere’s fronto-temporal language network, and does not require focused attention on the linguistic input.

  10. Event-related potential evidence for separable automatic and controlled retrieval processes in proactive interference.

    Science.gov (United States)

    Bergström, Zara M; O'Connor, Richard J; Li, Martin K-H; Simons, Jon S

    2012-05-21

    Interference between competing memories is a major source of retrieval failure, yet, surprisingly little is known about how competitive memory activation arises in the brain. One possibility is that interference during episodic retrieval might be produced by relatively automatic conceptual priming mechanisms that are independent of strategic retrieval processes. Such priming-driven interference might occur when the competing memories have strong pre-existing associations to the retrieval cue. We used ERPs to measure the neural dynamics of retrieval competition, and investigated whether the ERP correlates of interference were affected by varying task demands for selective retrieval. Participants encoded cue words that were presented either two or four times, paired either with the same or different strongly associated words across repetitions. In a subsequent test, participants either selectively recalled each cue's most recent associate, or simply judged how many times a cue had been presented, without requiring selective recall. Interference effects on test performance were only seen in the recall task. In contrast, ERPs during test revealed an early posterior positivity for high interference items that was present in both retrieval tasks. This early ERP effect likely reflects a conceptual priming-related N400 reduction when many associations to a cue were pre-activated. A later parietal positivity resembling the ERP correlate of conscious recollection was found only in the recall task. The results suggest that early effects of proactive interference are relatively automatic and independent of intentional retrieval processes, consistent with suggestions that interference can arise through conceptual priming. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. The Effect of Orthographic Depth on Letter String Processing: The Case of Visual Attention Span and Rapid Automatized Naming

    Science.gov (United States)

    Antzaka, Alexia; Martin, Clara; Caffarra, Sendy; Schlöffel, Sophie; Carreiras, Manuel; Lallier, Marie

    2018-01-01

    The present study investigated whether orthographic depth can increase the bias towards multi-letter processing in two reading-related skills: visual attention span (VAS) and rapid automatized naming (RAN). VAS (i.e., the number of visual elements that can be processed at once in a multi-element array) was tested with a visual 1-back task and RAN…

  12. Sensitometric characteristics of D-, E- and F-speed dental radiographic films in manual and automatic processing

    Directory of Open Access Journals (Sweden)

    Jahangir Haghani DDS, MSc

    2012-09-01

    Full Text Available BACKGROUND AND AIM:The purpose of this study was to evaluatethe sensitometric characteristics of Ultraspeed,Ektaspeed Plus and Insight dental radiographic films using manual and automatic processing systems.METHODS:In this experimental invitro study, an aluminum step-wedge was used to construct characteristic curves forD-, E- and F-speed radiographic films (Kodak Eastman, Rochester, USA. All films were processed in Iranianprocessing solution (chemical industries Co., Iran, Tehran both manually and automatically in a period of six days.Unexposed films of three types were processed manually andautomatically to determine base plus fog density. Speedand film contrast were measured according to International Standard Organization definition.RESULTS:There was significant difference in density obtained with the D-, E- and F-speed films in both manually andautomatically processing systems (P < 0.001. There was significant difference in density obtained with the Ultraspeed andinsight films. There was no significant difference in contrast obtained with the D-, E- and F-speed films in both manuallyand automatically processing systems (P = 0.255 , P = 0.260. There was significant difference in speed obtained with theD-, E- and F-speed films in both manually and automatically processing systems (P = 0.034, P = 0.040.CONCLUSIONS:The choice of processing system canaffect radiographic characteristics. The F-speed film processed inautomatic system has greater speed in comparison with manualprocessing system, and it provides a further reduction inradiation exposure without detriment to image quality.

  13. Dynamics of processing invisible faces in the brain: automatic neural encoding of facial expression information.

    Science.gov (United States)

    Jiang, Yi; Shannon, Robert W; Vizueta, Nathalie; Bernat, Edward M; Patrick, Christopher J; He, Sheng

    2009-02-01

    The fusiform face area (FFA) and the superior temporal sulcus (STS) are suggested to process facial identity and facial expression information respectively. We recently demonstrated a functional dissociation between the FFA and the STS as well as correlated sensitivity of the STS and the amygdala to facial expressions using an interocular suppression paradigm [Jiang, Y., He, S., 2006. Cortical responses to invisible faces: dissociating subsystems for facial-information processing. Curr. Biol. 16, 2023-2029.]. In the current event-related brain potential (ERP) study, we investigated the temporal dynamics of facial information processing. Observers viewed neutral, fearful, and scrambled face stimuli, either visibly or rendered invisible through interocular suppression. Relative to scrambled face stimuli, intact visible faces elicited larger positive P1 (110-130 ms) and larger negative N1 or N170 (160-180 ms) potentials at posterior occipital and bilateral occipito-temporal regions respectively, with the N170 amplitude significantly greater for fearful than neutral faces. Invisible intact faces generated a stronger signal than scrambled faces at 140-200 ms over posterior occipital areas whereas invisible fearful faces (compared to neutral and scrambled faces) elicited a significantly larger negative deflection starting at 220 ms along the STS. These results provide further evidence for cortical processing of facial information without awareness and elucidate the temporal sequence of automatic facial expression information extraction.

  14. Mirion--a software package for automatic processing of mass spectrometric images.

    Science.gov (United States)

    Paschke, C; Leisner, A; Hester, A; Maass, K; Guenther, S; Bouschen, W; Spengler, B

    2013-08-01

    Mass spectrometric imaging (MSI) techniques are of growing interest for the Life Sciences. In recent years, the development of new instruments employing ion sources that are tailored for spatial scanning allowed the acquisition of large data sets. A subsequent data processing, however, is still a bottleneck in the analytical process, as a manual data interpretation is impossible within a reasonable time frame. The transformation of mass spectrometric data into spatial distribution images of detected compounds turned out to be the most appropriate method to visualize the results of such scans, as humans are able to interpret images faster and easier than plain numbers. Image generation, thus, is a time-consuming and complex yet very efficient task. The free software package "Mirion," presented in this paper, allows the handling and analysis of data sets acquired by mass spectrometry imaging. Mirion can be used for image processing of MSI data obtained from many different sources, as it uses the HUPO-PSI-based standard data format imzML, which is implemented in the proprietary software of most of the mass spectrometer companies. Different graphical representations of the recorded data are available. Furthermore, automatic calculation and overlay of mass spectrometric images promotes direct comparison of different analytes for data evaluation. The program also includes tools for image processing and image analysis.

  15. Integrating automatic and controlled processes into neurocognitive models of social cognition.

    Science.gov (United States)

    Satpute, Ajay B; Lieberman, Matthew D

    2006-03-24

    Interest in the neural systems underlying social perception has expanded tremendously over the past few decades. However, gaps between behavioral literatures in social perception and neuroscience are still abundant. In this article, we apply the concept of dual-process models to neural systems in an effort to bridge the gap between many of these behavioral studies and neural systems underlying social perception. We describe and provide support for a neural division between reflexive and reflective systems. Reflexive systems correspond to automatic processes and include the amygdala, basal ganglia, ventromedial prefrontal cortex, dorsal anterior cingulate cortex, and lateral temporal cortex. Reflective systems correspond to controlled processes and include lateral prefrontal cortex, posterior parietal cortex, medial prefrontal cortex, rostral anterior cingulate cortex, and the hippocampus and surrounding medial temporal lobe region. This framework is considered to be a working model rather than a finished product. Finally, the utility of this model and its application to other social cognitive domains such as Theory of Mind are discussed.

  16. Automatic screening and classification of diabetic retinopathy and maculopathy using fuzzy image processing.

    Science.gov (United States)

    Rahim, Sarni Suhaila; Palade, Vasile; Shuttleworth, James; Jayne, Chrisina

    2016-12-01

    Digital retinal imaging is a challenging screening method for which effective, robust and cost-effective approaches are still to be developed. Regular screening for diabetic retinopathy and diabetic maculopathy diseases is necessary in order to identify the group at risk of visual impairment. This paper presents a novel automatic detection of diabetic retinopathy and maculopathy in eye fundus images by employing fuzzy image processing techniques. The paper first introduces the existing systems for diabetic retinopathy screening, with an emphasis on the maculopathy detection methods. The proposed medical decision support system consists of four parts, namely: image acquisition, image preprocessing including four retinal structures localisation, feature extraction and the classification of diabetic retinopathy and maculopathy. A combination of fuzzy image processing techniques, the Circular Hough Transform and several feature extraction methods are implemented in the proposed system. The paper also presents a novel technique for the macula region localisation in order to detect the maculopathy. In addition to the proposed detection system, the paper highlights a novel online dataset and it presents the dataset collection, the expert diagnosis process and the advantages of our online database compared to other public eye fundus image databases for diabetic retinopathy purposes.

  17. Examining the influence of psychopathy, hostility biases, and automatic processing on criminal offenders' Theory of Mind.

    Science.gov (United States)

    Nentjes, Lieke; Bernstein, David; Arntz, Arnoud; van Breukelen, Gerard; Slaats, Mariëtte

    2015-01-01

    Theory of Mind (ToM) is a social perceptual skill that refers to the ability to take someone else's perspective and infer what others think. The current study examined the effect of potential hostility biases, as well as controlled (slow) versus automatic (fast) processing on ToM performance in psychopathy. ToM abilities (as assessed with the Reading the Mind in the Eyes Test; RMET; Baron-Cohen, Wheelwright, Hill, Raste, & Plumb, 2001), was compared between 39 PCL-R diagnosed psychopathic offenders, 37 non-psychopathic offenders, and 26 nonoffender controls. Contrary to our hypothesis, psychopathic individuals presented with intact overall RMET performance when restrictions were imposed on how long task stimuli could be processed. In addition, psychopaths did not over-ascribe hostility to task stimuli (i.e., lack of hostility bias). However, there was a significant three-way interaction between hostility, processing speed, and psychopathy: when there was no time limit on stimulus presentation, psychopathic offenders made fewer errors in identifying more hostile eye stimuli compared to nonoffender controls, who seemed to be less accurate in detecting hostility. Psychopaths' more realistic appraisal of others' malevolent mental states is discussed in the light of theories that stress its potential adaptive function. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. The role of automaticity and attention in neural processes underlying empathy for happiness, sadness, and anxiety

    Directory of Open Access Journals (Sweden)

    Sylvia A. Morelli

    2013-05-01

    Full Text Available Although many studies have examined the neural basis of experiencing empathy, relatively little is known about how empathic processes are affected by different attentional conditions. Thus, we examined whether instructions to empathize might amplify responses in empathy-related regions and whether cognitive load would diminish the involvement of these regions. 32 participants completed a functional magnetic resonance imaging session assessing empathic responses to individuals experiencing happy, sad, and anxious events. Stimuli were presented under three conditions: watching naturally, while instructed to empathize, and under cognitive load. Across analyses, we found evidence for a core set of neural regions that support empathic processes (dorsomedial prefrontal cortex, DMPFC; medial prefrontal cortex, MPFC; temporoparietal junction, TPJ; amygdala; ventral anterior insula, AI; septal area, SA. Two key regions – the ventral AI and SA – were consistently active across all attentional conditions, suggesting that they are automatically engaged during empathy. In addition, watching versus empathizing with targets was not markedly different and instead led to similar subjective and neural responses to others’ emotional experiences. In contrast, cognitive load reduced the subjective experience of empathy and diminished neural responses in several regions related to empathy (DMPFC, MPFC, TPJ, amygdala and social cognition. The current results reveal how attention impacts empathic processes and provides insight into how empathy may unfold in everyday interactions.

  19. Grid infrastructure for automatic processing of SAR data for flood applications

    Science.gov (United States)

    Kussul, Natalia; Skakun, Serhiy; Shelestov, Andrii

    2010-05-01

    More and more geosciences applications are being put on to the Grids. Due to the complexity of geosciences applications that is caused by complex workflow, the use of computationally intensive environmental models, the need of management and integration of heterogeneous data sets, Grid offers solutions to tackle these problems. Many geosciences applications, especially those related to the disaster management and mitigations require the geospatial services to be delivered in proper time. For example, information on flooded areas should be provided to corresponding organizations (local authorities, civil protection agencies, UN agencies etc.) no more than in 24 h to be able to effectively allocate resources required to mitigate the disaster. Therefore, providing infrastructure and services that will enable automatic generation of products based on the integration of heterogeneous data represents the tasks of great importance. In this paper we present Grid infrastructure for automatic processing of synthetic-aperture radar (SAR) satellite images to derive flood products. In particular, we use SAR data acquired by ESA's ENVSAT satellite, and neural networks to derive flood extent. The data are provided in operational mode from ESA rolling archive (within ESA Category-1 grant). We developed a portal that is based on OpenLayers frameworks and provides access point to the developed services. Through the portal the user can define geographical region and search for the required data. Upon selection of data sets a workflow is automatically generated and executed on the resources of Grid infrastructure. For workflow execution and management we use Karajan language. The workflow of SAR data processing consists of the following steps: image calibration, image orthorectification, image processing with neural networks, topographic effects removal, geocoding and transformation to lat/long projection, and visualisation. These steps are executed by different software, and can be

  20. Automatic methods of the processing of data from track detectors on the basis of the PAVICOM facility

    Science.gov (United States)

    Aleksandrov, A. B.; Goncharova, L. A.; Davydov, D. A.; Publichenko, P. A.; Roganova, T. M.; Polukhina, N. G.; Feinberg, E. L.

    2007-02-01

    New automatic methods essentially simplify and increase the rate of the processing of data from track detectors. This provides a possibility of processing large data arrays and considerably improves their statistical significance. This fact predetermines the development of new experiments which plan to use large-volume targets, large-area emulsion, and solid-state track detectors [1]. In this regard, the problem of training qualified physicists who are capable of operating modern automatic equipment is very important. Annually, about ten Moscow students master the new methods, working at the Lebedev Physical Institute at the PAVICOM facility [2 4]. Most students specializing in high-energy physics are only given an idea of archaic manual methods of the processing of data from track detectors. In 2005, on the basis of the PAVICOM facility and the physicstraining course of Moscow State University, a new training work was prepared. This work is devoted to the determination of the energy of neutrons passing through a nuclear emulsion. It provides the possibility of acquiring basic practical skills of the processing of data from track detectors using automatic equipment and can be included in the educational process of students of any physical faculty. Those who have mastered the methods of automatic data processing in a simple and pictorial example of track detectors will be able to apply their knowledge in various fields of science and technique. Formulation of training works for pregraduate and graduate students is a new additional aspect of application of the PAVICOM facility described earlier in [4].

  1. Recognising safety critical events: can automatic video processing improve naturalistic data analyses?

    Science.gov (United States)

    Dozza, Marco; González, Nieves Pañeda

    2013-11-01

    New trends in research on traffic accidents include Naturalistic Driving Studies (NDS). NDS are based on large scale data collection of driver, vehicle, and environment information in real world. NDS data sets have proven to be extremely valuable for the analysis of safety critical events such as crashes and near crashes. However, finding safety critical events in NDS data is often difficult and time consuming. Safety critical events are currently identified using kinematic triggers, for instance searching for deceleration below a certain threshold signifying harsh braking. Due to the low sensitivity and specificity of this filtering procedure, manual review of video data is currently necessary to decide whether the events identified by the triggers are actually safety critical. Such reviewing procedure is based on subjective decisions, is expensive and time consuming, and often tedious for the analysts. Furthermore, since NDS data is exponentially growing over time, this reviewing procedure may not be viable anymore in the very near future. This study tested the hypothesis that automatic processing of driver video information could increase the correct classification of safety critical events from kinematic triggers in naturalistic driving data. Review of about 400 video sequences recorded from the events, collected by 100 Volvo cars in the euroFOT project, suggested that drivers' individual reaction may be the key to recognize safety critical events. In fact, whether an event is safety critical or not often depends on the individual driver. A few algorithms, able to automatically classify driver reaction from video data, have been compared. The results presented in this paper show that the state of the art subjective review procedures to identify safety critical events from NDS can benefit from automated objective video processing. In addition, this paper discusses the major challenges in making such video analysis viable for future NDS and new potential

  2. The N400 and Late Positive Complex (LPC Effects Reflect Controlled Rather than Automatic Mechanisms of Sentence Processing

    Directory of Open Access Journals (Sweden)

    Boris Kotchoubey

    2012-08-01

    Full Text Available This study compared automatic and controlled cognitive processes that underlie event-related potentials (ERPs effects during speech perception. Sentences were presented to French native speakers, and the final word could be congruent or incongruent, and presented at one of four levels of degradation (using a modulation with pink noise: no degradation, mild degradation (2 levels, or strong degradation. We assumed that degradation impairs controlled more than automatic processes. The N400 and Late Positive Complex (LPC effects were defined as the differences between the corresponding wave amplitudes to incongruent words minus congruent words. Under mild degradation, where controlled sentence-level processing could still occur (as indicated by behavioral data, both N400 and LPC effects were delayed and the latter effect was reduced. Under strong degradation, where sentence processing was rather automatic (as indicated by behavioral data, no ERP effect remained. These results suggest that ERP effects elicited in complex contexts, such as sentences, reflect controlled rather than automatic mechanisms of speech processing. These results differ from the results of experiments that used word-pair or word-list paradigms.

  3. A comparison of density of Insight and Ektaspeed plus dental x-ray films using automatic and manual processing

    International Nuclear Information System (INIS)

    Yoon, Suk Ja

    2001-01-01

    To compare the film density of Insight dental X-ray film (Eastman Kodak Co., Rochester, NY, USA) with that of Ektaspeed Plus film (Eastman Kodak) under manual and automatic processing conditions. Insight and wedge on the film under the three different exposure times. The exposed films were processed by both manual and automatic ways. The Base plus fog density and the optical density and the optical density made by exposing step wedge were calculated using a digital densitometer (model 07-443, Victoreen Inc, Cleveland, Ohio, USA). The optical densities of the Insight and Ektaspeed film versus thickness of aluminum wedge at the same exposure time were plotted on the graphs. Statistical analyses were applied for comparing the optical densities of the two films. The film density of both Insight films and Ektaspeed Plus films under automatic processing condition was significantly higher over the manual processing. The film density of Insight over Ektaspeed Plus film. To take the full advantage of reducing exposure time, Insight film should be processed automatically

  4. In vitro motility evaluation of aggregated cancer cells by means of automatic image processing.

    Science.gov (United States)

    De Hauwer, C; Darro, F; Camby, I; Kiss, R; Van Ham, P; Decaesteker, C

    1999-05-01

    Set up of an automatic image processing based method that enables the motility of in vitro aggregated cells to be evaluated for a number of hours. Our biological model included the PC-3 human prostate cancer cell line growing as a monolayer on the bottom of Falcon plastic dishes containing conventional culture media. Our equipment consisted of an incubator, an inverted phase contrast microscope, a Charge Coupled Device (CCD) video camera, and a computer equipped with an image processing software developed in our laboratory. This computer-assisted microscope analysis of aggregated cells enables global cluster motility to be evaluated. This analysis also enables the trajectory of each cell to be isolated and parametrized within a given cluster or, indeed, the trajectories of individual cells outside a cluster. The results show that motility inside a PC-3 cluster is not restricted to slight motion due to cluster expansion, but rather consists of a marked cell movement within the cluster. The proposed equipment enables in vitro aggregated cell motility to be studied. This method can, therefore, be used in pharmacological studies in order to select anti-motility related compounds. The compounds selected by the equipment described could then be tested in vivo as potential anti-metastatic.

  5. Automatic crack detection method for loaded coal in vibration failure process.

    Directory of Open Access Journals (Sweden)

    Chengwu Li

    Full Text Available In the coal mining process, the destabilization of loaded coal mass is a prerequisite for coal and rock dynamic disaster, and surface cracks of the coal and rock mass are important indicators, reflecting the current state of the coal body. The detection of surface cracks in the coal body plays an important role in coal mine safety monitoring. In this paper, a method for detecting the surface cracks of loaded coal by a vibration failure process is proposed based on the characteristics of the surface cracks of coal and support vector machine (SVM. A large number of cracked images are obtained by establishing a vibration-induced failure test system and industrial camera. Histogram equalization and a hysteresis threshold algorithm were used to reduce the noise and emphasize the crack; then, 600 images and regions, including cracks and non-cracks, were manually labelled. In the crack feature extraction stage, eight features of the cracks are extracted to distinguish cracks from other objects. Finally, a crack identification model with an accuracy over 95% was trained by inputting the labelled sample images into the SVM classifier. The experimental results show that the proposed algorithm has a higher accuracy than the conventional algorithm and can effectively identify cracks on the surface of the coal and rock mass automatically.

  6. A method for automatic control of the process of producing electrode pitch

    Energy Technology Data Exchange (ETDEWEB)

    Rozenman, E.S.; Bugaysen, I.M.; Chernyshov, Yu.A.; Klyusa, M.D.; Krysin, V.P.; Livshits, B.Ya.; Martynenko, V.V.; Meniovich, B.I.; Sklyar, M.G.; Voytenko, B.I.

    1983-01-01

    A method is proposed for automatic control of the process for producing electride pitch through regulation of the feeding of the starting raw material with correction based on the pitch level in the last apparatus of the technological line and change in the feeding of air into the reactors based on the flow rates of the starting raw material and the temperature of the liquid phase in the reactors. In order to increase the stability of the quality of the electrode pitch with changes in the properties of the starting resin, the heating temperature of the dehydrated resin is regulated in the pipe furnace relative to the quality of the mean temperature pitch produced from it, while the level of the liquid phase in the reactor is regulated relative to the quality of the final product. The proposed method provides for an improvement in the quality of process regulation, which makes it possible to improve the properties of the anode mass and to reduce its expenditure for the production of Aluminum.

  7. Automatic and directed search processes in solving simple semantic-memory problems.

    Science.gov (United States)

    Ben-Zur, H

    1989-09-01

    The cognitive processes involved in simple semantic-memory problems were investigated in four experiments. On each trial of Experiments 1 and 2, two stimulus words were presented, with the instructions to find a third word (i.e., the solution) that, when coupled with each of the stimuli, would yield two word pairs used in everyday language (e.g., surprise and birthday, for which the solution is party). The results of the two experiments indicated that informing the subject whether the solution constituted the first or the second element in the word pairs facilitated both likelihood and speed of solution attainment. In addition, solution attainment was relatively high for items based on frequently used word pairs (Experiment 1) and for items in which the stimuli appear, in everyday language, in a small number of word pairs (Experiment 2). In Experiment 3, the subjects were required to produce word pairs containing one of the two stimulus words from the items used in Experiment 2. Solution production was facilitated by rehearsing the second stimulus word of the specific item. The conclusion, supported by a post hoc analysis of the results of Experiments 2 and 3 (Experiment 4), was that indirect priming from one stimulus word may facilitate solution production from a searched word. These results are interpreted in terms of automatic and controlled processes, and their relevance to two different models for retrieval from semantic memory is discussed.

  8. On the Automatic Generation of Plans for Life Cycle Assembly Processes

    Energy Technology Data Exchange (ETDEWEB)

    CALTON,TERRI L.

    2000-01-01

    Designing products for easy assembly and disassembly during their entire life cycles for purposes including product assembly, product upgrade, product servicing and repair, and product disposal is a process that involves many disciplines. In addition, finding the best solution often involves considering the design as a whole and by considering its intended life cycle. Different goals and manufacturing plan selection criteria, as compared to initial assembly, require re-visiting significant fundamental assumptions and methods that underlie current assembly planning techniques. Previous work in this area has been limited to either academic studies of issues in assembly planning or to applied studies of life cycle assembly processes that give no attention to automatic planning. It is believed that merging these two areas will result in a much greater ability to design for, optimize, and analyze the cycle assembly processes. The study of assembly planning is at the very heart of manufacturing research facilities and academic engineering institutions; and, in recent years a number of significant advances in the field of assembly planning have been made. These advances have ranged from the development of automated assembly planning systems, such as Sandia's Automated Assembly Analysis System Archimedes 3.0{copyright}, to the startling revolution in microprocessors and computer-controlled production tools such as computer-aided design (CAD), computer-aided manufacturing (CAM), flexible manufacturing systems (EMS), and computer-integrated manufacturing (CIM). These results have kindled considerable interest in the study of algorithms for life cycle related assembly processes and have blossomed into a field of intense interest. The intent of this manuscript is to bring together the fundamental results in this area, so that the unifying principles and underlying concepts of algorithm design may more easily be implemented in practice.

  9. Welding qualification procedure for fuel rods tubes of Zr-Sn alloys by the TIG automatic process

    International Nuclear Information System (INIS)

    1984-11-01

    It is presented the requirements to be used in the Welding qualification procedure for tubes of Zr-Sn alloys, specified in the ASTM B353 regulatory guide, used in the fabrication of fuel rods PWR reactors by the automatic TIG process. (E.G.) [pt

  10. An Automatic Framework Using Space-Time Processing and TR-MUSIC for Subsurface and Through-Wall Multitarget Imaging

    Directory of Open Access Journals (Sweden)

    Si-hao Tan

    2012-01-01

    Full Text Available We present an automatic framework combined space-time signal processing with Time Reversal electromagnetic (EM inversion for subsurface and through-wall multitarget imaging using electromagnetic waves. This framework is composed of a frequency-wavenumber (FK filter to suppress direct wave and medium bounce, a FK migration algorithm to automatically estimate the number of targets and identify target regions, which can be used to reduce the computational complexity of the following imaging algorithm, and a EM inversion algorithm using Time Reversal Multiple Signal Classification (TR-MUSIC to reconstruct hidden objects. The feasibility of the framework is demonstrated with simulated data generated by GPRMAX.

  11. Seeing race: N170 responses to race and their relation to automatic racial attitudes and controlled processing.

    Science.gov (United States)

    Ofan, Renana H; Rubin, Nava; Amodio, David M

    2011-10-01

    We examined the relation between neural activity reflecting early face perception processes and automatic and controlled responses to race. Participants completed a sequential evaluative priming task, in which two-tone images of Black faces, White faces, and cars appeared as primes, followed by target words categorized as pleasant or unpleasant, while encephalography was recorded. Half of these participants were alerted that the task assessed racial prejudice and could reveal their personal bias ("alerted" condition). To assess face perception processes, the N170 component of the ERP was examined. For all participants, stronger automatic pro-White bias was associated with larger N170 amplitudes to Black than White faces. For participants in the alerted condition only, larger N170 amplitudes to Black versus White faces were also associated with less controlled processing on the word categorization task. These findings suggest that preexisting racial attitudes affect early face processing and that situational factors moderate the link between early face processing and behavior.

  12. [Use of nondeclarative and automatic memory processes in motor learning: how to mitigate the effects of aging].

    Science.gov (United States)

    Chauvel, Guillaume; Maquestiaux, François; Didierjean, André; Joubert, Sven; Dieudonné, Bénédicte; Verny, Marc

    2011-12-01

    Does normal aging inexorably lead to diminished motor learning abilities? This article provides an overview of the literature on the question, with particular emphasis on the functional dissociation between two sets of memory processes: declarative, effortful processes, and non-declarative, automatic processes. There is abundant evidence suggesting that aging does impair learning when past memories of former actions are required (episodic memory) and recollected through controlled processing (working memory). However, other studies have shown that aging does not impair learning when motor actions are performed non verbally and automatically (tapping procedural memory). These findings led us to hypothesize that one can minimize the impact of aging on the ability to learn new motor actions by favouring procedural learning. Recent data validating this hypothesis are presented. Our findings underline the importance of developing new motor learning strategies, which "bypass" declarative, effortful memory processes.

  13. Automatic analysis (aa: efficient neuroimaging workflows and parallel processing using Matlab and XML

    Directory of Open Access Journals (Sweden)

    Rhodri eCusack

    2015-01-01

    Full Text Available Recent years have seen neuroimaging data becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complex to set up and run (increasing the risk of human error and time consuming to execute (restricting what analyses are attempted. Here we present an open-source framework, automatic analysis (aa, to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (redone. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA. However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast and efficient, for simple single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  14. Acute stress shifts the balance between controlled and automatic processes in prospective memory.

    Science.gov (United States)

    Möschl, Marcus; Walser, Moritz; Plessow, Franziska; Goschke, Thomas; Fischer, Rico

    2017-10-01

    In everyday life we frequently rely on our abilities to postpone intentions until later occasions (prospective memory; PM) and to deactivate completed intentions even in stressful situations. Yet, little is known about the effects of acute stress on these abilities. In the present work we investigated the impact of acute stress on PM functioning under high task demands. (1) Different from previous studies, in which intention deactivation required mostly low processing demands, we used salient focal PM cues to induce high processing demands during intention-deactivation phases. (2) We systematically manipulated PM-monitoring demands in a nonfocal PM task that required participants to monitor for either one or six specific syllables that could occur in ongoing-task words. Eighty participants underwent the Trier Social Stress Test, a standardized stress induction protocol, or a standardized control situation, before performing a computerized PM task. Our primary interests were whether PM performance, PM-monitoring costs, aftereffects of completed intentions and/or commission-error risk would differ between stressed and non-stressed individuals and whether these effects would differ under varying task demands. Results revealed that PM performance and aftereffects of completed intentions during subsequent performance were not affected by acute stress induction, replicating previous findings. Under high demands on intention deactivation (focal condition), however, acute stress produced a nominal increase in erroneous PM responses after intention completion (commission errors). Most importantly, under high demands on PM monitoring (nonfocal condition), acute stress led to a substantial reduction in PM-monitoring costs. These findings support ideas of selective and demand-dependent effects of acute stress on cognitive functioning. Under high task demands, acute stress might induce a shift in processing strategy towards resource-saving behavior, which seems to increase the

  15. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML.

    Science.gov (United States)

    Cusack, Rhodri; Vicente-Grabovetsky, Alejandro; Mitchell, Daniel J; Wild, Conor J; Auer, Tibor; Linke, Annika C; Peelle, Jonathan E

    2014-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  16. Fidelity of Automatic Speech Processing for Adult and Child Talker Classifications.

    Directory of Open Access Journals (Sweden)

    Mark VanDam

    Full Text Available Automatic speech processing (ASP has recently been applied to very large datasets of naturalistically collected, daylong recordings of child speech via an audio recorder worn by young children. The system developed by the LENA Research Foundation analyzes children's speech for research and clinical purposes, with special focus on of identifying and tagging family speech dynamics and the at-home acoustic environment from the auditory perspective of the child. A primary issue for researchers, clinicians, and families using the Language ENvironment Analysis (LENA system is to what degree the segment labels are valid. This classification study evaluates the performance of the computer ASP output against 23 trained human judges who made about 53,000 judgements of classification of segments tagged by the LENA ASP. Results indicate performance consistent with modern ASP such as those using HMM methods, with acoustic characteristics of fundamental frequency and segment duration most important for both human and machine classifications. Results are likely to be important for interpreting and improving ASP output.

  17. A method of automatic control of the process of compressing pyrogas in olefin production

    Energy Technology Data Exchange (ETDEWEB)

    Podval' niy, M.L.; Bobrovnikov, N.R.; Kotler, L.D.; Shib, L.M.; Tuchinskiy, M.R.

    1982-01-01

    In the known method of automatically controlling the process of compressing pyrogas in olefin production by regulating the supply of cooling agents to the interstage coolers of the compression unit depending on the flow of hydrocarbons to the compression unit, to raise performance by lowering deposition of polymers on the flow through surfaces of the equipment, the coolant supply is also regulated as a function of the flows of hydrocarbons from the upper and lower parts of the demethanizer and the bottoms of the stripping tower. The coolant supply is regulated proportional to the difference between the flow of stripping tower bottoms and the ratio of the hydrocarbon flow from the upper and lower parts of the demethanizer to the hydrocarbon flow in the compression unit. With an increase in the proportion of light hydrocarbons (sum of upper and lower demethanizer products) in the total flow of pyrogas going to compression, the flow of coolant to the compression unit is reduced. Condensation of the given fractions in the separators, their amount in condensate going through the piping to the stripping tower, is reduced. With the reduction in the proportion of light hydrocarbons in the pyrogas, the flow of coolant is increased, thus improving condensation of heavy hydrocarbons in the separators and removing them from the compression unit in the bottoms of the stripping tower.

  18. Extended morphological processing: a practical method for automatic spot detection of biological markers from microscopic images.

    Science.gov (United States)

    Kimori, Yoshitaka; Baba, Norio; Morone, Nobuhiro

    2010-07-08

    A reliable extraction technique for resolving multiple spots in light or electron microscopic images is essential in investigations of the spatial distribution and dynamics of specific proteins inside cells and tissues. Currently, automatic spot extraction and characterization in complex microscopic images poses many challenges to conventional image processing methods. A new method to extract closely located, small target spots from biological images is proposed. This method starts with a simple but practical operation based on the extended morphological top-hat transformation to subtract an uneven background. The core of our novel approach is the following: first, the original image is rotated in an arbitrary direction and each rotated image is opened with a single straight line-segment structuring element. Second, the opened images are unified and then subtracted from the original image. To evaluate these procedures, model images of simulated spots with closely located targets were created and the efficacy of our method was compared to that of conventional morphological filtering methods. The results showed the better performance of our method. The spots of real microscope images can be quantified to confirm that the method is applicable in a given practice. Our method achieved effective spot extraction under various image conditions, including aggregated target spots, poor signal-to-noise ratio, and large variations in the background intensity. Furthermore, it has no restrictions with respect to the shape of the extracted spots. The features of our method allow its broad application in biological and biomedical image information analysis.

  19. An Overview of Automaticity and Implications For Training the Thinking Process

    National Research Council Canada - National Science Library

    Holt, Brian

    2002-01-01

    ...., visual search to battlefield thinking). The results of this examination suggest that automaticity can be developed using consistent rules and extensive practice that vary depending on the type of task...

  20. VACTIV: A graphical dialog based program for an automatic processing of line and band spectra

    Science.gov (United States)

    Zlokazov, V. B.

    2013-05-01

    The program VACTIV-Visual ACTIV-has been developed for an automatic analysis of spectrum-like distributions, in particular gamma-ray spectra or alpha-spectra and is a standard graphical dialog based Windows XX application, driven by a menu, mouse and keyboard. On the one hand, it was a conversion of an existing Fortran program ACTIV [1] to the DELPHI language; on the other hand, it is a transformation of the sequential syntax of Fortran programming to a new object-oriented style, based on the organization of event interactions. New features implemented in the algorithms of both the versions consisted in the following as peak model both an analytical function and a graphical curve could be used; the peak search algorithm was able to recognize not only Gauss peaks but also peaks with an irregular form; both narrow peaks (2-4 channels) and broad ones (50-100 channels); the regularization technique in the fitting guaranteed a stable solution in the most complicated cases of strongly overlapping or weak peaks. The graphical dialog interface of VACTIV is much more convenient than the batch mode of ACTIV. [1] V.B. Zlokazov, Computer Physics Communications, 28 (1982) 27-37. NEW VERSION PROGRAM SUMMARYProgram Title: VACTIV Catalogue identifier: ABAC_v2_0 Licensing provisions: no Programming language: DELPHI 5-7 Pascal. Computer: IBM PC series. Operating system: Windows XX. RAM: 1 MB Keywords: Nuclear physics, spectrum decomposition, least squares analysis, graphical dialog, object-oriented programming. Classification: 17.6. Catalogue identifier of previous version: ABAC_v1_0 Journal reference of previous version: Comput. Phys. Commun. 28 (1982) 27 Does the new version supersede the previous version?: Yes. Nature of problem: Program VACTIV is intended for precise analysis of arbitrary spectrum-like distributions, e.g. gamma-ray and X-ray spectra and allows the user to carry out the full cycle of automatic processing of such spectra, i.e. calibration, automatic peak search

  1. Discrete data processing from a scintillation sensor for exposure automatic machine in medical radiography

    International Nuclear Information System (INIS)

    Karadimov, D.; Petukhov, N.; Dimitrova, S.; Vasilev, M.

    1983-01-01

    An exposure automatic machine with a thin plastic scintillator plate collimated by light guide with a photoconverter is described. Due to the physical processes nature in the scintillator accounted by the photomultiplier there is a possibility of a radiation analysis as well as of a digital processing of the results. Two alternative devices are shown. In the first of them the photomultiplier output signal is fed to a pulse-height selector at the output of which standard logical level pulses are obtained. These pulses are fed to a data register and then to a digital comparator where they are compared with a preset quantity selected depending on X-ray film sensitivity and foil combinations as well as on the desired film darkening. The ionizing radiation interruption is controlled by a switch unit. The detector spectral sensitivity correction is accomplished changing the photomultiplier supply voltage. In the second alternative a noise and ionizing radiation discriminator is used where a pulse-height selection according to the radiant energy is carried out. A digital comparator and a switching circuit control the ionizing radiation. By a second switching circuit the spectral distributed pulses from the discriminator are fed to a spectral analyser controlling dinamically the digital comparator compensating for the ionizing radiation spectral response influence. The second alternative advantage is that it allows for the radiation parameters control both in radiograph mode and X-ray examination mode. Due to the system fast-acting the device can be used to measure very short exposures as well as in serial examinations

  2. Process and device for automatic control of air ratio in combustion

    Energy Technology Data Exchange (ETDEWEB)

    Rohr, F J; Holick, H

    1976-06-24

    The device concerns a process for the automatic control of the air ratio in combustion, by setting the fuel-air mixture for combustion depending on the air number lambda. The control of the air ratio of combustion engines is carried out using a zirconium dioxide measuring probe, which is situated in the exhaust gas. It is a disadvantage that this is only sensitive for an air number lambda of 1. In order to achieve control of the air ratio for air numbers greater or smaller than 1, according to the invention an auxiliary gas is mixed with the hot exhaust gas, or a component of the gas is withdrawn, so that a corrected exhaust gas flow is produced, whose air number is detected by the measuring sensor and controlled to a value of about 1. The auxiliary gas flow is chosen so that an air ratio differing from lambda equals 1 is formed when the air number of the corrected exhaust gas flow is regulated to a value of lambda equals 1 approximately. In order to keep the demand for auxiliary gas low, only part of the exhaust gas flow is used for the measurement. The exhaust gas part flow is kept constant while the auxiliary gas flow or the removed component of gas flow are altered. Hydrogen or oxygen are used as auxiliary gases, depending whether excess or reduced air is required. Instead of hydrogen, fuel or its combustion products can be used. According to the invention, the hydrogen or oxygen can be produced electrolytically. Dosing takes place by the current used for electrolysis.

  3. Methodology for automatic process of the fired ceramic tile's internal defect using IR images and artificial neural network

    OpenAIRE

    Andrade, Roberto Márcio de; Eduardo, Alexandre Carlos

    2011-01-01

    In the ceramic industry, rarely testing systems were employed to on-line detect the presence of defects in ceramic tiles. This paper is concerned with the problem of automatic inspection of ceramic tiles using Infrared Images and Artificial Neural Network (ANN). The performance of the technique has been evaluated theoretically and experimentally from laboratory and on line tile samples. It has been performed system for IR image processing and, utilizing an Artificial Neural Network (ANN), det...

  4. Automatic Service Derivation from Business Process Model Repositories via Semantic Technology

    NARCIS (Netherlands)

    Leopold, H.; Pittke, F.; Mendling, J.

    2015-01-01

    Although several approaches for service identification have been defined in research and practice, there is a notable lack of fully automated techniques. In this paper, we address the problem of manual work in the context of service derivation and present an approach for automatically deriving

  5. Experiences with automatic N and P measurements of an activated sludge process in a research environment

    DEFF Research Database (Denmark)

    Isaacs, Steven Howard; Temmink, H.

    1996-01-01

    Some of the advantages of on-line automatic measurement of ammonia, nitrate and phosphate for studying activated sludge systems are pointed out with the help of examples of batch experiments. Sample taking is performed by cross-flow filtration and measurement of all three analytes is performed by...

  6. It's a two-way street: Automatic and controlled processes in children's emotional responses to moral transgressions.

    Science.gov (United States)

    Dys, Sebastian P; Malti, Tina

    2016-12-01

    This study examined children's automatic, spontaneous emotional reactions to everyday moral transgressions and their relations with self-reported emotions, which are more complex and infused with controlled cognition. We presented children ​(N=242 4-, 8-, and 12-year-olds) with six everyday moral transgression scenarios in an experimental setting, and both their spontaneous facial emotional reactions and self-reported emotions in the role of the transgressor were recorded. We found that across age self-reported guilt was positively associated with spontaneous fear, and self-reported anger was positively related to spontaneous sadness. In addition, we found a developmental increase in spontaneous sadness and decrease in spontaneous happiness. These results support the importance of automatic and controlled processes in evoking children's emotional responses to everyday moral transgressions. We conclude by providing potential explanations for how automatic and controlled processes function in children's everyday moral experiences and how these processes may change with age. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Development of a CCD based system called DIGITRACK for automatic track counting and evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Molnar, J.; Somogyi, G.; Szilagyi, S.; Sepsy, K. (Magyar Tudomanyos Akademia, Debrecen. Atommag Kutato Intezete)

    1984-01-01

    We have developed, to the best of our knowledge, the first automatic track analysis system (DIGITRACK) in which the video signals are processed by a new type of video-receiver called charge-coupled device (CCD). The photosensitive semi-conductor device is a 2.5 cm long line imager of type Fairchild CCD 121HC which converts one row of the picture seen through a low magnification microscope into 1728 binary signals by a thresholding logic. The picture elements are analysed by a microcomputer equipped with two INTEL 8080 microprocessors and interfaced to a PDP-11/40 computer. The microcomputer also controls the motion of the stage of microscope. For pattern recognition and analysis a software procedure is developed which is able to differentiate between overlapping tracks and to determine the number, surface opening and x-y coordinates of the tracks occurring in a given detector area. The distribution of track densities and spot areas on the detector surface can be visualized on a graphic display. The DIGITRACK system has been tested for analysis of alpha-tracks registered in CR-39 and LR-115 detectors.

  8. Automatic and Intentional Number Processing Both Rely on Intact Right Parietal Cortex: A Combined fMRI and Neuronavigated TMS Study

    Science.gov (United States)

    Cohen Kadosh, Roi; Bien, Nina; Sack, Alexander T.

    2012-01-01

    Practice and training usually lead to performance increase in a given task. In addition, a shift from intentional toward more automatic processing mechanisms is often observed. It is currently debated whether automatic and intentional processing is subserved by the same or by different mechanism(s), and whether the same or different regions in the brain are recruited. Previous correlational evidence provided by behavioral, neuroimaging, modeling, and neuropsychological studies addressing this question yielded conflicting results. Here we used transcranial magnetic stimulation (TMS) to compare the causal influence of disrupting either left or right parietal cortex during automatic and intentional numerical processing, as reflected by the size congruity effect and the numerical distance effect, respectively. We found a functional hemispheric asymmetry within parietal cortex with only the TMS-induced right parietal disruption impairing both automatic and intentional numerical processing. In contrast, disrupting the left parietal lobe with TMS, or applying sham stimulation, did not affect performance during automatic or intentional numerical processing. The current results provide causal evidence for the functional relevance of right, but not left, parietal cortex for intentional, and automatic numerical processing, implying that at least within the parietal cortices, automatic, and intentional numerical processing rely on the same underlying hemispheric lateralization. PMID:22347175

  9. Automatic processing of facial affects in patients with borderline personality disorder: associations with symptomatology and comorbid disorders.

    Science.gov (United States)

    Donges, Uta-Susan; Dukalski, Bibiana; Kersting, Anette; Suslow, Thomas

    2015-01-01

    Instability of affects and interpersonal relations are important features of borderline personality disorder (BPD). Interpersonal problems of individuals suffering from BPD might develop based on abnormalities in the processing of facial affects and high sensitivity to negative affective expressions. The aims of the present study were to examine automatic evaluative shifts and latencies as a function of masked facial affects in patients with BPD compared to healthy individuals. As BPD comorbidity rates for mental and personality disorders are high, we investigated also the relationships of affective processing characteristics with specific borderline symptoms and comorbidity. Twenty-nine women with BPD and 38 healthy women participated in the study. The majority of patients suffered from additional Axis I disorders and/or additional personality disorders. In the priming experiment, angry, happy, neutral, or no facial expression was briefly presented (for 33 ms) and masked by neutral faces that had to be evaluated. Evaluative decisions and response latencies were registered. Borderline-typical symptomatology was assessed with the Borderline Symptom List. In the total sample, valence-congruent evaluative shifts and delays of evaluative decision due to facial affect were observed. No between-group differences were obtained for evaluative decisions and latencies. The presence of comorbid anxiety disorders was found to be positively correlated with evaluative shifting owing to masked happy primes, regardless of baseline-neutral or no facial expression condition. The presence of comorbid depressive disorder, paranoid personality disorder, and symptoms of social isolation and self-aggression were significantly correlated with response delay due to masked angry faces, regardless of baseline. In the present affective priming study, no abnormalities in the automatic recognition and processing of facial affects were observed in BPD patients compared to healthy individuals

  10. Pedestrians' intention to jaywalk: Automatic or planned? A study based on a dual-process model in China.

    Science.gov (United States)

    Xu, Yaoshan; Li, Yongjuan; Zhang, Feng

    2013-01-01

    The present study investigates the determining factors of Chinese pedestrians' intention to violate traffic laws using a dual-process model. This model divides the cognitive processes of intention formation into controlled analytical processes and automatic associative processes. Specifically, the process explained by the augmented theory of planned behavior (TPB) is controlled, whereas the process based on past behavior is automatic. The results of a survey conducted on 323 adult pedestrian respondents showed that the two added TPB variables had different effects on the intention to violate, i.e., personal norms were significantly related to traffic violation intention, whereas descriptive norms were non-significant predictors. Past behavior significantly but uniquely predicted the intention to violate: the results of the relative weight analysis indicated that the largest percentage of variance in pedestrians' intention to violate was explained by past behavior (42%). According to the dual-process model, therefore, pedestrians' intention formation relies more on habit than on cognitive TPB components and social norms. The implications of these findings for the development of intervention programs are discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Automatic and controlled attentional orienting in the elderly: A dual-process view of the positivity effect.

    Science.gov (United States)

    Gronchi, G; Righi, S; Pierguidi, L; Giovannelli, F; Murasecco, I; Viggiano, M P

    2018-04-01

    The positivity effect in the elderly consists of an attentional preference for positive information as well as avoidance of negative information. Extant theories predict either that the positivity effect depends on controlled attentional processes (socio-emotional selectivity theory), or on an automatic gating selection mechanism (dynamic integration theory). This study examined the role of automatic and controlled attention in the positivity effect. Two dot-probe tasks (with the duration of the stimuli lasting 100 ms and 500 ms, respectively) were employed to compare the attentional bias of 35 elderly people to that of 35 young adults. The stimuli used were expressive faces displaying neutral, disgusted, fearful, and happy expressions. In comparison to young people, the elderly allocated more attention to happy faces at 100 ms and they tended to avoid fearful faces at 500 ms. The findings are not predicted by either theory taken alone, but support the hypothesis that the positivity effect in the elderly is driven by two different processes: an automatic attention bias toward positive stimuli, and a controlled mechanism that diverts attention away from negative stimuli. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Development of Automatic Live Linux Rebuilding System with Flexibility in Science and Engineering Education and Applying to Information Processing Education

    Science.gov (United States)

    Sonoda, Jun; Yamaki, Kota

    We develop an automatic Live Linux rebuilding system for science and engineering education, such as information processing education, numerical analysis and so on. Our system is enable to easily and automatically rebuild a customized Live Linux from a ISO image of Ubuntu, which is one of the Linux distribution. Also, it is easily possible to install/uninstall packages and to enable/disable init daemons. When we rebuild a Live Linux CD using our system, we show number of the operations is 8, and the rebuilding time is about 33 minutes on CD version and about 50 minutes on DVD version. Moreover, we have applied the rebuilded Live Linux CD in a class of information processing education in our college. As the results of a questionnaires survey from our 43 students who used the Live Linux CD, we obtain that the our Live Linux is useful for about 80 percents of students. From these results, we conclude that our system is able to easily and automatically rebuild a useful Live Linux in short time.

  13. Analysis and synthesis of a system for optimal automatic regulation of the process of mechanical cutting by a combine

    Energy Technology Data Exchange (ETDEWEB)

    Pop, E.; Coroescu, T.; Poanta, A.; Pop, M.

    1978-01-01

    Uncontrollable dynamic operating regime of a combine has a negative effect. A consequence of the uncontrolled change in productivity and rate during cutting is total decrease in productivity. The cutters of the cutting mechanism are prematurely worn out. The quality of the coal decreases. Complications with combine control reduce productivity. The motor is exposed to the maximum loads, its service life decreases, and there is an inefficient consumption of electricity. Studies of the optimal automatic regulation of the cutting process were made by the method of modeled analysis on digital and analog machines. The method uses an electronic-automatic device with integrating circuit of domestic production (A-741, A-723). This device controls and regulates the current parameters of the acting motor. The device includes primarily an element of information type of the Hall TH traductor type, the regulating element is an electronic relay, electronic power distributor, etc.

  14. Automatic macroscopic characterization of diesel sprays by means of a new image processing algorithm

    Science.gov (United States)

    Rubio-Gómez, Guillermo; Martínez-Martínez, S.; Rua-Mojica, Luis F.; Gómez-Gordo, Pablo; de la Garza, Oscar A.

    2018-05-01

    A novel algorithm is proposed for the automatic segmentation of diesel spray images and the calculation of their macroscopic parameters. The algorithm automatically detects each spray present in an image, and therefore it is able to work with diesel injectors with a different number of nozzle holes without any modification. The main characteristic of the algorithm is that it splits each spray into three different regions and then segments each one with an individually calculated binarization threshold. Each threshold level is calculated from the analysis of a representative luminosity profile of each region. This approach makes it robust to irregular light distribution along a single spray and between different sprays of an image. Once the sprays are segmented, the macroscopic parameters of each one are calculated. The algorithm is tested with two sets of diesel spray images taken under normal and irregular illumination setups.

  15. daptive Filter Used as a Dynamic Compensator in Automatic Gauge Control of Strip Rolling Processes

    Directory of Open Access Journals (Sweden)

    N. ROMAN

    2010-12-01

    Full Text Available The paper deals with a control structure of the strip thickness in a rolling mill of quarto type (AGC – Automatic Gauge Control. It performs two functions: the compensation of errors induced by unideal dynamics of the tracking systems lead by AGC system and the control adaptation to the change of dynamic properties of the tracking systems. The compensation of dynamical errors is achieved through inverse models of the tracking system, implemented as adaptive filters.

  16. Automatic Generation of Object Models for Process Planning and Control Purposes using an International standard for Information Exchange

    Directory of Open Access Journals (Sweden)

    Petter Falkman

    2003-10-01

    Full Text Available In this paper a formal mapping between static information models and dynamic models is presented. The static information models are given according to an international standard for product, process and resource information exchange, (ISO 10303-214. The dynamic models are described as Discrete Event Systems. The product, process and resource information is automatically converted into product routes and used for simulation, controller synthesis and verification. A high level language, combining Petri nets and process algebra, is presented and used for speci- fication of desired routes. A main implication of the presented method is that it enables the reuse of process information when creating dynamic models for process control. This method also enables simulation and verification to be conducted early in the development chain.

  17. Development of Open source-based automatic shooting and processing UAV imagery for Orthoimage Using Smart Camera UAV

    Science.gov (United States)

    Park, J. W.; Jeong, H. H.; Kim, J. S.; Choi, C. U.

    2016-06-01

    Recently, aerial photography with unmanned aerial vehicle (UAV) system uses UAV and remote controls through connections of ground control system using bandwidth of about 430 MHz radio Frequency (RF) modem. However, as mentioned earlier, existing method of using RF modem has limitations in long distance communication. The Smart Camera equipments's LTE (long-term evolution), Bluetooth, and Wi-Fi to implement UAV that uses developed UAV communication module system carried out the close aerial photogrammetry with the automatic shooting. Automatic shooting system is an image capturing device for the drones in the area's that needs image capturing and software for loading a smart camera and managing it. This system is composed of automatic shooting using the sensor of smart camera and shooting catalog management which manages filmed images and information. Processing UAV imagery module used Open Drone Map. This study examined the feasibility of using the Smart Camera as the payload for a photogrammetric UAV system. The open soure tools used for generating Android, OpenCV (Open Computer Vision), RTKLIB, Open Drone Map.

  18. Development of Open source-based automatic shooting and processing UAV imagery for Orthoimage Using Smart Camera UAV

    Directory of Open Access Journals (Sweden)

    J. W. Park

    2016-06-01

    Full Text Available Recently, aerial photography with unmanned aerial vehicle (UAV system uses UAV and remote controls through connections of ground control system using bandwidth of about 430 MHz radio Frequency (RF modem. However, as mentioned earlier, existing method of using RF modem has limitations in long distance communication. The Smart Camera equipments’s LTE (long-term evolution, Bluetooth, and Wi-Fi to implement UAV that uses developed UAV communication module system carried out the close aerial photogrammetry with the automatic shooting. Automatic shooting system is an image capturing device for the drones in the area’s that needs image capturing and software for loading a smart camera and managing it. This system is composed of automatic shooting using the sensor of smart camera and shooting catalog management which manages filmed images and information. Processing UAV imagery module used Open Drone Map. This study examined the feasibility of using the Smart Camera as the payload for a photogrammetric UAV system. The open soure tools used for generating Android, OpenCV (Open Computer Vision, RTKLIB, Open Drone Map.

  19. A Review of Automatic Methods Based on Image Processing Techniques for Tuberculosis Detection from Microscopic Sputum Smear Images.

    Science.gov (United States)

    Panicker, Rani Oomman; Soman, Biju; Saini, Gagan; Rajan, Jeny

    2016-01-01

    Tuberculosis (TB) is an infectious disease caused by the bacteria Mycobacterium tuberculosis. It primarily affects the lungs, but it can also affect other parts of the body. TB remains one of the leading causes of death in developing countries, and its recent resurgences in both developed and developing countries warrant global attention. The number of deaths due to TB is very high (as per the WHO report, 1.5 million died in 2013), although most are preventable if diagnosed early and treated. There are many tools for TB detection, but the most widely used one is sputum smear microscopy. It is done manually and is often time consuming; a laboratory technician is expected to spend at least 15 min per slide, limiting the number of slides that can be screened. Many countries, including India, have a dearth of properly trained technicians, and they often fail to detect TB cases due to the stress of a heavy workload. Automatic methods are generally considered as a solution to this problem. Attempts have been made to develop automatic approaches to identify TB bacteria from microscopic sputum smear images. In this paper, we provide a review of automatic methods based on image processing techniques published between 1998 and 2014. The review shows that the accuracy of algorithms for the automatic detection of TB increased significantly over the years and gladly acknowledges that commercial products based on published works also started appearing in the market. This review could be useful to researchers and practitioners working in the field of TB automation, providing a comprehensive and accessible overview of methods of this field of research.

  20. The development of automatic chemical processing system for 67Ga production

    International Nuclear Information System (INIS)

    Lee, Dong Hoon; Suh, Yong Sup; Yang, Seung Dae; Chun, Kwon Soo; Hur, Min Goo; Yun, Yong Ki; Kim, Yoon Jong; Hong, Seung Hong

    2003-01-01

    The automatic system for 67 Ga production using for the diagnosis of malignant tumor has been developed. A solvent extraction and an ion exchange chromatography were used for the separation 67 Ga from the irradiated enriched 68 Zn. This system consisted of a solvent separation unit which was composed of micro conductivity cells, air supply tubes, solvent transfer tubes, solenoid valves and glasses, a PLC based controller and a PMU user interface unit for automation. The radiation exposure to the workers and the production time can both be reduced by employing this system during the 67 Ga production phase. After all, the mass production of 67 Ga with high efficiency was possible

  1. Well scintillation counter with automatic sample changing and data processing: an inexpensive instrument incorporating consumer products

    International Nuclear Information System (INIS)

    Dudley, R.A.; Figdor, H.C.; Keroe, E.A.; Morris, A.C. Jr.; Mutz, O.J.

    1977-01-01

    An automatic well scintillation-counting system suitable for in vitro assays with 125 I has been designed with the express purpose of allowing effective operation and maintenance in laboratories in developing countries. The system incorporates comparatively simple components, notably two consumer products: A Kodak Carousel slide projector as sample changer and a Hewlett-Packard HP-97 programmable printing calculator as system controller and data processor. The instrument can accomodate 80 counting vials of demensions 12 mm diameter x 75 mm, or 40 vials of 16 mm diameter x 100 mm. The calculator provides on-line control and data reduction with the mediation of an interface somewhat resembling that required between a scaler and a printer. Its program capacity is adequate for fairly complicated on-line operations, including, interpolation from a standard curve in logit-log space, calculation of error in hormone concentration, and termination of counting when the counting error is rediced to a prescribed fraction of the composite of other random assay errors (as stored in the calculator's memory). This system is inexpensive, robust, and capable of being operated manually if automatic accessories fail. It could be improved in several ways, particularly by providing for operation from batteries and, no doubt in the immediate future, substitution of the next generation of cheaper and more powerful calculators. The instrument may be cost-effective in any small to medium-sized laboratory. (orig.) [de

  2. Cognition and balance control: does processing of explicit contextual cues of impending perturbations modulate automatic postural responses?

    Science.gov (United States)

    Coelho, Daniel Boari; Teixeira, Luis Augusto

    2017-08-01

    Processing of predictive contextual cues of an impending perturbation is thought to induce adaptive postural responses. Cueing in previous research has been provided through repeated perturbations with a constant foreperiod. This experimental strategy confounds explicit predictive cueing with adaptation and non-specific properties of temporal cueing. Two experiments were performed to assess those factors separately. To perturb upright balance, the base of support was suddenly displaced backwards in three amplitudes: 5, 10 and 15 cm. In Experiment 1, we tested the effect of cueing the amplitude of the impending postural perturbation by means of visual signals, and the effect of adaptation to repeated exposures by comparing block versus random sequences of perturbation. In Experiment 2, we evaluated separately the effects of cueing the characteristics of an impending balance perturbation and cueing the timing of perturbation onset. Results from Experiment 1 showed that the block sequence of perturbations led to increased stability of automatic postural responses, and modulation of magnitude and onset latency of muscular responses. Results from Experiment 2 showed that only the condition cueing timing of platform translation onset led to increased balance stability and modulation of onset latency of muscular responses. Conversely, cueing platform displacement amplitude failed to induce any effects on automatic postural responses in both experiments. Our findings support the interpretation of improved postural responses via optimized sensorimotor processes, at the same time that cast doubt on the notion that cognitive processing of explicit contextual cues advancing the magnitude of an impending perturbation can preset adaptive postural responses.

  3. Controlling the COD removal of an A-stage pilot study with instrumentation and automatic process control.

    Science.gov (United States)

    Miller, Mark W; Elliott, Matt; DeArmond, Jon; Kinyua, Maureen; Wett, Bernhard; Murthy, Sudhir; Bott, Charles B

    2017-06-01

    The pursuit of fully autotrophic nitrogen removal via the anaerobic ammonium oxidation (anammox) pathway has led to an increased interest in carbon removal technologies, particularly the A-stage of the adsorption/bio-oxidation (A/B) process. The high-rate operation of the A-stage and lack of automatic process control often results in wide variations of chemical oxygen demand (COD) removal that can ultimately impact nitrogen removal in the downstream B-stage process. This study evaluated the use dissolved oxygen (DO) and mixed liquor suspended solids (MLSS) based automatic control strategies through the use of in situ on-line sensors in the A-stage of an A/B pilot study. The objective of using these control strategies was to reduce the variability of COD removal by the A-stage and thus the variability of the effluent C/N. The use of cascade DO control in the A-stage did not impact COD removal at the conditions tested in this study, likely because the bulk DO concentration (>0.5 mg/L) was maintained above the half saturation coefficient of heterotrophic organisms for DO. MLSS-based solids retention time (SRT) control, where MLSS was used as a surrogate for SRT, did not significantly reduce the effluent C/N variability but it was able to reduce COD removal variation in the A-stage by 90%.

  4. Performing the processing required for automatically get a PDF/A version of the CERN Library documentation

    CERN Document Server

    Molina Garcia-Retamero, Antonio

    2015-01-01

    The aim of the project was to perform the processing required for automatically get a PDF/A version of the CERN Library documentation. For this, it is necessary to extract as much metadata as possible from the sources files, inject the required data into the original source files creating new ones ready for being compiled with all related dependencies. Besides, I’ve proposed the creation of a HTML version consistent with the PDF and navigable for easy access, I’ve been trying to perform some Natural Language Processing for extracting metadata, I’ve proposed the injection of the cern library documentation into the HTML version of the long writeups where it is referenced (for instance, when a CERN Library function is referenced in a sample code) Finally, I’ve designed and implemented a Graphical User Interface in order to simplify the process for the user.

  5. Automatic processes in at-risk adolescents: the role of alcohol-approach tendencies and response inhibition in drinking behavior.

    Science.gov (United States)

    Peeters, Margot; Wiers, Reinout W; Monshouwer, Karin; van de Schoot, Rens; Janssen, Tim; Vollebergh, Wilma A M

    2012-11-01

    This study examined the association between automatic processes and drinking behavior in relation to individual differences in response inhibition in young adolescents who had just started drinking. It was hypothesized that strong automatic behavioral tendencies toward alcohol-related stimuli (alcohol-approach bias) were associated with higher levels of alcohol use, especially amongst adolescents with relatively weak inhibition skills. To test this hypothesis structural equation analyses (standard error of mean) were performed using a zero inflated Poisson (ZIP) model. A well-known problem in studying risk behavior is the low incidence rate resulting in a zero dominated distribution. A ZIP-model accounts for non-normality of the data. Adolescents were selected from secondary Special Education schools (a risk group for the development of substance use problems). Participants were 374 adolescents (mean age of M = 13.6 years). Adolescents completed the alcohol approach avoidance task (a-AAT), the Stroop colour naming task (Stroop) and a questionnaire that assessed alcohol use. The ZIP-model established stronger alcohol-approach tendencies for adolescent drinkers (P processes are associated with the drinking behavior of young, at-risk adolescents. It appears that alcohol-approach tendencies are formed shortly after the initiation of drinking and particularly affect the drinking behavior of adolescents with relatively weak inhibition skills. Implications for the prevention of problem drinking in adolescents are discussed. © 2012 The Authors. Addiction © 2012 Society for the Study of Addiction.

  6. Automatic data-processing equipment of moon mark of nail for verifying some experiential theory of Traditional Chinese Medicine.

    Science.gov (United States)

    Niu, Renjie; Fu, Chenyu; Xu, Zhiyong; Huang, Jianyuan

    2016-04-29

    Doctors who practice Traditional Chinese Medicine (TCM) diagnose using four methods - inspection, auscultation and olfaction, interrogation, and pulse feeling/palpation. The shape and shape changes of the moon marks on the nails are an important indication when judging the patient's health. There are a series of classical and experimental theories about moon marks in TCM, which does not have support from statistical data. To verify some experiential theories on moon mark in TCM by automatic data-processing equipment. This paper proposes the equipment that utilizes image processing technology to collect moon mark data of different target groups conveniently and quickly, building a database that combines this information with that gathered from the health and mental status questionnaire in each test. This equipment has a simple design, a low cost, and an optimized algorithm. The practice has been proven to quickly complete automatic acquisition and preservation of key data about moon marks. In the future, some conclusions will likely be obtained from these data; some changes of moon marks related to a special pathological change will be established with statistical methods.

  7. Automatical and accurate segmentation of cerebral tissues in fMRI dataset with combination of image processing and deep learning

    Science.gov (United States)

    Kong, Zhenglun; Luo, Junyi; Xu, Shengpu; Li, Ting

    2018-02-01

    Image segmentation plays an important role in medical science. One application is multimodality imaging, especially the fusion of structural imaging with functional imaging, which includes CT, MRI and new types of imaging technology such as optical imaging to obtain functional images. The fusion process require precisely extracted structural information, in order to register the image to it. Here we used image enhancement, morphometry methods to extract the accurate contours of different tissues such as skull, cerebrospinal fluid (CSF), grey matter (GM) and white matter (WM) on 5 fMRI head image datasets. Then we utilized convolutional neural network to realize automatic segmentation of images in deep learning way. Such approach greatly reduced the processing time compared to manual and semi-automatic segmentation and is of great importance in improving speed and accuracy as more and more samples being learned. The contours of the borders of different tissues on all images were accurately extracted and 3D visualized. This can be used in low-level light therapy and optical simulation software such as MCVM. We obtained a precise three-dimensional distribution of brain, which offered doctors and researchers quantitative volume data and detailed morphological characterization for personal precise medicine of Cerebral atrophy/expansion. We hope this technique can bring convenience to visualization medical and personalized medicine.

  8. Achieving Accurate Automatic Sleep Staging on Manually Pre-processed EEG Data Through Synchronization Feature Extraction and Graph Metrics.

    Science.gov (United States)

    Chriskos, Panteleimon; Frantzidis, Christos A; Gkivogkli, Polyxeni T; Bamidis, Panagiotis D; Kourtidou-Papadeli, Chrysoula

    2018-01-01

    Sleep staging, the process of assigning labels to epochs of sleep, depending on the stage of sleep they belong, is an arduous, time consuming and error prone process as the initial recordings are quite often polluted by noise from different sources. To properly analyze such data and extract clinical knowledge, noise components must be removed or alleviated. In this paper a pre-processing and subsequent sleep staging pipeline for the sleep analysis of electroencephalographic signals is described. Two novel methods of functional connectivity estimation (Synchronization Likelihood/SL and Relative Wavelet Entropy/RWE) are comparatively investigated for automatic sleep staging through manually pre-processed electroencephalographic recordings. A multi-step process that renders signals suitable for further analysis is initially described. Then, two methods that rely on extracting synchronization features from electroencephalographic recordings to achieve computerized sleep staging are proposed, based on bivariate features which provide a functional overview of the brain network, contrary to most proposed methods that rely on extracting univariate time and frequency features. Annotation of sleep epochs is achieved through the presented feature extraction methods by training classifiers, which are in turn able to accurately classify new epochs. Analysis of data from sleep experiments on a randomized, controlled bed-rest study, which was organized by the European Space Agency and was conducted in the "ENVIHAB" facility of the Institute of Aerospace Medicine at the German Aerospace Center (DLR) in Cologne, Germany attains high accuracy rates, over 90% based on ground truth that resulted from manual sleep staging by two experienced sleep experts. Therefore, it can be concluded that the above feature extraction methods are suitable for semi-automatic sleep staging.

  9. ANALYSIS OF RELIABILITY OF RESERVED AUTOMATIC CONTROL SYSTEMS OF INDUSTRIAL POWER PROCESSES

    Directory of Open Access Journals (Sweden)

    V. A. Anishchenko

    2014-01-01

    Full Text Available This paper describes the comparative analysis of the main structural schemes for reserved automatic control and regulation devices of important objects of power supply with increased reliability requirements. There were analyzed schemes of passive and active doubling with control device, passive and active tripling, combined redundancy and majority redundancy according to schemes: “two from three” and “three from five”. On the results of calculations fulfilled there was made comparison of these schemes for ideal devices of built-in control and ideal majority elements. Scales of preferences of systems according to criterion of average time maximum and average probability of no-failure operation were built. These scales have variable character, depending on intervals in which there is a parameter obtained by multiplication of failure rate and time. The sequence of systems’ preferences is changing and is depending on each system failures and in moments of curves crossing of average probability of no-failure operation of systems. Analysis of calculation results showed the advantages of tripling systems and combined redundancy in reliability and this is achieved by a great amount of expenses for these systems creation. Under definite conditions the reliability of system of passive tripling is higher compared to system of active doubling. The majority schemes allow determining not only the full but also single (metrological failures. Boundary value of unreliability of built-in control device is determined, and this allows making a perfect choice between systems of active and passive redundancy.

  10. Automatic support for product based workflow design : generation of process models from a product data model

    NARCIS (Netherlands)

    Vanderfeesten, I.T.P.; Reijers, H.A.; Aalst, van der W.M.P.; Vogelaar, J.J.C.L.; Meersman, R.; Dillon, T.; Herrero, P.

    2010-01-01

    Product Based Workflow Design (PBWD) is one of the few scientific methodologies for the (re)design of workflow processes. It is based on an analysis of the product that is produced in the workflow process and derives a process model from the product structure. Until now this derivation has been a

  11. Neural changes associated to procedural learning and automatization process in Developmental Coordination Disorder and/or Developmental Dyslexia.

    Science.gov (United States)

    Biotteau, Maëlle; Péran, Patrice; Vayssière, Nathalie; Tallet, Jessica; Albaret, Jean-Michel; Chaix, Yves

    2017-03-01

    Recent theories hypothesize that procedural learning may support the frequent overlap between neurodevelopmental disorders. The neural circuitry supporting procedural learning includes, among others, cortico-cerebellar and cortico-striatal loops. Alteration of these loops may account for the frequent comorbidity between Developmental Coordination Disorder (DCD) and Developmental Dyslexia (DD). The aim of our study was to investigate cerebral changes due to the learning and automatization of a sequence learning task in children with DD, or DCD, or both disorders. fMRI on 48 children (aged 8-12) with DD, DCD or DD + DCD was used to explore their brain activity during procedural tasks, performed either after two weeks of training or in the early stage of learning. Firstly, our results indicate that all children were able to perform the task with the same level of automaticity, but recruit different brain processes to achieve the same performance. Secondly, our fMRI results do not appear to confirm Nicolson and Fawcett's model. The neural correlates recruited for procedural learning by the DD and the comorbid groups are very close, while the DCD group presents distinct characteristics. This provide a promising direction on the neural mechanisms associated with procedural learning in neurodevelopmental disorders and for understanding comorbidity. Published by Elsevier Ltd.

  12. Automatic Generation of Assembly Sequence for the Planning of Outfitting Processes in Shipbuilding

    NARCIS (Netherlands)

    Wei, Y.

    2012-01-01

    The most important characteristics of the outfitting processes in shipbuilding are: 1. The processes involve many interferences between yard and different subcontractors. In recent years, the use of outsourcing and subcontracting has become a widespread strategy of western shipyards. There exists

  13. An investigation of the Stroop effect among deaf signers in English and Japanese: automatic processing or memory retrieval?

    Science.gov (United States)

    Flaherty, Mary; Moran, Aidan

    2007-01-01

    Most studies on the Stroop effect (unintentional automatic word processing) have been restricted to English speakers using vocal responses. Little is known about this effect with deaf signers. The study compared Stroop task responses among four different samples: deaf participants from a Japanese-language environment and from an English-language environment; and hearing individuals from Japan and from Australia. Color words were prepared in both English and Japanese and were presented in three conditions: congruent (e.g., the word red printed in red), incongruent (e.g., red printed in blue), and neutral. The magnitude of the effect was greater with the deaf participants than with the hearing participants. The deaf individuals experienced more interference in English than in Japanese.

  14. Effects of spectral complexity and sound duration on automatic complex-sound pitch processing in humans - a mismatch negativity study.

    Science.gov (United States)

    Tervaniemi, M; Schröger, E; Saher, M; Näätänen, R

    2000-08-18

    The pitch of a spectrally rich sound is known to be more easily perceived than that of a sinusoidal tone. The present study compared the importance of spectral complexity and sound duration in facilitated pitch discrimination. The mismatch negativity (MMN), which reflects automatic neural discrimination, was recorded to a 2. 5% pitch change in pure tones with only one sinusoidal frequency component (500 Hz) and in spectrally rich tones with three (500-1500 Hz) and five (500-2500 Hz) harmonic partials. During the recordings, subjects concentrated on watching a silent movie. In separate blocks, stimuli were of 100 and 250 ms in duration. The MMN amplitude was enhanced with both spectrally rich sounds when compared with pure tones. The prolonged sound duration did not significantly enhance the MMN. This suggests that increased spectral rather than temporal information facilitates pitch processing of spectrally rich sounds.

  15. Image processing. A system for the automatic sorting of chromosomes; Traitement d'images - Applications au classement des chromosomes

    Energy Technology Data Exchange (ETDEWEB)

    Najai, Amor

    1977-05-27

    The present paper deals with two aspects of the system: - an automata (specialized hardware) dedicated to image processing. Images are digitized, divided into sub-units and computations are carried out on their main parameters. - A software for the automatic recognition and sorting of chromosomes is implemented on a Multi-20 minicomputer, connected to the automata. (author) [French] Nous decrivons un systeme automatique de classification de chromosomes. Il se compose de: - l'A.S.T.I., Automate Specialise de Traitement d'Images permettant de numeriser celles-ci, d'isoler des sous-images, d'effectuer des calculs sur leurs parametres principaux. - Un programme de reconnaissance et de classification automatique des chromosomes implante sur un mini-ordinateur MULTI-20, couple a l'A.S.T.I. (auteur)

  16. Data-driven management using quantitative metric and automatic auditing program (QMAP) improves consistency of radiation oncology processes.

    Science.gov (United States)

    Yu, Naichang; Xia, Ping; Mastroianni, Anthony; Kolar, Matthew D; Chao, Samuel T; Greskovich, John F; Suh, John H

    Process consistency in planning and delivery of radiation therapy is essential to maintain patient safety and treatment quality and efficiency. Ensuring the timely completion of each critical clinical task is one aspect of process consistency. The purpose of this work is to report our experience in implementing a quantitative metric and automatic auditing program (QMAP) with a goal of improving the timely completion of critical clinical tasks. Based on our clinical electronic medical records system, we developed a software program to automatically capture the completion timestamp of each critical clinical task while providing frequent alerts of potential delinquency. These alerts were directed to designated triage teams within a time window that would offer an opportunity to mitigate the potential for late completion. Since July 2011, 18 metrics were introduced in our clinical workflow. We compared the delinquency rates for 4 selected metrics before the implementation of the metric with the delinquency rate of 2016. One-tailed Student t test was used for statistical analysis RESULTS: With an average of 150 daily patients on treatment at our main campus, the late treatment plan completion rate and late weekly physics check were reduced from 18.2% and 8.9% in 2011 to 4.2% and 0.1% in 2016, respectively (P < .01). The late weekly on-treatment physician visit rate was reduced from 7.2% in 2012 to <1.6% in 2016. The yearly late cone beam computed tomography review rate was reduced from 1.6% in 2011 to <0.1% in 2016. QMAP is effective in reducing late completions of critical tasks, which can positively impact treatment quality and patient safety by reducing the potential for errors resulting from distractions, interruptions, and rush in completion of critical tasks. Copyright © 2016 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  17. Automatic processing of list of journals and publications in the Nuclear Research Institute

    International Nuclear Information System (INIS)

    Vymetal, L.

    Using an EC 1040 computer, the Institute of Nuclear Research processed the list of journals in the reference library of the Czechoslovak Atomic Energy Commission including journals acquired by all institutions subordinated to the Czechoslovak Atomic Energy Commission, ie., UJV Rez (Nuclear Research Institute), Nuclear Information Centre Prague, UVVVR Prague (Institute for Research, Production and Application of Radioisotopes) and Institute of Radioecology and Applied Nuclear Techniques Kosice. Computer processing allowed obtaining files arranged by libraries, subject matters of the journals, countries of publication, and journal titles. Automated processing is being prepared of publications by UJV staff. The preparation is described of data for computer processing of both files and specimens are shown of printouts. (Ha)

  18. A fully automatic processing chain to produce Burn Scar Mapping products, using the full Landsat archive over Greece

    Science.gov (United States)

    Kontoes, Charalampos; Papoutsis, Ioannis; Herekakis, Themistoklis; Michail, Dimitrios; Ieronymidi, Emmanuela

    2013-04-01

    Remote sensing tools for the accurate, robust and timely assessment of the damages inflicted by forest wildfires provide information that is of paramount importance to public environmental agencies and related stakeholders before, during and after the crisis. The Institute for Astronomy, Astrophysics, Space Applications and Remote Sensing of the National Observatory of Athens (IAASARS/NOA) has developed a fully automatic single and/or multi date processing chain that takes as input archived Landsat 4, 5 or 7 raw images and produces precise diachronic burnt area polygons and damage assessments over the Greek territory. The methodology consists of three fully automatic stages: 1) the pre-processing stage where the metadata of the raw images are extracted, followed by the application of the LEDAPS software platform for calibration and mask production and the Automated Precise Orthorectification Package, developed by NASA, for image geo-registration and orthorectification, 2) the core-BSM (Burn Scar Mapping) processing stage which incorporates a published classification algorithm based on a series of physical indexes, the application of two filters for noise removal using graph-based techniques and the grouping of pixels classified as burnt to form the appropriate pixels clusters before proceeding to conversion from raster to vector, and 3) the post-processing stage where the products are thematically refined and enriched using auxiliary GIS layers (underlying land cover/use, administrative boundaries, etc.) and human logic/evidence to suppress false alarms and omission errors. The established processing chain has been successfully applied to the entire archive of Landsat imagery over Greece spanning from 1984 to 2012, which has been collected and managed in IAASARS/NOA. The number of full Landsat frames that were subject of process in the framework of the study was 415. These burn scar mapping products are generated for the first time to such a temporal and spatial

  19. Image processing and pattern recognition with CVIPtools MATLAB toolbox: automatic creation of masks for veterinary thermographic images

    Science.gov (United States)

    Mishra, Deependra K.; Umbaugh, Scott E.; Lama, Norsang; Dahal, Rohini; Marino, Dominic J.; Sackman, Joseph

    2016-09-01

    CVIPtools is a software package for the exploration of computer vision and image processing developed in the Computer Vision and Image Processing Laboratory at Southern Illinois University Edwardsville. CVIPtools is available in three variants - a) CVIPtools Graphical User Interface, b) CVIPtools C library and c) CVIPtools MATLAB toolbox, which makes it accessible to a variety of different users. It offers students, faculty, researchers and any user a free and easy way to explore computer vision and image processing techniques. Many functions have been implemented and are updated on a regular basis, the library has reached a level of sophistication that makes it suitable for both educational and research purposes. In this paper, the detail list of the functions available in the CVIPtools MATLAB toolbox are presented and how these functions can be used in image analysis and computer vision applications. The CVIPtools MATLAB toolbox allows the user to gain practical experience to better understand underlying theoretical problems in image processing and pattern recognition. As an example application, the algorithm for the automatic creation of masks for veterinary thermographic images is presented.

  20. Automatic Imitation

    Science.gov (United States)

    Heyes, Cecilia

    2011-01-01

    "Automatic imitation" is a type of stimulus-response compatibility effect in which the topographical features of task-irrelevant action stimuli facilitate similar, and interfere with dissimilar, responses. This article reviews behavioral, neurophysiological, and neuroimaging research on automatic imitation, asking in what sense it is "automatic"…

  1. Development of automatic radiographic inspection system using digital image processing and artificial intelligence

    International Nuclear Information System (INIS)

    Itoga, Kouyu; Sugimoto, Koji; Michiba, Koji; Kato, Yuhei; Sugita, Yuji; Onda, Katsuhiro.

    1991-01-01

    The application of computers to welding inspection is expanding rapidly. The classification of the application is the collection, analysis and processing of data, the graphic display of results, the distinction of the kinds of defects and the evaluation of the harmufulness of defects and the judgement of acceptance or rejection. The application of computer techniques to the automation of data collection was realized at the relatively early stage. Data processing and the graphic display of results are the techniques in progress now, and the application of artificial intelligence to the distinction of the kinds of defects and the evaluation of harmfulness is expected to expand rapidly. In order to computerize radiographic inspection, the abilities of image processing technology and knowledge engineering must be given to computers. The object of this system is the butt joints by arc welding of the steel materials of up to 30 mm thickness. The digitizing transformation of radiographs, the distinction and evaluation of transmissivity and gradation by image processing, and only as for those, of which the picture quality satisfies the standard, the extraction of defect images, their display, the distinction of the kinds and the final judgement are carried out. The techniques of image processing, the knowledge for distinguishing the kinds of defects and the concept of the practical system are reported. (K.I.)

  2. A Quality Sorting of Fruit Using a New Automatic Image Processing Method

    Science.gov (United States)

    Amenomori, Michihiro; Yokomizu, Nobuyuki

    This paper presents an innovative approach for quality sorting of objects such as apples sorting in an agricultural factory, using an image processing algorithm. The objective of our approach are; firstly to sort the objects by their colors precisely; secondly to detect any irregularity of the colors surrounding the apples efficiently. An experiment has been conducted and the results have been obtained and compared with that has been preformed by human sorting process and by color sensor sorting devices. The results demonstrate that our approach is capable to sort the objects rapidly and the percentage of classification valid rate was 100 %.

  3. Neural Bases of Automaticity

    Science.gov (United States)

    Servant, Mathieu; Cassey, Peter; Woodman, Geoffrey F.; Logan, Gordon D.

    2018-01-01

    Automaticity allows us to perform tasks in a fast, efficient, and effortless manner after sufficient practice. Theories of automaticity propose that across practice processing transitions from being controlled by working memory to being controlled by long-term memory retrieval. Recent event-related potential (ERP) studies have sought to test this…

  4. Automatic dataflow model extraction from modal real-time stream processing applications

    NARCIS (Netherlands)

    Geuns, S.J.; Hausmans, J.P.H.M.; Bekooij, Marco Jan Gerrit

    2013-01-01

    Many real-time stream processing applications are initially described as a sequential application containing while-loops, which execute for an unknown number of iterations. These modal applications have to be executed in parallel on an MPSoC system in order to meet their real-time throughput

  5. Radiometric installations for automatic control of industrial processes and some possibilities of the specialized computers application

    International Nuclear Information System (INIS)

    Kuzino, S.; Shandru, P.

    1979-01-01

    It is noted that application of radioisotope devices in circuits for automation of some industrial processes permits to obtain the on-line information about some parameters of these processes. This information being passed to a computer, controlling the process, permits to obtain and maintain some optimum technological perameters of this process. Some elements of the automation stem projecting are given from the poin of wiev of the radiometric devices tuning, calibration of the radiometric devices with the purpose to get a digital answer in the on-line regime with the preset accuracy and thrustworthyness levels for supplying them to the controlling computer; determination of the system's reaction on the base of the preset statistical criteria; development, on the base of the data obtained from the computer, of an algorithm for the functional checking of radiometric devices' characteristics, - stability and reproductibility of readings in the operation regime as well as determination of the value threshold of an answer, depending on the measured parameter [ru

  6. On-line measurement with automatic emulsion analysis system and off-line data processing (E531 neutrino experiment)

    International Nuclear Information System (INIS)

    Miyanishi, Motoaki

    1984-01-01

    The automatic emulsion analysis system developed by Nagoya cosmic ray observation group was practically used for the experiment (FNAL-E531) on determining the lifetime of charm particles for the first time in the world, and achieved a great successful result. The system consists of four large precise coordinate-measuring stages capable of conducting simultaneous measurement and multiple (currently four) DOMS (digitized on-line microscope), supported with one mini-computer (ECLIPS S/130). The purpose of E531 experiment was the determination of charm particle lifetime. The experiment was carried out at FNAL, USA, and by the irradiation of wide band ν sub(μ) beam equivalent to 7 x 10 18 of 350 GeV/c protons. The detector was a hybrid system of emulsions and a counter spectrometer. The scan of neutrino reaction, the scan of charm particles, and charm event measurement were analyzed in emulsions, and the on-line programs for-respective analyses were created. Nagoya group has found 726 neutrino reactions in the first run, obtained 37 charm particle candidates, and found 1442 neutrino reactions in the second run, and obtained 56 charm particle candidates. The capability of the automatic emulsion analysis system in terms of the time equired for analysis is in total 3.5 hours per event; 15 minutes for C.S. scan, 15 minutes for coupling to module, 20 minutes for tracing to vertex, 1 hour for neutrino reaction measurement, 10 minutes for offline data processing and 1.5 hours for charm particle scanning. (Wakatsuki, Y.)

  7. Automatic drawing and CAD actualization in processing data of radiant sampling in physics prospect

    International Nuclear Information System (INIS)

    Liu Jinsheng

    2010-01-01

    In this paper discussed a method of processing radiant sampling data with computer. By this method can get expain the curve of radiant sampling data, and we can combine mineral masses and analyse and calculate them, then record the result on Notebook. There are many merites of this method: easy to learn, simple to use, high efficient. It adapts to all sorts of mines. (authors)

  8. Automatic drawing and cad actualiztion in processing data of radiant sampling in physics prospect

    International Nuclear Information System (INIS)

    Liu Jinsheng

    2010-01-01

    In this paper discussed a method of processing radiant sampling data with computer. By this method can get explain the curve of radiant sampling data, and we can combine mineral masses and analyses and calculate them, then record the result on Notebook. There are many merites of this method: easy to learn, simple to use, high efficient. It adapts to all sorts of mines. (authors)

  9. Functional magnetic resonance imaging measure of automatic and controlled auditory processing

    OpenAIRE

    Mitchell, Teresa V.; Morey, Rajendra A.; Inan, Seniha; Belger, Aysenil

    2005-01-01

    Activity within fronto-striato-temporal regions during processing of unattended auditory deviant tones and an auditory target detection task was investigated using event-related functional magnetic resonance imaging. Activation within the middle frontal gyrus, inferior frontal gyrus, anterior cingulate gyrus, superior temporal gyrus, thalamus, and basal ganglia were analyzed for differences in activity patterns between the two stimulus conditions. Unattended deviant tones elicited robust acti...

  10. Automatic Defect Detection for TFT-LCD Array Process Using Quasiconformal Kernel Support Vector Data Description

    Directory of Open Access Journals (Sweden)

    Yi-Hung Liu

    2011-09-01

    Full Text Available Defect detection has been considered an efficient way to increase the yield rate of panels in thin film transistor liquid crystal display (TFT-LCD manufacturing. In this study we focus on the array process since it is the first and key process in TFT-LCD manufacturing. Various defects occur in the array process, and some of them could cause great damage to the LCD panels. Thus, how to design a method that can robustly detect defects from the images captured from the surface of LCD panels has become crucial. Previously, support vector data description (SVDD has been successfully applied to LCD defect detection. However, its generalization performance is limited. In this paper, we propose a novel one-class machine learning method, called quasiconformal kernel SVDD (QK-SVDD to address this issue. The QK-SVDD can significantly improve generalization performance of the traditional SVDD by introducing the quasiconformal transformation into a predefined kernel. Experimental results, carried out on real LCD images provided by an LCD manufacturer in Taiwan, indicate that the proposed QK-SVDD not only obtains a high defect detection rate of 96%, but also greatly improves generalization performance of SVDD. The improvement has shown to be over 30%. In addition, results also show that the QK-SVDD defect detector is able to accomplish the task of defect detection on an LCD image within 60 ms.

  11. Automatic processing, quality assurance and serving of real-time weather data

    Science.gov (United States)

    Williams, Matthew; Cornford, Dan; Bastin, Lucy; Jones, Richard; Parker, Stephen

    2011-03-01

    Recent advances in technology have produced a significant increase in the availability of free sensor data over the Internet. With affordable weather monitoring stations now available to individual meteorology enthusiasts, a reservoir of real time data such as temperature, rainfall and wind speed can now be obtained for most of the world. Despite the abundance of available data, the production of usable information about the weather in individual local neighbourhoods requires complex processing that poses several challenges. This paper discusses a collection of technologies and applications that harvest, refine and process this data, culminating in information that has been tailored toward the user. In this instance, this allows a user to make direct queries about the weather at any location, even when this is not directly instrumented, using interpolation methods provided by the INTAMAP project. A simplified example illustrates how the INTAMAP web processing service can be employed as part of a quality control procedure to estimate the bias and residual variance of user contributed temperature observations, using a reference standard based on temperature observations with carefully controlled quality. We also consider how the uncertainty introduced by the interpolation can be communicated to the user of the system, using UncertML, a developing standard for uncertainty representation.

  12. Anomaly disentanglement on the basis of automatic processing of the aerial gamma-spectrometric survey data

    International Nuclear Information System (INIS)

    Perlovskij, V.A.; Zaika, S.D.; Lomtadze, V.V.

    1976-01-01

    Automated processing airborne gamma spectrometric data has been introduced with the use of the BESM-4 computer. Discrimination of anomalies is effected using a variable interval 200-300 m within which maximum values in the general counting channel and their coordinates are found; reference points are detected too against which one can determine the excess and amplitude of the anomalies. The discriminated anomalies are fixed on printer and charts of anomalies are constructed. The charts and tables characterizing the anomaly help in the identification of its potentialities and hence underlie a decision to start ground works

  13. Automatic Image Processing Workflow for the Keck/NIRC2 Vortex Coronagraph

    Science.gov (United States)

    Xuan, Wenhao; Cook, Therese; Ngo, Henry; Zawol, Zoe; Ruane, Garreth; Mawet, Dimitri

    2018-01-01

    The Keck/NIRC2 camera, equipped with the vortex coronagraph, is an instrument targeted at the high contrast imaging of extrasolar planets. To uncover a faint planet signal from the overwhelming starlight, we utilize the Vortex Image Processing (VIP) library, which carries out principal component analysis to model and remove the stellar point spread function. To bridge the gap between data acquisition and data reduction, we implement a workflow that 1) downloads, sorts, and processes data with VIP, 2) stores the analysis products into a database, and 3) displays the reduced images, contrast curves, and auxiliary information on a web interface. Both angular differential imaging and reference star differential imaging are implemented in the analysis module. A real-time version of the workflow runs during observations, allowing observers to make educated decisions about time distribution on different targets, hence optimizing science yield. The post-night version performs a standardized reduction after the observation, building up a valuable database that not only helps uncover new discoveries, but also enables a statistical study of the instrument itself. We present the workflow, and an examination of the contrast performance of the NIRC2 vortex with respect to factors including target star properties and observing conditions.

  14. Image processing for an automatic detection of defect signals from electromagnetic cartographies

    International Nuclear Information System (INIS)

    Benoist, B.; Marqueste, L.; Birac, C.

    1994-01-01

    As the population of nuclear power plants ages, new defects are appearing in steam generator tubes (stress corrosion, corrosion pitting and intergranular corrosion). For more sophisticated expert appraisal of these defects, tubes can be examined by multifrequency eddy-current testing with an absolute coil (diameter value of 1 mm). A device, consisting of a push-puller mechanism and a motor-driven probe carrying this absolute coil, gives a helical movement to scan the inner surface of the tube. The signals obtained can be represented in the form of cartographies (3D representation in which the coordinates are the circumference, the length and amplitude of the X or Y component at a given frequency). The detection of defect signals by visual examination of these eddy-current cartographies is not always reproducible. The article describes an image processing procedure for the detection of defect signals which leads to a better reproductibility for more safety

  15. DEVELOPING UNIVERSAL INSTALLATION WITH AUTOMATIC MONITORING AND CONTROL PROCESS OF MIXING, WHIPPING AND MOLDING BISCUIT DOUGH

    Directory of Open Access Journals (Sweden)

    E. I. Ponomareva

    2013-01-01

    Full Text Available As products of high nutritional value can be used bakery products from a mixture of rye and wheat flour with the application of a grain of rye. Use whole grains assumes control of its quality according to organoleptic, physico-chemical and hygienic indices. Method of determining the color characteristics of grain scanner-metric method us-ing tablet scanner HP ScanJet 3570C with application of computer processing of images in RGB color mode is proposed. Application of the method to determine the color characteristics showed that rye, prepared in different ways, has different intensity of coloring, and the maximum intensity of the color components is observed at native grain.

  16. 13C-detected NMR experiments for automatic resonance assignment of IDPs and multiple-fixing SMFT processing

    International Nuclear Information System (INIS)

    Dziekański, Paweł; Grudziąż, Katarzyna; Jarvoll, Patrik; Koźmiński, Wiktor; Zawadzka-Kazimierczuk, Anna

    2015-01-01

    Intrinsically disordered proteins (IDPs) have recently attracted much interest, due to their role in many biological processes, including signaling and regulation mechanisms. High-dimensional 13 C direct-detected NMR experiments have proven exceptionally useful in case of IDPs, providing spectra with superior peak dispersion. Here, two such novel experiments recorded with non-uniform sampling are introduced, these are 5D HabCabCO(CA)NCO and 5D HNCO(CA)NCO. Together with the 4D (HACA)CON(CA)NCO, an extension of the previously published 3D experiments (Pantoja-Uceda and Santoro in J Biomol NMR 59:43–50, 2014. doi: 10.1007/s10858-014-9827-1 10.1007/s10858-014-9827-1 ), they form a set allowing for complete and reliable resonance assignment of difficult IDPs. The processing is performed with sparse multidimensional Fourier transform based on the concept of restricting (fixing) some of spectral dimensions to a priori known resonance frequencies. In our study, a multiple-fixing method was developed, that allows easy access to spectral data. The experiments were tested on a resolution-demanding alpha-synuclein sample. Due to superior peak dispersion in high-dimensional spectrum and availability of the sequential connectivities between four consecutive residues, the overwhelming majority of resonances could be assigned automatically using the TSAR program

  17. A fully MEMS-compatible process for 3D high aspect ratio micro coils obtained with an automatic wire bonder

    International Nuclear Information System (INIS)

    Kratt, K; Badilita, V; Burger, T; Wallrabe, U; Korvink, J G

    2010-01-01

    We report the fabrication of 3D micro coils made with an automatic wire bonder. Using standard MEMS processes such as spin coating and UV lithography on silicon and Pyrex® wafers results in high aspect ratio SU-8 posts with diameters down to 100 µm that serve as mechanical stabilization yokes for the coils. The wire bonder is employed to wind 25 µm insulated gold wire around the posts in an arbitrary (e.g. solenoidal) path, yielding arrays of micro coils. Each micro coil is bonded directly on-chip, so that loose wire ends are avoided and, compared to other winding methods, coil re-soldering is unnecessary. The manufacturing time for a single coil is about 200 ms, and although the process is serial, it is batch fabrication compatible due to the high throughput of the machine. Despite the speed of manufacture we obtain high manufacturing precision and reliability. The micro air-core solenoids show an RF quality factor of over 50 when tested at 400 MHz. We present a flexible coil making method where the number of windings is only limited by the post height. The coil diameter is restricted by limits defined by lithography and the mechanical strength of the posts. Based on this technique we present coils ranging from 100 µm diameter and 1 winding up to 1000 µm diameter and 20 windings

  18. Automatic processing of semantic relations in fMRI: neural activation during semantic priming of taxonomic and thematic categories.

    Science.gov (United States)

    Sachs, Olga; Weis, Susanne; Zellagui, Nadia; Huber, Walter; Zvyagintsev, Mikhail; Mathiak, Klaus; Kircher, Tilo

    2008-07-07

    Most current models of knowledge organization are based on hierarchical or taxonomic categories (animals, tools). Another important organizational pattern is thematic categorization, i.e. categories held together by external relations, a unifying scene or event (car and garage). The goal of this study was to compare the neural correlates of these categories under automatic processing conditions that minimize strategic influences. We used fMRI to examine neural correlates of semantic priming for category members with a short stimulus onset asynchrony (SOA) of 200 ms as subjects performed a lexical decision task. Four experimental conditions were compared: thematically related words (car-garage); taxonomically related (car-bus); unrelated (car-spoon); non-word trials (car-derf). We found faster reaction times for related than for unrelated prime-target pairs for both thematic and taxonomic categories. However, the size of the thematic priming effect was greater than that of the taxonomic. The imaging data showed signal changes for the taxonomic priming effects in the right precuneus, postcentral gyrus, middle frontal and superior frontal gyri and thematic priming effects in the right middle frontal gyrus and anterior cingulate. The contrast of neural priming effects showed larger signal changes in the right precuneus associated with the taxonomic but not with thematic priming response. We suggest that the greater involvement of precuneus in the processing of taxonomic relations indicates their reduced salience in the knowledge structure compared to more prominent thematic relations.

  19. Process for the automatic compensation of spectral displacement based on quenching processes in a liquid scintillation counter

    International Nuclear Information System (INIS)

    Nather, R.E.

    1978-01-01

    In measurements in a liquid scintillation counter, the tritium or C 14 isotope to be examined is situated in a scintillator solution. It is excited according to the energy of the β particle to emit light. An electrical signal is proportional to the light signal, and from the former, selective counting in the β spectrum can be undertaken in an impulse height analyser. The influence of the quenching effects by colour quenching or chemical quenching would reduce the gain of the counter. To compensate for the displacement of the spectrum, the required adjustment of a system parameter is carried out by calibration with a sample of low quenching effect. The calibration process is directly set for the energy end-point of the spectrum. Well known processes can be used to determine the quenching effect of the quenching represented by the sample. For example, the system parameters can be the discriminator level of the counter window. (DG) 891 HP [de

  20. Selective and validated data processing techniques for performance improvement of automatic lines

    Directory of Open Access Journals (Sweden)

    D’Aponte Francesco

    2016-01-01

    Full Text Available Optimization of the data processing techniques of accelerometers and force transducers allowed to get information about actions in order to improve the behavior of a cutting stage of a converting machinery for diapers production. In particular, different mechanical configurations have been studied and compared in order to reduce the solicitations due to the impacts between knives and anvil, to get clean and accurate cuts and to reduce wear of knives themselves. Reducing the uncertainty of measurements allowed to correctly individuate the best configuration for the pneumatic system that realize the coupling between anvil and knife. The size of pipes, the working pressure and the type of the fluid used in the coupling system have been examined. Experimental results obtained by means of acceleration and force measurements allowed to identify in a reproducible and coherent way the geometry of the pushing device and the working pressure range of the hydraulic fluid. The remarkable reduction of knife and anvil vibrations is expected to strongly reduce the wear of the cutting stage components.

  1. Different involvement of medial prefrontal cortex and dorso-lateral striatum in automatic and controlled processing of a future conditioned stimulus

    OpenAIRE

    Pérez-Díaz, Francisco; Díaz, Estrella; Sánchez, Natividad; Vargas, Juan Pedro; Pearce, John M.; López, Juan Carlos

    2017-01-01

    Recent studies support the idea that stimulus processing in latent inhibition can vary during the course of preexposure. Controlled attentional mechanisms are said to be important in the early stages of preexposure, while in later stages animals adopt automatic processing of the stimulus to be used for conditioning. Given this distinction, it is possible that both types of processing are governed by different neural systems, affecting differentially the retrieval of information about the stim...

  2. Do patients' faces influence General Practitioners' cancer suspicions? A test of automatic processing of sociodemographic information.

    Directory of Open Access Journals (Sweden)

    Rosalind Adam

    Full Text Available Delayed cancer diagnosis leads to poorer patient outcomes. During short consultations, General Practitioners (GPs make quick decisions about likelihood of cancer. Patients' facial cues are processed rapidly and may influence diagnosis.To investigate whether patients' facial characteristics influence immediate perception of cancer risk by GPs.Web-based binary forced choice experiment with GPs from Northeast Scotland.GPs were presented with a series of pairs of face prototypes and asked to quickly select the patient more likely to have cancer. Faces were modified with respect to age, gender, and ethnicity. Choices were analysed using Chi-squared goodness-of-fit statistics with Bonferroni corrections.Eighty-two GPs participated. GPs were significantly more likely to suspect cancer in older patients. Gender influenced GP cancer suspicion, but this was modified by age: the male face was chosen as more likely to have cancer than the female face for young (72% of GPs;95% CI 61.0-87.0 and middle-aged faces (65.9%; 95% CI 54.7-75.5; but 63.4% (95% CI 52.2-73.3 decided the older female was more likely to have cancer than the older male (p = 0.015. GPs were significantly more likely to suspect cancer in the young Caucasian male (65.9% (95% CI 54.7, 75.5 compared to the young Asian male (p = 0.004.GPs' first impressions about cancer risk are influenced by patient age, gender, and ethnicity. Tackling GP cognitive biases could be a promising way of reducing cancer diagnostic delays, particularly for younger patients.

  3. Automatic Lung-RADS™ classification with a natural language processing system.

    Science.gov (United States)

    Beyer, Sebastian E; McKee, Brady J; Regis, Shawn M; McKee, Andrea B; Flacke, Sebastian; El Saadawi, Gilan; Wald, Christoph

    2017-09-01

    Our aim was to train a natural language processing (NLP) algorithm to capture imaging characteristics of lung nodules reported in a structured CT report and suggest the applicable Lung-RADS™ (LR) category. Our study included structured, clinical reports of consecutive CT lung screening (CTLS) exams performed from 08/2014 to 08/2015 at an ACR accredited Lung Cancer Screening Center. All patients screened were at high-risk for lung cancer according to the NCCN Guidelines ® . All exams were interpreted by one of three radiologists credentialed to read CTLS exams using LR using a standard reporting template. Training and test sets consisted of consecutive exams. Lung screening exams were divided into two groups: three training sets (500, 120, and 383 reports each) and one final evaluation set (498 reports). NLP algorithm results were compared with the gold standard of LR category assigned by the radiologist. The sensitivity/specificity of the NLP algorithm to correctly assign LR categories for suspicious nodules (LR 4) and positive nodules (LR 3/4) were 74.1%/98.6% and 75.0%/98.8% respectively. The majority of mismatches occurred in cases where pulmonary findings were present not currently addressed by LR. Misclassifications also resulted from the failure to identify exams as follow-up and the failure to completely characterize part-solid nodules. In a sub-group analysis among structured reports with standardized language, the sensitivity and specificity to detect LR 4 nodules were 87.0% and 99.5%, respectively. An NLP system can accurately suggest the appropriate LR category from CTLS exam findings when standardized reporting is used.

  4. Big Data Analysis for Personalized Health Activities: Machine Learning Processing for Automatic Keyword Extraction Approach

    Directory of Open Access Journals (Sweden)

    Jun-Ho Huh

    2018-04-01

    Full Text Available The obese population is increasing rapidly due to the change of lifestyle and diet habits. Obesity can cause various complications and is becoming a social disease. Nonetheless, many obese patients are unaware of the medical treatments that are right for them. Although a variety of online and offline obesity management services have been introduced, they are still not enough to attract the attention of users and are not much of help to solve the problem. Obesity healthcare and personalized health activities are the important factors. Since obesity is related to lifestyle habits, eating habits, and interests, I concluded that the big data analysis of these factors could deduce the problem. Therefore, I collected big data by applying the machine learning and crawling method to the unstructured citizen health data in Korea and the search data of Naver, which is a Korean portal company, and Google for keyword analysis for personalized health activities. It visualized the big data using text mining and word cloud. This study collected and analyzed the data concerning the interests related to obesity, change of interest on obesity, and treatment articles. The analysis showed a wide range of seasonal factors according to spring, summer, fall, and winter. It also visualized and completed the process of extracting the keywords appropriate for treatment of abdominal obesity and lower body obesity. The keyword big data analysis technique for personalized health activities proposed in this paper is based on individual’s interests, level of interest, and body type. Also, the user interface (UI that visualizes the big data compatible with Android and Apple iOS. The users can see the data on the app screen. Many graphs and pictures can be seen via menu, and the significant data values are visualized through machine learning. Therefore, I expect that the big data analysis using various keywords specific to a person will result in measures for personalized

  5. MODELING OF PATTERN FORMING PROCESS OF AUTOMATIC RADIO DIRECTION FINDER OF PHASE VHF IN THE DEVELOPMENT ENVIRONMENT OF LabVIEW APPLIED PROGRAMS

    Directory of Open Access Journals (Sweden)

    G. K. Aslanov

    2015-01-01

    Full Text Available In the article is developed the model demonstrating the forming process of pattern of antenna system of aerodrome quasidopler automatic radiodirection-finder station in the development environment of LabVIEW applied programs of National Instrument company. 

  6. Challenges and opportunities : One stop processing of automatic large-scale base map production using airborne lidar data within gis environment case study: Makassar City, Indonesia

    NARCIS (Netherlands)

    Widyaningrum, E.; Gorte, B.G.H.

    2017-01-01

    LiDAR data acquisition is recognized as one of the fastest solutions to provide basis data for large-scale topographical base maps worldwide. Automatic LiDAR processing is believed one possible scheme to accelerate the large-scale topographic base map provision by the Geospatial Information

  7. Application of an automatic yarn dismantler to track changes in cotton fiber properties during full scale processing of cotton into carded yarn

    CSIR Research Space (South Africa)

    Fassihi, A

    2016-08-01

    Full Text Available Changes in Upland cotton fiber properties from lint to carded yarn, during full scale processing, were tracked, using a newly developed automatic yarn dismantler for dismantling short staple ring-spun yarns. Opening and cleaning increased fiber neps...

  8. Genetic analysis of seasonal runoff based on automatic techniques of hydrometeorological data processing

    Science.gov (United States)

    Kireeva, Maria; Sazonov, Alexey; Rets, Ekaterina; Ezerova, Natalia; Frolova, Natalia; Samsonov, Timofey

    2017-04-01

    , ground feeding is determined using interpolation of values before and after the flood • Floods during the rise and fall of high water are determined using depletion curves plotting • Groundwater component of runoff is divided into dynamic and static parts. The algorithm of subdivision described was formalized in the form of a program code in Fortran, with the connection of additional modules of R-Studio. The use of two languages allows, on the one hand, to speed up the processing of a large array of daily water discharges, on the other hand, to facilitate visualization and interpretation of results. The algorithm includes the selection of 15 calibration parameters describing the characteristics of each watershed. Verification and calibration of the program was carried out for 20 rivers of European Russia. According to calculations, there is a significant increase in the groundwater flow component in the most part of watershed and an increase in the role of flooding as the phase of the water regime as a whole. This research was supported by Russian Foundation for Basic Research (contract No. 16-35-60080).

  9. Second-language learning effects on automaticity of speech processing of Japanese phonetic contrasts: An MEG study.

    Science.gov (United States)

    Hisagi, Miwako; Shafer, Valerie L; Miyagawa, Shigeru; Kotek, Hadas; Sugawara, Ayaka; Pantazis, Dimitrios

    2016-12-01

    We examined discrimination of a second-language (L2) vowel duration contrast in English learners of Japanese (JP) with different amounts of experience using the magnetoencephalography mismatch field (MMF) component. Twelve L2 learners were tested before and after a second semester of college-level JP; half attended a regular rate course and half an accelerated course with more hours per week. Results showed no significant change in MMF for either the regular or accelerated learning group from beginning to end of the course. We also compared these groups against nine L2 learners who had completed four semesters of college-level JP. These 4-semester learners did not significantly differ from 2-semester learners, in that only a difference in hemisphere activation (interacting with time) between the two groups approached significance. These findings suggest that targeted training of L2 phonology may be necessary to allow for changes in processing of L2 speech contrasts at an early, automatic level. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Study of structure of marine specialist activity in an ergative system on monitoring and managing automatic control parameters of safe navigation process

    Directory of Open Access Journals (Sweden)

    Kholichev S. N.

    2016-12-01

    Full Text Available The study of structures' common features and dynamics of the technical object tuning circuit performing automatic adjustment of safe navigation options has been conducted for the first time in the theory of ergative systems. The research of the structure and process of ergative system functioning including an automatic control system with the option of safe navigation conditions has been fulfilled. The function of signals' selection performing optimal control law reconfiguration of the mentioned system has been given, and some sequence of marine specialist activities allowing solve the problem of navigation safety has been composed. The ergative system retargeted by the ship specialist has a two-tier hierarchy. The first level is an automatic control of the safe navigation parameter, and the second is the level of reconfiguration where the ship specialist changes the parameters of regulation act. The two-level hierarchical representation of the ergative navigation security settings management system makes it possible to introduce the concept of reconfiguration of regulation level as ship specialist activity which is to reduce the uncertainty in the environment in the operation of this layer. Such a reduction can be achieved as a result of exposure to the upper level associated with ideas of the ship specialist on the regulation of safe navigation parameters of the vessel on the lower level – the level of direct control automatic safe navigation option. As a result of studying the activities of the ship specialist in the ergative system on monitoring and managing automatic control parameters of safe navigation process it has been found that the main task of the ship specialist in the operation within the ergative system ensuring the navigation safety is to monitor the input and output of the automatic control system, decisions on the choice of reconfiguration laws regulating signal on the basis of information about deviations and the

  11. Linking attentional processes and conceptual problem solving: visual cues facilitate the automaticity of extracting relevant information from diagrams.

    Science.gov (United States)

    Rouinfar, Amy; Agra, Elise; Larson, Adam M; Rebello, N Sanjay; Loschky, Lester C

    2014-01-01

    This study investigated links between visual attention processes and conceptual problem solving. This was done by overlaying visual cues on conceptual physics problem diagrams to direct participants' attention to relevant areas to facilitate problem solving. Participants (N = 80) individually worked through four problem sets, each containing a diagram, while their eye movements were recorded. Each diagram contained regions that were relevant to solving the problem correctly and separate regions related to common incorrect responses. Problem sets contained an initial problem, six isomorphic training problems, and a transfer problem. The cued condition saw visual cues overlaid on the training problems. Participants' verbal responses were used to determine their accuracy. This study produced two major findings. First, short duration visual cues which draw attention to solution-relevant information and aid in the organizing and integrating of it, facilitate both immediate problem solving and generalization of that ability to new problems. Thus, visual cues can facilitate re-representing a problem and overcoming impasse, enabling a correct solution. Importantly, these cueing effects on problem solving did not involve the solvers' attention necessarily embodying the solution to the problem, but were instead caused by solvers attending to and integrating relevant information in the problems into a solution path. Second, this study demonstrates that when such cues are used across multiple problems, solvers can automatize the extraction of problem-relevant information extraction. These results suggest that low-level attentional selection processes provide a necessary gateway for relevant information to be used in problem solving, but are generally not sufficient for correct problem solving. Instead, factors that lead a solver to an impasse and to organize and integrate problem information also greatly facilitate arriving at correct solutions.

  12. Changes in automatic threat processing precede and predict clinical changes with exposure-based cognitive-behavior therapy for panic disorder.

    Science.gov (United States)

    Reinecke, Andrea; Waldenmaier, Lara; Cooper, Myra J; Harmer, Catherine J

    2013-06-01

    Cognitive behavioral therapy (CBT) is an effective treatment for emotional disorders such as anxiety or depression, but the mechanisms underlying successful intervention are far from understood. Although it has been a long-held view that psychopharmacological approaches work by directly targeting automatic emotional information processing in the brain, it is usually postulated that psychological treatments affect these processes only over time, through changes in more conscious thought cycles. This study explored the role of early changes in emotional information processing in CBT action. Twenty-eight untreated patients with panic disorder were randomized to a single session of exposure-based CBT or waiting group. Emotional information processing was measured on the day after intervention with an attentional visual probe task, and clinical symptoms were assessed on the day after intervention and at 4-week follow-up. Vigilance for threat information was decreased in the treated group, compared with the waiting group, the day after intervention, before reductions in clinical symptoms. The magnitude of this early effect on threat vigilance predicted therapeutic response after 4 weeks. Cognitive behavioral therapy rapidly affects automatic processing, and these early effects are predictive of later therapeutic change. Such results suggest very fast action on automatic processes mediating threat sensitivity, and they provide an early marker of treatment response. Furthermore, these findings challenge the notion that psychological treatments work directly on conscious thought processes before automatic information processing and imply a greater similarity between early effects of pharmacological and psychological treatments for anxiety than previously thought. Copyright © 2013 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  13. Different involvement of medial prefrontal cortex and dorso-lateral striatum in automatic and controlled processing of a future conditioned stimulus.

    Science.gov (United States)

    Pérez-Díaz, Francisco; Díaz, Estrella; Sánchez, Natividad; Vargas, Juan Pedro; Pearce, John M; López, Juan Carlos

    2017-01-01

    Recent studies support the idea that stimulus processing in latent inhibition can vary during the course of preexposure. Controlled attentional mechanisms are said to be important in the early stages of preexposure, while in later stages animals adopt automatic processing of the stimulus to be used for conditioning. Given this distinction, it is possible that both types of processing are governed by different neural systems, affecting differentially the retrieval of information about the stimulus. In the present study we tested if a lesion to the dorso-lateral striatum or to the medial prefrontal cortex has a selective effect on exposure to the future conditioned stimulus (CS). With this aim, animals received different amounts of exposure to the future CS. The results showed that a lesion to the medial prefrontal cortex enhanced latent inhibition in animals receiving limited preexposure to the CS, but had no effect in animals receiving extended preexposure to the CS. The lesion of the dorso-lateral striatum produced a decrease in latent inhibition, but only in animals with an extended exposure to the future conditioned stimulus. These results suggest that the dorsal striatum and medial prefrontal cortex play essential roles in controlled and automatic processes. Automatic attentional processes appear to be impaired by a lesion to the dorso-lateral striatum and facilitated by a lesion to the prefrontal cortex.

  14. Automatic extraction and processing of small RNAs on a multi-well/multi-channel (M&M) chip.

    Science.gov (United States)

    Zhong, Runtao; Flack, Kenneth; Zhong, Wenwan

    2012-12-07

    The study of the regulatory roles in small RNAs can be accelerated by techniques that permit simple, low-cost, and rapid extraction of small RNAs from a small number of cells. In order to ensure highly specific and sensitive detection, the extracted RNAs should be free of the background nucleic acids and present stably in a small volume. To meet these criteria, we designed a multi-well/multi-channel (M&M) chip to carry out automatic and selective isolation of small RNAs via solid-phase extraction (SPE), followed by reverse-transcription (RT) to convert them to the more stable cDNAs in a final volume of 2 μL. Droplets containing buffers for RNA binding, washing, and elution were trapped in microwells, which were connected by one channel, and suspended in mineral oil. The silica magnetic particles (SMPs) for SPE were moved along the channel from well to well, i.e. in between droplets, by a fixed magnet and a translation stage, allowing the nucleic acid fragments to bind to the SMPs, be washed, and then be eluted for RT reaction within 15 minutes. RNAs shorter than 63 nt were selectively enriched from cell lysates, with recovery comparable to that of a commercial kit. Physical separation of the droplets on our M&M chip allowed the usage of multiple channels for parallel processing of multiple samples. It also permitted smooth integration with on-chip RT-PCR, which simultaneously detected the target microRNA, mir-191, expressed in fewer than 10 cancer cells. Our results have demonstrated that the M&M chip device is a valuable and cost-saving platform for studying small RNA expression patterns in a limited number of cells with reasonable sample throughput.

  15. Automatizovani sistem podrške odlučivanju u procesima javne nabavke / Automatic automatic decision support in a tender process

    Directory of Open Access Journals (Sweden)

    Siniša Borović

    2004-01-01

    Full Text Available U poslednjoj fazi procesa javne nabavke komisija se susreće sa problemom izbora naboljeg ponuđača, odnosno onog čija ponuda u najvećoj meri ispunjava zahteve i uslove koji su navedeni u konkursnoj dokumentaciji. Mora se doneti odluka uz uvažavanje više različitih, a često i protivrečnih kriterijuma. Za rešavanje ovakvih i sličnih problema razvijen je čitav niz metoda koje pripadaju oblasti višekriterijumske analize i rangiranja. Uzimajući u obzir karakteristike problema izbora ponuđača u procesu javne nabavke, iz skupa navedenih metoda izdvojena je metoda analitičkih hijerarhijskih procesa (AHP, kao osnovna metoda za rangiranje ponuđača. Za eventualnu kontrolu izvršenog rangiranja predlaže se familija metoda PROMETHEE I-II. / In the last phase of a tender process, the Commission has a problem with choosing the best bidder, i.e. the one whose offer meets the highest requirements in the Tender documentation. A decision has to be made taking into account many different, and very often contradictory, criteria. For solving this one as well as similar problems, numerous methods, belonging to the multicriteria analysis and ranking, are developed. Taking in to account the characteristics of a problem imposed by choosing the bidder in a Tender process, one of the mentioned methods, the method of analytical hierarchy process (AHP has been chosen as a basic method for ranking bidders. For the performed ranking control, the PROMETHEE I-II, method family is proposed.

  16. Computer-aided recording of automatic endoscope washing and disinfection processes as an integral part of medical documentation for quality assurance purposes

    Directory of Open Access Journals (Sweden)

    Klein Stefanie

    2010-07-01

    Full Text Available Abstract Background The reprocessing of medical endoscopes is carried out using automatic cleaning and disinfection machines. The documentation and archiving of records of properly conducted reprocessing procedures is the last and increasingly important part of the reprocessing cycle for flexible endoscopes. Methods This report describes a new computer program designed to monitor and document the automatic reprocessing of flexible endoscopes and accessories in fully automatic washer-disinfectors; it does not contain nor compensate the manual cleaning step. The program implements national standards for the monitoring of hygiene in flexible endoscopes and the guidelines for the reprocessing of medical products. No FDA approval has been obtained up to now. The advantages of this newly developed computer program are firstly that it simplifies the documentation procedures of medical endoscopes and that it could be used universally with any washer-disinfector and that it is independent of the various interfaces and software products provided by the individual suppliers of washer-disinfectors. Results The computer program presented here has been tested on a total of four washer-disinfectors in more than 6000 medical examinations within 9 months. Conclusions We present for the first time an electronic documentation system for automated washer-disinfectors for medical devices e.g. flexible endoscopes which can be used on any washer-disinfectors that documents the procedures involved in the automatic cleaning process and can be easily connected to most hospital documentation systems.

  17. Sentinel-1 automatic processing chain for volcanic and seismic areas monitoring within the Geohazards Exploitation Platform (GEP)

    Science.gov (United States)

    De Luca, Claudio; Zinno, Ivana; Manunta, Michele; Lanari, Riccardo; Casu, Francesco

    2016-04-01

    these issues, ESA recently funded the development of the Geohazards Exploitation Platform (GEP), a project aimed at putting together data, processing tools and results to make them accessible to the EO scientific community, with particular emphasis to the Geohazard Supersites & Natural Laboratories and the CEOS Seismic Hazards and Volcanoes Pilots. In this work we present the integration of the parallel version of a well-known DInSAR algorithm referred to as Small BAseline Subset (P-SBAS) within the GEP platform for processing Sentinel-1 data. The integration allowed us to set up an operational on-demand web tool, open to every user, aimed at automatically processing S1A data for the generation of SBAS displacement time-series. Main characteristics as well as a number of experimental results obtained by using the implemented web tool will be also shown. This work is partially supported by: the RITMARE project of Italian MIUR, the DPC-CNR agreement and the ESA GEP project.

  18. Linking Automatic Evaluation to Mood and Information Processing Style: Consequences for Experienced Affect, Impression Formation, and Stereotyping

    Science.gov (United States)

    Chartrand, Tanya L.; van Baaren, Rick B.; Bargh, John A.

    2006-01-01

    According to the feelings-as-information account, a person's mood state signals to him or her the valence of the current environment (N. Schwarz & G. Clore, 1983). However, the ways in which the environment automatically influences mood in the first place remain to be explored. The authors propose that one mechanism by which the environment…

  19. Quantitative analysis of the patellofemoral motion pattern using semi-automatic processing of 4D CT data.

    Science.gov (United States)

    Forsberg, Daniel; Lindblom, Maria; Quick, Petter; Gauffin, Håkan

    2016-09-01

    To present a semi-automatic method with minimal user interaction for quantitative analysis of the patellofemoral motion pattern. 4D CT data capturing the patellofemoral motion pattern of a continuous flexion and extension were collected for five patients prone to patellar luxation both pre- and post-surgically. For the proposed method, an observer would place landmarks in a single 3D volume, which then are automatically propagated to the other volumes in a time sequence. From the landmarks in each volume, the measures patellar displacement, patellar tilt and angle between femur and tibia were computed. Evaluation of the observer variability showed the proposed semi-automatic method to be favorable over a fully manual counterpart, with an observer variability of approximately 1.5[Formula: see text] for the angle between femur and tibia, 1.5 mm for the patellar displacement, and 4.0[Formula: see text]-5.0[Formula: see text] for the patellar tilt. The proposed method showed that surgery reduced the patellar displacement and tilt at maximum extension with approximately 10-15 mm and 15[Formula: see text]-20[Formula: see text] for three patients but with less evident differences for two of the patients. A semi-automatic method suitable for quantification of the patellofemoral motion pattern as captured by 4D CT data has been presented. Its observer variability is on par with that of other methods but with the distinct advantage to support continuous motions during the image acquisition.

  20. Application of an automatic yarn dismantler to track changes in cotton fibre properties during processing on a miniature spinning line

    CSIR Research Space (South Africa)

    Fassihi, A

    2014-11-01

    Full Text Available . The results obtained on different Upland cottons have clearly demonstrated the practical value of the yarn dismantler in enabling yarns to be automatically dismantled into their constituent fibres, which can then be tested by instrument, such as the AFIS...

  1. Automatic Determination of the Need for Intravenous Contrast in Musculoskeletal MRI Examinations Using IBM Watson's Natural Language Processing Algorithm.

    Science.gov (United States)

    Trivedi, Hari; Mesterhazy, Joseph; Laguna, Benjamin; Vu, Thienkhai; Sohn, Jae Ho

    2018-04-01

    Magnetic resonance imaging (MRI) protocoling can be time- and resource-intensive, and protocols can often be suboptimal dependent upon the expertise or preferences of the protocoling radiologist. Providing a best-practice recommendation for an MRI protocol has the potential to improve efficiency and decrease the likelihood of a suboptimal or erroneous study. The goal of this study was to develop and validate a machine learning-based natural language classifier that can automatically assign the use of intravenous contrast for musculoskeletal MRI protocols based upon the free-text clinical indication of the study, thereby improving efficiency of the protocoling radiologist and potentially decreasing errors. We utilized a deep learning-based natural language classification system from IBM Watson, a question-answering supercomputer that gained fame after challenging the best human players on Jeopardy! in 2011. We compared this solution to a series of traditional machine learning-based natural language processing techniques that utilize a term-document frequency matrix. Each classifier was trained with 1240 MRI protocols plus their respective clinical indications and validated with a test set of 280. Ground truth of contrast assignment was obtained from the clinical record. For evaluation of inter-reader agreement, a blinded second reader radiologist analyzed all cases and determined contrast assignment based on only the free-text clinical indication. In the test set, Watson demonstrated overall accuracy of 83.2% when compared to the original protocol. This was similar to the overall accuracy of 80.2% achieved by an ensemble of eight traditional machine learning algorithms based on a term-document matrix. When compared to the second reader's contrast assignment, Watson achieved 88.6% agreement. When evaluating only the subset of cases where the original protocol and second reader were concordant (n = 251), agreement climbed further to 90.0%. The classifier was

  2. Meta-analytic moderators of experimental exposure to media portrayals of women on female appearance satisfaction: Social comparisons as automatic processes.

    Science.gov (United States)

    Want, Stephen C

    2009-09-01

    Experimental exposure to idealized media portrayals of women is thought to induce social comparisons in female viewers and thereby to be generally detrimental to female viewers' satisfaction with their own appearance. Through meta-analysis, the present paper examines the impact of moderators of this effect, some identified and updated from a prior meta-analysis and some that have hitherto received little attention. Participants' pre-existing appearance concerns and the processing instructions participants were given when exposed to media portrayals were found to significantly moderate effect sizes. With regard to processing instructions, a novel and counter-intuitive pattern was revealed; effect sizes were smallest when participants were instructed to focus on the appearance of women in media portrayals, and largest when participants processed the portrayals on a distracting, non-appearance dimension. These results are interpreted through a framework that suggests that social comparisons are automatic processes, the effects of which can be modified through conscious processing.

  3. Text mining and natural language processing approaches for automatic categorization of lay requests to web-based expert forums.

    Science.gov (United States)

    Himmel, Wolfgang; Reincke, Ulrich; Michelmann, Hans Wilhelm

    2009-07-22

    Both healthy and sick people increasingly use electronic media to obtain medical information and advice. For example, Internet users may send requests to Web-based expert forums, or so-called "ask the doctor" services. To automatically classify lay requests to an Internet medical expert forum using a combination of different text-mining strategies. We first manually classified a sample of 988 requests directed to a involuntary childlessness forum on the German website "Rund ums Baby" ("Everything about Babies") into one or more of 38 categories belonging to two dimensions ("subject matter" and "expectations"). After creating start and synonym lists, we calculated the average Cramer's V statistic for the association of each word with each category. We also used principle component analysis and singular value decomposition as further text-mining strategies. With these measures we trained regression models and determined, on the basis of best regression models, for any request the probability of belonging to each of the 38 different categories, with a cutoff of 50%. Recall and precision of a test sample were calculated as a measure of quality for the automatic classification. According to the manual classification of 988 documents, 102 (10%) documents fell into the category "in vitro fertilization (IVF)," 81 (8%) into the category "ovulation," 79 (8%) into "cycle," and 57 (6%) into "semen analysis." These were the four most frequent categories in the subject matter dimension (consisting of 32 categories). The expectation dimension comprised six categories; we classified 533 documents (54%) as "general information" and 351 (36%) as a wish for "treatment recommendations." The generation of indicator variables based on the chi-square analysis and Cramer's V proved to be the best approach for automatic classification in about half of the categories. In combination with the two other approaches, 100% precision and 100% recall were realized in 18 (47%) out of the 38

  4. The development of the Athens Emotional States Inventory (AESI): collection, validation and automatic processing of emotionally loaded sentences.

    Science.gov (United States)

    Chaspari, Theodora; Soldatos, Constantin; Maragos, Petros

    2015-01-01

    The development of ecologically valid procedures for collecting reliable and unbiased emotional data towards computer interfaces with social and affective intelligence targeting patients with mental disorders. Following its development, presented with, the Athens Emotional States Inventory (AESI) proposes the design, recording and validation of an audiovisual database for five emotional states: anger, fear, joy, sadness and neutral. The items of the AESI consist of sentences each having content indicative of the corresponding emotion. Emotional content was assessed through a survey of 40 young participants with a questionnaire following the Latin square design. The emotional sentences that were correctly identified by 85% of the participants were recorded in a soundproof room with microphones and cameras. A preliminary validation of AESI is performed through automatic emotion recognition experiments from speech. The resulting database contains 696 recorded utterances in Greek language by 20 native speakers and has a total duration of approximately 28 min. Speech classification results yield accuracy up to 75.15% for automatically recognizing the emotions in AESI. These results indicate the usefulness of our approach for collecting emotional data with reliable content, balanced across classes and with reduced environmental variability.

  5. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan; Calo, Victor M.

    2010-01-01

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques

  6. Load-Dependent Interference of Deep Brain Stimulation of the Subthalamic Nucleus with Switching from Automatic to Controlled Processing During Random Number Generation in Parkinson's Disease.

    Science.gov (United States)

    Williams, Isobel Anne; Wilkinson, Leonora; Limousin, Patricia; Jahanshahi, Marjan

    2015-01-01

    Deep brain stimulation of the subthalamic nucleus (STN DBS) ameliorates the motor symptoms of Parkinson's disease (PD). However, some aspects of executive control are impaired with STN DBS. We tested the prediction that (i) STN DBS interferes with switching from automatic to controlled processing during fast-paced random number generation (RNG) (ii) STN DBS-induced cognitive control changes are load-dependent. Fifteen PD patients with bilateral STN DBS performed paced-RNG, under three levels of cognitive load synchronised with a pacing stimulus presented at 1, 0.5 and 0.33 Hz (faster rates require greater cognitive control), with DBS on or off. Measures of output randomness were calculated. Countscore 1 (CS1) indicates habitual counting in steps of one (CS1). Countscore 2 (CS2) indicates a more controlled strategy of counting in twos. The fastest rate was associated with an increased CS1 score with STN DBS on compared to off. At the slowest rate, patients had higher CS2 scores with DBS off than on, such that the differences between CS1 and CS2 scores disappeared. We provide evidence for a load-dependent effect of STN DBS on paced RNG in PD. Patients could switch to more controlled RNG strategies during conditions of low cognitive load at slower rates only when the STN stimulators were off, but when STN stimulation was on, they engaged in more automatic habitual counting under increased cognitive load. These findings are consistent with the proposal that the STN implements a switch signal from the medial frontal cortex which enables a shift from automatic to controlled processing.

  7. Automatic classification of written descriptions by healthy adults: An overview of the application of natural language processing and machine learning techniques to clinical discourse analysis.

    Science.gov (United States)

    Toledo, Cíntia Matsuda; Cunha, Andre; Scarton, Carolina; Aluísio, Sandra

    2014-01-01

    Discourse production is an important aspect in the evaluation of brain-injured individuals. We believe that studies comparing the performance of brain-injured subjects with that of healthy controls must use groups with compatible education. A pioneering application of machine learning methods using Brazilian Portuguese for clinical purposes is described, highlighting education as an important variable in the Brazilian scenario. The aims were to describe how to:(i) develop machine learning classifiers using features generated by natural language processing tools to distinguish descriptions produced by healthy individuals into classes based on their years of education; and(ii) automatically identify the features that best distinguish the groups. The approach proposed here extracts linguistic features automatically from the written descriptions with the aid of two Natural Language Processing tools: Coh-Metrix-Port and AIC. It also includes nine task-specific features (three new ones, two extracted manually, besides description time; type of scene described - simple or complex; presentation order - which type of picture was described first; and age). In this study, the descriptions by 144 of the subjects studied in Toledo 18 were used,which included 200 healthy Brazilians of both genders. A Support Vector Machine (SVM) with a radial basis function (RBF) kernel is the most recommended approach for the binary classification of our data, classifying three of the four initial classes. CfsSubsetEval (CFS) is a strong candidate to replace manual feature selection methods.

  8. Automatic classification of written descriptions by healthy adults: An overview of the application of natural language processing and machine learning techniques to clinical discourse analysis

    Directory of Open Access Journals (Sweden)

    Cíntia Matsuda Toledo

    Full Text Available Discourse production is an important aspect in the evaluation of brain-injured individuals. We believe that studies comparing the performance of brain-injured subjects with that of healthy controls must use groups with compatible education. A pioneering application of machine learning methods using Brazilian Portuguese for clinical purposes is described, highlighting education as an important variable in the Brazilian scenario.OBJECTIVE: The aims were to describe how to: (i develop machine learning classifiers using features generated by natural language processing tools to distinguish descriptions produced by healthy individuals into classes based on their years of education; and (ii automatically identify the features that best distinguish the groups.METHODS: The approach proposed here extracts linguistic features automatically from the written descriptions with the aid of two Natural Language Processing tools: Coh-Metrix-Port and AIC. It also includes nine task-specific features (three new ones, two extracted manually, besides description time; type of scene described - simple or complex; presentation order - which type of picture was described first; and age. In this study, the descriptions by 144 of the subjects studied in Toledo18 were used, which included 200 healthy Brazilians of both genders.RESULTS AND CONCLUSION:A Support Vector Machine (SVM with a radial basis function (RBF kernel is the most recommended approach for the binary classification of our data, classifying three of the four initial classes. CfsSubsetEval (CFS is a strong candidate to replace manual feature selection methods.

  9. No strong evidence for abnormal levels of dysfunctional attitudes, automatic thoughts, and emotional information-processing biases in remitted bipolar I affective disorder.

    Science.gov (United States)

    Lex, Claudia; Meyer, Thomas D; Marquart, Barbara; Thau, Kenneth

    2008-03-01

    Beck extended his original cognitive theory of depression by suggesting that mania was a mirror image of depression characterized by extreme positive cognition about the self, the world, and the future. However, there were no suggestions what might be special regarding cognitive features in bipolar patients (Mansell & Scott, 2006). We therefore used different indicators to evaluate cognitive processes in bipolar patients and healthy controls. We compared 19 remitted bipolar I patients (BPs) without any Axis I comorbidity with 19 healthy individuals (CG). All participants completed the Beck Depression Inventory, the Dysfunctional Attitude Scale, the Automatic Thoughts Questionnaire, the Emotional Stroop Test, and an incidental recall task. No significant group differences were found in automatic thinking and the information-processing styles (Emotional Stroop Test, incidental recall task). Regarding dysfunctional attitudes, we obtained ambiguous results. It appears that individuals with remitted bipolar affective disorder do not show cognitive vulnerability as proposed in Beck's theory of depression if they only report subthreshold levels of depressive symptoms. Perhaps, the cognitive vulnerability might only be observable if mood induction procedures are used.

  10. Automatic food decisions

    DEFF Research Database (Denmark)

    Mueller Loose, Simone

    Consumers' food decisions are to a large extent shaped by automatic processes, which are either internally directed through learned habits and routines or externally influenced by context factors and visual information triggers. Innovative research methods such as eye tracking, choice experiments...... and food diaries allow us to better understand the impact of unconscious processes on consumers' food choices. Simone Mueller Loose will provide an overview of recent research insights into the effects of habit and context on consumers' food choices....

  11. Automatic dipole subtraction

    International Nuclear Information System (INIS)

    Hasegawa, K.

    2008-01-01

    The Catani-Seymour dipole subtraction is a general procedure to treat infrared divergences in real emission processes at next-to-leading order in QCD. We automatized the procedure in a computer code. The code is useful especially for the processes with many parton legs. In this talk, we first explain the algorithm of the dipole subtraction and the whole structure of our code. After that we show the results for some processes where the infrared divergences of real emission processes are subtracted. (author)

  12. Automatically varying the composition of a mixed refrigerant solution for single mixed refrigerant LNG (liquefied natural gas) process at changing working conditions

    International Nuclear Information System (INIS)

    Xu, Xiongwen; Liu, Jinping; Cao, Le; Pang, Weiqiang

    2014-01-01

    The SMR (single mixed refrigerant) process is widely used in the small- and medium-scale liquefaction of NG (natural gas). Operating the MR (mixed-refrigerant) process outside of the design specifications is difficult but essential to save energy. Nevertheless, it is difficult to realize because the process needs to alter the working refrigerant composition. To address this challenge, this study investigated the performance diagnosis mechanism for SMR process. A control strategy was then proposed to control the changes in working refrigerant composition under different working conditions. This strategy separates the working refrigerant flow in the SMR process into three flows through two phase separators before it flows into the cold box. The first liquid flow is rich in the high-temperature component (isopentane). The second liquid flow is rich in the middle-temperature components (ethylene and propane), and the gas flow is rich in the low-temperature components (nitrogen and methane). By adjusting the flow rates, it is easy to decouple the control variables and automate the system. Finally, this approach was validated by process simulation and shown to be highly adaptive and exergy efficient in response to changing working conditions. - Highlights: • The performance diagnosis mechanism of SMR LNG process is studied. • A measure to automatically change the operation composition as per the working conditions is proposed for SMR process. • SMR process simulation is performed to verify the validity of the control solution. • The control solution notably improves the energy efficiency of SMR process at changing working condition

  13. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  14. A CityGML extension for traffic-sign objects that guides the automatic processing of data collected using Mobile Mapping technology

    Science.gov (United States)

    Varela-González, M.; Riveiro, B.; Arias-Sánchez, P.; González-Jorge, H.; Martínez-Sánchez, J.

    2014-11-01

    The rapid evolution of integral schemes, accounting for geometric and semantic data, has been importantly motivated by the advances in the last decade in mobile laser scanning technology; automation in data processing has also recently influenced the expansion of the new model concepts. This paper reviews some important issues involved in the new paradigms of city 3D modelling: an interoperable schema for city 3D modelling (cityGML) and mobile mapping technology to provide the features that composing the city model. This paper focuses in traffic signs, discussing their characterization using cityGML in order to ease the implementation of LiDAR technology in road management software, as well as analysing some limitations of the current technology in the labour of automatic detection and classification.

  15. Automatic welding and cladding in heavy fabrication

    International Nuclear Information System (INIS)

    Altamer, A. de

    1980-01-01

    A description is given of the automatic welding processes used by an Italian fabricator of pressure vessels for petrochemical and nuclear plant. The automatic submerged arc welding, submerged arc strip cladding, pulsed TIG, hot wire TIG and MIG welding processes have proved satisfactory in terms of process reliability, metal deposition rate, and cost effectiveness for low alloy and carbon steels. An example shows sequences required during automatic butt welding, including heat treatments. Factors which govern satisfactory automatic welding include automatic anti-drift rotator device, electrode guidance and bead programming system, the capability of single and dual head operation, flux recovery and slag removal systems, operator environment and controls, maintaining continuity of welding and automatic reverse side grinding. Automatic welding is used for: joining vessel sections; joining tubes to tubeplate; cladding of vessel rings and tubes, dished ends and extruded nozzles; nozzle to shell and butt welds, including narrow gap welding. (author)

  16. Automatic LOD selection

    OpenAIRE

    Forsman, Isabelle

    2017-01-01

    In this paper a method to automatically generate transition distances for LOD, improving image stability and performance is presented. Three different methods were tested all measuring the change between two level of details using the spatial frequency. The methods were implemented as an optional pre-processing step in order to determine the transition distances from multiple view directions. During run-time both view direction based selection and the furthest distance for each direction was ...

  17. Stacouf: A new system for automatic processing of eddy current signal from steam generator testing of PWR power plants

    International Nuclear Information System (INIS)

    Ducreux, J.; Eyrolles, P.; Meylogan, T.

    1990-01-01

    A new system called STACOUF will be soon industrialized. The aim is to improve on-site signal processing for eddy testing of steam generators. Testing time, quality and productivity will be improved [fr

  18. Development and implementation of an automatic integration system for fibre optic sensors in the braiding process with the objective of online-monitoring of composite structures

    Science.gov (United States)

    Hufenbach, W.; Gude, M.; Czulak, A.; Kretschmann, Martin

    2014-04-01

    Increasing economic, political and ecological pressure leads to steadily rising percentage of modern processing and manufacturing processes for fibre reinforced polymers in industrial batch production. Component weights beneath a level achievable by classic construction materials, which lead to a reduced energy and cost balance during product lifetime, justify the higher fabrication costs. However, complex quality control and failure prediction slow down the substitution by composite materials. High-resolution fibre-optic sensors (FOS), due their low diameter, high measuring point density and simple handling, show a high applicability potential for an automated sensor-integration in manufacturing processes, and therefore the online monitoring of composite products manufactured in industrial scale. Integrated sensors can be used to monitor manufacturing processes, part tests as well as the component structure during product life cycle, which simplifies allows quality control during production and the optimization of single manufacturing processes.[1;2] Furthermore, detailed failure analyses lead to a enhanced understanding of failure processes appearing in composite materials. This leads to a lower wastrel number and products of a higher value and longer product life cycle, whereby costs, material and energy are saved. This work shows an automation approach for FOS-integration in the braiding process. For that purpose a braiding wheel has been supplemented with an appliance for automatic sensor application, which has been used to manufacture preforms of high-pressure composite vessels with FOS-networks integrated between the fibre layers. All following manufacturing processes (vacuum infiltration, curing) and component tests (quasi-static pressure test, programmed delamination) were monitored with the help of the integrated sensor networks. Keywords: SHM, high-pressure composite vessel, braiding, automated sensor integration, pressure test, quality control, optic

  19. A universal electronical adaptation of automats for biochemical analysis to a central processing computer by applying CAMAC-signals

    International Nuclear Information System (INIS)

    Schaefer, R.

    1975-01-01

    A universal expansion of a CAMAC-subsystem - BORER 3000 - for adapting analysis instruments in biochemistry to a processing computer is described. The possibility of standardizing input interfaces for lab instruments with such circuits is discussed and the advantages achieved by applying the CAMAC-specifications are described

  20. THE PROJECT OF ADMINISTRATIVE AND METHODICAL MANAGEMENT AUTOMATIZATION IN EDUCATIONAL INSTITUTION AS A TERM OF EDUCATION PROCESS QUALITY IMPROVEMENT

    Directory of Open Access Journals (Sweden)

    Анна Игоревна Яценко

    2017-12-01

    Full Text Available The article is devoted to the practice of information technologies implementation in the educational process according to the condition of educational informatization. The actuality of main article concept is confirmed by the trend of widespread introduction of information technologies in education both from the state and from business. Taking into account the increased attention to acquiring of high results in the educational process, the information technology tools allows to significantly improve the quality of education. In this regard, the article provides examples of various information systems using in order to administer educational process, their advantages and disadvantages. In consequence, the author formulates the problem of lack of integrated information systems. However, the development of information technologies is oriented towards a worldwide network, which has an access to a vast audience of users. Educational institutions are involved in the electronic process supported by an electronic environment of the educational development. As a result of the issue study above and the modern trends review in the article the author suggests a project description of educational organization management optimization with the help of the integrated information system use on the Internet.

  1. Automatic adjustment of cycle length and aeration time for improved nitrogen removal in an alternating activated sludge process

    DEFF Research Database (Denmark)

    Isaacs, Steven Howard

    1997-01-01

    The paper examines the nitrogen dynamics in the alternating BIODENITRO and BIODENIPHO processes with a focus on two control handles influencing now scheduling and aeration: the cycle length and the ammonia concentration at which a nitrifying period is terminated. A steady state analysis examining...

  2. A Dirichlet process mixture model for automatic (18)F-FDG PET image segmentation: Validation study on phantoms and on lung and esophageal lesions.

    Science.gov (United States)

    Giri, Maria Grazia; Cavedon, Carlo; Mazzarotto, Renzo; Ferdeghini, Marco

    2016-05-01

    The aim of this study was to implement a Dirichlet process mixture (DPM) model for automatic tumor edge identification on (18)F-fluorodeoxyglucose positron emission tomography ((18)F-FDG PET) images by optimizing the parameters on which the algorithm depends, to validate it experimentally, and to test its robustness. The DPM model belongs to the class of the Bayesian nonparametric models and uses the Dirichlet process prior for flexible nonparametric mixture modeling, without any preliminary choice of the number of mixture components. The DPM algorithm implemented in the statistical software package R was used in this work. The contouring accuracy was evaluated on several image data sets: on an IEC phantom (spherical inserts with diameter in the range 10-37 mm) acquired by a Philips Gemini Big Bore PET-CT scanner, using 9 different target-to-background ratios (TBRs) from 2.5 to 70; on a digital phantom simulating spherical/uniform lesions and tumors, irregular in shape and activity; and on 20 clinical cases (10 lung and 10 esophageal cancer patients). The influence of the DPM parameters on contour generation was studied in two steps. In the first one, only the IEC spheres having diameters of 22 and 37 mm and a sphere of the digital phantom (41.6 mm diameter) were studied by varying the main parameters until the diameter of the spheres was obtained within 0.2% of the true value. In the second step, the results obtained for this training set were applied to the entire data set to determine DPM based volumes of all available lesions. These volumes were compared to those obtained by applying already known algorithms (Gaussian mixture model and gradient-based) and to true values, when available. Only one parameter was found able to significantly influence segmentation accuracy (ANOVA test). This parameter was linearly connected to the uptake variance of the tested region of interest (ROI). In the first step of the study, a calibration curve was determined to

  3. A Dirichlet process mixture model for automatic {sup 18}F-FDG PET image segmentation: Validation study on phantoms and on lung and esophageal lesions

    Energy Technology Data Exchange (ETDEWEB)

    Giri, Maria Grazia, E-mail: mariagrazia.giri@ospedaleuniverona.it; Cavedon, Carlo [Medical Physics Unit, University Hospital of Verona, P.le Stefani 1, Verona 37126 (Italy); Mazzarotto, Renzo [Radiation Oncology Unit, University Hospital of Verona, P.le Stefani 1, Verona 37126 (Italy); Ferdeghini, Marco [Nuclear Medicine Unit, University Hospital of Verona, P.le Stefani 1, Verona 37126 (Italy)

    2016-05-15

    Purpose: The aim of this study was to implement a Dirichlet process mixture (DPM) model for automatic tumor edge identification on {sup 18}F-fluorodeoxyglucose positron emission tomography ({sup 18}F-FDG PET) images by optimizing the parameters on which the algorithm depends, to validate it experimentally, and to test its robustness. Methods: The DPM model belongs to the class of the Bayesian nonparametric models and uses the Dirichlet process prior for flexible nonparametric mixture modeling, without any preliminary choice of the number of mixture components. The DPM algorithm implemented in the statistical software package R was used in this work. The contouring accuracy was evaluated on several image data sets: on an IEC phantom (spherical inserts with diameter in the range 10–37 mm) acquired by a Philips Gemini Big Bore PET-CT scanner, using 9 different target-to-background ratios (TBRs) from 2.5 to 70; on a digital phantom simulating spherical/uniform lesions and tumors, irregular in shape and activity; and on 20 clinical cases (10 lung and 10 esophageal cancer patients). The influence of the DPM parameters on contour generation was studied in two steps. In the first one, only the IEC spheres having diameters of 22 and 37 mm and a sphere of the digital phantom (41.6 mm diameter) were studied by varying the main parameters until the diameter of the spheres was obtained within 0.2% of the true value. In the second step, the results obtained for this training set were applied to the entire data set to determine DPM based volumes of all available lesions. These volumes were compared to those obtained by applying already known algorithms (Gaussian mixture model and gradient-based) and to true values, when available. Results: Only one parameter was found able to significantly influence segmentation accuracy (ANOVA test). This parameter was linearly connected to the uptake variance of the tested region of interest (ROI). In the first step of the study, a

  4. A Dirichlet process mixture model for automatic 18F-FDG PET image segmentation: Validation study on phantoms and on lung and esophageal lesions

    International Nuclear Information System (INIS)

    Giri, Maria Grazia; Cavedon, Carlo; Mazzarotto, Renzo; Ferdeghini, Marco

    2016-01-01

    Purpose: The aim of this study was to implement a Dirichlet process mixture (DPM) model for automatic tumor edge identification on 18 F-fluorodeoxyglucose positron emission tomography ( 18 F-FDG PET) images by optimizing the parameters on which the algorithm depends, to validate it experimentally, and to test its robustness. Methods: The DPM model belongs to the class of the Bayesian nonparametric models and uses the Dirichlet process prior for flexible nonparametric mixture modeling, without any preliminary choice of the number of mixture components. The DPM algorithm implemented in the statistical software package R was used in this work. The contouring accuracy was evaluated on several image data sets: on an IEC phantom (spherical inserts with diameter in the range 10–37 mm) acquired by a Philips Gemini Big Bore PET-CT scanner, using 9 different target-to-background ratios (TBRs) from 2.5 to 70; on a digital phantom simulating spherical/uniform lesions and tumors, irregular in shape and activity; and on 20 clinical cases (10 lung and 10 esophageal cancer patients). The influence of the DPM parameters on contour generation was studied in two steps. In the first one, only the IEC spheres having diameters of 22 and 37 mm and a sphere of the digital phantom (41.6 mm diameter) were studied by varying the main parameters until the diameter of the spheres was obtained within 0.2% of the true value. In the second step, the results obtained for this training set were applied to the entire data set to determine DPM based volumes of all available lesions. These volumes were compared to those obtained by applying already known algorithms (Gaussian mixture model and gradient-based) and to true values, when available. Results: Only one parameter was found able to significantly influence segmentation accuracy (ANOVA test). This parameter was linearly connected to the uptake variance of the tested region of interest (ROI). In the first step of the study, a calibration curve

  5. A new automatic synthetic aperture radar-based flood mapping application hosted on the European Space Agency's Grid Processing of Demand Fast Access to Imagery environment

    Science.gov (United States)

    Matgen, Patrick; Giustarini, Laura; Hostache, Renaud

    2012-10-01

    This paper introduces an automatic flood mapping application that is hosted on the Grid Processing on Demand (GPOD) Fast Access to Imagery (Faire) environment of the European Space Agency. The main objective of the online application is to deliver operationally flooded areas using both recent and historical acquisitions of SAR data. Having as a short-term target the flooding-related exploitation of data generated by the upcoming ESA SENTINEL-1 SAR mission, the flood mapping application consists of two building blocks: i) a set of query tools for selecting the "crisis image" and the optimal corresponding "reference image" from the G-POD archive and ii) an algorithm for extracting flooded areas via change detection using the previously selected "crisis image" and "reference image". Stakeholders in flood management and service providers are able to log onto the flood mapping application to get support for the retrieval, from the rolling archive, of the most appropriate reference image. Potential users will also be able to apply the implemented flood delineation algorithm. The latter combines histogram thresholding, region growing and change detection as an approach enabling the automatic, objective and reliable flood extent extraction from SAR images. Both algorithms are computationally efficient and operate with minimum data requirements. The case study of the high magnitude flooding event that occurred in July 2007 on the Severn River, UK, and that was observed with a moderateresolution SAR sensor as well as airborne photography highlights the performance of the proposed online application. The flood mapping application on G-POD can be used sporadically, i.e. whenever a major flood event occurs and there is a demand for SAR-based flood extent maps. In the long term, a potential extension of the application could consist in systematically extracting flooded areas from all SAR images acquired on a daily, weekly or monthly basis.

  6. Embodiment and second-language: automatic activation of motor responses during processing spatially associated L2 words and emotion L2 words in a vertical Stroop paradigm.

    Science.gov (United States)

    Dudschig, Carolin; de la Vega, Irmgard; Kaup, Barbara

    2014-05-01

    Converging evidence suggests that understanding our first-language (L1) results in reactivation of experiential sensorimotor traces in the brain. Surprisingly, little is known regarding the involvement of these processes during second-language (L2) processing. Participants saw L1 or L2 words referring to entities with a typical location (e.g., star, mole) (Experiment 1 & 2) or to an emotion (e.g., happy, sad) (Experiment 3). Participants responded to the words' ink color with an upward or downward arm movement. Despite word meaning being fully task-irrelevant, L2 automatically activated motor responses similar to L1 even when L2 was acquired rather late in life (age >11). Specifically, words such as star facilitated upward, and words such as root facilitated downward responses. Additionally, words referring to positive emotions facilitated upward, and words referring to negative emotions facilitated downward responses. In summary our study suggests that reactivation of experiential traces is not limited to L1 processing. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Proposal for future diagnosis and management of vascular tumors by using automatic software for image processing and statistic prediction.

    Science.gov (United States)

    Popescu, M D; Draghici, L; Secheli, I; Secheli, M; Codrescu, M; Draghici, I

    2015-01-01

    Infantile Hemangiomas (IH) are the most frequent tumors of vascular origin, and the differential diagnosis from vascular malformations is difficult to establish. Specific types of IH due to the location, dimensions and fast evolution, can determine important functional and esthetic sequels. To avoid these unfortunate consequences it is necessary to establish the exact appropriate moment to begin the treatment and decide which the most adequate therapeutic procedure is. Based on clinical data collected by a serial clinical observations correlated with imaging data, and processed by a computer-aided diagnosis system (CAD), the study intended to develop a treatment algorithm to accurately predict the best final results, from the esthetical and functional point of view, for a certain type of lesion. The preliminary database was composed of 75 patients divided into 4 groups according to the treatment management they received: medical therapy, sclerotherapy, surgical excision and no treatment. The serial clinical observation was performed each month and all the data was processed by using CAD. The project goal was to create a software that incorporated advanced methods to accurately measure the specific IH lesions, integrated medical information, statistical methods and computational methods to correlate this information with that obtained from the processing of images. Based on these correlations, a prediction mechanism of the evolution of hemangioma, which helped determine the best method of therapeutic intervention to minimize further complications, was established.

  8. Automatic structural scene digitalization.

    Science.gov (United States)

    Tang, Rui; Wang, Yuhan; Cosker, Darren; Li, Wenbin

    2017-01-01

    In this paper, we present an automatic system for the analysis and labeling of structural scenes, floor plan drawings in Computer-aided Design (CAD) format. The proposed system applies a fusion strategy to detect and recognize various components of CAD floor plans, such as walls, doors, windows and other ambiguous assets. Technically, a general rule-based filter parsing method is fist adopted to extract effective information from the original floor plan. Then, an image-processing based recovery method is employed to correct information extracted in the first step. Our proposed method is fully automatic and real-time. Such analysis system provides high accuracy and is also evaluated on a public website that, on average, archives more than ten thousands effective uses per day and reaches a relatively high satisfaction rate.

  9. Study of the behavior of automatic track detectors for radon determination

    International Nuclear Information System (INIS)

    Moreno C, A.

    1997-01-01

    Both the alpha decay and the alpha and beta emitting radon daughters, may affect the living cells. In this thesis, experiments have been performed to study the response of environmental radon using different alpha particle detectors. A study was performed both in the laboratory and in the field of two kinds of detectors: a) Passive solid state nuclear track detectors, LR 115 type II, capable to integrate the alpha particles in a given period of time and, b) an automatic active detector, Clipperton, that continuously accumulate the alpha counting from radon decay. LR-115 track detectors were exposed in the laboratory to alpha particles from a radioactive source and a controlled radon atmosphere. The detectors were also exposed to electrons from an electron accelerator. The number of alpha tracks in the detectors were evaluated with two kinds of spark counters. The response of the track detectors as a function of the number of alpha tracks showed a reproducibility of 92%, and the effect of electron doses showed that the bulk etching velocity varied as a function of the electron dose. Additionally some changes were introduced in an SSNTD exchanger, exposed to the radon chamber in order to reduce the background in the non exposed positions. A conversion factor of 0.016 tracks/cm 2 . 10h per Bq/m 3 was obtained. The response of the two spark counters was similar. Field soil radon determinations were performed with track detectors during 11 months and with the active detector during 5 months with exposures each month and each hour respectively. When calculated for the same time periods exposure the response of both systems was similar. However differences were quite striking in the patterns of short and long term exposure periods since short term fluctuations are explicitly shown with the active detector while integrated within the passive one. (Author)

  10. Automatic Processing and Interpretation of Long Records of Endogenous Micro-Seismicity: the Case of the Super-Sauze Soft-Rock Landslide.

    Science.gov (United States)

    Provost, F.; Malet, J. P.; Hibert, C.; Doubre, C.

    2017-12-01

    The Super-Sauze landslide is a clay-rich landslide located the Southern French Alps. The landslide exhibits a complex pattern of deformation: a large number of rockfalls are observed in the 100 m height main scarp while the deformation of the upper part of the accumulated material is mainly affected by material shearing along stable in-situ crests. Several fissures are locally observed. The shallowest layer of the accumulated material tends to behave in a brittle manner but may undergo fluidization and/or rapid acceleration. Previous studies have demonstrated the presence of a rich endogenous micro-seismicity associated to the deformation of the landslide. However, the lack of long-term seismic records and suitable processing chains prevented a full interpretation of the links between the external forcings, the deformation and the recorded seismic signals. Since 2013, two permanent seismic arrays are installed in the upper part of the landslide. We here present the methodology adopted to process this dataset. The processing chain consists of a set of automated methods for automatic and robust detection, classification and location of the recorded seismicity. Thousands of events are detected and further automatically classified. The classification method is based on the description of the signal through attributes (e.g. waveform, spectral content properties). These attributes are used as inputs to classify the signal using a Random Forest machine-learning algorithm in four classes: endogenous micro-quakes, rockfalls, regional earthquakes and natural/anthropogenic noises. The endogenous landslide sources (i.e. micro-quake and rockfall) are further located. The location method is adapted to the type of event. The micro-quakes are located with a 3D velocity model derived from a seismic tomography campaign and an optimization of the first arrival picking with the inter-trace correlation of the P-wave arrivals. The rockfalls are located by optimizing the inter

  11. A Multi-Scale Flood Monitoring System Based on Fully Automatic MODIS and TerraSAR-X Processing Chains

    Directory of Open Access Journals (Sweden)

    Enrico Stein

    2013-10-01

    Full Text Available A two-component fully automated flood monitoring system is described and evaluated. This is a result of combining two individual flood services that are currently under development at DLR’s (German Aerospace Center Center for Satellite based Crisis Information (ZKI to rapidly support disaster management activities. A first-phase monitoring component of the system systematically detects potential flood events on a continental scale using daily-acquired medium spatial resolution optical data from the Moderate Resolution Imaging Spectroradiometer (MODIS. A threshold set controls the activation of the second-phase crisis component of the system, which derives flood information at higher spatial detail using a Synthetic Aperture Radar (SAR based satellite mission (TerraSAR-X. The proposed activation procedure finds use in the identification of flood situations in different spatial resolutions and in the time-critical and on demand programming of SAR satellite acquisitions at an early stage of an evolving flood situation. The automated processing chains of the MODIS (MFS and the TerraSAR-X Flood Service (TFS include data pre-processing, the computation and adaptation of global auxiliary data, thematic classification, and the subsequent dissemination of flood maps using an interactive web-client. The system is operationally demonstrated and evaluated via the monitoring two recent flood events in Russia 2013 and Albania/Montenegro 2013.

  12. Do I Have My Attention? Speed of Processing Advantages for the Self-Face Are Not Driven by Automatic Attention Capture

    Science.gov (United States)

    Keyes, Helen; Dlugokencka, Aleksandra

    2014-01-01

    We respond more quickly to our own face than to other faces, but there is debate over whether this is connected to attention-grabbing properties of the self-face. In two experiments, we investigate whether the self-face selectively captures attention, and the attentional conditions under which this might occur. In both experiments, we examined whether different types of face (self, friend, stranger) provide differential levels of distraction when processing self, friend and stranger names. In Experiment 1, an image of a distractor face appeared centrally – inside the focus of attention – behind a target name, with the faces either upright or inverted. In Experiment 2, distractor faces appeared peripherally – outside the focus of attention – in the left or right visual field, or bilaterally. In both experiments, self-name recognition was faster than other name recognition, suggesting a self-referential processing advantage. The presence of the self-face did not cause more distraction in the naming task compared to other types of face, either when presented inside (Experiment 1) or outside (Experiment 2) the focus of attention. Distractor faces had different effects across the two experiments: when presented inside the focus of attention (Experiment 1), self and friend images facilitated self and friend naming, respectively. This was not true for stranger stimuli, suggesting that faces must be robustly represented to facilitate name recognition. When presented outside the focus of attention (Experiment 2), no facilitation occurred. Instead, we report an interesting distraction effect caused by friend faces when processing strangers’ names. We interpret this as a “social importance” effect, whereby we may be tuned to pick out and pay attention to familiar friend faces in a crowd. We conclude that any speed of processing advantages observed in the self-face processing literature are not driven by automatic attention capture. PMID:25338170

  13. Using image processing technology and mathematical algorithm in the automatic selection of vocal cord opening and closing images from the larynx endoscopy video.

    Science.gov (United States)

    Kuo, Chung-Feng Jeffrey; Chu, Yueng-Hsiang; Wang, Po-Chun; Lai, Chun-Yu; Chu, Wen-Lin; Leu, Yi-Shing; Wang, Hsing-Won

    2013-12-01

    The human larynx is an important organ for voice production and respiratory mechanisms. The vocal cord is approximated for voice production and open for breathing. The videolaryngoscope is widely used for vocal cord examination. At present, physicians usually diagnose vocal cord diseases by manually selecting the image of the vocal cord opening to the largest extent (abduction), thus maximally exposing the vocal cord lesion. On the other hand, the severity of diseases such as vocal palsy, atrophic vocal cord is largely dependent on the vocal cord closing to the smallest extent (adduction). Therefore, diseases can be assessed by the image of the vocal cord opening to the largest extent, and the seriousness of breathy voice is closely correlated to the gap between vocal cords when closing to the smallest extent. The aim of the study was to design an automatic vocal cord image selection system to improve the conventional selection process by physicians and enhance diagnosis efficiency. Also, due to the unwanted fuzzy images resulting from examination process caused by human factors as well as the non-vocal cord images, texture analysis is added in this study to measure image entropy to establish a screening and elimination system to effectively enhance the accuracy of selecting the image of the vocal cord closing to the smallest extent. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  14. Attention biases in preoccupation with body image: An ERP study of the role of social comparison and automaticity when processing body size.

    Science.gov (United States)

    Uusberg, Helen; Peet, Krista; Uusberg, Andero; Akkermann, Kirsti

    2018-03-17

    Appearance-related attention biases are thought to contribute to body image disturbances. We investigated how preoccupation with body image is associated with attention biases to body size, focusing on the role of social comparison processes and automaticity. Thirty-six women varying on self-reported preoccupation compared their actual body size to size-modified images of either themselves or a figure-matched peer. Amplification of earlier (N170, P2) and later (P3, LPP) ERP components recorded under low vs. high concurrent working memory load were analyzed. Women with high preoccupation exhibited an earlier bias to larger bodies of both self and peer. During later processing stages, they exhibited a stronger bias to enlarged as well as reduced self-images and a lack of sensitivity to size-modifications of the peer-image. Working memory load did not affect these biases systematically. Current findings suggest that preoccupation with body image involves an earlier attention bias to weight increase cues and later over-engagement with own figure. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Automatically produced FRP beams with embedded FOS in complex geometry: process, material compatibility, micromechanical analysis, and performance tests

    Science.gov (United States)

    Gabler, Markus; Tkachenko, Viktoriya; Küppers, Simon; Kuka, Georg G.; Habel, Wolfgang R.; Milwich, Markus; Knippers, Jan

    2012-04-01

    The main goal of the presented work was to evolve a multifunctional beam composed out of fiber reinforced plastics (FRP) and an embedded optical fiber with various fiber Bragg grating sensors (FBG). These beams are developed for the use as structural member for bridges or industrial applications. It is now possible to realize large scale cross sections, the embedding is part of a fully automated process and jumpers can be omitted in order to not negatively influence the laminate. The development includes the smart placement and layout of the optical fibers in the cross section, reliable strain transfer, and finally the coupling of the embedded fibers after production. Micromechanical tests and analysis were carried out to evaluate the performance of the sensor. The work was funded by the German ministry of economics and technology (funding scheme ZIM). Next to the authors of this contribution, Melanie Book with Röchling Engineering Plastics KG (Haren/Germany; Katharina Frey with SAERTEX GmbH & Co. KG (Saerbeck/Germany) were part of the research group.

  16. A new automatic SAR-based flood mapping application hosted on the European Space Agency's grid processing on demand fast access to imagery environment

    Science.gov (United States)

    Hostache, Renaud; Chini, Marco; Matgen, Patrick; Giustarini, Laura

    2013-04-01

    There is a clear need for developing innovative processing chains based on earth observation (EO) data to generate products supporting emergency response and flood management at a global scale. Here an automatic flood mapping application is introduced. The latter is currently hosted on the Grid Processing on Demand (G-POD) Fast Access to Imagery (Faire) environment of the European Space Agency. The main objective of the online application is to deliver flooded areas using both recent and historical acquisitions of SAR data in an operational framework. It is worth mentioning that the method can be applied to both medium and high resolution SAR images. The flood mapping application consists of two main blocks: 1) A set of query tools for selecting the "crisis image" and the optimal corresponding pre-flood "reference image" from the G-POD archive. 2) An algorithm for extracting flooded areas using the previously selected "crisis image" and "reference image". The proposed method is a hybrid methodology, which combines histogram thresholding, region growing and change detection as an approach enabling the automatic, objective and reliable flood extent extraction from SAR images. The method is based on the calibration of a statistical distribution of "open water" backscatter values inferred from SAR images of floods. Change detection with respect to a pre-flood reference image helps reducing over-detection of inundated areas. The algorithms are computationally efficient and operate with minimum data requirements, considering as input data a flood image and a reference image. Stakeholders in flood management and service providers are able to log onto the flood mapping application to get support for the retrieval, from the rolling archive, of the most appropriate pre-flood reference image. Potential users will also be able to apply the implemented flood delineation algorithm. Case studies of several recent high magnitude flooding events (e.g. July 2007 Severn River flood

  17. An automatic method for detection and classification of Ionospheric Alfvén Resonances using signal and image processing techniques

    Science.gov (United States)

    Beggan, Ciaran

    2014-05-01

    which is then treated as an image. In combination with the spectrogram image of that day, the SRS are identified using image processing techniques. The peaks can now be mapped as continuous lines throughout the spectrogram. Finally, we can investigate the f and Δf statistics over the entire length of the dataset. We intend to run the coils as a long term experiment. The data and code are available on request.

  18. The Science of and Advanced Technology for Cost-Effective Manufacture of High Precision Engineering Products. Volume 5. Automatic Generation of Process Outlines of Forming and Machining Processes.

    Science.gov (United States)

    1986-08-01

    Circumscribing, (e.g. c-polygon - circumscribing polygon). or: subscript denoting: compressive. C-P - Computation Parameter. CAPM - computer aided process...technology. Assumptions about parts and processes apply to the bulk of them. J f I,%, Z Z" -, t . ,Wrg %%JJ but not necessarily to all of them. Deep...forming is the assumption about rigid perfectly plastic bodies. The introduction of elasto- plastic behavior refines these idealizations

  19. Spontaneous Facial Mimicry Is Enhanced by the Goal of Inferring Emotional States: Evidence for Moderation of "Automatic" Mimicry by Higher Cognitive Processes.

    Science.gov (United States)

    Murata, Aiko; Saito, Hisamichi; Schug, Joanna; Ogawa, Kenji; Kameda, Tatsuya

    2016-01-01

    A number of studies have shown that individuals often spontaneously mimic the facial expressions of others, a tendency known as facial mimicry. This tendency has generally been considered a reflex-like "automatic" response, but several recent studies have shown that the degree of mimicry may be moderated by contextual information. However, the cognitive and motivational factors underlying the contextual moderation of facial mimicry require further empirical investigation. In this study, we present evidence that the degree to which participants spontaneously mimic a target's facial expressions depends on whether participants are motivated to infer the target's emotional state. In the first study we show that facial mimicry, assessed by facial electromyography, occurs more frequently when participants are specifically instructed to infer a target's emotional state than when given no instruction. In the second study, we replicate this effect using the Facial Action Coding System to show that participants are more likely to mimic facial expressions of emotion when they are asked to infer the target's emotional state, rather than make inferences about a physical trait unrelated to emotion. These results provide convergent evidence that the explicit goal of understanding a target's emotional state affects the degree of facial mimicry shown by the perceiver, suggesting moderation of reflex-like motor activities by higher cognitive processes.

  20. Comparison of a semi-automatic annotation tool and a natural language processing application for the generation of clinical statement entries.

    Science.gov (United States)

    Lin, Ching-Heng; Wu, Nai-Yuan; Lai, Wei-Shao; Liou, Der-Ming

    2015-01-01

    Electronic medical records with encoded entries should enhance the semantic interoperability of document exchange. However, it remains a challenge to encode the narrative concept and to transform the coded concepts into a standard entry-level document. This study aimed to use a novel approach for the generation of entry-level interoperable clinical documents. Using HL7 clinical document architecture (CDA) as the example, we developed three pipelines to generate entry-level CDA documents. The first approach was a semi-automatic annotation pipeline (SAAP), the second was a natural language processing (NLP) pipeline, and the third merged the above two pipelines. We randomly selected 50 test documents from the i2b2 corpora to evaluate the performance of the three pipelines. The 50 randomly selected test documents contained 9365 words, including 588 Observation terms and 123 Procedure terms. For the Observation terms, the merged pipeline had a significantly higher F-measure than the NLP pipeline (0.89 vs 0.80, pgenerating entry-level interoperable clinical documents. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.comFor numbered affiliation see end of article.

  1. DEVELOPMENT AND TESTING OF GEO-PROCESSING MODELS FOR THE AUTOMATIC GENERATION OF REMEDIATION PLAN AND NAVIGATION DATA TO USE IN INDUSTRIAL DISASTER REMEDIATION

    Directory of Open Access Journals (Sweden)

    G. Lucas

    2015-08-01

    Full Text Available This paper introduces research done on the automatic preparation of remediation plans and navigation data for the precise guidance of heavy machinery in clean-up work after an industrial disaster. The input test data consists of a pollution extent shapefile derived from the processing of hyperspectral aerial survey data from the Kolontár red mud disaster. Three algorithms were developed and the respective scripts were written in Python. The first model aims at drawing a parcel clean-up plan. The model tests four different parcel orientations (0, 90, 45 and 135 degree and keeps the plan where clean-up parcels are less numerous considering it is an optimal spatial configuration. The second model drifts the clean-up parcel of a work plan both vertically and horizontally following a grid pattern with sampling distance of a fifth of a parcel width and keep the most optimal drifted version; here also with the belief to reduce the final number of parcel features. The last model aims at drawing a navigation line in the middle of each clean-up parcel. The models work efficiently and achieve automatic optimized plan generation (parcels and navigation lines. Applying the first model we demonstrated that depending on the size and geometry of the features of the contaminated area layer, the number of clean-up parcels generated by the model varies in a range of 4% to 38% from plan to plan. Such a significant variation with the resulting feature numbers shows that the optimal orientation identification can result in saving work, time and money in remediation. The various tests demonstrated that the model gains efficiency when 1/ the individual features of contaminated area present a significant orientation with their geometry (features are long, 2/ the size of pollution extent features becomes closer to the size of the parcels (scale effect. The second model shows only 1% difference with the variation of feature number; so this last is less interesting for

  2. Automatic quantitative renal scintigraphy

    International Nuclear Information System (INIS)

    Valeyre, J.; Deltour, G.; Delisle, M.J.; Bouchard, A.

    1976-01-01

    Renal scintigraphy data may be analyzed automatically by the use of a processing system coupled to an Anger camera (TRIDAC-MULTI 8 or CINE 200). The computing sequence is as follows: normalization of the images; background noise subtraction on both images; evaluation of mercury 197 uptake by the liver and spleen; calculation of the activity fractions on each kidney with respect to the injected dose, taking into account the kidney depth and the results referred to normal values; edition of the results. Automation minimizes the scattering parameters and by its simplification is a great asset in routine work [fr

  3. Automatic programmable air ozonizer

    International Nuclear Information System (INIS)

    Gubarev, S.P.; Klosovsky, A.V.; Opaleva, G.P.; Taran, V.S.; Zolototrubova, M.I.

    2015-01-01

    In this paper we describe a compact, economical, easy to manage auto air ozonator developed at the Institute of Plasma Physics of the NSC KIPT. It is designed for sanitation, disinfection of premises and cleaning the air from foreign odors. A distinctive feature of the developed device is the generation of a given concentration of ozone, approximately 0.7 maximum allowable concentration (MAC), and automatic maintenance of a specified level. This allows people to be inside the processed premises during operation. The microprocessor controller to control the operation of the ozonator was developed

  4. Automatic process control for pipeline: an ally in the quest for excellence in quality; Controle automatizado do processo de revestimento de dutos: um aliado na busca da excelencia em qualidade

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Christian E.; Santos, Paulo T.; Nunes, Matheus; Sartori, Marcio [Soco Ril do Brasil S.A., Pindamonhangaba, SP (Brazil); Populin, German [Soco-Ril da Argentina S.A. (Argentina); Ferreira, Joaquim C. [Tenaris Confab, Pindamonhangaba, SP (Brazil)

    2004-07-01

    Process is any specific combination of machines, methods, tools and people which aim to obtain products or services with high quality. Any changing in one of these elements may result in another process. The concern in assuring its control totally owe to the necessity of having quickly answers related to the standards deviation. Herewith, it is possible to get as much uniformity as possible from that quality characteristic. Through a thickness automatic controlled system it may be possible to evaluate in real time the process itself, identifying tendencies of sequential or temporal performance. Therefore it is possible to act instantaneously adjusting the process, complying with client's specification. The present paper describes the applicability of a automatic controlled system and shows the advantages of its using. (author)

  5. AUTOMATIC ARCHITECTURAL STYLE RECOGNITION

    Directory of Open Access Journals (Sweden)

    M. Mathias

    2012-09-01

    Full Text Available Procedural modeling has proven to be a very valuable tool in the field of architecture. In the last few years, research has soared to automatically create procedural models from images. However, current algorithms for this process of inverse procedural modeling rely on the assumption that the building style is known. So far, the determination of the building style has remained a manual task. In this paper, we propose an algorithm which automates this process through classification of architectural styles from facade images. Our classifier first identifies the images containing buildings, then separates individual facades within an image and determines the building style. This information could then be used to initialize the building reconstruction process. We have trained our classifier to distinguish between several distinct architectural styles, namely Flemish Renaissance, Haussmannian and Neoclassical. Finally, we demonstrate our approach on various street-side images.

  6. Finding weak points automatically

    International Nuclear Information System (INIS)

    Archinger, P.; Wassenberg, M.

    1999-01-01

    Operators of nuclear power stations have to carry out material tests at selected components by regular intervalls. Therefore a full automaticated test, which achieves a clearly higher reproducibility, compared to part automaticated variations, would provide a solution. In addition the full automaticated test reduces the dose of radiation for the test person. (orig.) [de

  7. Automatic first-break picking using the instantaneous traveltime attribute

    KAUST Repository

    Saragiotis, Christos; Alkhalifah, Tariq Ali

    2012-01-01

    Picking the first breaks is an important step in seismic processing. The large volume of the seismic data calls for automatic and objective picking. We introduce a new automatic first-break picker, which uses specifically designed time windows

  8. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo

    2003-01-01

    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...

  9. Automatic analysis of ultrasonic data

    International Nuclear Information System (INIS)

    Horteur, P.; Colin, J.; Benoist, P.; Bonis, M.; Paradis, L.

    1986-10-01

    This paper describes an automatic and self-contained data processing system, transportable on site, able to perform images such as ''A. Scan'', ''B. Scan'', ... to present very quickly the results of the control. It can be used in the case of pressure vessel inspection [fr

  10. Precision about the automatic emotional brain.

    Science.gov (United States)

    Vuilleumier, Patrik

    2015-01-01

    The question of automaticity in emotion processing has been debated under different perspectives in recent years. Satisfying answers to this issue will require a better definition of automaticity in terms of relevant behavioral phenomena, ecological conditions of occurrence, and a more precise mechanistic account of the underlying neural circuits.

  11. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming focuses on the techniques of automatic programming used with digital computers. Topics covered range from the design of machine-independent programming languages to the use of recursive procedures in ALGOL 60. A multi-pass translation scheme for ALGOL 60 is described, along with some commercial source languages. The structure and use of the syntax-directed compiler is also considered.Comprised of 12 chapters, this volume begins with a discussion on the basic ideas involved in the description of a computing process as a program for a computer, expressed in

  12. Automatic identification in mining

    Energy Technology Data Exchange (ETDEWEB)

    Puckett, D; Patrick, C [Mine Computers and Electronics Inc., Morehead, KY (United States)

    1998-06-01

    The feasibility of monitoring the locations and vital statistics of equipment and personnel in surface and underground mining operations has increased with advancements in radio frequency identification (RFID) technology. This paper addresses the use of RFID technology, which is relatively new to the mining industry, to track surface equipment in mine pits, loading points and processing facilities. Specific applications are discussed, including both simplified and complex truck tracking systems and an automatic pit ticket system. This paper concludes with a discussion of the future possibilities of using RFID technology in mining including monitoring heart and respiration rates, body temperatures and exertion levels; monitoring repetitious movements for the study of work habits; and logging air quality via personnel sensors. 10 refs., 5 figs.

  13. Automatic quantitative metallography

    International Nuclear Information System (INIS)

    Barcelos, E.J.B.V.; Ambrozio Filho, F.; Cunha, R.C.

    1976-01-01

    The quantitative determination of metallographic parameters is analysed through the description of Micro-Videomat automatic image analysis system and volumetric percentage of perlite in nodular cast irons, porosity and average grain size in high-density sintered pellets of UO 2 , and grain size of ferritic steel. Techniques adopted are described and results obtained are compared with the corresponding ones by the direct counting process: counting of systematic points (grid) to measure volume and intersections method, by utilizing a circunference of known radius for the average grain size. The adopted technique for nodular cast iron resulted from the small difference of optical reflectivity of graphite and perlite. Porosity evaluation of sintered UO 2 pellets is also analyzed [pt

  14. Automatic surveying techniques

    International Nuclear Information System (INIS)

    Sah, R.

    1976-01-01

    In order to investigate the feasibility of automatic surveying methods in a more systematic manner, the PEP organization signed a contract in late 1975 for TRW Systems Group to undertake a feasibility study. The completion of this study resulted in TRW Report 6452.10-75-101, dated December 29, 1975, which was largely devoted to an analysis of a survey system based on an Inertial Navigation System. This PEP note is a review and, in some instances, an extension of that TRW report. A second survey system which employed an ''Image Processing System'' was also considered by TRW, and it will be reviewed in the last section of this note. 5 refs., 5 figs., 3 tabs

  15. Physics of Automatic Target Recognition

    CERN Document Server

    Sadjadi, Firooz

    2007-01-01

    Physics of Automatic Target Recognition addresses the fundamental physical bases of sensing, and information extraction in the state-of-the art automatic target recognition field. It explores both passive and active multispectral sensing, polarimetric diversity, complex signature exploitation, sensor and processing adaptation, transformation of electromagnetic and acoustic waves in their interactions with targets, background clutter, transmission media, and sensing elements. The general inverse scattering, and advanced signal processing techniques and scientific evaluation methodologies being used in this multi disciplinary field will be part of this exposition. The issues of modeling of target signatures in various spectral modalities, LADAR, IR, SAR, high resolution radar, acoustic, seismic, visible, hyperspectral, in diverse geometric aspects will be addressed. The methods for signal processing and classification will cover concepts such as sensor adaptive and artificial neural networks, time reversal filt...

  16. Automatic Photoelectric Telescope Service

    International Nuclear Information System (INIS)

    Genet, R.M.; Boyd, L.J.; Kissell, K.E.; Crawford, D.L.; Hall, D.S.; BDM Corp., McLean, VA; Kitt Peak National Observatory, Tucson, AZ; Dyer Observatory, Nashville, TN)

    1987-01-01

    Automatic observatories have the potential of gathering sizable amounts of high-quality astronomical data at low cost. The Automatic Photoelectric Telescope Service (APT Service) has realized this potential and is routinely making photometric observations of a large number of variable stars. However, without observers to provide on-site monitoring, it was necessary to incorporate special quality checks into the operation of the APT Service at its multiple automatic telescope installation on Mount Hopkins. 18 references

  17. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  18. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. [comp.

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  19. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  20. Automatic control systems engineering

    International Nuclear Information System (INIS)

    Shin, Yun Gi

    2004-01-01

    This book gives descriptions of automatic control for electrical electronics, which indicates history of automatic control, Laplace transform, block diagram and signal flow diagram, electrometer, linearization of system, space of situation, state space analysis of electric system, sensor, hydro controlling system, stability, time response of linear dynamic system, conception of root locus, procedure to draw root locus, frequency response, and design of control system.

  1. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...

  2. Automatic differentiation of functions

    International Nuclear Information System (INIS)

    Douglas, S.R.

    1990-06-01

    Automatic differentiation is a method of computing derivatives of functions to any order in any number of variables. The functions must be expressible as combinations of elementary functions. When evaluated at specific numerical points, the derivatives have no truncation error and are automatically found. The method is illustrated by simple examples. Source code in FORTRAN is provided

  3. MOS voltage automatic tuning circuit

    OpenAIRE

    李, 田茂; 中田, 辰則; 松本, 寛樹

    2004-01-01

    Abstract ###Automatic tuning circuit adjusts frequency performance to compensate for the process variation. Phase locked ###loop (PLL) is a suitable oscillator for the integrated circuit. It is a feedback system that compares the input ###phase with the output phase. It can make the output frequency equal to the input frequency. In this paper, PLL ###fomed of MOSFET's is presented.The presented circuit consists of XOR circuit, Low-pass filter and Relaxation ###Oscillator. On PSPICE simulation...

  4. CLG for Automatic Image Segmentation

    OpenAIRE

    Christo Ananth; S.Santhana Priya; S.Manisha; T.Ezhil Jothi; M.S.Ramasubhaeswari

    2017-01-01

    This paper proposes an automatic segmentation method which effectively combines Active Contour Model, Live Wire method and Graph Cut approach (CLG). The aim of Live wire method is to provide control to the user on segmentation process during execution. Active Contour Model provides a statistical model of object shape and appearance to a new image which are built during a training phase. In the graph cut technique, each pixel is represented as a node and the distance between those nodes is rep...

  5. Automatic programming for critical applications

    Science.gov (United States)

    Loganantharaj, Raj L.

    1988-01-01

    The important phases of a software life cycle include verification and maintenance. Usually, the execution performance is an expected requirement in a software development process. Unfortunately, the verification and the maintenance of programs are the time consuming and the frustrating aspects of software engineering. The verification cannot be waived for the programs used for critical applications such as, military, space, and nuclear plants. As a consequence, synthesis of programs from specifications, an alternative way of developing correct programs, is becoming popular. The definition, or what is understood by automatic programming, has been changed with our expectations. At present, the goal of automatic programming is the automation of programming process. Specifically, it means the application of artificial intelligence to software engineering in order to define techniques and create environments that help in the creation of high level programs. The automatic programming process may be divided into two phases: the problem acquisition phase and the program synthesis phase. In the problem acquisition phase, an informal specification of the problem is transformed into an unambiguous specification while in the program synthesis phase such a specification is further transformed into a concrete, executable program.

  6. Automatic control of commercial computer programs

    International Nuclear Information System (INIS)

    Rezvov, B.A.; Artem'ev, A.N.; Maevskij, A.G.; Demkiv, A.A.; Kirillov, B.F.; Belyaev, A.D.; Artem'ev, N.A.

    2010-01-01

    The way of automatic control of commercial computer programs is presented. The developed connection of the EXAFS spectrometer automatic system (which is managed by PC for DOS) is taken with the commercial program for the CCD detector control (which is managed by PC for Windows). The described complex system is used for the automation of intermediate amplitude spectra processing in EXAFS spectrum measurements at Kurchatov SR source

  7. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan

    2010-05-31

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques to detect regions of high error and the flexibility of the transfinite interpolation to add degrees of freedom to these areas. Examples are shown of a section of the Palo Duro Canyon in northern Texas.

  8. Automatic digitization of SMA data

    Science.gov (United States)

    Väänänen, Mika; Tanskanen, Eija

    2017-04-01

    In the 1970's and 1980's the Scandinavian Magnetometer Array produced large amounts of excellent data from over 30 stations In Norway, Sweden and Finland. 620 film reels and 20 kilometers of film have been preserved and the longest time series produced in the campaign span almost uninterrupted for five years, but the data has never seen widespread use due to the choice of medium. Film is a difficult medium to digitize efficiently. Previously events of interest were searched for by hand and digitization was done by projecting the film on paper and plotting it by hand. We propose a method of automatically digitizing geomagnetic data stored on film and extracting the numerical values from the digitized data. The automatic digitization process helps in preserving old, valuable data that might otherwise go unused.

  9. Automatic computation of radioimmunoassay data

    International Nuclear Information System (INIS)

    Toyota, Takayoshi; Kudo, Mikihiko; Abe, Kanji; Kawamata, Fumiaki; Uehata, Shigeru.

    1975-01-01

    Radioimmunoassay provided dose response curves which showed linearity by the use of logistic transformation (Rodbard). This transformation which was applicable to radioimmunoassay should be useful for the computer processing of insulin and C-peptide assay. In the present studies, standard curves were analysed by testing the fit of analytic functions to radioimmunoassay of insulin and C-peptides. A program for use in combination with the double antibody technique was made by Dr. Kawamata. This approach was evidenced to be useful in order to allow automatic computation of data derived from the double antibody assays of insulin and C-peptides. Automatic corrected calculations of radioimmunoassay data of insulin was found to be satisfactory. (auth.)

  10. Automatic EEG spike detection.

    Science.gov (United States)

    Harner, Richard

    2009-10-01

    Since the 1970s advances in science and technology during each succeeding decade have renewed the expectation of efficient, reliable automatic epileptiform spike detection (AESD). But even when reinforced with better, faster tools, clinically reliable unsupervised spike detection remains beyond our reach. Expert-selected spike parameters were the first and still most widely used for AESD. Thresholds for amplitude, duration, sharpness, rise-time, fall-time, after-coming slow waves, background frequency, and more have been used. It is still unclear which of these wave parameters are essential, beyond peak-peak amplitude and duration. Wavelet parameters are very appropriate to AESD but need to be combined with other parameters to achieve desired levels of spike detection efficiency. Artificial Neural Network (ANN) and expert-system methods may have reached peak efficiency. Support Vector Machine (SVM) technology focuses on outliers rather than centroids of spike and nonspike data clusters and should improve AESD efficiency. An exemplary spike/nonspike database is suggested as a tool for assessing parameters and methods for AESD and is available in CSV or Matlab formats from the author at brainvue@gmail.com. Exploratory Data Analysis (EDA) is presented as a graphic method for finding better spike parameters and for the step-wise evaluation of the spike detection process.

  11. Thai Automatic Speech Recognition

    National Research Council Canada - National Science Library

    Suebvisai, Sinaporn; Charoenpornsawat, Paisarn; Black, Alan; Woszczyna, Monika; Schultz, Tanja

    2005-01-01

    .... We focus on the discussion of the rapid deployment of ASR for Thai under limited time and data resources, including rapid data collection issues, acoustic model bootstrap, and automatic generation of pronunciations...

  12. Automatic Payroll Deposit System.

    Science.gov (United States)

    Davidson, D. B.

    1979-01-01

    The Automatic Payroll Deposit System in Yakima, Washington's Public School District No. 7, directly transmits each employee's salary amount for each pay period to a bank or other financial institution. (Author/MLF)

  13. Automatic Test Systems Aquisition

    National Research Council Canada - National Science Library

    1994-01-01

    We are providing this final memorandum report for your information and use. This report discusses the efforts to achieve commonality in standards among the Military Departments as part of the DoD policy for automatic test systems (ATS...

  14. Brand and automaticity

    OpenAIRE

    Liu, J.

    2008-01-01

    A presumption of most consumer research is that consumers endeavor to maximize the utility of their choices and are in complete control of their purchasing and consumption behavior. However, everyday life experience suggests that many of our choices are not all that reasoned or conscious. Indeed, automaticity, one facet of behavior, is indispensable to complete the portrait of consumers. Despite its importance, little attention is paid to how the automatic side of behavior can be captured and...

  15. Position automatic determination technology

    International Nuclear Information System (INIS)

    1985-10-01

    This book tells of method of position determination and characteristic, control method of position determination and point of design, point of sensor choice for position detector, position determination of digital control system, application of clutch break in high frequency position determination, automation technique of position determination, position determination by electromagnetic clutch and break, air cylinder, cam and solenoid, stop position control of automatic guide vehicle, stacker crane and automatic transfer control.

  16. Automatic intelligent cruise control

    OpenAIRE

    Stanton, NA; Young, MS

    2006-01-01

    This paper reports a study on the evaluation of automatic intelligent cruise control (AICC) from a psychological perspective. It was anticipated that AICC would have an effect upon the psychology of driving—namely, make the driver feel like they have less control, reduce the level of trust in the vehicle, make drivers less situationally aware, but might reduce the workload and make driving might less stressful. Drivers were asked to drive in a driving simulator under manual and automatic inte...

  17. Observer design for non linear systems: application to automatic fault detection in process engineering; Synthese d'observateurs pour les systemes non lineaires. Application a la detection automatique de pannes en genie des procedes

    Energy Technology Data Exchange (ETDEWEB)

    Armanet, F.

    1999-04-01

    This thesis describes some theoretical contributions in state observer design for non linear systems and the conception of an automatic fault detector system for a petrochemical process. The first chapter is an overview of the observer theory for non linear systems. The second chapter presents a new methodology of high gain observer design for single-output U-uniformly observable systems. It consists in calculate a symmetric positive definite matrix which allows the design of an high gain observer which is exponentially converging. This observer is applied to estimate the concentrations in a perfectly mixed tank reactor with a kinetic scheme corresponding to the conversion of a product A onto a product B which is also converting onto a product C. In the third chapter, the use of high gain observer is extended for systems which are not uniformly observable but all admissible inputs are locally regularly persistent. A characterization of some of this class of inputs is given and an application for the preceding reactor illustrates this theory. The fourth chapter includes a summary of the observer used in residual generator design for linear and non linear systems. Two examples of automatic fault detector using these methods are describes. In annexed documents, a detailed study of the process modeling and the main observability properties are presented. (author)

  18. Automatic imitation: A meta-analysis.

    Science.gov (United States)

    Cracco, Emiel; Bardi, Lara; Desmet, Charlotte; Genschow, Oliver; Rigoni, Davide; De Coster, Lize; Radkova, Ina; Deschrijver, Eliane; Brass, Marcel

    2018-05-01

    Automatic imitation is the finding that movement execution is facilitated by compatible and impeded by incompatible observed movements. In the past 15 years, automatic imitation has been studied to understand the relation between perception and action in social interaction. Although research on this topic started in cognitive science, interest quickly spread to related disciplines such as social psychology, clinical psychology, and neuroscience. However, important theoretical questions have remained unanswered. Therefore, in the present meta-analysis, we evaluated seven key questions on automatic imitation. The results, based on 161 studies containing 226 experiments, revealed an overall effect size of g z = 0.95, 95% CI [0.88, 1.02]. Moderator analyses identified automatic imitation as a flexible, largely automatic process that is driven by movement and effector compatibility, but is also influenced by spatial compatibility. Automatic imitation was found to be stronger for forced choice tasks than for simple response tasks, for human agents than for nonhuman agents, and for goalless actions than for goal-directed actions. However, it was not modulated by more subtle factors such as animacy beliefs, motion profiles, or visual perspective. Finally, there was no evidence for a relation between automatic imitation and either empathy or autism. Among other things, these findings point toward actor-imitator similarity as a crucial modulator of automatic imitation and challenge the view that imitative tendencies are an indicator of social functioning. The current meta-analysis has important theoretical implications and sheds light on longstanding controversies in the literature on automatic imitation and related domains. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  19. Proper Names and Named Entities Recognition in the Automatic Text Processing. Review of the book: Nouvel, D., Ehrmann, M., & Rosset, S. (2016. Named Entities for Computational Linguistics. London; Hoboken: ISTE Ltd; John Wiley & Sons, Inc., 2016.

    Directory of Open Access Journals (Sweden)

    Daria M. Golikova

    2018-03-01

    Full Text Available The reviewed book by Damien Nouvel, Maud Ehrmann, and Sophie Rosset Named Entities for Computational Linguistics deals with automatic processing of texts, written in a natural language, and with named entities recognition, aimed at extracting most important information in these texts. The notion of named entities here extends to the entire set of linguistic units referring to an object. The researchers minutely consider the concept of named entities, juxtaposing this category to that of proper names and comparing their definitions, and describe all the stages of creation and implementation of automatic text annotation algorithms, as well as different ways of evaluating their performance quality. Proper names, in this context, are seen as a particular instance of named entities, one of the typical sources of reference to real objects to be electronically recognized in the text. The book provides a detailed overview and analysis of previous studies in the same field, based mainly on the English language data. It presents instruments and resources required to create and implement the algorithms in question, these may include typologies, knowledge or databases, and various types of corpora. Theoretical considerations, proposed by the authors, are supported by a significant number of exemplary cases, with algorithms operation principles presented in charts. The reviewed book gives quite a comprehensive picture of modern computational linguistic studies focused on named entities recognition and indicates some problems which are unresolved as yet.

  20. Automatic Program Development

    DEFF Research Database (Denmark)

    Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his...... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers...... by members of the IFIP Working Group 2.1 of which Bob was an active member. All papers are related to some of the research interests of Bob and, in particular, to the transformational development of programs and their algorithmic derivation from formal specifications. Automatic Program Development offers...

  1. Automatic welding of stainless steel tubing

    Science.gov (United States)

    Clautice, W. E.

    1978-01-01

    The use of automatic welding for making girth welds in stainless steel tubing was investigated as well as the reduction in fabrication costs resulting from the elimination of radiographic inspection. Test methodology, materials, and techniques are discussed, and data sheets for individual tests are included. Process variables studied include welding amperes, revolutions per minute, and shielding gas flow. Strip chart recordings, as a definitive method of insuring weld quality, are studied. Test results, determined by both radiographic and visual inspection, are presented and indicate that once optimum welding procedures for specific sizes of tubing are established, and the welding machine operations are certified, then the automatic tube welding process produces good quality welds repeatedly, with a high degree of reliability. Revised specifications for welding tubing using the automatic process and weld visual inspection requirements at the Kennedy Space Center are enumerated.

  2. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  3. Automatic Ultrasound Scanning

    DEFF Research Database (Denmark)

    Moshavegh, Ramin

    on the user adjustments on the scanner interface to optimize the scan settings. This explains the huge interest in the subject of this PhD project entitled “AUTOMATIC ULTRASOUND SCANNING”. The key goals of the project have been to develop automated techniques to minimize the unnecessary settings...... on the scanners, and to improve the computer-aided diagnosis (CAD) in ultrasound by introducing new quantitative measures. Thus, four major issues concerning automation of the medical ultrasound are addressed in this PhD project. They touch upon gain adjustments in ultrasound, automatic synthetic aperture image...

  4. Automatic NAA. Saturation activities

    International Nuclear Information System (INIS)

    Westphal, G.P.; Grass, F.; Kuhnert, M.

    2008-01-01

    A system for Automatic NAA is based on a list of specific saturation activities determined for one irradiation position at a given neutron flux and a single detector geometry. Originally compiled from measurements of standard reference materials, the list may be extended also by the calculation of saturation activities from k 0 and Q 0 factors, and f and α values of the irradiation position. A systematic improvement of the SRM approach is currently being performed by pseudo-cyclic activation analysis, to reduce counting errors. From these measurements, the list of saturation activities is recalculated in an automatic procedure. (author)

  5. Automatic measurement for solid state track detectors

    International Nuclear Information System (INIS)

    Ogura, Koichi

    1982-01-01

    Since in solid state track detectors, their tracks are measured with a microscope, observers are forced to do hard works that consume time and labour. This causes to obtain poor statistic accuracy or to produce personal error. Therefore, many researches have been done to aim at simplifying and automating track measurement. There are two categories in automating the measurement: simple counting of the number of tracks and the requirements to know geometrical elements such as the size of tracks or their coordinates as well as the number of tracks. The former is called automatic counting and the latter automatic analysis. The method to generally evaluate the number of tracks in automatic counting is the estimation of the total number of tracks in the total detector area or in a field of view of a microscope. It is suitable for counting when the track density is higher. The method to count tracks one by one includes the spark counting and the scanning microdensitometer. Automatic analysis includes video image analysis in which the high quality images obtained with a high resolution video camera are processed with a micro-computer, and the tracks are automatically recognized and measured by feature extraction. This method is described in detail. In many kinds of automatic measurements reported so far, frequently used ones are ''spark counting'' and ''video image analysis''. (Wakatsuki, Y.)

  6. Quality Assessment of Process Measures in Antimicrobial Stewardship: Concordance of Valacyclovir Indication and Automatic Prospective Approval in Computerized Provider Order Entry

    Science.gov (United States)

    Lee, Tiffany; McCoy, Christopher; Mahoney, Monica V

    2017-01-01

    . Furthermore, only 46 orders (39.3%) were per BIDMC-protocol. Conclusion Concordance of CPOE indication selection and suspected/confirmed infection for valacyclovir was low. Using CPOE to grant automatic prospective approval must be monitored and audited for accuracy if employed as an AST tool. Disclosures All authors: No reported disclosures.

  7. Automatization of the radiation control measurements

    International Nuclear Information System (INIS)

    Seki, Akio; Ogata, Harumi; Horikoshi, Yoshinori; Shirai, Kenji

    1988-01-01

    Plutonium Fuel Production Facility (PFPF) was constructed to fabricate the MOX fuels for 'MONJU' and 'JOYO' reactors and to develop the practical fuel fabricating technology. For the fuel fabrication process in this facility, centralized controlling system is being adopted for the mass production of the fuel and reduction of the radiation exposure dose. Also, the radiation control systems are suitable for the large-scale facility and the automatic-remote process of the fuel fabrication. One of the typical radiation control systems is the self moving survey system which has been developed by PNC and adopted for the automatic routine monitoring. (author)

  8. Automatic Control Of Length Of Welding Arc

    Science.gov (United States)

    Iceland, William F.

    1991-01-01

    Nonlinear relationships among current, voltage, and length stored in electronic memory. Conceptual microprocessor-based control subsystem maintains constant length of welding arc in gas/tungsten arc-welding system, even when welding current varied. Uses feedback of current and voltage from welding arc. Directs motor to set position of torch according to previously measured relationships among current, voltage, and length of arc. Signal paths marked "calibration" or "welding" used during those processes only. Other signal paths used during both processes. Control subsystem added to existing manual or automatic welding system equipped with automatic voltage control.

  9. Automatic optimisation of beam orientations using the simplex algorithm and optimisation of quality control using statistical process control (S.P.C.) for intensity modulated radiation therapy (I.M.R.T.)

    International Nuclear Information System (INIS)

    Gerard, K.

    2008-11-01

    Intensity Modulated Radiation Therapy (I.M.R.T.) is currently considered as a technique of choice to increase the local control of the tumour while reducing the dose to surrounding organs at risk. However, its routine clinical implementation is partially held back by the excessive amount of work required to prepare the patient treatment. In order to increase the efficiency of the treatment preparation, two axes of work have been defined. The first axis concerned the automatic optimisation of beam orientations. We integrated the simplex algorithm in the treatment planning system. Starting from the dosimetric objectives set by the user, it can automatically determine the optimal beam orientations that best cover the target volume while sparing organs at risk. In addition to time sparing, the simplex results of three patients with a cancer of the oropharynx, showed that the quality of the plan is also increased compared to a manual beam selection. Indeed, for an equivalent or even a better target coverage, it reduces the dose received by the organs at risk. The second axis of work concerned the optimisation of pre-treatment quality control. We used an industrial method: Statistical Process Control (S.P.C.) to retrospectively analyse the absolute dose quality control results performed using an ionisation chamber at Centre Alexis Vautrin (C.A.V.). This study showed that S.P.C. is an efficient method to reinforce treatment security using control charts. It also showed that our dose delivery process was stable and statistically capable for prostate treatments, which implies that a reduction of the number of controls can be considered for this type of treatment at the C.A.V.. (author)

  10. Automatic first-break picking using the instantaneous traveltime attribute

    KAUST Repository

    Saragiotis, Christos

    2012-01-01

    Picking the first breaks is an important step in seismic processing. The large volume of the seismic data calls for automatic and objective picking. We introduce a new automatic first-break picker, which uses specifically designed time windows and an iterative procedure based on the instantaneous traveltime attribute. The method is fast as it only uses a few FFT\\'s per trace. We demonstrate the effectiveness of this automatic method by applying it on real test data.

  11. 2nd International Conference on Mechatronics and Automatic Control

    CERN Document Server

    2015-01-01

    This book examines mechatronics and automatic control systems. The book covers important emerging topics in signal processing, control theory, sensors, mechanic manufacturing systems and automation. The book presents papers from the second International Conference on Mechatronics and Automatic Control Systems held in Beijing, China on September 20-21, 2014. Examines how to improve productivity through the latest advanced technologies Covering new systems and techniques in the broad field of mechatronics and automatic control systems.

  12. Cliff : the automatized zipper

    NARCIS (Netherlands)

    Baharom, M.Z.; Toeters, M.J.; Delbressine, F.L.M.; Bangaru, C.; Feijs, L.M.G.

    2016-01-01

    It is our strong believe that fashion - more specifically apparel - can support us so much more in our daily life than it currently does. The Cliff project takes the opportunity to create a generic automatized zipper. It is a response to the struggle by elderly, people with physical disability, and

  13. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstract...

  14. Automatic Oscillating Turret.

    Science.gov (United States)

    1981-03-01

    Final Report: February 1978 ZAUTOMATIC OSCILLATING TURRET SYSTEM September 1980 * 6. PERFORMING 01G. REPORT NUMBER .J7. AUTHOR(S) S. CONTRACT OR GRANT...o....e.... *24 APPENDIX P-4 OSCILLATING BUMPER TURRET ...................... 25 A. DESCRIPTION 1. Turret Controls ...Other criteria requirements were: 1. Turret controls inside cab. 2. Automatic oscillation with fixed elevation to range from 20* below the horizontal to

  15. Reactor component automatic grapple

    International Nuclear Information System (INIS)

    Greenaway, P.R.

    1982-01-01

    A grapple for handling nuclear reactor components in a medium such as liquid sodium which, upon proper seating and alignment of the grapple with the component as sensed by a mechanical logic integral to the grapple, automatically seizes the component. The mechanical logic system also precludes seizure in the absence of proper seating and alignment. (author)

  16. Automatic sweep circuit

    International Nuclear Information System (INIS)

    Keefe, D.J.

    1980-01-01

    An automatically sweeping circuit for searching for an evoked response in an output signal in time with respect to a trigger input is described. Digital counters are used to activate a detector at precise intervals, and monitoring is repeated for statistical accuracy. If the response is not found then a different time window is examined until the signal is found

  17. Automatic sweep circuit

    Science.gov (United States)

    Keefe, Donald J.

    1980-01-01

    An automatically sweeping circuit for searching for an evoked response in an output signal in time with respect to a trigger input. Digital counters are used to activate a detector at precise intervals, and monitoring is repeated for statistical accuracy. If the response is not found then a different time window is examined until the signal is found.

  18. Recursive automatic classification algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Bauman, E V; Dorofeyuk, A A

    1982-03-01

    A variational statement of the automatic classification problem is given. The dependence of the form of the optimal partition surface on the form of the classification objective functional is investigated. A recursive algorithm is proposed for maximising a functional of reasonably general form. The convergence problem is analysed in connection with the proposed algorithm. 8 references.

  19. Automatic Commercial Permit Sets

    Energy Technology Data Exchange (ETDEWEB)

    Grana, Paul [Folsom Labs, Inc., San Francisco, CA (United States)

    2017-12-21

    Final report for Folsom Labs’ Solar Permit Generator project, which has successfully completed, resulting in the development and commercialization of a software toolkit within the cloud-based HelioScope software environment that enables solar engineers to automatically generate and manage draft documents for permit submission.

  20. Computerized automatic tip scanning operation

    International Nuclear Information System (INIS)

    Nishikawa, K.; Fukushima, T.; Nakai, H.; Yanagisawa, A.

    1984-01-01

    In BWR nuclear power stations the Traversing Incore Probe (TIP) system is one of the most important components in reactor monitoring and control. In previous TIP systems, however, operators have suffered from the complexity of operation and long operation time required. The system presented in this paper realizes the automatic operation of the TIP system by monitoring and driving it with a process computer. This system significantly reduces the burden on customer operators and improves plant efficiency by simplifying the operating procedure, augmenting the accuracy of the measured data, and shortening operating time. The process computer is one of the PODIA (Plant Operation by Displayed Information Automation) systems. This computer transfers control signals to the TIP control panel, which in turn drives equipment by microprocessor control. The process computer contains such components as the CRT/KB unit, the printer plotter, the hard copier, and the message typers required for efficient man-machine communications. Its operation and interface properties are described

  1. Report on achievements of research and development of an automatic sewing system in fiscal 1985. Sewing preparation and processing technology; 1985 nendo jido hosei system no kenkyu kaihatsu seika hokokusho. Hosei junbi kako gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1986-03-01

    This paper describes the sewing preparation and processing technology, extracted from the achievement report for fiscal 1985 on developing an automatic sewing system. The cloth property evaluation technology was discussed by using the KES testing machine, and the composite transformation, large bending and composite surface property measuring device. Multiple regression analysis was made on cloth properties required for operation of each sewing process, and the summary of cloth properties required for the sewing devices was identified. A case study was performed on the data banking system on the sewing machine operating process as the object. For the cloth stabilization technology, seven kinds of temporary hardening agents were selected, and given the performance test. Literatures were investigated on tolerance for human body exposure to working environment pollution. Fabrication of the continuously operating temporary hardener processing device achieved the performance as designed. Discussions were given on temporary hardener flow rate and supply pressure, particle diameters and rotation speed of the rotating disk, particle properties and hardening performance, and work uniformity, where the performance of the continuous temporary hardening treatment was verified. High level correlation was derived between the relative handling difficulty and the cloth properties. With regard to the hardening standard, discussions are required also on the correlation with sewability. (NEDO)

  2. Automatic Computer Mapping of Terrain

    Science.gov (United States)

    Smedes, H. W.

    1971-01-01

    Computer processing of 17 wavelength bands of visible, reflective infrared, and thermal infrared scanner spectrometer data, and of three wavelength bands derived from color aerial film has resulted in successful automatic computer mapping of eight or more terrain classes in a Yellowstone National Park test site. The tests involved: (1) supervised and non-supervised computer programs; (2) special preprocessing of the scanner data to reduce computer processing time and cost, and improve the accuracy; and (3) studies of the effectiveness of the proposed Earth Resources Technology Satellite (ERTS) data channels in the automatic mapping of the same terrain, based on simulations, using the same set of scanner data. The following terrain classes have been mapped with greater than 80 percent accuracy in a 12-square-mile area with 1,800 feet of relief; (1) bedrock exposures, (2) vegetated rock rubble, (3) talus, (4) glacial kame meadow, (5) glacial till meadow, (6) forest, (7) bog, and (8) water. In addition, shadows of clouds and cliffs are depicted, but were greatly reduced by using preprocessing techniques.

  3. METHODS OF AUTOMATIC QUALITY CONTROL OF AGGLUTINANTSANDS IN FOUNDRY

    Directory of Open Access Journals (Sweden)

    D. M. Kukuj

    2004-01-01

    Full Text Available The article is dedicated to comparative analysis of the well-known methods of automatic quality control of agglutinant sands in process of their preparation and to the problems of automation control of the mix preparation processes.

  4. Towards Automatic Decentralized Control Structure Selection

    DEFF Research Database (Denmark)

    for decentralized control is determined automatically, and the resulting decentralized control structure is automatically tuned using standard techniques. Dynamic simulation of the resulting process system gives immediate feedback to the process design engineer regarding practical operability of the process......A subtask in integration of design and control of chemical processes is the selection of a control structure. Automating the selection of the control structure enables sequential integration of process and controld esign. As soon as the process is specified or computed, a structure....... The control structure selection problem is formulated as a special MILP employing cost coefficients which are computed using Parseval's theorem combined with RGA and IMC concepts. This approach enables selection and tuning of large-scale plant-wide decentralized controllers through efficient combination...

  5. Towards Automatic Decentralized Control Structure Selection

    DEFF Research Database (Denmark)

    Jørgensen, John Bagterp; Jørgensen, Sten Bay

    2000-01-01

    for decentralized control is determined automatically, and the resulting decentralized control structure is automatically tuned using standard techniques. Dynamic simulation of the resulting process system gives immediate feedback to the process design engineer regarding practical operability of the process......A subtask in integration of design and control of chemical processes is the selection of a control structure. Automating the selection of the control structure enables sequential integration of process and control design. As soon as the process is specified or computed, a structure....... The control structure selection problem is formulated as a special MILP employing cost coefficients which are computed using Parseval's theorem combined with RGA and IMC concepts. This approach enables selection and tuning of large-scale plant-wide decentralized controllers through efficient combination...

  6. Automatic indexing, compiling and classification

    International Nuclear Information System (INIS)

    Andreewsky, Alexandre; Fluhr, Christian.

    1975-06-01

    A review of the principles of automatic indexing, is followed by a comparison and summing-up of work by the authors and by a Soviet staff from the Moscou INFORM-ELECTRO Institute. The mathematical and linguistic problems of the automatic building of thesaurus and automatic classification are examined [fr

  7. Automatic centring and bonding of lenses

    Science.gov (United States)

    Krey, Stefan; Heinisch, J.; Dumitrescu, E.

    2007-05-01

    We present an automatic bonding station which is able to center and bond individual lenses or doublets to a barrel with sub micron centring accuracy. The complete manufacturing cycle includes the glue dispensing and UV curing. During the process the state of centring is continuously controlled by the vision software, and the final result is recorded to a file for process statistics. Simple pass or fail results are displayed to the operator at the end of the process.

  8. A macro-directive mechanism that facilitates automatic updating and processing of the contents of Electronic Healthcare Records: an extension to the CEN architecture.

    Science.gov (United States)

    Deftereos, S; Lambrinoudakis, C; Gritzalis, S; Georgonikou, D; Andriopoulos, P; Aessopos, A

    2003-03-01

    Facilitating data entry, eliminating redundant effort and providing decision support are some of the factors upon which the successful uptake of Electronic Healthcare Record (EHCR) technology is dependent. The European Standardization Committee (CEN), on the other hand, has proposed a standard EHCR architecture, which allows patient record contents to be highly diverse, customized to individual user needs; this makes their processing a challenging task and poses a demand for specially designed mechanisms. We describe the requirements for a macro-directive mechanism, pertaining to CEN-compatible EHCR software that can automate updating and processing of patient records, thus enhancing the functionality of the software. We have implemented the above-mentioned mechanism in an EHCR application that has been customized for use in the care process of patients suffering from beta-Thalassemia. The application is being used during the last two years in the Thalassemia units of four Greek hospitals, as part of their every day practice. We report on the experience we have acquired so far.

  9. Automatization of welding

    International Nuclear Information System (INIS)

    Iwabuchi, Masashi; Tomita, Jinji; Nishihara, Katsunori.

    1978-01-01

    Automatization of welding is one of the effective measures for securing high degree of quality of nuclear power equipment, as well as for correspondence to the environment at the site of plant. As the latest ones of the automatic welders practically used for welding of nuclear power apparatuses in factories of Toshiba and IHI, those for pipes and lining tanks are described here. The pipe welder performs the battering welding on the inside of pipe end as the so-called IGSCC countermeasure and the succeeding butt welding through the same controller. The lining tank welder is able to perform simultaneous welding of two parallel weld lines on a large thin plate lining tank. Both types of the welders are demonstrating excellent performance at the shops as well as at the plant site. (author)

  10. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic prog...

  11. Automatic video surveillance of outdoor scenes using track before detect

    DEFF Research Database (Denmark)

    Hansen, Morten; Sørensen, Helge Bjarup Dissing; Birkemark, Christian M.

    2005-01-01

    This paper concerns automatic video surveillance of outdoor scenes using a single camera. The first step in automatic interpretation of the video stream is activity detection based on background subtraction. Usually, this process will generate a large number of false alarms in outdoor scenes due...

  12. 2013 International Conference on Mechatronics and Automatic Control Systems

    CERN Document Server

    2014-01-01

    This book examines mechatronics and automatic control systems. The book covers important emerging topics in signal processing, control theory, sensors, mechanic manufacturing systems and automation. The book presents papers from the 2013 International Conference on Mechatronics and Automatic Control Systems held in Hangzhou, China on August 10-11, 2013. .

  13. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  14. Semi-automatic logarithmic converter of logs

    International Nuclear Information System (INIS)

    Gol'dman, Z.A.; Bondar's, V.V.

    1974-01-01

    Semi-automatic logarithmic converter of logging charts. An original semi-automatic converter was developed for use in converting BK resistance logging charts and the time interval, ΔT, of acoustic logs from a linear to a logarithmic scale with a specific ratio for subsequent combining of them with neutron-gamma logging charts in operative interpretation of logging materials by a normalization method. The converter can be used to increase productivity by giving curves different from those obtained in manual, pointwise processing. The equipment operates reliably and is simple in use. (author)

  15. Development of automatic laser welding system

    International Nuclear Information System (INIS)

    Ohwaki, Katsura

    2002-01-01

    Laser are a new production tool for high speed and low distortion welding and applications to automatic welding lines are increasing. IHI has long experience of laser processing for the preservation of nuclear power plants, welding of airplane engines and so on. Moreover, YAG laser oscillators and various kinds of hardware have been developed for laser welding and automation. Combining these welding technologies and laser hardware technologies produce the automatic laser welding system. In this paper, the component technologies are described, including combined optics intended to improve welding stability, laser oscillators, monitoring system, seam tracking system and so on. (author)

  16. Welding with the TIG automatic process of the end fittings for the execution of the Embalse nuclear power plant fuel channel rechange

    International Nuclear Information System (INIS)

    Suarez, P.O.

    1990-01-01

    The present work describes the methodology for the cutting of the existing welding and subsequent welding applied by the TIG process of the coupling composed by the shroud ring and the end fitting ring from one of Embalse nuclear power plant's fuel channels. The replacement will be previously determined by the SLAR-ETTE mechanism where a displacement operated among the Gartner Spring rings, the pressure tubes are separated from the Calandria tubes. The welding to be carried out has the function of stamping the CO 2 annular gas (thermal insulator) circulating between the pressure tube and the Calandria one during the functioning of the plant. (Author) [es

  17. ACIR: automatic cochlea image registration

    Science.gov (United States)

    Al-Dhamari, Ibraheem; Bauer, Sabine; Paulus, Dietrich; Lissek, Friedrich; Jacob, Roland

    2017-02-01

    Efficient Cochlear Implant (CI) surgery requires prior knowledge of the cochlea's size and its characteristics. This information helps to select suitable implants for different patients. To get these measurements, a segmentation method of cochlea medical images is needed. An important pre-processing step for good cochlea segmentation involves efficient image registration. The cochlea's small size and complex structure, in addition to the different resolutions and head positions during imaging, reveals a big challenge for the automated registration of the different image modalities. In this paper, an Automatic Cochlea Image Registration (ACIR) method for multi- modal human cochlea images is proposed. This method is based on using small areas that have clear structures from both input images instead of registering the complete image. It uses the Adaptive Stochastic Gradient Descent Optimizer (ASGD) and Mattes's Mutual Information metric (MMI) to estimate 3D rigid transform parameters. The use of state of the art medical image registration optimizers published over the last two years are studied and compared quantitatively using the standard Dice Similarity Coefficient (DSC). ACIR requires only 4.86 seconds on average to align cochlea images automatically and to put all the modalities in the same spatial locations without human interference. The source code is based on the tool elastix and is provided for free as a 3D Slicer plugin. Another contribution of this work is a proposed public cochlea standard dataset which can be downloaded for free from a public XNAT server.

  18. Method for automatic localization of MR-visible markers using morphological image processing and conventional pulse sequences: feasibility for image-guided procedures.

    Science.gov (United States)

    Busse, Harald; Trampel, Robert; Gründer, Wilfried; Moche, Michael; Kahn, Thomas

    2007-10-01

    To evaluate the feasibility and accuracy of an automated method to determine the 3D position of MR-visible markers. Inductively coupled RF coils were imaged in a whole-body 1.5T scanner using the body coil and two conventional gradient echo sequences (FLASH and TrueFISP) and large imaging volumes up to (300 mm(3)). To minimize background signals, a flip angle of approximately 1 degrees was used. Morphological 2D image processing in orthogonal scan planes was used to determine the 3D positions of a configuration of three fiducial markers (FMC). The accuracies of the marker positions and of the orientation of the plane defined by the FMC were evaluated at various distances r(M) from the isocenter. Fiducial marker detection with conventional equipment (pulse sequences, imaging coils) was very reliable and highly reproducible over a wide range of experimental conditions. For r(M) processing is feasible, simple, and very accurate. In combination with safe wireless markers, the method is found to be useful for image-guided procedures. (c) 2007 Wiley-Liss, Inc.

  19. Automatic inference of indexing rules for MEDLINE

    Directory of Open Access Journals (Sweden)

    Shooshan Sonya E

    2008-11-01

    Full Text Available Abstract Background: Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. Methods: In this paper, we describe the use and the customization of Inductive Logic Programming (ILP to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Results: Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI, a system producing automatic indexing recommendations for MEDLINE. Conclusion: We expect the sets of ILP rules obtained in this experiment to be integrated into MTI.

  20. Report on achievements of research and development of an automatic sewing system in fiscal 1986. Sewing preparation and processing technology; 1986 nendo jido hosei system no kenkyu kaihatsu seika hokokusho. Hosei junbi kako gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1987-03-01

    This paper describes the sewing preparation and processing technology from the achievement report for fiscal 1986 on developing an automatic sewing system. For the cloth property evaluation technology, such devices were used to measure different kinds of cloths and evaluate cloth properties related to sewing performance, as the KES testing machine, the measuring device for composite transformation, large transformation and bending, and composite surface properties, the cloth air permeability testing machine, and the thermal property measuring device. The system was verified with regard to data banking to confirm feasibility of the system to practical application. Among the cloth stabilizing technologies, discussions were given on age-based stability of cloth properties, repetitive transformation resistance, temperature, hardener deposition amount, and bending rigidity in cloths treated with temporary hardening agents. Discoloration, robustness and device damaging effect of cloths were also investigated. A partial and temporary hardening device was fabricated to make partial hardening possible. A continuous deflocculating device was also fabricated making deflocculation of temporarily hardened cloth possible. The deflocculation was discussed on the sewing process in relation with workability, and it was revealed that low-level hardening treatment is effective. It was verified that the handling is facilitated particularly in cloths that tend to curl. (NEDO)

  1. Automatic classification of defects in weld pipe

    International Nuclear Information System (INIS)

    Anuar Mikdad Muad; Mohd Ashhar Hj Khalid; Abdul Aziz Mohamad; Abu Bakar Mhd Ghazali; Abdul Razak Hamzah

    2000-01-01

    With the advancement of computer imaging technology, the image on hard radiographic film can be digitized and stored in a computer and the manual process of defect recognition and classification may be replace by the computer. In this paper a computerized method for automatic detection and classification of common defects in film radiography of weld pipe is described. The detection and classification processes consist of automatic selection of interest area on the image and then classify common defects using image processing and special algorithms. Analysis of the attributes of each defect such as area, size, shape and orientation are carried out by the feature analysis process. These attributes reveal the type of each defect. These methods of defect classification result in high success rate. Our experience showed that sharp film images produced better results

  2. Automatic classification of defects in weld pipe

    International Nuclear Information System (INIS)

    Anuar Mikdad Muad; Mohd Ashhar Khalid; Abdul Aziz Mohamad; Abu Bakar Mhd Ghazali; Abdul Razak Hamzah

    2001-01-01

    With the advancement of computer imaging technology, the image on hard radiographic film can be digitized and stored in a computer and the manual process of defect recognition and classification may be replaced by the computer. In this paper, a computerized method for automatic detection and classification of common defects in film radiography of weld pipe is described. The detection and classification processes consist of automatic selection of interest area on the image and then classify common defects using image processing and special algorithms. Analysis of the attributes of each defect such area, size, shape and orientation are carried out by the feature analysis process. These attributes reveal the type of each defect. These methods of defect classification result in high success rate. Our experience showed that sharp film images produced better results. (Author)

  3. Culture, attribution and automaticity: a social cognitive neuroscience view.

    Science.gov (United States)

    Mason, Malia F; Morris, Michael W

    2010-06-01

    A fundamental challenge facing social perceivers is identifying the cause underlying other people's behavior. Evidence indicates that East Asian perceivers are more likely than Western perceivers to reference the social context when attributing a cause to a target person's actions. One outstanding question is whether this reflects a culture's influence on automatic or on controlled components of causal attribution. After reviewing behavioral evidence that culture can shape automatic mental processes as well as controlled reasoning, we discuss the evidence in favor of cultural differences in automatic and controlled components of causal attribution more specifically. We contend that insights emerging from social cognitive neuroscience research can inform this debate. After introducing an attribution framework popular among social neuroscientists, we consider findings relevant to the automaticity of attribution, before speculating how one could use a social neuroscience approach to clarify whether culture affects automatic, controlled or both types of attribution processes.

  4. Automatic Differentiation and Deep Learning

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Statistical learning has been getting more and more interest from the particle-physics community in recent times, with neural networks and gradient-based optimization being a focus. In this talk we shall discuss three things: automatic differention tools: tools to quickly build DAGs of computation that are fully differentiable. We shall focus on one such tool "PyTorch".  Easy deployment of trained neural networks into large systems with many constraints: for example, deploying a model at the reconstruction phase where the neural network has to be integrated into CERN's bulk data-processing C++-only environment Some recent models in deep learning for segmentation and generation that might be useful for particle physics problems.

  5. AUTOMATIC FREQUENCY CONTROL SYSTEM

    Science.gov (United States)

    Hansen, C.F.; Salisbury, J.D.

    1961-01-10

    A control is described for automatically matching the frequency of a resonant cavity to that of a driving oscillator. The driving oscillator is disconnected from the cavity and a secondary oscillator is actuated in which the cavity is the frequency determining element. A low frequency is mixed with the output of the driving oscillator and the resultant lower and upper sidebands are separately derived. The frequencies of the sidebands are compared with the secondary oscillator frequency. deriving a servo control signal to adjust a tuning element in the cavity and matching the cavity frequency to that of the driving oscillator. The driving oscillator may then be connected to the cavity.

  6. Robot-assisted automatic ultrasound calibration.

    Science.gov (United States)

    Aalamifar, Fereshteh; Cheng, Alexis; Kim, Younsu; Hu, Xiao; Zhang, Haichong K; Guo, Xiaoyu; Boctor, Emad M

    2016-10-01

    Ultrasound (US) calibration is the process of determining the unknown transformation from a coordinate frame such as the robot's tooltip to the US image frame and is a necessary task for any robotic or tracked US system. US calibration requires submillimeter-range accuracy for most applications, but it is a time-consuming and repetitive task. We provide a new framework for automatic US calibration with robot assistance and without the need for temporal calibration. US calibration based on active echo (AE) phantom was previously proposed, and its superiority over conventional cross-wire phantom-based calibration was shown. In this work, we use AE to guide the robotic arm motion through the process of data collection; we combine the capability of the AE point to localize itself in the frame of the US image with the automatic motion of the robotic arm to provide a framework for calibrating the arm to the US image automatically. We demonstrated the efficacy of the automated method compared to the manual method through experiments. To highlight the necessity of frequent ultrasound calibration, it is demonstrated that the calibration precision changed from 1.67 to 3.20 mm if the data collection is not repeated after a dismounting/mounting of the probe holder. In a large data set experiment, similar reconstruction precision of automatic and manual data collection was observed, while the time was reduced by 58 %. In addition, we compared ten automatic calibrations with ten manual ones, each performed in 15 min, and showed that all the automatic ones could converge in the case of setting the initial matrix as identity, while this was not achieved by manual data sets. Given the same initial matrix, the repeatability of the automatic was [0.46, 0.34, 0.80, 0.47] versus [0.42, 0.51, 0.98, 1.15] mm in the manual case for the US image four corners. The submillimeter accuracy requirement of US calibration makes frequent data collections unavoidable. We proposed an automated

  7. Executive functions and motivation as moderators of the relationship between automatic associations and alcohol use in problem drinkers seeking online help

    NARCIS (Netherlands)

    van Deursen, D.S.; Salemink, E.; Boendermaker, W.J.; Pronk, T.; Hofmann, W.; Wiers, R.W.

    2015-01-01

    Background: Dual process models posit that problem drinking is maintained by an imbalance between relatively strong automatic processes and weak controlled processes, a combination of executive functions and motivation. Few studies have examined how the interplay between automatic processes and

  8. Automatic personnel contamination monitor

    International Nuclear Information System (INIS)

    Lattin, Kenneth R.

    1978-01-01

    United Nuclear Industries, Inc. (UNI) has developed an automatic personnel contamination monitor (APCM), which uniquely combines the design features of both portal and hand and shoe monitors. In addition, this prototype system also has a number of new features, including: micro computer control and readout, nineteen large area gas flow detectors, real-time background compensation, self-checking for system failures, and card reader identification and control. UNI's experience in operating the Hanford N Reactor, located in Richland, Washington, has shown the necessity of automatically monitoring plant personnel for contamination after they have passed through the procedurally controlled radiation zones. This final check ensures that each radiation zone worker has been properly checked before leaving company controlled boundaries. Investigation of the commercially available portal and hand and shoe monitors indicated that they did not have the sensitivity or sophistication required for UNI's application, therefore, a development program was initiated, resulting in the subject monitor. Field testing shows good sensitivity to personnel contamination with the majority of alarms showing contaminants on clothing, face and head areas. In general, the APCM has sensitivity comparable to portal survey instrumentation. The inherit stand-in, walk-on feature of the APCM not only makes it easy to use, but makes it difficult to bypass. (author)

  9. The digital ultrasonic test unit for automatic equipment

    International Nuclear Information System (INIS)

    Hiraoka, T.; Matsuyama, H.

    1976-01-01

    The operations and features of the ultrasonic test unit used and the digital data processing techniques employed are described. This unit is used for a few hundred multi-channel automatic ultrasonic test equipment

  10. Automatic sets and Delone sets

    International Nuclear Information System (INIS)

    Barbe, A; Haeseler, F von

    2004-01-01

    Automatic sets D part of Z m are characterized by having a finite number of decimations. They are equivalently generated by fixed points of certain substitution systems, or by certain finite automata. As examples, two-dimensional versions of the Thue-Morse, Baum-Sweet, Rudin-Shapiro and paperfolding sequences are presented. We give a necessary and sufficient condition for an automatic set D part of Z m to be a Delone set in R m . The result is then extended to automatic sets that are defined as fixed points of certain substitutions. The morphology of automatic sets is discussed by means of examples

  11. VELA Network Evaluation and Automatic Processing Research

    Science.gov (United States)

    1974-12-09

    LO/LR COMMFNT NO. (DEGRRES) T=20SFr T=30SRr T^aospr PATTO 937 *>*>. 3 5.20 a. 30 3.^7 7.97 0.0 10 ? 91R 27.9 a.oo 0.0 0.0 0.0 0.0 20 2 qiq 70.a a...FVF^T OISTANCE HB MS US IS LQ/I.P roMMFN7" NO. (DEORF^) T=20SFC T=30SFC T-aosRc PATTO MO 50. 1 5.20 a.oi 3.8 8 1.51 1. 10 10 r, Ml 5a.a 5.00 a...DFORPP^) T=20^KC T=30SEC 7= (4 0^ FC PATTO � lift. 1 s.on «.6 0 «.29 3. 9 9 0.0 in 7 u-\\-i 73.2 i.r-o 0.0 0.0 0.0 0.0 30 7 n in 96.0 1.70 0.0

  12. Speed and automaticity of word recognition - inseparable twins?

    DEFF Research Database (Denmark)

    Poulsen, Mads; Asmussen, Vibeke; Elbro, Carsten

    'Speed and automaticity' of word recognition is a standard collocation. However, it is not clear whether speed and automaticity (i.e., effortlessness) make independent contributions to reading comprehension. In theory, both speed and automaticity may save cognitive resources for comprehension...... processes. Hence, the aim of the present study was to assess the unique contributions of word recognition speed and automaticity to reading comprehension while controlling for decoding speed and accuracy. Method: 139 Grade 5 students completed tests of reading comprehension and computer-based tests of speed...... of decoding and word recognition together with a test of effortlessness (automaticity) of word recognition. Effortlessness was measured in a dual task in which participants were presented with a word enclosed in an unrelated figure. The task was to read the word and decide whether the figure was a triangle...

  13. Automatic Encoding and Language Detection in the GSDL

    Directory of Open Access Journals (Sweden)

    Otakar Pinkas

    2014-10-01

    Full Text Available Automatic detection of encoding and language of the text is part of the Greenstone Digital Library Software (GSDL for building and distributing digital collections. It is developed by the University of Waikato (New Zealand in cooperation with UNESCO. The automatic encoding and language detection in Slavic languages is difficult and it sometimes fails. The aim is to detect cases of failure. The automatic detection in the GSDL is based on n-grams method. The most frequent n-grams for Czech are presented. The whole process of automatic detection in the GSDL is described. The input documents to test collections are plain texts encoded in ISO-8859-1, ISO-8859-2 and Windows-1250. We manually evaluated the quality of automatic detection. To the causes of errors belong the improper language model predominance and the incorrect switch to Windows-1250. We carried out further tests on documents that were more complex.

  14. Semi-automatic fluoroscope

    International Nuclear Information System (INIS)

    Tarpley, M.W.

    1976-10-01

    Extruded aluminum-clad uranium-aluminum alloy fuel tubes must pass many quality control tests before irradiation in Savannah River Plant nuclear reactors. Nondestructive test equipment has been built to automatically detect high and low density areas in the fuel tubes using x-ray absorption techniques with a video analysis system. The equipment detects areas as small as 0.060-in. dia with 2 percent penetrameter sensitivity. These areas are graded as to size and density by an operator using electronic gages. Video image enhancement techniques permit inspection of ribbed cylindrical tubes and make possible the testing of areas under the ribs. Operation of the testing machine, the special low light level television camera, and analysis and enhancement techniques are discussed

  15. PACS quality control and automatic problem notifier

    Science.gov (United States)

    Honeyman-Buck, Janice C.; Jones, Douglas; Frost, Meryll M.; Staab, Edward V.

    1997-05-01

    One side effect of installing a clinical PACS Is that users become dependent upon the technology and in some cases it can be very difficult to revert back to a film based system if components fail. The nature of system failures range from slow deterioration of function as seen in the loss of monitor luminance through sudden catastrophic loss of the entire PACS networks. This paper describes the quality control procedures in place at the University of Florida and the automatic notification system that alerts PACS personnel when a failure has happened or is anticipated. The goal is to recover from a failure with a minimum of downtime and no data loss. Routine quality control is practiced on all aspects of PACS, from acquisition, through network routing, through display, and including archiving. Whenever possible, the system components perform self and between platform checks for active processes, file system status, errors in log files, and system uptime. When an error is detected or a exception occurs, an automatic page is sent to a pager with a diagnostic code. Documentation on each code, trouble shooting procedures, and repairs are kept on an intranet server accessible only to people involved in maintaining the PACS. In addition to the automatic paging system for error conditions, acquisition is assured by an automatic fax report sent on a daily basis to all technologists acquiring PACS images to be used as a cross check that all studies are archived prior to being removed from the acquisition systems. Daily quality control is preformed to assure that studies can be moved from each acquisition and contrast adjustment. The results of selected quality control reports will be presented. The intranet documentation server will be described with the automatic pager system. Monitor quality control reports will be described and the cost of quality control will be quantified. As PACS is accepted as a clinical tool, the same standards of quality control must be established

  16. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  17. Automatic program generation: future of software engineering

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, J.H.

    1979-01-01

    At this moment software development is still more of an art than an engineering discipline. Each piece of software is lovingly engineered, nurtured, and presented to the world as a tribute to the writer's skill. When will this change. When will the craftsmanship be removed and the programs be turned out like so many automobiles from an assembly line. Sooner or later it will happen: economic necessities will demand it. With the advent of cheap microcomputers and ever more powerful supercomputers doubling capacity, much more software must be produced. The choices are to double the number of programers, double the efficiency of each programer, or find a way to produce the needed software automatically. Producing software automatically is the only logical choice. How will automatic programing come about. Some of the preliminary actions which need to be done and are being done are to encourage programer plagiarism of existing software through public library mechanisms, produce well understood packages such as compiler automatically, develop languages capable of producing software as output, and learn enough about the whole process of programing to be able to automate it. Clearly, the emphasis must not be on efficiency or size, since ever larger and faster hardware is coming.

  18. Automatic change detection to facial expressions in adolescents

    DEFF Research Database (Denmark)

    Liu, Tongran; Xiao, Tong; Jiannong, Shi

    2016-01-01

    Adolescence is a critical period for the neurodevelopment of social-emotional processing, wherein the automatic detection of changes in facial expressions is crucial for the development of interpersonal communication. Two groups of participants (an adolescent group and an adult group) were...... in facial expressions between the two age groups. The current findings demonstrated that the adolescent group featured more negative vMMN amplitudes than the adult group in the fronto-central region during the 120–200 ms interval. During the time window of 370–450 ms, only the adult group showed better...... automatic processing on fearful faces than happy faces. The present study indicated that adolescent’s posses stronger automatic detection of changes in emotional expression relative to adults, and sheds light on the neurodevelopment of automatic processes concerning social-emotional information....

  19. Automatic generation of data merging program codes.

    OpenAIRE

    Hyensook, Kim; Oussena, Samia; Zhang, Ying; Clark, Tony

    2010-01-01

    Data merging is an essential part of ETL (Extract-Transform-Load) processes to build a data warehouse system. To avoid rewheeling merging techniques, we propose a Data Merging Meta-model (DMM) and its transformation into executable program codes in the manner of model driven engineering. DMM allows defining relationships of different model entities and their merging types in conceptual level. Our formalized transformation described using ATL (ATLAS Transformation Language) enables automatic g...

  20. Research on automatic control system of greenhouse

    Science.gov (United States)

    Liu, Yi; Qi, Guoyang; Li, Zeyu; Wu, Qiannan; Meng, Yupeng

    2017-03-01

    This paper introduces a kind of automatic control system of single-chip microcomputer and a temperature and humidity sensor based on the greenhouse, describes the system's hardware structure, working principle and process, and a large number of experiments on the effect of the control system, the results show that the system can ideally control temperature and room temperature and humidity, can be used in indoor breeding and planting, and has the versatility and portability.

  1. A Context Dependent Automatic Target Recognition System

    Science.gov (United States)

    Kim, J. H.; Payton, D. W.; Olin, K. E.; Tseng, D. Y.

    1984-06-01

    This paper describes a new approach to automatic target recognizer (ATR) development utilizing artificial intelligent techniques. The ATR system exploits contextual information in its detection and classification processes to provide a high degree of robustness and adaptability. In the system, knowledge about domain objects and their contextual relationships is encoded in frames, separating it from low level image processing algorithms. This knowledge-based system demonstrates an improvement over the conventional statistical approach through the exploitation of diverse forms of knowledge in its decision-making process.

  2. Evaluation of automatic vacuum- assisted compaction solutions

    Directory of Open Access Journals (Sweden)

    M. Brzeziński

    2011-01-01

    Full Text Available Currently on the mould-making machines market the companies like: DiSA, KUENKEL WAGNER, HAFLINGER, HEINRICH WAGNER SINTO, HUNTER, SAVELLI AND TECHNICAL play significant role. These companies are the manufacturers of various solutions in machines and instalations applied in foundry engineering. Automatic foundry machines for compaction of green sand have the major role in mechanisation and automation processes of making the mould. The concept of operation of automatic machines is based on the static and dynamic methods of compacting the green sand. The method which gains the importance is the compacting method by using the energy of the air pressure. It's the initial stage or the supporting process of compacting the green sand. However in the automatic mould making machines using this method it's essential to use the additional compaction of the mass in order to receive the final parameters of the form. In the constructional solutions of the machines there is the additional division which concerns the method of putting the sand into the mould box. This division distinquishes the transport of the sand with simultaneous compaction or the putting of the sand without the pre-compaction. As the solutions of the major manufacturers are often the subject for application in various foundries, the authors of the paper would like/have the confidence to present their own evaluation process confirmed by their own researches and independent analysis of the producers' solutions.

  3. Automatic River Network Extraction from LIDAR Data

    Science.gov (United States)

    Maderal, E. N.; Valcarcel, N.; Delgado, J.; Sevilla, C.; Ojeda, J. C.

    2016-06-01

    National Geographic Institute of Spain (IGN-ES) has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI) within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network) and hydrological criteria (flow accumulation river network), and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files), and process; using local virtualization and the Amazon Web Service (AWS), which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri) and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.

  4. AUTOMATIC RIVER NETWORK EXTRACTION FROM LIDAR DATA

    Directory of Open Access Journals (Sweden)

    E. N. Maderal

    2016-06-01

    Full Text Available National Geographic Institute of Spain (IGN-ES has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network and hydrological criteria (flow accumulation river network, and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files, and process; using local virtualization and the Amazon Web Service (AWS, which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.

  5. Deliberation versus automaticity in decision making: Which presentation format features facilitate automatic decision making?

    Directory of Open Access Journals (Sweden)

    Anke Soellner

    2013-05-01

    Full Text Available The idea of automatic decision making approximating normatively optimal decisions without necessitating much cognitive effort is intriguing. Whereas recent findings support the notion that such fast, automatic processes explain empirical data well, little is known about the conditions under which such processes are selected rather than more deliberate stepwise strategies. We investigate the role of the format of information presentation, focusing explicitly on the ease of information acquisition and its influence on information integration processes. In a probabilistic inference task, the standard matrix employed in prior research was contrasted with a newly created map presentation format and additional variations of both presentation formats. Across three experiments, a robust presentation format effect emerged: Automatic decision making was more prevalent in the matrix (with high information accessibility, whereas sequential decision strategies prevailed when the presentation format demanded more information acquisition effort. Further scrutiny of the effect showed that it is not driven by the presentation format as such, but rather by the extent of information search induced by a format. Thus, if information is accessible with minimal need for information search, information integration is likely to proceed in a perception-like, holistic manner. In turn, a moderate demand for information search decreases the likelihood of behavior consistent with the assumptions of automatic decision making.

  6. A color hierarchy for automatic target selection.

    Directory of Open Access Journals (Sweden)

    Illia Tchernikov

    Full Text Available Visual processing of color starts at the cones in the retina and continues through ventral stream visual areas, called the parvocellular pathway. Motion processing also starts in the retina but continues through dorsal stream visual areas, called the magnocellular system. Color and motion processing are functionally and anatomically discrete. Previously, motion processing areas MT and MST have been shown to have no color selectivity to a moving stimulus; the neurons were colorblind whenever color was presented along with motion. This occurs when the stimuli are luminance-defined versus the background and is considered achromatic motion processing. Is motion processing independent of color processing? We find that motion processing is intrinsically modulated by color. Color modulated smooth pursuit eye movements produced upon saccading to an aperture containing a surface of coherently moving dots upon a black background. Furthermore, when two surfaces that differed in color were present, one surface was automatically selected based upon a color hierarchy. The strength of that selection depended upon the distance between the two colors in color space. A quantifiable color hierarchy for automatic target selection has wide-ranging implications from sports to advertising to human-computer interfaces.

  7. Training shortest-path tractography: Automatic learning of spatial priors

    DEFF Research Database (Denmark)

    Kasenburg, Niklas; Liptrot, Matthew George; Reislev, Nina Linde

    2016-01-01

    Tractography is the standard tool for automatic delineation of white matter tracts from diffusion weighted images. However, the output of tractography often requires post-processing to remove false positives and ensure a robust delineation of the studied tract, and this demands expert prior...... knowledge. Here we demonstrate how such prior knowledge, or indeed any prior spatial information, can be automatically incorporated into a shortest-path tractography approach to produce more robust results. We describe how such a prior can be automatically generated (learned) from a population, and we...

  8. Discrete Model Reference Adaptive Control System for Automatic Profiling Machine

    Directory of Open Access Journals (Sweden)

    Peng Song

    2012-01-01

    Full Text Available Automatic profiling machine is a movement system that has a high degree of parameter variation and high frequency of transient process, and it requires an accurate control in time. In this paper, the discrete model reference adaptive control system of automatic profiling machine is discussed. Firstly, the model of automatic profiling machine is presented according to the parameters of DC motor. Then the design of the discrete model reference adaptive control is proposed, and the control rules are proven. The results of simulation show that adaptive control system has favorable dynamic performances.

  9. Automatic Grasp Generation and Improvement for Industrial Bin-Picking

    DEFF Research Database (Denmark)

    Kraft, Dirk; Ellekilde, Lars-Peter; Rytz, Jimmy Alison

    2014-01-01

    and achieve comparable results and that our learning approach can improve system performance significantly. Automatic bin-picking is an important industrial process that can lead to significant savings and potentially keep production in countries with high labour cost rather than outsourcing it. The presented......This paper presents work on automatic grasp generation and grasp learning for reducing the manual setup time and increase grasp success rates within bin-picking applications. We propose an approach that is able to generate good grasps automatically using a dynamic grasp simulator, a newly developed...

  10. Automatic, semi-automatic and manual validation of urban drainage data.

    Science.gov (United States)

    Branisavljević, N; Prodanović, D; Pavlović, D

    2010-01-01

    Advances in sensor technology and the possibility of automated long distance data transmission have made continuous measurements the preferable way of monitoring urban drainage processes. Usually, the collected data have to be processed by an expert in order to detect and mark the wrong data, remove them and replace them with interpolated data. In general, the first step in detecting the wrong, anomaly data is called the data quality assessment or data validation. Data validation consists of three parts: data preparation, validation scores generation and scores interpretation. This paper will present the overall framework for the data quality improvement system, suitable for automatic, semi-automatic or manual operation. The first two steps of the validation process are explained in more detail, using several validation methods on the same set of real-case data from the Belgrade sewer system. The final part of the validation process, which is the scores interpretation, needs to be further investigated on the developed system.

  11. An automatic evaluation system for NTA film neutron dosimeters

    CERN Document Server

    Müller, R

    1999-01-01

    At CERN, neutron personal monitoring for over 4000 collaborators is performed with Kodak NTA films, which have been shown to be the most suitable neutron dosimeter in the radiation environment around high-energy accelerators. To overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with sup 2 sup 3 sup 8 Pu-Be source neutrons, which results in densely ionised recoil tracks, as well as on the extension of the method to higher energy neutrons causing sparse and fragmentary tracks. The application of the method in routine personal monitoring is discussed. $9 overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with /sup 238/Pu-Be source $9 discussed. (10 refs).

  12. Designing automatic resupply systems.

    Science.gov (United States)

    Harding, M L

    1999-02-01

    This article outlines the process for designing and implementing autoresupply systems. The planning process includes determination of goals and appropriate participation. Different types of autoresupply mechanisms include kanban, breadman, consignment, systems contracts, and direct shipping from an MRP schedule.

  13. Automatic Deficits can lead to executive deficits in ADHD

    Directory of Open Access Journals (Sweden)

    Gabriella Martino

    2017-12-01

    Full Text Available It has been well documented an executive dysfunction in children with Attention Deficit Hyperactivity Disorder (ADHD and with Reading Disorder (RD. The purpose of the present study was to test an alternative hypothesis that deficits in executive functioning within ADHD may be partially due to an impairment of the automatic processing. In addition, since the co-occurrence between ADHD and RD, we tested the hypothesis that the automatic processing may be  a possible common cognitive factor between ADHD and RD. We investigated the automatic processing of selective visual attention through two experiments. 12 children with ADHD, 17 with ADHD+RD and 29 typically developing children, matched for age and gender, performed two tasks: Visual Information Processing Task and Clock Test. As expected, ADHD and ADHD+RD groups differed from the control group in controlled process task, suggesting a deficit in executive functioning. All clinical subjects also exhibited a lower performance in automatic processes, compared to control group. The results of this study suggest that executive deficits within ADHD can be partially due to an impairment of automatic processing.

  14. A framework for automatically checking anonymity with μ CRL

    OpenAIRE

    Chothia, T.; Orzan, S.M.; Pang, J.; Torabi Dashti, M.; Montanari, U.; Sannella, D.; Bruni, R.

    2007-01-01

    We present a powerful and flexible method for automatically checking anonymity in a possibilistic general-purpose process algebraic verification toolset. We propose new definitions of a choice anonymity degree and a player anonymity degree, to quantify the precision with which an intruder is able to single out the true originator of a given event or to associate the right event to a given protocol participant. We show how these measures of anonymity can be automatically calculated from a prot...

  15. Adapting Mask-RCNN for Automatic Nucleus Segmentation

    OpenAIRE

    Johnson, Jeremiah W.

    2018-01-01

    Automatic segmentation of microscopy images is an important task in medical image processing and analysis. Nucleus detection is an important example of this task. Mask-RCNN is a recently proposed state-of-the-art algorithm for object detection, object localization, and object instance segmentation of natural images. In this paper we demonstrate that Mask-RCNN can be used to perform highly effective and efficient automatic segmentations of a wide range of microscopy images of cell nuclei, for ...

  16. Development of a System for Automatic Recognition of Speech

    Directory of Open Access Journals (Sweden)

    Roman Jarina

    2003-01-01

    Full Text Available The article gives a review of a research on processing and automatic recognition of speech signals (ARR at the Department of Telecommunications of the Faculty of Electrical Engineering, University of iilina. On-going research is oriented to speech parametrization using 2-dimensional cepstral analysis, and to an application of HMMs and neural networks for speech recognition in Slovak language. The article summarizes achieved results and outlines future orientation of our research in automatic speech recognition.

  17. Automatic traveltime picking using instantaneous traveltime

    KAUST Repository

    Saragiotis, Christos

    2013-02-08

    Event picking is used in many steps of seismic processing. We present an automatic event picking method that is based on a new attribute of seismic signals, instantaneous traveltime. The calculation of the instantaneous traveltime consists of two separate but interrelated stages. First, a trace is mapped onto the time-frequency domain. Then the time-frequency representation is mapped back onto the time domain by an appropriate operation. The computed instantaneous traveltime equals the recording time at those instances at which there is a seismic event, a feature that is used to pick the events. We analyzed the concept of the instantaneous traveltime and demonstrated the application of our automatic picking method on dynamite and Vibroseis field data.

  18. Mars - robust automatic backbone assignment of proteins

    International Nuclear Information System (INIS)

    Jung, Young-Sang; Zweckstetter, Markus

    2004-01-01

    MARS a program for robust automatic backbone assignment of 13 C/ 15 N labeled proteins is presented. MARS does not require tight thresholds for establishing sequential connectivity or detailed adjustment of these thresholds and it can work with a wide variety of NMR experiments. Using only 13 C α / 13 C β connectivity information, MARS allows automatic, error-free assignment of 96% of the 370-residue maltose-binding protein. MARS can successfully be used when data are missing for a substantial portion of residues or for proteins with very high chemical shift degeneracy such as partially or fully unfolded proteins. Other sources of information, such as residue specific information or known assignments from a homologues protein, can be included into the assignment process. MARS exports its result in SPARKY format. This allows visual validation and integration of automated and manual assignment

  19. Automatic traveltime picking using instantaneous traveltime

    KAUST Repository

    Saragiotis, Christos; Alkhalifah, Tariq Ali; Fomel, Sergey

    2013-01-01

    Event picking is used in many steps of seismic processing. We present an automatic event picking method that is based on a new attribute of seismic signals, instantaneous traveltime. The calculation of the instantaneous traveltime consists of two separate but interrelated stages. First, a trace is mapped onto the time-frequency domain. Then the time-frequency representation is mapped back onto the time domain by an appropriate operation. The computed instantaneous traveltime equals the recording time at those instances at which there is a seismic event, a feature that is used to pick the events. We analyzed the concept of the instantaneous traveltime and demonstrated the application of our automatic picking method on dynamite and Vibroseis field data.

  20. Automatic modulation recognition of communication signals

    CERN Document Server

    Azzouz, Elsayed Elsayed

    1996-01-01

    Automatic modulation recognition is a rapidly evolving area of signal analysis. In recent years, interest from the academic and military research institutes has focused around the research and development of modulation recognition algorithms. Any communication intelligence (COMINT) system comprises three main blocks: receiver front-end, modulation recogniser and output stage. Considerable work has been done in the area of receiver front-ends. The work at the output stage is concerned with information extraction, recording and exploitation and begins with signal demodulation, that requires accurate knowledge about the signal modulation type. There are, however, two main reasons for knowing the current modulation type of a signal; to preserve the signal information content and to decide upon the suitable counter action, such as jamming. Automatic Modulation Recognition of Communications Signals describes in depth this modulation recognition process. Drawing on several years of research, the authors provide a cr...

  1. Automatic calibration of gamma spectrometers

    International Nuclear Information System (INIS)

    Tluchor, D.; Jiranek, V.

    1989-01-01

    The principle is described of energy calibration of the spectrometric path based on the measurement of the standard of one radionuclide or a set of them. The entire computer-aided process is divided into three main steps, viz.: the insertion of the calibration standard by the operator; the start of the calibration program; energy calibration by the computer. The program was selected such that the spectrum identification should not depend on adjustment of the digital or analog elements of the gamma spectrometric measuring path. The ECL program is described for automatic energy calibration as is its control, the organization of data file ECL.DAT and the necessary hardware support. The computer-multichannel analyzer communication was provided using an interface pair of Canberra 8673V and Canberra 8573 operating in the RS-422 standard. All subroutines for communication with the multichannel analyzer were written in MACRO 11 while the main program and the other subroutines were written in FORTRAN-77. (E.J.). 1 tab., 4 refs

  2. Portable and Automatic Moessbauer Analysis

    International Nuclear Information System (INIS)

    Souza, P. A. de; Garg, V. K.; Klingelhoefer, G.; Gellert, R.; Guetlich, P.

    2002-01-01

    A portable Moessbauer spectrometer, developed for extraterrestrial applications, opens up new industrial applications of MBS. But for industrial applications, an available tool for fast data analysis is also required, and it should be easy to handle. The analysis of Moessbauer spectra and their parameters is a barrier for the popularity of this wide-applicable spectroscopic technique in industry. Based on experience, the analysis of a Moessbauer spectrum is time-consuming and requires the dedication of a specialist. However, the analysis of Moessbauer spectra, from the fitting to the identification of the sample phases, can be faster using by genetic algorithms, fuzzy logic and artificial neural networks. Industrial applications are very specific ones and the data analysis can be performed using these algorithms. In combination with an automatic analysis, the Moessbauer spectrometer can be used as a probe instrument which covers the main industrial needs for an on-line monitoring of its products, processes and case studies. Some of these real industrial applications will be discussed.

  3. Automatic Structure-Based Code Generation from Coloured Petri Nets

    DEFF Research Database (Denmark)

    Kristensen, Lars Michael; Westergaard, Michael

    2010-01-01

    Automatic code generation based on Coloured Petri Net (CPN) models is challenging because CPNs allow for the construction of abstract models that intermix control flow and data processing, making translation into conventional programming constructs difficult. We introduce Process-Partitioned CPNs...... (PP-CPNs) which is a subclass of CPNs equipped with an explicit separation of process control flow, message passing, and access to shared and local data. We show how PP-CPNs caters for a four phase structure-based automatic code generation process directed by the control flow of processes....... The viability of our approach is demonstrated by applying it to automatically generate an Erlang implementation of the Dynamic MANET On-demand (DYMO) routing protocol specified by the Internet Engineering Task Force (IETF)....

  4. Automatic Control of the Concrete Mixture Homogeneity in Cycling Mixers

    Science.gov (United States)

    Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly

    2018-03-01

    The article describes the factors affecting the concrete mixture quality related to the moisture content of aggregates, since the effectiveness of the concrete mixture production is largely determined by the availability of quality management tools at all stages of the technological process. It is established that the unaccounted moisture of aggregates adversely affects the concrete mixture homogeneity and, accordingly, the strength of building structures. A new control method and the automatic control system of the concrete mixture homogeneity in the technological process of mixing components have been proposed, since the tasks of providing a concrete mixture are performed by the automatic control system of processing kneading-and-mixing machinery with operational automatic control of homogeneity. Theoretical underpinnings of the control of the mixture homogeneity are presented, which are related to a change in the frequency of vibrodynamic vibrations of the mixer body. The structure of the technical means of the automatic control system for regulating the supply of water is determined depending on the change in the concrete mixture homogeneity during the continuous mixing of components. The following technical means for establishing automatic control have been chosen: vibro-acoustic sensors, remote terminal units, electropneumatic control actuators, etc. To identify the quality indicator of automatic control, the system offers a structure flowchart with transfer functions that determine the ACS operation in transient dynamic mode.

  5. Towards automatic exchange of information

    OpenAIRE

    Oberson, Xavier

    2015-01-01

    This article describes the various steps that led towards automatic exchange of information as the global standard and the issues that remain to be solved. First, the various competing models of exchange information, such as Double Tax Treaty (DTT), TIEA's, FATCA or UE Directives are described with a view to show how they interact between themselves. Second, the so-called Rubik Strategy is summarized and compared with an automatic exchange of information (AEOI). The third part then describes ...

  6. Development of diaphragm automatic homing equipment

    International Nuclear Information System (INIS)

    Kobayashi, Hidetoshi; Yamada, Koji; Moriya, Shinichi; Koike, Jiro; Okabe, Masao; Oyama, Akihiro.

    1996-01-01

    In steam-turbine overhaul inspection, one of the most important processes is to remove rust and deposited contaminants on the surface of turbine parts, while the turbine is in operation, to recover thermal efficiency and prepare good surface conditions for color penetrant inspection. These processes generally are done by dry honing, but this generates dust. To protect laborers against these conditions, Hitachi, Ltd. has developed automatic honing equipment for the diaphragms of the nuclear steam turbine. This equipment was first used in the first annual inspection and overhaul of Hamaoka Nuclear Power Plant No.4 of Chubu Electric Power Inc. (author)

  7. Automatic Lamp and Fan Control Based on Microcontroller

    Science.gov (United States)

    Widyaningrum, V. T.; Pramudita, Y. D.

    2018-01-01

    In general, automation can be described as a process following pre-determined sequential steps with a little or without any human exertion. Automation is provided with the use of various sensors suitable to observe the production processes, actuators and different techniques and devices. In this research, the automation system developed is an automatic lamp and an automatic fan on the smart home. Both of these systems will be processed using an Arduino Mega 2560 microcontroller. A microcontroller is used to obtain values of physical conditions through sensors connected to it. In the automatic lamp system required sensors to detect the light of the LDR (Light Dependent Resistor) sensor. While the automatic fan system required sensors to detect the temperature of the DHT11 sensor. In tests that have been done lamps and fans can work properly. The lamp can turn on automatically when the light begins to darken, and the lamp can also turn off automatically when the light begins to bright again. In addition, it can concluded also that the readings of LDR sensors are placed outside the room is different from the readings of LDR sensors placed in the room. This is because the light intensity received by the existing LDR sensor in the room is blocked by the wall of the house or by other objects. Then for the fan, it can also turn on automatically when the temperature is greater than 25°C, and the fan speed can also be adjusted. The fan may also turn off automatically when the temperature is less than equal to 25°C.

  8. Evolutionary game dynamics of controlled and automatic decision-making.

    Science.gov (United States)

    Toupo, Danielle F P; Strogatz, Steven H; Cohen, Jonathan D; Rand, David G

    2015-07-01

    We integrate dual-process theories of human cognition with evolutionary game theory to study the evolution of automatic and controlled decision-making processes. We introduce a model in which agents who make decisions using either automatic or controlled processing compete with each other for survival. Agents using automatic processing act quickly and so are more likely to acquire resources, but agents using controlled processing are better planners and so make more effective use of the resources they have. Using the replicator equation, we characterize the conditions under which automatic or controlled agents dominate, when coexistence is possible and when bistability occurs. We then extend the replicator equation to consider feedback between the state of the population and the environment. Under conditions in which having a greater proportion of controlled agents either enriches the environment or enhances the competitive advantage of automatic agents, we find that limit cycles can occur, leading to persistent oscillations in the population dynamics. Critically, however, these limit cycles only emerge when feedback occurs on a sufficiently long time scale. Our results shed light on the connection between evolution and human cognition and suggest necessary conditions for the rise and fall of rationality.

  9. Evolutionary game dynamics of controlled and automatic decision-making

    Science.gov (United States)

    Toupo, Danielle F. P.; Strogatz, Steven H.; Cohen, Jonathan D.; Rand, David G.

    2015-07-01

    We integrate dual-process theories of human cognition with evolutionary game theory to study the evolution of automatic and controlled decision-making processes. We introduce a model in which agents who make decisions using either automatic or controlled processing compete with each other for survival. Agents using automatic processing act quickly and so are more likely to acquire resources, but agents using controlled processing are better planners and so make more effective use of the resources they have. Using the replicator equation, we characterize the conditions under which automatic or controlled agents dominate, when coexistence is possible and when bistability occurs. We then extend the replicator equation to consider feedback between the state of the population and the environment. Under conditions in which having a greater proportion of controlled agents either enriches the environment or enhances the competitive advantage of automatic agents, we find that limit cycles can occur, leading to persistent oscillations in the population dynamics. Critically, however, these limit cycles only emerge when feedback occurs on a sufficiently long time scale. Our results shed light on the connection between evolution and human cognition and suggest necessary conditions for the rise and fall of rationality.

  10. Automatic production of Iodine-123 with PLC 135/U

    International Nuclear Information System (INIS)

    Moghaddam-Banaem, L.; Afarideh, H.

    2004-01-01

    In this project, the automatic system for production of Iodine-123 with PLC/135μ Siemens, which is designed and installed for the first time in Iran, is discussed. The PLC (Programmable Logic Controller) is used to control industrial processing, which is similar to a computer and consists of central processing unit and memory and Input/Output units. PLC receives input information from auxiliary units such as sensors, switches, etc. and software processes data in memory and then sends commands to output units such as relays, motors, etc.The target section in Iodine production consists of 8 stages. In order to be sure automation works properly the system can be operated both manually and automatically. First PLC checks Manual/Automatic switch and in the case of automatic mode, PLC runs the program in memory and processing is done automatically. For this purpose, PLC takes the value of pressures and temperatures from analog inputs and after processing them it sends commands to digital output to activate valves or vacuum pumps or heaters. In this paper the following subjects are discussed: 1) Production of Iodine 123 2) PLC structure and auxiliary boards 3) Sensors and actuators and their connection to PLC 4) Software flowchart

  11. Cost-benefit analysis of the ATM automatic deposit service

    Directory of Open Access Journals (Sweden)

    Ivica Županović

    2015-03-01

    Full Text Available Bankers and other financial experts have analyzed the value of automated teller machines (ATM in terms of growing consumer demand, rising costs of technology development, decreasing profitability and market share. This paper presents a step-by-step cost-benefit analysis of the ATM automatic deposit service. The first step is to determine user attitudes towards using ATM automatic deposit service by using the Technology Acceptance Model (TAM. The second step is to determine location priorities for ATMs that provide automatic deposit services using the Analytic Hierarchy Process (AHP model. The results of the previous steps enable a highly efficient application of cost-benefit analysis for evaluating costs and benefits of automatic deposit services. To understand fully the proposed procedure outside of theoretical terms, a real-world application of a case study is conducted.

  12. ATIPS: Automatic Travel Itinerary Planning System for Domestic Areas.

    Science.gov (United States)

    Chang, Hsien-Tsung; Chang, Yi-Ming; Tsai, Meng-Tze

    2016-01-01

    Leisure travel has become a topic of great interest to Taiwanese residents in recent years. Most residents expect to be able to relax on a vacation during the holidays; however, the complicated procedure of travel itinerary planning is often discouraging and leads them to abandon the idea of traveling. In this paper, we design an automatic travel itinerary planning system for the domestic area (ATIPS) using an algorithm to automatically plan a domestic travel itinerary based on user intentions that allows users to minimize the process of trip planning. Simply by entering the travel time, the departure point, and the destination location, the system can automatically generate a travel itinerary. According to the results of the experiments, 70% of users were satisfied with the result of our system, and 82% of users were satisfied with the automatic user preference learning mechanism of ATIPS. Our algorithm also provides a framework for substituting modules or weights and offers a new method for travel planning.

  13. An Automatic Indirect Immunofluorescence Cell Segmentation System

    Directory of Open Access Journals (Sweden)

    Yung-Kuan Chan

    2014-01-01

    Full Text Available Indirect immunofluorescence (IIF with HEp-2 cells has been used for the detection of antinuclear autoantibodies (ANA in systemic autoimmune diseases. The ANA testing allows us to scan a broad range of autoantibody entities and to describe them by distinct fluorescence patterns. Automatic inspection for fluorescence patterns in an IIF image can assist physicians, without relevant experience, in making correct diagnosis. How to segment the cells from an IIF image is essential in developing an automatic inspection system for ANA testing. This paper focuses on the cell detection and segmentation; an efficient method is proposed for automatically detecting the cells with fluorescence pattern in an IIF image. Cell culture is a process in which cells grow under control. Cell counting technology plays an important role in measuring the cell density in a culture tank. Moreover, assessing medium suitability, determining population doubling times, and monitoring cell growth in cultures all require a means of quantifying cell population. The proposed method also can be used to count the cells from an image taken under a fluorescence microscope.

  14. Automatization in radiotherapy

    International Nuclear Information System (INIS)

    Schraub, S.; Dutou, L.; Bernard, D.; Koechlin, M.; Beer-Gabel, J.

    1978-01-01

    Data-processing in external radiotherapy has to be adapted to each local situation, taking into account the patients to be treated, the irradiation equipment, the data-processing centers available locally, regionally, and nationally, and the rentability of the data-processing system required. It should be recalled that most dosimetric methods used today can be treated manually, and the question of rentability has to be kept in mind when deciding to buy a data-processing system. The radiotherapist should, therefore, prepare a list of costs for each situation, and verify the validity of each programme proposed by the supplier. It is difficult to make a definite choice between the presently available systems. The radiotherapist has to choose in relation to his activity, his availability and the systems available to him. It can sometimes be more advantageous to have a terminal linked to a large computer, rather than to readapt a series of programmes for a data-processing system available locally: many such solutions, though original, cannot be 'exported'. It should be recalled that a large number of dosimetries can be obtained manually, and on the rare occasions when the aid of a computer is essential, the assistance of better equipped neighbouring centers can be obtained. The decision as to whether a data-processing system needs to be acquired has to take all these imperatives into account [fr

  15. Automatic evidence retrieval for systematic reviews.

    Science.gov (United States)

    Choong, Miew Keen; Galgani, Filippo; Dunn, Adam G; Tsafnat, Guy

    2014-10-01

    Snowballing involves recursively pursuing relevant references cited in the retrieved literature and adding them to the search results. Snowballing is an alternative approach to discover additional evidence that was not retrieved through conventional search. Snowballing's effectiveness makes it best practice in systematic reviews despite being time-consuming and tedious. Our goal was to evaluate an automatic method for citation snowballing's capacity to identify and retrieve the full text and/or abstracts of cited articles. Using 20 review articles that contained 949 citations to journal or conference articles, we manually searched Microsoft Academic Search (MAS) and identified 78.0% (740/949) of the cited articles that were present in the database. We compared the performance of the automatic citation snowballing method against the results of this manual search, measuring precision, recall, and F1 score. The automatic method was able to correctly identify 633 (as proportion of included citations: recall=66.7%, F1 score=79.3%; as proportion of citations in MAS: recall=85.5%, F1 score=91.2%) of citations with high precision (97.7%), and retrieved the full text or abstract for 490 (recall=82.9%, precision=92.1%, F1 score=87.3%) of the 633 correctly retrieved citations. The proposed method for automatic citation snowballing is accurate and is capable of obtaining the full texts or abstracts for a substantial proportion of the scholarly citations in review articles. By automating the process of citation snowballing, it may be possible to reduce the time and effort of common evidence surveillance tasks such as keeping trial registries up to date and conducting systematic reviews.

  16. Sleep-related automatism and the law.

    Science.gov (United States)

    Ebrahim, Irshaad Osman; Fenwick, Peter

    2008-04-01

    Crimes carried out during or arising from sleep highlight many difficulties with our current law and forensic sleep medicine clinical practice. There is a need for clarity in the law and agreement between experts on a standardised form of assessment and diagnosis in these challenging cases. We suggest that the time has come for a standardised, internationally recognised diagnostic protocol to be set as a minimum standard in all cases of suspected sleep-related forensic cases. The protocol of a full medical history, sleep history, psychiatric history, neuropsychiatric and psychometric examination and electroencephalography (EEG), should be routine. It should now be mandatory to carry out routine polysomnography (PSG) to establish the presence of precipitating and modulating factors. Sleepwalking is classified as insane automatism in England and Wales and sudden arousal from sleep in a non-sleepwalker as sane automatism. The recent case in England of R v. Lowe (2005) highlights these anomalies. Moreover, the word insanity stigmatises sleepwalkers and should be dropped. The simplest solution to these problems would be for the law to be changed so that there is only one category of defence for all sleep-related offences--not guilty by reason of sleep disorder. This was rejected by the House of Lords for cases of automatism due to epilepsy, and is likely to be rejected for sleepwalkers. Removing the categories of automatism (sane or insane) would be the best solution. Risk assessment is already standard practice in the UK and follow up, subsequent to disposal, by approved specialists should become part of the sentencing process. This will provide support for the defendant and protection of the public.

  17. Towards Automatic Threat Recognition

    Science.gov (United States)

    2006-12-01

    York: Bantam. Forschungsinstitut für Kommunikation, Informationsverarbeitung und Ergonomie FGAN Informationstechnik und Führungssysteme KIE Towards...Informationsverarbeitung und Ergonomie FGAN Informationstechnik und Führungssysteme KIE Content Preliminaries about Information Fusion The System Ontology Unification...as Processing Principle Back to the Example Conclusion and Outlook Forschungsinstitut für Kommunikation, Informationsverarbeitung und Ergonomie FGAN

  18. Research on Automatic Programming

    Science.gov (United States)

    1975-12-31

    Sequential processes, deadlocks, and semaphore primitives , Ph.D. Thesis, Harvard University, November 1974; Center for Research in Computing...verified. 13 Code generated to effect the synchronization makes use of the ECL control extension facility (Prenner’s CI, see [Prenner]). The... semaphore operations [Dijkstra] is being developed. Initial results for this code generator are very encouraging; in many cases generated code is

  19. Continuous moisture measurement in metallurgical coke with automatic charge correction

    International Nuclear Information System (INIS)

    Watzke, H.; Mehlhose, D.

    1981-01-01

    A process control system has been developed for automatic batching of the coke amount necessary for metallurgical processes taking into account the moisture content. The measurement is performed with a neutron moisture gage consisting of an Am-Be neutron source and a BF 3 counter. The output information of the counter is used for computer-controlled batching

  20. Preliminary Evidence for an Automatic Link between Sex and Power among Men Who Molest Children

    Science.gov (United States)

    Kamphuis, Jan H.; De Ruiter, Corine; Janssen, Bas; Spiering, Mark

    2005-01-01

    Understanding critical motivational processes of sexual offenders may ultimately provide important clues to more effective treatments. Implicit, automatic cognitive processes have received minimal attention; however, a lexical decision experiment revealed automatic links between the concepts of power and sex among participants who self-reported…

  1. Extraction: a system for automatic eddy current diagnosis of steam generator tubes in nuclear power plants

    International Nuclear Information System (INIS)

    Georgel, B.; Zorgati, R.

    1994-01-01

    Improving speed and quality of Eddy Current non-destructive testing of steam generator tubes leads to automatize all processes that contribute to diagnosis. This paper describes how we use signal processing, pattern recognition and artificial intelligence to build a software package that is able to automatically provide an efficient diagnosis. (authors). 2 figs., 5 refs

  2. Event-Related Potential Evidence that Automatic Recollection Can Be Voluntarily Avoided

    Science.gov (United States)

    Bergstrom, Zara M.; de Fockert, Jan; Richardson-Klavehn, Alan

    2009-01-01

    Voluntary control processes can be recruited to facilitate recollection in situations where a retrieval cue fails to automatically bring to mind a desired episodic memory. We investigated whether voluntary control processes can also stop recollection of unwanted memories that would otherwise have been automatically recollected. Participants were…

  3. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole

    1997-01-01

    This thesis is concerned with ``Automatic Validation of Numerical Solutions''. The basic theory of interval analysis and self-validating methods is introduced. The mean value enclosure is applied to discrete mappings for obtaining narrow enclosures of the iterates when applying these mappings...... differential equations, but in this thesis, we describe how to use the methods for enclosing iterates of discrete mappings, and then later use them for discretizing solutions of ordinary differential equations. The theory of automatic differentiation is introduced, and three methods for obtaining derivatives...... are described: The forward, the backward, and the Taylor expansion methods. The three methods have been implemented in the C++ program packages FADBAD/TADIFF. Some examples showing how to use the three metho ds are presented. A feature of FADBAD/TADIFF not present in other automatic differentiation packages...

  4. Automatic sample changers maintenance manual

    International Nuclear Information System (INIS)

    Myers, T.A.

    1978-10-01

    This manual describes and provides trouble-shooting aids for the Automatic Sample Changer electronics on the automatic beta counting system, developed by the Los Alamos Scientific Laboratory Group CNC-11. The output of a gas detector is shaped by a preamplifier, then is coupled to an amplifier. Amplifier output is discriminated and is the input to a scaler. An identification number is associated with each sample. At a predetermined count length, the identification number, scaler data plus other information is punched out on a data card. The next sample to be counted is automatically selected. The beta counter uses the same electronics as the prior count did, the only difference being the sample identification number and sample itself. This manual is intended as a step-by-step aid in trouble-shooting the electronics associated with positioning the sample, counting the sample, and getting the needed data punched on an 80-column data card

  5. [Development of automatic urine monitoring system].

    Science.gov (United States)

    Wei, Liang; Li, Yongqin; Chen, Bihua

    2014-03-01

    An automatic urine monitoring system is presented to replace manual operation. The system is composed of the flow sensor, MSP430f149 single chip microcomputer, human-computer interaction module, LCD module, clock module and memory module. The signal of urine volume is captured when the urine flows through the flow sensor and then displayed on the LCD after data processing. The experiment results suggest that the design of the monitor provides a high stability, accurate measurement and good real-time, and meets the demand of the clinical application.

  6. Personality in speech assessment and automatic classification

    CERN Document Server

    Polzehl, Tim

    2015-01-01

    This work combines interdisciplinary knowledge and experience from research fields of psychology, linguistics, audio-processing, machine learning, and computer science. The work systematically explores a novel research topic devoted to automated modeling of personality expression from speech. For this aim, it introduces a novel personality assessment questionnaire and presents the results of extensive labeling sessions to annotate the speech data with personality assessments. It provides estimates of the Big 5 personality traits, i.e. openness, conscientiousness, extroversion, agreeableness, and neuroticism. Based on a database built on the questionnaire, the book presents models to tell apart different personality types or classes from speech automatically.

  7. Automatic measurement of images on astrometric plates

    Science.gov (United States)

    Ortiz Gil, A.; Lopez Garcia, A.; Martinez Gonzalez, J. M.; Yershov, V.

    1994-04-01

    We present some results on the process of automatic detection and measurement of objects in overlapped fields of astrometric plates. The main steps of our algorithm are the following: determination of the Scale and Tilt between charge coupled devices (CCD) and microscope coordinate systems and estimation of signal-to-noise ratio in each field;--image identification and improvement of its position and size;--image final centering;--image selection and storage. Several parameters allow the use of variable criteria for image identification, characterization and selection. Problems related with faint images and crowded fields will be approached by special techniques (morphological filters, histogram properties and fitting models).

  8. Evaluation of automatic densitometer with laser diode

    International Nuclear Information System (INIS)

    Larrea Cox, Pedro J.; Hernandez Tabares, Lorenzo; Suarez San Pedro, Cirilo E.; Vazquez Cano, Aradys; Reyes Rodriguez, Marlen de los

    2009-01-01

    The evaluation of a prototype of an automatic transmission scanning densitometer is presented. It contains a semiconductor diode laser as a light source, and is mainly oriented to the analysis of protein electrophoresis. It was developed on the Center for Technological Applications and Nuclear Development (CEADEN). Its technical specifications were established and certified by the National Institute of Researches on Metrology (INIMET), and also the equipment was submitted for assays to the Process Control Laboratory, that belongs to the 'Adalberto Pesant' Enterprise for Sera and Hemo derivatives Products, in Havana city, where it was employed to the partial quality control of products that are made there, achieving satisfactory results. (Author)

  9. Automatic Construction of Finite Algebras

    Institute of Scientific and Technical Information of China (English)

    张健

    1995-01-01

    This paper deals with model generation for equational theories,i.e.,automatically generating (finite)models of a given set of (logical) equations.Our method of finite model generation and a tool for automatic construction of finite algebras is described.Some examples are given to show the applications of our program.We argue that,the combination of model generators and theorem provers enables us to get a better understanding of logical theories.A brief comparison betwween our tool and other similar tools is also presented.

  10. Development of an automatic scaler

    International Nuclear Information System (INIS)

    He Yuehong

    2009-04-01

    A self-designed automatic scaler is introduced. A microcontroller LPC936 is used as the master chip in the scaler. A counter integrated with the micro-controller is configured to operate as external pulse counter. Software employed in the scaler is based on a embedded real-time operating system kernel named Small RTOS. Data storage, calculation and some other functions are also provided. The scaler is designed for applications with low cost, low power consumption solutions. By now, the automatic scaler has been applied in a surface contamination instrument. (authors)

  11. Grinding Parts For Automatic Welding

    Science.gov (United States)

    Burley, Richard K.; Hoult, William S.

    1989-01-01

    Rollers guide grinding tool along prospective welding path. Skatelike fixture holds rotary grinder or file for machining large-diameter rings or ring segments in preparation for welding. Operator grasps handles to push rolling fixture along part. Rollers maintain precise dimensional relationship so grinding wheel cuts precise depth. Fixture-mounted grinder machines surface to quality sufficient for automatic welding; manual welding with attendant variations and distortion not necessary. Developed to enable automatic welding of parts, manual welding of which resulted in weld bead permeated with microscopic fissures.

  12. Automatic generation of anatomic characteristics from cerebral aneurysm surface models.

    Science.gov (United States)

    Neugebauer, M; Lawonn, K; Beuing, O; Preim, B

    2013-03-01

    Computer-aided research on cerebral aneurysms often depends on a polygonal mesh representation of the vessel lumen. To support a differentiated, anatomy-aware analysis, it is necessary to derive anatomic descriptors from the surface model. We present an approach on automatic decomposition of the adjacent vessels into near- and far-vessel regions and computation of the axial plane. We also exemplarily present two applications of the geometric descriptors: automatic computation of a unique vessel order and automatic viewpoint selection. Approximation methods are employed to analyze vessel cross-sections and the vessel area profile along the centerline. The resulting transition zones between near- and far- vessel regions are used as input for an optimization process to compute the axial plane. The unique vessel order is defined via projection into the plane space of the axial plane. The viewing direction for the automatic viewpoint selection is derived from the normal vector of the axial plane. The approach was successfully applied to representative data sets exhibiting a broad variability with respect to the configuration of their adjacent vessels. A robustness analysis showed that the automatic decomposition is stable against noise. A survey with 4 medical experts showed a broad agreement with the automatically defined transition zones. Due to the general nature of the underlying algorithms, this approach is applicable to most of the likely aneurysm configurations in the cerebral vasculature. Additional geometric information obtained during automatic decomposition can support correction in case the automatic approach fails. The resulting descriptors can be used for various applications in the field of visualization, exploration and analysis of cerebral aneurysms.

  13. Automatic and controlled components of judgment and decision making

    OpenAIRE

    Ferreira, MB; Garcia-Marques, L; Sherman, SJ; Sherman, JW

    2006-01-01

    The categorization of inductive reasoning into largely automatic processes (heuristic reasoning) and controlled analytical processes (rule-based reasoning) put forward by dual-process approaches of judgment under uncertainty (e.g., K. E. Stanovich & R. F. West, 2000) has been primarily a matter of assumption with a scarcity of direct empirical findings supporting it. The present authors use the process dissociation procedure (L. L. Jacoby, 1991) to provide convergent evidence validating a d...

  14. Design scheme of automatic feeding equipment of domestic uranium chemical concentrate

    International Nuclear Information System (INIS)

    Hu Jinming; Wang Chao; Peng Jinhui; Zhang Libo

    2014-01-01

    In order to solve problems by artificial feeding mode with low work efficiency, large intensity manual labor and environmental pollution in domestic uranium concentrate purification process, the design scheme of automatic feeding device was set up, including work flow sheet, composition of automatic equipment and operation. By application of automatic feeding equipment, the feeding speed can be greatly increased, labor force can be reduced, and harm to workman health can be decreased. (authors)

  15. Automatic Dissection Of Plantlets

    Science.gov (United States)

    Batchelor, B. G.; Harris, I. P.; Marchant, J. A.; Tillett, R. D.

    1989-03-01

    Micropropagation is a technique used in horticulture for generating a monoclonal colony of plants. A tiny plantlet is cut into several parts, each of which is then replanted. At the moment, the cutting is performed manually. Automating this task would have significant economic benefits. A robot designed to dissect plants would need to be equipped with intelligent visual sensing. This article is concerned with the image acquisition and processing techniques which such a machine might use. A program, which can calculate where to cut a plant with an "open" structure, is presented. This is expressed in the ProVision language, which is described in another article presented at this conference. (Article 1002-65)

  16. Automatic micropropagation of plants

    Science.gov (United States)

    Otte, Clemens; Schwanke, Joerg; Jensch, Peter F.

    1996-12-01

    Micropropagation is a sophisticated technique for the rapid multiplication of plants. It has a great commercial potential due to the speed of propagation, the high plant quality, and the ability to produce disease-free plants. However, micropropagation is usually done by hand which makes the process cost-intensive and tedious for the workers especially because it requires a sterile work-place. Therefore, we have developed a prototype automation system for the micropropagation of a grass species (miscanthus sinensis gigantheus). The objective of this paper is to describe the robotic system in an overview and to discuss the vision system more closely including the implemented morphological operations recognizing the cutting and gripping points of miscanthus plants. Fuzzy controllers are used to adapt the parameters of image operations on-line to each individual plant. Finally, we discuss our experiences with the developed prototype an give a preview of a possible real production line system.

  17. Automatic fuel exchanging device

    International Nuclear Information System (INIS)

    Takahashi, Fuminobu.

    1984-01-01

    Purpose: To enable to designate the identification number of a fuel assembly in a nuclear reactor pressure vessel thereby surely exchanging the designated assembly within a short time. Constitution: Identification number (or letter) pressed on a grip of a fuel assembly is to be detected by a two-dimensional ultrasonic probe of a pull-up mechanism. When the detected number corresponds with the designated number, a control signal is outputted, whereby the pull-up drive control mechanism or pull-up mechanism responds to pull-up and exchange the fuel assembly of the identified number. With such a constitution, the fuel assembly can rapidly and surely be recognized even if pressed letters deviate to the left or right of the probe, and further, the hinge portion and the signal processing portion can be simplified. (Horiuchi, T.)

  18. The automatic lumber planing mill

    Science.gov (United States)

    Peter Koch

    1957-01-01

    It is probable that a truly automatic planning operation could be devised if some of the variables commonly present in the mill-run lumber were eliminated and the remaining variables kept under close control. This paper will deal with the more general situation faced by mostl umber manufacturing plants. In other words, it will be assumed that the incoming lumber has...

  19. Automatically Preparing Safe SQL Queries

    Science.gov (United States)

    Bisht, Prithvi; Sistla, A. Prasad; Venkatakrishnan, V. N.

    We present the first sound program source transformation approach for automatically transforming the code of a legacy web application to employ PREPARE statements in place of unsafe SQL queries. Our approach therefore opens the way for eradicating the SQL injection threat vector from legacy web applications.

  20. The Automatic Measurement of Targets

    DEFF Research Database (Denmark)

    Höhle, Joachim

    1997-01-01

    The automatic measurement of targets is demonstrated by means of a theoretical example and by an interactive measuring program for real imagery from a réseau camera. The used strategy is a combination of two methods: the maximum correlation coefficient and the correlation in the subpixel range...... interactive software is also part of a computer-assisted learning program on digital photogrammetry....

  1. Automatic digitization. Experience of magnum 8000 in automatic digitization in EA; Digitalizacion automatica. Experiencias obtenidas durante la utilizacion del sistema magnus 8000 para la digitalizacion automatica en EA

    Energy Technology Data Exchange (ETDEWEB)

    Munoz Garcia, M.

    1995-12-31

    The paper describes the life cycle to be followed for the automatic digitization of files containing rasterised (scanned) images for their conversion into vector files (processable using CAD tools). The main characteristics of each of the five phases: capture, cleaning, conversion, revision and post-processing, that form part of the life cycle, are described. Lastly, the paper gives a comparative analysis of the results obtained using the automatic digitization process and other more conventional methods. (Author)

  2. Human visual system automatically encodes sequential regularities of discrete events.

    Science.gov (United States)

    Kimura, Motohiro; Schröger, Erich; Czigler, István; Ohira, Hideki

    2010-06-01

    For our adaptive behavior in a dynamically changing environment, an essential task of the brain is to automatically encode sequential regularities inherent in the environment into a memory representation. Recent studies in neuroscience have suggested that sequential regularities embedded in discrete sensory events are automatically encoded into a memory representation at the level of the sensory system. This notion is largely supported by evidence from investigations using auditory mismatch negativity (auditory MMN), an event-related brain potential (ERP) correlate of an automatic memory-mismatch process in the auditory sensory system. However, it is still largely unclear whether or not this notion can be generalized to other sensory modalities. The purpose of the present study was to investigate the contribution of the visual sensory system to the automatic encoding of sequential regularities using visual mismatch negativity (visual MMN), an ERP correlate of an automatic memory-mismatch process in the visual sensory system. To this end, we conducted a sequential analysis of visual MMN in an oddball sequence consisting of infrequent deviant and frequent standard stimuli, and tested whether the underlying memory representation of visual MMN generation contains only a sensory memory trace of standard stimuli (trace-mismatch hypothesis) or whether it also contains sequential regularities extracted from the repetitive standard sequence (regularity-violation hypothesis). The results showed that visual MMN was elicited by first deviant (deviant stimuli following at least one standard stimulus), second deviant (deviant stimuli immediately following first deviant), and first standard (standard stimuli immediately following first deviant), but not by second standard (standard stimuli immediately following first standard). These results are consistent with the regularity-violation hypothesis, suggesting that the visual sensory system automatically encodes sequential

  3. Artificial Intelligence In Automatic Target Recognizers: Technology And Timelines

    Science.gov (United States)

    Gilmore, John F.

    1984-12-01

    The recognition of targets in thermal imagery has been a problem exhaustively analyzed in its current localized dimension. This paper discusses the application of artificial intelligence (AI) technology to automatic target recognition, a concept capable of expanding current ATR efforts into a new globalized dimension. Deficiencies of current automatic target recognition systems are reviewed in terms of system shortcomings. Areas of artificial intelligence which show the most promise in improving ATR performance are analyzed, and a timeline is formed in light of how near (as well as far) term artificial intelligence applications may exist. Current research in the area of high level expert vision systems is reviewed and the possible utilization of artificial intelligence architectures to improve low level image processing functions is also discussed. Additional application areas of relevance to solving the problem of automatic target recognition utilizing both high and low level processing are also explored.

  4. Automatic analysis of microscopic images of red blood cell aggregates

    Science.gov (United States)

    Menichini, Pablo A.; Larese, Mónica G.; Riquelme, Bibiana D.

    2015-06-01

    Red blood cell aggregation is one of the most important factors in blood viscosity at stasis or at very low rates of flow. The basic structure of aggregates is a linear array of cell commonly termed as rouleaux. Enhanced or abnormal aggregation is seen in clinical conditions, such as diabetes and hypertension, producing alterations in the microcirculation, some of which can be analyzed through the characterization of aggregated cells. Frequently, image processing and analysis for the characterization of RBC aggregation were done manually or semi-automatically using interactive tools. We propose a system that processes images of RBC aggregation and automatically obtains the characterization and quantification of the different types of RBC aggregates. Present technique could be interesting to perform the adaptation as a routine used in hemorheological and Clinical Biochemistry Laboratories because this automatic method is rapid, efficient and economical, and at the same time independent of the user performing the analysis (repeatability of the analysis).

  5. Automatic remote communication system

    International Nuclear Information System (INIS)

    Yamamoto, Yoichi

    1990-05-01

    The Upgraded RECOVER (Remote Continual Verification) system is a communication system for remote continual verification of security and safeguards status of nuclear material in principal nuclear facilities. The system is composed of a command center and facility sub-systems. A command center is a mini-computer system to process C/S (Containment and Surveillance) status data. Facility sub-systems consists of OSM (On-site Multiplexer), MU (Monitoring Unit) and C/S sensor. The system uses public telephone network for communication between a command center and facility sub-systems, and it encrypts communication data to prevent falsification and wiretapping by unauthorized persons. This system inherits the design principle of RECOVER system that was tested by IAEA before. We upgraded and expanded its capabilities more than those of RECOVER. The development of this system began in 1983, and it finished in 1987. Performance tests of the system were carried out since 1987. It showed a farely good result with some indications which should need further improvements. The Upgraded RECOVER system provides timely information about the status of C/S systems, which could contribute to the reduction of inspection effort and the improvement of cost performance. (author)

  6. Rationalization of the electric power utilization for the ferro-alloy production at the HEK 'Jugohrom' by means of the follow up and restriction of the highest level loading as well as automatic processing system for the involved electric power (Macedonia)

    International Nuclear Information System (INIS)

    Koevski, Doncho

    2001-01-01

    The relations between the electro energetic system and the energetic generally, as well as the chronic energetic crisis, the electric power price, provoked to analyse the application of the economic use forms of the electric power. This paper presents the way of rationalisation of the electric power utilization for the ferro-alloy production in Jugohrom. This is done by appointing the measuring system, control and limitation of the highest level loading as well as automatic processing system for the involved electric power. (Original)

  7. Rationalization of the electric power utilization for the ferro-alloy production at the HEK 'Jugohrom' by means of the follow up and restriction of the highest level loading as well as automatic processing system for the involved electric power (Macedonia)

    International Nuclear Information System (INIS)

    Koevski, Doncho

    2002-01-01

    The relations between the electro energetic system and the energetic generally, as well as the chronic energetic crisis, the electric power price, provoked to analyse the application of the economic use forms of the electric power. This paper presents the way of rationalisation of the electric power utilization for the ferro-alloy production in Jugohrom. This is done by appointing the measuring system, control and limitation of the highest level loading as well as automatic processing system for the involved electric power. (Original)

  8. Methodology for Automatic Ontology Generation Using Database Schema Information

    Directory of Open Access Journals (Sweden)

    JungHyen An

    2018-01-01

    Full Text Available An ontology is a model language that supports the functions to integrate conceptually distributed domain knowledge and infer relationships among the concepts. Ontologies are developed based on the target domain knowledge. As a result, methodologies to automatically generate an ontology from metadata that characterize the domain knowledge are becoming important. However, existing methodologies to automatically generate an ontology using metadata are required to generate the domain metadata in a predetermined template, and it is difficult to manage data that are increased on the ontology itself when the domain OWL (Ontology Web Language individuals are continuously increased. The database schema has a feature of domain knowledge and provides structural functions to efficiently process the knowledge-based data. In this paper, we propose a methodology to automatically generate ontologies and manage the OWL individual through an interaction of the database and the ontology. We describe the automatic ontology generation process with example schema and demonstrate the effectiveness of the automatically generated ontology by comparing it with existing ontologies using the ontology quality score.

  9. Automatic differentiation algorithms in model analysis

    NARCIS (Netherlands)

    Huiskes, M.J.

    2002-01-01

    Title: Automatic differentiation algorithms in model analysis
    Author: M.J. Huiskes
    Date: 19 March, 2002

    In this thesis automatic differentiation algorithms and derivative-based methods

  10. Automatisms: bridging clinical neurology with criminal law.

    Science.gov (United States)

    Rolnick, Joshua; Parvizi, Josef

    2011-03-01

    The law, like neurology, grapples with the relationship between disease states and behavior. Sometimes, the two disciplines share the same terminology, such as automatism. In law, the "automatism defense" is a claim that action was involuntary or performed while unconscious. Someone charged with a serious crime can acknowledge committing the act and yet may go free if, relying on the expert testimony of clinicians, the court determines that the act of crime was committed in a state of automatism. In this review, we explore the relationship between the use of automatism in the legal and clinical literature. We close by addressing several issues raised by the automatism defense: semantic ambiguity surrounding the term automatism, the presence or absence of consciousness during automatisms, and the methodological obstacles that have hindered the study of cognition during automatisms. Copyright © 2010 Elsevier Inc. All rights reserved.

  11. [Automatic adjustment control system for DC glow discharge plasma source].

    Science.gov (United States)

    Wan, Zhen-zhen; Wang, Yong-qing; Li, Xiao-jia; Wang, Hai-zhou; Shi, Ning

    2011-03-01

    There are three important parameters in the DC glow discharge process, the discharge current, discharge voltage and argon pressure in discharge source. These parameters influence each other during glow discharge process. This paper presents an automatic control system for DC glow discharge plasma source. This system collects and controls discharge voltage automatically by adjusting discharge source pressure while the discharge current is constant in the glow discharge process. The design concept, circuit principle and control program of this automatic control system are described. The accuracy is improved by this automatic control system with the method of reducing the complex operations and manual control errors. This system enhances the control accuracy of glow discharge voltage, and reduces the time to reach discharge voltage stability. The glow discharge voltage stability test results with automatic control system are provided as well, the accuracy with automatic control system is better than 1% FS which is improved from 4% FS by manual control. Time to reach discharge voltage stability has been shortened to within 30 s by automatic control from more than 90 s by manual control. Standard samples like middle-low alloy steel and tin bronze have been tested by this automatic control system. The concentration analysis precision has been significantly improved. The RSDs of all the test result are better than 3.5%. In middle-low alloy steel standard sample, the RSD range of concentration test result of Ti, Co and Mn elements is reduced from 3.0%-4.3% by manual control to 1.7%-2.4% by automatic control, and that for S and Mo is also reduced from 5.2%-5.9% to 3.3%-3.5%. In tin bronze standard sample, the RSD range of Sn, Zn and Al elements is reduced from 2.6%-4.4% to 1.0%-2.4%, and that for Si, Ni and Fe is reduced from 6.6%-13.9% to 2.6%-3.5%. The test data is also shown in this paper.

  12. SU-E-J-16: Automatic Image Contrast Enhancement Based On Automatic Parameter Optimization for Radiation Therapy Setup Verification

    Energy Technology Data Exchange (ETDEWEB)

    Qiu, J [Taishan Medical University, Taian, Shandong (China); Washington University in St Louis, St Louis, MO (United States); Li, H. Harlod; Zhang, T; Yang, D [Washington University in St Louis, St Louis, MO (United States); Ma, F [Taishan Medical University, Taian, Shandong (China)

    2015-06-15

    Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The most important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools.

  13. SU-E-J-16: Automatic Image Contrast Enhancement Based On Automatic Parameter Optimization for Radiation Therapy Setup Verification

    International Nuclear Information System (INIS)

    Qiu, J; Li, H. Harlod; Zhang, T; Yang, D; Ma, F

    2015-01-01

    Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The most important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools

  14. Study on automatic control of high uranium concentration solvent extraction with pulse sieve-plate column

    International Nuclear Information System (INIS)

    You Wenzhi; Xing Guangxuan; Long Maoxiong; Zhang Jianmin; Zhou Qin; Chen Fuping; Ye Lingfeng

    1998-01-01

    The author mainly described the working condition of the automatic control system of high uranium concentration solvent extraction with pulse sieve-plate column on a large scale test. The use of the automatic instrument and meter, automatic control circuit, and the best feedback control point of the solvent extraction processing with pulse sieve-plate column are discussed in detail. The writers point out the success of this experiment on automation, also present some questions that should be cared for the automatic control, instruments and meters in production in the future

  15. Passive Visual Sensing in Automatic Arc Welding

    DEFF Research Database (Denmark)

    Liu, Jinchao

    For decades much work has been devoted to the research and development of automatic arc welding systems. However, it has remained a challenging problem. Besides the very complex arc welding process itself, the lack of ability to precisely sense the welding process, including the seam geometry...... and the weld pool, has also prevented the realization of a closed-loop control system for many years, even though a variety of sensors have been developed. Among all the sensor systems, visual sensors have the advantage of receiving visual information and have been drawn more and more attentions. Typical...... industrial solutions for seam detection such as using laser scanners suer from several limitations. For instance, it must be positioned some distance ahead to the molten pool and may cause problem when dealing with shiny surfaces. Existing techniques for weld pool sensing mostly rely on auxiliary light...

  16. Towards Automatic Classification of Wikipedia Content

    Science.gov (United States)

    Szymański, Julian

    Wikipedia - the Free Encyclopedia encounters the problem of proper classification of new articles everyday. The process of assignment of articles to categories is performed manually and it is a time consuming task. It requires knowledge about Wikipedia structure, which is beyond typical editor competence, which leads to human-caused mistakes - omitting or wrong assignments of articles to categories. The article presents application of SVM classifier for automatic classification of documents from The Free Encyclopedia. The classifier application has been tested while using two text representations: inter-documents connections (hyperlinks) and word content. The results of the performed experiments evaluated on hand crafted data show that the Wikipedia classification process can be partially automated. The proposed approach can be used for building a decision support system which suggests editors the best categories that fit new content entered to Wikipedia.

  17. Automatic design of magazine covers

    Science.gov (United States)

    Jahanian, Ali; Liu, Jerry; Tretter, Daniel R.; Lin, Qian; Damera-Venkata, Niranjan; O'Brien-Strain, Eamonn; Lee, Seungyon; Fan, Jian; Allebach, Jan P.

    2012-03-01

    In this paper, we propose a system for automatic design of magazine covers that quantifies a number of concepts from art and aesthetics. Our solution to automatic design of this type of media has been shaped by input from professional designers, magazine art directors and editorial boards, and journalists. Consequently, a number of principles in design and rules in designing magazine covers are delineated. Several techniques are derived and employed in order to quantify and implement these principles and rules in the format of a software framework. At this stage, our framework divides the task of design into three main modules: layout of magazine cover elements, choice of color for masthead and cover lines, and typography of cover lines. Feedback from professional designers on our designs suggests that our results are congruent with their intuition.

  18. Automatic schema evolution in Root

    International Nuclear Information System (INIS)

    Brun, R.; Rademakers, F.

    2001-01-01

    ROOT version 3 (spring 2001) supports automatic class schema evolution. In addition this version also produces files that are self-describing. This is achieved by storing in each file a record with the description of all the persistent classes in the file. Being self-describing guarantees that a file can always be read later, its structure browsed and objects inspected, also when the library with the compiled code of these classes is missing. The schema evolution mechanism supports the frequent case when multiple data sets generated with many different class versions must be analyzed in the same session. ROOT supports the automatic generation of C++ code describing the data objects in a file

  19. Automatic Conflict Detection on Contracts

    Science.gov (United States)

    Fenech, Stephen; Pace, Gordon J.; Schneider, Gerardo

    Many software applications are based on collaborating, yet competing, agents or virtual organisations exchanging services. Contracts, expressing obligations, permissions and prohibitions of the different actors, can be used to protect the interests of the organisations engaged in such service exchange. However, the potentially dynamic composition of services with different contracts, and the combination of service contracts with local contracts can give rise to unexpected conflicts, exposing the need for automatic techniques for contract analysis. In this paper we look at automatic analysis techniques for contracts written in the contract language mathcal{CL}. We present a trace semantics of mathcal{CL} suitable for conflict analysis, and a decision procedure for detecting conflicts (together with its proof of soundness, completeness and termination). We also discuss its implementation and look into the applications of the contract analysis approach we present. These techniques are applied to a small case study of an airline check-in desk.

  20. The Mark II Automatic Diflux

    Directory of Open Access Journals (Sweden)

    Jean L Rasson

    2011-07-01

    Full Text Available We report here on the new realization of an automatic fluxgate theodolite able to perform unattended absolute geomagnetic declination and inclination measurements: the AUTODIF MKII. The main changes of this version compared with the former one are presented as well as the better specifications we expect now. We also explain the absolute orientation procedure by means of a laser beam and a corner cube and the method for leveling the fluxgate sensor, which is different from a conventional DIflux theodolite.

  1. Annual review in automatic programming

    CERN Document Server

    Halpern, Mark I; Bolliet, Louis

    2014-01-01

    Computer Science and Technology and their Application is an eight-chapter book that first presents a tutorial on database organization. Subsequent chapters describe the general concepts of Simula 67 programming language; incremental compilation and conversational interpretation; dynamic syntax; the ALGOL 68. Other chapters discuss the general purpose conversational system for graphical programming and automatic theorem proving based on resolution. A survey of extensible programming language is also shown.

  2. Automatic Detection of Fake News

    OpenAIRE

    Pérez-Rosas, Verónica; Kleinberg, Bennett; Lefevre, Alexandra; Mihalcea, Rada

    2017-01-01

    The proliferation of misleading information in everyday access media outlets such as social media feeds, news blogs, and online newspapers have made it challenging to identify trustworthy news sources, thus increasing the need for computational tools able to provide insights into the reliability of online content. In this paper, we focus on the automatic identification of fake content in online news. Our contribution is twofold. First, we introduce two novel datasets for the task of fake news...

  3. Automatic computation of transfer functions

    Science.gov (United States)

    Atcitty, Stanley; Watson, Luke Dale

    2015-04-14

    Technologies pertaining to the automatic computation of transfer functions for a physical system are described herein. The physical system is one of an electrical system, a mechanical system, an electromechanical system, an electrochemical system, or an electromagnetic system. A netlist in the form of a matrix comprises data that is indicative of elements in the physical system, values for the elements in the physical system, and structure of the physical system. Transfer functions for the physical system are computed based upon the netlist.

  4. Automatic wipers with mist control

    OpenAIRE

    Ashik K.P; A.N.Basavaraju

    2016-01-01

    - This paper illustrates Automatic wipers with mist control. In modern days, the accidents are most common in commercial vehicles. One of the reasons for these accidents is formation of the mist inside the vehicle due to heavy rain. In rainy seasons for commercial vehicles, the wiper on the windshield has to be controlled by the driver himself, which distracts his concentration on driving. Also when the rain lasts for more time (say for about 15 minutes) the formation of mist on t...

  5. How CBO Estimates Automatic Stabilizers

    Science.gov (United States)

    2015-11-01

    the economy. Most types of revenues—mainly personal, corporate, and social insurance taxes —are sensitive to the business cycle and account for most of...Medicare taxes for self-employed people, taxes on production and imports, and unemployment insurance taxes . Those six categories account for the bulk of...federal tax revenues.6 Individual taxes account for most of the automatic stabilizers from revenues, followed by Social Security plus Medicare

  6. Social Signals, their function, and automatic analysis: A survey

    NARCIS (Netherlands)

    Vinciarelli, Alessandro; Pantic, Maja; Bourlard, Hervé; Pentland, Alex

    2008-01-01

    Social Signal Processing (SSP) aims at the analysis of social behaviour in both Human-Human and Human-Computer interactions. SSP revolves around automatic sensing and interpretation of social signals, complex aggregates of nonverbal behaviours through which individuals express their attitudes

  7. Automatic SIMD parallelization of embedded applications based on pattern recognition

    NARCIS (Netherlands)

    Manniesing, R.; Karkowski, I.P.; Corporaal, H.

    2000-01-01

    This paper investigates the potential for automatic mapping of typical embedded applications to architectures with multimedia instruction set extensions. For this purpose a (pattern matching based) code transformation engine is used, which involves a three-step process of matching, condition

  8. Do Automatic Self-Associations Relate to Suicidal Ideation?

    NARCIS (Netherlands)

    Glashouwer, Klaske A.; de Jong, Peter J.; Penninx, Brenda W. J. H.; Kerkhof, Ad J. F. M.; van Dyck, Richard; Ormel, Johan

    Dysfunctional self-schemas are assumed to play an important role in suicidal ideation. According to recent information-processing models, it is important to differentiate between 'explicit' beliefs and automatic associations. Explicit beliefs stem from the weighting of propositions and their

  9. Application of software technology to automatic test data analysis

    Science.gov (United States)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  10. Automatic identification of temporal sequences in chewing sounds

    NARCIS (Netherlands)

    Amft, O.D.; Kusserow, M.; Tröster, G.

    2007-01-01

    Chewing is an essential part of food intake. The analysis and detection of food patterns is an important component of an automatic dietary monitoring system. However chewing is a time-variable process depending on food properties. We present an automated methodology to extract sub-sequences of

  11. Automatized welding equipment for manufacturing steel cells for special buildings

    International Nuclear Information System (INIS)

    Weikert, F.; Winter, K.P.

    1986-01-01

    In GDR's nuclear power plant construction, reinforced concrete wall cells are used to construct pressure and full pressure containments for WWER-440 and WWER-1000 reactors, respectively. Welding processes for the prefabrication of steel cells as reinforcement have been automatized in order to increase both labor productivity and quality assurance. 11 figs

  12. Demonstrator for Automatic Target Classification in SAR Imagery

    NARCIS (Netherlands)

    Wit, J.J.M. de; Broek, A.C. van den; Dekker, R.J.

    2006-01-01

    Due to the increasing use of unmanned aerial vehicles (UAV) for reconnaissance, surveillance, and target acquisition applications, the interest in synthetic aperture radar (SAR) systems is growing. In order to facilitate the processing of the enormous amount of SAR data on the ground, automatic

  13. Automatic web site authoring with SiteGuide

    NARCIS (Netherlands)

    de Boer, V.; Hollink, V.; van Someren, M.W.; Kłopotek, M.A.; Przepiórkowski, A.; Wierzchoń, S.T.; Trojanowski, K.

    2009-01-01

    An important step in the design process for a web site is to determine which information is to be included and how the information should be organized on the web site’s pages. In this paper we describe ’SiteGuide’, a tool that automatically produces an information architecture for a web site that a

  14. Automatic Tuning of the Superheat Controller in a Refrigeration Plant

    DEFF Research Database (Denmark)

    Rasmussen, Henrik; Thybo, Claus; Larsen, Lars F. S.

    2006-01-01

    This paper proposes an automatic tuning of the superheat control in a refrigeration system using a relay method. By means of a simple evaporator model that captures the important dynamics and non-linearities of the superheat a gain-scheduling that compensates for the variation of the process gain...

  15. Automatic Coding of Short Text Responses via Clustering in Educational Assessment

    Science.gov (United States)

    Zehner, Fabian; Sälzer, Christine; Goldhammer, Frank

    2016-01-01

    Automatic coding of short text responses opens new doors in assessment. We implemented and integrated baseline methods of natural language processing and statistical modelling by means of software components that are available under open licenses. The accuracy of automatic text coding is demonstrated by using data collected in the "Programme…

  16. Testing the Automatization Deficit Hypothesis of Dyslexia via a Dual-Task Paradigm.

    Science.gov (United States)

    Yap, Regina L.; van der Leij, Aryan

    1994-01-01

    Fourteen Dutch children with dyslexia were compared with controls on automatic processing under a dual task (motor balance task and auditory choice task) model. Results indicated the dyslexic group was more impaired in the dual task condition than in the single task condition, compared with controls. Findings support the automatization deficit…

  17. An interactive system for the automatic layout of printed circuit boards (ARAIGNEE)

    International Nuclear Information System (INIS)

    Combet, M.; Eder, J.; Pagny, C.

    1974-12-01

    A software package for the automatic layout of printed circuit boards is presented. The program permits an interaction of the user during the layout process. The automatic searching of paths can be interrupted at any step and convenient corrections can be inserted. This procedure improves strongly the performance of the program as far as the number of unresolved connections is concerned

  18. Mindfulness decouples the relation between automatic alcohol motivation and drinking behavior

    NARCIS (Netherlands)

    Ostafin, Brian D.; Bauer, Chris; Myxter, Peter

    Dual-process models of addiction propose that alcohol and drug use are influenced by automatic motivational responses to substance use cues. With increasing evidence that automatic alcohol motivation is related to heavy drinking, researchers have begun to examine interventions that may modulate the

  19. Automatic gas-levitation system for vacuum deposition of laser-fusion targets

    International Nuclear Information System (INIS)

    Jordan, C.W.; Cameron, G.R.; Krenik, R.M.; Crane, J.K.

    1981-01-01

    An improved simple system has been developed to gas-levitate microspheres during vacuum-deposition processes. The automatic operation relies on two effects: a lateral stabilizing force provided by a centering-ring; and an automatically incremented gas metering system to offset weight increases during coating

  20. CONCEPT OF AUTOMATIC CONTROL SYSTEM FOR IMPROVING THE EFFICIENCY OF THE ABSORPTION REFRIGERATING UNITS

    Directory of Open Access Journals (Sweden)

    O. Titlova

    2016-12-01

    Full Text Available The general concept of the automatic control systems constructing for increasing the efficiency of the artificial cold production process in the absorption refrigerating units is substantiated. The described automatic control systems provides necessary degree of the ammonia vapor purification from the water in all absorption refrigerating units modes and minimizes heat loss from the dephlegmator surface.