WorldWideScience

Sample records for automated multistep genetic

  1. Automating multistep flow synthesis: approach and challenges in integrating chemistry, machines and logic

    Directory of Open Access Journals (Sweden)

    Chinmay A. Shukla

    2017-05-01

    Full Text Available The implementation of automation in the multistep flow synthesis is essential for transforming laboratory-scale chemistry into a reliable industrial process. In this review, we briefly introduce the role of automation based on its application in synthesis viz. auto sampling and inline monitoring, optimization and process control. Subsequently, we have critically reviewed a few multistep flow synthesis and suggested a possible control strategy to be implemented so that it helps to reliably transfer the laboratory-scale synthesis strategy to a pilot scale at its optimum conditions. Due to the vast literature in multistep synthesis, we have classified the literature and have identified the case studies based on few criteria viz. type of reaction, heating methods, processes involving in-line separation units, telescopic synthesis, processes involving in-line quenching and process with the smallest time scale of operation. This classification will cover the broader range in the multistep synthesis literature.

  2. Dissolvable fluidic time delays for programming multi-step assays in instrument-free paper diagnostics.

    Science.gov (United States)

    Lutz, Barry; Liang, Tinny; Fu, Elain; Ramachandran, Sujatha; Kauffman, Peter; Yager, Paul

    2013-07-21

    Lateral flow tests (LFTs) are an ingenious format for rapid and easy-to-use diagnostics, but they are fundamentally limited to assay chemistries that can be reduced to a single chemical step. In contrast, most laboratory diagnostic assays rely on multiple timed steps carried out by a human or a machine. Here, we use dissolvable sugar applied to paper to create programmable flow delays and present a paper network topology that uses these time delays to program automated multi-step fluidic protocols. Solutions of sucrose at different concentrations (10-70% of saturation) were added to paper strips and dried to create fluidic time delays spanning minutes to nearly an hour. A simple folding card format employing sugar delays was shown to automate a four-step fluidic process initiated by a single user activation step (folding the card); this device was used to perform a signal-amplified sandwich immunoassay for a diagnostic biomarker for malaria. The cards are capable of automating multi-step assay protocols normally used in laboratories, but in a rapid, low-cost, and easy-to-use format.

  3. Applying flow chemistry: methods, materials, and multistep synthesis.

    Science.gov (United States)

    McQuade, D Tyler; Seeberger, Peter H

    2013-07-05

    The synthesis of complex molecules requires control over both chemical reactivity and reaction conditions. While reactivity drives the majority of chemical discovery, advances in reaction condition control have accelerated method development/discovery. Recent tools include automated synthesizers and flow reactors. In this Synopsis, we describe how flow reactors have enabled chemical advances in our groups in the areas of single-stage reactions, materials synthesis, and multistep reactions. In each section, we detail the lessons learned and propose future directions.

  4. Complex Genetics of Behavior: BXDs in the Automated Home-Cage.

    Science.gov (United States)

    Loos, Maarten; Verhage, Matthijs; Spijker, Sabine; Smit, August B

    2017-01-01

    This chapter describes a use case for the genetic dissection and automated analysis of complex behavioral traits using the genetically diverse panel of BXD mouse recombinant inbred strains. Strains of the BXD resource differ widely in terms of gene and protein expression in the brain, as well as in their behavioral repertoire. A large mouse resource opens the possibility for gene finding studies underlying distinct behavioral phenotypes, however, such a resource poses a challenge in behavioral phenotyping. To address the specifics of large-scale screening we describe how to investigate: (1) how to assess mouse behavior systematically in addressing a large genetic cohort, (2) how to dissect automation-derived longitudinal mouse behavior into quantitative parameters, and (3) how to map these quantitative traits to the genome, deriving loci underlying aspects of behavior.

  5. Statistical theory of multi-step compound and direct reactions

    International Nuclear Information System (INIS)

    Feshbach, H.; Kerman, A.; Koonin, S.

    1980-01-01

    The theory of nuclear reactions is extended so as to include a statistical treatment of multi-step processes. Two types are distinguished, the multi-step compound and the multi-step direct. The wave functions for the system are grouped according to their complexity. The multi-step direct process involves explicitly those states which are open, while the multi-step compound involves those which are bound. In addition to the random phase assumption which is applied differently to the multi-step direct and to the multi-step compound cross-sections, it is assumed that the residual interaction will have non-vanishing matrix elements between states whose complexities differ by at most one unit. This is referred to as the chaining hypothesis. Explicit expressions for the double differential cross-section giving the angular distribution and energy spectrum are obtained for both reaction types. The statistical multi-step compound cross-sections are symmetric about 90 0 . The classical statistical theory of nuclear reactions is a special limiting case. The cross-section for the statistical multi-step direct reaction consists of a set of convolutions of single-step direct cross-sections. For the many step case it is possible to derive a diffusion equation in momentum space. Application is made to the reaction 181 Ta(p,n) 181 W using the statistical multi-step compound formalism

  6. Application of multistep compound and multistep direct models for data evaluation

    International Nuclear Information System (INIS)

    Chadwick, M.B.; Young, P.G.

    1992-01-01

    We have implemented the quantum mechanical multistep compound (MSC) and multistep direct (MSD) theories of Feshbach, Kerman, and Koonin (FKK) for the calculation of nucleon-induced multistep reactions. Our code system, FKK-GNASH, uses the FKK theory for primary preequilibrium emission and describes subsequent equilibrium emission using the Hauser-Feshbach code GNASH. The MSC contribution yields emitted particles with angular distributions symmetric about ninety degrees, whereas the MSD contribution, calculated by averaging DWBA cross sections for particle-hole excitations in the continuum, results in forward-peaked preequilibrium emission. The original picture of the evolution of the reaction, as described by FKK, is modified to allow transitions from the MSD to MSC chain. This modification is consistent with semiclassical preequilibrium descriptions, and allows a good description of preequilibrium spectra and angular distributions for a range of different reactions. As an example of our methodology, we calculate 93 Nb(n,n') reactions at 14, 20 and 26 MeV, and 93 Nb(n,p) at 14 MeV, comparing our results with experimental data. Use of the FKK theory has the advantage that preequilibrium calculations are based on rigorous physical principles, and (unlike semiclassical approaches) allow a good description of angular distributions. We argue that our implementation of the FKK theory maximizes the level of predictability, which is needed for data evaluations

  7. Strong-stability-preserving additive linear multistep methods

    KAUST Repository

    Hadjimichael, Yiannis

    2018-02-20

    The analysis of strong-stability-preserving (SSP) linear multistep methods is extended to semi-discretized problems for which different terms on the right-hand side satisfy different forward Euler (or circle) conditions. Optimal perturbed and additive monotonicity-preserving linear multistep methods are studied in the context of such problems. Optimal perturbed methods attain larger monotonicity-preserving step sizes when the different forward Euler conditions are taken into account. On the other hand, we show that optimal SSP additive methods achieve a monotonicity-preserving step-size restriction no better than that of the corresponding nonadditive SSP linear multistep methods.

  8. Automation of diagnostic genetic testing: mutation detection by cyclic minisequencing.

    Science.gov (United States)

    Alagrund, Katariina; Orpana, Arto K

    2014-01-01

    The rising role of nucleic acid testing in clinical decision making is creating a need for efficient and automated diagnostic nucleic acid test platforms. Clinical use of nucleic acid testing sets demands for shorter turnaround times (TATs), lower production costs and robust, reliable methods that can easily adopt new test panels and is able to run rare tests in random access principle. Here we present a novel home-brew laboratory automation platform for diagnostic mutation testing. This platform is based on the cyclic minisequecing (cMS) and two color near-infrared (NIR) detection. Pipetting is automated using Tecan Freedom EVO pipetting robots and all assays are performed in 384-well micro plate format. The automation platform includes a data processing system, controlling all procedures, and automated patient result reporting to the hospital information system. We have found automated cMS a reliable, inexpensive and robust method for nucleic acid testing for a wide variety of diagnostic tests. The platform is currently in clinical use for over 80 mutations or polymorphisms. Additionally to tests performed from blood samples, the system performs also epigenetic test for the methylation of the MGMT gene promoter, and companion diagnostic tests for analysis of KRAS and BRAF gene mutations from formalin fixed and paraffin embedded tumor samples. Automation of genetic test reporting is found reliable and efficient decreasing the work load of academic personnel.

  9. Identification of genes associated with dissociation of cognitive performance and neuropathological burden: Multistep analysis of genetic, epigenetic, and transcriptional data.

    Directory of Open Access Journals (Sweden)

    Charles C White

    2017-04-01

    Full Text Available The molecular underpinnings of the dissociation of cognitive performance and neuropathological burden are poorly understood, and there are currently no known genetic or epigenetic determinants of the dissociation."Residual cognition" was quantified by regressing out the effects of cerebral pathologies and demographic characteristics on global cognitive performance proximate to death. To identify genes influencing residual cognition, we leveraged neuropathological, genetic, epigenetic, and transcriptional data available for deceased participants of the Religious Orders Study (n = 492 and the Rush Memory and Aging Project (n = 487. Given that our sample size was underpowered to detect genome-wide significance, we applied a multistep approach to identify genes influencing residual cognition, based on our prior observation that independent genetic and epigenetic risk factors can converge on the same locus. In the first step (n = 979, we performed a genome-wide association study with a predefined suggestive p < 10-5, and nine independent loci met this threshold in eight distinct chromosomal regions. Three of the six genes within 100 kb of the lead SNP are expressed in the dorsolateral prefrontal cortex (DLPFC: UNC5C, ENC1, and TMEM106B. In the second step, in the subset of participants with DLPFC DNA methylation data (n = 648, we found that residual cognition was related to differential DNA methylation of UNC5C and ENC1 (false discovery rate < 0.05. In the third step, in the subset of participants with DLPFC RNA sequencing data (n = 469, brain transcription levels of UNC5C and ENC1 were evaluated for their association with residual cognition: RNA levels of both UNC5C (estimated effect = -0.40, 95% CI -0.69 to -0.10, p = 0.0089 and ENC1 (estimated effect = 0.0064, 95% CI 0.0033 to 0.0096, p = 5.7 × 10-5 were associated with residual cognition. In secondary analyses, we explored the mechanism of these associations and found that ENC1 may be related to

  10. Multistep processes in nuclear reactions

    International Nuclear Information System (INIS)

    Hodgson, P.E.

    1988-01-01

    The theories of nuclear reactions are reviewed with particular attention to the recent work on multistep processes. The evidence for compound nucleus and direct interaction reactions is described together with the results of comparisons between theories and experimental data. These theories have now proved inadequate, and there is evidence for multistep processes that take place after the initial direct stage but long before the attainment of the statistical equilibrium characteristic of compound nucleus processes. The theories of these reactions are described and it is shown how they can account for the experimental data and thus give a comprehensive understanding of nuclear reactions. (author)

  11. Cross sections for multistep direct reactions

    International Nuclear Information System (INIS)

    Demetriou, Paraskevi; Marcinkowski, Andrzej; Marianski, Bohdan

    2002-01-01

    Inelastic scattering and charge-exchange reactions have been analysed at energies ranging from 14 to 27 MeV using the modified multistep direct reaction theory (MSD) of Feshbach, Kerman and Koonin. The modified theory considers the non-DWBA matrix elements in the MSD cross section formulae and includes both incoherent particle-hole excitations and coherent collective excitations in the continuum, according to the prescriptions. The results show important contributions from multistep processes at all energies considered. (author)

  12. Space station automation study: Automation requirements derived from space manufacturing concepts. Volume 1: Executive summary

    Science.gov (United States)

    1984-01-01

    The electroepitaxial process and the Very Large Scale Integration (VLSI) circuits (chips) facilities were chosen because each requires a very high degree of automation, and therefore involved extensive use of teleoperators, robotics, process mechanization, and artificial intelligence. Both cover a raw materials process and a sophisticated multi-step process and are therfore highly representative of the kinds of difficult operation, maintenance, and repair challenges which can be expected for any type of space manufacturing facility. Generic areas were identified which will require significant further study. The initial design will be based on terrestrial state-of-the-art hard automation. One hundred candidate missions were evaluated on the basis of automation portential and availability of meaning ful knowldege. The design requirements and unconstrained design concepts developed for the two missions are presented.

  13. Statistical multistep direct and statistical multistep compound models for calculations of nuclear data for applications

    International Nuclear Information System (INIS)

    Seeliger, D.

    1993-01-01

    This contribution contains a brief presentation and comparison of the different Statistical Multistep Approaches, presently available for practical nuclear data calculations. (author). 46 refs, 5 figs

  14. Randomness in multi-step direct reactions

    International Nuclear Information System (INIS)

    Koning, A.J.; Akkermans, J.M.

    1991-01-01

    The authors propose a quantum-statistical framework that provides an integrated perspective on the differences and similarities between the many current models for multi-step direct reactions in the continuum. It is argued that to obtain a statistical theory two physically different approaches are conceivable to postulate randomness, respectively called leading-particle statistics and residual-system statistics. They present a new leading-particle statistics theory for multi-step direct reactions. It is shown that the model of Feshbach et al. can be derived as a simplification of this theory and thus can be founded solely upon leading-particle statistics. The models developed by Tamura et al. and Nishioka et al. are based upon residual-system statistics and hence fall into a physically different class of multi-step direct theories, although the resulting cross-section formulae for the important first step are shown to be the same. The widely used semi-classical models such as the generalized exciton model can be interpreted as further phenomenological simplification of the leading-particle statistics theory

  15. Automated Test Assembly for Cognitive Diagnosis Models Using a Genetic Algorithm

    Science.gov (United States)

    Finkelman, Matthew; Kim, Wonsuk; Roussos, Louis A.

    2009-01-01

    Much recent psychometric literature has focused on cognitive diagnosis models (CDMs), a promising class of instruments used to measure the strengths and weaknesses of examinees. This article introduces a genetic algorithm to perform automated test assembly alongside CDMs. The algorithm is flexible in that it can be applied whether the goal is to…

  16. Contributions of dopamine-related genes and environmental factors to highly sensitive personality: a multi-step neuronal system-level approach.

    Directory of Open Access Journals (Sweden)

    Chunhui Chen

    Full Text Available Traditional behavioral genetic studies (e.g., twin, adoption studies have shown that human personality has moderate to high heritability, but recent molecular behavioral genetic studies have failed to identify quantitative trait loci (QTL with consistent effects. The current study adopted a multi-step approach (ANOVA followed by multiple regression and permutation to assess the cumulative effects of multiple QTLs. Using a system-level (dopamine system genetic approach, we investigated a personality trait deeply rooted in the nervous system (the Highly Sensitive Personality, HSP. 480 healthy Chinese college students were given the HSP scale and genotyped for 98 representative polymorphisms in all major dopamine neurotransmitter genes. In addition, two environment factors (stressful life events and parental warmth that have been implicated for their contributions to personality development were included to investigate their relative contributions as compared to genetic factors. In Step 1, using ANOVA, we identified 10 polymorphisms that made statistically significant contributions to HSP. In Step 2, these polymorphism's main effects and interactions were assessed using multiple regression. This model accounted for 15% of the variance of HSP (p<0.001. Recent stressful life events accounted for an additional 2% of the variance. Finally, permutation analyses ascertained the probability of obtaining these findings by chance to be very low, p ranging from 0.001 to 0.006. Dividing these loci by the subsystems of dopamine synthesis, degradation/transport, receptor and modulation, we found that the modulation and receptor subsystems made the most significant contribution to HSP. The results of this study demonstrate the utility of a multi-step neuronal system-level approach in assessing genetic contributions to individual differences in human behavior. It can potentially bridge the gap between the high heritability estimates based on traditional

  17. The statistics of multi-step direct reactions

    International Nuclear Information System (INIS)

    Koning, A.J.; Akkermans, J.M.

    1991-01-01

    We propose a quantum-statistical framework that provides an integrated perspective on the differences and similarities between the many current models for multi-step direct reactions in the continuum. It is argued that to obtain a statistical theory two physically different approaches are conceivable to postulate randomness, respectively called leading-particle statistics and residual-system statistics. We present a new leading-particle statistics theory for multi-step direct reactions. It is shown that the model of Feshbach et al. can be derived as a simplification of this theory and thus can be founded solely upon leading-particle statistics. The models developed by Tamura et al. and Nishioka et al. are based upon residual-system statistics and hence fall into a physically different class of multi-step direct theories, although the resulting cross-section formulae for the important first step are shown to be the same. The widely used semi-classical models such as the generalized exciton model can be interpreted as further phenomenological simplifications of the leading-particle statistics theory. A more comprehensive exposition will appear before long. (author). 32 refs, 4 figs

  18. Comparison of boundedness and monotonicity properties of one-leg and linear multistep methods

    KAUST Repository

    Mozartova, A.; Savostianov, I.; Hundsdorfer, W.

    2015-01-01

    © 2014 Elsevier B.V. All rights reserved. One-leg multistep methods have some advantage over linear multistep methods with respect to storage of the past results. In this paper boundedness and monotonicity properties with arbitrary (semi-)norms or convex functionals are analyzed for such multistep methods. The maximal stepsize coefficient for boundedness and monotonicity of a one-leg method is the same as for the associated linear multistep method when arbitrary starting values are considered. It will be shown, however, that combinations of one-leg methods and Runge-Kutta starting procedures may give very different stepsize coefficients for monotonicity than the linear multistep methods with the same starting procedures. Detailed results are presented for explicit two-step methods.

  19. Comparison of boundedness and monotonicity properties of one-leg and linear multistep methods

    KAUST Repository

    Mozartova, A.

    2015-05-01

    © 2014 Elsevier B.V. All rights reserved. One-leg multistep methods have some advantage over linear multistep methods with respect to storage of the past results. In this paper boundedness and monotonicity properties with arbitrary (semi-)norms or convex functionals are analyzed for such multistep methods. The maximal stepsize coefficient for boundedness and monotonicity of a one-leg method is the same as for the associated linear multistep method when arbitrary starting values are considered. It will be shown, however, that combinations of one-leg methods and Runge-Kutta starting procedures may give very different stepsize coefficients for monotonicity than the linear multistep methods with the same starting procedures. Detailed results are presented for explicit two-step methods.

  20. The quantum theory of statistical multistep nucleus reactions

    CERN Document Server

    Zhivopistsev, F A

    2002-01-01

    The phenomenological models and quantum approaches to the description of the statistical multistep nuclear reactions are discussed. The basic advantages and deficiencies of various modifications of the quantum theory of the statistical multistep direct reactions: Feshbach-Kerman-Koonin formalism, the generalized model of the statistical multistep reactions (GMSMR) are considered in detail. The possibility of obtaining the consistent description of the experimental spectra for the reactions with nucleons is shown by the particular examples. Further improvement and development of the quantum formalism for the more complete and consecutive description of various mechanisms of the component particle formalism in the output channel, the correct of the unbound state densities of the intermediate and finite nuclei are needed for the analysis of the inclusive reactions with participation of the component particles, (and with an account of the contributions to the cross sections of the nucleus cluster and shell areas)...

  1. Deformation dependent TUL multi-step direct model

    International Nuclear Information System (INIS)

    Wienke, H.; Capote, R.; Herman, M.; Sin, M.

    2008-01-01

    The Multi-Step Direct (MSD) module TRISTAN in the nuclear reaction code EMPIRE has been extended to account for nuclear deformation. The new formalism was tested in calculations of neutron emission spectra emitted from the 232 Th(n,xn) reaction. These calculations include vibration-rotational Coupled Channels (CC) for the inelastic scattering to low-lying collective levels, 'deformed' MSD with quadrupole deformation for inelastic scattering to the continuum, Multi-Step Compound (MSC) and Hauser-Feshbach with advanced treatment of the fission channel. Prompt fission neutrons were also calculated. The comparison with experimental data shows clear improvement over the 'spherical' MSD calculations and JEFF-3.1 and JENDL-3.3 evaluations. (authors)

  2. An automated microfluidic DNA microarray platform for genetic variant detection in inherited arrhythmic diseases.

    Science.gov (United States)

    Huang, Shu-Hong; Chang, Yu-Shin; Juang, Jyh-Ming Jimmy; Chang, Kai-Wei; Tsai, Mong-Hsun; Lu, Tzu-Pin; Lai, Liang-Chuan; Chuang, Eric Y; Huang, Nien-Tsu

    2018-03-12

    In this study, we developed an automated microfluidic DNA microarray (AMDM) platform for point mutation detection of genetic variants in inherited arrhythmic diseases. The platform allows for automated and programmable reagent sequencing under precise conditions of hybridization flow and temperature control. It is composed of a commercial microfluidic control system, a microfluidic microarray device, and a temperature control unit. The automated and rapid hybridization process can be performed in the AMDM platform using Cy3 labeled oligonucleotide exons of SCN5A genetic DNA, which produces proteins associated with sodium channels abundant in the heart (cardiac) muscle cells. We then introduce a graphene oxide (GO)-assisted DNA microarray hybridization protocol to enable point mutation detection. In this protocol, a GO solution is added after the staining step to quench dyes bound to single-stranded DNA or non-perfectly matched DNA, which can improve point mutation specificity. As proof-of-concept we extracted the wild-type and mutant of exon 12 and exon 17 of SCN5A genetic DNA from patients with long QT syndrome or Brugada syndrome by touchdown PCR and performed a successful point mutation discrimination in the AMDM platform. Overall, the AMDM platform can greatly reduce laborious and time-consuming hybridization steps and prevent potential contamination. Furthermore, by introducing the reciprocating flow into the microchannel during the hybridization process, the total assay time can be reduced to 3 hours, which is 6 times faster than the conventional DNA microarray. Given the automatic assay operation, shorter assay time, and high point mutation discrimination, we believe that the AMDM platform has potential for low-cost, rapid and sensitive genetic testing in a simple and user-friendly manner, which may benefit gene screening in medical practice.

  3. DEFORMATION DEPENDENT TUL MULTI-STEP DIRECT MODEL

    International Nuclear Information System (INIS)

    WIENKE, H.; CAPOTE, R.; HERMAN, M.; SIN, M.

    2007-01-01

    The Multi-Step Direct (MSD) module TRISTAN in the nuclear reaction code EMPIRE has been extended in order to account for nuclear deformation. The new formalism was tested in calculations of neutron emission spectra emitted from the 232 Th(n,xn) reaction. These calculations include vibration-rotational Coupled Channels (CC) for the inelastic scattering to low-lying collective levels, ''deformed'' MSD with quadrupole deformation for inelastic scattering to the continuum, Multi-Step Compound (MSC) and Hauser-Feshbach with advanced treatment of the fission channel. Prompt fission neutrons were also calculated. The comparison with experimental data shows clear improvement over the ''spherical'' MSD calculations and JEFF-3.1 and JENDL-3.3 evaluations

  4. Research on the quantum multistep theory for pre-equilibrium nuclear reaction

    CERN Document Server

    Su Zong Di; Abdurixit, A; Wang Shu Nuan; Li Bao Xian; Huang Zhong; Liu Jian Feng; Zhang Benai; Zhu Yao Yin; Li Zhi Wen

    2002-01-01

    The Feshbach-Kerman-Koonin (FKK) quantum multistep theory of the pre-equilibrium reaction is further improved and perfected. A unified description for the multistep compound (MSC) process of the pre-equilibrium reaction and the compound nucleus (CN) process of full equilibrium reaction can be presented. This formula can integrate MSC and CN theories with the optical model and Hauser-Feshbach formula, and can get self-consistent expression. In multistep direct (MSD) process of the pre-equilibrium reaction, the mu-step cross section can be expressed by the convolution of mu one-step cross section. And the one step cross section for continuum can be written as the product of an averaged DWBA matrix element and the state density. For calculating the multistep direct reaction cross section, two methods, the state densities and full microscopic model, are used and compared. Some typical experiments are analyzed by using the work mentioned above. The calculated results are reasonable and in good agreement with the e...

  5. A Hybrid Genetic Programming Algorithm for Automated Design of Dispatching Rules.

    Science.gov (United States)

    Nguyen, Su; Mei, Yi; Xue, Bing; Zhang, Mengjie

    2018-06-04

    Designing effective dispatching rules for production systems is a difficult and timeconsuming task if it is done manually. In the last decade, the growth of computing power, advanced machine learning, and optimisation techniques has made the automated design of dispatching rules possible and automatically discovered rules are competitive or outperform existing rules developed by researchers. Genetic programming is one of the most popular approaches to discovering dispatching rules in the literature, especially for complex production systems. However, the large heuristic search space may restrict genetic programming from finding near optimal dispatching rules. This paper develops a new hybrid genetic programming algorithm for dynamic job shop scheduling based on a new representation, a new local search heuristic, and efficient fitness evaluators. Experiments show that the new method is effective regarding the quality of evolved rules. Moreover, evolved rules are also significantly smaller and contain more relevant attributes.

  6. Two-component multistep direct reactions: A microscopic approach

    International Nuclear Information System (INIS)

    Koning, A.J.; Chadwick, M.B.

    1998-03-01

    The authors present two principal advances in multistep direct theory: (1) A two-component formulation of multistep direct reactions, where neutron and proton excitations are explicitly accounted for in the evolution of the reaction, for all orders of scattering. While this may at first seem to be a formidable task, especially for multistep processes where the many possible reaction pathways becomes large in a two-component formalism, the authors show that this is not so -- a rather simple generalization of the FKK convolution expression 1 automatically generates these pathways. Such considerations are particularly relevant when simultaneously analyzing both neutron and proton emission spectra, which is always important since these processes represent competing decay channels. (2) A new, and fully microscopic, method for calculating MSD cross sections which does not make use of particle-hole state densities but instead directly calculates cross sections for all possible particle-hole excitations (again including an exact book-keeping of the neutron/proton type of the particle and hole at all stages of the reaction) determined from a simple non-interacting shell model. This is in contrast to all previous numerical approaches which sample only a small number of such states to estimate the DWBA strength, and utilize simple analytical formulae for the partial state density, based on the equidistant spacing model. The new approach has been applied, along with theories for multistep compound, compound, and collective reactions, to analyze experimental emission spectra for a range of targets and energies. The authors show that the theory correctly accounts for double-differential nucleon spectra

  7. The Feshbach-Kerman-Koonin multistep reaction theory and its application to data evaluation

    International Nuclear Information System (INIS)

    Hodgson, P.E.; Chadwick, M.B.

    1994-01-01

    The development of the multistep nuclear reaction theory of Feshbach, Kerman, and Koonin (FKK) is reviewed, with particular attention to recent work. We present methodologies for unambiguously separating multistep direct and multistep compound contributions, and analyze neutron inelastic scattering by 93 Nb at 14: MeV to illustrate the approaches. Modification of the FKK theory to include transitions from the multistep direct to the multistep compound chain (''P→Q transitions'') is discussed. We also describe developments to the theory in order to calculate inelastic cross sections for incident nucleon energies to 200 MeV. At these high energies multiple preequilibrium emission processes must be included, and a theory for their determination is described and compared with experimental measurements of proton reactions on 90 Zr. The usefulness of the FKK theory for a range of nuclear data applications, including intermediate energy reactions of importance in Accelerator Transmutation of Waste, reactions on biologically-important elements for cancer radiation treatment, and reactions of importance in fusion technology, is assessed

  8. On some properties of the block linear multi-step methods | Chollom ...

    African Journals Online (AJOL)

    The convergence, stability and order of Block linear Multistep methods have been determined in the past based on individual members of the block. In this paper, methods are proposed to examine the properties of the entire block. Some Block Linear Multistep methods have been considered, their convergence, stability and ...

  9. Molecular Detection of Bladder Cancer by Fluorescence Microsatellite Analysis and an Automated Genetic Analyzing System

    Directory of Open Access Journals (Sweden)

    Sarel Halachmi

    2007-01-01

    Full Text Available To investigate the ability of an automated fluorescent analyzing system to detect microsatellite alterations, in patients with bladder cancer. We investigated 11 with pathology proven bladder Transitional Cell Carcinoma (TCC for microsatellite alterations in blood, urine, and tumor biopsies. DNA was prepared by standard methods from blood, urine and resected tumor specimens, and was used for microsatellite analysis. After the primers were fluorescent labeled, amplification of the DNA was performed with PCR. The PCR products were placed into the automated genetic analyser (ABI Prism 310, Perkin Elmer, USA and were subjected to fluorescent scanning with argon ion laser beams. The fluorescent signal intensity measured by the genetic analyzer measured the product size in terms of base pairs. We found loss of heterozygocity (LOH or microsatellite alterations (a loss or gain of nucleotides, which alter the original normal locus size in all the patients by using fluorescent microsatellite analysis and an automated analyzing system. In each case the genetic changes found in urine samples were identical to those found in the resected tumor sample. The studies demonstrated the ability to detect bladder tumor non-invasively by fluorescent microsatellite analysis of urine samples. Our study supports the worldwide trend for the search of non-invasive methods to detect bladder cancer. We have overcome major obstacles that prevented the clinical use of an experimental system. With our new tested system microsatellite analysis can be done cheaper, faster, easier and with higher scientific accuracy.

  10. Continuous Multistep Methods for Volterra Integro-Differential

    African Journals Online (AJOL)

    Kamoh et al.

    DIFFERENTIAL EQUATIONS OF THE SECOND ORDER. 1Kamoh N.M. ... methods, Volterra integro-differential equation, Convergent, ...... Research of a Multistep Method Applied to Numerical Solution of. Volterra ... Congress on Engineering.

  11. A Bias and Variance Analysis for Multistep-Ahead Time Series Forecasting.

    Science.gov (United States)

    Ben Taieb, Souhaib; Atiya, Amir F

    2016-01-01

    Multistep-ahead forecasts can either be produced recursively by iterating a one-step-ahead time series model or directly by estimating a separate model for each forecast horizon. In addition, there are other strategies; some of them combine aspects of both aforementioned concepts. In this paper, we present a comprehensive investigation into the bias and variance behavior of multistep-ahead forecasting strategies. We provide a detailed review of the different multistep-ahead strategies. Subsequently, we perform a theoretical study that derives the bias and variance for a number of forecasting strategies. Finally, we conduct a Monte Carlo experimental study that compares and evaluates the bias and variance performance of the different strategies. From the theoretical and the simulation studies, we analyze the effect of different factors, such as the forecast horizon and the time series length, on the bias and variance components, and on the different multistep-ahead strategies. Several lessons are learned, and recommendations are given concerning the advantages, disadvantages, and best conditions of use of each strategy.

  12. Explicit strong stability preserving multistep Runge–Kutta methods

    KAUST Repository

    Bresten, Christopher; Gottlieb, Sigal; Grant, Zachary; Higgs, Daniel; Ketcheson, David I.; Né meth, Adrian

    2015-01-01

    High-order spatial discretizations of hyperbolic PDEs are often designed to have strong stability properties, such as monotonicity. We study explicit multistep Runge-Kutta strong stability preserving (SSP) time integration methods for use with such discretizations. We prove an upper bound on the SSP coefficient of explicit multistep Runge-Kutta methods of order two and above. Numerical optimization is used to find optimized explicit methods of up to five steps, eight stages, and tenth order. These methods are tested on the linear advection and nonlinear Buckley-Leverett equations, and the results for the observed total variation diminishing and/or positivity preserving time-step are presented.

  13. Explicit strong stability preserving multistep Runge–Kutta methods

    KAUST Repository

    Bresten, Christopher

    2015-10-15

    High-order spatial discretizations of hyperbolic PDEs are often designed to have strong stability properties, such as monotonicity. We study explicit multistep Runge-Kutta strong stability preserving (SSP) time integration methods for use with such discretizations. We prove an upper bound on the SSP coefficient of explicit multistep Runge-Kutta methods of order two and above. Numerical optimization is used to find optimized explicit methods of up to five steps, eight stages, and tenth order. These methods are tested on the linear advection and nonlinear Buckley-Leverett equations, and the results for the observed total variation diminishing and/or positivity preserving time-step are presented.

  14. Automated DNA extraction from genetically modified maize using aminosilane-modified bacterial magnetic particles.

    Science.gov (United States)

    Ota, Hiroyuki; Lim, Tae-Kyu; Tanaka, Tsuyoshi; Yoshino, Tomoko; Harada, Manabu; Matsunaga, Tadashi

    2006-09-18

    A novel, automated system, PNE-1080, equipped with eight automated pestle units and a spectrophotometer was developed for genomic DNA extraction from maize using aminosilane-modified bacterial magnetic particles (BMPs). The use of aminosilane-modified BMPs allowed highly accurate DNA recovery. The (A(260)-A(320)):(A(280)-A(320)) ratio of the extracted DNA was 1.9+/-0.1. The DNA quality was sufficiently pure for PCR analysis. The PNE-1080 offered rapid assay completion (30 min) with high accuracy. Furthermore, the results of real-time PCR confirmed that our proposed method permitted the accurate determination of genetically modified DNA composition and correlated well with results obtained by conventional cetyltrimethylammonium bromide (CTAB)-based methods.

  15. Stepsize Restrictions for Boundedness and Monotonicity of Multistep Methods

    KAUST Repository

    Hundsdorfer, W.

    2011-04-29

    In this paper nonlinear monotonicity and boundedness properties are analyzed for linear multistep methods. We focus on methods which satisfy a weaker boundedness condition than strict monotonicity for arbitrary starting values. In this way, many linear multistep methods of practical interest are included in the theory. Moreover, it will be shown that for such methods monotonicity can still be valid with suitable Runge-Kutta starting procedures. Restrictions on the stepsizes are derived that are not only sufficient but also necessary for these boundedness and monotonicity properties. © 2011 Springer Science+Business Media, LLC.

  16. Strong-stability-preserving additive linear multistep methods

    KAUST Repository

    Hadjimichael, Yiannis; Ketcheson, David I.

    2018-01-01

    The analysis of strong-stability-preserving (SSP) linear multistep methods is extended to semi-discretized problems for which different terms on the right-hand side satisfy different forward Euler (or circle) conditions. Optimal perturbed

  17. Personalized multistep cognitive behavioral therapy for obesity

    Directory of Open Access Journals (Sweden)

    Dalle Grave R

    2017-06-01

    Full Text Available Riccardo Dalle Grave, Massimiliano Sartirana, Marwan El Ghoch, Simona Calugi Department of Eating and Weight Disorders, Villa Garda Hospital, Verona, Italy Abstract: Multistep cognitive behavioral therapy for obesity (CBT-OB is a treatment that may be delivered at three levels of care (outpatient, day hospital, and residential. In a stepped-care approach, CBT-OB associates the traditional procedures of weight-loss lifestyle modification, ie, physical activity and dietary recommendations, with specific cognitive behavioral strategies that have been indicated by recent research to influence weight loss and maintenance by addressing specific cognitive processes. The treatment program as a whole is delivered in six modules. These are introduced according to the individual patient’s needs in a flexible and personalized fashion. A recent randomized controlled trial has found that 88 patients suffering from morbid obesity treated with multistep residential CBT-OB achieved a mean weight loss of 15% after 12 months, with no tendency to regain weight between months 6 and 12. The treatment has also shown promising long-term results in the management of obesity associated with binge-eating disorder. If these encouraging findings are confirmed by the two ongoing outpatient studies (one delivered individually and one in a group setting, this will provide evidence-based support for the potential of multistep CBT-OB to provide a more effective alternative to standard weight-loss lifestyle-modification programs. Keywords: obesity, cognitive behavioral therapy, lifestyle modification, weight loss, weight maintenance, outcome

  18. Surrogate-Assisted Genetic Programming With Simplified Models for Automated Design of Dispatching Rules.

    Science.gov (United States)

    Nguyen, Su; Zhang, Mengjie; Tan, Kay Chen

    2017-09-01

    Automated design of dispatching rules for production systems has been an interesting research topic over the last several years. Machine learning, especially genetic programming (GP), has been a powerful approach to dealing with this design problem. However, intensive computational requirements, accuracy and interpretability are still its limitations. This paper aims at developing a new surrogate assisted GP to help improving the quality of the evolved rules without significant computational costs. The experiments have verified the effectiveness and efficiency of the proposed algorithms as compared to those in the literature. Furthermore, new simplification and visualisation approaches have also been developed to improve the interpretability of the evolved rules. These approaches have shown great potentials and proved to be a critical part of the automated design system.

  19. Strong Stability Preserving Explicit Linear Multistep Methods with Variable Step Size

    KAUST Repository

    Hadjimichael, Yiannis

    2016-09-08

    Strong stability preserving (SSP) methods are designed primarily for time integration of nonlinear hyperbolic PDEs, for which the permissible SSP step size varies from one step to the next. We develop the first SSP linear multistep methods (of order two and three) with variable step size, and prove their optimality, stability, and convergence. The choice of step size for multistep SSP methods is an interesting problem because the allowable step size depends on the SSP coefficient, which in turn depends on the chosen step sizes. The description of the methods includes an optimal step-size strategy. We prove sharp upper bounds on the allowable step size for explicit SSP linear multistep methods and show the existence of methods with arbitrarily high order of accuracy. The effectiveness of the methods is demonstrated through numerical examples.

  20. Effect of One-Step and Multi-Steps Polishing System on Enamel Roughness

    Directory of Open Access Journals (Sweden)

    Cynthia Sumali

    2013-07-01

    Full Text Available Normal 0 false false false MicrosoftInternetExplorer4 The final procedures of orthodontic treatment are bracket debonding and cleaning the remaining adhesive. Multi-step polishing system is the most common method used. The disadvantage of that system is long working time, because of the stages that should be done. Therefore, dental material manufacturer make an improvement to the system, to reduce several stages into one stage only. This new system is known as one-step polishing system. Objective: To compare the effect of one-step and multi-step polishing system on enamel roughness after orthodontic bracket debonding. Methods: Randomized control trial was conducted included twenty-eight maxillary premolar randomized into two polishing system; one-step OptraPol (Ivoclar, Vivadent and multi-step AstroPol (Ivoclar, Vivadent. After bracket debonding, the remaining adhesive on each group was cleaned by subjective polishing system for ninety seconds using low speed handpiece. The enamel roughness was subjected to profilometer, registering two roughness parameters (Ra, Rz. Independent t-test was used to analyze the mean score of enamel roughness in each group. Results: There was no significant difference of enamel roughness between one-step and multi-step polishing system (p>0.005. Conclusion: One-step polishing system can produce a similar enamel roughness to multi-step polishing system after bracket debonding and adhesive cleaning.DOI: 10.14693/jdi.v19i3.136

  1. Online multistep-ahead inundation depth forecasts by recurrent NARX networks

    Directory of Open Access Journals (Sweden)

    H.-Y. Shen

    2013-03-01

    Full Text Available Various types of artificial neural networks (ANNs have been successfully applied in hydrological fields, but relatively scant on multistep-ahead flood inundation forecasting, which is very difficult to achieve, especially when dealing with forecasts without regular observed data. This study proposes a recurrent configuration of nonlinear autoregressive with exogenous inputs (NARX network, called R-NARX, to forecast multistep-ahead inundation depths in an inundation area. The proposed R-NARX is constructed based on the recurrent neural network (RNN, which is commonly used for modeling nonlinear dynamical systems. The models were trained and tested based on a large number of inundation data generated by a well validated two-dimensional simulation model at thirteen inundation-prone sites in Yilan County, Taiwan. We demonstrate that the R-NARX model can effectively inhibit error growth and accumulation when being applied to online multistep-ahead inundation forecasts over a long lasting forecast period. For comparison, a feedforward time-delay and an online feedback configuration of NARX networks (T-NARX and O-NARX were performed. The results show that (1 T-NARX networks cannot make online forecasts due to unavailable inputs in the constructed networks even though they provide the best performances for reference only; and (2 R-NARX networks consistently outperform O-NARX networks and can be adequately applied to online multistep-ahead forecasts of inundation depths in the study area during typhoon events.

  2. Multi-step laser spectroscopy in atomic uranium

    International Nuclear Information System (INIS)

    Solarz, R.W.; Paisner, J.A.; Carlson, L.R.; May, C.A.; Johnson, S.A.

    1975-01-01

    The use of multistep photoionization techniques to measure lifetimes, cross sections, and branching ratios in uranium is described. The measurement of autoionization behavior, the assignment of new levels, and the observation of Rydberg states are also described. Experimental data are presented for all parameters

  3. Imaging-pathologic correlation of multi-step hepatocarcinogenesis

    International Nuclear Information System (INIS)

    Matsui, O.

    2012-01-01

    Full text: Approximately 80% of Japanese HCC cases are derived from HCV-associated liver cirrhosis and chronic hepatitis, and the remaining less than 20% of patients are HBV positive. Because of the introduction of this surveillance system by periodic ultrasound in these high-risk patients, the size of HCCs firstly detected during 2004 to 2005 (n=16809) was less than 2cm in 35% of all cases, 2.1-5.0 cm 48%, respectively. However, various types of hepatocellular nodules such as dysplastic nodule are also detected during screening procedures. Pathologically, human HCC often develops in a multistep fashion from dysplastic nodule to classic hyper vascular HCC. Therefore, for the early diagnosis of HCC, understanding of the sequential changes of imaging findings in accordance with multi-step hepatocarcinogenseis is important. In addition, to understand the imaging features of various types of HCC is also important for the precise characterization of HCCs. (1) Classification of hepatocellular nodules during multistep hepatocarcinogenesis; According to International Consensus Group for Hepatocellular Neoplasia, these nodules are divided into large regenerative nodule, low grade dysplastic nodule (L-DN), high-grade dysplastic nodule (H-DN), and HCC. In addition, small HCC (less than 2 cm) is divided into early HCC and progressed HCC. Early HCC has a vaguely nodular appearance and is highly well differentiated. (2) Imaging of multistep hepatocarcinogenesis; We revealed that the intranodular blood supply changes in accordance with the progression of human hepatocarcinogenesis from dyspalstic nodule to overt HCC. The intranodular portal supply relative to the surrounding liver parenchyma evaluated by CT during arterial portography (CTAP) is decreased, whereas the intranodular arterial supply evaluated by CT during hepatic arteriography (CTHA) revealed is first decreased during the early stage of hepatocarcinogenesis and then increased in parallel with increasing grade of

  4. Continuous track paths reveal additive evidence integration in multistep decision making.

    Science.gov (United States)

    Buc Calderon, Cristian; Dewulf, Myrtille; Gevers, Wim; Verguts, Tom

    2017-10-03

    Multistep decision making pervades daily life, but its underlying mechanisms remain obscure. We distinguish four prominent models of multistep decision making, namely serial stage, hierarchical evidence integration, hierarchical leaky competing accumulation (HLCA), and probabilistic evidence integration (PEI). To empirically disentangle these models, we design a two-step reward-based decision paradigm and implement it in a reaching task experiment. In a first step, participants choose between two potential upcoming choices, each associated with two rewards. In a second step, participants choose between the two rewards selected in the first step. Strikingly, as predicted by the HLCA and PEI models, the first-step decision dynamics were initially biased toward the choice representing the highest sum/mean before being redirected toward the choice representing the maximal reward (i.e., initial dip). Only HLCA and PEI predicted this initial dip, suggesting that first-step decision dynamics depend on additive integration of competing second-step choices. Our data suggest that potential future outcomes are progressively unraveled during multistep decision making.

  5. A generalized theory of chromatography and multistep liquid extraction

    Science.gov (United States)

    Chizhkov, V. P.; Boitsov, V. N.

    2017-03-01

    A generalized theory of chromatography and multistep liquid extraction is developed. The principles of highly efficient processes for fine preparative separation of binary mixture components on a fixed sorbent layer are discussed.

  6. Improved perovskite phototransistor prepared using multi-step annealing method

    Science.gov (United States)

    Cao, Mingxuan; Zhang, Yating; Yu, Yu; Yao, Jianquan

    2018-02-01

    Organic-inorganic hybrid perovskites with good intrinsic physical properties have received substantial interest for solar cell and optoelectronic applications. However, perovskite film always suffers from a low carrier mobility due to its structural imperfection including sharp grain boundaries and pinholes, restricting their device performance and application potential. Here we demonstrate a straightforward strategy based on multi-step annealing process to improve the performance of perovskite photodetector. Annealing temperature and duration greatly affects the surface morphology and optoelectrical properties of perovskites which determines the device property of phototransistor. The perovskite films treated with multi-step annealing method tend to form highly uniform, well-crystallized and high surface coverage perovskite film, which exhibit stronger ultraviolet-visible absorption and photoluminescence spectrum compare to the perovskites prepared by conventional one-step annealing process. The field-effect mobilities of perovskite photodetector treated by one-step direct annealing method shows mobility as 0.121 (0.062) cm2V-1s-1 for holes (electrons), which increases to 1.01 (0.54) cm2V-1s-1 for that treated with muti-step slow annealing method. Moreover, the perovskite phototransistors exhibit a fast photoresponse speed of 78 μs. In general, this work focuses on the influence of annealing methods on perovskite phototransistor, instead of obtains best parameters of it. These findings prove that Multi-step annealing methods is feasible to prepared high performance based photodetector.

  7. Comparing Multi-Step IMAC and Multi-Step TiO2 Methods for Phosphopeptide Enrichment

    Science.gov (United States)

    Yue, Xiaoshan; Schunter, Alissa; Hummon, Amanda B.

    2016-01-01

    Phosphopeptide enrichment from complicated peptide mixtures is an essential step for mass spectrometry-based phosphoproteomic studies to reduce sample complexity and ionization suppression effects. Typical methods for enriching phosphopeptides include immobilized metal affinity chromatography (IMAC) or titanium dioxide (TiO2) beads, which have selective affinity and can interact with phosphopeptides. In this study, the IMAC enrichment method was compared with the TiO2 enrichment method, using a multi-step enrichment strategy from whole cell lysate, to evaluate their abilities to enrich for different types of phosphopeptides. The peptide-to-beads ratios were optimized for both IMAC and TiO2 beads. Both IMAC and TiO2 enrichments were performed for three rounds to enable the maximum extraction of phosphopeptides from the whole cell lysates. The phosphopeptides that are unique to IMAC enrichment, unique to TiO2 enrichment, and identified with both IMAC and TiO2 enrichment were analyzed for their characteristics. Both IMAC and TiO2 enriched similar amounts of phosphopeptides with comparable enrichment efficiency. However, phosphopeptides that are unique to IMAC enrichment showed a higher percentage of multi-phosphopeptides, as well as a higher percentage of longer, basic, and hydrophilic phosphopeptides. Also, the IMAC and TiO2 procedures clearly enriched phosphopeptides with different motifs. Finally, further enriching with two rounds of TiO2 from the supernatant after IMAC enrichment, or further enriching with two rounds of IMAC from the supernatant TiO2 enrichment does not fully recover the phosphopeptides that are not identified with the corresponding multi-step enrichment. PMID:26237447

  8. A Parallel Genetic Algorithm for Automated Electronic Circuit Design

    Science.gov (United States)

    Long, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris

    2000-01-01

    Parallelized versions of genetic algorithms (GAs) are popular primarily for three reasons: the GA is an inherently parallel algorithm, typical GA applications are very compute intensive, and powerful computing platforms, especially Beowulf-style computing clusters, are becoming more affordable and easier to implement. In addition, the low communication bandwidth required allows the use of inexpensive networking hardware such as standard office ethernet. In this paper we describe a parallel GA and its use in automated high-level circuit design. Genetic algorithms are a type of trial-and-error search technique that are guided by principles of Darwinian evolution. Just as the genetic material of two living organisms can intermix to produce offspring that are better adapted to their environment, GAs expose genetic material, frequently strings of 1s and Os, to the forces of artificial evolution: selection, mutation, recombination, etc. GAs start with a pool of randomly-generated candidate solutions which are then tested and scored with respect to their utility. Solutions are then bred by probabilistically selecting high quality parents and recombining their genetic representations to produce offspring solutions. Offspring are typically subjected to a small amount of random mutation. After a pool of offspring is produced, this process iterates until a satisfactory solution is found or an iteration limit is reached. Genetic algorithms have been applied to a wide variety of problems in many fields, including chemistry, biology, and many engineering disciplines. There are many styles of parallelism used in implementing parallel GAs. One such method is called the master-slave or processor farm approach. In this technique, slave nodes are used solely to compute fitness evaluations (the most time consuming part). The master processor collects fitness scores from the nodes and performs the genetic operators (selection, reproduction, variation, etc.). Because of dependency

  9. Stepsize Restrictions for Boundedness and Monotonicity of Multistep Methods

    KAUST Repository

    Hundsdorfer, W.; Mozartova, A.; Spijker, M. N.

    2011-01-01

    In this paper nonlinear monotonicity and boundedness properties are analyzed for linear multistep methods. We focus on methods which satisfy a weaker boundedness condition than strict monotonicity for arbitrary starting values. In this way, many

  10. 48 CFR 15.202 - Advisory multi-step process.

    Science.gov (United States)

    2010-10-01

    ... CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Solicitation and Receipt of Proposals and Information 15.202 Advisory multi-step process. (a) The agency may publish a presolicitation notice (see 5.204) that provides a general description of the scope or purpose of the acquisition and invites potential...

  11. Flexural Free Vibrations of Multistep Nonuniform Beams

    Directory of Open Access Journals (Sweden)

    Guojin Tan

    2016-01-01

    Full Text Available This paper presents an exact approach to investigate the flexural free vibrations of multistep nonuniform beams. Firstly, one-step beam with moment of inertia and mass per unit length varying as I(x=α11+βxr+4 and m(x=α21+βxr was studied. By using appropriate transformations, the differential equation for flexural free vibration of one-step beam with variable cross section is reduced to a four-order differential equation with constant coefficients. According to different types of roots for the characteristic equation of four-order differential equation with constant coefficients, two kinds of modal shape functions are obtained, and the general solutions for flexural free vibration of one-step beam with variable cross section are presented. An exact approach to solve the natural frequencies and modal shapes of multistep beam with variable cross section is presented by using transfer matrix method, the exact general solutions of one-step beam, and iterative method. Numerical examples reveal that the calculated frequencies and modal shapes are in good agreement with the finite element method (FEM, which demonstrates the solutions of present method are exact ones.

  12. Performance Optimization of a Solar-Driven Multi-Step Irreversible Brayton Cycle Based on a Multi-Objective Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Ahmadi Mohammad Hosein

    2016-01-01

    Full Text Available An applicable approach for a multi-step regenerative irreversible Brayton cycle on the basis of thermodynamics and optimization of thermal efficiency and normalized output power is presented in this work. In the present study, thermodynamic analysis and a NSGA II algorithm are coupled to determine the optimum values of thermal efficiency and normalized power output for a Brayton cycle system. Moreover, three well-known decision-making methods are employed to indicate definite answers from the outputs gained from the aforementioned approach. Finally, with the aim of error analysis, the values of the average and maximum error of the results are also calculated.

  13. Multistep Methods for Integrating the Solar System

    Science.gov (United States)

    1988-07-01

    Technical Report 1055 [Multistep Methods for Integrating the Solar System 0 Panayotis A. Skordos’ MIT Artificial Intelligence Laboratory DTIC S D g8...RMA ELEENT. PROECT. TASK Artific ial Inteligence Laboratory ARE1A G WORK UNIT NUMBERS 545 Technology Square Cambridge, MA 02139 IL. CONTROLLING...describes research done at the Artificial Intelligence Laboratory of the Massachusetts Institute of Technology, supported by the Advanced Research Projects

  14. Multi-step wrought processing of TiAl-based alloys

    International Nuclear Information System (INIS)

    Fuchs, G.E.

    1997-04-01

    Wrought processing will likely be needed for fabrication of a variety of TiAl-based alloy structural components. Laboratory and development work has usually relied on one-step forging to produce test material. Attempts to scale-up TiAl-based alloy processing has indicated that multi-step wrought processing is necessary. The purpose of this study was to examine potential multi-step processing routes, such as two-step isothermal forging and extrusion + isothermal forging. The effects of processing (I/M versus P/M), intermediate recrystallization heat treatments and processing route on the tensile and creep properties of Ti-48Al-2Nb-2Cr alloys were examined. The results of the testing were then compared to samples from the same heats of materials processed by one-step routes. Finally, by evaluating the effect of processing on microstructure and properties, optimized and potentially lower cost processing routes could be identified

  15. Overcoming the hurdles of multi-step targeting (MST) for effective radioimmunotherapy of solid tumors

    International Nuclear Information System (INIS)

    Larson, Steven M.; Cheung, Nai-Kong

    2009-01-01

    The 4 specific aims of this project are: (1) Optimization of MST to increase tumor uptake; (2) Antigen heterogeneity; (3) Characterization and reduction of renal uptake; and (4) Validation in vivo of optimized MST targeted therapy. This proposal focussed upon optimizing multistep immune targeting strategies for the treatment of cancer. Two multi-step targeting constructs were explored during this funding period: (1) anti-Tag-72 and (2) anti-GD2.

  16. Angular momentum in multi-step photoionization

    International Nuclear Information System (INIS)

    Yoshida, Tadashi; Adachi, Hajime; Kuwako, Akira; Nittoh, Koichi; Araki, Yoshio; Watanabe, Takashi; Yoguchi, Itaru.

    1995-01-01

    The effect of the angular momenta on the multi-step laser-ionization efficiency was investigated numerically for cases with and without the hyperfine interactions. For either cases the ionization efficiency proved to depend appreciably on the values of J in the excitation ladder. In this respect, we elaborated a simple and efficient method of determining J, which was based on the laser polarization dependence of the excitation rate. Application of this method to a couple of real excitation ladders proved its usefulness and reliability. (author)

  17. Development of the multistep compound process calculation code

    Energy Technology Data Exchange (ETDEWEB)

    Kawano, Toshihiko [Kyushu Univ., Fukuoka (Japan)

    1998-03-01

    A program `cmc` has been developed to calculate the multistep compound (MSC) process by Feshback-Kerman-Koonin. A radial overlap integral in the transition matrix element is calculated microscopically, and comparisons are made for neutron induced {sup 93}Nb reactions. Strengths of the two-body interaction V{sub 0} are estimated from the total MSC cross sections. (author)

  18. Photon Production through Multi-step Processes Important in Nuclear Fluorescence Experiments

    International Nuclear Information System (INIS)

    Hagmann, C; Pruet, J

    2006-01-01

    The authors present calculations describing the production of photons through multi-step processes occurring when a beam of gamma rays interacts with a macroscopic material. These processes involve the creation of energetic electrons through Compton scattering, photo-absorption and pair production, the subsequent scattering of these electrons, and the creation of energetic photons occurring as these electrons are slowed through Bremsstrahlung emission. Unlike single Compton collisions, during which an energetic photon that is scattered through a large angle loses most of its energy, these multi-step processes result in a sizable flux of energetic photons traveling at large angles relative to an incident photon beam. These multi-step processes are also a key background in experiments that measure nuclear resonance fluorescence by shining photons on a thin foil and observing the spectrum of back-scattered photons. Effective cross sections describing the production of backscattered photons are presented in a tabular form that allows simple estimates of backgrounds expected in a variety of experiments. Incident photons with energies between 0.5 MeV and 8 MeV are considered. These calculations of effective cross sections may be useful for those designing NRF experiments or systems that detect specific isotopes in well-shielded environments through observation of resonance fluorescence

  19. Accurate Multisteps Traffic Flow Prediction Based on SVM

    Directory of Open Access Journals (Sweden)

    Zhang Mingheng

    2013-01-01

    Full Text Available Accurate traffic flow prediction is prerequisite and important for realizing intelligent traffic control and guidance, and it is also the objective requirement for intelligent traffic management. Due to the strong nonlinear, stochastic, time-varying characteristics of urban transport system, artificial intelligence methods such as support vector machine (SVM are now receiving more and more attentions in this research field. Compared with the traditional single-step prediction method, the multisteps prediction has the ability that can predict the traffic state trends over a certain period in the future. From the perspective of dynamic decision, it is far important than the current traffic condition obtained. Thus, in this paper, an accurate multi-steps traffic flow prediction model based on SVM was proposed. In which, the input vectors were comprised of actual traffic volume and four different types of input vectors were compared to verify their prediction performance with each other. Finally, the model was verified with actual data in the empirical analysis phase and the test results showed that the proposed SVM model had a good ability for traffic flow prediction and the SVM-HPT model outperformed the other three models for prediction.

  20. A one-pot multistep cyclization yielding thiadiazoloimidazole derivatives

    Directory of Open Access Journals (Sweden)

    Debabrata Samanta

    2014-12-01

    Full Text Available A versatile synthetic procedure is described to prepare the benzimidazole-fused 1,2,4-thiadiazoles 2a–c via a methanesulfonyl chloride initiated multistep cyclization involving the intramolecular reaction of an in-situ generated carbodiimide with a thiourea unit. The structure of the intricate heterocycle 2a was confirmed by single-crystal X-ray analysis and its mechanism of formation supported by DFT computations.

  1. Stochastic multistep polarization switching in ferroelectrics

    Science.gov (United States)

    Genenko, Y. A.; Khachaturyan, R.; Schultheiß, J.; Ossipov, A.; Daniels, J. E.; Koruza, J.

    2018-04-01

    Consecutive stochastic 90° polarization switching events, clearly resolved in recent experiments, are described by a nucleation and growth multistep model. It extends the classical Kolmogorov-Avrami-Ishibashi approach and includes possible consecutive 90°- and parallel 180° switching events. The model predicts the results of simultaneous time-resolved macroscopic measurements of polarization and strain, performed on a tetragonal Pb (Zr ,Ti ) O3 ceramic in a wide range of electric fields over a time domain of seven orders of magnitude. It allows the determination of the fractions of individual switching processes, their characteristic switching times, activation fields, and respective Avrami indices.

  2. A Capping Step During Automated Glycan Assembly Enables Access to Complex Glycans in High Yield.

    Science.gov (United States)

    Yu, Yang; Kononov, Andrew; Delbianco, Martina; Seeberger, Peter H

    2018-04-20

    The products of multi-step automated solid phase syntheses are purified after release from the resin. Capping of unreacted nucleophiles is commonplace in automated oligonucleotide synthesis to minimize accumulation of deletion sequences. To date, capping was not used routinely during automated glycan assembly (AGA) since previous capping protocols suffered from long reaction times and conditions incompatible with some protective groups. Here, a method using methanesulfonic acid and acetic anhydride for the fast and quantitative capping of hydroxyl groups that failed to be glycosylated is reported. Commonly used protective groups in AGA are stable under these capping conditions. The introduction of a capping step into the coupling cycle drastically improved overall yields by decreasing side-products and simplifying purification, while reducing building block consumption. To illustrate the method, the biologically important tetrasaccharide Lc4, as well as a 50-mer polymannoside were prepared. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. The nested-doorway model of multistep compound processes

    International Nuclear Information System (INIS)

    Hussein, M.S.

    1982-05-01

    The multistep compound contribution to preequilibrium reaction are discussed within the nested-doorway model. Emphasis is placed on the generalized cross-section auto-correlation function. Several of the more widely used concepts in the conventional, one-class, statistical analysis are discussed and generalized to the multiclass case. A summary of the formal results of the nested-doorway model, obtained within Feshbach's projection operator theory is given. (Author) [pt

  4. Densities of accessible final states for multi-step compound reactions

    International Nuclear Information System (INIS)

    Maoming De; Guo Hua

    1993-01-01

    The densities of accessible final states for calculations of multi-step compound reactions are derived. The Pauli exclusion principle is taken into account in the calculations. The results are compared with a previous author's results and the effect of the Pauli exclusion principle is investigated. (Author)

  5. Significantly enhanced memory effect in metallic glass by multistep training

    Science.gov (United States)

    Li, M. X.; Luo, P.; Sun, Y. T.; Wen, P.; Bai, H. Y.; Liu, Y. H.; Wang, W. H.

    2017-11-01

    The state of metastable equilibrium glass can carry an imprint of the past and exhibit memory effect. As a hallmark of glassy dynamics, memory effect can affect glassy behavior as it evolves further upon time. Even though the physical picture of the memory effect has been well studied, it is unclear whether a glass can recall as many pieces of information as possible, and if so, how the glass will accordingly behave. We report that by fractionizing temperature interval, inserting multistep aging protocols, and optimizing the time of each temperature step, i.e., by imposing a multistep "training" on a prototypical P d40N i10C u30P20 metallic glass, the memory of the trained glass can be significantly strengthened, marked by a pronounced augment in potential energy. These findings provide a new guide for regulating the energy state of glass by enhancing the nonequilibrium behaviors of the memory effect and offer an opportunity to develop a clearer physical picture of glassy dynamics.

  6. PSO-MISMO modeling strategy for multistep-ahead time series prediction.

    Science.gov (United States)

    Bao, Yukun; Xiong, Tao; Hu, Zhongyi

    2014-05-01

    Multistep-ahead time series prediction is one of the most challenging research topics in the field of time series modeling and prediction, and is continually under research. Recently, the multiple-input several multiple-outputs (MISMO) modeling strategy has been proposed as a promising alternative for multistep-ahead time series prediction, exhibiting advantages compared with the two currently dominating strategies, the iterated and the direct strategies. Built on the established MISMO strategy, this paper proposes a particle swarm optimization (PSO)-based MISMO modeling strategy, which is capable of determining the number of sub-models in a self-adaptive mode, with varying prediction horizons. Rather than deriving crisp divides with equal-size s prediction horizons from the established MISMO, the proposed PSO-MISMO strategy, implemented with neural networks, employs a heuristic to create flexible divides with varying sizes of prediction horizons and to generate corresponding sub-models, providing considerable flexibility in model construction, which has been validated with simulated and real datasets.

  7. Multistep Hybrid Iterations for Systems of Generalized Equilibria with Constraints of Several Problems

    Directory of Open Access Journals (Sweden)

    Lu-Chuan Ceng

    2014-01-01

    Full Text Available We first introduce and analyze one multistep iterative algorithm by hybrid shrinking projection method for finding a solution of the system of generalized equilibria with constraints of several problems: the generalized mixed equilibrium problem, finitely many variational inclusions, the minimization problem for a convex and continuously Fréchet differentiable functional, and the fixed-point problem of an asymptotically strict pseudocontractive mapping in the intermediate sense in a real Hilbert space. We prove strong convergence theorem for the iterative algorithm under suitable conditions. On the other hand, we also propose another multistep iterative algorithm involving no shrinking projection method and derive its weak convergence under mild assumptions.

  8. Strong Stability Preserving Explicit Linear Multistep Methods with Variable Step Size

    KAUST Repository

    Hadjimichael, Yiannis; Ketcheson, David I.; Loczi, Lajos; Né meth, Adriá n

    2016-01-01

    Strong stability preserving (SSP) methods are designed primarily for time integration of nonlinear hyperbolic PDEs, for which the permissible SSP step size varies from one step to the next. We develop the first SSP linear multistep methods (of order

  9. Baking Powder Actuated Centrifugo-Pneumatic Valving for Automation of Multi-Step Bioassays

    Directory of Open Access Journals (Sweden)

    David J. Kinahan

    2016-10-01

    Full Text Available We report a new flow control method for centrifugal microfluidic systems; CO2 is released from on-board stored baking powder upon contact with an ancillary liquid. The elevated pressure generated drives the sample into a dead-end pneumatic chamber sealed by a dissolvable film (DF. This liquid incursion wets and dissolves the DF, thus opening the valve. The activation pressure of the DF valve can be tuned by the geometry of the channel upstream of the DF membrane. Through pneumatic coupling with properly dimensioned disc architecture, we established serial cascading of valves, even at a constant spin rate. Similarly, we demonstrate sequential actuation of valves by dividing the disc into a number of distinct pneumatic chambers (separated by DF membranes. Opening these DFs, typically through arrival of a liquid to that location on a disc, permits pressurization of these chambers. This barrier-based scheme provides robust and strictly ordered valve actuation, which is demonstrated by the automation of a multi-step/multi-reagent DNA-based hybridization assay.

  10. Direct observation of multistep energy transfer in LHCII with fifth-order 3D electronic spectroscopy.

    Science.gov (United States)

    Zhang, Zhengyang; Lambrev, Petar H; Wells, Kym L; Garab, Győző; Tan, Howe-Siang

    2015-07-31

    During photosynthesis, sunlight is efficiently captured by light-harvesting complexes, and the excitation energy is then funneled towards the reaction centre. These photosynthetic excitation energy transfer (EET) pathways are complex and proceed in a multistep fashion. Ultrafast two-dimensional electronic spectroscopy (2DES) is an important tool to study EET processes in photosynthetic complexes. However, the multistep EET processes can only be indirectly inferred by correlating different cross peaks from a series of 2DES spectra. Here we directly observe multistep EET processes in LHCII using ultrafast fifth-order three-dimensional electronic spectroscopy (3DES). We measure cross peaks in 3DES spectra of LHCII that directly indicate energy transfer from excitons in the chlorophyll b (Chl b) manifold to the low-energy level chlorophyll a (Chl a) via mid-level Chl a energy states. This new spectroscopic technique allows scientists to move a step towards mapping the complete complex EET processes in photosynthetic systems.

  11. Description of bioremediation of soils using the model of a multistep system of microorganisms

    Science.gov (United States)

    Lubysheva, A. I.; Potashev, K. A.; Sofinskaya, O. A.

    2018-01-01

    The paper deals with the development of a mathematical model describing the interaction of a multi-step system of microorganisms in soil polluted with oil products. Each step in this system uses products of vital activity of the previous step to feed. Six different models of the multi-step system are considered. The equipping of the models with coefficients was carried out from the condition of minimizing the residual of the calculated and experimental data using an original algorithm based on the Levenberg-Marquardt method in combination with the Monte Carlo method for the initial approximation finding.

  12. Multigeometry Nanoparticle Engineering via Kinetic Control through Multistep assembly

    Science.gov (United States)

    Chen, Yingchao; Wang, Xiaojun; Zhang, Ke; Zhang, Fuwu; Mays, Jimmy; Wooley, Karen; Pochan, Darrin

    2014-03-01

    Organization of block copolymers into complicated multicompartment (MCM) and multigeometry (MGM) nanostructures is of increasing interest. Multistep, co-assembly methods resulting in kinetic control processing was used to produce complex nanoparticles that are not obtained via other assembly methods. Vesicle-cylinder, separate vesicle and cylinder, disk-cylinder, and mixed vesicle nanoparticles were constructed by binary blends of distinct diblock copolymers. Initially, the vesicle former polyacrylic acid-polyisoprene and cylinder former polyacrylic acid-polystyrene which share the same hydrophilic domain but immiscible hydrophobic domain were blended in THF. Secondly, dimaine molecules are added to associate with the common hydrophilic PAA. Importantly, and lastly, by tuning the kinetic addition rate of selective, miscible solvent water, the unlike hydrophobic blocks are kinetically trapped into one particle and eventually nanophase separate to form multiple compartments and multigeometries. The effective bottom-up multistep assembly strategies can be applied in other binary/ternary blends, in which new vesicle-sphere, disk-disk and cylinder-cylinder MCM/MGM nanoparticles were programed. We are grateful for the financial support from the National Science Funding DMR-0906815 (D.J.P. and K.L.W.) and NIST METROLOGY POCHAN 2012.

  13. Multi-step rearrangement mechanism for acetyl cedrene to the hydrocarbon follower

    DEFF Research Database (Denmark)

    Paknikar, Shashikumar Keshav; Kamounah, Fadhil S.; Hansen, Poul Erik

    2017-01-01

    Conversion of acetyl cedrene (2) to its follower (3) using acetic anhydride and polyphosphoric acid involves a multi-step cationic molecular rearrangement, which is consistent with deuteriation and 1-13C labeling studies of acetyl cedrene. The key step involves cyclopropylcarbinyl cation-cyclopro...

  14. Automated segmentation and reconstruction of patient-specific cardiac anatomy and pathology from in vivo MRI

    International Nuclear Information System (INIS)

    Ringenberg, Jordan; Deo, Makarand; Devabhaktuni, Vijay; Filgueiras-Rama, David; Pizarro, Gonzalo; Ibañez, Borja; Berenfeld, Omer; Boyers, Pamela; Gold, Jeffrey

    2012-01-01

    This paper presents an automated method to segment left ventricle (LV) tissues from functional and delayed-enhancement (DE) cardiac magnetic resonance imaging (MRI) scans using a sequential multi-step approach. First, a region of interest (ROI) is computed to create a subvolume around the LV using morphological operations and image arithmetic. From the subvolume, the myocardial contours are automatically delineated using difference of Gaussians (DoG) filters and GSV snakes. These contours are used as a mask to identify pathological tissues, such as fibrosis or scar, within the DE-MRI. The presented automated technique is able to accurately delineate the myocardium and identify the pathological tissue in patient sets. The results were validated by two expert cardiologists, and in one set the automated results are quantitatively and qualitatively compared with expert manual delineation. Furthermore, the method is patient-specific, performed on an entire patient MRI series. Thus, in addition to providing a quick analysis of individual MRI scans, the fully automated segmentation method is used for effectively tagging regions in order to reconstruct computerized patient-specific 3D cardiac models. These models can then be used in electrophysiological studies and surgical strategy planning. (paper)

  15. Incremental Learning of Medical Data for Multi-Step Patient Health Classification

    DEFF Research Database (Denmark)

    Kranen, Philipp; Müller, Emmanuel; Assent, Ira

    2010-01-01

    of textile sensors, body sensors and preprocessing techniques as well as the integration and merging of sensor data in electronic health record systems. Emergency detection on multiple levels will show the benefits of multi-step classification and further enhance the scalability of emergency detection...

  16. A Fifth Order Hybrid Linear Multistep method For the Direct Solution ...

    African Journals Online (AJOL)

    A linear multistep hybrid method (LMHM)with continuous coefficients isconsidered and directly applied to solve third order initial and boundary value problems (IBVPs). The continuous method is used to obtain Multiple Finite Difference Methods (MFDMs) (each of order 5) which are combined as simultaneous numerical ...

  17. Bayesian emulation for optimization in multi-step portfolio decisions

    OpenAIRE

    Irie, Kaoru; West, Mike

    2016-01-01

    We discuss the Bayesian emulation approach to computational solution of multi-step portfolio studies in financial time series. "Bayesian emulation for decisions" involves mapping the technical structure of a decision analysis problem to that of Bayesian inference in a purely synthetic "emulating" statistical model. This provides access to standard posterior analytic, simulation and optimization methods that yield indirect solutions of the decision problem. We develop this in time series portf...

  18. KAPSIES: A program for the calculation of multi-step direct reaction cross sections

    International Nuclear Information System (INIS)

    Koning, A.J.; Akkermans, J.M.

    1994-09-01

    We present a program for the calculation of continuum cross sections, sepctra, angular distributions and analyzing powers according to various quantum-mechanical theories for statistical multi-step direct nuclear reactions. (orig.)

  19. A Multistep Synthesis Incorporating a Green Bromination of an Aromatic Ring

    Science.gov (United States)

    Cardinal, Pascal; Greer, Brandon; Luong, Horace; Tyagunova, Yevgeniya

    2012-01-01

    Electrophilic aromatic substitution is a fundamental topic taught in the undergraduate organic chemistry curriculum. A multistep synthesis that includes a safer and greener method for the bromination of an aromatic ring than traditional bromination methods is described. This experiment is multifaceted and can be used to teach students about…

  20. Multistep Synthesis of a Terphenyl Derivative Showcasing the Diels-Alder Reaction

    Science.gov (United States)

    Davie, Elizabeth A. Colby

    2015-01-01

    An adaptable multistep synthesis project designed for the culmination of a second-year organic chemistry laboratory course is described. The target compound is a terphenyl derivative that is an intermediate in the synthesis of compounds used in organic light-emitting devices. Students react a conjugated diene with dimethylacetylene dicarboxylate…

  1. Automation of RELAP5 input calibration and code validation using genetic algorithm

    International Nuclear Information System (INIS)

    Phung, Viet-Anh; Kööp, Kaspar; Grishchenko, Dmitry; Vorobyev, Yury; Kudinov, Pavel

    2016-01-01

    Highlights: • Automated input calibration and code validation using genetic algorithm is presented. • Predictions generally overlap experiments for individual system response quantities (SRQs). • It was not possible to predict simultaneously experimental maximum flow rate and oscillation period. • Simultaneous consideration of multiple SRQs is important for code validation. - Abstract: Validation of system thermal-hydraulic codes is an important step in application of the codes to reactor safety analysis. The goal of the validation process is to determine how well a code can represent physical reality. This is achieved by comparing predicted and experimental system response quantities (SRQs) taking into account experimental and modelling uncertainties. Parameters which are required for the code input but not measured directly in the experiment can become an important source of uncertainty in the code validation process. Quantification of such parameters is often called input calibration. Calibration and uncertainty quantification may become challenging tasks when the number of calibrated input parameters and SRQs is large and dependencies between them are complex. If only engineering judgment is employed in the process, the outcome can be prone to so called “user effects”. The goal of this work is to develop an automated approach to input calibration and RELAP5 code validation against data on two-phase natural circulation flow instability. Multiple SRQs are used in both calibration and validation. In the input calibration, we used genetic algorithm (GA), a heuristic global optimization method, in order to minimize the discrepancy between experimental and simulation data by identifying optimal combinations of uncertain input parameters in the calibration process. We demonstrate the importance of the proper selection of SRQs and respective normalization and weighting factors in the fitness function. In the code validation, we used maximum flow rate as the

  2. Automation of RELAP5 input calibration and code validation using genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Phung, Viet-Anh, E-mail: vaphung@kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Kööp, Kaspar, E-mail: kaspar@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Grishchenko, Dmitry, E-mail: dmitry@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Vorobyev, Yury, E-mail: yura3510@gmail.com [National Research Center “Kurchatov Institute”, Kurchatov square 1, Moscow 123182 (Russian Federation); Kudinov, Pavel, E-mail: pavel@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden)

    2016-04-15

    Highlights: • Automated input calibration and code validation using genetic algorithm is presented. • Predictions generally overlap experiments for individual system response quantities (SRQs). • It was not possible to predict simultaneously experimental maximum flow rate and oscillation period. • Simultaneous consideration of multiple SRQs is important for code validation. - Abstract: Validation of system thermal-hydraulic codes is an important step in application of the codes to reactor safety analysis. The goal of the validation process is to determine how well a code can represent physical reality. This is achieved by comparing predicted and experimental system response quantities (SRQs) taking into account experimental and modelling uncertainties. Parameters which are required for the code input but not measured directly in the experiment can become an important source of uncertainty in the code validation process. Quantification of such parameters is often called input calibration. Calibration and uncertainty quantification may become challenging tasks when the number of calibrated input parameters and SRQs is large and dependencies between them are complex. If only engineering judgment is employed in the process, the outcome can be prone to so called “user effects”. The goal of this work is to develop an automated approach to input calibration and RELAP5 code validation against data on two-phase natural circulation flow instability. Multiple SRQs are used in both calibration and validation. In the input calibration, we used genetic algorithm (GA), a heuristic global optimization method, in order to minimize the discrepancy between experimental and simulation data by identifying optimal combinations of uncertain input parameters in the calibration process. We demonstrate the importance of the proper selection of SRQs and respective normalization and weighting factors in the fitness function. In the code validation, we used maximum flow rate as the

  3. Implementing a Structured Reporting Initiative Using a Collaborative Multistep Approach.

    Science.gov (United States)

    Goldberg-Stein, Shlomit; Walter, William R; Amis, E Stephen; Scheinfeld, Meir H

    To describe the successful implementation of a structured reporting initiative in a large urban academic radiology department. We describe our process, compromises, and top 10 lessons learned in overhauling traditional reporting practices and comprehensively implementing structured reporting at our institution. To achieve our goals, we took deliberate steps toward consensus building, undertook multistep template refinement, and achieved close collaboration with the technical staff, department coders, and hospital information technologists. Following institutional review board exemption, we audited radiologist compliance by evaluating 100 consecutive cases of 12 common examination types. Fisher exact test was applied to determine significance of association between trainee initial report drafting and template compliance. We produced and implemented structured reporting templates for 95% of all departmental computed tomography, magnetic resonance, and ultrasound examinations. Structured templates include specialized reports adhering to the American College of Radiology's Reporting and Data Systems (ACR's RADS) recommendations (eg, Lung-RADS and Li-RADS). We attained 94% radiologist compliance within 2 years, without any financial incentives. We provide a blueprint of how to successfully achieve structured reporting using a collaborative multistep approach. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. An Automated Defect Prediction Framework using Genetic Algorithms: A Validation of Empirical Studies

    Directory of Open Access Journals (Sweden)

    Juan Murillo-Morera

    2016-05-01

    Full Text Available Today, it is common for software projects to collect measurement data through development processes. With these data, defect prediction software can try to estimate the defect proneness of a software module, with the objective of assisting and guiding software practitioners. With timely and accurate defect predictions, practitioners can focus their limited testing resources on higher risk areas. This paper reports the results of three empirical studies that uses an automated genetic defect prediction framework. This framework generates and compares different learning schemes (preprocessing + attribute selection + learning algorithms and selects the best one using a genetic algorithm, with the objective to estimate the defect proneness of a software module. The first empirical study is a performance comparison of our framework with the most important framework of the literature. The second empirical study is a performance and runtime comparison between our framework and an exhaustive framework. The third empirical study is a sensitivity analysis. The last empirical study, is our main contribution in this paper. Performance of the software development defect prediction models (using AUC, Area Under the Curve was validated using NASA-MDP and PROMISE data sets. Seventeen data sets from NASA-MDP (13 and PROMISE (4 projects were analyzed running a NxM-fold cross-validation. A genetic algorithm was used to select the components of the learning schemes automatically, and to assess and report the results. Our results reported similar performance between frameworks. Our framework reported better runtime than exhaustive framework. Finally, we reported the best configuration according to sensitivity analysis.

  5. Progress in applyiong the FKK multistep reaction theory to intermediate-energy data evaluation

    International Nuclear Information System (INIS)

    Chadwick, M.B.; Young, P.G.

    1994-01-01

    Recent developments to the physics modeling in the FKK-GNASH code system are reviewed. We describe modifications to include a linking of multistep direct and multistep compound processes, which are important when the incident energy is less than about 30 MeV. A model for multiple preequilibrium emission is given, and compared with experimental measurements of proton reactions on 90 Zr at 160 MeV. We also give some preliminary observations concerning FKK calculations which use both normal and non-normal DWBA matrix elements. We describe the application of the FKK-GNASH code to a range of nuclear data applications, including intermediate energy reactions of importance in the accelerator transmutation of waste, and fast neutron and proton cancer radiation treatment. We outline areas where further work is needed for the accurate modeling of nuclear reactions using the FKK theory

  6. Controlled growth of silica-titania hybrid functional nanoparticles through a multistep microfluidic approach.

    Science.gov (United States)

    Shiba, K; Sugiyama, T; Takei, T; Yoshikawa, G

    2015-11-11

    Silica/titania-based functional nanoparticles were prepared through controlled nucleation of titania and subsequent encapsulation by silica through a multistep microfluidic approach, which was successfully applied to obtaining aminopropyl-functionalized silica/titania nanoparticles for a highly sensitive humidity sensor.

  7. Particle-hole state densities for statistical multi-step compound reactions

    International Nuclear Information System (INIS)

    Oblozinsky, P.

    1986-01-01

    An analytical relation is derived for the density of particle-hole bound states applying the equidistant-spacing approximation and the Darwin-Fowler statistical method. The Pauli exclusion principle as well as the finite depth of the potential well are taken into account. The set of densities needed for calculations of multi-step compound reactions is completed by deriving the densities of accessible final states for escape and damping. (orig.)

  8. Multi-Step Time Series Forecasting with an Ensemble of Varied Length Mixture Models.

    Science.gov (United States)

    Ouyang, Yicun; Yin, Hujun

    2018-05-01

    Many real-world problems require modeling and forecasting of time series, such as weather temperature, electricity demand, stock prices and foreign exchange (FX) rates. Often, the tasks involve predicting over a long-term period, e.g. several weeks or months. Most existing time series models are inheritably for one-step prediction, that is, predicting one time point ahead. Multi-step or long-term prediction is difficult and challenging due to the lack of information and uncertainty or error accumulation. The main existing approaches, iterative and independent, either use one-step model recursively or treat the multi-step task as an independent model. They generally perform poorly in practical applications. In this paper, as an extension of the self-organizing mixture autoregressive (AR) model, the varied length mixture (VLM) models are proposed to model and forecast time series over multi-steps. The key idea is to preserve the dependencies between the time points within the prediction horizon. Training data are segmented to various lengths corresponding to various forecasting horizons, and the VLM models are trained in a self-organizing fashion on these segments to capture these dependencies in its component AR models of various predicting horizons. The VLM models form a probabilistic mixture of these varied length models. A combination of short and long VLM models and an ensemble of them are proposed to further enhance the prediction performance. The effectiveness of the proposed methods and their marked improvements over the existing methods are demonstrated through a number of experiments on synthetic data, real-world FX rates and weather temperatures.

  9. Rapid access to compound libraries through flow technology: fully automated synthesis of a 3-aminoindolizine library via orthogonal diversification.

    Science.gov (United States)

    Lange, Paul P; James, Keith

    2012-10-08

    A novel methodology for the synthesis of druglike heterocycle libraries has been developed through the use of flow reactor technology. The strategy employs orthogonal modification of a heterocyclic core, which is generated in situ, and was used to construct both a 25-membered library of druglike 3-aminoindolizines, and selected examples of a 100-member virtual library. This general protocol allows a broad range of acylation, alkylation and sulfonamidation reactions to be performed in conjunction with a tandem Sonogashira coupling/cycloisomerization sequence. All three synthetic steps were conducted under full automation in the flow reactor, with no handling or isolation of intermediates, to afford the desired products in good yields. This fully automated, multistep flow approach opens the way to highly efficient generation of druglike heterocyclic systems as part of a lead discovery strategy or within a lead optimization program.

  10. Roles of multi-step transfer in fusion process induced by heavy-ion reactions

    International Nuclear Information System (INIS)

    Imanishi, B.; Oertzen, W. von.

    1993-06-01

    In nucleus-nucleus collisions of the systems, 12 C+ 13 C and 13 C+ 16 O- 12 C+ 17 O, the effects of the multi-step transfers and inelastic excitations on the fusion cross sections are investigated in the framework of the coupled-reaction-channel (CRC) method. Strong CRC effects of the multi-step processes are observed. Namely, the valence neutron in 13 C or 17 O plays an important role in the enhancement of the fusion. The potential barrier is effectively lowered with the formation of the covalent molecule of the configuration, 12 C+n+ 12 C or 12 C+n+ 16 O. In the analyses of the system 12 C+ 13 C, however, it is still required to introduce core-core optical potential of lower barrier height in the state of the positive total parity. This could be due to the neck formation with the nucleons contained in two core nuclei. (author)

  11. Impact of user influence on information multi-step communication in a micro-blog

    International Nuclear Information System (INIS)

    Wu Yue; Hu Yong; He Xiao-Hai; Deng Ken

    2014-01-01

    User influence is generally considered as one of the most critical factors that affect information cascading spreading. Based on this common assumption, this paper proposes a theoretical model to examine user influence on the information multi-step communication in a micro-blog. The multi-steps of information communication are divided into first-step and non-first-step, and user influence is classified into five dimensions. Actual data from the Sina micro-blog is collected to construct the model by means of an approach based on structural equations that uses the Partial Least Squares (PLS) technique. Our experimental results indicate that the dimensions of the number of fans and their authority significantly impact the information of first-step communication. Leader rank has a positive impact on both first-step and non-first-step communication. Moreover, global centrality and weight of friends are positively related to the information non-first-step communication, but authority is found to have much less relation to it

  12. Optimal Solution for VLSI Physical Design Automation Using Hybrid Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    I. Hameem Shanavas

    2014-01-01

    Full Text Available In Optimization of VLSI Physical Design, area minimization and interconnect length minimization is an important objective in physical design automation of very large scale integration chips. The objective of minimizing the area and interconnect length would scale down the size of integrated chips. To meet the above objective, it is necessary to find an optimal solution for physical design components like partitioning, floorplanning, placement, and routing. This work helps to perform the optimization of the benchmark circuits with the above said components of physical design using hierarchical approach of evolutionary algorithms. The goal of minimizing the delay in partitioning, minimizing the silicon area in floorplanning, minimizing the layout area in placement, minimizing the wirelength in routing has indefinite influence on other criteria like power, clock, speed, cost, and so forth. Hybrid evolutionary algorithm is applied on each of its phases to achieve the objective. Because evolutionary algorithm that includes one or many local search steps within its evolutionary cycles to obtain the minimization of area and interconnect length. This approach combines a hierarchical design like genetic algorithm and simulated annealing to attain the objective. This hybrid approach can quickly produce optimal solutions for the popular benchmarks.

  13. A Multistep Synthesis Featuring Classic Carbonyl Chemistry for the Advanced Organic Chemistry Laboratory

    Science.gov (United States)

    Duff, David B.; Abbe, Tyler G.; Goess, Brian C.

    2012-01-01

    A multistep synthesis of 5-isopropyl-1,3-cyclohexanedione is carried out from three commodity chemicals. The sequence involves an aldol condensation, Dieckmann-type annulation, ester hydrolysis, and decarboxylation. No purification is required until after the final step, at which point gravity column chromatography provides the desired product in…

  14. The value of some Corsican sub-populations for genetic association studies

    Directory of Open Access Journals (Sweden)

    Vona Giuseppe

    2008-07-01

    Full Text Available Abstract Background Genetic isolates with a history of a small founder population, long-lasting isolation and population bottlenecks represent exceptional resources in the identification of disease genes. In these populations the disease allele reveals Linkage Disequilibrium (LD with markers over significant genetic intervals, therefore facilitating disease locus identification. In a previous study we examined the LD extension on the Xq13 region in three Corsican sub-populations from the inner mountainous region of the island. On the basis of those previous results we have proposed a multistep procedure to carry out studies aimed at the identification of genes involved in complex diseases in Corsica. A prerequisite to carry out the proposed multi-step procedure was the presence of different degrees of LD on the island and a common genetic derivation of the different Corsican sub-populations. In order to evaluate the existence of these conditions in the present paper we extended the analysis to the Corsican coastal populations. Methods Samples were analyzed using seven dinucleotide microsatellite markers on chromosome Xq13-21: DXS983, DXS986, DXS8092, DXS8082, DXS1225, DXS8037 and DXS995 spanning approximately 4.0 cM (13.3 Mb. We have also investigated the distribution of the DXS1225-DXS8082 haplotype which has been recently proposed as a good marker of population genetic history due to its low recombination rate. Results the results obtained indicate a decrease of LD on the island from the central mountainous toward the coastal sub-populations. In addition the analysis of the DXS1225-DXS8082 haplotype revealed: 1 the presence of a particular haplotype with high frequency; 2 the derivation from a common genetic pool of the sub-populations examined in the present study. Conclusion These results indicate the Corsican sub-populations useful for the fine mapping of genes contributing to complex diseases.

  15. Comparison study of multistep forging and injection forging of automobile fasteners

    OpenAIRE

    Chen Senyong; Qin Yi

    2015-01-01

    In order to improve production efficiency, injection forging as a feasible approach was introduced to automobile fasteners production. In the study reported in this paper, two forging approaches, traditional multistep forging and injection forging, were analysed by using a finite element method. Using ABAQUS and DEFORM, some significant factors, namely, forging force, energy consumption, component accuracy and stress distribution in the die, were compared to explore the potential and challeng...

  16. Feshbach-Kerman-Koonin analysis of 93Nb reactions: P→Q transitions and reduced importance of multistep compound emission

    International Nuclear Information System (INIS)

    Chadwick, M.B.; Young, P.G.

    1993-01-01

    We have implemented multistep compound (MSC) and multistep direct (MSD) preequilibrium theories of Feshbach, Kerman, and Koonin (FKK) for the calculation of nucleon-induced reactions. Unlike most previous analyses, which have concentrated on just one of these multistep mechanisms, we consider both mechanisms as well as subsequent Hauser-Feshbach equilibrium emission, and describe the complete nucleon emission spectra and angular distributions quantum mechanically. We compare theoretical calculations of (n,n') and (n,p) reactions on 93 Nb at energies of 14, 20, and 25.7 MeV with experimental data. Our analysis suggests that the FKK theory should be modified to allow transitions from the MSD to MSC preequilibrium chains, and shows MSC processes to be less important than previously thought. We find that the MSD mechanism dominates preequilibrium emission even for incident neutron energies as low as 14 MeV. A model to account for preequilibrium flux cascading from the MSD to MSC chain is presented, and we check its validity with a least-squares fit to data which establishes the experimentally observed partitioning between MSD and MSC

  17. Contractivity properties of a class of linear multistep methods for nonlinear neutral delay differential equations

    International Nuclear Information System (INIS)

    Wang Wansheng; Li Shoufu; Wang Wenqiang

    2009-01-01

    In this paper, we show that under identical conditions which guarantee the contractivity of the theoretical solutions of general nonlinear NDDEs, the numerical solutions obtained by a class of linear multistep methods are also contractive.

  18. Automated extraction of DNA from blood and PCR setup using a Tecan Freedom EVO liquid handler for forensic genetic STR typing of reference samples

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Frøslev, Tobias G; Frank-Hansen, Rune

    2011-01-01

    We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO liquid handler mounted with the Te-MagS magnetic separation device (Tecan, Männedorf, Switzerland). The protocols were validated for accredited forensic genetic work according to ISO...... handler leading to the reduction of manual work, and increased quality and throughput....

  19. The average angular distribution of emitted particles in multi-step compound processes

    International Nuclear Information System (INIS)

    Bonetti, R.; Carlson, B.V.; Hussein, M.S.; Toledo, A.S. de

    1983-05-01

    A simple model for the differential cross-section that describes the angular distribution of emitted particles in heavy-ion induced multi-step compound reactions, is constructed. It is suggested that through a careful analysis of the deviations of the experimental data from the pure Hauser-Feshbach behaviour may shed light on the physical nature of the pre-compound, heavy-ion configuration. (Author) [pt

  20. Diffusion coefficients for periodically induced multi-step persistent walks on regular lattices

    International Nuclear Information System (INIS)

    Gilbert, Thomas; Sanders, David P

    2012-01-01

    We present a generalization of our formalism for the computation of diffusion coefficients of multi-step persistent random walks on regular lattices to walks which include zero-displacement states. This situation is especially relevant to systems where tracer particles move across potential barriers as a result of the action of a periodic forcing whose period sets the timescale between transitions. (paper)

  1. Multi-step carboxymethylation of kappa-Carrageenan

    International Nuclear Information System (INIS)

    Aranilla, Charito Tranquilan; Nagasawa, Naotsugu; Bayquen, Aristea V.

    2008-01-01

    Many polysaccharide derivatives have been prepared by carboxymethylation reactions in order to increase the range of potential applications of these natural polymers in the chemical, food, pharmaceutical and cosmetic industries. Carboxymethylation of kappa-carrageenan was attempted for the first time to synthesize derivatives with various degree of substitution. A multistep carboxymethylation was performed under heterogeneous reaction conditions, in isopropyl alcohol/water slurry medium, with aqueous sodium hydroxide solution for activation, and monochloroacetic acid for etherification. The derivatives obtained had average degree of substitutions from 1.20 to 1.92 as determined by potentiometric back-titration. Chemical and structural characterization were accomplished by Gel Permeation Chromatography, Elemental analysis, FT-IR Spectroscopy, 1 H N and 13 C NMR Spectroscopy. The relative reactivity of the hydroxyl groups in κ-carrageenan dimer unit proceeded in the order O-C2 G4S > O-C6 G4S >O-C2 AG at a ratio of 1:0.6:O.4. (author)

  2. A multistep continuous-flow system for rapid on-demand synthesis of receptor ligands

    DEFF Research Database (Denmark)

    Petersen, Trine P; Ritzén, Andreas; Ulven, Trond

    2009-01-01

    A multistep continuous-flow system for synthesis of receptor ligands by assembly of three variable building blocks in a single unbroken flow is described. The sequence consists of three reactions and two scavenger steps, where a Cbz-protected diamine is reacted with an isocyanate, deprotected, an......, and reacted further with an alkylating agent....

  3. Observation of new levels for isotope separation in atomic uranium by multistep ionization

    International Nuclear Information System (INIS)

    Carlson, L.R.; Solarz, R.W.; Paisne, J.A.; Worden, E.F.; May, C.A.; Johnson, S.A.

    1975-01-01

    Over 100 new odd parity levels useful for isotope separation of U(I) were observed between 32,660--34,165 cm -1 using multistep photoionization. A tabulation of typical cross sections and radiative lifetimes for these states is given. The extension of this technique to mapping the spectra of other heavy atoms is discussed

  4. Formation of Stone-Wales edge: Multistep reconstruction and growth mechanisms of zigzag nanographene.

    Science.gov (United States)

    Dang, Jing-Shuang; Wang, Wei-Wei; Zheng, Jia-Jia; Nagase, Shigeru; Zhao, Xiang

    2017-10-05

    Although the existence of Stone-Wales (5-7) defect at graphene edge has been clarified experimentally, theoretical study on the formation mechanism is still imperfect. In particular, the regioselectivity of multistep reactions at edge (self-reconstruction and growth with foreign carbon feedstock) is essential to understand the kinetic behavior of reactive boundaries but investigations are still lacking. Herein, by using finite-sized models, multistep reconstructions and carbon dimer additions of a bared zigzag edge are introduced using density functional theory calculations. The zigzag to 5-7 transformation is proved as a site-selective process to generate alternating 5-7 pairs sequentially and the first step with largest barrier is suggested as the rate-determining step. Conversely, successive C 2 insertions on the active edge are calculated to elucidate the formation of 5-7 edge during graphene growth. A metastable intermediate with a triple sequentially fused pentagon fragment is proved as the key structure for 5-7 edge formation. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  5. Methods and Tools for the Analysis, Verification and Synthesis of Genetic Logic Circuits,

    DEFF Research Database (Denmark)

    Baig, Hasan

    2017-01-01

    . This usually requires simulating the mathematical models of these genetic circuits and perceive whether or not the circuit behaves appropriately. Furthermore, synthetic biology utilizes the concepts from electronic design automation (EDA) of abstraction and automated construction to generate genetic circuits...... that the proposed approach is effective to determine the variation in the behavior of genetic circuits when the circuit’s parameters are changed. In addition, the thesis also attempts to propose a synthesis and technology mapping tool, called GeneTech, for genetic circuits. It allows users to construct a genetic...... important design characteristics. This thesis also introduces an automated approach to analyze the behavior of genetic logic circuits from the simulation data. With this capability, the boolean logic of complex genetic circuits can be analyzed and/or verified automatically. It is also shown in this thesis...

  6. A novel multistep method for chondroitin sulphate immobilization and its interaction with fibroblast cells

    Energy Technology Data Exchange (ETDEWEB)

    Ozaltin, Kadir; Lehocký, Marián, E-mail: lehocky@post.cz; Kuceková, Zdenka; Humpolíček, Petr; Sáha, Petr

    2017-01-01

    Polymeric biomaterials are widely used in medical applications owing to their low cost, processability and sufficient toughness. Surface modification by creating a thin film of bioactive agents is promising technique to enhance cellular interactions, regulate the protein adsorption and/or avoid bacterial infections. Polyethylene is one of the most used polymeric biomaterial but its hydrophobic nature impedes its further chemical modifications. Plasma treatment is unique method to increase its hydrophilicity by incorporating hydrophilic oxidative functional groups and tailoring the surface by physical etching. Furthermore, grafting of polymer brushes of amine group containing monomers onto the functionalized surface lead to strongly immobilized bioactive agents at the final step. Chondroitin sulphate is natural polysaccharide mainly found in connective cartilage tissue which used as a bioactive agent to immobilize onto polyethylene surface by multistep method in this study. - Highlights: • Attachment of chondroitin sulfate to polyethylene. • A robust way to modify surfaces using multistep approach. • The modified surfaces showed improved proliferation of mouse primary fibroblast cells.

  7. In core monitor having multi-step seals

    International Nuclear Information System (INIS)

    Kasai, Makoto; Ono, Susumu.

    1976-01-01

    Purpose: To completely prevent a sensor gas sealed in a pipe from leaking in an in-core neutron detector for use with a bwr type reactor. Constitution: In an in core monitor fabricated by disposing inner and outer electrodes in a housing, forming a layer of neutron conversion material on the outer electrode, filling an ionizing gas within the space between the layer and the inner electrode and, thereafter, attaching an insulation cable and an exhaust pipe respectively by way of insulators to both ends of the housing, the exhaust pipe is sealed in two-steps through pressure bonding using a multi-stepped pincher tool having two pressure bonding bits of a step shape and the outer sealing portion is further welded. The sensor gas sealed in the pipe can thus be prevented from leaking upon pressure bonding and welding. (Horiuchi, T.)

  8. In core monitor having multi-step seals

    Energy Technology Data Exchange (ETDEWEB)

    Kasai, M; Ono, S

    1976-12-09

    A method to completely prevent a sensor gas sealed in a pipe from leaking in an in-core neutron detector for use with a BWR type reactor is described. In an in core monitor fabricated by disposing inner and outer electrodes in a housing, forming a layer of neutron conversion material on the outer electrode, filling an ionizing gas within the space between the layer and the inner electrode and, thereafter, attaching an insulation cable and an exhaust pipe respectively by way of insulators to both ends of the housing, the exhaust pipe is sealed in two-steps through pressure bonding using a multi-stepped pincher tool having two pressure bonding bits of a step shape and the outer sealing portion is further welded. The sensor gas sealed in the pipe can thus be prevented from leaking upon pressure bonding and welding.

  9. Round-off error in long-term orbital integrations using multistep methods

    Science.gov (United States)

    Quinlan, Gerald D.

    1994-01-01

    Techniques for reducing roundoff error are compared by testing them on high-order Stormer and summetric multistep methods. The best technique for most applications is to write the equation in summed, function-evaluation form and to store the coefficients as rational numbers. A larger error reduction can be achieved by writing the equation in backward-difference form and performing some of the additions in extended precision, but this entails a larger central processing unit (cpu) cost.

  10. The multi-step prompt particle emission from fission fragments

    International Nuclear Information System (INIS)

    Zhivopistsev, A.; Oprea, C.; Oprea, I.

    2003-01-01

    The purpose of this work is the study of non-equilibrium high-energy gamma emission from 252 Cf. In the framework of the formalism of statistical multi-step compound processes in nuclear reactions. A relation was found between the shape of the high-energy part of the gamma spectrum and different mechanisms of excitation of the fission fragments. Agreement with experimental data for different groups of fission fragments was obtained. The analysis of the experimental high-energy part of gamma spectra yields information about the mechanism of excitation of fission fragments. The influence of dissipation of the deformation excess on intrinsic excitation of fission fragments was studied. (authors)

  11. Performance assessment of laboratory and field-scale multi-step passive treatment of iron-rich acid mine drainage for design improvement.

    Science.gov (United States)

    Rakotonimaro, Tsiverihasina V; Neculita, Carmen Mihaela; Bussière, Bruno; Genty, Thomas; Zagury, Gérald J

    2018-04-17

    Multi-step passive systems for the treatment of iron-rich acid mine drainage (Fe-rich AMD) perform satisfactorily at the laboratory scale. However, their field-scale application has revealed dissimilarities in performance, particularly with respect to hydraulic parameters. In this study, the assessment of factors potentially responsible for the variations in performance of laboratory and field-scale multi-step systems was undertaken. Three laboratory multi-step treatment scenarios, involving a combination of dispersed alkaline substrate (DAS) units, anoxic dolomitic drains, and passive biochemical reactors (PBRs), were set up in 10.7-L columns. The field-scale treatment consisted of two PBRs separated by a wood ash (WA) reactor. The parameters identified as possibly influencing the performances of the laboratory and field-scale experiments were the following: AMD chemistry (electrical conductivity and Fe and SO 4 2- concentrations), flow rate (Q), and saturated hydraulic conductivity (k sat ). Based on these findings, the design of an efficient passive multi-step treatment system is suggested to consider the following: (1) Fe pretreatment, using materials with high k sat and low HRT. If a PBR is to be used, the Fe load should be PBR/DAS filled with a mixture with at least 20% of neutralizing agent; (3) include Q and k sat (> 10 -3  cm/s) in the long-term prediction. Finally, mesocosm testing is strongly recommended prior to construction of full-scale systems for the treatment of Fe-rich AMD.

  12. Application of fluorescence-based semi-automated AFLP analysis in barley and wheat

    DEFF Research Database (Denmark)

    Schwarz, G.; Herz, M.; Huang, X.Q.

    2000-01-01

    of semi-automated codominant analysis for hemizygous AFLP markers in an F-2 population was too low, proposing the use of dominant allele-typing defaults. Nevertheless, the efficiency of genetic mapping, especially of complex plant genomes, will be accelerated by combining the presented genotyping......Genetic mapping and the selection of closely linked molecular markers for important agronomic traits require efficient, large-scale genotyping methods. A semi-automated multifluorophore technique was applied for genotyping AFLP marker loci in barley and wheat. In comparison to conventional P-33...

  13. Evaluation of accuracy in implant site preparation performed in single- or multi-step drilling procedures.

    Science.gov (United States)

    Marheineke, Nadine; Scherer, Uta; Rücker, Martin; von See, Constantin; Rahlf, Björn; Gellrich, Nils-Claudius; Stoetzer, Marcus

    2018-06-01

    Dental implant failure and insufficient osseointegration are proven results of mechanical and thermal damage during the surgery process. We herein performed a comparative study of a less invasive single-step drilling preparation protocol and a conventional multiple drilling sequence. Accuracy of drilling holes was precisely analyzed and the influence of different levels of expertise of the handlers and additional use of drill template guidance was evaluated. Six experimental groups, deployed in an osseous study model, were representing template-guided and freehanded drilling actions in a stepwise drilling procedure in comparison to a single-drill protocol. Each experimental condition was studied by the drilling actions of respectively three persons without surgical knowledge as well as three highly experienced oral surgeons. Drilling actions were performed and diameters were recorded with a precision measuring instrument. Less experienced operators were able to significantly increase the drilling accuracy using a guiding template, especially when multi-step preparations are performed. Improved accuracy without template guidance was observed when experienced operators were executing single-step versus multi-step technique. Single-step drilling protocols have shown to produce more accurate results than multi-step procedures. The outcome of any protocol can be further improved by use of guiding templates. Operator experience can be a contributing factor. Single-step preparations are less invasive and are promoting osseointegration. Even highly experienced surgeons are achieving higher levels of accuracy by combining this technique with template guidance. Hereby template guidance enables a reduction of hands-on time and side effects during surgery and lead to a more predictable clinical diameter.

  14. Multi-step direct reactions

    International Nuclear Information System (INIS)

    Koning, A.J.

    1992-07-01

    In recent years a variety of statistical theories has been developed concerning multistep direct (MSD) nuclear reactions. In addition, dominant in applications is a whole class of semiclassical models that may be subsumed under the heading of 'generalized exciton models'; these are basically MSD-type extensions on top of compound-like concepts. In this report the relation between their underlying statistical MSD-postulates are highlighted. A command framework is sketched that enables to generate the various MSD theories through assigning statistical properties to different parts of the nuclear Hamiltonian. Then it is shown that distinct forms of nuclear randomness are embodied in the mentioned theories. All these theories appear to be very similar at a qualitative level. In order to explain the high energy-tails and forward-peaked angular distribution typical for particles emitted in MSD reactions, it is imagined that the incident continuum particle stepwise looses its energy and direction in a sequence of collisions, thereby creating new particle-hole pairs in the target system. At each step emission may take place. The statistical aspect comes in because many continuum states are involved in the process. These are supposed to display chaotic behavior, the associated randomness assumption giving rise to important simplifications in the expression for MSD emission cross sections. This picture suggests that mentioned MSD models can be interpreted as a variant of essentially one and the same theory. 113 refs.; 25 figs.; 9 tabs

  15. Testing of multistep soil washing for radiocesium-contaminated soil containing plant matter

    International Nuclear Information System (INIS)

    Funakawa, Masafumi; Tagawa, Akihiro; Okuda, Nobuyasu

    2012-01-01

    Decontamination work following radiocesium exposure requires a vast reduction in the amount of contaminated soil generated. The current study subjected 4 types of contaminated soil with different properties to multistep soil washing under the same conditions. This study also determined the effectiveness of radiocesium decontamination and the extent to which the amount of contaminated soil was reduced. In addition, the effectiveness of plant matter separation, adsorbent addition, and grinding as part of multistep soil washing was determined using the same contaminated soil. Results of testing indicated that the rate of radiocesium decontamination ranged from 73.6 to 89.2% and the recovery rate ranged from 51.5 to 84.2% for twice-treated soil, regardless of the soil properties or cesium level. Plant matter in soil had a high radiocesium level. However, there was little plant matter in our soil sample. Therefore, plant matter separation had little effect on the improvement in the percentage of radiocesium decontamination of twice-treated soil. Soil surface grinding improved the rate of radiocesium decontamination of twice-treated soil. However, radiocesium in soil tightly bound with minerals in the soil; thus, the addition of an adsorbent also failed to improve the rate of radiocesium decontamination. (author)

  16. Multistep Model of Cervical Cancer: Participation of miRNAs and Coding Genes

    Directory of Open Access Journals (Sweden)

    Angelica Judith Granados López

    2014-09-01

    Full Text Available Aberrant miRNA expression is well recognized as an important step in the development of cancer. Close to 70 microRNAs (miRNAs have been implicated in cervical cancer up to now, nevertheless it is unknown if aberrant miRNA expression causes the onset of cervical cancer. One of the best ways to address this issue is through a multistep model of carcinogenesis. In the progression of cervical cancer there are three well-established steps to reach cancer that we used in the model proposed here. The first step of the model comprises the gene changes that occur in normal cells to be transformed into immortal cells (CIN 1, the second comprises immortal cell changes to tumorigenic cells (CIN 2, the third step includes cell changes to increase tumorigenic capacity (CIN 3, and the final step covers tumorigenic changes to carcinogenic cells. Altered miRNAs and their target genes are located in each one of the four steps of the multistep model of carcinogenesis. miRNA expression has shown discrepancies in different works; therefore, in this model we include miRNAs recording similar results in at least two studies. The present model is a useful insight into studying potential prognostic, diagnostic, and therapeutic miRNAs.

  17. Automated extraction of DNA from clothing

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hjort, Benjamin Benn; Nøhr Hansen, Thomas

    2011-01-01

    Presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. We have compared three automated DNA extraction methods based on magnetic beads with a manual method with the aim of reducing...

  18. Automated tetraploid genotype calling by hierarchical clustering

    Science.gov (United States)

    SNP arrays are transforming breeding and genetics research for autotetraploids. To fully utilize these arrays, however, the relationship between signal intensity and allele dosage must be inferred independently for each marker. We developed an improved computational method to automate this process, ...

  19. A transition from using multi-step procedures to a fully integrated system for performing extracorporeal photopheresis: A comparison of costs and efficiencies.

    Science.gov (United States)

    Azar, Nabih; Leblond, Veronique; Ouzegdouh, Maya; Button, Paul

    2017-12-01

    The Pitié Salpêtrière Hospital Hemobiotherapy Department, Paris, France, has been providing extracorporeal photopheresis (ECP) since November 2011, and started using the Therakos ® CELLEX ® fully integrated system in 2012. This report summarizes our single-center experience of transitioning from the use of multi-step ECP procedures to the fully integrated ECP system, considering the capacity and cost implications. The total number of ECP procedures performed 2011-2015 was derived from department records. The time taken to complete a single ECP treatment using a multi-step technique and the fully integrated system at our department was assessed. Resource costs (2014€) were obtained for materials and calculated for personnel time required. Time-driven activity-based costing methods were applied to provide a cost comparison. The number of ECP treatments per year increased from 225 (2012) to 727 (2015). The single multi-step procedure took 270 min compared to 120 min for the fully integrated system. The total calculated per-session cost of performing ECP using the multi-step procedure was greater than with the CELLEX ® system (€1,429.37 and €1,264.70 per treatment, respectively). For hospitals considering a transition from multi-step procedures to fully integrated methods for ECP where cost may be a barrier, time-driven activity-based costing should be utilized to gain a more comprehensive understanding the full benefit that such a transition offers. The example from our department confirmed that there were not just cost and time savings, but that the time efficiencies gained with CELLEX ® allow for more patient treatments per year. © 2017 The Authors Journal of Clinical Apheresis Published by Wiley Periodicals, Inc.

  20. Sequential and Multistep Substrate Interrogation Provides the Scaffold for Specificity in Human Flap Endonuclease 1

    KAUST Repository

    Sobhy, M.; Joudeh, L.; Huang, X.; Takahashi, Masateru; Hamdan, S.

    2013-01-01

    Human flap endonuclease 1 (FEN1), one of the structure-specific 5' nucleases, is integral in replication, repair, and recombination of cellular DNA. The 5' nucleases share significant unifying features yet cleave diverse substrates at similar positions relative to 5' end junctions. Using single-molecule Förster resonance energy transfer, we find a multistep mechanism that verifies all substrate features before inducing the intermediary-DNA bending step that is believed to unify 5' nuclease mechanisms. This is achieved by coordinating threading of the 5' flap of a nick junction into the conserved capped-helical gateway, overseeing the active site, and bending by binding at the base of the junction. We propose that this sequential and multistep substrate recognition process allows different 5' nucleases to recognize different substrates and restrict the induction of DNA bending to the last common step. Such mechanisms would also ensure the protection ofDNA junctions from nonspecific bending and cleavage. 2013 The Authors.

  1. Sequential and Multistep Substrate Interrogation Provides the Scaffold for Specificity in Human Flap Endonuclease 1

    KAUST Repository

    Sobhy, M.

    2013-06-06

    Human flap endonuclease 1 (FEN1), one of the structure-specific 5\\' nucleases, is integral in replication, repair, and recombination of cellular DNA. The 5\\' nucleases share significant unifying features yet cleave diverse substrates at similar positions relative to 5\\' end junctions. Using single-molecule Förster resonance energy transfer, we find a multistep mechanism that verifies all substrate features before inducing the intermediary-DNA bending step that is believed to unify 5\\' nuclease mechanisms. This is achieved by coordinating threading of the 5\\' flap of a nick junction into the conserved capped-helical gateway, overseeing the active site, and bending by binding at the base of the junction. We propose that this sequential and multistep substrate recognition process allows different 5\\' nucleases to recognize different substrates and restrict the induction of DNA bending to the last common step. Such mechanisms would also ensure the protection ofDNA junctions from nonspecific bending and cleavage. 2013 The Authors.

  2. Study of influence of gas mixture composition on the multistep avalanche chambers characteristics

    International Nuclear Information System (INIS)

    Abdushukurov, D.A.; Zanevskij, Yu.V.; Peshekhonov, V.D.

    1987-01-01

    The influence of the concentration of organic quenchers on the operation of multistep avalanche chambers /MSAC/ has been studied. An empirical dependence of the gas amplification factor of MSAC on the quencher concentration has been derived. Measures are considered to increase the stability of the MSAC operation. To improve the MSAC operation argon + n-heptane, neon + methane and neon + argon + methane mixtures are suggested

  3. CGMIM: Automated text-mining of Online Mendelian Inheritance in Man (OMIM to identify genetically-associated cancers and candidate genes

    Directory of Open Access Journals (Sweden)

    Jones Steven

    2005-03-01

    Full Text Available Abstract Background Online Mendelian Inheritance in Man (OMIM is a computerized database of information about genes and heritable traits in human populations, based on information reported in the scientific literature. Our objective was to establish an automated text-mining system for OMIM that will identify genetically-related cancers and cancer-related genes. We developed the computer program CGMIM to search for entries in OMIM that are related to one or more cancer types. We performed manual searches of OMIM to verify the program results. Results In the OMIM database on September 30, 2004, CGMIM identified 1943 genes related to cancer. BRCA2 (OMIM *164757, BRAF (OMIM *164757 and CDKN2A (OMIM *600160 were each related to 14 types of cancer. There were 45 genes related to cancer of the esophagus, 121 genes related to cancer of the stomach, and 21 genes related to both. Analysis of CGMIM results indicate that fewer than three gene entries in OMIM should mention both, and the more than seven-fold discrepancy suggests cancers of the esophagus and stomach are more genetically related than current literature suggests. Conclusion CGMIM identifies genetically-related cancers and cancer-related genes. In several ways, cancers with shared genetic etiology are anticipated to lead to further etiologic hypotheses and advances regarding environmental agents. CGMIM results are posted monthly and the source code can be obtained free of charge from the BC Cancer Research Centre website http://www.bccrc.ca/ccr/CGMIM.

  4. Design Automation in Synthetic Biology.

    Science.gov (United States)

    Appleton, Evan; Madsen, Curtis; Roehner, Nicholas; Densmore, Douglas

    2017-04-03

    Design automation refers to a category of software tools for designing systems that work together in a workflow for designing, building, testing, and analyzing systems with a target behavior. In synthetic biology, these tools are called bio-design automation (BDA) tools. In this review, we discuss the BDA tools areas-specify, design, build, test, and learn-and introduce the existing software tools designed to solve problems in these areas. We then detail the functionality of some of these tools and show how they can be used together to create the desired behavior of two types of modern synthetic genetic regulatory networks. Copyright © 2017 Cold Spring Harbor Laboratory Press; all rights reserved.

  5. Multi-step wind speed forecasting based on a hybrid forecasting architecture and an improved bat algorithm

    International Nuclear Information System (INIS)

    Xiao, Liye; Qian, Feng; Shao, Wei

    2017-01-01

    Highlights: • Propose a hybrid architecture based on a modified bat algorithm for multi-step wind speed forecasting. • Improve the accuracy of multi-step wind speed forecasting. • Modify bat algorithm with CG to improve optimized performance. - Abstract: As one of the most promising sustainable energy sources, wind energy plays an important role in energy development because of its cleanliness without causing pollution. Generally, wind speed forecasting, which has an essential influence on wind power systems, is regarded as a challenging task. Analyses based on single-step wind speed forecasting have been widely used, but their results are insufficient in ensuring the reliability and controllability of wind power systems. In this paper, a new forecasting architecture based on decomposing algorithms and modified neural networks is successfully developed for multi-step wind speed forecasting. Four different hybrid models are contained in this architecture, and to further improve the forecasting performance, a modified bat algorithm (BA) with the conjugate gradient (CG) method is developed to optimize the initial weights between layers and thresholds of the hidden layer of neural networks. To investigate the forecasting abilities of the four models, the wind speed data collected from four different wind power stations in Penglai, China, were used as a case study. The numerical experiments showed that the hybrid model including the singular spectrum analysis and general regression neural network with CG-BA (SSA-CG-BA-GRNN) achieved the most accurate forecasting results in one-step to three-step wind speed forecasting.

  6. Differential genetic regulation of motor activity and anxiety-related behaviors in mice using an automated home cage task.

    Science.gov (United States)

    Kas, Martien J H; de Mooij-van Malsen, Annetrude J G; Olivier, Berend; Spruijt, Berry M; van Ree, Jan M

    2008-08-01

    Traditional behavioral tests, such as the open field test, measure an animal's responsiveness to a novel environment. However, it is generally difficult to assess whether the behavioral response obtained from these tests relates to the expression level of motor activity and/or to avoidance of anxiogenic areas. Here, an automated home cage environment for mice was designed to obtain independent measures of motor activity levels and of sheltered feeding preference during three consecutive days. Chronic treatment with the anxiolytic drug chlordiazepoxide (5 and 10 mg/kg/day) in C57BL/6J mice reduced sheltered feeding preference without altering motor activity levels. Furthermore, two distinct chromosome substitution strains, derived from C57BL/6J (host strain) and A/J (donor strain) inbred strains, expressed either increased sheltering preference in females (chromosome 15) or reduced motor activity levels in females and males (chromosome 1) when compared to C57BL/6J. Longitudinal behavioral monitoring revealed that these phenotypic differences maintained after adaptation to the home cage. Thus, by using new automated behavioral phenotyping approaches, behavior can be dissociated into distinct behavioral domains (e.g., anxiety-related and motor activity domains) with different underlying genetic origin and pharmacological responsiveness.

  7. Toward fully automated genotyping: Genotyping microsatellite markers by deconvolution

    Energy Technology Data Exchange (ETDEWEB)

    Perlin, M.W.; Lancia, G.; See-Kiong, Ng [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    1995-11-01

    Dense genetic linkage maps have been constructed for the human and mouse genomes, with average densities of 2.9 cM and 0.35 cM, respectively. These genetic maps are crucial for mapping both Mendelian and complex traits and are useful in clinical genetic diagnosis. Current maps are largely comprised of abundant, easily assayed, and highly polymorphic PCR-based microsatellite markers, primarily dinucleotide (CA){sub n} repeats. One key limitation of these length polymorphisms is the PCR stutter (or slippage) artifact that introduces additional stutter bands. With two (or more) closely spaced alleles, the stutter bands overlap, and it is difficult to accurately determine the correct alleles; this stutter phenomenon has all but precluded full automation, since a human must visually inspect the allele data. We describe here novel deconvolution methods for accurate genotyping that mathematically remove PCR stutter artifact from microsatellite markers. These methods overcome the manual interpretation bottleneck and thereby enable full automation of genetic map construction and use. New functionalities, including the pooling of DNAs and the pooling of markers, are described that may greatly reduce the associated experimentation requirements. 32 refs., 5 figs., 3 tabs.

  8. Design of H2/H∞ RMPC for Boiler Superheated Steam Temperature Based on Memoryless Feedback Multistep Strategy

    Directory of Open Access Journals (Sweden)

    Pu Han

    2017-01-01

    Full Text Available The collection of superheated steam temperature models of a thermal power plant under different loads can be approximated to “multimodel” linear uncertain systems. After transformation, the tracking system was obtained from “multimodel” linear uncertain systems. For this tracking uncertain system, a mixed H2/H∞ robust model predictive control (HRMPC based on a memoryless feedback multistep strategy is proposed. A multistep control strategy combines the advantages of predictive control rolling optimization with memoryless feedback control thoughts. It could effectively decrease the controller optimization parameter and ensure closed-loop system stability, and, at the same time, it also achieved acceptable control performance. Successful application to the superheated steam temperature system of a 300 MW thermal power plant verified the study of the HRMPC-P cascade controller design scheme in terms of feasibility and effectiveness.

  9. Optimization of a Multi-Step Procedure for Isolation of Chicken Bone Collagen

    OpenAIRE

    Cansu, ?mran; Boran, G?khan

    2015-01-01

    Chicken bone is not adequately utilized despite its high nutritional value and protein content. Although not a common raw material, chicken bone can be used in many different ways besides manufacturing of collagen products. In this study, a multi-step procedure was optimized to isolate chicken bone collagen for higher yield and quality for manufacture of collagen products. The chemical composition of chicken bone was 2.9% nitrogen corresponding to about 15.6% protein, 9.5% fat, 14.7% mineral ...

  10. A multi-step electrochemical etching process for a three-dimensional micro probe array

    International Nuclear Information System (INIS)

    Kim, Yoonji; Youn, Sechan; Cho, Young-Ho; Park, HoJoon; Chang, Byeung Gyu; Oh, Yong Soo

    2011-01-01

    We present a simple, fast, and cost-effective process for three-dimensional (3D) micro probe array fabrication using multi-step electrochemical metal foil etching. Compared to the previous electroplating (add-on) process, the present electrochemical (subtractive) process results in well-controlled material properties of the metallic microstructures. In the experimental study, we describe the single-step and multi-step electrochemical aluminum foil etching processes. In the single-step process, the depth etch rate and the bias etch rate of an aluminum foil have been measured as 1.50 ± 0.10 and 0.77 ± 0.03 µm min −1 , respectively. On the basis of the single-step process results, we have designed and performed the two-step electrochemical etching process for the 3D micro probe array fabrication. The fabricated 3D micro probe array shows the vertical and lateral fabrication errors of 15.5 ± 5.8% and 3.3 ± 0.9%, respectively, with the surface roughness of 37.4 ± 9.6 nm. The contact force and the contact resistance of the 3D micro probe array have been measured to be 24.30 ± 0.98 mN and 2.27 ± 0.11 Ω, respectively, for an overdrive of 49.12 ± 1.25 µm.

  11. Automatic scanning of Cerenkov light photograms from a multistep avalanche chamber using a television digitizer

    International Nuclear Information System (INIS)

    Vascon, M.; Zanella, G.

    1980-01-01

    A television digitizer and its application to automatic scanning of Cerenkov imaging using the multistep avalanche chamber in front of optical spark chamber are described. The results are of interest in the adoption of the automatic scanning of photographic plates of these events or for the on-line application of the television digitizer itself. (orig.)

  12. Genetic Learning of Fuzzy Expert Systems for Decision Support in the Automated Process of Wooden Boards Cutting

    Directory of Open Access Journals (Sweden)

    Yaroslav MATSYSHYN

    2014-03-01

    Full Text Available Sawing solid wood (lumber, wooden boards into blanks is an important technological operation, which has significant influence on the efficiency of the woodworking industry as a whole. Selecting a rational variant of lumber cutting is a complex multicriteria problem with many stochastic factors, characterized by incomplete information and fuzzy attributes. About this property by currently used automatic optimizing cross-cut saw is not always rational use of wood raw material. And since the optimization algorithms of these saw functions as a “black box”, their improvement is not possible. Therefore topical the task of developing a new approach to the optimal cross-cutting that takes into account stochastic properties of wood as a material from biological origin. Here we propose a new approach to the problem of lumber optimal cutting in the conditions of uncertainty of lumber quantity and fuzziness lengths of defect-free areas. To account for these conditions, we applied the methods of fuzzy sets theory and used a genetic algorithm to simulate the process of human learning in the implementation the technological operation. Thus, the rules of behavior with yet another defect-free area is defined in fuzzy expert system that can be configured to perform specific production tasks using genetic algorithm. The author's implementation of the genetic algorithm is used to set up the parameters of fuzzy expert system. Working capacity of the developed system verified on simulated and real-world data. Implementation of this approach will make it suitable for the control of automated or fully automatic optimizing cross cutting of solid wood.

  13. Effect of administration route and dose of streptavidin or biotin on the tumor uptake of radioactivity in intraperitoneal tumor with multistep targeting

    International Nuclear Information System (INIS)

    Zhang Meili; Yao Zhengsheng; Sakahara, Harumi; Saga, Tsuneo; Nakamoto, Yuji; Sato, Noriko; Zhao Songji; Nakada, Hiroshi; Yamashina, Ikuo; Konishi, Junji

    1998-01-01

    The effect of the administration route and dose of streptavidin or biotin on the biodistribution of radioactivity in multistep targeting was studied in nude mice bearing intraperitoneal (IP) colon cancer xenograft. The multistep targeting included a two-step method using biotinylated antibody and radiolabeled streptavidin and a three-step method with radiolabeled biotin based on the two-step method. A monoclonal antibody, MLS128, which recognizes Tn antigen on mucin, was biotinylated and injected intravenously (IV) or IP in nude mice bearing human colon cancer LS180 IP xenografts for pretargeting. In the two-step method, IP-injected streptavidin showed a higher tumor uptake and tumor-to-nontumor ratios than IV-injected streptavidin regardless of administration route of pretargeting. The tumor uptake of radiolabeled streptavidin was increased with a high dose of biotinylated antibody pretargeting, but decreased with an increasing dose of streptavidin. In the three-step targeting, IP injection also gave a higher tumor uptake of radiolabeled biotin than IV injection. In conclusion, IP administration of radiolabeled streptavidin or biotin resulted in more efficient IP tumor targeting with the multistep methods

  14. Genetic algorithms in teaching artificial intelligence (automated generation of specific algebras)

    Science.gov (United States)

    Habiballa, Hashim; Jendryscik, Radek

    2017-11-01

    The problem of teaching essential Artificial Intelligence (AI) methods is an important task for an educator in the branch of soft-computing. The key focus is often given to proper understanding of the principle of AI methods in two essential points - why we use soft-computing methods at all and how we apply these methods to generate reasonable results in sensible time. We present one interesting problem solved in the non-educational research concerning automated generation of specific algebras in the huge search space. We emphasize above mentioned points as an educational case study of an interesting problem in automated generation of specific algebras.

  15. Effective interactions for description of multistep processes

    International Nuclear Information System (INIS)

    Avrigeanu, M.; Stetcu, I.; Avrigeanu, V.; Antonov, A.N.; Lenske, H.

    2000-01-01

    The reliability of realistic M3Y effective NN interactions to describe multistep direct (MSD) processes is proved by analysing the corresponding real optical potentials. This trial is done in order to overcome the uncertainties of the effective NN-interaction strength V 0 obtained by direct fit to the experimental data. The microscopic potential for the nucleon-nucleus scattering at energies lower than 100 MeV has been calculated by using nucleonic and mesonic form factors. It has been analysed through (i) a comparison with phenomenological optical potentials, and (ii) its use for description of nucleon elastic scattering angular distributions. It results that the strongly simplified model interactions usually involved within MSD reaction theory, e.g. 1 fm range Yukawa (1Y) term, neglect important dynamical details of such processes. An 1Y-equivalent V 0 strength of a realistic effective NN interaction is determined by corresponding optical-potential volume integrals, and involved within Feshbach-Kerman-Koonin theory calculations with the final goal of MSD studies without any V 0 free parameter. (authors)

  16. Low-loss ultracompact optical power splitter using a multistep structure.

    Science.gov (United States)

    Huang, Zhe; Chan, Hau Ping; Afsar Uddin, Mohammad

    2010-04-01

    We propose a low-loss ultracompact optical power splitter for broadband passive optical network applications. The design is based on a multistep structure involving a two-material (core/cladding) system. The performance of the proposed device was evaluated through the three-dimensional finite-difference beam propagation method. By using the proposed design, an excess loss of 0.4 dB was achieved at a full branching angle of 24 degrees. The wavelength-dependent loss was found to be less than 0.3 dB, and the polarization-dependent loss was less than 0.05 dB from O to L bands. The device offers the potential of being mass-produced using low-cost polymer-based embossing techniques.

  17. Quantummechanical multi-step direct models for nuclear data applications

    International Nuclear Information System (INIS)

    Koning, A.J.

    1992-10-01

    Various multi-step direct models have been derived and compared on a theoretical level. Subsequently, these models have been implemented in the computer code system KAPSIES, enabling a consistent comparison on the basis of the same set of nuclear parameters and same set of numerical techniques. Continuum cross sections in the energy region between 10 and several hundreds of MeV have successfully been analysed. Both angular distributions and energy spectra can be predicted in an essentially parameter-free manner. It is demonstrated that the quantum-mechanical MSD models (in particular the FKK model) give an improved prediction of pre-equilibrium angular distributions as compared to the experiment-based systematics of Kalbach. This makes KAPSIES a reliable tool for nuclear data applications in the afore-mentioned energy region. (author). 10 refs., 2 figs

  18. Multi-Step Deep Reactive Ion Etching Fabrication Process for Silicon-Based Terahertz Components

    Science.gov (United States)

    Jung-Kubiak, Cecile (Inventor); Reck, Theodore (Inventor); Chattopadhyay, Goutam (Inventor); Perez, Jose Vicente Siles (Inventor); Lin, Robert H. (Inventor); Mehdi, Imran (Inventor); Lee, Choonsup (Inventor); Cooper, Ken B. (Inventor); Peralta, Alejandro (Inventor)

    2016-01-01

    A multi-step silicon etching process has been developed to fabricate silicon-based terahertz (THz) waveguide components. This technique provides precise dimensional control across multiple etch depths with batch processing capabilities. Nonlinear and passive components such as mixers and multipliers waveguides, hybrids, OMTs and twists have been fabricated and integrated into a small silicon package. This fabrication technique enables a wafer-stacking architecture to provide ultra-compact multi-pixel receiver front-ends in the THz range.

  19. Multistep translation and cultural adaptation of the Penn acoustic neuroma quality-of-life scale for German-speaking patients.

    Science.gov (United States)

    Kristin, Julia; Glaas, Marcel Fabian; Stenin, Igor; Albrecht, Angelika; Klenzner, Thomas; Schipper, Jörg; Eysel-Gosepath, Katrin

    2017-11-01

    Monitoring the health-related quality of life (HRQOL) for patients with vestibular schwannoma (VS) has garnered increasing interest. In German-speaking countries, there is no disease-specific questionnaire available similar to the "Penn Acoustic Neuroma Quality-of-life Scale" (PANQOL). We translated the PANQOL for German-speaking patients based on a multistep protocol that included not only a forward-backward translation but also linguistic and sociocultural adaptations. The process consists of translation, synthesis, back translation, review by an expert committee, administration of the prefinal version to our patients, submission and appraisal of all written documents by our research team. The required multidisciplinary team for translation comprised head and neck surgeons, language professionals (German and English), a professional translator, and bilingual participants. A total of 123 patients with VS underwent microsurgical procedures via different approaches at our clinic between January 2007 and January 2017. Among these, 72 patients who underwent the translabyrinthine approach participated in the testing of the German-translated PANQOL. The first German version of the PANQOL questionnaire was created by a multistep translation process. The responses indicate that the questionnaire is simple to administer and applicable to our patients. The use of a multistep process to translate quality-of-life questionnaires is complex and time-consuming. However, this process was performed properly and resulted in a version of the PANQOL for assessing the quality of life of German-speaking patients with VS.

  20. Global description of (n,p) - and (n,2n) - activation cross sections within statistical multistep theory

    International Nuclear Information System (INIS)

    Kalka, H.; Torjman, M.; Seeliger, D.; Lopez, R.

    1989-07-01

    A unique description of (n,p) and (n,2n) activation cross sections as well as emission spectra is proposed within a pure multistep approach. Calculations are presented for 8 nuclei (A=47...65) in the incident energy range from zero up to 20 MeV. (author). 42 refs, 5 figs, 1 tab

  1. MEGA-CC: computing core of molecular evolutionary genetics analysis program for automated and iterative data analysis.

    Science.gov (United States)

    Kumar, Sudhir; Stecher, Glen; Peterson, Daniel; Tamura, Koichiro

    2012-10-15

    There is a growing need in the research community to apply the molecular evolutionary genetics analysis (MEGA) software tool for batch processing a large number of datasets and to integrate it into analysis workflows. Therefore, we now make available the computing core of the MEGA software as a stand-alone executable (MEGA-CC), along with an analysis prototyper (MEGA-Proto). MEGA-CC provides users with access to all the computational analyses available through MEGA's graphical user interface version. This includes methods for multiple sequence alignment, substitution model selection, evolutionary distance estimation, phylogeny inference, substitution rate and pattern estimation, tests of natural selection and ancestral sequence inference. Additionally, we have upgraded the source code for phylogenetic analysis using the maximum likelihood methods for parallel execution on multiple processors and cores. Here, we describe MEGA-CC and outline the steps for using MEGA-CC in tandem with MEGA-Proto for iterative and automated data analysis. http://www.megasoftware.net/.

  2. A Droplet Microfluidic Platform for Automating Genetic Engineering.

    Science.gov (United States)

    Gach, Philip C; Shih, Steve C C; Sustarich, Jess; Keasling, Jay D; Hillson, Nathan J; Adams, Paul D; Singh, Anup K

    2016-05-20

    We present a water-in-oil droplet microfluidic platform for transformation, culture and expression of recombinant proteins in multiple host organisms including bacteria, yeast and fungi. The platform consists of a hybrid digital microfluidic/channel-based droplet chip with integrated temperature control to allow complete automation and integration of plasmid addition, heat-shock transformation, addition of selection medium, culture, and protein expression. The microfluidic format permitted significant reduction in consumption (100-fold) of expensive reagents such as DNA and enzymes compared to the benchtop method. The chip contains a channel to continuously replenish oil to the culture chamber to provide a fresh supply of oxygen to the cells for long-term (∼5 days) cell culture. The flow channel also replenished oil lost to evaporation and increased the number of droplets that could be processed and cultured. The platform was validated by transforming several plasmids into Escherichia coli including plasmids containing genes for fluorescent proteins GFP, BFP and RFP; plasmids with selectable markers for ampicillin or kanamycin resistance; and a Golden Gate DNA assembly reaction. We also demonstrate the applicability of this platform for transformation in widely used eukaryotic organisms such as Saccharomyces cerevisiae and Aspergillus niger. Duration and temperatures of the microfluidic heat-shock procedures were optimized to yield transformation efficiencies comparable to those obtained by benchtop methods with a throughput up to 6 droplets/min. The proposed platform offers potential for automation of molecular biology experiments significantly reducing cost, time and variability while improving throughput.

  3. Application of multi-step direct reaction theory to 14 MeV neutron reaction, 3 (n,. cap alpha. )

    Energy Technology Data Exchange (ETDEWEB)

    Kumabe, I.; Matoba, M.; Fukuda, K. [Kyushu Univ., Fukuoka (Japan). Faculty of Engineering; Ikegami, H.; Muraoka, M [eds.

    1980-01-01

    Multi-step direct-reaction theory proposed by Tamura et al. has been applied to continuous spectra of the 14 MeV (n, ..cap alpha..) reaction with some modifications. Calculated results reproduce well the experimental energy and angular distributions of the 14 MeV (n, ..cap alpha..) reactions.

  4. Theoretical studies of multistep processes. Isospin effects in nuclear scattering, and meson and baryon interactions in nuclear physics. Progress report

    International Nuclear Information System (INIS)

    Madsen, V.A.; Landau, R.H.

    1985-12-01

    Research on microscopic optical potentials, multistep processes, neutron-proton differences in nuclear vibrations, and exact calculations of Coulomb plus nuclear bound states of exotic systems is reported. 21 refs

  5. Journal of Genetics | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics. K. Kosalai. Articles written in Journal of Genetics. Volume 80 Issue 1 April 2001 pp 9-16. An automated annotation tool for genomic DNA sequences using GeneScan and BLAST · Andrew M. Lynn Chakresh Kumar Jain K. Kosalai Pranjan Barman Nupur Thakur Harish Batra Alok ...

  6. Synthesis of 10-Ethyl Flavin: A Multistep Synthesis Organic Chemistry Laboratory Experiment for Upper-Division Undergraduate Students

    Science.gov (United States)

    Sichula, Vincent A.

    2015-01-01

    A multistep synthesis of 10-ethyl flavin was developed as an organic chemistry laboratory experiment for upper-division undergraduate students. Students synthesize 10-ethyl flavin as a bright yellow solid via a five-step sequence. The experiment introduces students to various hands-on experimental organic synthetic techniques, such as column…

  7. Self-Regulated Strategy Development Instruction for Teaching Multi-Step Equations to Middle School Students Struggling in Math

    Science.gov (United States)

    Cuenca-Carlino, Yojanna; Freeman-Green, Shaqwana; Stephenson, Grant W.; Hauth, Clara

    2016-01-01

    Six middle school students identified as having a specific learning disability or at risk for mathematical difficulties were taught how to solve multi-step equations by using the self-regulated strategy development (SRSD) model of instruction. A multiple-probe-across-pairs design was used to evaluate instructional effects. Instruction was provided…

  8. A Multistep Extending Truncation Method towards Model Construction of Infinite-State Markov Chains

    Directory of Open Access Journals (Sweden)

    Kemin Wang

    2014-01-01

    Full Text Available The model checking of Infinite-State Continuous Time Markov Chains will inevitably encounter the state explosion problem when constructing the CTMCs model; our method is to get a truncated model of the infinite one; to get a sufficient truncated model to meet the model checking of Continuous Stochastic Logic based system properties, we propose a multistep extending advanced truncation method towards model construction of CTMCs and implement it in the INFAMY model checker; the experiment results show that our method is effective.

  9. Propagators for the Time-Dependent Kohn-Sham Equations: Multistep, Runge-Kutta, Exponential Runge-Kutta, and Commutator Free Magnus Methods.

    Science.gov (United States)

    Gómez Pueyo, Adrián; Marques, Miguel A L; Rubio, Angel; Castro, Alberto

    2018-05-09

    We examine various integration schemes for the time-dependent Kohn-Sham equations. Contrary to the time-dependent Schrödinger's equation, this set of equations is nonlinear, due to the dependence of the Hamiltonian on the electronic density. We discuss some of their exact properties, and in particular their symplectic structure. Four different families of propagators are considered, specifically the linear multistep, Runge-Kutta, exponential Runge-Kutta, and the commutator-free Magnus schemes. These have been chosen because they have been largely ignored in the past for time-dependent electronic structure calculations. The performance is analyzed in terms of cost-versus-accuracy. The clear winner, in terms of robustness, simplicity, and efficiency is a simplified version of a fourth-order commutator-free Magnus integrator. However, in some specific cases, other propagators, such as some implicit versions of the multistep methods, may be useful.

  10. Automation of route identification and optimisation based on data-mining and chemical intuition.

    Science.gov (United States)

    Lapkin, A A; Heer, P K; Jacob, P-M; Hutchby, M; Cunningham, W; Bull, S D; Davidson, M G

    2017-09-21

    Data-mining of Reaxys and network analysis of the combined literature and in-house reactions set were used to generate multiple possible reaction routes to convert a bio-waste feedstock, limonene, into a pharmaceutical API, paracetamol. The network analysis of data provides a rich knowledge-base for generation of the initial reaction screening and development programme. Based on the literature and the in-house data, an overall flowsheet for the conversion of limonene to paracetamol was proposed. Each individual reaction-separation step in the sequence was simulated as a combination of the continuous flow and batch steps. The linear model generation methodology allowed us to identify the reaction steps requiring further chemical optimisation. The generated model can be used for global optimisation and generation of environmental and other performance indicators, such as cost indicators. However, the identified further challenge is to automate model generation to evolve optimal multi-step chemical routes and optimal process configurations.

  11. Computerized tools in psychology: cross cultural and genetically informative studies of memory

    Directory of Open Access Journals (Sweden)

    Ismatullina V.

    2016-01-01

    Full Text Available In this article we presented the computerized tools for psychological studies of memory. The importance of implementing computerized automated tools for psychological studies is discussed. It has been shown that this tools can be used both for cross-cultural and genetically informative studies. The validity of these tools for cross-cultural and genetically informative studies of memory can be seen as the first step to use automated computerized tools for big data collection in psychology.

  12. Teaching children with autism spectrum disorder and other developmental disabilities to perform multistep requesting using an iPad.

    Science.gov (United States)

    Alzrayer, Nouf M; Banda, Devender R; Koul, Rajinder

    2017-06-01

    Many children with autism spectrum disorders (ASD) and/or developmental disabilities are unable to meet their daily communication needs with speech alone. These individuals are considered potential candidates for speech-generating devices (SGDs) and mobile technologies with AAC-specific applications. The purpose of this study was to determine the effectiveness of systematic instruction on teaching multistep requesting skills using an iPad loaded with Proloquo2Go to children with ASD and other developmental disabilities. The participants in this study were four children between the ages of 8 and 10 years diagnosed with ASD and/or other developmental disabilities. The results indicated that for these participants, the intervention was effective in increasing multistep requesting using the iPad. All participants were successful to varying degrees in navigating across pages and combining symbols to request preferred items. Additionally, the participants demonstrated generalization of newly acquired skills by requesting different preferred items and activities during the generalization probes. Results are discussed and implications for research and practice are presented.

  13. Multivariate statistical analysis of a multi-step industrial processes

    DEFF Research Database (Denmark)

    Reinikainen, S.P.; Høskuldsson, Agnar

    2007-01-01

    Monitoring and quality control of industrial processes often produce information on how the data have been obtained. In batch processes, for instance, the process is carried out in stages; some process or control parameters are set at each stage. However, the obtained data might not be utilized...... efficiently, even if this information may reveal significant knowledge about process dynamics or ongoing phenomena. When studying the process data, it may be important to analyse the data in the light of the physical or time-wise development of each process step. In this paper, a unified approach to analyse...... multivariate multi-step processes, where results from each step are used to evaluate future results, is presented. The methods presented are based on Priority PLS Regression. The basic idea is to compute the weights in the regression analysis for given steps, but adjust all data by the resulting score vectors...

  14. Fabrication of different pore shapes by multi-step etching technique in ion-irradiated PET membranes

    Science.gov (United States)

    Mo, D.; Liu, J. D.; Duan, J. L.; Yao, H. J.; Latif, H.; Cao, D. L.; Chen, Y. H.; Zhang, S. X.; Zhai, P. F.; Liu, J.

    2014-08-01

    A method for the fabrication of different pore shapes in polyethylene terephthalate (PET)-based track etched membranes (TEMs) is reported. A multi-step etching technique involving etchant variation and track annealing was applied to fabricate different pore shapes in PET membranes. PET foils of 12-μm thickness were irradiated with Bi ions (kinetic energy 9.5 MeV/u, fluence 106 ions/cm2) at the Heavy Ion Research Facility (HIRFL, Lanzhou). The cross-sections of fundamental pore shapes (cylinder, cone, and double cone) were analyzed. Funnel-shaped and pencil-shaped pores were obtained using a two-step etching process. Track annealing was carried out in air at 180 °C for 120 min. After track annealing, the selectivity of the etching process decreased, which resulted in isotropic etching in subsequent etching steps. Rounded cylinder and rounded cone shapes were obtained by introducing a track-annealing step in the etching process. Cup and spherical funnel-shaped pores were fabricated using a three- and four-step etching process, respectively. The described multi-step etching technique provides a controllable method to fabricate new pore shapes in TEMs. Introduction of a variety of pore shapes may improve the separation properties of TEMs and enrich the series of TEM products.

  15. Purification of crude glycerol from transesterification reaction of palm oil using direct method and multistep method

    Science.gov (United States)

    Nasir, N. F.; Mirus, M. F.; Ismail, M.

    2017-09-01

    Crude glycerol which produced from transesterification reaction has limited usage if it does not undergo purification process. It also contains excess methanol, catalyst and soap. Conventionally, purification method of the crude glycerol involves high cost and complex processes. This study aimed to determine the effects of using different purification methods which are direct method (comprises of ion exchange and methanol removal steps) and multistep method (comprises of neutralization, filtration, ion exchange and methanol removal steps). Two crude glycerol samples were investigated; the self-produced sample through the transesterification process of palm oil and the sample obtained from biodiesel plant. Samples were analysed using Fourier Transform Infrared Spectroscopy, Gas Chromatography and High Performance Liquid Chromatography. The results of this study for both samples after purification have showed that the pure glycerol was successfully produced and fatty acid salts were eliminated. Also, the results indicated the absence of methanol in both samples after purification process. In short, the combination of 4 purification steps has contributed to a higher quality of glycerol. Multistep purification method gave a better result compared to the direct method as neutralization and filtration steps helped in removing most excess salt, fatty acid and catalyst.

  16. Genetic instability model for cancer risk in A-bomb survivors

    International Nuclear Information System (INIS)

    Niwa, Ohtsura

    1998-01-01

    This review was written rather against Mendelsohn's reductionist model for cancer risk in A-bomb survivors in following chapters. Assumptions for carcinogenic process: mutation of a cell to the cancer cell and its proliferation. Multi-step theory for carcinogenesis and age of crisis: induction of cancer by accumulation of cancer-related gene mutations which being linear to time (age). Effect of exogenous hit in the multi-step theory: radiation as an exogenous hit to damage DNA. Dose-effect relationship for cancer risk in the survivors and the problem for the latent period: for solid tumors, dose-effect relationship is linear and shortening of the latent period is not observed. Considerations on cancer data in adulthood exposure/Indirect effect model in radiation carcinogenesis: solid cancer data supporting the indirect effect model. Possible mechanism for radiation-induced long-term increase of natural mutation frequency: genetic instability remaining in the irradiated cells which being a basis of the indirect effect model. Notes for considerations of carcinogenicity in exposed people/Difference in carcinogenic mechanisms due to age. The author concluded that the radiation-induced carcinogenesis is deeply related with the natural carcinogenesis and particularly for solid cancers, it can not be explained by the classic reductionist model. (K.H.)

  17. Automated scheduling and planning from theory to practice

    CERN Document Server

    Ozcan, Ender; Urquhart, Neil

    2013-01-01

      Solving scheduling problems has long presented a challenge for computer scientists and operations researchers. The field continues to expand as researchers and practitioners examine ever more challenging problems and develop automated methods capable of solving them. This book provides 11 case studies in automated scheduling, submitted by leading researchers from across the world. Each case study examines a challenging real-world problem by analysing the problem in detail before investigating how the problem may be solved using state of the art techniques.The areas covered include aircraft scheduling, microprocessor instruction scheduling, sports fixture scheduling, exam scheduling, personnel scheduling and production scheduling.  Problem solving methodologies covered include exact as well as (meta)heuristic approaches, such as local search techniques, linear programming, genetic algorithms and ant colony optimisation.The field of automated scheduling has the potential to impact many aspects of our lives...

  18. Influence of multi-step heat treatments in creep age forming of 7075 aluminum alloy: Optimization for springback, strength and exfoliation corrosion

    Energy Technology Data Exchange (ETDEWEB)

    Arabi Jeshvaghani, R.; Zohdi, H. [Department of Materials Engineering, Tarbiat Modares University, P.O. Box 14115-143, Tehran (Iran, Islamic Republic of); Shahverdi, H.R., E-mail: shahverdi@modares.ac.ir [Department of Materials Engineering, Tarbiat Modares University, P.O. Box 14115-143, Tehran (Iran, Islamic Republic of); Bozorg, M. [Department of Materials Engineering, Tarbiat Modares University, P.O. Box 14115-143, Tehran (Iran, Islamic Republic of); Hadavi, S.M.M. [School of Materials Science and Engineering, MA University of Technology, P.O. Box 16765-3197, Tehran (Iran, Islamic Republic of)

    2012-11-15

    Multi-step heat treatments comprise of high temperature forming (150 Degree-Sign C/24 h plus 190 Degree-Sign C for several minutes) and subsequent low temperature forming (120 Degree-Sign C for 24 h) is developed in creep age forming of 7075 aluminum alloy to decrease springback and exfoliation corrosion susceptibility without reduction in tensile properties. The results show that the multi-step heat treatment gives the low springback and the best combination of exfoliation corrosion resistance and tensile strength. The lower springback is attributed to the dislocation recovery and more stress relaxation at higher temperature. Transmission electron microscopy observations show that corrosion resistance is improved due to the enlargement in the size and the inter-particle distance of the grain boundaries precipitates. Furthermore, the achievement of the high strength is related to the uniform distribution of ultrafine {eta} Prime precipitates within grains. - Highlights: Black-Right-Pointing-Pointer Creep age forming developed for manufacturing of aircraft wing panels by aluminum alloy. Black-Right-Pointing-Pointer A good combination of properties with minimal springback is required in this component. Black-Right-Pointing-Pointer This requirement can be improved through the appropriate heat treatments. Black-Right-Pointing-Pointer Multi-step cycles developed in creep age forming of AA7075 for improving of springback and properties. Black-Right-Pointing-Pointer Results indicate simultaneous enhancing the properties and shape accuracy (lower springback).

  19. MONTEBURNS 2.0: An Automated, Multi-Step Monte Carlo Burnup Code System

    International Nuclear Information System (INIS)

    2007-01-01

    A - Description of program or function: MONTEBURNS Version 2 calculates coupled neutronic/isotopic results for nuclear systems and produces a large number of criticality and burnup results based on various material feed/removal specifications, power(s), and time intervals. MONTEBURNS is a fully automated tool that links the LANL MCNP Monte Carlo transport code with a radioactive decay and burnup code. Highlights on changes to Version 2 are listed in the transmittal letter. Along with other minor improvements in MONTEBURNS Version 2, the option was added to use CINDER90 instead of ORIGEN2 as the depletion/decay part of the system. CINDER90 is a multi-group depletion code developed at LANL and is not currently available from RSICC, nor from the NEA Databank. This MONTEBURNS release was tested with various combinations of CCC-715/MCNPX 2.4.0, CCC-710/MCNP5, CCC-700/MCNP4C, CCC-371/ORIGEN2.2, ORIGEN2.1 and CINDER90. Perl is required software and is not included in this distribution. MCNP, ORIGEN2, and CINDER90 are not included. The following changes have been made: 1) An increase in the number of removal group information that must be provided for each material in each step in the feed input file. 2) The capability to use CINDER90 instead of ORIGEN2.1 as the depletion/decay part of the code. 3) ORIGEN2.2 can also be used instead of ORIGEN2.1 in Monteburns. 4) The correction of including the capture cross sections to metastable as well as ground states if applicable for an isotope (i.e. Am-241 and Am-243 in particular). 5) The ability to use a MCNP input file that has a title card starting with 'm' (this was a bug in the first version of Monteburns). 6) A decrease in run time for cases involving decay-only steps (power of 0.0). Monteburns does not run MCNP to calculate cross sections for a step unless it is an irradiation step. 7) The ability to change the cross section libraries used each step. If different cross section libraries are desired for multiple steps. 8

  20. Comparison of manual and automated AmpliSeq™ workflows in the typing of a Somali population with the Precision ID Identity Panel

    DEFF Research Database (Denmark)

    van der Heijden, Suzanne; de Oliveira, Susanne Juel; Kampmann, Marie-Louise

    2017-01-01

    to compare different AmpliSeq™ workflows to investigate the possibility of using automated library building in forensic genetic case work. In order to do so, the SNP typing of the Somalis was performed using three different workflows: 1) manual library building and sequencing on the Ion PGM™, 2) automated...... workflows. The Biomek(®)3000/Ion PGM™ workflow was found to perform similarly to the manual/Ion PGM™ workflow. This argues for the use of automated library building in forensic genetic case work. Automated library building decreases the workload of the laboratory staff, decreases the risk of pipetting...... library building using the Biomek(®)3000 and sequencing on the Ion PGM™, and 3) automated library building using the Ion Chef™ and sequencing on the Ion S5™. AmpliSeq™ workflows were compared based on coverage, locus balance, noise, and heterozygote balance. Overall, the Ion Chef™/Ion S5™ workflow...

  1. Natural genetic transformation in Acinetobacter sp. BD413 Biofilms: introducing natural genetic transformation as a tool for bioenhancement of biofilm reactors

    Energy Technology Data Exchange (ETDEWEB)

    Hendrickx, L

    2002-07-01

    This study focussed on the localization and quantification of natural genetic transformation using neutral and disadvantageous genes in monoculture biofilms to investigate gene transfer and expression of the transferred genes in the absence of a selective advantage. Data obtained by this investigation were regarded as initial steps for evaluating the applicability of adding catabolic traits into the indigenous bacterial community of biofilm reactors by in situ natural genetic transformation. Because Acinetobacter spp. strains are readily found in waste water treatment plants and because Acinetobacter sp. BD413 possesses a high effective level of competence, natural genetic transformation was investigated in monoculture Acinetobacter sp. BD413 biofilms. The genes used for transformation encoded for the green fluorescent protein (GFP) and its variants. Monitoring of transformation events were performed with the use of automated confocal laser scanning microscopy (CLSM) and semi automated digital image processing and analysis. (orig.)

  2. Multistep Hybrid Extragradient Method for Triple Hierarchical Variational Inequalities

    Directory of Open Access Journals (Sweden)

    Zhao-Rong Kong

    2013-01-01

    Full Text Available We consider a triple hierarchical variational inequality problem (THVIP, that is, a variational inequality problem defined over the set of solutions of another variational inequality problem which is defined over the intersection of the fixed point set of a strict pseudocontractive mapping and the solution set of the classical variational inequality problem. Moreover, we propose a multistep hybrid extragradient method to compute the approximate solutions of the THVIP and present the convergence analysis of the sequence generated by the proposed method. We also derive a solution method for solving a system of hierarchical variational inequalities (SHVI, that is, a system of variational inequalities defined over the intersection of the fixed point set of a strict pseudocontractive mapping and the solution set of the classical variational inequality problem. Under very mild conditions, it is proven that the sequence generated by the proposed method converges strongly to a unique solution of the SHVI.

  3. Automated Image Analysis of HER2 Fluorescence In Situ Hybridization to Refine Definitions of Genetic Heterogeneity in Breast Cancer Tissue.

    Science.gov (United States)

    Radziuviene, Gedmante; Rasmusson, Allan; Augulis, Renaldas; Lesciute-Krilaviciene, Daiva; Laurinaviciene, Aida; Clim, Eduard; Laurinavicius, Arvydas

    2017-01-01

    Human epidermal growth factor receptor 2 gene- (HER2-) targeted therapy for breast cancer relies primarily on HER2 overexpression established by immunohistochemistry (IHC) with borderline cases being further tested for amplification by fluorescence in situ hybridization (FISH). Manual interpretation of HER2 FISH is based on a limited number of cells and rather complex definitions of equivocal, polysomic, and genetically heterogeneous (GH) cases. Image analysis (IA) can extract high-capacity data and potentially improve HER2 testing in borderline cases. We investigated statistically derived indicators of HER2 heterogeneity in HER2 FISH data obtained by automated IA of 50 IHC borderline (2+) cases of invasive ductal breast carcinoma. Overall, IA significantly underestimated the conventional HER2, CEP17 counts, and HER2/CEP17 ratio; however, it collected more amplified cells in some cases below the lower limit of GH definition by manual procedure. Indicators for amplification, polysomy, and bimodality were extracted by factor analysis and allowed clustering of the tumors into amplified, nonamplified, and equivocal/polysomy categories. The bimodality indicator provided independent cell diversity characteristics for all clusters. Tumors classified as bimodal only partially coincided with the conventional GH heterogeneity category. We conclude that automated high-capacity nonselective tumor cell assay can generate evidence-based HER2 intratumor heterogeneity indicators to refine GH definitions.

  4. Microarc oxidation coating covered Ti implants with micro-scale gouges formed by a multi-step treatment for improving osseointegration.

    Science.gov (United States)

    Bai, Yixin; Zhou, Rui; Cao, Jianyun; Wei, Daqing; Du, Qing; Li, Baoqiang; Wang, Yaming; Jia, Dechang; Zhou, Yu

    2017-07-01

    The sub-microporous microarc oxidation (MAO) coating covered Ti implant with micro-scale gouges has been fabricated via a multi-step MAO process to overcome the compromised bone-implant integration. The as-prepared implant has been further mediated by post-heat treatment to compare the effects of -OH functional group and the nano-scale orange peel-like morphology on osseointegration. The bone regeneration, bone-implant contact interface, and biomechanical push-out force of the modified Ti implant have been discussed thoroughly in this work. The greatly improved push-out force for the MAO coated Ti implants with micro-scale gouges could be attributed to the excellent mechanical interlocking effect between implants and biologically meshed bone tissues. Attributed to the -OH functional group which promotes synostosis between the biologically meshed bone and the gouge surface of implant, the multi-step MAO process could be an effective strategy to improve the osseointegration of Ti implant. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Automated, quantitative cognitive/behavioral screening of mice: for genetics, pharmacology, animal cognition and undergraduate instruction.

    Science.gov (United States)

    Gallistel, C R; Balci, Fuat; Freestone, David; Kheifets, Aaron; King, Adam

    2014-02-26

    We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be

  6. Bistable responses in bacterial genetic networks: Designs and dynamical consequences

    Science.gov (United States)

    Tiwari, Abhinav; Ray, J. Christian J.; Narula, Jatin; Igoshin, Oleg A.

    2011-01-01

    A key property of living cells is their ability to react to stimuli with specific biochemical responses. These responses can be understood through the dynamics of underlying biochemical and genetic networks. Evolutionary design principles have been well studied in networks that display graded responses, with a continuous relationship between input signal and system output. Alternatively, biochemical networks can exhibit bistable responses so that over a range of signals the network possesses two stable steady states. In this review, we discuss several conceptual examples illustrating network designs that can result in a bistable response of the biochemical network. Next, we examine manifestations of these designs in bacterial master-regulatory genetic circuits. In particular, we discuss mechanisms and dynamic consequences of bistability in three circuits: two-component systems, sigma-factor networks, and a multistep phosphorelay. Analyzing these examples allows us to expand our knowledge of evolutionary design principles for networks with bistable responses. PMID:21385588

  7. Cross-cultural adaptation of instruments assessing breastfeeding determinants: a multi-step approach

    Science.gov (United States)

    2014-01-01

    Background Cross-cultural adaptation is a necessary process to effectively use existing instruments in other cultural and language settings. The process of cross-culturally adapting, including translation, of existing instruments is considered a critical set to establishing a meaningful instrument for use in another setting. Using a multi-step approach is considered best practice in achieving cultural and semantic equivalence of the adapted version. We aimed to ensure the content validity of our instruments in the cultural context of KwaZulu-Natal, South Africa. Methods The Iowa Infant Feeding Attitudes Scale, Breastfeeding Self-Efficacy Scale-Short Form and additional items comprise our consolidated instrument, which was cross-culturally adapted utilizing a multi-step approach during August 2012. Cross-cultural adaptation was achieved through steps to maintain content validity and attain semantic equivalence in the target version. Specifically, Lynn’s recommendation to apply an item-level content validity index score was followed. The revised instrument was translated and back-translated. To ensure semantic equivalence, Brislin’s back-translation approach was utilized followed by the committee review to address any discrepancies that emerged from translation. Results Our consolidated instrument was adapted to be culturally relevant and translated to yield more reliable and valid results for use in our larger research study to measure infant feeding determinants effectively in our target cultural context. Conclusions Undertaking rigorous steps to effectively ensure cross-cultural adaptation increases our confidence that the conclusions we make based on our self-report instrument(s) will be stronger. In this way, our aim to achieve strong cross-cultural adaptation of our consolidated instruments was achieved while also providing a clear framework for other researchers choosing to utilize existing instruments for work in other cultural, geographic and population

  8. Effects of the multi-step activation process on the carrier concentration of p-type GaN

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae-Kwan [Department of Materials Science and Metallurgical Engineering, Sunchon National University, Sunchon, Chonnam 540-742 (Korea, Republic of); Jeon, Seong-Ran [LED Research and Business Division, Korea Photonics Technology Institute, Gwanju 500-779 (Korea, Republic of); Lee, Ji-Myon, E-mail: jimlee@sunchon.ac.kr [Department of Printed Electronics Engineering, Sunchon National University, Sunchon, Chonnam 540-742 (Korea, Republic of)

    2014-06-25

    Highlights: • Hole concentration of p-GaN was enhanced by multi-step activation process. • The O{sub 2} plasma treatment is attributed to the enhanced hole concentration of p-GaN. • PL peak intensity was also enhanced by MS activation process. - Abstract: A multi-step activation method, which include an oxygen plasma treatment, chemical treatment, and post annealing in N{sub 2} was proposed to enhance the hole concentration of a p-type GaN epitaxial layer. This process was found to effectively activate p-GaN by increasing the hole concentration compared to that of the conventionally annealed sample. After the optimal oxygen plasma treatment (10 min at a source and table power of 500 W and 100 W, respectively), followed by a HCl and buffered oxide etchant treatment, and then by a post-RTA process in a N{sub 2} environment, the hole concentration was increased from 4.0 × 10{sup 17} to 2.0 × 10{sup 18} cm{sup −3}. The oxygen plasma was found to effectively remove the remaining H atoms and subsequent wet treatment can effectively remove the GaO{sub x} that had formed during O plasma treatment, resulting in the higher intensity of photoluminescence.

  9. Automation of the Analysis of Moessbauer Spectra

    International Nuclear Information System (INIS)

    Souza, Paulo A. de Jr.; Garg, R.; Garg, V. K.

    1998-01-01

    In the present report we propose the automation of least square fitting of Moessbauer spectra, the identification of the substance, its crystal structure and the access to the references with the help of a genetic algorith, Fuzzy logic, and the artificial neural network associated with a databank of Moessbauer parameters and references. This system could be useful for specialists and non-specialists, in industry as well as in research laboratories

  10. [The genetic fingerprints file in France: between security and freedom].

    Science.gov (United States)

    Manaouil, C; Gignon, M; Werbrouck, A; Jarde, O

    2008-01-01

    In France, the French National File Automated with Genetic fingerprints (FNAEG) is a bank automated by genetic data which is used in penal domain. It facilitates search of the authors of malpractices, or the missing people. Since 1998, it has enabled to resolve numerous criminal cases. An extension of the field of application has been observed. It is a confidential register which is subjected to numerous controls. Nevertheless, private character of the data and its functioning (criminal character of the refusal of taking, periods of answer, and problem of data's conservation) explain the important contesting of associations worried about the respect of personal freedoms.

  11. A NEW FAMILY OF MULTISTEP METHODS WITH IMPROVED PHASE-LAG CHARACTERISTICS FOR THE INTEGRATION OF ORBITAL PROBLEMS

    International Nuclear Information System (INIS)

    Vlachos, D. S.; Anastassi, Z. A.; Simos, T. E.

    2009-01-01

    In this paper, we introduce a new family of 10-step linear multistep methods for the integration of orbital problems. The new methods are constructed by adopting a new methodology which improves the phase-lag characteristics by vanishing both the phase-lag function and its first derivatives at a specific frequency. The efficiency of the new family of methods is proved via error analysis and numerical applications.

  12. Multi-step capacitor discharges as an RF generator

    International Nuclear Information System (INIS)

    Hotta, Eiki; Yamamoto, Shunji; Ishii, Shozo; Hayashi, Izumi

    1979-01-01

    A variety of methods have been developed for large output radio frequency (RF) generators to heat and stabilize high temperature plasma. As the generators for this purpose, capacitor discharge, cable discharge, and oscillation with electronic tubes are considered. Here, a new RF generator is reported, which utilizes capacitor discharge to extract heavy current, and solves the difficulty of short duration by employing multistep discharges. The authors solved the problem of frequency decrease in capacitor discharge by cutting off the unnecessary capacitors reasonably from the load circuit, using the additional circuit for shunting current and vacuum gap switches. The vacuum gap switches and the trigger system are described together with the RF generator manufactured. The generator was fabricated to be rather compact for its large output and simple in circuitry as compared with conventional oscillator systems. The shortcomings are frequency variation and the improper phase of switching the next step in to cause instability, when the load change occurs. It would be difficult to operate the generator in a RF range of more than about 10 MHz due to jitter of the vacuum gap switches and others. (Wakatsuki, Y.)

  13. Effects of Stroke on Ipsilesional End-Effector Kinematics in a Multi-Step Activity of Daily Living

    OpenAIRE

    Gulde, Philipp; Hughes, Charmayne Mary Lee; Hermsdörfer, Joachim

    2017-01-01

    Background: Stroke frequently impairs activities of daily living (ADL) and deteriorates the function of the contra- as well as the ipsilesional limbs. In order to analyze alterations of higher motor control unaffected by paresis or sensory loss, the kinematics of ipsilesional upper limb movements in patients with stroke has previously been analyzed during prehensile movements and simple tool use actions. By contrast, motion recording of multi-step ADL is rare and patient-control comparisons f...

  14. Arduino-based automation of a DNA extraction system.

    Science.gov (United States)

    Kim, Kyung-Won; Lee, Mi-So; Ryu, Mun-Ho; Kim, Jong-Won

    2015-01-01

    There have been many studies to detect infectious diseases with the molecular genetic method. This study presents an automation process for a DNA extraction system based on microfluidics and magnetic bead, which is part of a portable molecular genetic test system. This DNA extraction system consists of a cartridge with chambers, syringes, four linear stepper actuators, and a rotary stepper actuator. The actuators provide a sequence of steps in the DNA extraction process, such as transporting, mixing, and washing for the gene specimen, magnetic bead, and reagent solutions. The proposed automation system consists of a PC-based host application and an Arduino-based controller. The host application compiles a G code sequence file and interfaces with the controller to execute the compiled sequence. The controller executes stepper motor axis motion, time delay, and input-output manipulation. It drives the stepper motor with an open library, which provides a smooth linear acceleration profile. The controller also provides a homing sequence to establish the motor's reference position, and hard limit checking to prevent any over-travelling. The proposed system was implemented and its functionality was investigated, especially regarding positioning accuracy and velocity profile.

  15. Automation and Optimization of Multipulse Laser Zona Drilling of Mouse Embryos During Embryo Biopsy.

    Science.gov (United States)

    Wong, Christopher Yee; Mills, James K

    2017-03-01

    Laser zona drilling (LZD) is a required step in many embryonic surgical procedures, for example, assisted hatching and preimplantation genetic diagnosis. LZD involves the ablation of the zona pellucida (ZP) using a laser while minimizing potentially harmful thermal effects on critical internal cell structures. Develop a method for the automation and optimization of multipulse LZD, applied to cleavage-stage embryos. A two-stage optimization is used. The first stage uses computer vision algorithms to identify embryonic structures and determines the optimal ablation zone farthest away from critical structures such as blastomeres. The second stage combines a genetic algorithm with a previously reported thermal analysis of LZD to optimize the combination of laser pulse locations and pulse durations. The goal is to minimize the peak temperature experienced by the blastomeres while creating the desired opening in the ZP. A proof of concept of the proposed LZD automation and optimization method is demonstrated through experiments on mouse embryos with positive results, as adequately sized openings are created. Automation of LZD is feasible and is a viable step toward the automation of embryo biopsy procedures. LZD is a common but delicate procedure performed by human operators using subjective methods to gauge proper LZD procedure. Automation of LZD removes human error to increase the success rate of LZD. Although the proposed methods are developed for cleavage-stage embryos, the same methods may be applied to most types LZD procedures, embryos at different developmental stages, or nonembryonic cells.

  16. Multi-step processes in the (d, t) and (d, 3He) reactions on 116Sn and 208Pb targets at Ed = 200 MeV

    International Nuclear Information System (INIS)

    Langevin-Joliot, H.; Van de Wiele, J.; Guillot, J.; Koning, A.J.

    2000-01-01

    The role of multi-step processes in the reactions 116 Sn(d,t), 208 Pb(d,t) and 116 Sn(d, 3 He), previously studied at E d = 200 MeV at forward angles and for relatively low energy transfers, has been investigated. We have performed for the first time multi-step calculations taking into account systematically collective excitations in the second and higher order step inelastic transitions. A calculation code based on the Feshbach, Kerman and Koonin model has been modified to handle explicitly these collective excitations, most important in the forward angle domain. One step double differential pick-up cross sections were built from finite range distorted wave results spread in energy using known or estimated hole state characteristics. It is shown that two-step cross sections calculated using the above method compare rather well with those deduced via coupled channel calculations for the same collective excitations. The multi-step calculations performed up to 6 steps reproduce reasonably well the 115 Sn, 207 Pb and 115 In experimental spectra measured up to E x ∼- 40 MeV and 15 deg. The relative contributions of steps of increasing order to pick-up cross sections at E d = 200 MeV and 150 MeV are discussed. (authors)

  17. Altering user' acceptance of automation through prior automation exposure.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  18. Interpretation of some (p,n), (n,p), and (3He, p) reactions by means of the statistical multistep compound emission theory

    International Nuclear Information System (INIS)

    Bonetti, R.; Milazzo, L.C.; Melanotte, M.

    1983-01-01

    A number of (p,n), (n,p), and ( 3 He, p) reactions have been interpreted on the basis of the statistical multistep compound emission mechanism. Good agreement with experiment is found both in spectrum shape and in the value of the coherence widths

  19. Preparation of the metabotropic glutamate receptor 5 (mGluR5) PET tracer [18F]FPEB for human use: An automated radiosynthesis and a novel one-pot synthesis of its radiolabeling precursor

    International Nuclear Information System (INIS)

    Lim, Keunpoong; Labaree, David; Li, Songye; Huang, Yiyun

    2014-01-01

    The radiotracer 3-[ 18 F]fluoro-5-(2-pyridinylethynyl)benzonitrile, or [ 18 F]FPEB, is a promising PET imaging agent for the metabotropic glutamate subtype 5 receptor (mGluR5). In an effort to develop a routine production method of this radiotracer for use in clinical research we adapted its radiosynthesis to an automated chemistry module. In the meanwhile, we also developed a simplified “one-pot” method for the preparation of the nitrobenzonitrile radiolabeling precursor for [ 18 F]FPEB and its reference standard to replace the existing multi-step synthetic approach. - Highlights: • Radiosynthesis of [ 18 F]FPEB was performed in a Tracerlab FX-FN automated module. • The radiolabeling precursor was prepared from a “one-pot” Suzuki coupling method. • Total synthesis time from EOB to a final injectable dose was about 90 min. • The procedure was applied in the routine preparation of [ 18 F]FPEB for human use

  20. Continuous multistep synthesis of perillic acid from limonene by catalytic biofilms under segmented flow.

    Science.gov (United States)

    Willrodt, Christian; Halan, Babu; Karthaus, Lisa; Rehdorf, Jessica; Julsing, Mattijs K; Buehler, Katja; Schmid, Andreas

    2017-02-01

    The efficiency of biocatalytic reactions involving industrially interesting reactants is often constrained by toxification of the applied biocatalyst. Here, we evaluated the combination of biologically and technologically inspired strategies to overcome toxicity-related issues during the multistep oxyfunctionalization of (R)-(+)-limonene to (R)-(+)-perillic acid. Pseudomonas putida GS1 catalyzing selective limonene oxidation via the p-cymene degradation pathway and recombinant Pseudomonas taiwanensis VLB120 were evaluated for continuous perillic acid production. A tubular segmented-flow biofilm reactor was used in order to relieve oxygen limitations and to enable membrane mediated substrate supply as well as efficient in situ product removal. Both P. putida GS1 and P. taiwanensis VLB120 developed a catalytic biofilm in this system. The productivity of wild-type P. putida GS1 encoding the enzymes for limonene bioconversion was highly dependent on the carbon source and reached 34 g L tube -1  day -1 when glycerol was supplied. More than 10-fold lower productivities were reached irrespective of the applied carbon source when the recombinant P. taiwanensis VLB120 harboring p-cymene monooxygenase and p-cumic alcohol dehydrogenase was used as biocatalyst. The technical applicability for preparative perillic acid synthesis in the applied system was verified by purification of perillic acid from the outlet stream using an anion exchanger resin. This concept enabled the multistep production of perillic acid and which might be transferred to other reactions involving volatile reactants and toxic end-products. Biotechnol. Bioeng. 2017;114: 281-290. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  1. High speed quantitative digital beta autoradiography using a multi-step avalanche detector and an Apple-II microcomputer

    International Nuclear Information System (INIS)

    Bateman, J.E.; Connolly, J.F.; Stephenson, R.

    1985-04-01

    The development of an electronic, digital beta autoradiography system is described. Using a Multi-Step Avalanche/Multi-Wire Proportional Counter (MSA/MWPC) detector system fitted with delay line readout, high speed digital imaging is demonstrated with sub-millimeter spatial resolution. Good proportionality of observed counting rate relative to the known tritium activity is demonstrated. The application of the system to autoradiography in immunoelectrophoresis, histopathology and DNA sequencing is described. (author)

  2. Developments in statistical analysis in quantitative genetics

    DEFF Research Database (Denmark)

    Sorensen, Daniel

    2009-01-01

    of genetic means and variances, models for the analysis of categorical and count data, the statistical genetics of a model postulating that environmental variance is partly under genetic control, and a short discussion of models that incorporate massive genetic marker information. We provide an overview......A remarkable research impetus has taken place in statistical genetics since the last World Conference. This has been stimulated by breakthroughs in molecular genetics, automated data-recording devices and computer-intensive statistical methods. The latter were revolutionized by the bootstrap...... and by Markov chain Monte Carlo (McMC). In this overview a number of specific areas are chosen to illustrate the enormous flexibility that McMC has provided for fitting models and exploring features of data that were previously inaccessible. The selected areas are inferences of the trajectories over time...

  3. Toward fully automated genotyping: Allele assignment, pedigree construction, phase determination, and recombination detection in Duchenne muscular dystrophy

    Energy Technology Data Exchange (ETDEWEB)

    Perlin, M.W.; Burks, M.B. [Carnegie Mellon Univ., Pittsburgh, PA (United States); Hoop, R.C.; Hoffman, E.P. [Univ. of Pittsburgh School of Medicine, PA (United States)

    1994-10-01

    Human genetic maps have made quantum leaps in the past few years, because of the characterization of >2,000 CA dinucleotide repeat loci: these PCR-based markers offer extraordinarily high PIC, and within the next year their density is expected to reach intervals of a few centimorgans per marker. These new genetic maps open new avenues for disease gene research, including large-scale genotyping for both simple and complex disease loci. However, the allele patterns of many dinucleotide repeat loci can be complex and difficult to interpret, with genotyping errors a recognized problem. Furthermore, the possibility of genotyping individuals at hundreds or thousands of polymorphic loci requires improvements in data handling and analysis. The automation of genotyping and analysis of computer-derived haplotypes would remove many of the barriers preventing optimal use of dense and informative dinucleotide genetic maps. Toward this end, we have automated the allele identification, genotyping, phase determinations, and inheritance consistency checks generated by four CA repeats within the 2.5-Mbp, 10-cM X-linked dystrophin gene, using fluorescein-labeled multiplexed PCR products analyzed on automated sequencers. The described algorithms can deconvolute and resolve closely spaced alleles, despite interfering stutter noise; set phase in females; propagate the phase through the family; and identify recombination events. We show the implementation of these algorithms for the completely automated interpretation of allele data and risk assessment for five Duchenne/Becker muscular dystrophy families. The described approach can be scaled up to perform genome-based analyses with hundreds or thousands of CA-repeat loci, using multiple fluorophors on automated sequencers. 16 refs., 5 figs., 1 tab.

  4. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  5. Automation model of sewerage rehabilitation planning.

    Science.gov (United States)

    Yang, M D; Su, T C

    2006-01-01

    The major steps of sewerage rehabilitation include inspection of sewerage, assessment of structural conditions, computation of structural condition grades, and determination of rehabilitation methods and materials. Conventionally, sewerage rehabilitation planning relies on experts with professional background that is tedious and time-consuming. This paper proposes an automation model of planning optimal sewerage rehabilitation strategies for the sewer system by integrating image process, clustering technology, optimization, and visualization display. Firstly, image processing techniques, such as wavelet transformation and co-occurrence features extraction, were employed to extract various characteristics of structural failures from CCTV inspection images. Secondly, a classification neural network was established to automatically interpret the structural conditions by comparing the extracted features with the typical failures in a databank. Then, to achieve optimal rehabilitation efficiency, a genetic algorithm was used to determine appropriate rehabilitation methods and substitution materials for the pipe sections with a risk of mal-function and even collapse. Finally, the result from the automation model can be visualized in a geographic information system in which essential information of the sewer system and sewerage rehabilitation plans are graphically displayed. For demonstration, the automation model of optimal sewerage rehabilitation planning was applied to a sewer system in east Taichung, Chinese Taiwan.

  6. A Dimensionality Reduction-Based Multi-Step Clustering Method for Robust Vessel Trajectory Analysis

    Directory of Open Access Journals (Sweden)

    Huanhuan Li

    2017-08-01

    Full Text Available The Shipboard Automatic Identification System (AIS is crucial for navigation safety and maritime surveillance, data mining and pattern analysis of AIS information have attracted considerable attention in terms of both basic research and practical applications. Clustering of spatio-temporal AIS trajectories can be used to identify abnormal patterns and mine customary route data for transportation safety. Thus, the capacities of navigation safety and maritime traffic monitoring could be enhanced correspondingly. However, trajectory clustering is often sensitive to undesirable outliers and is essentially more complex compared with traditional point clustering. To overcome this limitation, a multi-step trajectory clustering method is proposed in this paper for robust AIS trajectory clustering. In particular, the Dynamic Time Warping (DTW, a similarity measurement method, is introduced in the first step to measure the distances between different trajectories. The calculated distances, inversely proportional to the similarities, constitute a distance matrix in the second step. Furthermore, as a widely-used dimensional reduction method, Principal Component Analysis (PCA is exploited to decompose the obtained distance matrix. In particular, the top k principal components with above 95% accumulative contribution rate are extracted by PCA, and the number of the centers k is chosen. The k centers are found by the improved center automatically selection algorithm. In the last step, the improved center clustering algorithm with k clusters is implemented on the distance matrix to achieve the final AIS trajectory clustering results. In order to improve the accuracy of the proposed multi-step clustering algorithm, an automatic algorithm for choosing the k clusters is developed according to the similarity distance. Numerous experiments on realistic AIS trajectory datasets in the bridge area waterway and Mississippi River have been implemented to compare our

  7. A Dimensionality Reduction-Based Multi-Step Clustering Method for Robust Vessel Trajectory Analysis.

    Science.gov (United States)

    Li, Huanhuan; Liu, Jingxian; Liu, Ryan Wen; Xiong, Naixue; Wu, Kefeng; Kim, Tai-Hoon

    2017-08-04

    The Shipboard Automatic Identification System (AIS) is crucial for navigation safety and maritime surveillance, data mining and pattern analysis of AIS information have attracted considerable attention in terms of both basic research and practical applications. Clustering of spatio-temporal AIS trajectories can be used to identify abnormal patterns and mine customary route data for transportation safety. Thus, the capacities of navigation safety and maritime traffic monitoring could be enhanced correspondingly. However, trajectory clustering is often sensitive to undesirable outliers and is essentially more complex compared with traditional point clustering. To overcome this limitation, a multi-step trajectory clustering method is proposed in this paper for robust AIS trajectory clustering. In particular, the Dynamic Time Warping (DTW), a similarity measurement method, is introduced in the first step to measure the distances between different trajectories. The calculated distances, inversely proportional to the similarities, constitute a distance matrix in the second step. Furthermore, as a widely-used dimensional reduction method, Principal Component Analysis (PCA) is exploited to decompose the obtained distance matrix. In particular, the top k principal components with above 95% accumulative contribution rate are extracted by PCA, and the number of the centers k is chosen. The k centers are found by the improved center automatically selection algorithm. In the last step, the improved center clustering algorithm with k clusters is implemented on the distance matrix to achieve the final AIS trajectory clustering results. In order to improve the accuracy of the proposed multi-step clustering algorithm, an automatic algorithm for choosing the k clusters is developed according to the similarity distance. Numerous experiments on realistic AIS trajectory datasets in the bridge area waterway and Mississippi River have been implemented to compare our proposed method with

  8. Continuous Video Modeling to Assist with Completion of Multi-Step Home Living Tasks by Young Adults with Moderate Intellectual Disability

    Science.gov (United States)

    Mechling, Linda C.; Ayres, Kevin M.; Bryant, Kathryn J.; Foster, Ashley L.

    2014-01-01

    The current study evaluated a relatively new video-based procedure, continuous video modeling (CVM), to teach multi-step cleaning tasks to high school students with moderate intellectual disability. CVM in contrast to video modeling and video prompting allows repetition of the video model (looping) as many times as needed while the user completes…

  9. Controllable 3D architectures of aligned carbon nanotube arrays by multi-step processes

    Science.gov (United States)

    Huang, Shaoming

    2003-06-01

    An effective way to fabricate large area three-dimensional (3D) aligned CNTs pattern based on pyrolysis of iron(II) phthalocyanine (FePc) by two-step processes is reported. The controllable generation of different lengths and selective growth of the aligned CNT arrays on metal-patterned (e.g., Ag and Au) substrate are the bases for generating such 3D aligned CNTs architectures. By controlling experimental conditions 3D aligned CNT arrays with different lengths/densities and morphologies/structures as well as multi-layered architectures can be fabricated in large scale by multi-step pyrolysis of FePc. These 3D architectures could have interesting properties and be applied for developing novel nanotube-based devices.

  10. Admixture mapping of end stage kidney disease genetic susceptibility using estimated mutual information ancestry informative markers

    Directory of Open Access Journals (Sweden)

    Geiger Dan

    2010-10-01

    Full Text Available Abstract Background The question of a genetic contribution to the higher prevalence and incidence of end stage kidney disease (ESKD among African Americans (AA remained unresolved, until recent findings using admixture mapping pointed to the association of a genomic locus on chromosome 22 with this disease phenotype. In the current study we utilize this example to demonstrate the utility of applying a multi-step admixture mapping approach. Methods A multi-step case only admixture mapping study, consisted of the following steps was designed: 1 Assembly of the sample dataset (ESKD AA; 2 Design of the estimated mutual information ancestry informative markers (n = 2016 screening panel 3; Genotyping the sample set whose size was determined by a power analysis (n = 576 appropriate for the initial screening panel; 4 Inference of local ancestry for each individual and identification of regions with increased AA ancestry using two different ancestry inference statistical approaches; 5 Enrichment of the initial screening panel; 6 Power analysis of the enriched panel 7 Genotyping of additional samples. 8 Re-analysis of the genotyping results to identify a genetic risk locus. Results The initial screening phase yielded a significant peak using the ADMIXMAP ancestry inference program applying case only statistics. Subgroup analysis of 299 ESKD patients with no history of diabetes yielded peaks using both the ANCESTRYMAP and ADMIXMAP ancestry inference programs. The significant peak was found on chromosome 22. Genotyping of additional ancestry informative markers on chromosome 22 that took into account linkage disequilibrium in the ancestral populations, and the addition of samples increased the statistical significance of the finding. Conclusions A multi-step admixture mapping analysis of AA ESKD patients replicated the finding of a candidate risk locus on chromosome 22, contributing to the heightened susceptibility of African Americans to develop non

  11. Automated extraction of DNA and PCR setup using a Tecan Freedom EVO® liquid handler

    DEFF Research Database (Denmark)

    Frøslev, Tobias Guldberg; Hansen, Anders Johannes; Stangegaard, Michael

    2009-01-01

    We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO® liquid handler mounted with the TeMagS magnetic separation device. The methods were validated for accredited, forensic genetic work according to ISO 17025 using the Qiagen Mag...... genetic DNA typing can be implemented on a simple robot leading to the reduction of manual work as well as increased quality and throughput....

  12. Improving the driver-automation interaction: an approach using automation uncertainty.

    Science.gov (United States)

    Beller, Johannes; Heesen, Matthias; Vollrath, Mark

    2013-12-01

    The aim of this study was to evaluate whether communicating automation uncertainty improves the driver-automation interaction. A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs.low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. The presentation of automation uncertaintythrough a symbol improves overall driver-automation cooperation. Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver-automation cooperation.

  13. Genetic algorithm based feature selection combined with dual classification for the automated detection of proliferative diabetic retinopathy.

    Science.gov (United States)

    Welikala, R A; Fraz, M M; Dehmeshki, J; Hoppe, A; Tah, V; Mann, S; Williamson, T H; Barman, S A

    2015-07-01

    Proliferative diabetic retinopathy (PDR) is a condition that carries a high risk of severe visual impairment. The hallmark of PDR is the growth of abnormal new vessels. In this paper, an automated method for the detection of new vessels from retinal images is presented. This method is based on a dual classification approach. Two vessel segmentation approaches are applied to create two separate binary vessel map which each hold vital information. Local morphology features are measured from each binary vessel map to produce two separate 4-D feature vectors. Independent classification is performed for each feature vector using a support vector machine (SVM) classifier. The system then combines these individual outcomes to produce a final decision. This is followed by the creation of additional features to generate 21-D feature vectors, which feed into a genetic algorithm based feature selection approach with the objective of finding feature subsets that improve the performance of the classification. Sensitivity and specificity results using a dataset of 60 images are 0.9138 and 0.9600, respectively, on a per patch basis and 1.000 and 0.975, respectively, on a per image basis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Automated extraction of DNA from biological stains on fabric from crime cases. A comparison of a manual and three automated methods.

    Science.gov (United States)

    Stangegaard, Michael; Hjort, Benjamin B; Hansen, Thomas N; Hoflund, Anders; Mogensen, Helle S; Hansen, Anders J; Morling, Niels

    2013-05-01

    The presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. DNA extraction from fabric for forensic genetic purposes may be challenging due to the occasional presence of PCR inhibitors that may be co-extracted with the DNA. Using 120 forensic trace evidence samples consisting of various types of fabric, we compared three automated DNA extraction methods based on magnetic beads (PrepFiler Express Forensic DNA Extraction Kit on an AutoMate Express, QIAsyphony DNA Investigator kit either with the sample pre-treatment recommended by Qiagen or an in-house optimized sample pre-treatment on a QIAsymphony SP) and one manual method (Chelex) with the aim of reducing the amount of PCR inhibitors in the DNA extracts and increasing the proportion of reportable STR-profiles. A total of 480 samples were processed. The highest DNA recovery was obtained with the PrepFiler Express kit on an AutoMate Express while the lowest DNA recovery was obtained using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen. Extraction using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen resulted in the lowest percentage of PCR inhibition (0%) while extraction using manual Chelex resulted in the highest percentage of PCR inhibition (51%). The largest number of reportable STR-profiles was obtained with DNA from samples extracted with the PrepFiler Express kit (75%) while the lowest number was obtained with DNA from samples extracted using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen (41%). Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  15. Genetics of Fertility Indicators Based on Behaviour and Progesterone in Milk

    DEFF Research Database (Denmark)

    Løvendahl, P; Chagunda, M; O'Connell, J

    2009-01-01

    Genetic selection for female is diffecult because current records based on AI information are biased from farmer's decisions and have low heritablity (h2 = 0.03). Automated on-farm progesterone based heat detection methods are becoming commercially available, as has eqipment based on behaviour...... oestrus and would enable improved genetic selection for fertility traits....

  16. Molecular Analysis of a Multistep Lung Cancer Model Induced by Chronic Inflammation Reveals Epigenetic Regulation of p16, Activation of the DNA Damage Response Pathway

    Directory of Open Access Journals (Sweden)

    David Blanco

    2007-10-01

    Full Text Available The molecular hallmarks of inflammation-mediated lung carcinogenesis have not been fully clarified, mainly due to the scarcity of appropriate animal models. We have used a silica-induced multistep lung carcinogenesis model driven by chronic inflammation to study the evolution of molecular markers, genetic alterations. We analyzed markers of DNA damage response (DDR, proliferative stress, telomeric stress: δ-H2AX, p16, p53, TERT. Lung cancer-related epigenetic, genetic alterations, including promoter hypermethylation status of p16(CDKN2A, APC, CDH13, Rassf1, Nore1A, as well as mutations of Tp53, epidermal growth factor receptor, K-ras, N-ras, c-H-ras, have been also studied. Our results showed DDR pathway activation in preneoplastic lesions, in association with inducible nitric oxide synthase, p53 induction. p16 was also induced in early tumorigenic progression, was inactivated in bronchiolar dysplasias, tumors. Remarkably, lack of mutations of Ras, epidermal growth factor receptor, a very low frequency of Tp53 mutations suggest that they are not required for tumorigenesis in this model. In contrast, epigenetic alterations in p16(CDKN2A, CDH13, APC, but not in Rassf1, Nore1A, were clearly observed. These data suggest the existence of a specific molecular signature of inflammation-driven lung carcinogenesis that shares some, but not all, of the molecular landmarks of chemically induced lung cancer.

  17. Multi-Step Ka/Ka Dichroic Plate with Rounded Corners for NASA's 34m Beam Waveguide Antenna

    Science.gov (United States)

    Veruttipong, Watt; Khayatian, Behrouz; Hoppe, Daniel; Long, Ezra

    2013-01-01

    A multi-step Ka/Ka dichroic plate Frequency Selective Surface (FSS structure) is designed, manufactured and tested for use in NASA's Deep Space Network (DSN) 34m Beam Waveguide (BWG) antennas. The proposed design allows ease of manufacturing and ability to handle the increased transmit power (reflected off the FSS) of the DSN BWG antennas from 20kW to 100 kW. The dichroic is designed using HFSS and results agree well with measured data considering the manufacturing tolerances that could be achieved on the dichroic.

  18. Coping Strategies Applied to Comprehend Multistep Arithmetic Word Problems by Students with Above-Average Numeracy Skills and Below-Average Reading Skills

    Science.gov (United States)

    Nortvedt, Guri A.

    2011-01-01

    This article discusses how 13-year-old students with above-average numeracy skills and below-average reading skills cope with comprehending word problems. Compared to other students who are proficient in numeracy and are skilled readers, these students are more disadvantaged when solving single-step and multistep arithmetic word problems. The…

  19. Data-based control of a multi-step forming process

    Science.gov (United States)

    Schulte, R.; Frey, P.; Hildenbrand, P.; Vogel, M.; Betz, C.; Lechner, M.; Merklein, M.

    2017-09-01

    The fourth industrial revolution represents a new stage in the organization and management of the entire value chain. However, concerning the field of forming technology, the fourth industrial revolution has only arrived gradually until now. In order to make a valuable contribution to the digital factory the controlling of a multistage forming process was investigated. Within the framework of the investigation, an abstracted and transferable model is used to outline which data have to be collected, how an interface between the different forming machines can be designed tangible and which control tasks must be fulfilled. The goal of this investigation was to control the subsequent process step based on the data recorded in the first step. The investigated process chain links various metal forming processes, which are typical elements of a multi-step forming process. Data recorded in the first step of the process chain is analyzed and processed for an improved process control of the subsequent process. On the basis of the gained scientific knowledge, it is possible to make forming operations more robust and at the same time more flexible, and thus create the fundament for linking various production processes in an efficient way.

  20. Abundance and composition of indigenous bacterial communities in a multi-step biofiltration-based drinking water treatment plant.

    Science.gov (United States)

    Lautenschlager, Karin; Hwang, Chiachi; Ling, Fangqiong; Liu, Wen-Tso; Boon, Nico; Köster, Oliver; Egli, Thomas; Hammes, Frederik

    2014-10-01

    Indigenous bacterial communities are essential for biofiltration processes in drinking water treatment systems. In this study, we examined the microbial community composition and abundance of three different biofilter types (rapid sand, granular activated carbon, and slow sand filters) and their respective effluents in a full-scale, multi-step treatment plant (Zürich, CH). Detailed analysis of organic carbon degradation underpinned biodegradation as the primary function of the biofilter biomass. The biomass was present in concentrations ranging between 2-5 × 10(15) cells/m(3) in all filters but was phylogenetically, enzymatically and metabolically diverse. Based on 16S rRNA gene-based 454 pyrosequencing analysis for microbial community composition, similar microbial taxa (predominantly Proteobacteria, Planctomycetes, Acidobacteria, Bacteriodetes, Nitrospira and Chloroflexi) were present in all biofilters and in their respective effluents, but the ratio of microbial taxa was different in each filter type. This change was also reflected in the cluster analysis, which revealed a change of 50-60% in microbial community composition between the different filter types. This study documents the direct influence of the filter biomass on the microbial community composition of the final drinking water, particularly when the water is distributed without post-disinfection. The results provide new insights on the complexity of indigenous bacteria colonizing drinking water systems, especially in different biofilters of a multi-step treatment plant. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Variation of nanopore diameter along porous anodic alumina channels by multi-step anodization.

    Science.gov (United States)

    Lee, Kwang Hong; Lim, Xin Yuan; Wai, Kah Wing; Romanato, Filippo; Wong, Chee Cheong

    2011-02-01

    In order to form tapered nanocapillaries, we investigated a method to vary the nanopore diameter along the porous anodic alumina (PAA) channels using multi-step anodization. By anodizing the aluminum in either single acid (H3PO4) or multi-acid (H2SO4, oxalic acid and H3PO4) with increasing or decreasing voltage, the diameter of the nanopore along the PAA channel can be varied systematically corresponding to the applied voltages. The pore size along the channel can be enlarged or shrunken in the range of 20 nm to 200 nm. Structural engineering of the template along the film growth direction can be achieved by deliberately designing a suitable voltage and electrolyte together with anodization time.

  2. A Family of Symmetric Linear Multistep Methods for the Numerical Solution of the Schroedinger Equation and Related Problems

    International Nuclear Information System (INIS)

    Anastassi, Z. A.; Simos, T. E.

    2010-01-01

    We develop a new family of explicit symmetric linear multistep methods for the efficient numerical solution of the Schroedinger equation and related problems with oscillatory solution. The new methods are trigonometrically fitted and have improved intervals of periodicity as compared to the corresponding classical method with constant coefficients and other methods from the literature. We also apply the methods along with other known methods to real periodic problems, in order to measure their efficiency.

  3. Solvent recyclability in a multistep direct liquefaction process

    Energy Technology Data Exchange (ETDEWEB)

    Hetland, M.D.; Rindt, J.R. [Univ. of North Dakota, Grand Forks, ND (United States)

    1995-12-31

    Direct liquefaction research at the Energy & Environmental Research Center (EERC) has, for a number of years, concentrated on developing a direct liquefaction process specifically for low-rank coals (LRCs) through the use of hydrogen-donating solvents and solvents similar to coal-derived liquids, the water/gas shift reaction, and lower-severity reaction conditions. The underlying assumption of all of the research was that advantage could be taken of the reactivity and specific qualities of LRCs to produce a tetrahydrofuran (THF)-soluble material that might be easier to upgrade than the soluble residuum produced during direct liquefaction of high-rank coals. A multistep approach was taken to produce the THF-soluble material, consisting of (1) preconversion treatment to prepare the coal for solubilization, (2) solubilization of the coal in the solvent, and (3) polishing to complete solubilization of the remaining material. The product of these three steps can then be upgraded during a traditional hydrotreatment step. The results of the EERC`s research indicated that additional studies to develop this process more fully were justified. Two areas were targeted for further research: (1) determination of the recyclability of the solvent used during solubilization and (2) determination of the minimum severity required for hydrotreatment of the liquid product. The current project was funded to investigate these two areas.

  4. Low cost automation

    International Nuclear Information System (INIS)

    1987-03-01

    This book indicates method of building of automation plan, design of automation facilities, automation and CHIP process like basics of cutting, NC processing machine and CHIP handling, automation unit, such as drilling unit, tapping unit, boring unit, milling unit and slide unit, application of oil pressure on characteristics and basic oil pressure circuit, application of pneumatic, automation kinds and application of process, assembly, transportation, automatic machine and factory automation.

  5. Theoretical intercomparison of multi-step direct reaction models and computational intercomparison of multi-step direct reaction models

    International Nuclear Information System (INIS)

    Koning, A.J.

    1992-08-01

    In recent years several statistical theories have been developed concerning multistep direct (MSD) nuclear reactions. In addition, dominant in applications is a whole class of semiclassical models that may be subsumed under the heading of 'generalized exciton models'. These are basically MSD-type extensions on top of compound-like concepts. In this report the relationship between their underlying statistical MSD-postulates is highlighted. A command framework is outlined that enables to generate the various MSD theories through assigning statistical properties to different parts of the nuclear Hamiltonian. Then it is shown that distinct forms of nuclear randomness are embodied in the mentioned theories. All these theories appear to be very similar at a qualitative level. In order to explain the high energy-tails and forward-peaked angular distribution typical for particles emitted in MSD reactions, it is imagined that the incident continuum particle stepwise looses its energy and direction in a sequence of collisions, thereby creating new particle-hole pairs in the target system. At each step emission may take place. The statistical aspect comes in because many continuum states are involved in the process. These are supposed to display chaotic behavior, the associated randomness assumption giving rise to important simplifications in the expression for MSD emission cross sections. This picture suggests that mentioned MSD models can be interpreted as a variant of essentially one and the same theory. However, this appears not to be the case. To show this usual MSD distinction within the composite reacting nucleus between the fast continuum particle and the residual interactions, the nucleons of the residual core are to be distinguished from those of the leading particle with the residual system. This distinction will turn out to be crucial to present analysis. 27 refs.; 5 figs.; 1 tab

  6. Designing mixed metal halide ammines for ammonia storage using density functional theory and genetic algorithms

    DEFF Research Database (Denmark)

    Jensen, Peter Bjerre; Lysgaard, Steen; Quaade, Ulrich J.

    2014-01-01

    electrolyte membrane fuel cells (PEMFC). We use genetic algorithms (GAs) to search for materials containing up to three different metals (alkaline-earth, 3d and 4d) and two different halides (Cl, Br and I) – almost 27000 combinations, and have identified novel mixtures, with significantly improved storage......Metal halide ammines have great potential as a future, high-density energy carrier in vehicles. So far known materials, e.g. Mg(NH3)6Cl2 and Sr(NH3)8Cl2, are not suitable for automotive, fuel cell applications, because the release of ammonia is a multi-step reaction, requiring too much heat...

  7. An enzyme-catalyzed multistep DNA refolding mechanism in hairpin telomere formation.

    Directory of Open Access Journals (Sweden)

    Ke Shi

    Full Text Available Hairpin telomeres of bacterial linear chromosomes are generated by a DNA cutting-rejoining enzyme protelomerase. Protelomerase resolves a concatenated dimer of chromosomes as the last step of chromosome replication, converting a palindromic DNA sequence at the junctions between chromosomes into covalently closed hairpins. The mechanism by which protelomerase transforms a duplex DNA substrate into the hairpin telomeres remains largely unknown. We report here a series of crystal structures of the protelomerase TelA bound to DNA that represent distinct stages along the reaction pathway. The structures suggest that TelA converts a linear duplex substrate into hairpin turns via a transient strand-refolding intermediate that involves DNA-base flipping and wobble base-pairs. The extremely compact di-nucleotide hairpin structure of the product is fully stabilized by TelA prior to strand ligation, which drives the reaction to completion. The enzyme-catalyzed, multistep strand refolding is a novel mechanism in DNA rearrangement reactions.

  8. The route from problem to solution in multistep continuous flow synthesis of pharmaceutical compounds.

    Science.gov (United States)

    Bana, Péter; Örkényi, Róbert; Lövei, Klára; Lakó, Ágnes; Túrós, György István; Éles, János; Faigl, Ferenc; Greiner, István

    2017-12-01

    Recent advances in the field of continuous flow chemistry allow the multistep preparation of complex molecules such as APIs (Active Pharmaceutical Ingredients) in a telescoped manner. Numerous examples of laboratory-scale applications are described, which are pointing towards novel manufacturing processes of pharmaceutical compounds, in accordance with recent regulatory, economical and quality guidances. The chemical and technical knowledge gained during these studies is considerable; nevertheless, connecting several individual chemical transformations and the attached analytics and purification holds hidden traps. In this review, we summarize innovative solutions for these challenges, in order to benefit chemists aiming to exploit flow chemistry systems for the synthesis of biologically active molecules. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Multistep Cylindrical Structure Analysis at Normal Incidence Based on Water-Substrate Broadband Metamaterial Absorbers

    Science.gov (United States)

    Fang, Chonghua

    2018-01-01

    A new multistep cylindrical structure based on water-substrate broadband metamaterial absorbers is designed to reduce the traditional radar cross-section (RCS) of a rod-shaped object. The proposed configuration consists of two distinct parts. One of these components is formed by a four-step cylindrical metal structure, whereas the other one is formed by a new water-substrate broadband metamaterial absorber. The designed structure can significantly reduce the radar cross section more than 10 dB from 4.58 to 18.42 GHz which is the 86.5 % bandwidth of from C-band to 20 GHz. The results of measurement show reasonably good accordance with the simulated ones, which verifies the ability and effect of the proposed design.

  10. Optimization of the Production of Extracellular Polysaccharide from the Shiitake Medicinal Mushroom Lentinus edodes (Agaricomycetes) Using Mutation and a Genetic Algorithm-Coupled Artificial Neural Network (GA-ANN).

    Science.gov (United States)

    Adeeyo, Adeyemi Ojutalayo; Lateef, Agbaje; Gueguim-Kana, Evariste Bosco

    2016-01-01

    Exopolysaccharide (EPS) production by a strain of Lentinus edodes was studied via the effects of treatments with ultraviolet (UV) irradiation and acridine orange. Furthermore, optimization of EPS production was studied using a genetic algorithm coupled with an artificial neural network in submerged fermentation. Exposure to irradiation and acridine orange resulted in improved EPS production (2.783 and 5.548 g/L, respectively) when compared with the wild strain (1.044 g/L), whereas optimization led to improved productivity (23.21 g/L). The EPS produced by various strains also demonstrated good DPPH scavenging activities of 45.40-88.90%, and also inhibited the growth of Escherichia coli and Klebsiella pneumoniae. This study shows that multistep optimization schemes involving physical-chemical mutation and media optimization can be an attractive strategy for improving the yield of bioactives from medicinal mushrooms. To the best of our knowledge, this report presents the first reference of a multistep approach to optimizing EPS production in L. edodes.

  11. The use of microwaves for the automated production of radiopharmaceuticals

    International Nuclear Information System (INIS)

    Stone-Elander, S.A.

    1993-01-01

    Microwaves have been increasingly used over the last decade to speed up chemical transformations. Nowhere are the dramatic time gains observed so obviously important as in applications with short-lived radioisotopes where every minute gained in multi-step procedures may be important for success. Commercially-available laboratory microwave ovens adapted for remote-control or robot-assisted techniques have been used to speed up primarily incorporations of [ 18 F]fluoride. New applications with microwave cavities not only provide a more controllable microwave field for the transformations, but also lend themselves well to use in multi-step procedures under remote-control. Applications of microwave techniques in radiolabelling procedures are reviewed with respect to the chemical transformations, microwave parameters and apparatus requirements

  12. Synthesis of InGaZnO4 nanoparticles using low temperature multistep co-precipitation method

    International Nuclear Information System (INIS)

    Wu, Ming-Chung; Hsiao, Kai-Chi; Lu, Hsin-Chun

    2015-01-01

    Indium gallium zinc oxide (InGaZnO 4 , IGZO) has attracted explosive growth in investigations over the last decades as an important material in the thin-film transistor. In this study, the various nitrate precursors, including indium nitrate, gallium nitrate, and zinc nitrate, were prepared from the various metals dissolved in nitric acid. Then, we used these nitrate precursors to synthesize the IGZO precursor powder by the multistep co-precipitation method. The synthesis parameters of the co-precipitation method, such as reaction temperature, pH value and reaction time, were controlled precisely to prepare the high quality IGZO precursor powder. Finally, IGZO precursor powder was calcined at 900 °C. Then, the microstructure, the crystalline structure, the particle size distribution and specific surface area of calcined IGZO precursor powder were characterized by electron transmission microscopy, X-ray diffraction technique, dynamic light scattering method and the surface area and porosimetry analyzer, respectively. The relative density of IGZO tablet sintered at 1200 °C for 12 h is as high as 97.30%, and it showed highly InGaZnO 4 crystalline structure and the large grain size. The IGZO nanoparticles developed in our study has the potential for the high quality target materials used in the application of electronic devices. - Graphical abstract: Display Omitted - Highlights: • InGaZnO 4 (IGZO) nanoparticle was synthesized by multistep co-precipitation method. • The synthesis parameters were controlled precisely to prepare high quality powder. • The relative density of highly crystalline IGZO tablet is as high as 97.30%. • IGZO tablet exhibited highly crystalline structure and the large grain size

  13. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  14. Natural transformation of Vibrio parahaemolyticus: A rapid method to create genetic deletions.

    Science.gov (United States)

    Chimalapati, Suneeta; de Souza Santos, Marcela; Servage, Kelly; De Nisco, Nicole J; Dalia, Ankur B; Orth, Kim

    2018-03-19

    The Gram-negative bacterium Vibrio parahaemolyticus is an opportunistic human pathogen and the leading cause of seafood borne acute gastroenteritis worldwide. Recently, this bacterium was implicated as the etiologic agent of a severe shrimp disease with consequent devastating outcomes to shrimp farming. In both cases, acquisition of genetic material via horizontal transfer provided V. parahaemolyticus with new virulence tools to cause disease. Dissecting the molecular mechanisms of V. parahaemolyticus pathogenesis often requires manipulating its genome. Classically, genetic deletions in V. parahaemolyticus are performed using a laborious, lengthy, multi-step process. Herein, we describe a fast and efficient method to edit this bacterium's genome based on V. parahaemolyticus natural competence. Although this method is similar to one previously described, V. parahaemolyticus requires counter selection for curing of acquired plasmids due to its recalcitrant nature of retaining extrachromosomal DNA. We believe this approach will be of use to the Vibrio community. Importance Spreading of Vibrios throughout the world correlates with increased global temperatures. As they spread, they find new niches to survive, proliferate and invade. Therefore, genetic manipulation of Vibrios is of utmost importance for studying these species. Herein, we have delineated and validated a rapid method to create genetic deletions in Vibrio parahaemolyticus This study provides insightful methodology for studies with other Vibrio species. Copyright © 2018 American Society for Microbiology.

  15. Intelligent DNA-based molecular diagnostics using linked genetic markers

    Energy Technology Data Exchange (ETDEWEB)

    Pathak, D.K.; Perlin, M.W.; Hoffman, E.P.

    1994-12-31

    This paper describes a knowledge-based system for molecular diagnostics, and its application to fully automated diagnosis of X-linked genetic disorders. Molecular diagnostic information is used in clinical practice for determining genetic risks, such as carrier determination and prenatal diagnosis. Initially, blood samples are obtained from related individuals, and PCR amplification is performed. Linkage-based molecular diagnosis then entails three data analysis steps. First, for every individual, the alleles (i.e., DNA composition) are determined at specified chromosomal locations. Second, the flow of genetic material among the individuals is established. Third, the probability that a given individual is either a carrier of the disease or affected by the disease is determined. The current practice is to perform each of these three steps manually, which is costly, time consuming, labor-intensive, and error-prone. As such, the knowledge-intensive data analysis and interpretation supersede the actual experimentation effort as the major bottleneck in molecular diagnostics. By examining the human problem solving for the task, we have designed and implemented a prototype knowledge-based system capable of fully automating linkage-based molecular diagnostics in X-linked genetic disorders, including Duchenne Muscular Dystrophy (DMD). Our system uses knowledge-based interpretation of gel electrophoresis images to determine individual DNA marker labels, a constraint satisfaction search for consistent genetic flow among individuals, and a blackboard-style problem solver for risk assessment. We describe the system`s successful diagnosis of DMD carrier and affected individuals from raw clinical data.

  16. Automated genotyping of dinucleotide repeat markers

    Energy Technology Data Exchange (ETDEWEB)

    Perlin, M.W.; Hoffman, E.P. [Carnegie Mellon Univ., Pittsburgh, PA (United States)]|[Univ. of Pittsburgh, PA (United States)

    1994-09-01

    The dinucleotide repeats (i.e., microsatellites) such as CA-repeats are a highly polymorphic, highly abundant class of PCR-amplifiable markers that have greatly streamlined genetic mapping experimentation. It is expected that over 30,000 such markers (including tri- and tetranucleotide repeats) will be characterized for routine use in the next few years. Since only size determination, and not sequencing, is required to determine alleles, in principle, dinucleotide repeat genotyping is easily performed on electrophoretic gels, and can be automated using DNA sequencers. Unfortunately, PCR stuttering with these markers generates not one band for each allele, but a pattern of bands. Since closely spaced alleles must be disambiguated by human scoring, this poses a key obstacle to full automation. We have developed methods that overcome this obstacle. Our model is that the observed data is generated by arithmetic superposition (i.e., convolution) of multiple allele patterns. By quantitatively measuring the size of each component band, and exploiting the unique stutter pattern associated with each marker, closely spaced alleles can be deconvolved; this unambiguously reconstructs the {open_quotes}true{close_quotes} allele bands, with stutter artifact removed. We used this approach in a system for automated diagnosis of (X-linked) Duchenne muscular dystrophy; four multiplexed CA-repeats within the dystrophin gene were assayed on a DNA sequencer. Our method accurately detected small variations in gel migration that shifted the allele size estimate. In 167 nonmutated alleles, 89% (149/167) showed no size variation, 9% (15/167) showed 1 bp variation, and 2% (3/167) showed 2 bp variation. We are currently developing a library of dinucleotide repeat patterns; together with our deconvolution methods, this library will enable fully automated genotyping of dinucleotide repeats from sizing data.

  17. Multi-step magnetization of the Ising model on a Shastry-Sutherland lattice: a Monte Carlo simulation

    International Nuclear Information System (INIS)

    Huang, W C; Huo, L; Tian, G; Qian, H R; Gao, X S; Qin, M H; Liu, J-M

    2012-01-01

    The magnetization behaviors and spin configurations of the classical Ising model on a Shastry-Sutherland lattice are investigated using Monte Carlo simulations, in order to understand the fascinating magnetization plateaus observed in TmB 4 and other rare-earth tetraborides. The simulations reproduce the 1/2 magnetization plateau by taking into account the dipole-dipole interaction. In addition, a narrow 2/3 magnetization step at low temperature is predicted in our simulation. The multi-step magnetization can be understood as the consequence of the competitions among the spin-exchange interaction, the dipole-dipole interaction, and the static magnetic energy.

  18. Enhancement of the neutral-beam stopping cross section in fusion plasmas due to multistep collision processes

    International Nuclear Information System (INIS)

    Boley, C.D.; Janev, R.K.; Post, D.E.

    1983-10-01

    Multistep processes involving excited atomic states are found to produce a substantial increase in the stopping cross section for a neutral hydrogen beam injected into a plasma, and thus to reduce the beam penetration. For typical plasma and beam parameters of current large tokamak experiments, the stopping cross-sectional enhancement is found to vary from 25% to 50% depending on the beam energy, plasma density, and impurity level. For neutral hydrogen beams with energies greater than or equal to 500 keV, envisioned in tokamak amd mirror reactor designs, the enhancement can be as large as 80 to 90%

  19. The genetic basis of addictive disorders.

    Science.gov (United States)

    Ducci, Francesca; Goldman, David

    2012-06-01

    Addictions are common, chronic, and relapsing diseases that develop through a multistep process. The impact of addictions on morbidity and mortality is high worldwide. Twin studies have shown that the heritability of addictions ranges from 0.39 (hallucinogens) to 0.72 (cocaine). Twin studies indicate that genes influence each stage from initiation to addiction, although the genetic determinants may differ. Addictions are by definition the result of gene × environment interaction. These disorders, which are in part volitional, in part inborn, and in part determined by environmental experience, pose the full range of medical, genetic, policy, and moral challenges. Gene discovery is being facilitated by a variety of powerful approaches, but is in its infancy. It is not surprising that the genes discovered so far act in a variety of ways: via altered metabolism of drug (the alcohol and nicotine metabolic gene variants), via altered function of a drug receptor (the nicotinic receptor, which may alter affinity for nicotine but as discussed may also alter circuitry of reward), and via general mechanisms of addiction (genes such as monoamine oxidase A and the serotonin transporter that modulate stress response, emotion, and behavioral control). Addiction medicine today benefits from genetic studies that buttress the case for a neurobiologic origin of addictive behavior, and some general information on familially transmitted propensity that can be used to guide prevention. A few well-validated, specific predictors such as OPRM1, ADH1B, ALDH2, CHRNA5, and CYP26 have been identified and can provide some specific guidance, for example, to understand alcohol-related flushing and upper GI cancer risk (ADH1B and AKLDH2), variation in nicotine metabolism (CYP26), and, potentially, naltrexone treatment response (OPRM1). However, the genetic predictors available are few in number and account for only a small portion of the genetic variance in liability, and have not been integrated

  20. The light spot test: Measuring anxiety in mice in an automated home-cage environment.

    Science.gov (United States)

    Aarts, Emmeke; Maroteaux, Gregoire; Loos, Maarten; Koopmans, Bastijn; Kovačević, Jovana; Smit, August B; Verhage, Matthijs; Sluis, Sophie van der

    2015-11-01

    Behavioral tests of animals in a controlled experimental setting provide a valuable tool to advance understanding of genotype-phenotype relations, and to study the effects of genetic and environmental manipulations. To optimally benefit from the increasing numbers of genetically engineered mice, reliable high-throughput methods for comprehensive behavioral phenotyping of mice lines have become a necessity. Here, we describe the development and validation of an anxiety test, the light spot test, that allows for unsupervised, automated, high-throughput testing of mice in a home-cage system. This automated behavioral test circumvents bias introduced by pretest handling, and enables recording both baseline behavior and the behavioral test response over a prolonged period of time. We demonstrate that the light spot test induces a behavioral response in C57BL/6J mice. This behavior reverts to baseline when the aversive stimulus is switched off, and is blunted by treatment with the anxiolytic drug Diazepam, demonstrating predictive validity of the assay, and indicating that the observed behavioral response has a significant anxiety component. Also, we investigated the effectiveness of the light spot test as part of sequential testing for different behavioral aspects in the home-cage. Two learning tests, administered prior to the light spot test, affected the light spot test parameters. The light spot test is a novel, automated assay for anxiety-related high-throughput testing of mice in an automated home-cage environment, allowing for both comprehensive behavioral phenotyping of mice, and rapid screening of pharmacological compounds. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Systemic cancer multistep therapy; Systemische Krebs-Mehrschritt-Therapie

    Energy Technology Data Exchange (ETDEWEB)

    Weigang-Koehler, K. [Medizinische Klinik 5, Arbeitsgruppe Biologische Krebstherapie, Staedtisches Klinikum Nuernberg (Germany); Kaiser, G. [Medizinische Klinik 5, Arbeitsgruppe Biologische Krebstherapie, Staedtisches Klinikum Nuernberg (Germany); Gallmeier, W.M. [Medizinische Klinik 5, Arbeitsgruppe Biologische Krebstherapie, Staedtisches Klinikum Nuernberg (Germany)

    1997-04-11

    To get an insight into the claimed efficacy of `systemic cancer multistep therapy` (sKMT) with hyperglycemia, whole-body hyperthermia and hyperoxemia, we conducted a best case analysis with 20 patients who had received sKMT alone (9 patients) or in combination with chemo- or radiotherapy (11 patients). There was no complete remission or an unquestionable partial remission when sKMT was used alone. When sKMT was combined with classical effective therapies like chemo- and radiotherapy, 1 patient had questionable complete remission and 3 patients had partial remission. In these three patients sKMT had been combined with a newly applied chemotherapy: Therefore, it remains unclear which method was effective in causing the remission. (orig.) [Deutsch] Um eine Ueberblick ueber die behauptete Wirksamkeit der systemischen Krebs-Mehrschritt-Therapie (sKMT) nach von Ardenne zu erlangen, fuehrten wir eine Best-case-Analyse bei 20 Patienten durch, die die sKMT ohne Chemotherapie (9 Patienten) und in Kombination mit Chemo- bzw. Strahlentherapie (11 Patienten) erhalten hatten. sKMT allein hatte zu keiner kompletten Remission oder sicheren partiellen Remission gefuehrt. Bei der Kombination von sKMT mit klassischen Therapieverfahren wie Chemotherapie und Bestrahlung trat bei einem Patienten eine fragliche komplette Remission ein sowie bei 3 Patienten eine partielle Remission. Im letzteren Fall war jeweils eine fuer den Patienten neue Chemotherapie mit der sKMT kombiniert worden, so dass unklar bleibt, was die Verbesserung herbeifuehrte. (orig.)

  2. Validation of shortened 2-day sterility testing of mesenchymal stem cell-based therapeutic preparation on an automated culture system.

    Science.gov (United States)

    Lysák, Daniel; Holubová, Monika; Bergerová, Tamara; Vávrová, Monika; Cangemi, Giuseppina Cristina; Ciccocioppo, Rachele; Kruzliak, Peter; Jindra, Pavel

    2016-03-01

    Cell therapy products represent a new trend of treatment in the field of immunotherapy and regenerative medicine. Their biological nature and multistep preparation procedure require the application of complex release criteria and quality control. Microbial contamination of cell therapy products is a potential source of morbidity in recipients. The automated blood culture systems are widely used for the detection of microorganisms in cell therapy products. However the standard 2-week cultivation period is too long for some cell-based treatments and alternative methods have to be devised. We tried to verify whether a shortened cultivation of the supernatant from the mesenchymal stem cell (MSC) culture obtained 2 days before the cell harvest could sufficiently detect microbial growth and allow the release of MSC for clinical application. We compared the standard Ph. Eur. cultivation method and the automated blood culture system BACTEC (Becton Dickinson). The time to detection (TTD) and the detection limit were analyzed for three bacterial and two fungal strains. The Staphylococcus aureus and Pseudomonas aeruginosa were recognized within 24 h with both methods (detection limit ~10 CFU). The time required for the detection of Bacillus subtilis was shorter with the automated method (TTD 10.3 vs. 60 h for 10-100 CFU). The BACTEC system reached significantly shorter times to the detection of Candida albicans and Aspergillus brasiliensis growth compared to the classical method (15.5 vs. 48 and 31.5 vs. 48 h, respectively; 10-100 CFU). The positivity was demonstrated within 48 h in all bottles, regardless of the size of the inoculum. This study validated the automated cultivation system as a method able to detect all tested microorganisms within a 48-h period with a detection limit of ~10 CFU. Only in case of B. subtilis, the lowest inoculum (~10 CFU) was not recognized. The 2-day cultivation technique is then capable of confirming the microbiological safety of MSC and

  3. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  4. Shutdown Dose Rate Analysis Using the Multi-Step CADIS Method

    International Nuclear Information System (INIS)

    Ibrahim, Ahmad M.; Peplow, Douglas E.; Peterson, Joshua L.; Grove, Robert E.

    2015-01-01

    The Multi-Step Consistent Adjoint Driven Importance Sampling (MS-CADIS) hybrid Monte Carlo (MC)/deterministic radiation transport method was proposed to speed up the shutdown dose rate (SDDR) neutron MC calculation using an importance function that represents the neutron importance to the final SDDR. This work applied the MS-CADIS method to the ITER SDDR benchmark problem. The MS-CADIS method was also used to calculate the SDDR uncertainty resulting from uncertainties in the MC neutron calculation and to determine the degree of undersampling in SDDR calculations because of the limited ability of the MC method to tally detailed spatial and energy distributions. The analysis that used the ITER benchmark problem compared the efficiency of the MS-CADIS method to the traditional approach of using global MC variance reduction techniques for speeding up SDDR neutron MC calculation. Compared to the standard Forward-Weighted-CADIS (FW-CADIS) method, the MS-CADIS method increased the efficiency of the SDDR neutron MC calculation by 69%. The MS-CADIS method also increased the fraction of nonzero scoring mesh tally elements in the space-energy regions of high importance to the final SDDR

  5. Computational comparison of quantum-mechanical models for multistep direct reactions

    International Nuclear Information System (INIS)

    Koning, A.J.; Akkermans, J.M.

    1993-01-01

    We have carried out a computational comparison of all existing quantum-mechanical models for multistep direct (MSD) reactions. The various MSD models, including the so-called Feshbach-Kerman-Koonin, Tamura-Udagawa-Lenske and Nishioka-Yoshida-Weidenmueller models, have been implemented in a single computer system. All model calculations thus use the same set of parameters and the same numerical techniques; only one adjustable parameter is employed. The computational results have been compared with experimental energy spectra and angular distributions for several nuclear reactions, namely, 90 Zr(p,p') at 80 MeV, 209 Bi(p,p') at 62 MeV, and 93 Nb(n,n') at 25.7 MeV. In addition, the results have been compared with the Kalbach systematics and with semiclassical exciton model calculations. All quantum MSD models provide a good fit to the experimental data. In addition, they reproduce the systematics very well and are clearly better than semiclassical model calculations. We furthermore show that the calculated predictions do not differ very strongly between the various quantum MSD models, leading to the conclusion that the simplest MSD model (the Feshbach-Kerman-Koonin model) is adequate for the analysis of experimental data

  6. Optimization of automation: I. Estimation method of cognitive automation rates reflecting the effects of automation on human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Seong, Poong Hyun

    2014-01-01

    Highlights: • We propose an estimation method of the automation rate by taking the advantages of automation as the estimation measures. • We conduct the experiments to examine the validity of the suggested method. • The higher the cognitive automation rate is, the greater the decreased rate of the working time will be. • The usefulness of the suggested estimation method is proved by statistical analyses. - Abstract: Since automation was introduced in various industrial fields, the concept of the automation rate has been used to indicate the inclusion proportion of automation among all work processes or facilities. Expressions of the inclusion proportion of automation are predictable, as is the ability to express the degree of the enhancement of human performance. However, many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, this paper proposes a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs). Automation in NPPs can be divided into two types: system automation and cognitive automation. Some general descriptions and characteristics of each type of automation are provided, and the advantages of automation are investigated. The advantages of each type of automation are used as measures of the estimation method of the automation rate. One advantage was found to be a reduction in the number of tasks, and another was a reduction in human cognitive task loads. The system and the cognitive automation rate were proposed as quantitative measures by taking advantage of the aforementioned benefits. To quantify the required human cognitive task loads and thus suggest the cognitive automation rate, Conant’s information-theory-based model was applied. The validity of the suggested method, especially as regards the cognitive automation rate, was proven by conducting

  7. The fusion of biology, computer science, and engineering: towards efficient and successful synthetic biology.

    Science.gov (United States)

    Linshiz, Gregory; Goldberg, Alex; Konry, Tania; Hillson, Nathan J

    2012-01-01

    Synthetic biology is a nascent field that emerged in earnest only around the turn of the millennium. It aims to engineer new biological systems and impart new biological functionality, often through genetic modifications. The design and construction of new biological systems is a complex, multistep process, requiring multidisciplinary collaborative efforts from "fusion" scientists who have formal training in computer science or engineering, as well as hands-on biological expertise. The public has high expectations for synthetic biology and eagerly anticipates the development of solutions to the major challenges facing humanity. This article discusses laboratory practices and the conduct of research in synthetic biology. It argues that the fusion science approach, which integrates biology with computer science and engineering best practices, including standardization, process optimization, computer-aided design and laboratory automation, miniaturization, and systematic management, will increase the predictability and reproducibility of experiments and lead to breakthroughs in the construction of new biological systems. The article also discusses several successful fusion projects, including the development of software tools for DNA construction design automation, recursive DNA construction, and the development of integrated microfluidics systems.

  8. A fully automated Drosophila olfactory classical conditioning and testing system for behavioral learning and memory assessment.

    Science.gov (United States)

    Jiang, Hui; Hanna, Eriny; Gatto, Cheryl L; Page, Terry L; Bhuva, Bharat; Broadie, Kendal

    2016-03-01

    Aversive olfactory classical conditioning has been the standard method to assess Drosophila learning and memory behavior for decades, yet training and testing are conducted manually under exceedingly labor-intensive conditions. To overcome this severe limitation, a fully automated, inexpensive system has been developed, which allows accurate and efficient Pavlovian associative learning/memory analyses for high-throughput pharmacological and genetic studies. The automated system employs a linear actuator coupled to an odorant T-maze with airflow-mediated transfer of animals between training and testing stages. Odorant, airflow and electrical shock delivery are automatically administered and monitored during training trials. Control software allows operator-input variables to define parameters of Drosophila learning, short-term memory and long-term memory assays. The approach allows accurate learning/memory determinations with operational fail-safes. Automated learning indices (immediately post-training) and memory indices (after 24h) are comparable to traditional manual experiments, while minimizing experimenter involvement. The automated system provides vast improvements over labor-intensive manual approaches with no experimenter involvement required during either training or testing phases. It provides quality control tracking of airflow rates, odorant delivery and electrical shock treatments, and an expanded platform for high-throughput studies of combinational drug tests and genetic screens. The design uses inexpensive hardware and software for a total cost of ∼$500US, making it affordable to a wide range of investigators. This study demonstrates the design, construction and testing of a fully automated Drosophila olfactory classical association apparatus to provide low-labor, high-fidelity, quality-monitored, high-throughput and inexpensive learning and memory behavioral assays. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Rapid discrimination of sea buckthorn berries from different H. rhamnoides subspecies by multi-step IR spectroscopy coupled with multivariate data analysis

    Science.gov (United States)

    Liu, Yue; Zhang, Ying; Zhang, Jing; Fan, Gang; Tu, Ya; Sun, Suqin; Shen, Xudong; Li, Qingzhu; Zhang, Yi

    2018-03-01

    As an important ethnic medicine, sea buckthorn was widely used to prevent and treat various diseases due to its nutritional and medicinal properties. According to the Chinese Pharmacopoeia, sea buckthorn was originated from H. rhamnoides, which includes five subspecies distributed in China. Confusion and misidentification usually occurred due to their similar morphology, especially in dried and powdered forms. Additionally, these five subspecies have vital differences in quality and physiological efficacy. This paper focused on the quick classification and identification method of sea buckthorn berry powders from five H. rhamnoides subspecies using multi-step IR spectroscopy coupled with multivariate data analysis. The holistic chemical compositions revealed by the FT-IR spectra demonstrated that flavonoids, fatty acids and sugars were the main chemical components. Further, the differences in FT-IR spectra regarding their peaks, positions and intensities were used to identify H. rhamnoides subspecies samples. The discrimination was achieved using principal component analysis (PCA) and partial least square-discriminant analysis (PLS-DA). The results showed that the combination of multi-step IR spectroscopy and chemometric analysis offered a simple, fast and reliable method for the classification and identification of the sea buckthorn berry powders from different H. rhamnoides subspecies.

  10. PEGASUS: a preequilibrium and multi-step evaporation code for neutron cross section calculation

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, Tsuneo; Sugi, Teruo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Iijima, Shungo; Nishigori, Takeo

    1999-06-01

    The computer code PEGASUS was developed to calculate neutron-induced reaction cross sections on the basis of the closed form exciton model preequilibrium theory and the multi-step evaporation theory. The cross sections and emitted particle spectra are calculated for the compound elastic scattering, (n,{gamma}), (n,n`), (n,p), (n,{alpha}), (n,d), (n,t), (n,{sup 3}He), (n,2n), (n,n`p), (n,n`{alpha}), (n,n`d), (n,n`t), (n,2p) and (n,3n) reactions. The double differential cross sections of emitted particles are also calculated. The calculated results are written on a magnetic disk in the ENDF format. Parameter files and/or systematics formulas are provided for level densities, mass excess, radiation widths and inverse cross sections so that the input data to the code are made minimum. (author)

  11. Low-pressure, multistep, multiwire proportional counter for the time-of-flight isochronous spectrometer

    International Nuclear Information System (INIS)

    Vieira, D.J.

    1985-01-01

    A low-pressure, multistep, multiwire proportional counter (MSMWPC) has been developed for the characterization and testing of the time-of-flight isochronous (TOFI) spectrometer and its associated secondary-beam transport line. This type of counter was selected because of its high sensitivity, large dynamic range, and good position (0.2 mm FWHM) and timing (180 ps FWHM) resolution. Furthermore, because the counter operates at low gas pressures (1-10 torr) and high electric-field strengths, which enable short collection times, it can be used as a transmission counter with thin gas-isolation windows and it can operate at high counting rates. Here the authors discuss the basic operating principle of the MSMWPC, describe the technical details of the detector and signal processing, and report on the performance they have measured for alpha particles and fission fragments

  12. Automated detection of macular drusen using geometric background leveling and threshold selection.

    Science.gov (United States)

    Smith, R Theodore; Chan, Jackie K; Nagasaki, Takayuki; Ahmad, Umer F; Barbazetto, Irene; Sparrow, Janet; Figueroa, Marta; Merriam, Joanna

    2005-02-01

    Age-related macular degeneration (ARMD) is the most prevalent cause of visual loss in patients older than 60 years in the United States. Observation of drusen is the hallmark finding in the clinical evaluation of ARMD. To segment and quantify drusen found in patients with ARMD using image analysis and to compare the efficacy of image analysis segmentation with that of stereoscopic manual grading of drusen. Retrospective study. University referral center.Patients Photographs were randomly selected from an available database of patients with known ARMD in the ongoing Columbia University Macular Genetics Study. All patients were white and older than 60 years. Twenty images from 17 patients were selected as representative of common manifestations of drusen. Image preprocessing included automated color balancing and, where necessary, manual segmentation of confounding lesions such as geographic atrophy (3 images). The operator then chose among 3 automated processing options suggested by predominant drusen type. Automated processing consisted of elimination of background variability by a mathematical model and subsequent histogram-based threshold selection. A retinal specialist using a graphic tablet while viewing stereo pairs constructed digital drusen drawings for each image. The sensitivity and specificity of drusen segmentation using the automated method with respect to manual stereoscopic drusen drawings were calculated on a rigorous pixel-by-pixel basis. The median sensitivity and specificity of automated segmentation were 70% and 81%, respectively. After preprocessing and option choice, reproducibility of automated drusen segmentation was necessarily 100%. Automated drusen segmentation can be reliably performed on digital fundus photographs and result in successful quantification of drusen in a more precise manner than is traditionally possible with manual stereoscopic grading of drusen. With only minor preprocessing requirements, this automated detection

  13. Automated preparation of the dopamine D{sub 2/3} receptor agonist ligand [{sup 11}C]-(+)-PHNO for human PET imaging studies

    Energy Technology Data Exchange (ETDEWEB)

    Plisson, Christophe, E-mail: Christophe.2.plisson@gsk.com [GlaxoSmithKline, Clinical Imaging Centre, Hammersmith Hospital, London W12 0NN (United Kingdom); Huiban, Mickael; Pampols-Maso, Sabina; Singleton, Goerkem; Hill, Samuel P.; Passchier, Jan [GlaxoSmithKline, Clinical Imaging Centre, Hammersmith Hospital, London W12 0NN (United Kingdom)

    2012-02-15

    Carbon-11 labelled (+)-4-Propyl-3,4,4a,5,6,10b-hexahydro-2H-naphtho[1,2-b][1,4]oxazin-9-ol ([{sup 11}C]-(+)-PHNO) is used as a high-affinity state, dopamine D{sub 2/3} receptor ligand in clinical PET studies. To facilitate its use, robust, rapid, efficient and GMP compliant methods are required for the manufacturing and QC testing processes. Additionally, to allow for full quantification of the resulting signal in the CNS, a reliable method is required to establish the parent plasma concentration over the course of the scan. This paper provides high-quality methods to support clinical application of [{sup 11}C]-(+)-PHNO. - Highlights: Black-Right-Pointing-Pointer Fully automated synthesis of [{sup 11}C]-(+)-PHNO. Black-Right-Pointing-Pointer Rapid multi-step synthesis and QC analysis. Black-Right-Pointing-Pointer Reproducible synthesis process typically yielding more than 3 GBq of [{sup 11}C]-(+)-PHNO. Black-Right-Pointing-Pointer Very low failure rate.

  14. Advances in vacuum ultraviolet detection with multistep gaseous detectors and application to Cherenkov ring imaging

    CERN Document Server

    Breskin, Amos; Charpak, Georges; Peisert, Anna; Policarpo, Armando; Sauli, Fabio

    1981-01-01

    The multistep avalanche chamber permits an efficient detection of VUV photons. In a two-step proportional mode charges higher than 1 pC are obtained from single electrons. By using as the final localization step a spark chamber viewed by a TV digitizer it is easy to have imaging of complex patterns. This is applied to Cherenkov ring imaging and (K, pi ) separation with 3 sigma up to 320 GeV is envisaged. The properties of various photoionizable vapours have been studied. By combining a scintillation xenon chamber with a photoionization wire chamber, a resolution of 8.3% (FWHM) has been obtained for 5.9 keV X- rays using tetrakis(dimethylamine)-ethylene vapour. (15 refs).

  15. ActionMap: A web-based software that automates loci assignments to framework maps.

    Science.gov (United States)

    Albini, Guillaume; Falque, Matthieu; Joets, Johann

    2003-07-01

    Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/).

  16. Design and rationale for examining neuroimaging genetics in ischemic stroke

    Science.gov (United States)

    Giese, Anne-Katrin; Schirmer, Markus D.; Donahue, Kathleen L.; Cloonan, Lisa; Irie, Robert; Winzeck, Stefan; Bouts, Mark J.R.J.; McIntosh, Elissa C.; Mocking, Steven J.; Dalca, Adrian V.; Sridharan, Ramesh; Xu, Huichun; Frid, Petrea; Giralt-Steinhauer, Eva; Holmegaard, Lukas; Roquer, Jaume; Wasselius, Johan; Cole, John W.; McArdle, Patrick F.; Broderick, Joseph P.; Jimenez-Conde, Jordi; Jern, Christina; Kissela, Brett M.; Kleindorfer, Dawn O.; Lemmens, Robin; Lindgren, Arne; Meschia, James F.; Rundek, Tatjana; Sacco, Ralph L.; Schmidt, Reinhold; Sharma, Pankaj; Slowik, Agnieszka; Thijs, Vincent; Woo, Daniel; Worrall, Bradford B.; Kittner, Steven J.; Mitchell, Braxton D.; Rosand, Jonathan; Golland, Polina; Wu, Ona

    2017-01-01

    Objective: To describe the design and rationale for the genetic analysis of acute and chronic cerebrovascular neuroimaging phenotypes detected on clinical MRI in patients with acute ischemic stroke (AIS) within the scope of the MRI–GENetics Interface Exploration (MRI-GENIE) study. Methods: MRI-GENIE capitalizes on the existing infrastructure of the Stroke Genetics Network (SiGN). In total, 12 international SiGN sites contributed MRIs of 3,301 patients with AIS. Detailed clinical phenotyping with the web-based Causative Classification of Stroke (CCS) system and genome-wide genotyping data were available for all participants. Neuroimaging analyses include the manual and automated assessments of established MRI markers. A high-throughput MRI analysis pipeline for the automated assessment of cerebrovascular lesions on clinical scans will be developed in a subset of scans for both acute and chronic lesions, validated against gold standard, and applied to all available scans. The extracted neuroimaging phenotypes will improve characterization of acute and chronic cerebrovascular lesions in ischemic stroke, including CCS subtypes, and their effect on functional outcomes after stroke. Moreover, genetic testing will uncover variants associated with acute and chronic MRI manifestations of cerebrovascular disease. Conclusions: The MRI-GENIE study aims to develop, validate, and distribute the MRI analysis platform for scans acquired as part of clinical care for patients with AIS, which will lead to (1) novel genetic discoveries in ischemic stroke, (2) strategies for personalized stroke risk assessment, and (3) personalized stroke outcome assessment. PMID:28852707

  17. Concrete Plant Operations Optimization Using Combined Simulation and Genetic Algorithms

    NARCIS (Netherlands)

    Cao, Ming; Lu, Ming; Zhang, Jian-Ping

    2004-01-01

    This work presents a new approach for concrete plant operations optimization by combining a ready mixed concrete (RMC) production simulation tool (called HKCONSIM) with a genetic algorithm (GA) based optimization procedure. A revamped HKCONSIM computer system can be used to automate the simulation

  18. Facilitating Students' Review of the Chemistry of Nitrogen-Containing Heterocyclic Compounds and Their Characterization through Multistep Synthesis of Thieno[2,3-"b"]Pyridine Derivatives

    Science.gov (United States)

    Liu, Hanlin; Zaplishnyy, Vladimir; Mikhaylichenko, Lana

    2016-01-01

    A multistep synthesis of thieno[2,3-"b"]pyridine derivatives is described that is suitable for the upper-level undergraduate organic laboratory. This experiment exposes students to various hands-on experimental techniques as well as methods of product characterization such as IR and [superscript 1]H NMR spectroscopy, and…

  19. Single- and multistep resistance selection studies on the activity of retapamulin compared to other agents against Staphylococcus aureus and Streptococcus pyogenes.

    Science.gov (United States)

    Kosowska-Shick, Klaudia; Clark, Catherine; Credito, Kim; McGhee, Pamela; Dewasse, Bonifacio; Bogdanovich, Tatiana; Appelbaum, Peter C

    2006-02-01

    Retapamulin had the lowest rate of spontaneous mutations by single-step passaging and the lowest parent and selected mutant MICs by multistep passaging among all drugs tested for all Staphylococcus aureus strains and three Streptococcus pyogenes strains which yielded resistant clones. Retapamulin has a low potential for resistance selection in S. pyogenes, with a slow and gradual propensity for resistance development in S. aureus.

  20. Synthesis of InGaZnO{sub 4} nanoparticles using low temperature multistep co-precipitation method

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Ming-Chung, E-mail: mingchungwu@mail.cgu.edu.tw; Hsiao, Kai-Chi; Lu, Hsin-Chun

    2015-07-15

    Indium gallium zinc oxide (InGaZnO{sub 4}, IGZO) has attracted explosive growth in investigations over the last decades as an important material in the thin-film transistor. In this study, the various nitrate precursors, including indium nitrate, gallium nitrate, and zinc nitrate, were prepared from the various metals dissolved in nitric acid. Then, we used these nitrate precursors to synthesize the IGZO precursor powder by the multistep co-precipitation method. The synthesis parameters of the co-precipitation method, such as reaction temperature, pH value and reaction time, were controlled precisely to prepare the high quality IGZO precursor powder. Finally, IGZO precursor powder was calcined at 900 °C. Then, the microstructure, the crystalline structure, the particle size distribution and specific surface area of calcined IGZO precursor powder were characterized by electron transmission microscopy, X-ray diffraction technique, dynamic light scattering method and the surface area and porosimetry analyzer, respectively. The relative density of IGZO tablet sintered at 1200 °C for 12 h is as high as 97.30%, and it showed highly InGaZnO{sub 4} crystalline structure and the large grain size. The IGZO nanoparticles developed in our study has the potential for the high quality target materials used in the application of electronic devices. - Graphical abstract: Display Omitted - Highlights: • InGaZnO{sub 4} (IGZO) nanoparticle was synthesized by multistep co-precipitation method. • The synthesis parameters were controlled precisely to prepare high quality powder. • The relative density of highly crystalline IGZO tablet is as high as 97.30%. • IGZO tablet exhibited highly crystalline structure and the large grain size.

  1. Reaction and catalyst engineering to exploit kinetically controlled whole-cell multistep biocatalysis for terminal FAME oxyfunctionalization.

    Science.gov (United States)

    Schrewe, Manfred; Julsing, Mattijs K; Lange, Kerstin; Czarnotta, Eik; Schmid, Andreas; Bühler, Bruno

    2014-09-01

    The oxyfunctionalization of unactivated C−H bonds can selectively and efficiently be catalyzed by oxygenase-containing whole-cell biocatalysts. Recombinant Escherichia coli W3110 containing the alkane monooxygenase AlkBGT and the outer membrane protein AlkL from Pseudomonas putida GPo1 have been shown to efficiently catalyze the terminal oxyfunctionalization of renewable fatty acid methyl esters yielding bifunctional products of interest for polymer synthesis. In this study, AlkBGTL-containing E. coli W3110 is shown to catalyze the multistep conversion of dodecanoic acid methyl ester (DAME) via terminal alcohol and aldehyde to the acid, exhibiting Michaelis-Menten-type kinetics for each reaction step. In two-liquid phase biotransformations, the product formation pattern was found to be controlled by DAME availability. Supplying DAME as bulk organic phase led to accumulation of the terminal alcohol as the predominant product. Limiting DAME availability via application of bis(2-ethylhexyl)phthalate (BEHP) as organic carrier solvent enabled almost exclusive acid accumulation. Furthermore, utilization of BEHP enhanced catalyst stability by reducing toxic effects of substrate and products. A further shift towards the overoxidized products was achieved by co-expression of the gene encoding the alcohol dehydrogenase AlkJ, which was shown to catalyze efficient and irreversible alcohol to aldehyde oxidation in vivo. With DAME as organic phase, the aldehyde accumulated as main product using resting cells containing AlkBGT, AlkL, as well as AlkJ. This study highlights the versatility of whole-cell biocatalysis for synthesis of industrially relevant bifunctional building blocks and demonstrates how integrated reaction and catalyst engineering can be implemented to control product formation patterns in biocatalytic multistep reactions. © 2014 Wiley Periodicals, Inc.

  2. A multi-step approach for testing non-toxic amphiphilic antifouling coatings against marine microfouling at different levels of biological complexity.

    Science.gov (United States)

    Zecher, Karsten; Aitha, Vishwa Prasad; Heuer, Kirsten; Ahlers, Herbert; Roland, Katrin; Fiedel, Michael; Philipp, Bodo

    2018-03-01

    Marine biofouling on artificial surfaces such as ship hulls or fish farming nets causes enormous economic damage. The time for the developmental process of antifouling coatings can be shortened by reliable laboratory assays. For designing such test systems, it is important that toxic effects can be excluded, that multiple parameters can be addressed simultaneously and that mechanistic aspects can be included. In this study, a multi-step approach for testing antifouling coatings was established employing photoautotrophic biofilm formation of marine microorganisms in micro- and mesoscoms. Degree and pattern of biofilm formation was determined by quantification of chlorophyll fluorescence. For the microcosms, co-cultures of diatoms and a heterotrophic bacterium were exposed to fouling-release coatings. For the mesocosms, a novel device was developed that permits parallel quantification of a multitude of coatings under defined conditions with varying degrees of shear stress. Additionally, the antifouling coatings were tested for leaching of potential compounds and finally tested in sea trials. This multistep-approach revealed that the individual steps led to consistent results regarding antifouling activity of the coatings. Furthermore, the novel mesocosm system can be employed for advanced antifouling analysis including metagenomic approaches for determination of microbial diversity attaching to different coatings under changing shear forces. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  4. A Parallel Genetic Algorithm for Automated Electronic Circuit Design

    Science.gov (United States)

    Lohn, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris; Norvig, Peter (Technical Monitor)

    2000-01-01

    We describe a parallel genetic algorithm (GA) that automatically generates circuit designs using evolutionary search. A circuit-construction programming language is introduced and we show how evolution can generate practical analog circuit designs. Our system allows circuit size (number of devices), circuit topology, and device values to be evolved. We present experimental results as applied to analog filter and amplifier design tasks.

  5. ASAP: an environment for automated preprocessing of sequencing data

    Directory of Open Access Journals (Sweden)

    Torstenson Eric S

    2013-01-01

    Full Text Available Abstract Background Next-generation sequencing (NGS has yielded an unprecedented amount of data for genetics research. It is a daunting task to process the data from raw sequence reads to variant calls and manually processing this data can significantly delay downstream analysis and increase the possibility for human error. The research community has produced tools to properly prepare sequence data for analysis and established guidelines on how to apply those tools to achieve the best results, however, existing pipeline programs to automate the process through its entirety are either inaccessible to investigators, or web-based and require a certain amount of administrative expertise to set up. Findings Advanced Sequence Automated Pipeline (ASAP was developed to provide a framework for automating the translation of sequencing data into annotated variant calls with the goal of minimizing user involvement without the need for dedicated hardware or administrative rights. ASAP works both on computer clusters and on standalone machines with minimal human involvement and maintains high data integrity, while allowing complete control over the configuration of its component programs. It offers an easy-to-use interface for submitting and tracking jobs as well as resuming failed jobs. It also provides tools for quality checking and for dividing jobs into pieces for maximum throughput. Conclusions ASAP provides an environment for building an automated pipeline for NGS data preprocessing. This environment is flexible for use and future development. It is freely available at http://biostat.mc.vanderbilt.edu/ASAP.

  6. ASAP: an environment for automated preprocessing of sequencing data.

    Science.gov (United States)

    Torstenson, Eric S; Li, Bingshan; Li, Chun

    2013-01-04

    Next-generation sequencing (NGS) has yielded an unprecedented amount of data for genetics research. It is a daunting task to process the data from raw sequence reads to variant calls and manually processing this data can significantly delay downstream analysis and increase the possibility for human error. The research community has produced tools to properly prepare sequence data for analysis and established guidelines on how to apply those tools to achieve the best results, however, existing pipeline programs to automate the process through its entirety are either inaccessible to investigators, or web-based and require a certain amount of administrative expertise to set up. Advanced Sequence Automated Pipeline (ASAP) was developed to provide a framework for automating the translation of sequencing data into annotated variant calls with the goal of minimizing user involvement without the need for dedicated hardware or administrative rights. ASAP works both on computer clusters and on standalone machines with minimal human involvement and maintains high data integrity, while allowing complete control over the configuration of its component programs. It offers an easy-to-use interface for submitting and tracking jobs as well as resuming failed jobs. It also provides tools for quality checking and for dividing jobs into pieces for maximum throughput. ASAP provides an environment for building an automated pipeline for NGS data preprocessing. This environment is flexible for use and future development. It is freely available at http://biostat.mc.vanderbilt.edu/ASAP.

  7. ASAP: an environment for automated preprocessing of sequencing data

    Science.gov (United States)

    2013-01-01

    Background Next-generation sequencing (NGS) has yielded an unprecedented amount of data for genetics research. It is a daunting task to process the data from raw sequence reads to variant calls and manually processing this data can significantly delay downstream analysis and increase the possibility for human error. The research community has produced tools to properly prepare sequence data for analysis and established guidelines on how to apply those tools to achieve the best results, however, existing pipeline programs to automate the process through its entirety are either inaccessible to investigators, or web-based and require a certain amount of administrative expertise to set up. Findings Advanced Sequence Automated Pipeline (ASAP) was developed to provide a framework for automating the translation of sequencing data into annotated variant calls with the goal of minimizing user involvement without the need for dedicated hardware or administrative rights. ASAP works both on computer clusters and on standalone machines with minimal human involvement and maintains high data integrity, while allowing complete control over the configuration of its component programs. It offers an easy-to-use interface for submitting and tracking jobs as well as resuming failed jobs. It also provides tools for quality checking and for dividing jobs into pieces for maximum throughput. Conclusions ASAP provides an environment for building an automated pipeline for NGS data preprocessing. This environment is flexible for use and future development. It is freely available at http://biostat.mc.vanderbilt.edu/ASAP. PMID:23289815

  8. Automated addition of Chelex solution to tubes containing trace items

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Thomas Møller; Hansen, Anders Johannes

    2011-01-01

    Extraction of DNA from trace items for forensic genetic DNA typing using a manual Chelex based extraction protocol requires addition of Chelex solution to sample tubes containing trace items. Automated of addition of Chelex solution may be hampered by high viscosity of the solution and fast...... sedimentation rate of the Chelex beads. Here, we present a simple method that can be used on an Eppendorf epMotion liquid handler resolving these issues...

  9. Simulation study of multi-step model algorithmic control of the nuclear reactor thermal power tracking system

    International Nuclear Information System (INIS)

    Shi Xiaoping; Xu Tianshu

    2001-01-01

    The classical control method is usually hard to ensure the thermal power tracking accuracy, because the nuclear reactor system is a complex nonlinear system with uncertain parameters and disturbances. A sort of non-parameter model is constructed with the open-loop impulse response of the system. Furthermore, a sort of thermal power tracking digital control law is presented using the multi-step model algorithmic control principle. The control method presented had good tracking performance and robustness. It can work despite the existence of unmeasurable disturbances. The simulation experiment testifies the correctness and effectiveness of the method. The high accuracy matching between the thermal power and the referenced load is achieved

  10. Characterization of olive oil volatiles by multi-step direct thermal desorption-comprehensive gas chromatography-time-of-flight mass spectrometry using a programmed temperature vaporizing injector

    NARCIS (Netherlands)

    de Koning, S.; Kaal, E.; Janssen, H.-G.; van Platerink, C.; Brinkman, U.A.Th.

    2008-01-01

    The feasibility of a versatile system for multi-step direct thermal desorption (DTD) coupled to comprehensive gas chromatography (GC × GC) with time-of-flight mass spectrometric (TOF-MS) detection is studied. As an application the system is used for the characterization of fresh versus aged olive

  11. Improved genetic algorithm in optimization of beam orientation in intensity modulated radiotherapy

    International Nuclear Information System (INIS)

    Ni Xinye; Yang Jianhua; Sun Suping; Yao Yi

    2009-01-01

    Objective: At present beam orientation selection in intensity-modulated radiotherapy (IMRT) is mainly based on empiric knowledge. This study is to evaluate the feasibility of automated beam angle selection. Methods: Genetic algorithm technique which based on beam eye view dose measurement (BEVD-GA) was tested on two clinical cases, including a spine column cancer and a lung cancer. Three plans were obtained under the following different beam configurations: five equiangular-spaced beams, five beams with GA-selected, and five beams with BEVD-GA-selected beams. Then the dose distribution was compared among the three plans. Results: The method, restricting the range of genetic algorithm followed by carrying through genetic operations, not only shortened the optimization time, but also improved the optimization effect. For spine column cancer and lung cancer, the best IMRT plans were obtained with BEVD-GA-selected beams, which used automated beam orientation selection. Conclusions: Comparing with the conventional manual beam orientation selection, beam orientation optimization which is feasible in IMRT planning may significantly improve the efficiency and result. (authors)

  12. Numerical multistep methods for the efficient solution of quantum mechanics and related problems

    International Nuclear Information System (INIS)

    Anastassi, Z.A.; Simos, T.E.

    2009-01-01

    In this paper we present the recent development in the numerical integration of the Schroedinger equation and related systems of ordinary differential equations with oscillatory solutions, such as the N-body problem. We examine several types of multistep methods (explicit, implicit, predictor-corrector, hybrid) and several properties (P-stability, trigonometric fitting of various orders, phase fitting, high phase-lag order, algebraic order). We analyze the local truncation error and the stability of the methods. The error for the Schroedinger equation is also presented, which reveals the relation of the error to the energy. The efficiency of the methods is evaluated through the integration of five problems. Figures are presented and analyzed and some general conclusions are made. Code written in Maple is given for the development of all methods analyzed in this paper. Also the subroutines written in Matlab, that concern the integration of the methods, are presented.

  13. Automating quantum dot barcode assays using microfluidics and magnetism for the development of a point-of-care device.

    Science.gov (United States)

    Gao, Yali; Lam, Albert W Y; Chan, Warren C W

    2013-04-24

    The impact of detecting multiple infectious diseases simultaneously at point-of-care with good sensitivity, specificity, and reproducibility would be enormous for containing the spread of diseases in both resource-limited and rich countries. Many barcoding technologies have been introduced for addressing this need as barcodes can be applied to detecting thousands of genetic and protein biomarkers simultaneously. However, the assay process is not automated and is tedious and requires skilled technicians. Barcoding technology is currently limited to use in resource-rich settings. Here we used magnetism and microfluidics technology to automate the multiple steps in a quantum dot barcode assay. The quantum dot-barcoded microbeads are sequentially (a) introduced into the chip, (b) magnetically moved to a stream containing target molecules, (c) moved back to the original stream containing secondary probes, (d) washed, and (e) finally aligned for detection. The assay requires 20 min, has a limit of detection of 1.2 nM, and can detect genetic targets for HIV, hepatitis B, and syphilis. This study provides a simple strategy to automate the entire barcode assay process and moves barcoding technologies one step closer to point-of-care applications.

  14. MSBIS: A Multi-Step Biomedical Informatics Screening Approach for Identifying Medications that Mitigate the Risks of Metoclopramide-Induced Tardive Dyskinesia

    OpenAIRE

    Dong Xu; Alexandrea G. Ham; Rickey D. Tivis; Matthew L. Caylor; Aoxiang Tao; Steve T. Flynn; Peter J. Economen; Hung K. Dang; Royal W. Johnson; Vaughn L. Culbertson

    2017-01-01

    In 2009 the U.S. Food and Drug Administration (FDA) placed a black box warning on metoclopramide (MCP) due to the increased risks and prevalence of tardive dyskinesia (TD). In this study, we developed a multi-step biomedical informatics screening (MSBIS) approach leveraging publicly available bioactivity and drug safety data to identify concomitant drugs that mitigate the risks of MCP-induced TD. MSBIS includes (1) TargetSearch (http://dxulab.org/software) bioinformatics scoring for drug anti...

  15. Multi-step process for concentrating magnetic particles in waste sludges

    Science.gov (United States)

    Watson, John L.

    1990-01-01

    This invention involves a multi-step, multi-force process for dewatering sludges which have high concentrations of magnetic particles, such as waste sludges generated during steelmaking. This series of processing steps involves (1) mixing a chemical flocculating agent with the sludge; (2) allowing the particles to aggregate under non-turbulent conditions; (3) subjecting the mixture to a magnetic field which will pull the magnetic aggregates in a selected direction, causing them to form a compacted sludge; (4) preferably, decanting the clarified liquid from the compacted sludge; and (5) using filtration to convert the compacted sludge into a cake having a very high solids content. Steps 2 and 3 should be performed simultaneously. This reduces the treatment time and increases the extent of flocculation and the effectiveness of the process. As partially formed aggregates with active flocculating groups are pulled through the mixture by the magnetic field, they will contact other particles and form larger aggregates. This process can increase the solids concentration of steelmaking sludges in an efficient and economic manner, thereby accomplishing either of two goals: (a) it can convert hazardous wastes into economic resources for recycling as furnace feed material, or (b) it can dramatically reduce the volume of waste material which must be disposed.

  16. Preparation of molecularly imprinted polymers for strychnine by precipitation polymerization and multistep swelling and polymerization and their application for the selective extraction of strychnine from nux-vomica extract powder.

    Science.gov (United States)

    Nakamura, Yukari; Matsunaga, Hisami; Haginaka, Jun

    2016-04-01

    Monodisperse molecularly imprinted polymers for strychnine were prepared by precipitation polymerization and multistep swelling and polymerization, respectively. In precipitation polymerization, methacrylic acid and divinylbenzene were used as a functional monomer and crosslinker, respectively, while in multistep swelling and polymerization, methacrylic acid and ethylene glycol dimethacrylate were used as a functional monomer and crosslinker, respectively. The retention and molecular recognition properties of the molecularly imprinted polymers prepared by both methods for strychnine were evaluated using a mixture of sodium phosphate buffer and acetonitrile as a mobile phase by liquid chromatography. In addition to shape recognition, ionic and hydrophobic interactions could affect the retention of strychnine in low acetonitrile content. Furthermore, molecularly imprinted polymers prepared by both methods could selectively recognize strychnine among solutes tested. The retention factors and imprinting factors of strychnine on the molecularly imprinted polymer prepared by precipitation polymerization were 220 and 58, respectively, using 20 mM sodium phosphate buffer (pH 6.0)/acetonitrile (50:50, v/v) as a mobile phase, and those on the molecularly imprinted polymer prepared by multistep swelling and polymerization were 73 and 4.5. These results indicate that precipitation polymerization is suitable for the preparation of a molecularly imprinted polymer for strychnine. Furthermore, the molecularly imprinted polymer could be successfully applied for selective extraction of strychnine in nux-vomica extract powder. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Multi-step processes in the (d, t) and (d, {sup 3}He) reactions on {sup 116}Sn and {sup 208}Pb targets at E{sub d} = 200 MeV

    Energy Technology Data Exchange (ETDEWEB)

    Langevin-Joliot, H.; Van de Wiele, J.; Guillot, J. [Institut de Physique Nucleaire, IN2P3/CNRS, 91 - Orsay (France); Koning, A.J. [Nuclear Research and Consultancy Group NRG, NL (Netherlands)

    2000-07-01

    The role of multi-step processes in the reactions {sup 116}Sn(d,t), {sup 208}Pb(d,t) and {sup 116}Sn(d,{sup 3}He), previously studied at E{sub d} = 200 MeV at forward angles and for relatively low energy transfers, has been investigated. We have performed for the first time multi-step calculations taking into account systematically collective excitations in the second and higher order step inelastic transitions. A calculation code based on the Feshbach, Kerman and Koonin model has been modified to handle explicitly these collective excitations, most important in the forward angle domain. One step double differential pick-up cross sections were built from finite range distorted wave results spread in energy using known or estimated hole state characteristics. It is shown that two-step cross sections calculated using the above method compare rather well with those deduced via coupled channel calculations for the same collective excitations. The multi-step calculations performed up to 6 steps reproduce reasonably well the {sup 115}Sn, {sup 207}Pb and {sup 115}In experimental spectra measured up to E{sub x}{approx}- 40 MeV and 15 deg. The relative contributions of steps of increasing order to pick-up cross sections at E{sub d} = 200 MeV and 150 MeV are discussed. (authors)

  18. Lean automation development : applying lean principles to the automation development process

    OpenAIRE

    Granlund, Anna; Wiktorsson, Magnus; Grahn, Sten; Friedler, Niklas

    2014-01-01

    By a broad empirical study it is indicated that automation development show potential of improvement. In the paper, 13 lean product development principles are contrasted to the automation development process and it is suggested why and how these principles can facilitate, support and improve the automation development process. The paper summarises a description of what characterises a lean automation development process and what consequences it entails. Main differences compared to current pr...

  19. Both Automation and Paper.

    Science.gov (United States)

    Purcell, Royal

    1988-01-01

    Discusses the concept of a paperless society and the current situation in library automation. Various applications of automation and telecommunications are addressed, and future library automation is considered. Automation at the Monroe County Public Library in Bloomington, Indiana, is described as an example. (MES)

  20. Optimal Installation Locations for Automated External Defibrillators in Taipei 7-Eleven Stores: Using GIS and a Genetic Algorithm with a New Stirring Operator

    Directory of Open Access Journals (Sweden)

    Chung-Yuan Huang

    2014-01-01

    Full Text Available Immediate treatment with an automated external defibrillator (AED increases out-of-hospital cardiac arrest (OHCA patient survival potential. While considerable attention has been given to determining optimal public AED locations, spatial and temporal factors such as time of day and distance from emergency medical services (EMSs are understudied. Here we describe a geocomputational genetic algorithm with a new stirring operator (GANSO that considers spatial and temporal cardiac arrest occurrence factors when assessing the feasibility of using Taipei 7-Eleven stores as installation locations for AEDs. Our model is based on two AED conveyance modes, walking/running and driving, involving service distances of 100 and 300 meters, respectively. Our results suggest different AED allocation strategies involving convenience stores in urban settings. In commercial areas, such installations can compensate for temporal gaps in EMS locations when responding to nighttime OHCA incidents. In residential areas, store installations can compensate for long distances from fire stations, where AEDs are currently held in Taipei.

  1. Automated Groundwater Screening

    International Nuclear Information System (INIS)

    Taylor, Glenn A.; Collard, Leonard B.

    2005-01-01

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application

  2. Expression Patterns of Cancer Stem Cell Markers During Specific Celecoxib Therapy in Multistep Rat Colon Carcinogenesis Bioassays.

    Science.gov (United States)

    Salim, Elsayed I; Hegazi, Mona M; Kang, Jin Seok; Helmy, Hager M

    2016-01-01

    The purpose of this study was to investigate the role of colon cancer stem cells (CSCs) during chemicallyinduced rat multi-step colon carcinogenesis with or without the treatment with a specific cyclooxygenase-2 inhibitor drug (celecoxib). Two experiments were performed, the first, a short term 12 week colon carcinogenesis bioassay in which only surrogate markers for colon cancer, aberrant crypt foci (ACF) lesions, were formed. The other experiment was a medium term colon cancer rat assay in which tumors had developed after 32 weeks. Treatment with celecoxib lowered the numbers of ACF, as well as the tumor volumes and multiplicities after 32 weeks. Immunohistochemical proliferating cell nuclear antigen (PCNA) labeling indexes LI (%) were downregulated after treatment by celecoxib. Also different cell surface antigens known to associate with CSCs such as the epithelial cell adhesion molecule (EpCAM), CD44 and CD133 were compared between the two experiments and showed differential expression patterns depending on the stage of carcinogenesis and treatment with celecoxib. Flow cytometric analysis demonstrated that the numbers of CD133 cells were increased in the colonic epithelium after 12 weeks while those of CD44 but not CD133 cells were increased after 32 weeks. Moreover, aldehyde dehydrogenase-1 activity levels in the colonic epithelium (a known CSC marker) detected by ELISA assay were found down-regulated after 12 weeks, but were up-regulated after 32 weeks. The data have also shown that the protective effect of celecoxib on these specific markers and populations of CSCs and on other molecular processes such as apoptosis targeted by this drug may vary depending on the genetic and phenotypic stages of carcinogenesis. Therefore, uncovering these distinction roles of CSCs during different phases of carcinogenesis and during specific treatment could be useful for targeted therapy.

  3. Steam generator automated eddy current data analysis: A benchmarking study. Final report

    International Nuclear Information System (INIS)

    Brown, S.D.

    1998-12-01

    The eddy current examination of steam generator tubes is a very demanding process. Challenges include: complex signal analysis, massive amount of data to be reviewed quickly with extreme precision and accuracy, shortages of data analysts during peak periods, and the desire to reduce examination costs. One method to address these challenges is by incorporating automation into the data analysis process. Specific advantages, which automated data analysis has the potential to provide, include the ability to analyze data more quickly, consistently and accurately than can be performed manually. Also, automated data analysis can potentially perform the data analysis function with significantly smaller levels of analyst staffing. Despite the clear advantages that an automated data analysis system has the potential to provide, no automated system has been produced and qualified that can perform all of the functions that utility engineers demand. This report investigates the current status of automated data analysis, both at the commercial and developmental level. A summary of the various commercial and developmental data analysis systems is provided which includes the signal processing methodologies used and, where available, the performance data obtained for each system. Also, included in this report is input from seventeen research organizations regarding the actions required and obstacles to be overcome in order to bring automatic data analysis from the laboratory into the field environment. In order to provide assistance with ongoing and future research efforts in the automated data analysis arena, the most promising approaches to signal processing are described in this report. These approaches include: wavelet applications, pattern recognition, template matching, expert systems, artificial neural networks, fuzzy logic, case based reasoning and genetic algorithms. Utility engineers and NDE researchers can use this information to assist in developing automated data

  4. Multi-step prediction for influenza outbreak by an adjusted long short-term memory.

    Science.gov (United States)

    Zhang, J; Nawata, K

    2018-05-01

    Influenza results in approximately 3-5 million annual cases of severe illness and 250 000-500 000 deaths. We urgently need an accurate multi-step-ahead time-series forecasting model to help hospitals to perform dynamical assignments of beds to influenza patients for the annually varied influenza season, and aid pharmaceutical companies to formulate a flexible plan of manufacturing vaccine for the yearly different influenza vaccine. In this study, we utilised four different multi-step prediction algorithms in the long short-term memory (LSTM). The result showed that implementing multiple single-output prediction in a six-layer LSTM structure achieved the best accuracy. The mean absolute percentage errors from two- to 13-step-ahead prediction for the US influenza-like illness rates were all LSTM has been applied and refined to perform multi-step-ahead prediction for influenza outbreaks. Hopefully, this modelling methodology can be applied in other countries and therefore help prevent and control influenza worldwide.

  5. Influence of multi-step washing using Na2EDTA, oxalic acid and phosphoric acid on metal fractionation and spectroscopy characteristics from contaminated soil.

    Science.gov (United States)

    Wei, Meng; Chen, Jiajun

    2016-11-01

    A multi-step soil washing test using a typical chelating agent (Na 2 EDTA), organic acid (oxalic acid), and inorganic weak acid (phosphoric acid) was conducted to remediate soil contaminated with heavy metals near an arsenic mining area. The aim of the test was to improve the heavy metal removal efficiency and investigate its influence on metal fractionation and the spectroscopy characteristics of contaminated soil. The results indicated that the orders of the multi-step washing were critical for the removal efficiencies of the metal fractions, bioavailability, and potential mobility due to the different dissolution levels of mineral fractions and the inter-transformation of metal fractions by XRD and FT-IR spectral analyses. The optimal soil washing options were identified as the Na 2 EDTA-phosphoric-oxalic acid (EPO) and phosphoric-oxalic acid-Na 2 EDTA (POE) sequences because of their high removal efficiencies (approximately 45 % for arsenic and 88 % for cadmium) and the minimal harmful effects that were determined by the mobility and bioavailability of the remaining heavy metals based on the metal stability (I R ) and modified redistribution index ([Formula: see text]).

  6. Robotic Automation of In Vivo Two-Photon Targeted Whole-Cell Patch-Clamp Electrophysiology.

    Science.gov (United States)

    Annecchino, Luca A; Morris, Alexander R; Copeland, Caroline S; Agabi, Oshiorenoya E; Chadderton, Paul; Schultz, Simon R

    2017-08-30

    Whole-cell patch-clamp electrophysiological recording is a powerful technique for studying cellular function. While in vivo patch-clamp recording has recently benefited from automation, it is normally performed "blind," meaning that throughput for sampling some genetically or morphologically defined cell types is unacceptably low. One solution to this problem is to use two-photon microscopy to target fluorescently labeled neurons. Combining this with robotic automation is difficult, however, as micropipette penetration induces tissue deformation, moving target cells from their initial location. Here we describe a platform for automated two-photon targeted patch-clamp recording, which solves this problem by making use of a closed loop visual servo algorithm. Our system keeps the target cell in focus while iteratively adjusting the pipette approach trajectory to compensate for tissue motion. We demonstrate platform validation with patch-clamp recordings from a variety of cells in the mouse neocortex and cerebellum. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Classical and molecular genetics of malignant melanoma and dysplastic naevi

    International Nuclear Information System (INIS)

    Traupe, H.; Macher, E.

    1988-01-01

    The authors conclude that the prevailing concept of monogenic autosomaldominant inheritance of dysplastic naevi and familial melanoma is not compatible with the principles of formal (Mendelian) genetics. The concept of polygenic inheritance offers instead a sound basis to explain familial aggregation of dysplastic naevi and melanoma. The various genes involved have not yet been identified at the molecular level. The recent advances made possible by modern DNA technology have given us a new view of carcinogenesis. In human malignant melanoma, chromosomes 1, 6, 7 are of particular interest and oncogenes located on these chromosomes may be involved with the initiation, promotion and progression of melanoma. Carcinogenesis is viewed as a multistep process and even tumour initiation requires the input of at least two independent oncogenes. Molecular genetics thus adds an important argument for the existence of a polygenic predisposition to melanoma. The concept of polygenic inheritance is not restricted to familial melanoma, but implies that all melanomas basically share the same predisposition and are due to similar genetic mechanisms. In some patients an inherited genetic predisposition is of great importance, whereas in others (the majority) environmental factors (e.g. UV-light-induced mutations) will be the cause of initial steps in the malignant transformation. The concept of polygenic inheritance has consequences for the management of our patients. In contrast to simple Mendelian inheritance, the risk for dysplastic naevi and melanoma is not constantly 50%, but increases with the number of family members already affected. Persons belonging to families with more that 2 affected close relatives should be considered at high risk regardless of the dysplastic naevus status. Strict surveillance of this patient group is warranted for melanoma prevention

  8. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  9. Comparison of manual and automated AmpliSeq™ workflows in the typing of a Somali population with the Precision ID Identity Panel.

    Science.gov (United States)

    van der Heijden, Suzanne; de Oliveira, Susanne Juel; Kampmann, Marie-Louise; Børsting, Claus; Morling, Niels

    2017-11-01

    The Precision ID Identity Panel was used to type 109 Somali individuals in order to obtain allele frequencies for the Somali population. These frequencies were used to establish a Somali HID-SNP database, which will be used for the biostatistic calculations in family and immigration cases. Genotypes obtained with the Precision ID Identity Panel were found to be almost in complete concordance with genotypes obtained with the SNPforID PCR-SBE-CE assay. In seven SNP loci, silent alleles were identified, of which most were previously described in the literature. The project also set out to compare different AmpliSeq™ workflows to investigate the possibility of using automated library building in forensic genetic case work. In order to do so, the SNP typing of the Somalis was performed using three different workflows: 1) manual library building and sequencing on the Ion PGM™, 2) automated library building using the Biomek ® 3000 and sequencing on the Ion PGM™, and 3) automated library building using the Ion Chef™ and sequencing on the Ion S5™. AmpliSeq™ workflows were compared based on coverage, locus balance, noise, and heterozygote balance. Overall, the Ion Chef™/Ion S5™ workflow was found to give the best results and required least hands-on time in the laboratory. However, the Ion Chef™/Ion S5™ workflow was also the most expensive. The number of libraries that may be constructed in one Ion Chef™ library building run was limited to eight, which is too little for high throughput workflows. The Biomek ® 3000/Ion PGM™ workflow was found to perform similarly to the manual/Ion PGM™ workflow. This argues for the use of automated library building in forensic genetic case work. Automated library building decreases the workload of the laboratory staff, decreases the risk of pipetting errors, and simplifies the daily workflow in forensic genetic laboratories. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. 78 FR 53466 - Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated...

    Science.gov (United States)

    2013-08-29

    ... Customs Automation Program (NCAP) Tests Concerning Automated Commercial Environment (ACE) Document Image... National Customs Automation Program (NCAP) tests concerning document imaging, known as the Document Image... the National Customs Automation Program (NCAP) tests concerning document imaging, known as the...

  11. Automated Quantitative Rare Earth Elements Mineralogy by Scanning Electron Microscopy

    Science.gov (United States)

    Sindern, Sven; Meyer, F. Michael

    2016-09-01

    Increasing industrial demand of rare earth elements (REEs) stems from the central role they play for advanced technologies and the accelerating move away from carbon-based fuels. However, REE production is often hampered by the chemical, mineralogical as well as textural complexity of the ores with a need for better understanding of their salient properties. This is not only essential for in-depth genetic interpretations but also for a robust assessment of ore quality and economic viability. The design of energy and cost-efficient processing of REE ores depends heavily on information about REE element deportment that can be made available employing automated quantitative process mineralogy. Quantitative mineralogy assigns numeric values to compositional and textural properties of mineral matter. Scanning electron microscopy (SEM) combined with a suitable software package for acquisition of backscatter electron and X-ray signals, phase assignment and image analysis is one of the most efficient tools for quantitative mineralogy. The four different SEM-based automated quantitative mineralogy systems, i.e. FEI QEMSCAN and MLA, Tescan TIMA and Zeiss Mineralogic Mining, which are commercially available, are briefly characterized. Using examples of quantitative REE mineralogy, this chapter illustrates capabilities and limitations of automated SEM-based systems. Chemical variability of REE minerals and analytical uncertainty can reduce performance of phase assignment. This is shown for the REE phases parisite and synchysite. In another example from a monazite REE deposit, the quantitative mineralogical parameters surface roughness and mineral association derived from image analysis are applied for automated discrimination of apatite formed in a breakdown reaction of monazite and apatite formed by metamorphism prior to monazite breakdown. SEM-based automated mineralogy fulfils all requirements for characterization of complex unconventional REE ores that will become

  12. Optimization of automation: III. Development of optimization method for determining automation rate in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Kim, Man Cheol; Seong, Poong Hyun

    2016-01-01

    Highlights: • We propose an appropriate automation rate that enables the best human performance. • We analyze the shortest working time considering Situation Awareness Recovery (SAR). • The optimized automation rate is estimated by integrating the automation and ostracism rate estimation methods. • The process to derive the optimized automation rate is demonstrated through case studies. - Abstract: Automation has been introduced in various industries, including the nuclear field, because it is commonly believed that automation promises greater efficiency, lower workloads, and fewer operator errors through reducing operator errors and enhancing operator and system performance. However, the excessive introduction of automation has deteriorated operator performance due to the side effects of automation, which are referred to as Out-of-the-Loop (OOTL), and this is critical issue that must be resolved. Thus, in order to determine the optimal level of automation introduction that assures the best human operator performance, a quantitative method of optimizing the automation is proposed in this paper. In order to propose the optimization method for determining appropriate automation levels that enable the best human performance, the automation rate and ostracism rate, which are estimation methods that quantitatively analyze the positive and negative effects of automation, respectively, are integrated. The integration was conducted in order to derive the shortest working time through considering the concept of situation awareness recovery (SAR), which states that the automation rate with the shortest working time assures the best human performance. The process to derive the optimized automation rate is demonstrated through an emergency operation scenario-based case study. In this case study, four types of procedures are assumed through redesigning the original emergency operating procedure according to the introduced automation and ostracism levels. Using the

  13. EpiJen: a server for multistep T cell epitope prediction

    Directory of Open Access Journals (Sweden)

    Guan Pingping

    2006-03-01

    Full Text Available Abstract Background The main processing pathway for MHC class I ligands involves degradation of proteins by the proteasome, followed by transport of products by the transporter associated with antigen processing (TAP to the endoplasmic reticulum (ER, where peptides are bound by MHC class I molecules, and then presented on the cell surface by MHCs. The whole process is modeled here using an integrated approach, which we call EpiJen. EpiJen is based on quantitative matrices, derived by the additive method, and applied successively to select epitopes. EpiJen is available free online. Results To identify epitopes, a source protein is passed through four steps: proteasome cleavage, TAP transport, MHC binding and epitope selection. At each stage, different proportions of non-epitopes are eliminated. The final set of peptides represents no more than 5% of the whole protein sequence and will contain 85% of the true epitopes, as indicated by external validation. Compared to other integrated methods (NetCTL, WAPP and SMM, EpiJen performs best, predicting 61 of the 99 HIV epitopes used in this study. Conclusion EpiJen is a reliable multi-step algorithm for T cell epitope prediction, which belongs to the next generation of in silico T cell epitope identification methods. These methods aim to reduce subsequent experimental work by improving the success rate of epitope prediction.

  14. A data acquisition system for measuring ionization cross section in laser multi-step resonant ionization experiment

    International Nuclear Information System (INIS)

    Qian Dongbin; Guo Yuhui; Zhang Dacheng; Chinese Academy of Sciences, Beijing; Ma Xinwen; Zhao Zhizheng; Wang Yanyu; Zu Kailing

    2006-01-01

    A CAMAC data acquisition system for measuring ionization cross section in laser multi-step resonant ionization experiment is described. The number of scalers in the front-end CAMAC can be adjusted by changing the data read-out table files. Both continuous and manual acquisition models are available, and there is a wide adjustable range from 1 ms to 800 s with the acquisition time unit. The long-term stability, Δt/t, for the data acquisition system with an acquisition time unit of 100 s was measured to be better than ±0.01%, thus validating its reliability in long-term online experimental data acquisition. The time response curves for three electrothermal power-meters were also measured by this DAQ system. (authors)

  15. World-wide distribution automation systems

    International Nuclear Information System (INIS)

    Devaney, T.M.

    1994-01-01

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems

  16. WIDAFELS flexible automation systems

    International Nuclear Information System (INIS)

    Shende, P.S.; Chander, K.P.; Ramadas, P.

    1990-01-01

    After discussing the various aspects of automation, some typical examples of various levels of automation are given. One of the examples is of automated production line for ceramic fuel pellets. (M.G.B.)

  17. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  18. Virtual automation.

    Science.gov (United States)

    Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C

    2001-01-01

    Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).

  19. Assessment of an automated capillary system for Plasmodium vivax microsatellite genotyping.

    Science.gov (United States)

    Manrique, Paulo; Hoshi, Mari; Fasabi, Manuel; Nolasco, Oscar; Yori, Pablo; Calderón, Martiza; Gilman, Robert H; Kosek, Margaret N; Vinetz, Joseph M; Gamboa, Dionicia

    2015-08-21

    Several platforms have been used to generate the primary data for microsatellite analysis of malaria parasite genotypes. Each has relative advantages but share a limitation of being time- and cost-intensive. A commercially available automated capillary gel cartridge system was assessed in the microsatellite analysis of Plasmodium vivax diversity in the Peruvian Amazon. The reproducibility and accuracy of a commercially-available automated capillary system, QIAxcel, was assessed using a sequenced PCR product of 227 base pairs. This product was measured 42 times, then 27 P. vivax samples from Peruvian Amazon subjects were analyzed with this instrument using five informative microsatellites. Results from the QIAxcel system were compared with a Sanger-type sequencing machine, the ABI PRISM(®) 3100 Genetic Analyzer. Significant differences were seen between the sequenced amplicons and the results from the QIAxcel instrument. Different runs, plates and cartridges yielded significantly different results. Additionally, allele size decreased with each run by 0.045, or 1 bp, every three plates. QIAxcel and ABI PRISM systems differed in giving different values than those obtained by ABI PRISM, and too many (i.e. inaccurate) alleles per locus were also seen with the automated instrument. While P. vivax diversity could generally be estimated using an automated capillary gel cartridge system, the data demonstrate that this system is not sufficiently precise for reliably identifying parasite strains via microsatellite analysis. This conclusion reached after systematic analysis was due both to inadequate precision and poor reproducibility in measuring PCR product size.

  20. Automation of Test Cases for Web Applications : Automation of CRM Test Cases

    OpenAIRE

    Seyoum, Alazar

    2012-01-01

    The main theme of this project was to design a test automation framework for automating web related test cases. Automating test cases designed for testing a web interface provide a means of improving a software development process by shortening the testing phase in the software development life cycle. In this project an existing AutoTester framework and iMacros test automation tools were used. CRM Test Agent was developed to integrate AutoTester to iMacros and to enable the AutoTester,...

  1. Grade Distribution Modeling within the Bauxite Seams of the Wachangping Mine, China, Using a Multi-Step Interpolation Algorithm

    Directory of Open Access Journals (Sweden)

    Shaofeng Wang

    2017-05-01

    Full Text Available Mineral reserve estimation and mining design depend on a precise modeling of the mineralized deposit. A multi-step interpolation algorithm, including 1D biharmonic spline estimator for interpolating floor altitudes, 2D nearest neighbor, linear, natural neighbor, cubic, biharmonic spline, inverse distance weighted, simple kriging, and ordinary kriging interpolations for grade distribution on the two vertical sections at roadways, and 3D linear interpolation for grade distribution between sections, was proposed to build a 3D grade distribution model of the mineralized seam in a longwall mining panel with a U-shaped layout having two roadways at both sides. Compared to field data from exploratory boreholes, this multi-step interpolation using a natural neighbor method shows an optimal stability and a minimal difference between interpolation and field data. Using this method, the 97,576 m3 of bauxite, in which the mass fraction of Al2O3 (Wa and the mass ratio of Al2O3 to SiO2 (Wa/s are 61.68% and 27.72, respectively, was delimited from the 189,260 m3 mineralized deposit in the 1102 longwall mining panel in the Wachangping mine, Southwest China. The mean absolute errors, the root mean squared errors and the relative standard deviations of errors between interpolated data and exploratory grade data at six boreholes are 2.544, 2.674, and 32.37% of Wa; and 1.761, 1.974, and 67.37% of Wa/s, respectively. The proposed method can be used for characterizing the grade distribution in a mineralized seam between two roadways at both sides of a longwall mining panel.

  2. Fabrication and characterization of functionally graded Al–SiC nanocomposite by using a novel multistep friction stir processing

    International Nuclear Information System (INIS)

    Salehi, Mojtaba; Farnoush, Hamidreza; Mohandesi, Jamshid Aghazadeh

    2014-01-01

    Highlights: • A novel multistep FSP to fabricate functionally-graded bulk nanocomposites. • The composition of FG sample was changed from 18 to 0 wt% SiC along five layers. • The 3.2 times higher hardness than the base metal in FG sample along the thickness. • The 15% enhancement in hardness of FG sample with respect to homogenous sample. • Microhardness and inverse of interparticle spacing was correlated linearly. - Abstract: Functionally graded materials are one of the most promising candidates among advanced materials. However, some challenges still exist in its fabrication methods. The current study aims to produce functionally-graded bulk Al–SiC nanocomposites by a novel multistep friction stir processing (FSP) for the first time. The SiC nanoparticles were packed into a groove on the 6061 aluminum plate and FSP was performed by using a tool with pin length of 6 mm. Subsequently, FSP was reapplied on another groove by using a tool with a shorter pin length of 3.2 mm. The desirable distribution of SiC nanoparticles in the matrix was confirmed by scanning electron and atomic force microscopes. The composition of graded sample was changed continuously from 18 to 0 wt% SiC along the thickness. Accordingly, the microhardness profile showed a maximum of 160 Hv in the enriched zone which is 3.2 times higher than the hardness of the particle-depleted zone. However, a constant hardness value of 135 Hv was obtained along the thickness of homogenous sample which is 15% lower than that of superficial layer in graded sample. Moreover, the hardness values were linearly correlated with the inverse of interparticle spacing

  3. Alternative statistics in multi-step direct reaction theory

    International Nuclear Information System (INIS)

    Koning, A.J.

    1990-06-01

    In recent years a variety of statistical theories have been developed concerning multistep direct (MSD) nuclear reactions. In addition, dominant in applications is a whole class of semiclassical models that may be subsumed under the heading of 'generalized exciton model': these are basically MSD-type extensions on top of compound-like concepts. In this report the relationship between their underlying statistical MSD-postulates are highlighted. A common framework is sketched that enables to generate the various MSD theories through assigning statistical properties to different parts of the nuclear Hamiltonian. Then it is shown that distinct forms of nuclear randomness are embodied in the mentioned theories. All these theories appear to be very similar at a qualitative level. In order to explain the high-energy tails and forward-peaked angular distribution typical for particles emitted in MSD reactions, it is imaged that the incident continuum particle stepwise looses its energy and direction in a sequence of collisions, thereby creating new particle-hole pairs in the target system. At each step emission may take place. The statistical aspect comes in because many continuum states are involved in the process. These are supposed to display chaotic behavior, the associated randomness assumption giving rise to important simplifications in the expressions for the MSD emission cross sections. This picture suggests that the mentioned MSD models can be interpreted as variants of essentially one and the same theory. However, this appears not to be the case. To show this the usual MSD distinction within the composite reacting nucleus between the fast continuum particle and the residual system is introduced. One implication is that the mutual residual interactions of the nucleons of the residual core are to be distinguished from those of the leading particle with the residual system. This distinction will turn out to be central to the present analysis. (author). 14 refs.; 4

  4. An Automation Planning Primer.

    Science.gov (United States)

    Paynter, Marion

    1988-01-01

    This brief planning guide for library automation incorporates needs assessment and evaluation of options to meet those needs. A bibliography of materials on automation planning and software reviews, library software directories, and library automation journals is included. (CLB)

  5. Automated PCR setup for forensic casework samples using the Normalization Wizard and PCR Setup robotic methods.

    Science.gov (United States)

    Greenspoon, S A; Sykes, K L V; Ban, J D; Pollard, A; Baisden, M; Farr, M; Graham, N; Collins, B L; Green, M M; Christenson, C C

    2006-12-20

    Human genome, pharmaceutical and research laboratories have long enjoyed the application of robotics to performing repetitive laboratory tasks. However, the utilization of robotics in forensic laboratories for processing casework samples is relatively new and poses particular challenges. Since the quantity and quality (a mixture versus a single source sample, the level of degradation, the presence of PCR inhibitors) of the DNA contained within a casework sample is unknown, particular attention must be paid to procedural susceptibility to contamination, as well as DNA yield, especially as it pertains to samples with little biological material. The Virginia Department of Forensic Science (VDFS) has successfully automated forensic casework DNA extraction utilizing the DNA IQ(trade mark) System in conjunction with the Biomek 2000 Automation Workstation. Human DNA quantitation is also performed in a near complete automated fashion utilizing the AluQuant Human DNA Quantitation System and the Biomek 2000 Automation Workstation. Recently, the PCR setup for casework samples has been automated, employing the Biomek 2000 Automation Workstation and Normalization Wizard, Genetic Identity version, which utilizes the quantitation data, imported into the software, to create a customized automated method for DNA dilution, unique to that plate of DNA samples. The PCR Setup software method, used in conjunction with the Normalization Wizard method and written for the Biomek 2000, functions to mix the diluted DNA samples, transfer the PCR master mix, and transfer the diluted DNA samples to PCR amplification tubes. Once the process is complete, the DNA extracts, still on the deck of the robot in PCR amplification strip tubes, are transferred to pre-labeled 1.5 mL tubes for long-term storage using an automated method. The automation of these steps in the process of forensic DNA casework analysis has been accomplished by performing extensive optimization, validation and testing of the

  6. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  7. Simulation-Based Optimization for Storage Allocation Problem of Outbound Containers in Automated Container Terminals

    Directory of Open Access Journals (Sweden)

    Ning Zhao

    2015-01-01

    Full Text Available Storage allocation of outbound containers is a key factor of the performance of container handling system in automated container terminals. Improper storage plans of outbound containers make QC waiting inevitable; hence, the vessel handling time will be lengthened. A simulation-based optimization method is proposed in this paper for the storage allocation problem of outbound containers in automated container terminals (SAPOBA. A simulation model is built up by Timed-Colored-Petri-Net (TCPN, used to evaluate the QC waiting time of storage plans. Two optimization approaches, based on Particle Swarm Optimization (PSO and Genetic Algorithm (GA, are proposed to form the complete simulation-based optimization method. Effectiveness of this method is verified by experiment, as the comparison of the two optimization approaches.

  8. Improving the performance of the MWPC X-ray imaging detector by means of the Multi-Step Avalanche technique

    International Nuclear Information System (INIS)

    Bateman, J.E.; Connolly, J.F.; Stephenson, R.

    1984-01-01

    X-ray imaging systems based on conventional MWPC technology and artificial delay line readout techniques have been developed at RAL for several applications over a period of some eight years. It is perceived that very limited scope exists for the further improvement of the imaging capability of the standard MWPC design. Attention has therefore been turned to the possibility of exploiting the Multi-Step Avalanche (MSA) system of electron multiplication in this context. Results from a prototype system are presented which show spatial resolution better than that achieved in the MWPC systems. The facility for controlling the effective depth of the detector electronically is also demonstrated. (author)

  9. Automated Budget System -

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  10. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    Science.gov (United States)

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  11. Routine human-competitive machine intelligence by means of genetic programming

    Science.gov (United States)

    Koza, John R.; Streeter, Matthew J.; Keane, Martin

    2004-01-01

    Genetic programming is a systematic method for getting computers to automatically solve a problem. Genetic programming starts from a high-level statement of what needs to be done and automatically creates a computer program to solve the problem. The paper demonstrates that genetic programming (1) now routinely delivers high-return human-competitive machine intelligence; (2) is an automated invention machine; (3) can automatically create a general solution to a problem in the form of a parameterized topology; and (4) has delivered a progression of qualitatively more substantial results in synchrony with five approximately order-of-magnitude increases in the expenditure of computer time. Recent results involving the automatic synthesis of the topology and sizing of analog electrical circuits and controllers demonstrate these points.

  12. 78 FR 66039 - Modification of National Customs Automation Program Test Concerning Automated Commercial...

    Science.gov (United States)

    2013-11-04

    ... Customs Automation Program Test Concerning Automated Commercial Environment (ACE) Cargo Release (Formerly...) plan to both rename and modify the National Customs Automation Program (NCAP) test concerning the... data elements required to obtain release for cargo transported by air. The test will now be known as...

  13. Automated measurement of mouse social behaviors using depth sensing, video tracking, and machine learning.

    Science.gov (United States)

    Hong, Weizhe; Kennedy, Ann; Burgos-Artizzu, Xavier P; Zelikowsky, Moriel; Navonne, Santiago G; Perona, Pietro; Anderson, David J

    2015-09-22

    A lack of automated, quantitative, and accurate assessment of social behaviors in mammalian animal models has limited progress toward understanding mechanisms underlying social interactions and their disorders such as autism. Here we present a new integrated hardware and software system that combines video tracking, depth sensing, and machine learning for automatic detection and quantification of social behaviors involving close and dynamic interactions between two mice of different coat colors in their home cage. We designed a hardware setup that integrates traditional video cameras with a depth camera, developed computer vision tools to extract the body "pose" of individual animals in a social context, and used a supervised learning algorithm to classify several well-described social behaviors. We validated the robustness of the automated classifiers in various experimental settings and used them to examine how genetic background, such as that of Black and Tan Brachyury (BTBR) mice (a previously reported autism model), influences social behavior. Our integrated approach allows for rapid, automated measurement of social behaviors across diverse experimental designs and also affords the ability to develop new, objective behavioral metrics.

  14. Fluorescence In Situ Hybridization (FISH Signal Analysis Using Automated Generated Projection Images

    Directory of Open Access Journals (Sweden)

    Xingwei Wang

    2012-01-01

    Full Text Available Fluorescence in situ hybridization (FISH tests provide promising molecular imaging biomarkers to more accurately and reliably detect and diagnose cancers and genetic disorders. Since current manual FISH signal analysis is low-efficient and inconsistent, which limits its clinical utility, developing automated FISH image scanning systems and computer-aided detection (CAD schemes has been attracting research interests. To acquire high-resolution FISH images in a multi-spectral scanning mode, a huge amount of image data with the stack of the multiple three-dimensional (3-D image slices is generated from a single specimen. Automated preprocessing these scanned images to eliminate the non-useful and redundant data is important to make the automated FISH tests acceptable in clinical applications. In this study, a dual-detector fluorescence image scanning system was applied to scan four specimen slides with FISH-probed chromosome X. A CAD scheme was developed to detect analyzable interphase cells and map the multiple imaging slices recorded FISH-probed signals into the 2-D projection images. CAD scheme was then applied to each projection image to detect analyzable interphase cells using an adaptive multiple-threshold algorithm, identify FISH-probed signals using a top-hat transform, and compute the ratios between the normal and abnormal cells. To assess CAD performance, the FISH-probed signals were also independently visually detected by an observer. The Kappa coefficients for agreement between CAD and observer ranged from 0.69 to 1.0 in detecting/counting FISH signal spots in four testing samples. The study demonstrated the feasibility of automated FISH signal analysis that applying a CAD scheme to the automated generated 2-D projection images.

  15. View-Invariant Gait Recognition Through Genetic Template Segmentation

    Science.gov (United States)

    Isaac, Ebenezer R. H. P.; Elias, Susan; Rajagopalan, Srinivasan; Easwarakumar, K. S.

    2017-08-01

    Template-based model-free approach provides by far the most successful solution to the gait recognition problem in literature. Recent work discusses how isolating the head and leg portion of the template increase the performance of a gait recognition system making it robust against covariates like clothing and carrying conditions. However, most involve a manual definition of the boundaries. The method we propose, the genetic template segmentation (GTS), employs the genetic algorithm to automate the boundary selection process. This method was tested on the GEI, GEnI and AEI templates. GEI seems to exhibit the best result when segmented with our approach. Experimental results depict that our approach significantly outperforms the existing implementations of view-invariant gait recognition.

  16. Automated High-Throughput Genotyping for Study of Global Epidemiology of Mycobacterium tuberculosis Based on Mycobacterial Interspersed Repetitive Units

    Science.gov (United States)

    Supply, Philip; Lesjean, Sarah; Savine, Evgueni; Kremer, Kristin; van Soolingen, Dick; Locht, Camille

    2001-01-01

    Large-scale genotyping of Mycobacterium tuberculosis is especially challenging, as the current typing methods are labor-intensive and the results are difficult to compare among laboratories. Here, automated typing based on variable-number tandem repeats (VNTRs) of genetic elements named mycobacterial interspersed repetitive units (MIRUs) in 12 mammalian minisatellite-like loci of M. tuberculosis is presented. This system combines analysis of multiplex PCRs on a fluorescence-based DNA analyzer with computerized automation of the genotyping. Analysis of a blinded reference set of 90 strains from 38 countries (K. Kremer et al., J. Clin. Microbiol. 37:2607–2618, 1999) demonstrated that it is 100% reproducible, sensitive, and specific for M. tuberculosis complex isolates, a performance that has not been achieved by any other typing method tested in the same conditions. MIRU-VNTRs can be used for analysis of the global genetic diversity of M. tuberculosis complex strains at different levels of evolutionary divergence. To fully exploit the portability of this typing system, a website was set up for the analysis of M. tuberculosis MIRU-VNTR genotypes via the Internet. This opens the way for global epidemiological surveillance of tuberculosis and should lead to novel insights into the evolutionary and population genetics of this major pathogen. PMID:11574573

  17. Automation-aided Task Loads Index based on the Automation Rate Reflecting the Effects on Human Operators in NPPs

    International Nuclear Information System (INIS)

    Lee, Seungmin; Seong, Poonghyun; Kim, Jonghyun

    2013-01-01

    Many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs) was suggested. These suggested measures express how much automation support human operators but it cannot express the change of human operators' workload, whether the human operators' workload is increased or decreased. Before considering automation rates, whether the adopted automation is good or bad might be estimated in advance. In this study, to estimate the appropriateness of automation according to the change of the human operators' task loads, automation-aided task loads index is suggested based on the concept of the suggested automation rate. To insure plant safety and efficiency on behalf of human operators, various automation systems have been installed in NPPs, and many works which were previously conducted by human operators can now be supported by computer-based operator aids. According to the characteristics of the automation types, the estimation method of the system automation and the cognitive automation rate were suggested. The proposed estimation method concentrates on the effects of introducing automation, so it directly express how much the automated system support human operators. Based on the suggested automation rates, the way to estimate how much the automated system can affect the human operators' cognitive task load is suggested in this study. When there is no automation, the calculated index is 1, and it means there is no change of human operators' task load

  18. A next-generation dual-recombinase system for time and host specific targeting of pancreatic cancer

    Science.gov (United States)

    Schachtler, Christina; Zukowska, Magdalena; Eser, Stefan; Feyerabend, Thorsten B.; Paul, Mariel C.; Eser, Philipp; Klein, Sabine; Lowy, Andrew M.; Banerjee, Ruby; Yang, Fangtang; Lee, Chang-Lung; Moding, Everett J.; Kirsch, David G.; Scheideler, Angelika; Alessi, Dario R.; Varela, Ignacio; Bradley, Allan; Kind, Alexander; Schnieke, Angelika E.; Rodewald, Hans-Reimer; Rad, Roland; Schmid, Roland M.; Schneider, Günter; Saur, Dieter

    2014-01-01

    Genetically engineered mouse models (GEMMs) have dramatically improved our understanding of tumor evolution and therapeutic resistance. However, sequential genetic manipulation of gene expression and targeting of the host is almost impossible using conventional Cre-loxP–based models. We have developed an inducible dual-recombinase system by combining flippase-FRT (Flp-FRT) and Cre-loxP recombination technologies to improve GEMMs of pancreatic cancer. This enables investigation of multistep carcinogenesis, genetic manipulation of tumor subpopulations (such as cancer stem cells), selective targeting of the tumor microenvironment and genetic validation of therapeutic targets in autochthonous tumors on a genome-wide scale. As a proof of concept, we performed tumor cell–autonomous and nonautonomous targeting, recapitulated hallmarks of human multistep carcinogenesis, validated genetic therapy by 3-phosphoinositide-dependent protein kinase inactivation as well as cancer cell depletion and show that mast cells in the tumor microenvironment, which had been thought to be key oncogenic players, are dispensable for tumor formation. PMID:25326799

  19. A next-generation dual-recombinase system for time- and host-specific targeting of pancreatic cancer.

    Science.gov (United States)

    Schönhuber, Nina; Seidler, Barbara; Schuck, Kathleen; Veltkamp, Christian; Schachtler, Christina; Zukowska, Magdalena; Eser, Stefan; Feyerabend, Thorsten B; Paul, Mariel C; Eser, Philipp; Klein, Sabine; Lowy, Andrew M; Banerjee, Ruby; Yang, Fangtang; Lee, Chang-Lung; Moding, Everett J; Kirsch, David G; Scheideler, Angelika; Alessi, Dario R; Varela, Ignacio; Bradley, Allan; Kind, Alexander; Schnieke, Angelika E; Rodewald, Hans-Reimer; Rad, Roland; Schmid, Roland M; Schneider, Günter; Saur, Dieter

    2014-11-01

    Genetically engineered mouse models (GEMMs) have dramatically improved our understanding of tumor evolution and therapeutic resistance. However, sequential genetic manipulation of gene expression and targeting of the host is almost impossible using conventional Cre-loxP-based models. We have developed an inducible dual-recombinase system by combining flippase-FRT (Flp-FRT) and Cre-loxP recombination technologies to improve GEMMs of pancreatic cancer. This enables investigation of multistep carcinogenesis, genetic manipulation of tumor subpopulations (such as cancer stem cells), selective targeting of the tumor microenvironment and genetic validation of therapeutic targets in autochthonous tumors on a genome-wide scale. As a proof of concept, we performed tumor cell-autonomous and nonautonomous targeting, recapitulated hallmarks of human multistep carcinogenesis, validated genetic therapy by 3-phosphoinositide-dependent protein kinase inactivation as well as cancer cell depletion and show that mast cells in the tumor microenvironment, which had been thought to be key oncogenic players, are dispensable for tumor formation.

  20. Asleep at the automated wheel-Sleepiness and fatigue during highly automated driving.

    Science.gov (United States)

    Vogelpohl, Tobias; Kühn, Matthias; Hummel, Thomas; Vollrath, Mark

    2018-03-20

    Due to the lack of active involvement in the driving situation and due to monotonous driving environments drivers with automation may be prone to become fatigued faster than manual drivers (e.g. Schömig et al., 2015). However, little is known about the progression of fatigue during automated driving and its effects on the ability to take back manual control after a take-over request. In this driving simulator study with Nö=ö60 drivers we used a three factorial 2ö×ö2ö×ö12 mixed design to analyze the progression (12ö×ö5ömin; within subjects) of driver fatigue in drivers with automation compared to manual drivers (between subjects). Driver fatigue was induced as either mainly sleep related or mainly task related fatigue (between subjects). Additionally, we investigated the drivers' reactions to a take-over request in a critical driving scenario to gain insights into the ability of fatigued drivers to regain manual control and situation awareness after automated driving. Drivers in the automated driving condition exhibited facial indicators of fatigue after 15 to 35ömin of driving. Manual drivers only showed similar indicators of fatigue if they suffered from a lack of sleep and then only after a longer period of driving (approx. 40ömin). Several drivers in the automated condition closed their eyes for extended periods of time. In the driving with automation condition mean automation deactivation times after a take-over request were slower for a certain percentage (about 30%) of the drivers with a lack of sleep (Mö=ö3.2; SDö=ö2.1ös) compared to the reaction times after a long drive (Mö=ö2.4; SDö=ö0.9ös). Drivers with automation also took longer than manual drivers to first glance at the speed display after a take-over request and were more likely to stay behind a braking lead vehicle instead of overtaking it. Drivers are unable to stay alert during extended periods of automated driving without non-driving related tasks. Fatigued drivers could

  1. Procedure automation: the effect of automated procedure execution on situation awareness and human performance

    International Nuclear Information System (INIS)

    Andresen, Gisle; Svengren, Haakan; Heimdal, Jan O.; Nilsen, Svein; Hulsund, John-Einar; Bisio, Rossella; Debroise, Xavier

    2004-04-01

    As advised by the procedure workshop convened in Halden in 2000, the Halden Project conducted an experiment on the effect of automation of Computerised Procedure Systems (CPS) on situation awareness and human performance. The expected outcome of the study was to provide input for guidance on CPS design, and to support the Halden Project's ongoing research on human reliability analysis. The experiment was performed in HAMMLAB using the HAMBO BWR simulator and the COPMA-III CPS. Eight crews of operators from Forsmark 3 and Oskarshamn 3 participated. Three research questions were investigated: 1) Does procedure automation create Out-Of-The-Loop (OOTL) performance problems? 2) Does procedure automation affect situation awareness? 3) Does procedure automation affect crew performance? The independent variable, 'procedure configuration', had four levels: paper procedures, manual CPS, automation with breaks, and full automation. The results showed that the operators experienced OOTL problems in full automation, but that situation awareness and crew performance (response time) were not affected. One possible explanation for this is that the operators monitored the automated procedure execution conscientiously, something which may have prevented the OOTL problems from having negative effects on situation awareness and crew performance. In a debriefing session, the operators clearly expressed their dislike for the full automation condition, but that automation with breaks could be suitable for some tasks. The main reason why the operators did not like the full automation was that they did not feel being in control. A qualitative analysis addressing factors contributing to response time delays revealed that OOTL problems did not seem to cause delays, but that some delays could be explained by the operators having problems with the freeze function of the CPS. Also other factors such as teamwork and operator tendencies were of importance. Several design implications were drawn

  2. Design and rationale for examining neuroimaging genetics in ischemic stroke: The MRI-GENIE study.

    Science.gov (United States)

    Giese, Anne-Katrin; Schirmer, Markus D; Donahue, Kathleen L; Cloonan, Lisa; Irie, Robert; Winzeck, Stefan; Bouts, Mark J R J; McIntosh, Elissa C; Mocking, Steven J; Dalca, Adrian V; Sridharan, Ramesh; Xu, Huichun; Frid, Petrea; Giralt-Steinhauer, Eva; Holmegaard, Lukas; Roquer, Jaume; Wasselius, Johan; Cole, John W; McArdle, Patrick F; Broderick, Joseph P; Jimenez-Conde, Jordi; Jern, Christina; Kissela, Brett M; Kleindorfer, Dawn O; Lemmens, Robin; Lindgren, Arne; Meschia, James F; Rundek, Tatjana; Sacco, Ralph L; Schmidt, Reinhold; Sharma, Pankaj; Slowik, Agnieszka; Thijs, Vincent; Woo, Daniel; Worrall, Bradford B; Kittner, Steven J; Mitchell, Braxton D; Rosand, Jonathan; Golland, Polina; Wu, Ona; Rost, Natalia S

    2017-10-01

    To describe the design and rationale for the genetic analysis of acute and chronic cerebrovascular neuroimaging phenotypes detected on clinical MRI in patients with acute ischemic stroke (AIS) within the scope of the MRI-GENetics Interface Exploration (MRI-GENIE) study. MRI-GENIE capitalizes on the existing infrastructure of the Stroke Genetics Network (SiGN). In total, 12 international SiGN sites contributed MRIs of 3,301 patients with AIS. Detailed clinical phenotyping with the web-based Causative Classification of Stroke (CCS) system and genome-wide genotyping data were available for all participants. Neuroimaging analyses include the manual and automated assessments of established MRI markers. A high-throughput MRI analysis pipeline for the automated assessment of cerebrovascular lesions on clinical scans will be developed in a subset of scans for both acute and chronic lesions, validated against gold standard, and applied to all available scans. The extracted neuroimaging phenotypes will improve characterization of acute and chronic cerebrovascular lesions in ischemic stroke, including CCS subtypes, and their effect on functional outcomes after stroke. Moreover, genetic testing will uncover variants associated with acute and chronic MRI manifestations of cerebrovascular disease. The MRI-GENIE study aims to develop, validate, and distribute the MRI analysis platform for scans acquired as part of clinical care for patients with AIS, which will lead to (1) novel genetic discoveries in ischemic stroke, (2) strategies for personalized stroke risk assessment, and (3) personalized stroke outcome assessment.

  3. Application of genetic algorithm for extraction of the parameters from powder EPR spectra

    International Nuclear Information System (INIS)

    Spalek, T.; Pietrzyk, P.; Sojka, Z.

    2005-01-01

    The application of the stochastic genetic algorithm in tandem with the deterministic Powell method to automated extraction of the magnetic parameters from powder EPR spectra was described. The efficiency and robustness of such hybrid approach were investigated as a function of the uncertainty range of parameters, using simulated data sets. The discussed results demonstrate superior performance of the hybrid genetic algorithm in fitting of complex spectra in comparison to the common Monte Carlo Method joint with the Powell refinement. (author)

  4. Thermally Stable Ni-rich Austenite Formed Utilizing Multistep Intercritical Heat Treatment in a Low-Carbon 10 Wt Pct Ni Martensitic Steel

    Science.gov (United States)

    Jain, Divya; Isheim, Dieter; Zhang, Xian J.; Ghosh, Gautam; Seidman, David N.

    2017-08-01

    Austenite reversion and its thermal stability attained during the transformation is key to enhanced toughness and blast resistance in transformation-induced-plasticity martensitic steels. We demonstrate that the thermal stability of Ni-stabilized austenite and kinetics of the transformation can be controlled by forming Ni-rich regions in proximity of pre-existing (retained) austenite. Atom probe tomography (APT) in conjunction with thermodynamic and kinetic modeling elucidates the role of Ni-rich regions in enhancing growth kinetics of thermally stable austenite, formed utilizing a multistep intercritical ( Quench- Lamellarization- Tempering (QLT)-type) heat treatment for a low-carbon 10 wt pct Ni steel. Direct evidence of austenite formation is provided by dilatometry, and the volume fraction is quantified by synchrotron X-ray diffraction. The results indicate the growth of nm-thick austenite layers during the second intercritical tempering treatment (T-step) at 863 K (590 °C), with austenite retained from first intercritical treatment (L-step) at 923 K (650 °C) acting as a nucleation template. For the first time, the thermal stability of austenite is quantified with respect to its compositional evolution during the multistep intercritical treatment of these steels. Austenite compositions measured by APT are used in combination with the thermodynamic and kinetic approach formulated by Ghosh and Olson to assess thermal stability and predict the martensite-start temperature. This approach is particularly useful as empirical relations cannot be extrapolated for the highly Ni-enriched austenite investigated in the present study.

  5. Automation-aided Task Loads Index based on the Automation Rate Reflecting the Effects on Human Operators in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seungmin; Seong, Poonghyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Kim, Jonghyun [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2013-05-15

    Many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs) was suggested. These suggested measures express how much automation support human operators but it cannot express the change of human operators' workload, whether the human operators' workload is increased or decreased. Before considering automation rates, whether the adopted automation is good or bad might be estimated in advance. In this study, to estimate the appropriateness of automation according to the change of the human operators' task loads, automation-aided task loads index is suggested based on the concept of the suggested automation rate. To insure plant safety and efficiency on behalf of human operators, various automation systems have been installed in NPPs, and many works which were previously conducted by human operators can now be supported by computer-based operator aids. According to the characteristics of the automation types, the estimation method of the system automation and the cognitive automation rate were suggested. The proposed estimation method concentrates on the effects of introducing automation, so it directly express how much the automated system support human operators. Based on the suggested automation rates, the way to estimate how much the automated system can affect the human operators' cognitive task load is suggested in this study. When there is no automation, the calculated index is 1, and it means there is no change of human operators' task load.

  6. Automation of radioimmunoassay

    International Nuclear Information System (INIS)

    Yamaguchi, Chisato; Yamada, Hideo; Iio, Masahiro

    1974-01-01

    Automation systems for measuring Australian antigen by radioimmunoassay under development were discussed. Samples were processed as follows: blood serum being dispensed by automated sampler to the test tube, and then incubated under controlled time and temperature; first counting being omitted; labelled antibody being dispensed to the serum after washing; samples being incubated and then centrifuged; radioactivities in the precipitate being counted by auto-well counter; measurements being tabulated by automated typewriter. Not only well-type counter but also position counter was studied. (Kanao, N.)

  7. Rapid determination and chemical change tracking of benzoyl peroxide in wheat flour by multi-step IR macro-fingerprinting

    Science.gov (United States)

    Guo, Xiao-Xi; Hu, Wei; Liu, Yuan; Sun, Su-Qin; Gu, Dong-Chen; He, Helen; Xu, Chang-Hua; Wang, Xi-Chang

    2016-02-01

    BPO is often added to wheat flour as flour improver, but its excessive use and edibility are receiving increasing concern. A multi-step IR macro-fingerprinting was employed to identify BPO in wheat flour and unveil its changes during storage. BPO contained in wheat flour (treatment of BPO in wheat flour based on 2DCOS-IR and spectral subtraction analysis, it was found that BPO in wheat flour not only decomposed into benzoic acid and benzoate, but also produced other deleterious substances, e.g., benzene. This study offers a promising method with minimum pretreatment and time-saving to identify BPO in wheat flour and its chemical products during storage in a holistic manner.

  8. Multi-step direct reactions at low energies

    International Nuclear Information System (INIS)

    Marcinkowski, A.; Marianski, B.

    2001-01-01

    Full text: The theory of the multistep direct (MSD) reactions of Feshbach, Kerman and Koonin has for quite some time become a subject of controversy due to the bi orthogonal distorted waves involved in the transition amplitudes describing the MSD cross sections. The bi orthogonal wave functions result in non-normal DWBA matrix elements, that can be expressed in terms of normal DWBA matrix elements multiplied by the inverse elastic scattering S-matrix. It has been argued that the enhancing inverse S-factors are washed out by averaging over energy in the continuum. As a result normal DWBA matrix elements are commonly used in practical calculations. Almost all analyses of inelastic scattering and charge-exchange reactions using the DWBA matrix elements have concluded that nucleon emission at low energies can be described as one-step reaction mainly. On the other hand, it has been shown that the limits imposed by the energy weighted sum rules (EWSR's) on transition of given angular momentum transfer lead to a significant reduction of the one step cross section that can be compensated by the enhanced MSD cross sections obtained with the use of the non-normal DWBA matrix elements. Very recently the MSD theory of FKK was modified to include collective excitations and the non-normal DWBA matrix elements and the prescription for calculations of the cross sections for the MSD reactions was given. In the present paper we present the results of the modified theory used for describing the 93 Nb (n,xn) 93 Nb reaction at incident energy of 20 MeV and the 65 Cu (p,xn) 65 Zn reaction at 27 MeV. The results show enhanced contributions from two-, three- and four step reactions. We investigate the importance of the multi-phonon, multi particle hole and the mixed particle hole-phonon excitations in neutron scattering to the continuum. We also show the importance of the different sequences of collisions of the leading continuum nucleon that contribute to the MSD (p,n) reaction. When all

  9. 77 FR 48527 - National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE...

    Science.gov (United States)

    2012-08-14

    ... National Customs Automation Program (NCAP) test concerning the simplified entry functionality in the... DEPARTMENT OF HOMELAND SECURITY U.S. Customs and Border Protection National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE) Simplified Entry: Modification of...

  10. Application of genetic algorithm with genetic modification and quality map in production strategy optimization; Aplicacao de algoritmo genetico com modificacao genetica e mapa de qualidade na otimizacao de estrategia de producao

    Energy Technology Data Exchange (ETDEWEB)

    Nakajima, Lincoln; Maschio, Celio; Schiozer, Denis J. [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Mecanica. Dept. de Engenharia de Petroleo

    2008-07-01

    The definition of position and number of wells is the most important stage on production strategy selection, since it will affect the reservoir behavior, which influences future decisions. However this process is time-consuming and it is often a trial-and-error approach. Many studies have been made in order to reduce the engineer's effort in this stage, by minimizing the number of simulation runs through proxy models or by automating the whole process, using some optimization algorithm. This work proposes a methodology that integrates genetic algorithm and quality map to automate the production strategy optimization. It is also introduced the concept of genetic modification, which is the procedure to update the quality map according to the wells production of each evaluated strategy. The objective is to improve the evolutionary process, allowing the evaluation of more promising alternatives, improving the chance of obtaining better solutions without a substantial increase in the number of simulations. (author)

  11. Multi-step resistive switching behavior of Li-doped ZnO resistance random access memory device controlled by compliance current

    International Nuclear Information System (INIS)

    Lin, Chun-Cheng; Tang, Jian-Fu; Su, Hsiu-Hsien; Hong, Cheng-Shong; Huang, Chih-Yu; Chu, Sheng-Yuan

    2016-01-01

    The multi-step resistive switching (RS) behavior of a unipolar Pt/Li 0.06 Zn 0.94 O/Pt resistive random access memory (RRAM) device is investigated. It is found that the RRAM device exhibits normal, 2-, 3-, and 4-step RESET behaviors under different compliance currents. The transport mechanism within the device is investigated by means of current-voltage curves, in-situ transmission electron microscopy, and electrochemical impedance spectroscopy. It is shown that the ion transport mechanism is dominated by Ohmic behavior under low electric fields and the Poole-Frenkel emission effect (normal RS behavior) or Li + ion diffusion (2-, 3-, and 4-step RESET behaviors) under high electric fields.

  12. Semi-automated quantitative Drosophila wings measurements.

    Science.gov (United States)

    Loh, Sheng Yang Michael; Ogawa, Yoshitaka; Kawana, Sara; Tamura, Koichiro; Lee, Hwee Kuan

    2017-06-28

    Drosophila melanogaster is an important organism used in many fields of biological research such as genetics and developmental biology. Drosophila wings have been widely used to study the genetics of development, morphometrics and evolution. Therefore there is much interest in quantifying wing structures of Drosophila. Advancement in technology has increased the ease in which images of Drosophila can be acquired. However such studies have been limited by the slow and tedious process of acquiring phenotypic data. We have developed a system that automatically detects and measures key points and vein segments on a Drosophila wing. Key points are detected by performing image transformations and template matching on Drosophila wing images while vein segments are detected using an Active Contour algorithm. The accuracy of our key point detection was compared against key point annotations of users. We also performed key point detection using different training data sets of Drosophila wing images. We compared our software with an existing automated image analysis system for Drosophila wings and showed that our system performs better than the state of the art. Vein segments were manually measured and compared against the measurements obtained from our system. Our system was able to detect specific key points and vein segments from Drosophila wing images with high accuracy.

  13. Laboratory Automation and Middleware.

    Science.gov (United States)

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Managing laboratory automation.

    Science.gov (United States)

    Saboe, T J

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed.

  15. Determination of the Optimized Automation Rate considering Effects of Automation on Human Operators in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun; Kim, Jong Hyun; Kim, Man Cheol

    2015-01-01

    Automation refers to the use of a device or a system to perform a function previously performed by a human operator. It is introduced to reduce the human errors and to enhance the performance in various industrial fields, including the nuclear industry. However, these positive effects are not always achieved in complex systems such as nuclear power plants (NPPs). An excessive introduction of automation can generate new roles for human operators and change activities in unexpected ways. As more automation systems are accepted, the ability of human operators to detect automation failures and resume manual control is diminished. This disadvantage of automation is called the Out-of-the- Loop (OOTL) problem. We should consider the positive and negative effects of automation at the same time to determine the appropriate level of the introduction of automation. Thus, in this paper, we suggest an estimation method to consider the positive and negative effects of automation at the same time to determine the appropriate introduction of automation. This concept is limited in that it does not consider the effects of automation on human operators. Thus, a new estimation method for automation rate was suggested to overcome this problem

  16. 78 FR 44142 - Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated...

    Science.gov (United States)

    2013-07-23

    ... Customs Automation Program (NCAP) Tests Concerning Automated Commercial Environment (ACE) Document Image... (CBP's) plan to modify the National Customs Automation Program (NCAP) tests concerning document imaging... entry process by reducing the number of data elements required to obtain release for cargo transported...

  17. The Science of Home Automation

    Science.gov (United States)

    Thomas, Brian Louis

    Smart home technologies and the concept of home automation have become more popular in recent years. This popularity has been accompanied by social acceptance of passive sensors installed throughout the home. The subsequent increase in smart homes facilitates the creation of home automation strategies. We believe that home automation strategies can be generated intelligently by utilizing smart home sensors and activity learning. In this dissertation, we hypothesize that home automation can benefit from activity awareness. To test this, we develop our activity-aware smart automation system, CARL (CASAS Activity-aware Resource Learning). CARL learns the associations between activities and device usage from historical data and utilizes the activity-aware capabilities to control the devices. To help validate CARL we deploy and test three different versions of the automation system in a real-world smart environment. To provide a foundation of activity learning, we integrate existing activity recognition and activity forecasting into CARL home automation. We also explore two alternatives to using human-labeled data to train the activity learning models. The first unsupervised method is Activity Detection, and the second is a modified DBSCAN algorithm that utilizes Dynamic Time Warping (DTW) as a distance metric. We compare the performance of activity learning with human-defined labels and with automatically-discovered activity categories. To provide evidence in support of our hypothesis, we evaluate CARL automation in a smart home testbed. Our results indicate that home automation can be boosted through activity awareness. We also find that the resulting automation has a high degree of usability and comfort for the smart home resident.

  18. Adaptive Automation Design and Implementation

    Science.gov (United States)

    2015-09-17

    with an automated system to a real-world adaptive au- tomation system implementation. There have been plenty of adaptive automation 17 Adaptive...of systems without increasing manpower requirements by allocating routine tasks to automated aids, improving safety through the use of au- tomated ...between intermediate levels of au- tomation , explicitly defining which human task a given level automates. Each model aids the creation and classification

  19. Layered distributed architecture for plant automation

    International Nuclear Information System (INIS)

    Aravamuthan, G.; Verma, Yachika; Ranjan, Jyoti; Chachondia, Alka S.; Ganesh, G.

    2005-01-01

    The development of plant automation system and associated software remains one of the greatest challenges to the widespread implementation of highly adaptive re-configurable automation technology. This paper presents a layered distributed architecture for a plant automation system designed to support rapid reconfiguration and redeployment of automation components. The paper first presents evolution of automation architecture and their associated environment in the past few decades and then presents the concept of layered system architecture and the use of automation components to support the construction of a wide variety of automation system. It also highlights the role of standards and technology, which can be used in the development of automation components. We have attempted to adhere to open standards and technology for the development of automation component at a various layers. It also highlights the application of this concept in the development of an Operator Information System (OIS) for Advanced Heavy Water Reactor (AHWR). (author)

  20. Molecular Cloning Designer Simulator (MCDS: All-in-one molecular cloning and genetic engineering design, simulation and management software for complex synthetic biology and metabolic engineering projects

    Directory of Open Access Journals (Sweden)

    Zhenyu Shi

    2016-12-01

    Full Text Available Molecular Cloning Designer Simulator (MCDS is a powerful new all-in-one cloning and genetic engineering design, simulation and management software platform developed for complex synthetic biology and metabolic engineering projects. In addition to standard functions, it has a number of features that are either unique, or are not found in combination in any one software package: (1 it has a novel interactive flow-chart user interface for complex multi-step processes, allowing an integrated overview of the whole project; (2 it can perform a user-defined workflow of cloning steps in a single execution of the software; (3 it can handle multiple types of genetic recombineering, a technique that is rapidly replacing classical cloning for many applications; (4 it includes experimental information to conveniently guide wet lab work; and (5 it can store results and comments to allow the tracking and management of the whole project in one platform. MCDS is freely available from https://mcds.codeplex.com. Keywords: BioCAD, Genetic engineering software, Molecular cloning software, Synthetic biology, Workflow simulation and management

  1. Semantics-based Automated Web Testing

    Directory of Open Access Journals (Sweden)

    Hai-Feng Guo

    2015-08-01

    Full Text Available We present TAO, a software testing tool performing automated test and oracle generation based on a semantic approach. TAO entangles grammar-based test generation with automated semantics evaluation using a denotational semantics framework. We show how TAO can be incorporated with the Selenium automation tool for automated web testing, and how TAO can be further extended to support automated delta debugging, where a failing web test script can be systematically reduced based on grammar-directed strategies. A real-life parking website is adopted throughout the paper to demonstrate the effectivity of our semantics-based web testing approach.

  2. Automation in organizations: Eternal conflict

    Science.gov (United States)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  3. Mobile home automation-merging mobile value added services and home automation technologies

    OpenAIRE

    Rosendahl, Andreas; Hampe, Felix J.; Botterweck, Goetz

    2007-01-01

    non-peer-reviewed In this paper we study mobile home automation, a field that emerges from an integration of mobile application platforms and home automation technologies. In a conceptual introduction we first illustrate the need for such applications by introducing a two-dimensional conceptual model of mobility. Subsequently we suggest an architecture and discuss different options of how a user might access a mobile home automation service and the controlled devices. As another contrib...

  4. Suresh C. Jhanwar

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics. Suresh C. Jhanwar. Articles written in Journal of Genetics. Volume 79 Issue 3 2000 pp 113-123. Characterization of newly established colorectal cancer cell lines: correlation between cytogenetic abnormalities and allelic deletions associated with multistep tumorigenesis · Hans Gerdes ...

  5. Experience of automation failures in training: effects on trust, automation bias, complacency and performance.

    Science.gov (United States)

    Sauer, Juergen; Chavaillaz, Alain; Wastell, David

    2016-06-01

    This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one.

  6. Future Trends in Process Automation

    OpenAIRE

    Jämsä-Jounela, Sirkka-Liisa

    2007-01-01

    The importance of automation in the process industries has increased dramatically in recent years. In the highly industrialized countries, process automation serves to enhance product quality, master the whole range of products, improve process safety and plant availability, efficiently utilize resources and lower emissions. In the rapidly developing countries, mass production is the main motivation for applying process automation. The greatest demand for process automation is in the chemical...

  7. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  8. 76 FR 34246 - Automated Commercial Environment (ACE); Announcement of National Customs Automation Program Test...

    Science.gov (United States)

    2011-06-13

    ... Environment (ACE); Announcement of National Customs Automation Program Test of Automated Procedures for In... Customs Automation Program (NCAP) test relating to highway movements of commercial goods that are transported in-bond through the United States from one point in Canada to another point in Canada. The NCAP...

  9. Automated cloning methods.; TOPICAL

    International Nuclear Information System (INIS)

    Collart, F.

    2001-01-01

    Argonne has developed a series of automated protocols to generate bacterial expression clones by using a robotic system designed to be used in procedures associated with molecular biology. The system provides plate storage, temperature control from 4 to 37 C at various locations, and Biomek and Multimek pipetting stations. The automated system consists of a robot that transports sources from the active station on the automation system. Protocols for the automated generation of bacterial expression clones can be grouped into three categories (Figure 1). Fragment generation protocols are initiated on day one of the expression cloning procedure and encompass those protocols involved in generating purified coding region (PCR)

  10. Exact free vibration of multi-step Timoshenko beam system with several attachments

    Science.gov (United States)

    Farghaly, S. H.; El-Sayed, T. A.

    2016-05-01

    This paper deals with the analysis of the natural frequencies, mode shapes of an axially loaded multi-step Timoshenko beam combined system carrying several attachments. The influence of system design and the proposed sub-system non-dimensional parameters on the combined system characteristics are the major part of this investigation. The effect of material properties, rotary inertia and shear deformation of the beam system for each span are included. The end masses are elastically supported against rotation and translation at an offset point from the point of attachment. A sub-system having two degrees of freedom is located at the beam ends and at any of the intermediate stations and acts as a support and/or a suspension. The boundary conditions of the ordinary differential equation governing the lateral deflections and slope due to bending of the beam system including the shear force term, due to the sub-system, have been formulated. Exact global coefficient matrices for the combined modal frequencies, the modal shape and for the discrete sub-system have been derived. Based on these formulae, detailed parametric studies of the combined system are carried out. The applied mathematical model is valid for wide range of applications especially in mechanical, naval and structural engineering fields.

  11. Automation, Performance and International Competition

    DEFF Research Database (Denmark)

    Kromann, Lene; Sørensen, Anders

    This paper presents new evidence on trade‐induced automation in manufacturing firms using unique data combining a retrospective survey that we have assembled with register data for 2005‐2010. In particular, we establish a causal effect where firms that have specialized in product types for which...... the Chinese exports to the world market has risen sharply invest more in automated capital compared to firms that have specialized in other product types. We also study the relationship between automation and firm performance and find that firms with high increases in scale and scope of automation have faster...... productivity growth than other firms. Moreover, automation improves the efficiency of all stages of the production process by reducing setup time, run time, and inspection time and increasing uptime and quantity produced per worker. The efficiency improvement varies by type of automation....

  12. Automated processing pipeline for neonatal diffusion MRI in the developing Human Connectome Project.

    Science.gov (United States)

    Bastiani, Matteo; Andersson, Jesper L R; Cordero-Grande, Lucilio; Murgasova, Maria; Hutter, Jana; Price, Anthony N; Makropoulos, Antonios; Fitzgibbon, Sean P; Hughes, Emer; Rueckert, Daniel; Victor, Suresh; Rutherford, Mary; Edwards, A David; Smith, Stephen M; Tournier, Jacques-Donald; Hajnal, Joseph V; Jbabdi, Saad; Sotiropoulos, Stamatios N

    2018-05-28

    The developing Human Connectome Project is set to create and make available to the scientific community a 4-dimensional map of functional and structural cerebral connectivity from 20 to 44 weeks post-menstrual age, to allow exploration of the genetic and environmental influences on brain development, and the relation between connectivity and neurocognitive function. A large set of multi-modal MRI data from fetuses and newborn infants is currently being acquired, along with genetic, clinical and developmental information. In this overview, we describe the neonatal diffusion MRI (dMRI) image processing pipeline and the structural connectivity aspect of the project. Neonatal dMRI data poses specific challenges, and standard analysis techniques used for adult data are not directly applicable. We have developed a processing pipeline that deals directly with neonatal-specific issues, such as severe motion and motion-related artefacts, small brain sizes, high brain water content and reduced anisotropy. This pipeline allows automated analysis of in-vivo dMRI data, probes tissue microstructure, reconstructs a number of major white matter tracts, and includes an automated quality control framework that identifies processing issues or inconsistencies. We here describe the pipeline and present an exemplar analysis of data from 140 infants imaged at 38-44 weeks post-menstrual age. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  14. Synthesis of Well-Defined Copper "N"-Heterocyclic Carbene Complexes and Their Use as Catalysts for a "Click Reaction": A Multistep Experiment that Emphasizes the Role of Catalysis in Green Chemistry

    Science.gov (United States)

    Ison, Elon A.; Ison, Ana

    2012-01-01

    A multistep experiment for an advanced synthesis lab course that incorporates topics in organic-inorganic synthesis and catalysis and highlights green chemistry principles was developed. Students synthesized two "N"-heterocyclic carbene ligands, used them to prepare two well-defined copper(I) complexes and subsequently utilized the complexes as…

  15. The Automation-by-Expertise-by-Training Interaction.

    Science.gov (United States)

    Strauch, Barry

    2017-03-01

    I introduce the automation-by-expertise-by-training interaction in automated systems and discuss its influence on operator performance. Transportation accidents that, across a 30-year interval demonstrated identical automation-related operator errors, suggest a need to reexamine traditional views of automation. I review accident investigation reports, regulator studies, and literature on human computer interaction, expertise, and training and discuss how failing to attend to the interaction of automation, expertise level, and training has enabled operators to commit identical automation-related errors. Automated systems continue to provide capabilities exceeding operators' need for effective system operation and provide interfaces that can hinder, rather than enhance, operator automation-related situation awareness. Because of limitations in time and resources, training programs do not provide operators the expertise needed to effectively operate these automated systems, requiring them to obtain the expertise ad hoc during system operations. As a result, many do not acquire necessary automation-related system expertise. Integrating automation with expected operator expertise levels, and within training programs that provide operators the necessary automation expertise, can reduce opportunities for automation-related operator errors. Research to address the automation-by-expertise-by-training interaction is needed. However, such research must meet challenges inherent to examining realistic sociotechnical system automation features with representative samples of operators, perhaps by using observational and ethnographic research. Research in this domain should improve the integration of design and training and, it is hoped, enhance operator performance.

  16. Mechanical properties of molybdenum-titanium alloys micro-structurally controlled by multi-step internal nitriding

    International Nuclear Information System (INIS)

    Nagae, M.; Yoshio, T.; Takemoto, Y.; Takada, J.; Hiraoka, Y.

    2001-01-01

    Internally nitrided dilute Mo-Ti alloys having a heavily deformed microstructure near the specimen surface were prepared by a novel two-step nitriding process at 1173 to 1773 K in N 2 gas. For the nitrided specimens three-point bend tests were performed at temperatures from 77 to 298 K in order to investigate the effect of microstructure control by internal nitriding on the ductile-to-brittle transition temperature (DBTT) of the alloy Yield strength obtained at 243 K of the specimen maintaining the deformed microstructure by the two-step nitriding was about 1.7 times as much as recrystallized specimen. The specimen subjected to the two-step nitriding was bent more than 90 degree at 243 K, whereas recrystallized specimen was fractured after showing a slight ductility at 243 K. DBTT of the specimen subjected to the two-step nitriding and recrystallized specimen was about 153 K and 203 K, respectively. These results indicate that multi-step internal nitriding is very effective to the improvement in the embrittlement by the recrystallization of molybdenum alloys. (author)

  17. Distribution automation

    International Nuclear Information System (INIS)

    Gruenemeyer, D.

    1991-01-01

    This paper reports on a Distribution Automation (DA) System enhances the efficiency and productivity of a utility. It also provides intangible benefits such as improved public image and market advantages. A utility should evaluate the benefits and costs of such a system before committing funds. The expenditure for distribution automation is economical when justified by the deferral of a capacity increase, a decrease in peak power demand, or a reduction in O and M requirements

  18. A Software Module for High-Accuracy Calibration of Rings and Cylinders on CMM using Multi-Orientation Techniques (Multi-Step and Reversal methods)

    DEFF Research Database (Denmark)

    Tosello, Guido; De Chiffre, Leonardo

    . The Centre for Geometrical Metrology (CGM) at the Technical University of Denmark takes care of free form measurements, in collaboration with DIMEG, University of Padova, Italy. The present report describes a software module, ROUNDCAL, to be used for high-accuracy calibration of rings and cylinders....... The purpose of the software is to calculate the form error and the least square circle of rings and cylinders by mean of average of pontwise measuring results becoming from so-called multi-orientation techniques (both reversal and multi-step methods) in order to eliminate systematic errors of CMM ....

  19. Understanding human management of automation errors

    Science.gov (United States)

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  20. Human-centred automation: an explorative study

    International Nuclear Information System (INIS)

    Hollnagel, Erik; Miberg, Ann Britt

    1999-05-01

    The purpose of the programme activity on human-centred automation at the HRP is to develop knowledge (in the form of models and theories) and tools (in the form of techniques and simulators) to support design of automation that ensures effective human performance and comprehension. This report presents the work done on both the analytical and experimental side of this project. The analytical work has surveyed common definitions of automation and traditional design principles. A general finding is that human-centred automation usually is defined in terms of what it is not. This is partly due to a lack of adequate models and of human-automation interaction. Another result is a clarification of the consequences of automation, in particular with regard to situation awareness and workload. The experimental work has taken place as an explorative experiment in HAMMLAB in collaboration with IPSN (France). The purpose of this experiment was to increase the understanding of how automation influences operator performance in NPP control rooms. Two different types of automation (extensive and limited) were considered in scenarios having two different degrees of complexity (high and low), and involving diagnostic and procedural tasks. Six licensed NPP crews from the NPP at Loviisa, Finland, participated in the experiment. The dependent variables applied were plant performance, operator performance, self-rated crew performance, situation awareness, workload, and operator trust in the automation. The results from the diagnostic scenarios indicated that operators' judgement of crew efficiency was related to their level of trust in the automation, and further that operators trusted automation least and rated crew performance lowest in situations where crew performance was efficient and vice versa. The results from procedural scenarios indicated that extensive automation efficiently supported operators' performance, and further that operator' judgement of crew performance efficiency

  1. Configuration Management Automation (CMA) -

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  2. Toward designing for trust in database automation

    Energy Technology Data Exchange (ETDEWEB)

    Duez, P. P.; Jamieson, G. A. [Cognitive Engineering Laboratory, Univ. of Toronto, 5 King' s College Rd., Toronto, Ont. M5S 3G8 (Canada)

    2006-07-01

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operating functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process

  3. Toward designing for trust in database automation

    International Nuclear Information System (INIS)

    Duez, P. P.; Jamieson, G. A.

    2006-01-01

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operating functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process

  4. Multi-step resistive switching behavior of Li-doped ZnO resistance random access memory device controlled by compliance current

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Chun-Cheng [Department of Electrical Engineering, National Cheng Kung University, Tainan 701, Taiwan (China); Department of Mathematic and Physical Sciences, R.O.C. Air Force Academy, Kaohsiung 820, Taiwan (China); Tang, Jian-Fu; Su, Hsiu-Hsien [Department of Electrical Engineering, National Cheng Kung University, Tainan 701, Taiwan (China); Hong, Cheng-Shong; Huang, Chih-Yu [Department of Electronic Engineering, National Kaohsiung Normal University, Kaohsiung 802, Taiwan (China); Chu, Sheng-Yuan, E-mail: chusy@mail.ncku.edu.tw [Department of Electrical Engineering, National Cheng Kung University, Tainan 701, Taiwan (China); Center for Micro/Nano Science and Technology, National Cheng Kung University, Tainan 701, Taiwan (China)

    2016-06-28

    The multi-step resistive switching (RS) behavior of a unipolar Pt/Li{sub 0.06}Zn{sub 0.94}O/Pt resistive random access memory (RRAM) device is investigated. It is found that the RRAM device exhibits normal, 2-, 3-, and 4-step RESET behaviors under different compliance currents. The transport mechanism within the device is investigated by means of current-voltage curves, in-situ transmission electron microscopy, and electrochemical impedance spectroscopy. It is shown that the ion transport mechanism is dominated by Ohmic behavior under low electric fields and the Poole-Frenkel emission effect (normal RS behavior) or Li{sup +} ion diffusion (2-, 3-, and 4-step RESET behaviors) under high electric fields.

  5. The improved syntheses of 5-substituted 2'-[18F]fluoro-2'-deoxy-arabinofuranosyluracil derivatives ([18F]FAU, [18F]FEAU, [18F]FFAU, [18F]FCAU, [18F]FBAU and [18F]FIAU) using a multistep one-pot strategy

    International Nuclear Information System (INIS)

    Cai Hancheng; Li Zibo; Conti, Peter S.

    2011-01-01

    Introduction: We and others have previously reported a four-step radiosynthesis of a series of 2'-deoxy-2'-[ 18 F]fluoro-5-substituted-1-β-D-arabinofuranosyluracil derivatives including [ 18 F]FAU, [ 18 F]FEAU, [ 18 F]FFAU, [ 18 F]FCAU, [ 18 F]FBAU and [ 18 F]FIAU as thymidine derivatives for tumor proliferation and/or reporter gene expression imaging with positron emission tomography (PET). Although the radiosynthesis has been proven to be reproducible and efficient, this complicated multistep reaction is difficult to incorporate into an automated cGMP-compliant radiosynthesis module for routine production. Recently, we have developed a simple and efficient one-pot method for routine production of [ 18 F]FMAU. In this study, we studied the feasibility of radiosynthesizing [ 18 F]FAU, [ 18 F]FEAU, [ 18 F]FFAU, [ 18 F]FCAU, [ 18 F]FBAU and [ 18 F]FIAU using this newly developed method. Methods: Similar to the radiosynthesis of [ 18 F]FMAU, 5-substituted 2'-[ 18 F]fluoro-2'-deoxy-arabinofuranosyluracil derivatives ([ 18 F]FAU, [ 18 F]FEAU, [ 18 F]FFAU, [ 18 F]FCAU, [ 18 F]FBAU and [ 18 F]FIAU) were synthesized in one-pot radiosynthesis module in the presence of Friedel-Crafts catalyst TMSOTf and HMDS. Results: This one-pot radiosynthesis method could be used to produce [ 18 F]FAU, [ 18 F]FEAU, [ 18 F]FFAU, [ 18 F]FCAU, [ 18 F]FBAU and [ 18 F]FIAU. The overall radiochemical yields of these tracers varied from 4.1%±0.8% to 10.1%±1.9% (decay-corrected, n=4). The overall reaction time was reduced from 210 min to 150 min from the end of bombardment, and the radiochemical purity was >99%. Conclusions: The improved radiosyntheses of [ 18 F]FAU, [ 18 F]FEAU, [ 18 F]FFAU, [ 18 F]FCAU, [ 18 F]FBAU and [ 18 F]FIAU have been achieved with reasonable yields and high purity using a multistep one-pot method. The synthetic time has been reduced, and the reaction procedures have been significantly simplified. The success of this approach may make PET tracers [ 18 F]FAU, [ 18 F

  6. Automated electron microprobe

    International Nuclear Information System (INIS)

    Thompson, K.A.; Walker, L.R.

    1986-01-01

    The Plant Laboratory at the Oak Ridge Y-12 Plant has recently obtained a Cameca MBX electron microprobe with a Tracor Northern TN5500 automation system. This allows full stage and spectrometer automation and digital beam control. The capabilities of the system include qualitative and quantitative elemental microanalysis for all elements above and including boron in atomic number, high- and low-magnification imaging and processing, elemental mapping and enhancement, and particle size, shape, and composition analyses. Very low magnification, quantitative elemental mapping using stage control (which is of particular interest) has been accomplished along with automated size, shape, and composition analysis over a large relative area

  7. Automate functional testing

    Directory of Open Access Journals (Sweden)

    Ramesh Kalindri

    2014-06-01

    Full Text Available Currently, software engineers are increasingly turning to the option of automating functional tests, but not always have successful in this endeavor. Reasons range from low planning until over cost in the process. Some principles that can guide teams in automating these tests are described in this article.

  8. Driver Psychology during Automated Platooning

    NARCIS (Netherlands)

    Heikoop, D.D.

    2017-01-01

    With the rapid increase in vehicle automation technology, the call for understanding how humans behave while driving in an automated vehicle becomes more urgent. Vehicles that have automated systems such as Lane Keeping Assist (LKA) or Adaptive Cruise Control (ACC) not only support drivers in their

  9. Modeling Increased Complexity and the Reliance on Automation: FLightdeck Automation Problems (FLAP) Model

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.

  10. Automation systems for radioimmunoassay

    International Nuclear Information System (INIS)

    Yamasaki, Paul

    1974-01-01

    The application of automation systems for radioimmunoassay (RIA) was discussed. Automated systems could be useful in the second step, of the four basic processes in the course of RIA, i.e., preparation of sample for reaction. There were two types of instrumentation, a semi-automatic pipete, and a fully automated pipete station, both providing for fast and accurate dispensing of the reagent or for the diluting of sample with reagent. Illustrations of the instruments were shown. (Mukohata, S.)

  11. Mobile Genome Express (MGE: A comprehensive automatic genetic analyses pipeline with a mobile device.

    Directory of Open Access Journals (Sweden)

    Jun-Hee Yoon

    Full Text Available The development of next-generation sequencing (NGS technology allows to sequence whole exomes or genome. However, data analysis is still the biggest bottleneck for its wide implementation. Most laboratories still depend on manual procedures for data handling and analyses, which translates into a delay and decreased efficiency in the delivery of NGS results to doctors and patients. Thus, there is high demand for developing an automatic and an easy-to-use NGS data analyses system. We developed comprehensive, automatic genetic analyses controller named Mobile Genome Express (MGE that works in smartphones or other mobile devices. MGE can handle all the steps for genetic analyses, such as: sample information submission, sequencing run quality check from the sequencer, secured data transfer and results review. We sequenced an Actrometrix control DNA containing multiple proven human mutations using a targeted sequencing panel, and the whole analysis was managed by MGE, and its data reviewing program called ELECTRO. All steps were processed automatically except for the final sequencing review procedure with ELECTRO to confirm mutations. The data analysis process was completed within several hours. We confirmed the mutations that we have identified were consistent with our previous results obtained by using multi-step, manual pipelines.

  12. Mobile Genome Express (MGE): A comprehensive automatic genetic analyses pipeline with a mobile device.

    Science.gov (United States)

    Yoon, Jun-Hee; Kim, Thomas W; Mendez, Pedro; Jablons, David M; Kim, Il-Jin

    2017-01-01

    The development of next-generation sequencing (NGS) technology allows to sequence whole exomes or genome. However, data analysis is still the biggest bottleneck for its wide implementation. Most laboratories still depend on manual procedures for data handling and analyses, which translates into a delay and decreased efficiency in the delivery of NGS results to doctors and patients. Thus, there is high demand for developing an automatic and an easy-to-use NGS data analyses system. We developed comprehensive, automatic genetic analyses controller named Mobile Genome Express (MGE) that works in smartphones or other mobile devices. MGE can handle all the steps for genetic analyses, such as: sample information submission, sequencing run quality check from the sequencer, secured data transfer and results review. We sequenced an Actrometrix control DNA containing multiple proven human mutations using a targeted sequencing panel, and the whole analysis was managed by MGE, and its data reviewing program called ELECTRO. All steps were processed automatically except for the final sequencing review procedure with ELECTRO to confirm mutations. The data analysis process was completed within several hours. We confirmed the mutations that we have identified were consistent with our previous results obtained by using multi-step, manual pipelines.

  13. Genetic Constructor: An Online DNA Design Platform.

    Science.gov (United States)

    Bates, Maxwell; Lachoff, Joe; Meech, Duncan; Zulkower, Valentin; Moisy, Anaïs; Luo, Yisha; Tekotte, Hille; Franziska Scheitz, Cornelia Johanna; Khilari, Rupal; Mazzoldi, Florencio; Chandran, Deepak; Groban, Eli

    2017-12-15

    Genetic Constructor is a cloud Computer Aided Design (CAD) application developed to support synthetic biologists from design intent through DNA fabrication and experiment iteration. The platform allows users to design, manage, and navigate complex DNA constructs and libraries, using a new visual language that focuses on functional parts abstracted from sequence. Features like combinatorial libraries and automated primer design allow the user to separate design from construction by focusing on functional intent, and design constraints aid iterative refinement of designs. A plugin architecture enables contributions from scientists and coders to leverage existing powerful software and connect to DNA foundries. The software is easily accessible and platform agnostic, free for academics, and available in an open-source community edition. Genetic Constructor seeks to democratize DNA design, manufacture, and access to tools and services from the synthetic biology community.

  14. A multistep damage recognition mechanism for global genomic nucleotide excision repair.

    Science.gov (United States)

    Sugasawa, K; Okamoto, T; Shimizu, Y; Masutani, C; Iwai, S; Hanaoka, F

    2001-03-01

    A mammalian nucleotide excision repair (NER) factor, the XPC-HR23B complex, can specifically bind to certain DNA lesions and initiate the cell-free repair reaction. Here we describe a detailed analysis of its binding specificity using various DNA substrates, each containing a single defined lesion. A highly sensitive gel mobility shift assay revealed that XPC-HR23B specifically binds a small bubble structure with or without damaged bases, whereas dual incision takes place only when damage is present in the bubble. This is evidence that damage recognition for NER is accomplished through at least two steps; XPC-HR23B first binds to a site that has a DNA helix distortion, and then the presence of injured bases is verified prior to dual incision. Cyclobutane pyrimidine dimers (CPDs) were hardly recognized by XPC-HR23B, suggesting that additional factors may be required for CPD recognition. Although the presence of mismatched bases opposite a CPD potentiated XPC-HR23B binding, probably due to enhancement of the helix distortion, cell-free excision of such compound lesions was much more efficient than expected from the observed affinity for XPC-HR23B. This also suggests that additional factors and steps are required for the recognition of some types of lesions. A multistep mechanism of this sort may provide a molecular basis for ensuring the high level of damage discrimination that is required for global genomic NER.

  15. Classification of Automated Search Traffic

    Science.gov (United States)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  16. An automated swimming respirometer

    DEFF Research Database (Denmark)

    STEFFENSEN, JF; JOHANSEN, K; BUSHNELL, PG

    1984-01-01

    An automated respirometer is described that can be used for computerized respirometry of trout and sharks.......An automated respirometer is described that can be used for computerized respirometry of trout and sharks....

  17. Laboratory automation: trajectory, technology, and tactics.

    Science.gov (United States)

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a

  18. Contaminant analysis automation, an overview

    International Nuclear Information System (INIS)

    Hollen, R.; Ramos, O. Jr.

    1996-01-01

    To meet the environmental restoration and waste minimization goals of government and industry, several government laboratories, universities, and private companies have formed the Contaminant Analysis Automation (CAA) team. The goal of this consortium is to design and fabricate robotics systems that standardize and automate the hardware and software of the most common environmental chemical methods. In essence, the CAA team takes conventional, regulatory- approved (EPA Methods) chemical analysis processes and automates them. The automation consists of standard laboratory modules (SLMs) that perform the work in a much more efficient, accurate, and cost- effective manner

  19. Selecting automation for the clinical chemistry laboratory.

    Science.gov (United States)

    Melanson, Stacy E F; Lindeman, Neal I; Jarolim, Petr

    2007-07-01

    Laboratory automation proposes to improve the quality and efficiency of laboratory operations, and may provide a solution to the quality demands and staff shortages faced by today's clinical laboratories. Several vendors offer automation systems in the United States, with both subtle and obvious differences. Arriving at a decision to automate, and the ensuing evaluation of available products, can be time-consuming and challenging. Although considerable discussion concerning the decision to automate has been published, relatively little attention has been paid to the process of evaluating and selecting automation systems. To outline a process for evaluating and selecting automation systems as a reference for laboratories contemplating laboratory automation. Our Clinical Chemistry Laboratory staff recently evaluated all major laboratory automation systems in the United States, with their respective chemistry and immunochemistry analyzers. Our experience is described and organized according to the selection process, the important considerations in clinical chemistry automation, decisions and implementation, and we give conclusions pertaining to this experience. Including the formation of a committee, workflow analysis, submitting a request for proposal, site visits, and making a final decision, the process of selecting chemistry automation took approximately 14 months. We outline important considerations in automation design, preanalytical processing, analyzer selection, postanalytical storage, and data management. Selecting clinical chemistry laboratory automation is a complex, time-consuming process. Laboratories considering laboratory automation may benefit from the concise overview and narrative and tabular suggestions provided.

  20. Identification of five B-type response regulators as members of a multistep phosphorelay system interacting with histidine-containing phosphotransfer partners of Populus osmosensor

    Directory of Open Access Journals (Sweden)

    Bertheau Lucie

    2012-12-01

    Full Text Available Abstract Background In plants, the multistep phosphorelay signaling pathway mediates responses to environmental factors and plant hormones. This system is composed of three successive partners: hybrid Histidine-aspartate Kinases (HKs, Histidine-containing Phosphotransfer proteins (HPts, and Response Regulators (RRs. Among the third partners, B-type RR family members are the final output elements of the pathway; they act as transcription factors and clearly play a pivotal role in the early response to cytokinin in Arabidopsis. While interactions studies between partners belonging to the multistep phosphorelay system are mainly focused on protagonists involved in cytokinin or ethylene pathways, very few reports are available concerning partners of osmotic stress signaling pathway. Results In Populus, we identified eight B-type RR proteins, RR12-16, 19, 21 and 22 in the Dorskamp genotype. To assess HPt/B-type RR interactions and consequently determine potential third partners in the osmosensing multistep phosphorelay system, we performed global yeast two-hybrid (Y2H assays in combination with Bimolecular Fluorescence Complementation (BiFC assays in plant cells. We found that all B-type RRs are able to interact with HPt predominant partners (HPt2, 7 and 9 of HK1, which is putatively involved in the osmosensing pathway. However, different profiles of interaction are observed depending on the studied HPt. HPt/RR interactions displayed a nuclear localization, while the nuclear and cytosolic localization of HPt and nuclear localization of RR proteins were validated. Although the nuclear localization of HPt/RR interaction was expected, this work constitutes the first evidence of such an interaction in plants. Furthermore, the pertinence of this partnership is reinforced by highlighting a co-expression of B-type RR transcripts and the other partners (HK1 and HPts belonging to a potential osmosensing pathway. Conclusion Based on the interaction studies

  1. Automated Vehicles Symposium 2014

    CERN Document Server

    Beiker, Sven; Road Vehicle Automation 2

    2015-01-01

    This paper collection is the second volume of the LNMOB series on Road Vehicle Automation. The book contains a comprehensive review of current technical, socio-economic, and legal perspectives written by experts coming from public authorities, companies and universities in the U.S., Europe and Japan. It originates from the Automated Vehicle Symposium 2014, which was jointly organized by the Association for Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Burlingame, CA, in July 2014. The contributions discuss the challenges arising from the integration of highly automated and self-driving vehicles into the transportation system, with a focus on human factors and different deployment scenarios. This book is an indispensable source of information for academic researchers, industrial engineers, and policy makers interested in the topic of road vehicle automation.

  2. Work Planing Automation at Mechanical Subdivision

    OpenAIRE

    Dzindzelėta, Vytautas

    2005-01-01

    Work planing automation, installation possibilities and future outlook at mechanical subdivision. To study how the work planing has changed before and after automation process and to analyse automation process methodology.

  3. Automated solid-phase subcloning based on beads brought into proximity by magnetic force.

    Science.gov (United States)

    Hudson, Elton P; Nikoshkov, Andrej; Uhlen, Mathias; Rockberg, Johan

    2012-01-01

    In the fields of proteomics, metabolic engineering and synthetic biology there is a need for high-throughput and reliable cloning methods to facilitate construction of expression vectors and genetic pathways. Here, we describe a new approach for solid-phase cloning in which both the vector and the gene are immobilized to separate paramagnetic beads and brought into proximity by magnetic force. Ligation events were directly evaluated using fluorescent-based microscopy and flow cytometry. The highest ligation efficiencies were obtained when gene- and vector-coated beads were brought into close contact by application of a magnet during the ligation step. An automated procedure was developed using a laboratory workstation to transfer genes into various expression vectors and more than 95% correct clones were obtained in a number of various applications. The method presented here is suitable for efficient subcloning in an automated manner to rapidly generate a large number of gene constructs in various vectors intended for high throughput applications.

  4. Evaluation of BRCA1 and BRCA2 mutation prevalence, risk prediction models and a multistep testing approach in French‐Canadian families with high risk of breast and ovarian cancer

    Science.gov (United States)

    Simard, Jacques; Dumont, Martine; Moisan, Anne‐Marie; Gaborieau, Valérie; Vézina, Hélène; Durocher, Francine; Chiquette, Jocelyne; Plante, Marie; Avard, Denise; Bessette, Paul; Brousseau, Claire; Dorval, Michel; Godard, Béatrice; Houde, Louis; Joly, Yann; Lajoie, Marie‐Andrée; Leblanc, Gilles; Lépine, Jean; Lespérance, Bernard; Malouin, Hélène; Parboosingh, Jillian; Pichette, Roxane; Provencher, Louise; Rhéaume, Josée; Sinnett, Daniel; Samson, Carolle; Simard, Jean‐Claude; Tranchant, Martine; Voyer, Patricia; BRCAs, INHERIT; Easton, Douglas; Tavtigian, Sean V; Knoppers, Bartha‐Maria; Laframboise, Rachel; Bridge, Peter; Goldgar, David

    2007-01-01

    Background and objective In clinical settings with fixed resources allocated to predictive genetic testing for high‐risk cancer predisposition genes, optimal strategies for mutation screening programmes are critically important. These depend on the mutation spectrum found in the population under consideration and the frequency of mutations detected as a function of the personal and family history of cancer, which are both affected by the presence of founder mutations and demographic characteristics of the underlying population. The results of multistep genetic testing for mutations in BRCA1 or BRCA2 in a large series of families with breast cancer in the French‐Canadian population of Quebec, Canada are reported. Methods A total of 256 high‐risk families were ascertained from regional familial cancer clinics throughout the province of Quebec. Initially, families were tested for a panel of specific mutations known to occur in this population. Families in which no mutation was identified were then comprehensively tested. Three algorithms to predict the presence of mutations were evaluated, including the prevalence tables provided by Myriad Genetics Laboratories, the Manchester Scoring System and a logistic regression approach based on the data from this study. Results 8 of the 15 distinct mutations found in 62 BRCA1/BRCA2‐positive families had never been previously reported in this population, whereas 82% carried 1 of the 4 mutations currently observed in ⩾2 families. In the subset of 191 families in which at least 1 affected individual was tested, 29% carried a mutation. Of these 27 BRCA1‐positive and 29 BRCA2‐positive families, 48 (86%) were found to harbour a mutation detected by the initial test. Among the remaining 143 inconclusive families, all 8 families found to have a mutation after complete sequencing had Manchester Scores ⩾18. The logistic regression and Manchester Scores provided equal predictive power, and both were significantly better

  5. Molecular Cloning Designer Simulator (MCDS): All-in-one molecular cloning and genetic engineering design, simulation and management software for complex synthetic biology and metabolic engineering projects.

    Science.gov (United States)

    Shi, Zhenyu; Vickers, Claudia E

    2016-12-01

    Molecular Cloning Designer Simulator (MCDS) is a powerful new all-in-one cloning and genetic engineering design, simulation and management software platform developed for complex synthetic biology and metabolic engineering projects. In addition to standard functions, it has a number of features that are either unique, or are not found in combination in any one software package: (1) it has a novel interactive flow-chart user interface for complex multi-step processes, allowing an integrated overview of the whole project; (2) it can perform a user-defined workflow of cloning steps in a single execution of the software; (3) it can handle multiple types of genetic recombineering, a technique that is rapidly replacing classical cloning for many applications; (4) it includes experimental information to conveniently guide wet lab work; and (5) it can store results and comments to allow the tracking and management of the whole project in one platform. MCDS is freely available from https://mcds.codeplex.com.

  6. Multi-AGV path planning with double-path constraints by using an improved genetic algorithm.

    Directory of Open Access Journals (Sweden)

    Zengliang Han

    Full Text Available This paper investigates an improved genetic algorithm on multiple automated guided vehicle (multi-AGV path planning. The innovations embody in two aspects. First, three-exchange crossover heuristic operators are used to produce more optimal offsprings for getting more information than with the traditional two-exchange crossover heuristic operators in the improved genetic algorithm. Second, double-path constraints of both minimizing the total path distance of all AGVs and minimizing single path distances of each AGV are exerted, gaining the optimal shortest total path distance. The simulation results show that the total path distance of all AGVs and the longest single AGV path distance are shortened by using the improved genetic algorithm.

  7. Airway Clearance Techniques (ACTs)

    Medline Plus

    Full Text Available ... decisions about your health care. CF Genetics: The ... Diagnosing CF is a multistep process. A complete diagnostic evaluation should include a newborn screening, a sweat chloride ...

  8. Physiological Self-Regulation and Adaptive Automation

    Science.gov (United States)

    Prinzell, Lawrence J.; Pope, Alan T.; Freeman, Frederick G.

    2007-01-01

    Adaptive automation has been proposed as a solution to current problems of human-automation interaction. Past research has shown the potential of this advanced form of automation to enhance pilot engagement and lower cognitive workload. However, there have been concerns voiced regarding issues, such as automation surprises, associated with the use of adaptive automation. This study examined the use of psychophysiological self-regulation training with adaptive automation that may help pilots deal with these problems through the enhancement of cognitive resource management skills. Eighteen participants were assigned to 3 groups (self-regulation training, false feedback, and control) and performed resource management, monitoring, and tracking tasks from the Multiple Attribute Task Battery. The tracking task was cycled between 3 levels of task difficulty (automatic, adaptive aiding, manual) on the basis of the electroencephalogram-derived engagement index. The other two tasks remained in automatic mode that had a single automation failure. Those participants who had received self-regulation training performed significantly better and reported lower National Aeronautics and Space Administration Task Load Index scores than participants in the false feedback and control groups. The theoretical and practical implications of these results for adaptive automation are discussed.

  9. Automated quantification of neuronal networks and single-cell calcium dynamics using calcium imaging.

    Science.gov (United States)

    Patel, Tapan P; Man, Karen; Firestein, Bonnie L; Meaney, David F

    2015-03-30

    Recent advances in genetically engineered calcium and membrane potential indicators provide the potential to estimate the activation dynamics of individual neurons within larger, mesoscale networks (100s-1000+neurons). However, a fully integrated automated workflow for the analysis and visualization of neural microcircuits from high speed fluorescence imaging data is lacking. Here we introduce FluoroSNNAP, Fluorescence Single Neuron and Network Analysis Package. FluoroSNNAP is an open-source, interactive software developed in MATLAB for automated quantification of numerous biologically relevant features of both the calcium dynamics of single-cells and network activity patterns. FluoroSNNAP integrates and improves upon existing tools for spike detection, synchronization analysis, and inference of functional connectivity, making it most useful to experimentalists with little or no programming knowledge. We apply FluoroSNNAP to characterize the activity patterns of neuronal microcircuits undergoing developmental maturation in vitro. Separately, we highlight the utility of single-cell analysis for phenotyping a mixed population of neurons expressing a human mutant variant of the microtubule associated protein tau and wild-type tau. We show the performance of semi-automated cell segmentation using spatiotemporal independent component analysis and significant improvement in detecting calcium transients using a template-based algorithm in comparison to peak-based or wavelet-based detection methods. Our software further enables automated analysis of microcircuits, which is an improvement over existing methods. We expect the dissemination of this software will facilitate a comprehensive analysis of neuronal networks, promoting the rapid interrogation of circuits in health and disease. Copyright © 2015. Published by Elsevier B.V.

  10. Buying Program of the Standard Automated Materiel Management System. Automated Small Purchase System: Defense Supply Center Philadelphia

    National Research Council Canada - National Science Library

    2001-01-01

    The Standard Automated Materiel Management System Automated Small Purchase System is a fully automated micro-purchases system used by the General and Industrial Directorate at the Defense Supply Center Philadelphia...

  11. Comparison of Size Modulation Standard Automated Perimetry and Conventional Standard Automated Perimetry with a 10-2 Test Program in Glaucoma Patients.

    Science.gov (United States)

    Hirasawa, Kazunori; Takahashi, Natsumi; Satou, Tsukasa; Kasahara, Masayuki; Matsumura, Kazuhiro; Shoji, Nobuyuki

    2017-08-01

    This prospective observational study compared the performance of size modulation standard automated perimetry with the Octopus 600 10-2 test program, with stimulus size modulation during testing, based on stimulus intensity and conventional standard automated perimetry, with that of the Humphrey 10-2 test program in glaucoma patients. Eighty-seven eyes of 87 glaucoma patients underwent size modulation standard automated perimetry with Dynamic strategy and conventional standard automated perimetry using the SITA standard strategy. The main outcome measures were global indices, point-wise threshold, visual defect size and depth, reliability indices, and test duration; these were compared between size modulation standard automated perimetry and conventional standard automated perimetry. Global indices and point-wise threshold values between size modulation standard automated perimetry and conventional standard automated perimetry were moderately to strongly correlated (p 33.40, p modulation standard automated perimetry than with conventional standard automated perimetry, but the visual-field defect size was smaller (p modulation-standard automated perimetry than on conventional standard automated perimetry. The reliability indices, particularly the false-negative response, of size modulation standard automated perimetry were worse than those of conventional standard automated perimetry (p modulation standard automated perimetry than with conventional standard automated perimetry (p = 0.02). Global indices and the point-wise threshold value of the two testing modalities correlated well. However, the potential of a large stimulus presented at an area with a decreased sensitivity with size modulation standard automated perimetry could underestimate the actual threshold in the 10-2 test protocol, as compared with conventional standard automated perimetry.

  12. Extent and Area of Swidden in Montane Mainland Southeast Asia: Estimation by Multi-Step Thresholds with Landsat-8 OLI Data

    Directory of Open Access Journals (Sweden)

    Peng Li

    2016-01-01

    Full Text Available Information on the distribution, area and extent of swidden agriculture landscape is necessary for implementing the program of Reducing Emissions from Deforestation and Forest Degradation (REDD, biodiversity conservation and local livelihood improvement. To our knowledge, explicit spatial maps and accurate area data on swidden agriculture remain surprisingly lacking. However, this traditional farming practice has been transforming into other profit-driven land use, like tree plantations and permanent cash agriculture. Swidden agriculture is characterized by a rotational and dynamic nature of agroforestry, with land cover changing from natural forests, newly-cleared swiddens to different-aged fallows. The Operational Land Imager (OLI onboard the Landsat-8 satellite has visible, near-infrared and shortwave infrared bands, which are sensitive to the changes in vegetation cover, land surface moisture content and soil exposure, and therefore, four vegetation indices (VIs were calculated, including the Normalized Difference Vegetation Index (NDVI, the Normalized Difference Moisture Index (NDMI, the Normalized Burn Ratio (NBR and the Soil Adjusted Vegetation Index (SAVI. In this study, we developed a multi-step threshold approach that uses a combination of thresholds of four VIs and local elevation range (LER and applied it to detect and map newly-opened swiddens and different-aged fallows using OLI imagery acquired between 2013 and 2015. The resultant Landsat-derived swidden agriculture maps have high accuracy with an overall accuracy of 86.9% and a Kappa coefficient of 0.864. The results of this study indicated that the Landsat-based multi-step threshold algorithms could potentially be applied to monitor the long-term change pattern of swidden agriculture in montane mainland Southeast Asia since the late 1980s and also in other tropical regions, like insular Southeast Asia, South Asia, Latin America and Central Africa, where swidden agriculture is

  13. Laboratory automation and LIMS in forensics

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Morling, Niels

    2013-01-01

    . Furthermore, implementation of automated liquid handlers reduces the risk of sample misplacement. A LIMS can efficiently control the sample flow through the laboratory and manage the results of the conducted tests for each sample. Integration of automated liquid handlers with a LIMS provides the laboratory......Implementation of laboratory automation and LIMS in a forensic laboratory enables the laboratory, to standardize sample processing. Automated liquid handlers can increase throughput and eliminate manual repetitive pipetting operations, known to result in occupational injuries to the technical staff...... with the tools required for setting up automated production lines of complex laboratory processes and monitoring the whole process and the results. Combined, this enables processing of a large number of samples. Selection of the best automated solution for an individual laboratory should be based on user...

  14. Automation and robotics

    Science.gov (United States)

    Montemerlo, Melvin

    1988-01-01

    The Autonomous Systems focus on the automation of control systems for the Space Station and mission operations. Telerobotics focuses on automation for in-space servicing, assembly, and repair. The Autonomous Systems and Telerobotics each have a planned sequence of integrated demonstrations showing the evolutionary advance of the state-of-the-art. Progress is briefly described for each area of concern.

  15. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Aman, J.K.

    1986-01-01

    Automation, the removal of the human element in inspection, has not been generally applied to film radiographic NDT. The justication for automating is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 8 x 10 9 bits per 14 x 17. The equivalent to 2200 computer floppy discs. Parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, computer aided interpretation appears on the horizon. A unit which laser scans a 14 x 17 (inch) film in 6 - 8 seconds can digitize film information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for Film Digital Radiography System) is moving toward 50 micron (*approx* 16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. We expect the automated system to appear first in parts (modules) as certain operations are automated. The future will see it all come together in an automated film radiographic NDT system (author) [pt

  16. Genetic predisposition and implications for radioprotection

    Energy Technology Data Exchange (ETDEWEB)

    Streffer, Christian [University Clinics, Essen, Essen (Germany)

    2000-05-01

    Treatments of cancer patients with ionizing radiation have shown in some cases severe acute radiation effects after radiation doses which are very well tolerated by most patients. Skin fibroblasts of these patients studied after in vitro irradiation also showed a high radiosensitivity frequently. It was found that these effects are based on genetic predisposition which was usually inherited from their parents. During recent years quite a number of these syndromes have been described in humans and often the responsible genes have been characterized: Ataxia telangiectasia, Bloom's syndrome, Fanconi anemia, Li Fraumeni syndrome, Nevoid basal cell carcinoma syndrome, Neurofibromatosis, Nijmegen breakage syndrome, Retinoblastoma. In most cases it was found that the regulation processes of DNA repair processes and of the cell cycle for cell proliferation are disturbed. Frequently these processes cannot be separated from each other. Quite a number of these syndromes also show genomic instability which can also be induced by radiation exposures. These Phenomena have mainly been studied by determining the rate of chromosomal aberrations many cell generations after the exposure took place. Genomic instability apparently plays an important role for the development of stochastic late effects for which multistep events are necessary. This is especially for carcinogenesis the case. In mice it has been shown that radiation-induced genomic instability can be transmitted to the next mouse generation. In mouse models and also with radiotherapy patients it has been shown that genetic predisposition not only increases radiosensitivity with respect to cell survival and chromosomal damage but also to carcinogenesis. This has been observed cf. with p53-knock out mice and with children after radiotherapy cf. treatment of retinoblastoma. In the children with a genetic predisposition for retinoblastoma secondary tumours occurred to a much higher rate than in those children with

  17. Genetic predisposition and implications for radioprotection

    International Nuclear Information System (INIS)

    Streffer, Christian

    2000-01-01

    Treatments of cancer patients with ionizing radiation have shown in some cases severe acute radiation effects after radiation doses which are very well tolerated by most patients. Skin fibroblasts of these patients studied after in vitro irradiation also showed a high radiosensitivity frequently. It was found that these effects are based on genetic predisposition which was usually inherited from their parents. During recent years quite a number of these syndromes have been described in humans and often the responsible genes have been characterized: Ataxia telangiectasia, Bloom's syndrome, Fanconi anemia, Li Fraumeni syndrome, Nevoid basal cell carcinoma syndrome, Neurofibromatosis, Nijmegen breakage syndrome, Retinoblastoma. In most cases it was found that the regulation processes of DNA repair processes and of the cell cycle for cell proliferation are disturbed. Frequently these processes cannot be separated from each other. Quite a number of these syndromes also show genomic instability which can also be induced by radiation exposures. These Phenomena have mainly been studied by determining the rate of chromosomal aberrations many cell generations after the exposure took place. Genomic instability apparently plays an important role for the development of stochastic late effects for which multistep events are necessary. This is especially for carcinogenesis the case. In mice it has been shown that radiation-induced genomic instability can be transmitted to the next mouse generation. In mouse models and also with radiotherapy patients it has been shown that genetic predisposition not only increases radiosensitivity with respect to cell survival and chromosomal damage but also to carcinogenesis. This has been observed cf. with p53-knock out mice and with children after radiotherapy cf. treatment of retinoblastoma. In the children with a genetic predisposition for retinoblastoma secondary tumours occurred to a much higher rate than in those children with

  18. Automated Vehicles Symposium 2015

    CERN Document Server

    Beiker, Sven

    2016-01-01

    This edited book comprises papers about the impacts, benefits and challenges of connected and automated cars. It is the third volume of the LNMOB series dealing with Road Vehicle Automation. The book comprises contributions from researchers, industry practitioners and policy makers, covering perspectives from the U.S., Europe and Japan. It is based on the Automated Vehicles Symposium 2015 which was jointly organized by the Association of Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Ann Arbor, Michigan, in July 2015. The topical spectrum includes, but is not limited to, public sector activities, human factors, ethical and business aspects, energy and technological perspectives, vehicle systems and transportation infrastructure. This book is an indispensable source of information for academic researchers, industrial engineers and policy makers interested in the topic of road vehicle automation.

  19. Multistep change in epidermal growth factor receptors during spontaneous neoplastic progression in Chinese hamster embryo fibroblasts

    International Nuclear Information System (INIS)

    Wakshull, E.; Kraemer, P.M.; Wharton, W.

    1985-01-01

    Whole Chinese hamster embryo lineages have been shown to undergo multistep spontaneous neoplastic progression during serial passage in culture. The authors have studied the binding, internalization, and degradation of 125 I-labeled epidermal growth factor at four different stages of transformation. The whole Chinese hamster embryo cells lost cell surface epidermal growth factor receptors gradually during the course of neoplastic progression until only 10% of the receptor number present in the early-passage cells (precrisis) were retained in the late-passage cells (tumorigenic). No differences in internalization rates, chloroquine sensitivity, or ability to degrade hormone between the various passage levels were seen. No evidence for the presence in conditioned medium of transforming growth factors which might mask or down-regulate epidermal growth factor receptor was obtained. These results suggest that a reduction in cell surface epidermal growth factor receptor might be an early event during spontaneous transformation in whole Chinese hamster embryo cells

  20. A New Multi-Step Iterative Algorithm for Approximating Common Fixed Points of a Finite Family of Multi-Valued Bregman Relatively Nonexpansive Mappings

    Directory of Open Access Journals (Sweden)

    Wiyada Kumam

    2016-05-01

    Full Text Available In this article, we introduce a new multi-step iteration for approximating a common fixed point of a finite class of multi-valued Bregman relatively nonexpansive mappings in the setting of reflexive Banach spaces. We prove a strong convergence theorem for the proposed iterative algorithm under certain hypotheses. Additionally, we also use our results for the solution of variational inequality problems and to find the zero points of maximal monotone operators. The theorems furnished in this work are new and well-established and generalize many well-known recent research works in this field.

  1. Automated PCB Inspection System

    Directory of Open Access Journals (Sweden)

    Syed Usama BUKHARI

    2017-05-01

    Full Text Available Development of an automated PCB inspection system as per the need of industry is a challenging task. In this paper a case study is presented, to exhibit, a proposed system for an immigration process of a manual PCB inspection system to an automated PCB inspection system, with a minimal intervention on the existing production flow, for a leading automotive manufacturing company. A detailed design of the system, based on computer vision followed by testing and analysis was proposed, in order to aid the manufacturer in the process of automation.

  2. Automated lung nodule classification following automated nodule detection on CT: A serial approach

    International Nuclear Information System (INIS)

    Armato, Samuel G. III; Altman, Michael B.; Wilkie, Joel; Sone, Shusuke; Li, Feng; Doi, Kunio; Roy, Arunabha S.

    2003-01-01

    We have evaluated the performance of an automated classifier applied to the task of differentiating malignant and benign lung nodules in low-dose helical computed tomography (CT) scans acquired as part of a lung cancer screening program. The nodules classified in this manner were initially identified by our automated lung nodule detection method, so that the output of automated lung nodule detection was used as input to automated lung nodule classification. This study begins to narrow the distinction between the 'detection task' and the 'classification task'. Automated lung nodule detection is based on two- and three-dimensional analyses of the CT image data. Gray-level-thresholding techniques are used to identify initial lung nodule candidates, for which morphological and gray-level features are computed. A rule-based approach is applied to reduce the number of nodule candidates that correspond to non-nodules, and the features of remaining candidates are merged through linear discriminant analysis to obtain final detection results. Automated lung nodule classification merges the features of the lung nodule candidates identified by the detection algorithm that correspond to actual nodules through another linear discriminant classifier to distinguish between malignant and benign nodules. The automated classification method was applied to the computerized detection results obtained from a database of 393 low-dose thoracic CT scans containing 470 confirmed lung nodules (69 malignant and 401 benign nodules). Receiver operating characteristic (ROC) analysis was used to evaluate the ability of the classifier to differentiate between nodule candidates that correspond to malignant nodules and nodule candidates that correspond to benign lesions. The area under the ROC curve for this classification task attained a value of 0.79 during a leave-one-out evaluation

  3. Automation of Electrical Cable Harnesses Testing

    Directory of Open Access Journals (Sweden)

    Zhuming Bi

    2017-12-01

    Full Text Available Traditional automated systems, such as industrial robots, are applied in well-structured environments, and many automated systems have a limited adaptability to deal with complexity and uncertainty; therefore, the applications of industrial robots in small- and medium-sized enterprises (SMEs are very limited. The majority of manual operations in SMEs are too complicated for automation. The rapidly developed information technologies (IT has brought new opportunities for the automation of manufacturing and assembly processes in the ill-structured environments. Note that an automation solution should be designed to meet the given requirements of the specified application, and it differs from one application to another. In this paper, we look into the feasibility of automated testing for electric cable harnesses, and our focus is on some of the generic strategies for the improvement of the adaptability of automation solutions. Especially, the concept of modularization is adopted in developing hardware and software to maximize system adaptability in testing a wide scope of products. A proposed system has been implemented, and the system performances have been evaluated by executing tests on actual products. The testing experiments have shown that the automated system outperformed manual operations greatly in terms of cost-saving, productivity and reliability. Due to the potential of increasing system adaptability and cost reduction, the presented work has its theoretical and practical significance for an extension for other automation solutions in SMEs.

  4. Automated DBS microsampling, microscale automation and microflow LC-MS for therapeutic protein PK.

    Science.gov (United States)

    Zhang, Qian; Tomazela, Daniela; Vasicek, Lisa A; Spellman, Daniel S; Beaumont, Maribel; Shyong, BaoJen; Kenny, Jacqueline; Fauty, Scott; Fillgrove, Kerry; Harrelson, Jane; Bateman, Kevin P

    2016-04-01

    Reduce animal usage for discovery-stage PK studies for biologics programs using microsampling-based approaches and microscale LC-MS. We report the development of an automated DBS-based serial microsampling approach for studying the PK of therapeutic proteins in mice. Automated sample preparation and microflow LC-MS were used to enable assay miniaturization and improve overall assay throughput. Serial sampling of mice was possible over the full 21-day study period with the first six time points over 24 h being collected using automated DBS sample collection. Overall, this approach demonstrated comparable data to a previous study using single mice per time point liquid samples while reducing animal and compound requirements by 14-fold. Reduction in animals and drug material is enabled by the use of automated serial DBS microsampling for mice studies in discovery-stage studies of protein therapeutics.

  5. Quantitative Estimation for the Effectiveness of Automation

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun

    2012-01-01

    In advanced MCR, various automation systems are applied to enhance the human performance and reduce the human errors in industrial fields. It is expected that automation provides greater efficiency, lower workload, and fewer human errors. However, these promises are not always fulfilled. As the new types of events related to application of the imperfect and complex automation are occurred, it is required to analyze the effects of automation system for the performance of human operators. Therefore, we suggest the quantitative estimation method to analyze the effectiveness of the automation systems according to Level of Automation (LOA) classification, which has been developed over 30 years. The estimation of the effectiveness of automation will be achieved by calculating the failure probability of human performance related to the cognitive activities

  6. Automating spectral measurements

    Science.gov (United States)

    Goldstein, Fred T.

    2008-09-01

    This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.

  7. Ask the experts: automation: part I.

    Science.gov (United States)

    Allinson, John L; Blick, Kenneth E; Cohen, Lucinda; Higton, David; Li, Ming

    2013-08-01

    Bioanalysis invited a selection of leading researchers to express their views on automation in the bioanalytical laboratory. The topics discussed include the challenges that the modern bioanalyst faces when integrating automation into existing drug-development processes, the impact of automation and how they envision the modern bioanalytical laboratory changing in the near future. Their enlightening responses provide a valuable insight into the impact of automation and the future of the constantly evolving bioanalytical laboratory.

  8. An Automation Survival Guide for Media Centers.

    Science.gov (United States)

    Whaley, Roger E.

    1989-01-01

    Reviews factors that should affect the decision to automate a school media center and offers suggestions for the automation process. Topics discussed include getting the library collection ready for automation, deciding what automated functions are needed, evaluating software vendors, selecting software, and budgeting. (CLB)

  9. Demands on digital automation; Anforderungen an die Digitale Automation

    Energy Technology Data Exchange (ETDEWEB)

    Bieler, P.

    1995-12-31

    In chapter 12 of the anthology about building control the demands on digital automation are presented. The following aspects are discussed: variety of the companies` philosophies, demands of the customer/investor, demands of the use of buildings/rooms, the user, point of view of manufacturer of technical plants. (BWI) [Deutsch] Kapitel 12 des Sammelbandes ueber Building Control stellt die Anforderungen an die Digitale Automation vor. In diesem Zusammenhang wird auf folgende Themenbereiche eingegangen: Spektrum der Firmenphilosophien, Forderungen der Auftraggeber/Investoren, der Gebaeude-/Raumnutzung, der Betreiber sowie Sicht der Ersteller betriebstechnischer Anlagen. (BWI)

  10. Disassembly automation automated systems with cognitive abilities

    CERN Document Server

    Vongbunyong, Supachai

    2015-01-01

    This book presents a number of aspects to be considered in the development of disassembly automation, including the mechanical system, vision system and intelligent planner. The implementation of cognitive robotics increases the flexibility and degree of autonomy of the disassembly system. Disassembly, as a step in the treatment of end-of-life products, can allow the recovery of embodied value left within disposed products, as well as the appropriate separation of potentially-hazardous components. In the end-of-life treatment industry, disassembly has largely been limited to manual labor, which is expensive in developed countries. Automation is one possible solution for economic feasibility. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.

  11. Automations influence on nuclear power plants: a look at three accidents and how automation played a role.

    Science.gov (United States)

    Schmitt, Kara

    2012-01-01

    Nuclear power is one of the ways that we can design an efficient sustainable future. Automation is the primary system used to assist operators in the task of monitoring and controlling nuclear power plants (NPP). Automation performs tasks such as assessing the status of the plant's operations as well as making real time life critical situational specific decisions. While the advantages and disadvantages of automation are well studied in variety of domains, accidents remind us that there is still vulnerability to unknown variables. This paper will look at the effects of automation within three NPP accidents and incidents and will consider why automation failed in preventing these accidents from occurring. It will also review the accidents at the Three Mile Island, Chernobyl, and Fukushima Daiichi NPP's in order to determine where better use of automation could have resulted in a more desirable outcome.

  12. Genetic Algorithm Calibration of Probabilistic Cellular Automata for Modeling Mining Permit Activity

    Science.gov (United States)

    Louis, S.J.; Raines, G.L.

    2003-01-01

    We use a genetic algorithm to calibrate a spatially and temporally resolved cellular automata to model mining activity on public land in Idaho and western Montana. The genetic algorithm searches through a space of transition rule parameters of a two dimensional cellular automata model to find rule parameters that fit observed mining activity data. Previous work by one of the authors in calibrating the cellular automaton took weeks - the genetic algorithm takes a day and produces rules leading to about the same (or better) fit to observed data. These preliminary results indicate that genetic algorithms are a viable tool in calibrating cellular automata for this application. Experience gained during the calibration of this cellular automata suggests that mineral resource information is a critical factor in the quality of the results. With automated calibration, further refinements of how the mineral-resource information is provided to the cellular automaton will probably improve our model.

  13. Automated System Marketplace 1994.

    Science.gov (United States)

    Griffiths, Jose-Marie; Kertis, Kimberly

    1994-01-01

    Reports results of the 1994 Automated System Marketplace survey based on responses from 60 vendors. Highlights include changes in the library automation marketplace; estimated library systems revenues; minicomputer and microcomputer-based systems; marketplace trends; global markets and mergers; research needs; new purchase processes; and profiles…

  14. Automation in Warehouse Development

    NARCIS (Netherlands)

    Hamberg, R.; Verriet, J.

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and

  15. Operational proof of automation

    International Nuclear Information System (INIS)

    Jaerschky, R.; Reifenhaeuser, R.; Schlicht, K.

    1976-01-01

    Automation of the power plant process may imply quite a number of problems. The automation of dynamic operations requires complicated programmes often interfering in several branched areas. This reduces clarity for the operating and maintenance staff, whilst increasing the possibilities of errors. The synthesis and the organization of standardized equipment have proved very successful. The possibilities offered by this kind of automation for improving the operation of power plants will only sufficiently and correctly be turned to profit, however, if the application of these technics of equipment is further improved and if its volume is tallied with a definite etc. (orig.) [de

  16. Chef infrastructure automation cookbook

    CERN Document Server

    Marschall, Matthias

    2013-01-01

    Chef Infrastructure Automation Cookbook contains practical recipes on everything you will need to automate your infrastructure using Chef. The book is packed with illustrated code examples to automate your server and cloud infrastructure.The book first shows you the simplest way to achieve a certain task. Then it explains every step in detail, so that you can build your knowledge about how things work. Eventually, the book shows you additional things to consider for each approach. That way, you can learn step-by-step and build profound knowledge on how to go about your configuration management

  17. Operational proof of automation

    International Nuclear Information System (INIS)

    Jaerschky, R.; Schlicht, K.

    1977-01-01

    Automation of the power plant process may imply quite a number of problems. The automation of dynamic operations requires complicated programmes often interfering in several branched areas. This reduces clarity for the operating and maintenance staff, whilst increasing the possibilities of errors. The synthesis and the organization of standardized equipment have proved very successful. The possibilities offered by this kind of automation for improving the operation of power plants will only sufficiently and correctly be turned to profit, however, if the application of these equipment techniques is further improved and if it stands in a certain ratio with a definite efficiency. (orig.) [de

  18. Future Computer, Communication, Control and Automation

    CERN Document Server

    2011 International Conference on Computer, Communication, Control and Automation

    2012-01-01

    The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011). 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011) has been held in Zhuhai, China, November 19-20, 2011. This volume topics covered include wireless communications, advances in wireless video, wireless sensors networking, security in wireless networks, network measurement and management, hybrid and discrete-event systems, internet analytics and automation, robotic system and applications, reconfigurable automation systems, machine vision in automation. We hope that researchers, graduate students and other interested readers benefit scientifically from the proceedings and also find it stimulating in the process.

  19. "First generation" automated DNA sequencing technology.

    Science.gov (United States)

    Slatko, Barton E; Kieleczawa, Jan; Ju, Jingyue; Gardner, Andrew F; Hendrickson, Cynthia L; Ausubel, Frederick M

    2011-10-01

    Beginning in the 1980s, automation of DNA sequencing has greatly increased throughput, reduced costs, and enabled large projects to be completed more easily. The development of automation technology paralleled the development of other aspects of DNA sequencing: better enzymes and chemistry, separation and imaging technology, sequencing protocols, robotics, and computational advancements (including base-calling algorithms with quality scores, database developments, and sequence analysis programs). Despite the emergence of high-throughput sequencing platforms, automated Sanger sequencing technology remains useful for many applications. This unit provides background and a description of the "First-Generation" automated DNA sequencing technology. It also includes protocols for using the current Applied Biosystems (ABI) automated DNA sequencing machines. © 2011 by John Wiley & Sons, Inc.

  20. Introduction matters: Manipulating trust in automation and reliance in automated driving.

    Science.gov (United States)

    Körber, Moritz; Baseler, Eva; Bengler, Klaus

    2018-01-01

    Trust in automation is a key determinant for the adoption of automated systems and their appropriate use. Therefore, it constitutes an essential research area for the introduction of automated vehicles to road traffic. In this study, we investigated the influence of trust promoting (Trust promoted group) and trust lowering (Trust lowered group) introductory information on reported trust, reliance behavior and take-over performance. Forty participants encountered three situations in a 17-min highway drive in a conditionally automated vehicle (SAE Level 3). Situation 1 and Situation 3 were non-critical situations where a take-over was optional. Situation 2 represented a critical situation where a take-over was necessary to avoid a collision. A non-driving-related task (NDRT) was presented between the situations to record the allocation of visual attention. Participants reporting a higher trust level spent less time looking at the road or instrument cluster and more time looking at the NDRT. The manipulation of introductory information resulted in medium differences in reported trust and influenced participants' reliance behavior. Participants of the Trust promoted group looked less at the road or instrument cluster and more at the NDRT. The odds of participants of the Trust promoted group to overrule the automated driving system in the non-critical situations were 3.65 times (Situation 1) to 5 times (Situation 3) higher. In Situation 2, the Trust promoted group's mean take-over time was extended by 1154 ms and the mean minimum time-to-collision was 933 ms shorter. Six participants from the Trust promoted group compared to no participant of the Trust lowered group collided with the obstacle. The results demonstrate that the individual trust level influences how much drivers monitor the environment while performing an NDRT. Introductory information influences this trust level, reliance on an automated driving system, and if a critical take-over situation can be

  1. Automating the Small Library.

    Science.gov (United States)

    Skapura, Robert

    1987-01-01

    Discusses the use of microcomputers for automating school libraries, both for entire systems and for specific library tasks. Highlights include available library management software, newsletters that evaluate software, constructing an evaluation matrix, steps to consider in library automation, and a brief discussion of computerized card catalogs.…

  2. GUI test automation for Qt application

    OpenAIRE

    Wang, Lei

    2015-01-01

    GUI test automation is a popular and interesting subject in the testing industry. Many companies plan to start test automation projects in order to implement efficient, less expensive software testing. However, there are challenges for the testing team who lack experience performing GUI tests automation. Many GUI test automation projects have ended in failure due to mistakes made during the early stages of the project. The major work of this thesis is to find a solution to the challenges of e...

  3. Automation synthesis modules review

    International Nuclear Information System (INIS)

    Boschi, S.; Lodi, F.; Malizia, C.; Cicoria, G.; Marengo, M.

    2013-01-01

    The introduction of 68 Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived 68 Ge/ 68 Ga generator has been at the bases of the development of 68 Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for 68 Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. - Highlights: ► Generators availability and robust chemistry boosted for the huge diffusion of 68Ga radiopharmaceuticals. ► Different technological approaches for 68Ga radiopharmaceuticals will be discussed. ► Generator eluate post processing and evolution to cassette based systems were the major issues in automation. ► Impact of regulations on the technological development will be also considered

  4. 76 FR 69755 - National Customs Automation Program Test Concerning Automated Commercial Environment (ACE...

    Science.gov (United States)

    2011-11-09

    ... DEPARTMENT OF HOMELAND SECURITY U.S. Customs and Border Protection National Customs Automation... announces U.S. Customs and Border Protection's (CBP's) plan to conduct a National Customs Automation Program... conveyance transporting the cargo to the United States. This data will fulfill merchandise entry requirements...

  5. I trust it, but I don't know why: effects of implicit attitudes toward automation on trust in an automated system.

    Science.gov (United States)

    Merritt, Stephanie M; Heimbaugh, Heather; LaChapell, Jennifer; Lee, Deborah

    2013-06-01

    This study is the first to examine the influence of implicit attitudes toward automation on users' trust in automation. Past empirical work has examined explicit (conscious) influences on user level of trust in automation but has not yet measured implicit influences. We examine concurrent effects of explicit propensity to trust machines and implicit attitudes toward automation on trust in an automated system. We examine differential impacts of each under varying automation performance conditions (clearly good, ambiguous, clearly poor). Participants completed both a self-report measure of propensity to trust and an Implicit Association Test measuring implicit attitude toward automation, then performed an X-ray screening task. Automation performance was manipulated within-subjects by varying the number and obviousness of errors. Explicit propensity to trust and implicit attitude toward automation did not significantly correlate. When the automation's performance was ambiguous, implicit attitude significantly affected automation trust, and its relationship with propensity to trust was additive: Increments in either were related to increases in trust. When errors were obvious, a significant interaction between the implicit and explicit measures was found, with those high in both having higher trust. Implicit attitudes have important implications for automation trust. Users may not be able to accurately report why they experience a given level of trust. To understand why users trust or fail to trust automation, measurements of implicit and explicit predictors may be necessary. Furthermore, implicit attitude toward automation might be used as a lever to effectively calibrate trust.

  6. A map of directional genetic interactions in a metazoan cell.

    Science.gov (United States)

    Fischer, Bernd; Sandmann, Thomas; Horn, Thomas; Billmann, Maximilian; Chaudhary, Varun; Huber, Wolfgang; Boutros, Michael

    2015-03-06

    Gene-gene interactions shape complex phenotypes and modify the effects of mutations during development and disease. The effects of statistical gene-gene interactions on phenotypes have been used to assign genes to functional modules. However, directional, epistatic interactions, which reflect regulatory relationships between genes, have been challenging to map at large-scale. Here, we used combinatorial RNA interference and automated single-cell phenotyping to generate a large genetic interaction map for 21 phenotypic features of Drosophila cells. We devised a method that combines genetic interactions on multiple phenotypes to reveal directional relationships. This network reconstructed the sequence of protein activities in mitosis. Moreover, it revealed that the Ras pathway interacts with the SWI/SNF chromatin-remodelling complex, an interaction that we show is conserved in human cancer cells. Our study presents a powerful approach for reconstructing directional regulatory networks and provides a resource for the interpretation of functional consequences of genetic alterations.

  7. Robotics/Automated Systems Technicians.

    Science.gov (United States)

    Doty, Charles R.

    Major resources exist that can be used to develop or upgrade programs in community colleges and technical institutes that educate robotics/automated systems technicians. The first category of resources is Economic, Social, and Education Issues. The Office of Technology Assessment (OTA) report, "Automation and the Workplace," presents analyses of…

  8. Evaluation of an Automated Keywording System.

    Science.gov (United States)

    Malone, Linda C.; And Others

    1990-01-01

    Discussion of automated indexing techniques focuses on ways to statistically document improvements in the development of an automated keywording system over time. The system developed by the Joint Chiefs of Staff to automate the storage, categorization, and retrieval of information from military exercises is explained, and performance measures are…

  9. Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation

    CERN Document Server

    2012-01-01

    This volume Future Control and Automation- Volume 2 includes best papers from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into six sessions on the basis of the classification of manuscripts considered, which is listed as follows: Mathematical Modeling, Analysis and Computation, Control Engineering, Reliable Networks Design, Vehicular Communications and Networking, Automation and Mechatronics.

  10. Automation for a base station stability testing

    OpenAIRE

    Punnek, Elvis

    2016-01-01

    This Batchelor’s thesis was commissioned by Oy LM Ericsson Ab Oulu. The aim of it was to help to investigate and create a test automation solution for the stability testing of the LTE base station. The main objective was to create a test automation for a predefined test set. This test automation solution had to be created for specific environments and equipment. This work included creating the automation for the test cases and putting them to daily test automation jobs. The key factor...

  11. An Automated System for Skeletal Maturity Assessment by Extreme Learning Machines.

    Science.gov (United States)

    Mansourvar, Marjan; Shamshirband, Shahaboddin; Raj, Ram Gopal; Gunalan, Roshan; Mazinani, Iman

    2015-01-01

    Assessing skeletal age is a subjective and tedious examination process. Hence, automated assessment methods have been developed to replace manual evaluation in medical applications. In this study, a new fully automated method based on content-based image retrieval and using extreme learning machines (ELM) is designed and adapted to assess skeletal maturity. The main novelty of this approach is it overcomes the segmentation problem as suffered by existing systems. The estimation results of ELM models are compared with those of genetic programming (GP) and artificial neural networks (ANNs) models. The experimental results signify improvement in assessment accuracy over GP and ANN, while generalization capability is possible with the ELM approach. Moreover, the results are indicated that the ELM model developed can be used confidently in further work on formulating novel models of skeletal age assessment strategies. According to the experimental results, the new presented method has the capacity to learn many hundreds of times faster than traditional learning methods and it has sufficient overall performance in many aspects. It has conclusively been found that applying ELM is particularly promising as an alternative method for evaluating skeletal age.

  12. An Automated System for Skeletal Maturity Assessment by Extreme Learning Machines

    Science.gov (United States)

    Mansourvar, Marjan; Shamshirband, Shahaboddin; Raj, Ram Gopal; Gunalan, Roshan; Mazinani, Iman

    2015-01-01

    Assessing skeletal age is a subjective and tedious examination process. Hence, automated assessment methods have been developed to replace manual evaluation in medical applications. In this study, a new fully automated method based on content-based image retrieval and using extreme learning machines (ELM) is designed and adapted to assess skeletal maturity. The main novelty of this approach is it overcomes the segmentation problem as suffered by existing systems. The estimation results of ELM models are compared with those of genetic programming (GP) and artificial neural networks (ANNs) models. The experimental results signify improvement in assessment accuracy over GP and ANN, while generalization capability is possible with the ELM approach. Moreover, the results are indicated that the ELM model developed can be used confidently in further work on formulating novel models of skeletal age assessment strategies. According to the experimental results, the new presented method has the capacity to learn many hundreds of times faster than traditional learning methods and it has sufficient overall performance in many aspects. It has conclusively been found that applying ELM is particularly promising as an alternative method for evaluating skeletal age. PMID:26402795

  13. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  14. Modeling genetic imprinting effects of DNA sequences with multilocus polymorphism data

    Directory of Open Access Journals (Sweden)

    Staud Roland

    2009-08-01

    Full Text Available Abstract Single nucleotide polymorphisms (SNPs represent the most widespread type of DNA sequence variation in the human genome and they have recently emerged as valuable genetic markers for revealing the genetic architecture of complex traits in terms of nucleotide combination and sequence. Here, we extend an algorithmic model for the haplotype analysis of SNPs to estimate the effects of genetic imprinting expressed at the DNA sequence level. The model provides a general procedure for identifying the number and types of optimal DNA sequence variants that are expressed differently due to their parental origin. The model is used to analyze a genetic data set collected from a pain genetics project. We find that DNA haplotype GAC from three SNPs, OPRKG36T (with two alleles G and T, OPRKA843G (with alleles A and G, and OPRKC846T (with alleles C and T, at the kappa-opioid receptor, triggers a significant effect on pain sensitivity, but with expression significantly depending on the parent from which it is inherited (p = 0.008. With a tremendous advance in SNP identification and automated screening, the model founded on haplotype discovery and statistical inference may provide a useful tool for genetic analysis of any quantitative trait with complex inheritance.

  15. Automation System Products and Research

    OpenAIRE

    Rintala, Mikko; Sormunen, Jussi; Kuisma, Petri; Rahkala, Matti

    2014-01-01

    Automation systems are used in most buildings nowadays. In the past they were mainly used in industry to control and monitor critical systems. During the past few decades the automation systems have become more common and are used today from big industrial solutions to homes of private customers. With the growing need for ecologic and cost-efficient management systems, home and building automation systems are becoming a standard way of controlling lighting, ventilation, heating etc. Auto...

  16. Guidelines for Automation Project Execution

    OpenAIRE

    Takkinen, Heidi

    2011-01-01

    The purpose of this Master’s thesis was to create instructions for executing an automation project. Sarlin Oy Ab needed directions on how to execute an automation project. Sarlin is starting up a new business area offering total project solutions for customers. Sarlin focuses on small and minor automation projects on domestic markets. The thesis represents issues related to project execution starting from the theory of the project to its kick-off and termination. Site work is one importan...

  17. Organizational changes and automation: Towards a customer-oriented automation: Part 3

    International Nuclear Information System (INIS)

    Van Gelder, J.W.

    1994-01-01

    Automation offers great opportunities in the efforts of energy utilities in the Netherlands to reorganize towards more customer-oriented businesses. However, automation in itself is not enough. First, the organizational structure has to be changed considerably. Various energy utilities have already started on it. The restructuring principle is the same everywhere, but the way it is implemented differs widely. In this article attention is paid to the necessity of realizing an integrated computerized system, which, however, is not feasible at the moment. The second best alternative is to use various computerized systems, capable of two-way data exchange. Two viable approaches are discussed: (1) one operating system on which all automated systems within a company should run, or (2) a selective system linking on the basis of required speed information exchange. Option (2) offers more freedom of selecting the system. 2 figs

  18. You're a What? Automation Technician

    Science.gov (United States)

    Mullins, John

    2010-01-01

    Many people think of automation as laborsaving technology, but it sure keeps Jim Duffell busy. Defined simply, automation is a technique for making a device run or a process occur with minimal direct human intervention. But the functions and technologies involved in automated manufacturing are complex. Nearly all functions, from orders coming in…

  19. Does Automated Feedback Improve Writing Quality?

    Science.gov (United States)

    Wilson, Joshua; Olinghouse, Natalie G.; Andrada, Gilbert N.

    2014-01-01

    The current study examines data from students in grades 4-8 who participated in a statewide computer-based benchmark writing assessment that featured automated essay scoring and automated feedback. We examined whether the use of automated feedback was associated with gains in writing quality across revisions to an essay, and with transfer effects…

  20. System reliability, performance and trust in adaptable automation.

    Science.gov (United States)

    Chavaillaz, Alain; Wastell, David; Sauer, Jürgen

    2016-01-01

    The present study examined the effects of reduced system reliability on operator performance and automation management in an adaptable automation environment. 39 operators were randomly assigned to one of three experimental groups: low (60%), medium (80%), and high (100%) reliability of automation support. The support system provided five incremental levels of automation which operators could freely select according to their needs. After 3 h of training on a simulated process control task (AutoCAMS) in which the automation worked infallibly, operator performance and automation management were measured during a 2.5-h testing session. Trust and workload were also assessed through questionnaires. Results showed that although reduced system reliability resulted in lower levels of trust towards automation, there were no corresponding differences in the operators' reliance on automation. While operators showed overall a noteworthy ability to cope with automation failure, there were, however, decrements in diagnostic speed and prospective memory with lower reliability. Copyright © 2015. Published by Elsevier Ltd.

  1. Automated estimation of defects in magnetographic defectoscopy. 1. Automated magnetographic flow detectors

    International Nuclear Information System (INIS)

    Mikhajlov, S.P.; Vaulin, S.L.; Shcherbinin, V.E.; Shur, M.L.

    1993-01-01

    Consideration is given to specific features and possible functions of equipment for automated estimation of stretched continuity defects for samples with plane surface in magnetographic defectoscopy are discussed. Two models of automated magnetographic flow detectors, those with built-in microcomputer and in the form computer attachment, are described. Directions of further researches and development are discussed. 35 refs., 6 figs

  2. Multistep continuous-flow synthesis in medicinal chemistry: discovery and preliminary structure-activity relationships of CCR8 ligands.

    Science.gov (United States)

    Petersen, Trine P; Mirsharghi, Sahar; Rummel, Pia C; Thiele, Stefanie; Rosenkilde, Mette M; Ritzén, Andreas; Ulven, Trond

    2013-07-08

    A three-step continuous-flow synthesis system and its application to the assembly of a new series of chemokine receptor ligands directly from commercial building blocks is reported. No scavenger columns or solvent switches are necessary to recover the desired test compounds, which were obtained in overall yields of 49-94%. The system is modular and flexible, and the individual steps of the sequence can be interchanged with similar outcome, extending the scope of the chemistry. Biological evaluation confirmed activity on the chemokine CCR8 receptor and provided initial structure-activity-relationship (SAR) information for this new ligand series, with the most potent member displaying full agonist activity with single-digit nanomolar potency. To the best of our knowledge, this represents the first published example of efficient use of multistep flow synthesis combined with biological testing and SAR studies in medicinal chemistry. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Automated detection of analyzable metaphase chromosome cells depicted on scanned digital microscopic images

    Science.gov (United States)

    Qiu, Yuchen; Wang, Xingwei; Chen, Xiaodong; Li, Yuhua; Liu, Hong; Li, Shibo; Zheng, Bin

    2010-02-01

    applying this CAD-guided high-resolution microscopic image scanning system to prescreen and select ROIs that may contain analyzable metaphase chromosome cells. The success and the further improvement of this automated scanning system may have great impact on the future clinical practice in genetic laboratories to detect and diagnose diseases.

  4. Order Division Automated System.

    Science.gov (United States)

    Kniemeyer, Justin M.; And Others

    This publication was prepared by the Order Division Automation Project staff to fulfill the Library of Congress' requirement to document all automation efforts. The report was originally intended for internal use only and not for distribution outside the Library. It is now felt that the library community at-large may have an interest in the…

  5. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2014-12-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  6. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2015-01-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  7. Proof-of-concept automation of propellant processing

    Science.gov (United States)

    Ramohalli, Kumar; Schallhorn, P. A.

    1989-01-01

    For space-based propellant production, automation of the process is needed. Currently, all phases of terrestrial production have some form of human interaction. A mixer was acquired to help perform the tasks of automation. A heating system to be used with the mixer was designed, built, and installed. Tests performed on the heating system verify design criteria. An IBM PS/2 personal computer was acquired for the future automation work. It is hoped that some the mixing process itself will be automated. This is a concept demonstration task; proving that propellant production can be automated reliably.

  8. Active Mycobacterium Infection Due to Intramuscular BCG Administration Following Multi-Steps Medication Errors

    Directory of Open Access Journals (Sweden)

    MohammadReza Rafati

    2015-10-01

    Full Text Available Bacillus Calmette-Guérin (BCG is indicated for treatment of primary or relapsing flat urothelial cell carcinoma in situ (CIS of the urinary bladder. Disseminated infectious complications occasionally occur due to BCG as a vaccine and intravesical therapy.  Intramuscular (IM or Intravenous (IV administrations of BCG are rare medication errors which are more probable to produce systemic infections. This report presents 13 years old case that several steps medication errors occurred consequently from physician handwriting, pharmacy dispensing, nursing administration and patient family. The physician wrote βHCG instead of HCG in the prescription. βHCG was read as BCG by the pharmacy staff and 6 vials of intravesical BCG were administered IM twice a week for 3 consecutive weeks. The patient experienced fever and chills after each injection, but he was admitted 2 months after first IM administration of BCG with fever and pancytopenia. Unfortunately four month after using drug, during second admission duo to cellulitis at the sites of BCG injection the physicians diagnosed the medication error. Using handwritten prescription and inappropriate abbreviations, spending inadequate time for taking a brief medical history in pharmacy, lack of verifying name, dose and wrote before medication administration and lack of considering medication error as an important differential diagnosis had roles to occur this multi-steps medication error.

  9. A genome-wide screen for genetic variants that modify the recruitment of REST to its target genes.

    Directory of Open Access Journals (Sweden)

    Rory Johnson

    Full Text Available Increasing numbers of human diseases are being linked to genetic variants, but our understanding of the mechanistic links leading from DNA sequence to disease phenotype is limited. The majority of disease-causing nucleotide variants fall within the non-protein-coding portion of the genome, making it likely that they act by altering gene regulatory sequences. We hypothesised that SNPs within the binding sites of the transcriptional repressor REST alter the degree of repression of target genes. Given that changes in the effective concentration of REST contribute to several pathologies-various cancers, Huntington's disease, cardiac hypertrophy, vascular smooth muscle proliferation-these SNPs should alter disease-susceptibility in carriers. We devised a strategy to identify SNPs that affect the recruitment of REST to target genes through the alteration of its DNA recognition element, the RE1. A multi-step screen combining genetic, genomic, and experimental filters yielded 56 polymorphic RE1 sequences with robust and statistically significant differences of affinity between alleles. These SNPs have a considerable effect on the the functional recruitment of REST to DNA in a range of in vitro, reporter gene, and in vivo analyses. Furthermore, we observe allele-specific biases in deeply sequenced chromatin immunoprecipitation data, consistent with predicted differenes in RE1 affinity. Amongst the targets of polymorphic RE1 elements are important disease genes including NPPA, PTPRT, and CDH4. Thus, considerable genetic variation exists in the DNA motifs that connect gene regulatory networks. Recently available ChIP-seq data allow the annotation of human genetic polymorphisms with regulatory information to generate prior hypotheses about their disease-causing mechanism.

  10. A Genome-Wide Screen for Genetic Variants That Modify the Recruitment of REST to Its Target Genes

    Science.gov (United States)

    Johnson, Rory; Richter, Nadine; Bogu, Gireesh K.; Bhinge, Akshay; Teng, Siaw Wei; Choo, Siew Hua; Andrieux, Lise O.; de Benedictis, Cinzia; Jauch, Ralf; Stanton, Lawrence W.

    2012-01-01

    Increasing numbers of human diseases are being linked to genetic variants, but our understanding of the mechanistic links leading from DNA sequence to disease phenotype is limited. The majority of disease-causing nucleotide variants fall within the non-protein-coding portion of the genome, making it likely that they act by altering gene regulatory sequences. We hypothesised that SNPs within the binding sites of the transcriptional repressor REST alter the degree of repression of target genes. Given that changes in the effective concentration of REST contribute to several pathologies—various cancers, Huntington's disease, cardiac hypertrophy, vascular smooth muscle proliferation—these SNPs should alter disease-susceptibility in carriers. We devised a strategy to identify SNPs that affect the recruitment of REST to target genes through the alteration of its DNA recognition element, the RE1. A multi-step screen combining genetic, genomic, and experimental filters yielded 56 polymorphic RE1 sequences with robust and statistically significant differences of affinity between alleles. These SNPs have a considerable effect on the the functional recruitment of REST to DNA in a range of in vitro, reporter gene, and in vivo analyses. Furthermore, we observe allele-specific biases in deeply sequenced chromatin immunoprecipitation data, consistent with predicted differenes in RE1 affinity. Amongst the targets of polymorphic RE1 elements are important disease genes including NPPA, PTPRT, and CDH4. Thus, considerable genetic variation exists in the DNA motifs that connect gene regulatory networks. Recently available ChIP–seq data allow the annotation of human genetic polymorphisms with regulatory information to generate prior hypotheses about their disease-causing mechanism. PMID:22496669

  11. Automated Test-Form Generation

    Science.gov (United States)

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  12. Context-Aware user interfaces in automation

    DEFF Research Database (Denmark)

    Olsen, Mikkel Holm

    2007-01-01

    Automation is deployed in a great range of different domains such as the chemical industry, the production of consumer goods, the production of energy (both in terms of power plants and in the petrochemical industry), transportation and several others. Through several decades the complexity...... of automation systems and the level of automation have been rising. This has caused problems regarding the operator's ability to comprehend the overall situation and state of the automation system, in particular in abnormal situations. The amount of data available to the operator results in information overload....... Since context-aware applications have been developed in other research areas it seems natural to analyze the findings of this research and examine how this can be applied to the domain of automation systems. By evaluating existing architectures for the development of context-aware applications we find...

  13. Automated transit planning, operation, and applications

    CERN Document Server

    Liu, Rongfang

    2016-01-01

    This book analyzes the successful implementations of automated transit in various international locations, such as Paris, Toronto, London, and Kuala Lumpur, and investigates the apparent lack of automated transit applications in the urban environment in the United States. The book begins with a brief definition of automated transit and its historical development. After a thorough description of the technical specifications, the author highlights a few applications from each sub-group of the automated transit spectrum. International case studies display various technologies and their applications, and identify vital factors that affect each system and performance evaluations of existing applications. The book then discusses the planning and operation of automated transit applications at both macro and micro levels. Finally, the book covers a number of less successful concepts, as well as the lessons learned, allow ng readers to gain a comprehensive understanding of the topic.

  14. Participation through Automation: Fully Automated Critical PeakPricing in Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Kiliccote,Sila; Linkugel, Eric

    2006-06-20

    California electric utilities have been exploring the use of dynamic critical peak prices (CPP) and other demand response programs to help reduce peaks in customer electric loads. CPP is a tariff design to promote demand response. Levels of automation in DR can be defined as follows: Manual Demand Response involves a potentially labor-intensive approach such as manually turning off or changing comfort set points at each equipment switch or controller. Semi-Automated Demand Response involves a pre-programmed demand response strategy initiated by a person via centralized control system. Fully Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. The receipt of the external signal initiates pre-programmed demand response strategies. They refer to this as Auto-DR. This paper describes the development, testing, and results from automated CPP (Auto-CPP) as part of a utility project in California. The paper presents the project description and test methodology. This is followed by a discussion of Auto-DR strategies used in the field test buildings. They present a sample Auto-CPP load shape case study, and a selection of the Auto-CPP response data from September 29, 2005. If all twelve sites reached their maximum saving simultaneously, a total of approximately 2 MW of DR is available from these twelve sites that represent about two million ft{sup 2}. The average DR was about half that value, at about 1 MW. These savings translate to about 0.5 to 1.0 W/ft{sup 2} of demand reduction. They are continuing field demonstrations and economic evaluations to pursue increasing penetrations of automated DR that has demonstrated ability to provide a valuable DR resource for California.

  15. Managing laboratory automation

    OpenAIRE

    Saboe, Thomas J.

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Fina...

  16. Poly(vinylpyrrolidone)-Free Multistep Synthesis of Silver Nanoplates with Plasmon Resonance in the Near Infrared Range.

    Science.gov (United States)

    Khan, Assad U; Zhou, Zhengping; Krause, Joseph; Liu, Guoliang

    2017-11-01

    Herein, a poly(vinylpyrrolidone) (PVP)-free method is described for synthesizing Ag nanoplates that have localized surface plasmon resonance in the near-infrared (NIR) range. Citrate-capped Ag spherical nanoparticles are first grown into small Ag nanoplates that resonate in the range of 500-800 nm. The small Ag nanoplates are used as seeds to further grow into large Ag nanoplates with a lateral dimension of 100-600 nm and a plasmon resonance wavelength of 800-1660 nm and above. The number of growth steps can be increased as desired. Without introducing additional citrate into the solutions of small Ag nanoplate seeds, large Ag nanoplates can be synthesized within minutes. The entire synthesis is completely PVP free, which promotes the nanoparticle growth along the lateral direction to form large Ag nanoplates. The multistep growth and the minimum usage of citrate are essential for the fast growth of high-aspect-ratio Ag nanoplates resonating in the NIR range. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. LIBRARY AUTOMATION IN NIGERAN UNIVERSITIES

    African Journals Online (AJOL)

    facilitate services and access to information in libraries is widely acceptable. ... Moreover, Ugah (2001) reports that the automation process at the. Abubakar ... blueprint in 1987 and a turn-key system of automation was suggested for the library.

  18. Automated 741 document preparation: Oak Ridge National Laboratory's Automated Safeguards Information System (OASIS)

    International Nuclear Information System (INIS)

    Austin, H.C.; Gray, L.M.

    1982-01-01

    OASIS has been providing for Oak Ridge National Laboratory's total safeguards needs since being place on line in April 1980. The system supports near real-time nuclear materials safeguards and accountability control. The original design of OASIS called for an automated facsimile of a 741 document to be prepared as a functional by-product of updating the inventory. An attempt was made to utilize, intact, DOE-Albuquerque's automated 741 system to generate the facsimile; however, the five page document produced proved too cumbersome. Albuquerque's programs were modified to print an original 741 document utilizing standard DOE/NRC 741 forms. It is felt that the best features of both the automated and manually generated 741 documents have been incorporated. Automation of the source data for 741 shipping documents produces greater efficiency while reducing possible errors. Through utilization of the standard DOE/NRC form, continuity within the NMMSS system is maintained, thus minimizing the confusion and redundancy associated with facsimiles. OASIS now fulfills the original concept of near real-time accountability by furnishing a viable 741 document as a function of updating the inventory

  19. Evaluation of glycodendron and synthetically-modified dextran clearing agents for multi-step targeting of radioisotopes for molecular imaging and radioimmunotherapy

    Science.gov (United States)

    Cheal, Sarah M.; Yoo, Barney; Boughdad, Sarah; Punzalan, Blesida; Yang, Guangbin; Dilhas, Anna; Torchon, Geralda; Pu, Jun; Axworthy, Don B.; Zanzonico, Pat; Ouerfelli, Ouathek; Larson, Steven M.

    2014-01-01

    A series of N-acetylgalactosamine-dendrons (NAG-dendrons) and dextrans bearing biotin moieties were compared for their ability to complex with and sequester circulating bispecific anti-tumor antibody (scFv4) streptavidin (SA) fusion protein (scFv4-SA) in vivo, to improve tumor to normal tissue concentration ratios for targeted radioimmunotherapy and diagnosis. Specifically, a total of five NAG-dendrons employing a common synthetic scaffold structure containing 4, 8, 16, or 32 carbohydrate residues and a single biotin moiety were prepared (NAGB), and for comparative purposes, a biotinylated-dextran with average molecular weight (MW) of 500 kD was synthesized from amino-dextran (DEXB). One of the NAGB compounds, CA16, has been investigated in humans; our aim was to determine if other NAGB analogs (e.g. CA8 or CA4) were bioequivalent to CA16 and/or better suited as MST reagents. In vivo studies included dynamic positron-emission tomography (PET) imaging of 124I-labelled-scFv4-SA clearance and dual-label biodistribution studies following multi-step targeting (MST) directed at subcutaneous (s.c.) human colon adenocarcinoma xenografts in mice. The MST protocol consists of three injections: first, a bispecific antibody specific for an anti-tumor associated glycoprotein (TAG-72) single chain genetically-fused with SA (scFv4-SA); second, CA16 or other clearing agent; and third, radiolabeled biotin. We observed using PET imaging of 124I-labelled-scFv4-SA clearance that the spatial arrangement of ligands conjugated to NAG (i.e. biotin) can impact the binding to antibody in circulation and subsequent liver uptake of the NAG-antibody complex. Also, NAGB CA32-LC or CA16-LC can be utilized during MST to achieve comparable tumor- to-blood ratios and absolute tumor uptake seen previously with CA16. Finally, DEXB was equally effective as NAGB CA32-LC at lowering scFv4-SA in circulation, but at the expense of reducing absolute tumor uptake of radiolabeled biotin. PMID:24219178

  20. Individual differences in the calibration of trust in automation.

    Science.gov (United States)

    Pop, Vlad L; Shrewsbury, Alex; Durso, Francis T

    2015-06-01

    The objective was to determine whether operators with an expectancy that automation is trustworthy are better at calibrating their trust to changes in the capabilities of automation, and if so, why. Studies suggest that individual differences in automation expectancy may be able to account for why changes in the capabilities of automation lead to a substantial change in trust for some, yet only a small change for others. In a baggage screening task, 225 participants searched for weapons in 200 X-ray images of luggage. Participants were assisted by an automated decision aid exhibiting different levels of reliability. Measures of expectancy that automation is trustworthy were used in conjunction with subjective measures of trust and perceived reliability to identify individual differences in trust calibration. Operators with high expectancy that automation is trustworthy were more sensitive to changes (both increases and decreases) in automation reliability. This difference was eliminated by manipulating the causal attribution of automation errors. Attributing the cause of automation errors to factors external to the automation fosters an understanding of tasks and situations in which automation differs in reliability and may lead to more appropriate trust. The development of interventions can lead to calibrated trust in automation. © 2014, Human Factors and Ergonomics Society.

  1. Comparison of microbial community shifts in two parallel multi-step drinking water treatment processes.

    Science.gov (United States)

    Xu, Jiajiong; Tang, Wei; Ma, Jun; Wang, Hong

    2017-07-01

    Drinking water treatment processes remove undesirable chemicals and microorganisms from source water, which is vital to public health protection. The purpose of this study was to investigate the effects of treatment processes and configuration on the microbiome by comparing microbial community shifts in two series of different treatment processes operated in parallel within a full-scale drinking water treatment plant (DWTP) in Southeast China. Illumina sequencing of 16S rRNA genes of water samples demonstrated little effect of coagulation/sedimentation and pre-oxidation steps on bacterial communities, in contrast to dramatic and concurrent microbial community shifts during ozonation, granular activated carbon treatment, sand filtration, and disinfection for both series. A large number of unique operational taxonomic units (OTUs) at these four treatment steps further illustrated their strong shaping power towards the drinking water microbial communities. Interestingly, multidimensional scaling analysis revealed tight clustering of biofilm samples collected from different treatment steps, with Nitrospira, the nitrite-oxidizing bacteria, noted at higher relative abundances in biofilm compared to water samples. Overall, this study provides a snapshot of step-to-step microbial evolvement in multi-step drinking water treatment systems, and the results provide insight to control and manipulation of the drinking water microbiome via optimization of DWTP design and operation.

  2. Automation of Taxiing

    Directory of Open Access Journals (Sweden)

    Jaroslav Bursík

    2017-01-01

    Full Text Available The article focuses on the possibility of automation of taxiing, which is the part of a flight, which, under adverse weather conditions, greatly reduces the operational usability of an airport, and is the only part of a flight that has not been affected by automation, yet. Taxiing is currently handled manually by the pilot, who controls the airplane based on information from visual perception. The article primarily deals with possible ways of obtaining navigational information, and its automatic transfer to the controls. Analyzed wand assessed were currently available technologies such as computer vision, Light Detection and Ranging and Global Navigation Satellite System, which are useful for navigation and their general implementation into an airplane was designed. Obstacles to the implementation were identified, too. The result is a proposed combination of systems along with their installation into airplane’s systems so that it is possible to use the automated taxiing.

  3. Programmable automation systems in PSA

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1997-06-01

    The Finnish safety authority (STUK) requires plant specific PSAs, and quantitative safety goals are set on different levels. The reliability analysis is more problematic when critical safety functions are realized by applying programmable automation systems. Conventional modeling techniques do not necessarily apply to the analysis of these systems, and the quantification seems to be impossible. However, it is important to analyze contribution of programmable automation systems to the plant safety and PSA is the only method with system analytical view over the safety. This report discusses the applicability of PSA methodology (fault tree analyses, failure modes and effects analyses) in the analysis of programmable automation systems. The problem of how to decompose programmable automation systems for reliability modeling purposes is discussed. In addition to the qualitative analysis and structural reliability modeling issues, the possibility to evaluate failure probabilities of programmable automation systems is considered. One solution to the quantification issue is the use of expert judgements, and the principles to apply expert judgements is discussed in the paper. A framework to apply expert judgements is outlined. Further, the impacts of subjective estimates on the interpretation of PSA results are discussed. (orig.) (13 refs.)

  4. Development and investigation of a neutron radiography imaging system with a low-pressure multistep chamber

    International Nuclear Information System (INIS)

    Anisimov, Yu.S.; Chernenko, S.P.; Ivanov, A.B.; Netusil, T.; Peshekhonov, V.D.; Smykov, L.P.; Zanevsky, Yu.V.; Cisar, M.; Horacek, J.; Knourek, J.; Moucka, L.; Nezmar, L.; Pellar, L.; Pochman, J.; Schneider, Z.; Sidak, Z.; Vrba, I.; Bizek, V.; Zavadil, Z.; Beran, P.; Cerny, K.

    1988-01-01

    An imaging system of thermal neutrons for an investigation of digital neutron radiography has been developed and tested. Some characteristics obtained on a neutron radiography beam of an experimental reactor are reported. The coordinates of each event are determined in this system. After processing in a LSI 11/23 computer, a radiograph, accumulated in a histogramming memory of 64 K 16-bit words, is presented on a colour display. A 230x180 mm 2 low-pressure multistep chamber is used as a detector. Neutron conversion takes place in a 6 μm boron layer enriched to 86% in 10 B. The detection efficiency of thermal neutrons is no less than 3%. The count rate of the system reaches up to 2x10 5 events per second. A radiograph can be obtained within 10 minutes. The sensitivity of this system to gamma-background is low. One event/s is detected for a background of 1 R/h. The spatial resolution is found to be 0.7 mm (FWHM) using a cadmium knife edge. The integral nonlinearity is less than 0.4%. The possibility of using a hydrogeneous converter in this system for neutron radiography is discussed. (orig.)

  5. How to assess sustainability in automated manufacturing

    DEFF Research Database (Denmark)

    Dijkman, Teunis Johannes; Rödger, Jan-Markus; Bey, Niki

    2015-01-01

    The aim of this paper is to describe how sustainability in automation can be assessed. The assessment method is illustrated using a case study of a robot. Three aspects of sustainability assessment in automation are identified. Firstly, we consider automation as part of a larger system...... that fulfills the market demand for a given functionality. Secondly, three aspects of sustainability have to be assessed: environment, economy, and society. Thirdly, automation is part of a system with many levels, with different actors on each level, resulting in meeting the market demand. In this system......, (sustainability) specifications move top-down, which helps avoiding sub-optimization and problem shifting. From these three aspects, sustainable automation is defined as automation that contributes to products that fulfill a market demand in a more sustainable way. The case study presents the carbon footprints...

  6. Analysis of large brain MRI databases for investigating the relationships between brain, cognitive, and genetic polymorphisms

    International Nuclear Information System (INIS)

    Mazoyer, B.

    2006-01-01

    A major challenge for the years to come is the understanding of the brain-behaviour relationships, and in particular the investigation and quantification of the impact of genetic polymorphism on these relationships. In this framework, a promising experimental approach, which we will refer to as neuro-epidemiologic imaging, consists in acquiring multimodal (brain images, psychometric an d sociological data, genotypes) data in large (several hundreds or thousands ) cohorts of subjects. Processing of such large databases requires on first place the conception and implementation of automated 'pipelines', including image registration, spatial normalisation tissue segmentation, and multivariate statistical analysis. Given the number of images and data to be processed, such pipelines must be both fully automated and robust enough to be able to handle multi-center MRI data, e.g. having inhomogeneous characteristics in terms of resolution and contrast. This approach will be illustrated using two databases collected in aged healthy subjects, searching for the impact of genetic and environmental on two markers of brain aging, namely white matter hyper-signals, and grey matter atrophy. (author)

  7. Automation of coal mining equipment

    Energy Technology Data Exchange (ETDEWEB)

    Yamada, Ryuji

    1986-12-25

    Major machines used in the working face include the shearer and the self-advancing frame. The shearer has been changed from the radio-controlled model to the microcomputer operated machine, while automating various functions. In addition, a system for comprehensively examining operating conditions and natural conditions in the working face for further automation. The selfadvancing frame has been modified from the sequence controlled model to the microcomputer aided electrohydraulic control system. In order to proceed further with automation and introduce robotics, detectors, control units and valves must be made smaller in higher reliability. The system will be controlled above the ground in the future, provided that the machines in the working face are remote controlled at the gate while transmitting relevant data above the ground from this system. Thus, automated working face will be realized. (2 figs, 1 photo)

  8. Controls and automation in the SPIRAL project

    International Nuclear Information System (INIS)

    Bothner, U.; Boulot, A.; Maherault, J.; Martial, L.

    1999-01-01

    The control and automation team of the R and D of Accelerator-Exotic Beam Department has had in the framework of SPIRAL collaboration the following tasks: 1. automation of the resonator high frequency equipment of the CIME cyclotron; 2. automation of the vacuum equipment, i.e. the low energy line (TBE), the CIME cyclotron, the low energy line (BE); 3. automation of load safety for power supply; 4. for each of these tasks a circuitry file based on the SCHEMA software has been worked out. The programs required in the automation of load safety for power supply (STEP5, PROTOOL, DESIGNER 4.1) were developed and implemented for PC

  9. Automated controlled-potential coulometric determination of uranium

    International Nuclear Information System (INIS)

    Knight, C.H.; Clegg, D.E.; Wright, K.D.; Cassidy, R.M.

    1982-06-01

    A controlled-potential coulometer has been automated in our laboratory for routine determination of uranium in solution. The CRNL-designed automated system controls degassing, prereduction, and reduction of the sample. The final result is displayed on a digital coulometer readout. Manual and automated modes of operation are compared to show the precision and accuracy of the automated system. Results are also shown for the coulometric titration of typical uranium-aluminum alloy samples

  10. Office automation: a look beyond word processing

    OpenAIRE

    DuBois, Milan Ephriam, Jr.

    1983-01-01

    Approved for public release; distribution is unlimited Word processing was the first of various forms of office automation technologies to gain widespread acceptance and usability in the business world. For many, it remains the only form of office automation technology. Office automation, however, is not just word processing, although it does include the function of facilitating and manipulating text. In reality, office automation is not one innovation, or one office system, or one tech...

  11. The Employment-Impact of Automation in Canada

    OpenAIRE

    McLean, Colin Alexander

    2015-01-01

    Standard neoclassical models of labour demand predict that automation does not produce long-term increases in unemployment. Supporting evidence in Canada between 1970 and 2008 is explained by the reallocation of labour from industries with high levels of automation such as Manufacturing to industries with low levels of automation such as Retail and Wholesale Trade, and Business Services. Recent evidence indicates however that on-going technological advances are now driving labour automation i...

  12. Automated thermometric enzyme immunoassay of human proinsulin produced by Escherichia coli.

    Science.gov (United States)

    Birnbaum, S; Bülow, L; Hardy, K; Danielsson, B; Mosbach, K

    1986-10-01

    We have determined and monitored the production and release of human proinsulin by genetically engineered Escherichia coli cells. Several M9 media samples were analyzed sequentially after centrifugation with the aid of a rapid automated flow-through thermometric enzyme-linked immunosorbent assay (TELISA) system. The response time was 7 min after sample injection and a single assay was complete after 13 min. Insulin concentrations in the range of 0.1-50 micrograms/ml could be determined. The TELISA method correlated well with conventional radioimmunoassay determinations. Standard curves were reproducible over a period of several days even when the immobilized antibody column was stored at 25 degrees C in the enzyme thermistor unit. Thus, immediate assay start up was possible.

  13. Complex Automated Negotiations Theories, Models, and Software Competitions

    CERN Document Server

    Zhang, Minjie; Robu, Valentin; Matsuo, Tokuro

    2013-01-01

    Complex Automated Negotiations are a widely studied, emerging area in the field of Autonomous Agents and Multi-Agent Systems. In general, automated negotiations can be complex, since there are a lot of factors that characterize such negotiations. For this book, we solicited papers on all aspects of such complex automated negotiations, which are studied in the field of Autonomous Agents and Multi-Agent Systems. This book includes two parts, which are Part I: Agent-based Complex Automated Negotiations and Part II: Automated Negotiation Agents Competition. Each chapter in Part I is an extended version of ACAN 2011 papers after peer reviews by three PC members. Part II includes ANAC 2011 (The Second Automated Negotiating Agents Competition), in which automated agents who have different negotiation strategies and implemented by different developers are automatically negotiate in the several negotiation domains. ANAC is an international competition in which automated negotiation strategies, submitted by a number of...

  14. Automated process flowsheet synthesis for membrane processes using genetic algorithm: role of crossover operators

    KAUST Repository

    Shafiee, Alireza; Arab, Mobin; Lai, Zhiping; Liu, Zongwen; Abbas, Ali

    2016-01-01

    In optimization-based process flowsheet synthesis, optimization methods, including genetic algorithms (GA), are used as advantageous tools to select a high performance flowsheet by ‘screening’ large numbers of possible flowsheets. In this study, we

  15. Automated Methods of Corrosion Measurements

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    1997-01-01

    . Mechanical control, recording, and data processing must therefore be automated to a high level of precision and reliability. These general techniques and the apparatus involved have been described extensively. The automated methods of such high-resolution microscopy coordinated with computerized...

  16. Human-centered automation: Development of a philosophy

    Science.gov (United States)

    Graeber, Curtis; Billings, Charles E.

    1990-01-01

    Information on human-centered automation philosophy is given in outline/viewgraph form. It is asserted that automation of aircraft control will continue in the future, but that automation should supplement, not supplant the human management and control function in civil air transport.

  17. The future is now: Technology's impact on the practice of genetic counseling.

    Science.gov (United States)

    Gordon, Erynn S; Babu, Deepti; Laney, Dawn A

    2018-03-01

    Smartphones, artificial intelligence, automation, digital communication, and other types of technology are playing an increasingly important role in our daily lives. It is no surprise that technology is also shaping the practice of medicine, and more specifically the practice of genetic counseling. While digital tools have been part of the practice of medical genetics for decades, such as internet- or CD-ROM-based tools like Online Mendelian Inheritance in Man and Pictures of Standard Syndromes and Undiagnosed Malformations in the 1980s, the potential for emerging tools to change how we practice and the way patients consume information is startling. Technology has the potential to aid in at-risk patient identification, assist in generating a differential diagnosis, improve efficiency in medical history collection and risk assessment, provide educational support for patients, and streamline follow-up. Here we review the historic and current uses of technology in genetic counseling, identify challenges to integration, and propose future applications of technology that can shape the practice of genetic counseling. © 2018 Wiley Periodicals, Inc.

  18. A system-level approach to automation research

    Science.gov (United States)

    Harrison, F. W.; Orlando, N. E.

    1984-01-01

    Automation is the application of self-regulating mechanical and electronic devices to processes that can be accomplished with the human organs of perception, decision, and actuation. The successful application of automation to a system process should reduce man/system interaction and the perceived complexity of the system, or should increase affordability, productivity, quality control, and safety. The expense, time constraints, and risk factors associated with extravehicular activities have led the Automation Technology Branch (ATB), as part of the NASA Automation Research and Technology Program, to investigate the use of robots and teleoperators as automation aids in the context of space operations. The ATB program addresses three major areas: (1) basic research in autonomous operations, (2) human factors research on man-machine interfaces with remote systems, and (3) the integration and analysis of automated systems. This paper reviews the current ATB research in the area of robotics and teleoperators.

  19. International Conference Automation : Challenges in Automation, Robotics and Measurement Techniques

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2016-01-01

    This book presents the set of papers accepted for presentation at the International Conference Automation, held in Warsaw, 2-4 March of 2016. It presents the research results presented by top experts in the fields of industrial automation, control, robotics and measurement techniques. Each chapter presents a thorough analysis of a specific technical problem which is usually followed by numerical analysis, simulation, and description of results of implementation of the solution of a real world problem. The presented theoretical results, practical solutions and guidelines will be valuable for both researchers working in the area of engineering sciences and for practitioners solving industrial problems. .

  20. Comparison of Automated Graphical User Interface Testing Tools

    OpenAIRE

    Gaber, Domen

    2018-01-01

    The thesis presents the analysis of modern tools for automated testing of various web based user interfaces. The purpose of the work is to compare specific test automation solutions and point out the most suitable test automation tool amongst them. One of the main goals of test automation is to gain faster execution when compared to manual testing and overall cost deduction. There are multiple test automation solutions available on the market, which differ in complexity of use, type of o...

  1. Wireless Android Based Home Automation System

    Directory of Open Access Journals (Sweden)

    Muhammad Tanveer Riaz

    2017-01-01

    Full Text Available This manuscript presents a prototype and design implementation of an advance home automation system that uses Wi-Fi technology as a network infrastructure connecting its parts. The proposed system consists of two main components; the first part is the server, which presents system core that manages and controls user’s home. Users and system administrator can locally (Local Area Network or remotely (internet manage and control the system. Second part is the hardware interface module, which provides appropriate interface to sensors and actuator of home automation system. Unlike most of the available home automation system in the market, the proposed system is scalable that one server can manage many hardware interface modules as long as it exists within network coverage. System supports a wide range of home automation devices like appliances, power management components, and security components. The proposed system is better in terms of the flexibility and scalability than the commercially available home automation systems

  2. Intranodular signal intensity analysis of hypovascular high-risk borderline lesions of HCC that illustrate multi-step hepatocarcinogenesis within the nodule on Gd–EOB–DTPA-enhanced MRI

    International Nuclear Information System (INIS)

    Kobayashi, Satoshi; Matsui, Osamu; Gabata, Toshifumi; Koda, Wataru; Minami, Tetsuya; Ryu, Yasuji; Kozaka, Kazuto; Kitao, Azusa

    2012-01-01

    Objectives: To analyze intranodular signal intensity pattern of hypovascular high-risk borderline lesions of HCC that illustrate multi-step hepatocarcinogenesis within the nodule on Gd–EOB–DTPA-enhanced MRI. Methods: A total of 73 nodules showing hypervascular foci in hypovascular high-risk borderline lesions identified by angiography-assisted CT were included in this study. The intranodular signal intensities of both the hypervascular foci and the hypovascular high-risk borderline lesions were evaluated on hepatobiliary-phase EOB-enhanced MRI obtained 20 min after intravenous injection of contrast media. Results: Among 59 hypervascular foci within hypointense hypovascular high-risk borderline lesions, 6 showed more hypointensity, 32 isointensity, and 21 hyperintensity compared to the surrounding hypointense, hypovascular portion of the nodules. Among 14 hypervascular foci within isointense hypovascular high-risk borderline lesions, 5 showed isointensity, and 9 hypointensity compared to the surrounding isointense hypovascular high-risk borderline lesions. No hypervascular foci showed hyperintensity compared to the surrounding isointense hypovascular high-risk borderline lesions. Conclusions: In most of the hypovascular high-risk borderline lesions containing hypervascular foci within the nodule, the signal intensity was decreased in hypervascular foci as compared with hypovascular high-risk borderline lesions and the surrounding background liver parenchyma. This supports the concept of signal intensity decrease during the dedifferentiation process in multistep hepatocarcinogenesis. However, around 30% of the nodules did not follow this rule, and hypervascular foci showed hyperintensity relative to the hypovascular high-risk borderline lesions.

  3. Automated processing of endoscopic surgical instruments.

    Science.gov (United States)

    Roth, K; Sieber, J P; Schrimm, H; Heeg, P; Buess, G

    1994-10-01

    This paper deals with the requirements for automated processing of endoscopic surgical instruments. After a brief analysis of the current problems, solutions are discussed. Test-procedures have been developed to validate the automated processing, so that the cleaning results are guaranteed and reproducable. Also a device for testing and cleaning was designed together with Netzsch Newamatic and PCI, called TC-MIC, to automate processing and reduce manual work.

  4. Highway Electrification And Automation

    OpenAIRE

    Shladover, Steven E.

    1992-01-01

    This report addresses how the California Department of Transportation and the California PATH Program have made efforts to evaluate the feasibility and applicability of highway electrification and automation technologies. In addition to describing how the work was conducted, the report also describes the findings on highway electrification and highway automation, with experimental results, design study results, and a region-wide application impacts study for Los Angeles.

  5. Automated Status Notification System

    Science.gov (United States)

    2005-01-01

    NASA Lewis Research Center's Automated Status Notification System (ASNS) was born out of need. To prevent "hacker attacks," Lewis' telephone system needed to monitor communications activities 24 hr a day, 7 days a week. With decreasing staff resources, this continuous monitoring had to be automated. By utilizing existing communications hardware, a UNIX workstation, and NAWK (a pattern scanning and processing language), we implemented a continuous monitoring system.

  6. Multistep triaxial strength tests: investigating strength parameters and pore pressure effects on Opalinus Clay

    International Nuclear Information System (INIS)

    Graesle, W.

    2010-01-01

    Document available in extended abstract form only. The impact of natural variability between rock samples from a single formation is a common problem for the characterisation of THM properties of rocks. Data variation arising from heterogeneity between samples often obscures details of material behaviour. Besides efforts to reduce this statistical noise by careful selection of samples, there are essentially two approaches to overcome this problem: - To generate very large data sets for better statistics. - To avoid the impact of natural variability by yielding an extensive data set from a single sample. The multistep strength test follows the latter approach to characterise the mechanical behaviour of Opalinus Clay from Mont Terri and the possible impact of pore pressure effects. The concept of the multistep strength test comprises three test sections, each focused on the investigation of one mechanical characteristic of Opalinus Clay. Any section is composed of a series of strain controlled load cycles at various levels of confining pressure. 1) The linear elastic limit, i.e. the onset of nonlinearity in the stress-strain-relationship σ dev (ε 1 ) during strain-controlled triaxial loading, is determined in section 1. It defines a lower limit for the onset of damage. Avoiding sample damage is essential during this test section to ensure that all measurements reflect the behaviour of undisturbed material. Therefore, a rather strict and well detectable criterion for the onset of nonlinearity is required to enable a timely termination of any load phase. 2) Section 2 is focused on shear strength. Any load cycle is stopped as soon as peak stress is detected. As progressive damage of the sample is unavoidable during this process, it must be expected that only very few measured peak stresses approximately represent properties of the undamaged material. 3) Test section 3 is a conventional test of residual strength. Tests are carried out on cylindrical samples (100 mm

  7. Airway Clearance Techniques (ACTs)

    Medline Plus

    Full Text Available ... CF is a multistep process. A complete diagnostic evaluation should include a newborn screening, a sweat chloride ... a genetic or carrier test, and a clinical evaluation at a CF Foundation-accredited care center. Carrier ...

  8. Automation in airport security X-ray screening of cabin baggage: Examining benefits and possible implementations of automated explosives detection.

    Science.gov (United States)

    Hättenschwiler, Nicole; Sterchi, Yanik; Mendes, Marcia; Schwaninger, Adrian

    2018-10-01

    Bomb attacks on civil aviation make detecting improvised explosive devices and explosive material in passenger baggage a major concern. In the last few years, explosive detection systems for cabin baggage screening (EDSCB) have become available. Although used by a number of airports, most countries have not yet implemented these systems on a wide scale. We investigated the benefits of EDSCB with two different levels of automation currently being discussed by regulators and airport operators: automation as a diagnostic aid with an on-screen alarm resolution by the airport security officer (screener) or EDSCB with an automated decision by the machine. The two experiments reported here tested and compared both scenarios and a condition without automation as baseline. Participants were screeners at two international airports who differed in both years of work experience and familiarity with automation aids. Results showed that experienced screeners were good at detecting improvised explosive devices even without EDSCB. EDSCB increased only their detection of bare explosives. In contrast, screeners with less experience (tenure automated decision provided better human-machine detection performance than on-screen alarm resolution and no automation. This came at the cost of slightly higher false alarm rates on the human-machine system level, which would still be acceptable from an operational point of view. Results indicate that a wide-scale implementation of EDSCB would increase the detection of explosives in passenger bags and automated decision instead of automation as diagnostic aid with on screen alarm resolution should be considered. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. 21 CFR 864.5600 - Automated hematocrit instrument.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated hematocrit instrument. 864.5600 Section 864.5600 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices...

  10. 21 CFR 862.2900 - Automated urinalysis system.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated urinalysis system. 862.2900 Section 862....2900 Automated urinalysis system. (a) Identification. An automated urinalysis system is a device... that duplicate manual urinalysis systems. This device is used in conjunction with certain materials to...

  11. Resins production: batch plant automation

    International Nuclear Information System (INIS)

    Banti, M.; Mauri, G.

    1996-01-01

    Companies that look for automation in their plants without external resources, have at their disposal flexible, custom and easy to use DCS, open towards PLC. In this article it is explained why Hoechts has followed this way of new plants for resins production automation

  12. Small cities face greater impact from automation

    Science.gov (United States)

    Sun, Lijun; Cebrian, Manuel; Rahwan, Iyad

    2018-01-01

    The city has proved to be the most successful form of human agglomeration and provides wide employment opportunities for its dwellers. As advances in robotics and artificial intelligence revive concerns about the impact of automation on jobs, a question looms: how will automation affect employment in cities? Here, we provide a comparative picture of the impact of automation across US urban areas. Small cities will undertake greater adjustments, such as worker displacement and job content substitutions. We demonstrate that large cities exhibit increased occupational and skill specialization due to increased abundance of managerial and technical professions. These occupations are not easily automatable, and, thus, reduce the potential impact of automation in large cities. Our results pass several robustness checks including potential errors in the estimation of occupational automation and subsampling of occupations. Our study provides the first empirical law connecting two societal forces: urban agglomeration and automation's impact on employment. PMID:29436514

  13. Small cities face greater impact from automation.

    Science.gov (United States)

    Frank, Morgan R; Sun, Lijun; Cebrian, Manuel; Youn, Hyejin; Rahwan, Iyad

    2018-02-01

    The city has proved to be the most successful form of human agglomeration and provides wide employment opportunities for its dwellers. As advances in robotics and artificial intelligence revive concerns about the impact of automation on jobs, a question looms: how will automation affect employment in cities? Here, we provide a comparative picture of the impact of automation across US urban areas. Small cities will undertake greater adjustments, such as worker displacement and job content substitutions. We demonstrate that large cities exhibit increased occupational and skill specialization due to increased abundance of managerial and technical professions. These occupations are not easily automatable, and, thus, reduce the potential impact of automation in large cities. Our results pass several robustness checks including potential errors in the estimation of occupational automation and subsampling of occupations. Our study provides the first empirical law connecting two societal forces: urban agglomeration and automation's impact on employment. © 2018 The Authors.

  14. Space power subsystem automation technology

    Science.gov (United States)

    Graves, J. R. (Compiler)

    1982-01-01

    The technology issues involved in power subsystem automation and the reasonable objectives to be sought in such a program were discussed. The complexities, uncertainties, and alternatives of power subsystem automation, along with the advantages from both an economic and a technological perspective were considered. Whereas most spacecraft power subsystems now use certain automated functions, the idea of complete autonomy for long periods of time is almost inconceivable. Thus, it seems prudent that the technology program for power subsystem automation be based upon a growth scenario which should provide a structured framework of deliberate steps to enable the evolution of space power subsystems from the current practice of limited autonomy to a greater use of automation with each step being justified on a cost/benefit basis. Each accomplishment should move toward the objectives of decreased requirement for ground control, increased system reliability through onboard management, and ultimately lower energy cost through longer life systems that require fewer resources to operate and maintain. This approach seems well-suited to the evolution of more sophisticated algorithms and eventually perhaps even the use of some sort of artificial intelligence. Multi-hundred kilowatt systems of the future will probably require an advanced level of autonomy if they are to be affordable and manageable.

  15. AUTOMATED PROCESS MONITORING: APPLYING PROVEN AUTOMATION TECHNIQUES TO INTERNATIONAL SAFEGUARDS NEEDS

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Durst, Philip C.; Grate, Jay W.; Devol, Timothy A.; Egorov, Oleg; Clements, John P.

    2008-01-01

    Identification and quantification of specific alpha- and beta-emitting radionuclides in complex liquid matrices is highly challenging, and is typically accomplished through laborious wet chemical sample preparation and separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, gas proportional counting, alpha energy analysis, mass spectrometry). Analytical results may take days or weeks to report. Chains of custody and sample security measures may also complicate or slow the analytical process. When an industrial process-scale plant requires the monitoring of specific radionuclides as an indication of the composition of its feed stream or of plant performance, radiochemical measurements must be fast, accurate, and reliable. Scientists at Pacific Northwest National Laboratory have assembled a fully automated prototype Process Monitor instrument capable of a variety of tasks: automated sampling directly from a feed stream, sample digestion/analyte redox adjustment, chemical separations, radiochemical detection and data analysis/reporting. The system is compact, its components are fluidically inter-linked, and analytical results could be immediately transmitted to on- or off-site locations. The development of a rapid radiochemical Process Monitor for 99Tc in Hanford tank waste processing streams, capable of performing several measurements per hour, will be discussed in detail. More recently, the automated platform was modified to perform measurements of 90Sr in Hanford tank waste stimulant. The system exemplifies how automation could be integrated into reprocessing facilities to support international nuclear safeguards needs

  16. 21 CFR 864.5620 - Automated hemoglobin system.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated hemoglobin system. 864.5620 Section 864.5620 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  17. 21 CFR 864.5200 - Automated cell counter.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated cell counter. 864.5200 Section 864.5200 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  18. 21 CFR 864.5680 - Automated heparin analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated heparin analyzer. 864.5680 Section 864.5680 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  19. 21 CFR 864.5850 - Automated slide spinner.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated slide spinner. 864.5850 Section 864.5850 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  20. Novel insights in agent-based complex automated negotiation

    CERN Document Server

    Lopez-Carmona, Miguel; Ito, Takayuki; Zhang, Minjie; Bai, Quan; Fujita, Katsuhide

    2014-01-01

    This book focuses on all aspects of complex automated negotiations, which are studied in the field of autonomous agents and multi-agent systems. This book consists of two parts. I: Agent-Based Complex Automated Negotiations, and II: Automated Negotiation Agents Competition. The chapters in Part I are extended versions of papers presented at the 2012 international workshop on Agent-Based Complex Automated Negotiation (ACAN), after peer reviews by three Program Committee members. Part II examines in detail ANAC 2012 (The Third Automated Negotiating Agents Competition), in which automated agents that have different negotiation strategies and are implemented by different developers are automatically negotiated in the several negotiation domains. ANAC is an international competition in which automated negotiation strategies, submitted by a number of universities and research institutes across the world, are evaluated in tournament style. The purpose of the competition is to steer the research in the area of bilate...

  1. Integration of enabling methods for the automated flow preparation of piperazine-2-carboxamide.

    Science.gov (United States)

    Ingham, Richard J; Battilocchio, Claudio; Hawkins, Joel M; Ley, Steven V

    2014-01-01

    Here we describe the use of a new open-source software package and a Raspberry Pi(®) computer for the simultaneous control of multiple flow chemistry devices and its application to a machine-assisted, multi-step flow preparation of pyrazine-2-carboxamide - a component of Rifater(®), used in the treatment of tuberculosis - and its reduced derivative piperazine-2-carboxamide.

  2. ERP processes automation in corporate environments

    OpenAIRE

    Antonoaie Victor; Irimeş Adrian; Chicoş Lucia-Antoneta

    2017-01-01

    The automation processes are used in organizations to speed up analyses processes and reduce manual labour. Robotic Automation of IT processes implemented in a modern corporate workspace provides an excellent tool for assisting professionals in making decisions, saving resources and serving as a know-how repository. This study presents the newest trends in process automation, its benefits such as security, ease of use, reduction of overall process duration, and provide examples of SAPERP proj...

  3. Powder handling for automated fuel processing

    International Nuclear Information System (INIS)

    Frederickson, J.R.; Eschenbaum, R.C.; Goldmann, L.H.

    1989-01-01

    Installation of the Secure Automated Fabrication (SAF) line has been completed. It is located in the Fuel Cycle Plant (FCP) at the Department of Energy's (DOE) Hanford site near Richland, Washington. The SAF line was designed to fabricate advanced reactor fuel pellets and assemble fuel pins by automated, remote operation. This paper describes powder handling equipment and techniques utilized for automated powder processing and powder conditioning systems in this line. 9 figs

  4. Automation Revolutionize the Business Service Industry

    OpenAIRE

    Marciniak, Róbert

    2017-01-01

    In the last decades significant disruptive changes began with the extended use of automation. Many jobs are changed or disappeared and others were born totally with the automation. Together with the progress of technology, the automation was primarily spread in the industrial sector, mostly in the production and assembly lines. The growth maycontinue in the future further, researchers expect more than 35 million industrial robots globally by 2018.But it shades the situati...

  5. Automated diagnostics scoping study. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Quadrel, R.W.; Lash, T.A.

    1994-06-01

    The objective of the Automated Diagnostics Scoping Study was to investigate the needs for diagnostics in building operation and to examine some of the current technologies in automated diagnostics that can address these needs. The study was conducted in two parts. In the needs analysis, the authors interviewed facility managers and engineers at five building sites. In the technology survey, they collected published information on automated diagnostic technologies in commercial and military applications as well as on technologies currently under research. The following describe key areas that the authors identify for the research, development, and deployment of automated diagnostic technologies: tools and techniques to aid diagnosis during building commissioning, especially those that address issues arising from integrating building systems and diagnosing multiple simultaneous faults; technologies to aid diagnosis for systems and components that are unmonitored or unalarmed; automated capabilities to assist cause-and-effect exploration during diagnosis; inexpensive, reliable sensors, especially those that expand the current range of sensory input; technologies that aid predictive diagnosis through trend analysis; integration of simulation and optimization tools with building automation systems to optimize control strategies and energy performance; integration of diagnostic, control, and preventive maintenance technologies. By relating existing technologies to perceived and actual needs, the authors reached some conclusions about the opportunities for automated diagnostics in building operation. Some of a building operator`s needs can be satisfied by off-the-shelf hardware and software. Other needs are not so easily satisfied, suggesting directions for future research. Their conclusions and suggestions are offered in the final section of this study.

  6. Levels of automation and user control - evaluation of a turbine automation interface

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Jonas (Chalmers Univ. of Technology (Sweden))

    2008-10-15

    The study was performed during the annual operator training at the Studsvik nuclear power plant simulator facility in Nykoeping, Sweden. The participating operators came from the Oskarshamn 3 nuclear power plant. In the study, seven nuclear power plant turbine operators were interviewed concerning their use of the automatic turbine system. A field study approach together with a heuristic usability evaluation was made to assess how the operators are affected by use of automation in the control room setting. The purpose of the study was to examine how operator performance is affected by varying levels of automation in nuclear power plant turbine operation. The Automatic Turbine System (ATS) was evaluated to clarify how the ATS interface design supports the operators work. The results show that during manual control the operators experience loss of speed and accuracy in performing actions together with difficulty of dividing attention between performing a task and overall monitoring, as the major problems. The positive aspects of manual operations lie in increased feeling of being in control when performing actions by hand. With higher levels of automation the problems shift to issues concerning difficulty of following the automatic sequences and loosing track in procedures. As the level of automation gets higher, the need of feedback increases which means that information presentation also becomes more important. The use of the semiautomatic, step-mode is often preferred by the operators since it combines the speed and accuracy of the automation with the ability of maintaining the feeling of being in control. Further, a number of usability related concerns was found in the ATS interface. The operators especially experience the presentation of the conditions that manage the automatic sequences as difficult to perceive. (author)

  7. Levels of automation and user control - evaluation of a turbine automation interface

    International Nuclear Information System (INIS)

    Andersson, Jonas

    2008-10-01

    The study was performed during the annual operator training at the Studsvik nuclear power plant simulator facility in Nykoeping, Sweden. The participating operators came from the Oskarshamn 3 nuclear power plant. In the study, seven nuclear power plant turbine operators were interviewed concerning their use of the automatic turbine system. A field study approach together with a heuristic usability evaluation was made to assess how the operators are affected by use of automation in the control room setting. The purpose of the study was to examine how operator performance is affected by varying levels of automation in nuclear power plant turbine operation. The Automatic Turbine System (ATS) was evaluated to clarify how the ATS interface design supports the operators work. The results show that during manual control the operators experience loss of speed and accuracy in performing actions together with difficulty of dividing attention between performing a task and overall monitoring, as the major problems. The positive aspects of manual operations lie in increased feeling of being in control when performing actions by hand. With higher levels of automation the problems shift to issues concerning difficulty of following the automatic sequences and loosing track in procedures. As the level of automation gets higher, the need of feedback increases which means that information presentation also becomes more important. The use of the semiautomatic, step-mode is often preferred by the operators since it combines the speed and accuracy of the automation with the ability of maintaining the feeling of being in control. Further, a number of usability related concerns was found in the ATS interface. The operators especially experience the presentation of the conditions that manage the automatic sequences as difficult to perceive. (au)

  8. Detection of Heterogeneous Small Inclusions by a Multi-Step MUSIC Method

    Science.gov (United States)

    Solimene, Raffaele; Dell'Aversano, Angela; Leone, Giovanni

    2014-05-01

    In this contribution the problem of detecting and localizing scatterers with small (in terms of wavelength) cross sections by collecting their scattered field is addressed. The problem is dealt with for a two-dimensional and scalar configuration where the background is given as a two-layered cylindrical medium. More in detail, while scattered field data are taken in the outermost layer, inclusions are embedded within the inner layer. Moreover, the case of heterogeneous inclusions (i.e., having different scattering coefficients) is addressed. As a pertinent applicative context we identify the problem of diagnose concrete pillars in order to detect and locate rebars, ducts and other small in-homogeneities that can populate the interior of the pillar. The nature of inclusions influences the scattering coefficients. For example, the field scattered by rebars is stronger than the one due to ducts. Accordingly, it is expected that the more weakly scattering inclusions can be difficult to be detected as their scattered fields tend to be overwhelmed by those of strong scatterers. In order to circumvent this problem, in this contribution a multi-step MUltiple SIgnal Classification (MUSIC) detection algorithm is adopted [1]. In particular, the first stage aims at detecting rebars. Once rebars have been detected, their positions are exploited to update the Green's function and to subtract the scattered field due to their presence. The procedure is repeated until all the inclusions are detected. The analysis is conducted by numerical experiments for a multi-view/multi-static single-frequency configuration and the synthetic data are generated by a FDTD forward solver. Acknowledgement This work benefited from networking activities carried out within the EU funded COST Action TU1208 "Civil Engineering Applications of Ground Penetrating Radar." [1] R. Solimene, A. Dell'Aversano and G. Leone, "MUSIC algorithms for rebar detection," J. of Geophysics and Engineering, vol. 10, pp. 1

  9. Future Autonomous and Automated Systems Testbed

    Data.gov (United States)

    National Aeronautics and Space Administration — Trust is the greatest obstacle to implementing greater autonomy and automation (A&A) in the human spaceflight program. The Future Autonomous and Automated...

  10. Fatigue and voluntary utilization of automation in simulated driving.

    Science.gov (United States)

    Neubauer, Catherine; Matthews, Gerald; Langheim, Lisa; Saxby, Dyani

    2012-10-01

    A driving simulator was used to assess the impact on fatigue, stress, and workload of full vehicle automation that was initiated by the driver. Previous studies have shown that mandatory use of full automation induces a state of "passive fatigue" associated with loss of alertness. By contrast, voluntary use of automation may enhance the driver's perceptions of control and ability to manage fatigue. Participants were assigned to one of two experimental conditions, automation optional (AO) and nonautomation (NA), and then performed a 35 min, monotonous simulated drive. In the last 5 min, automation was unavailable and drivers were required to respond to an emergency event. Subjective state and workload were evaluated before and after the drive. Making automation available to the driver failed to alleviate fatigue and stress states induced by driving in monotonous conditions. Drivers who were fatigued prior to the drive were more likely to choose to use automation, but automation use increased distress, especially in fatigue-prone drivers. Drivers in the AO condition were slower to initiate steering responses to the emergency event, suggesting optional automation may be distracting. Optional, driver-controlled automation appears to pose the same dangers to task engagement and alertness as externally initiated automation. Drivers of automated vehicles may be vulnerable to fatigue that persists when normal vehicle control is restored. It is important to evaluate automated systems' impact on driver fatigue, to seek design solutions to the issue of maintaining driver engagement, and to address the vulnerabilities of fatigue-prone drivers.

  11. Aprendizaje automático

    OpenAIRE

    Moreno, Antonio

    1994-01-01

    En este libro se introducen los conceptos básicos en una de las ramas más estudiadas actualmente dentro de la inteligencia artificial: el aprendizaje automático. Se estudian temas como el aprendizaje inductivo, el razonamiento analógico, el aprendizaje basado en explicaciones, las redes neuronales, los algoritmos genéticos, el razonamiento basado en casos o las aproximaciones teóricas al aprendizaje automático.

  12. Instant Sikuli test automation

    CERN Document Server

    Lau, Ben

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. A concise guide written in an easy-to follow style using the Starter guide approach.This book is aimed at automation and testing professionals who want to use Sikuli to automate GUI. Some Python programming experience is assumed.

  13. Automated lattice data generation

    Science.gov (United States)

    Ayyar, Venkitesh; Hackett, Daniel C.; Jay, William I.; Neil, Ethan T.

    2018-03-01

    The process of generating ensembles of gauge configurations (and measuring various observables over them) can be tedious and error-prone when done "by hand". In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.

  14. Designing and implementing test automation frameworks with QTP

    CERN Document Server

    Bhargava, Ashish

    2013-01-01

    A tutorial-based approach, showing basic coding and designing techniques to build test automation frameworks.If you are a beginner, an automation engineer, an aspiring test automation engineer, a manual tester, a test lead or a test architect who wants to learn, create, and maintain test automation frameworks, this book will accelerate your ability to develop and adapt the framework.

  15. Automated data collection in single particle electron microscopy

    Science.gov (United States)

    Tan, Yong Zi; Cheng, Anchi; Potter, Clinton S.; Carragher, Bridget

    2016-01-01

    Automated data collection is an integral part of modern workflows in single particle electron microscopy (EM) research. This review surveys the software packages available for automated single particle EM data collection. The degree of automation at each stage of data collection is evaluated, and the capabilities of the software packages are described. Finally, future trends in automation are discussed. PMID:26671944

  16. Generic Automated Multi-function Finger Design

    Science.gov (United States)

    Honarpardaz, M.; Tarkian, M.; Sirkett, D.; Ölvander, J.; Feng, X.; Elf, J.; Sjögren, R.

    2016-11-01

    Multi-function fingers that are able to handle multiple workpieces are crucial in improvement of a robot workcell. Design automation of multi-function fingers is highly demanded by robot industries to overcome the current iterative, time consuming and complex manual design process. However, the existing approaches for the multi-function finger design automation are unable to entirely meet the robot industries’ need. This paper proposes a generic approach for design automation of multi-function fingers. The proposed approach completely automates the design process and requires no expert skill. In addition, this approach executes the design process much faster than the current manual process. To validate the approach, multi-function fingers are successfully designed for two case studies. Further, the results are discussed and benchmarked with existing approaches.

  17. Identifying Requirements for Effective Human-Automation Teamwork

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey C. Joe; John O' Hara; Heather D. Medema; Johanna H. Oxstrand

    2014-06-01

    Previous studies have shown that poorly designed human-automation collaboration, such as poorly designed communication protocols, often leads to problems for the human operators, such as: lack of vigilance, complacency, and loss of skills. These problems often lead to suboptimal system performance. To address this situation, a considerable amount of research has been conducted to improve human-automation collaboration and to make automation function better as a “team player.” Much of this research is based on an understanding of what it means to be a good team player from the perspective of a human team. However, the research is often based on a simplified view of human teams and teamwork. In this study, we sought to better understand the capabilities and limitations of automation from the standpoint of human teams. We first examined human teams to identify the principles for effective teamwork. We next reviewed the research on integrating automation agents and human agents into mixed agent teams to identify the limitations of automation agents to conform to teamwork principles. This research resulted in insights that can lead to more effective human-automation collaboration by enabling a more realistic set of requirements to be developed based on the strengths and limitations of all agents.

  18. Advanced automation for in-space vehicle processing

    Science.gov (United States)

    Sklar, Michael; Wegerif, D.

    1990-01-01

    The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.

  19. Automated Assessment in Massive Open Online Courses

    Science.gov (United States)

    Ivaniushin, Dmitrii A.; Shtennikov, Dmitrii G.; Efimchick, Eugene A.; Lyamin, Andrey V.

    2016-01-01

    This paper describes an approach to use automated assessments in online courses. Open edX platform is used as the online courses platform. The new assessment type uses Scilab as learning and solution validation tool. This approach allows to use automated individual variant generation and automated solution checks without involving the course…

  20. Automation trust and attention allocation in multitasking workspace.

    Science.gov (United States)

    Karpinsky, Nicole D; Chancey, Eric T; Palmer, Dakota B; Yamani, Yusuke

    2018-07-01

    Previous research suggests that operators with high workload can distrust and then poorly monitor automation, which has been generally inferred from automation dependence behaviors. To test automation monitoring more directly, the current study measured operators' visual attention allocation, workload, and trust toward imperfect automation in a dynamic multitasking environment. Participants concurrently performed a manual tracking task with two levels of difficulty and a system monitoring task assisted by an unreliable signaling system. Eye movement data indicate that operators allocate less visual attention to monitor automation when the tracking task is more difficult. Participants reported reduced levels of trust toward the signaling system when the tracking task demanded more focused visual attention. Analyses revealed that trust mediated the relationship between the load of the tracking task and attention allocation in Experiment 1, an effect that was not replicated in Experiment 2. Results imply a complex process underlying task load, visual attention allocation, and automation trust during multitasking. Automation designers should consider operators' task load in multitasking workspaces to avoid reduced automation monitoring and distrust toward imperfect signaling systems. Copyright © 2018. Published by Elsevier Ltd.