WorldWideScience

Sample records for cellular automation model

  1. Cellular-automation fluids: A model for flow in porous media

    International Nuclear Information System (INIS)

    Rothman, D.H.

    1987-01-01

    Because the intrinsic inhomogeneity of porous media makes the application of proper boundary conditions difficult, fluid flow through microgeometric models has typically been achieved with idealized arrays of geometrically simple pores, throats, and cracks. The author proposes here an attractive alternative, capable of freely and accurately modeling fluid flow in grossly irregular geometries. This new method numerically solves the Navier-Stokes equations using the cellular-automation fluid model introduced by Frisch, Hasslacher, and Pomeau. The cellular-automation fluid is extraordinarily simple - particles of unit mass traveling with unit velocity reside on a triangular lattice and obey elementary collisions rules - but capable of modeling much of the rich complexity of real fluid flow. The author shows how cellular-automation fluids are applied to the study of porous media. In particular, he discusses issues of scale on the cellular-automation lattice and present the results of 2-D simulations, including numerical estimation of permeability and verification of Darcy's law

  2. Computational Complexity of Some Problems on Generalized Cellular Automations

    Directory of Open Access Journals (Sweden)

    P. G. Klyucharev

    2012-03-01

    Full Text Available We prove that the preimage problem of a generalized cellular automation is NP-hard. The results of this work are important for supporting the security of the ciphers based on the cellular automations.

  3. Track filter on the basis of a cellular automation

    International Nuclear Information System (INIS)

    Glazov, A.A.; Kisel', I.V.; Konotopskaya, E.V.; Ososkov, G.A.

    1991-01-01

    The filtering method for tracks in discrete detectors based on the cellular automation is described. Results of the application of this method to experimental data (the spectrometer ARES) are quite successful: threefold reduction of input information with data grouping according to their belonging to separate tracks. They lift up percentage of useful events, which simplifies and accelerates considerably their next recognition. The described cellular automation for track filtering can be successfully applied in parallel computers and also in on-line mode if hardware implementation is used. 21 refs.; 11 figs

  4. A cellular automation model for the change of public attitude regarding nuclear energy

    International Nuclear Information System (INIS)

    Ohnishi, Teruaki

    1991-01-01

    A cellular automation model was constructed to investigate how public opinion on nuclear energy in Japan depends upon the information environment and personal communication between people. From simulation with this model, the following become clear; (i) society is a highly non-linear system with a self-organizing potential: (ii) in a society composed of one type of constituent member with homogeneous characteristics, the trend of public opinion is substantially changed only when the effort to ameliorate public acceptance over a long period of time, by means such as education, persuasion and advertisement, exceeds a certain threshold, and (iii) in the case when the amount of information on nuclear risk released from the newsmedia is reduced continuously from now on, the acceptability of nuclear energy is significantly improved so far as the extent of the reduction exceeds a certain threshold. (author)

  5. A cellular automation model for the change of public attitude regarding nuclear energy

    Energy Technology Data Exchange (ETDEWEB)

    Ohnishi, Teruaki (CRC Research Inst., Chiba (Japan))

    1991-01-01

    A cellular automation model was constructed to investigate how public opinion on nuclear energy in Japan depends upon the information environment and personal communication between people. From simulation with this model, the following become clear; (i) society is a highly non-linear system with a self-organizing potential: (ii) in a society composed of one type of constituent member with homogeneous characteristics, the trend of public opinion is substantially changed only when the effort to ameliorate public acceptance over a long period of time, by means such as education, persuasion and advertisement, exceeds a certain threshold, and (iii) in the case when the amount of information on nuclear risk released from the newsmedia is reduced continuously from now on, the acceptability of nuclear energy is significantly improved so far as the extent of the reduction exceeds a certain threshold. (author).

  6. Genetic Algorithm Calibration of Probabilistic Cellular Automata for Modeling Mining Permit Activity

    Science.gov (United States)

    Louis, S.J.; Raines, G.L.

    2003-01-01

    We use a genetic algorithm to calibrate a spatially and temporally resolved cellular automata to model mining activity on public land in Idaho and western Montana. The genetic algorithm searches through a space of transition rule parameters of a two dimensional cellular automata model to find rule parameters that fit observed mining activity data. Previous work by one of the authors in calibrating the cellular automaton took weeks - the genetic algorithm takes a day and produces rules leading to about the same (or better) fit to observed data. These preliminary results indicate that genetic algorithms are a viable tool in calibrating cellular automata for this application. Experience gained during the calibration of this cellular automata suggests that mineral resource information is a critical factor in the quality of the results. With automated calibration, further refinements of how the mineral-resource information is provided to the cellular automaton will probably improve our model.

  7. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  8. Automated cellular sample preparation using a Centrifuge-on-a-Chip.

    Science.gov (United States)

    Mach, Albert J; Kim, Jae Hyun; Arshi, Armin; Hur, Soojung Claire; Di Carlo, Dino

    2011-09-07

    The standard centrifuge is a laboratory instrument widely used by biologists and medical technicians for preparing cell samples. Efforts to automate the operations of concentration, cell separation, and solution exchange that a centrifuge performs in a simpler and smaller platform have had limited success. Here, we present a microfluidic chip that replicates the functions of a centrifuge without moving parts or external forces. The device operates using a purely fluid dynamic phenomenon in which cells selectively enter and are maintained in microscale vortices. Continuous and sequential operation allows enrichment of cancer cells from spiked blood samples at the mL min(-1) scale, followed by fluorescent labeling of intra- and extra-cellular antigens on the cells without the need for manual pipetting and washing steps. A versatile centrifuge-analogue may open opportunities in automated, low-cost and high-throughput sample preparation as an alternative to the standard benchtop centrifuge in standardized clinical diagnostics or resource poor settings.

  9. Simulation of Regionally Ecological Land Based on a Cellular Automation Model: A Case Study of Beijing, China

    Directory of Open Access Journals (Sweden)

    Xiubin Li

    2012-08-01

    Full Text Available Ecological land is like the “liver” of a city and is very useful to public health. Ecological land change is a spatially dynamic non-linear process under the interaction between natural and anthropogenic factors at different scales. In this study, by setting up natural development scenario, object orientation scenario and ecosystem priority scenario, a Cellular Automation (CA model has been established to simulate the evolution pattern of ecological land in Beijing in the year 2020. Under the natural development scenario, most of ecological land will be replaced by construction land and crop land. But under the scenarios of object orientation and ecosystem priority, the ecological land area will increase, especially under the scenario of ecosystem priority. When considering the factors such as total area of ecological land, loss of key ecological land and spatial patterns of land use, the scenarios from priority to inferiority are ecosystem priority, object orientation and natural development, so future land management policies in Beijing should be focused on conversion of cropland to forest, wetland protection and prohibition of exploitation of natural protection zones, water source areas and forest parks to maintain the safety of the regional ecosystem.

  10. Simulation of regionally ecological land based on a cellular automation model: a case study of Beijing, China.

    Science.gov (United States)

    Xie, Hualin; Kung, Chih-Chun; Zhang, Yanting; Li, Xiubin

    2012-08-01

    Ecological land is like the "liver" of a city and is very useful to public health. Ecological land change is a spatially dynamic non-linear process under the interaction between natural and anthropogenic factors at different scales. In this study, by setting up natural development scenario, object orientation scenario and ecosystem priority scenario, a Cellular Automation (CA) model has been established to simulate the evolution pattern of ecological land in Beijing in the year 2020. Under the natural development scenario, most of ecological land will be replaced by construction land and crop land. But under the scenarios of object orientation and ecosystem priority, the ecological land area will increase, especially under the scenario of ecosystem priority. When considering the factors such as total area of ecological land, loss of key ecological land and spatial patterns of land use, the scenarios from priority to inferiority are ecosystem priority, object orientation and natural development, so future land management policies in Beijing should be focused on conversion of cropland to forest, wetland protection and prohibition of exploitation of natural protection zones, water source areas and forest parks to maintain the safety of the regional ecosystem.

  11. Automated and Adaptable Quantification of Cellular Alignment from Microscopic Images for Tissue Engineering Applications

    Science.gov (United States)

    Xu, Feng; Beyazoglu, Turker; Hefner, Evan; Gurkan, Umut Atakan

    2011-01-01

    Cellular alignment plays a critical role in functional, physical, and biological characteristics of many tissue types, such as muscle, tendon, nerve, and cornea. Current efforts toward regeneration of these tissues include replicating the cellular microenvironment by developing biomaterials that facilitate cellular alignment. To assess the functional effectiveness of the engineered microenvironments, one essential criterion is quantification of cellular alignment. Therefore, there is a need for rapid, accurate, and adaptable methodologies to quantify cellular alignment for tissue engineering applications. To address this need, we developed an automated method, binarization-based extraction of alignment score (BEAS), to determine cell orientation distribution in a wide variety of microscopic images. This method combines a sequenced application of median and band-pass filters, locally adaptive thresholding approaches and image processing techniques. Cellular alignment score is obtained by applying a robust scoring algorithm to the orientation distribution. We validated the BEAS method by comparing the results with the existing approaches reported in literature (i.e., manual, radial fast Fourier transform-radial sum, and gradient based approaches). Validation results indicated that the BEAS method resulted in statistically comparable alignment scores with the manual method (coefficient of determination R2=0.92). Therefore, the BEAS method introduced in this study could enable accurate, convenient, and adaptable evaluation of engineered tissue constructs and biomaterials in terms of cellular alignment and organization. PMID:21370940

  12. Modeling Increased Complexity and the Reliance on Automation: FLightdeck Automation Problems (FLAP) Model

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.

  13. Automated Physico-Chemical Cell Model Development through Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    Peter J. Ortoleva

    2005-11-29

    The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.

  14. Modeling cellular systems

    CERN Document Server

    Matthäus, Franziska; Pahle, Jürgen

    2017-01-01

    This contributed volume comprises research articles and reviews on topics connected to the mathematical modeling of cellular systems. These contributions cover signaling pathways, stochastic effects, cell motility and mechanics, pattern formation processes, as well as multi-scale approaches. All authors attended the workshop on "Modeling Cellular Systems" which took place in Heidelberg in October 2014. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.

  15. Cosserat modeling of cellular solids

    NARCIS (Netherlands)

    Onck, P.R.

    Cellular solids inherit their macroscopic mechanical properties directly from the cellular microstructure. However, the characteristic material length scale is often not small compared to macroscopic dimensions, which limits the applicability of classical continuum-type constitutive models. Cosserat

  16. Cellular modeling of fault-tolerant multicomputers

    Energy Technology Data Exchange (ETDEWEB)

    Morgan, G

    1987-01-01

    Work described was concerned with a novel method for investigation of fault tolerance in large regular networks of computers. Motivation was to provide a technique useful in rapid evaluation of highly reliable systems that exploit the low cost and ease of volume production of simple microcomputer components. First, a system model and simulator based upon cellular automata are developed. This model is characterized by its simplicity and ease of modification when adapting to new types of network. Second, in order to test and verify the predictive capabilities of the cellular system, a more-detailed simulation is performed based upon an existing computational model, that of the Transputer. An example application is used to exercise various systems designed using the cellular model. Using this simulator, experimental results are obtained both for existing well-understood configurations and for more novel types also developed here. In all cases it was found that the cellular model and simulator successfully predicted the ranking in reliability improvement of the systems studied.

  17. Implementing The Automated Phases Of The Partially-Automated Digital Triage Process Model

    Directory of Open Access Journals (Sweden)

    Gary D Cantrell

    2012-12-01

    Full Text Available Digital triage is a pre-digital-forensic phase that sometimes takes place as a way of gathering quick intelligence. Although effort has been undertaken to model the digital forensics process, little has been done to date to model digital triage. This work discuses the further development of a model that does attempt to address digital triage the Partially-automated Crime Specific Digital Triage Process model. The model itself will be presented along with a description of how its automated functionality was implemented to facilitate model testing.

  18. Recent advances in automated system model extraction (SME)

    International Nuclear Information System (INIS)

    Narayanan, Nithin; Bloomsburgh, John; He Yie; Mao Jianhua; Patil, Mahesh B; Akkaraju, Sandeep

    2006-01-01

    In this paper we present two different techniques for automated extraction of system models from FEA models. We discuss two different algorithms: for (i) automated N-DOF SME for electrostatically actuated MEMS and (ii) automated N-DOF SME for MEMS inertial sensors. We will present case studies for the two different algorithms presented

  19. Automated Student Model Improvement

    Science.gov (United States)

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  20. The Automation of Nowcast Model Assessment Processes

    Science.gov (United States)

    2016-09-01

    secondly, provide modelers with the information needed to understand the model errors and how their algorithm changes might mitigate these errors. In...by ARL modelers. 2. Development Environment The automation of Point-Stat processes (i.e., PSA) was developed using Python 3.5.* Python was selected...because it is easy to use, widely used for scripting, and satisfies all the requirements to automate the implementation of the Point-Stat tool. In

  1. Precision automation of cell type classification and sub-cellular fluorescence quantification from laser scanning confocal images

    Directory of Open Access Journals (Sweden)

    Hardy Craig Hall

    2016-02-01

    Full Text Available While novel whole-plant phenotyping technologies have been successfully implemented into functional genomics and breeding programs, the potential of automated phenotyping with cellular resolution is largely unexploited. Laser scanning confocal microscopy has the potential to close this gap by providing spatially highly resolved images containing anatomic as well as chemical information on a subcellular basis. However, in the absence of automated methods, the assessment of the spatial patterns and abundance of fluorescent markers with subcellular resolution is still largely qualitative and time-consuming. Recent advances in image acquisition and analysis, coupled with improvements in microprocessor performance, have brought such automated methods within reach, so that information from thousands of cells per image for hundreds of images may be derived in an experimentally convenient time-frame. Here, we present a MATLAB-based analytical pipeline to 1 segment radial plant organs into individual cells, 2 classify cells into cell type categories based upon random forest classification, 3 divide each cell into sub-regions, and 4 quantify fluorescence intensity to a subcellular degree of precision for a separate fluorescence channel. In this research advance, we demonstrate the precision of this analytical process for the relatively complex tissues of Arabidopsis hypocotyls at various stages of development. High speed and robustness make our approach suitable for phenotyping of large collections of stem-like material and other tissue types.

  2. Dynamic behavior of cellular materials and cellular structures: Experiments and modeling

    Science.gov (United States)

    Gao, Ziyang

    Cellular solids, including cellular materials and cellular structures (CMS), have attracted people's great interests because of their low densities and novel physical, mechanical, thermal, electrical and acoustic properties. They offer potential for lightweight structures, energy absorption, thermal management, etc. Therefore, the studies of cellular solids have become one of the hottest research fields nowadays. From energy absorption point of view, any plastically deformed structures can be divided into two types (called type I and type II), and the basic cells of the CMS may take the configurations of these two types of structures. Accordingly, separated discussions are presented in this thesis. First, a modified 1-D model is proposed and numerically solved for a typical type II structure. Good agreement is achieved with the previous experimental data, hence is used to simulate the dynamic behavior of a type II chain. Resulted from different load speeds, interesting collapse modes are observed, and the parameters which govern the cell's post-collapse behavior are identified through a comprehensive non-dimensional analysis on general cellular chains. Secondly, the MHS specimens are chosen as an example of type I foam materials because of their good uniformity of the cell geometry. An extensive experimental study was carried out, where more attention was paid to their responses to dynamic loadings. Great enhancement of the stress-strain curve was observed in dynamic cases, and the energy absorption capacity is found to be several times higher than that of the commercial metal foams. Based on the experimental study, finite elemental simulations and theoretical modeling are also conducted, achieving good agreements and demonstrating the validities of those models. It is believed that the experimental, numerical and analytical results obtained in the present study will certainly deepen the understanding of the unsolved fundamental issues on the mechanical behavior of

  3. Agent-based Modeling Automated: Data-driven Generation of Innovation Diffusion Models

    NARCIS (Netherlands)

    Jensen, T.; Chappin, E.J.L.

    2016-01-01

    Simulation modeling is useful to gain insights into driving mechanisms of diffusion of innovations. This study aims to introduce automation to make identification of such mechanisms with agent-based simulation modeling less costly in time and labor. We present a novel automation procedure in which

  4. Cellular automata model for traffic flow with safe driving conditions

    International Nuclear Information System (INIS)

    Lárraga María Elena; Alvarez-Icaza Luis

    2014-01-01

    In this paper, a recently introduced cellular automata (CA) model is used for a statistical analysis of the inner microscopic structure of synchronized traffic flow. The analysis focuses on the formation and dissolution of clusters or platoons of vehicles, as the mechanism that causes the presence of this synchronized traffic state with a high flow. This platoon formation is one of the most interesting phenomena observed in traffic flows and plays an important role both in manual and automated highway systems (AHS). Simulation results, obtained from a single-lane system under periodic boundary conditions indicate that in the density region where the synchronized state is observed, most vehicles travel together in platoons with approximately the same speed and small spatial distances. The examination of velocity variations and individual vehicle gaps shows that the flow corresponding to the synchronized state is stable, safe and highly correlated. Moreover, results indicate that the observed platoon formation in real traffic is reproduced in simulations by the relation between vehicle headway and velocity that is embedded in the dynamics definition of the CA model. (general)

  5. Cellular potts models multiscale extensions and biological applications

    CERN Document Server

    Scianna, Marco

    2013-01-01

    A flexible, cell-level, and lattice-based technique, the cellular Potts model accurately describes the phenomenological mechanisms involved in many biological processes. Cellular Potts Models: Multiscale Extensions and Biological Applications gives an interdisciplinary, accessible treatment of these models, from the original methodologies to the latest developments. The book first explains the biophysical bases, main merits, and limitations of the cellular Potts model. It then proposes several innovative extensions, focusing on ways to integrate and interface the basic cellular Potts model at the mesoscopic scale with approaches that accurately model microscopic dynamics. These extensions are designed to create a nested and hybrid environment, where the evolution of a biological system is realistically driven by the constant interplay and flux of information between the different levels of description. Through several biological examples, the authors demonstrate a qualitative and quantitative agreement with t...

  6. Modelling for near-surface interaction of lithium ceramics and sweep-gas by use of cellular automation

    International Nuclear Information System (INIS)

    Shimura, K.; Terai, T.; Yamawaki, M.; Yamaguchi, K.

    2003-01-01

    Tritium release from the lithium ceramics as a fusion reactor breeder material is strongly affected by the composition of the sweep-gas as result of its influences with the material's surface. The typical surface processes which play important roles are adsorption, desorption and interaction between vacancy site and the constituents of the sweep-gas. Among a large number of studies and models, yet it seems to be difficult to model the overall behaviour of those processes due to its complex time-transient nature. In the present work the coarse grained atomic simulation based on the Cellular Automaton (CA) is used to model the dynamics of near-surface interaction between Li 2 O surface and sweep-gas that is consisting of a noble gas, hydrogen gas and water vapour. (author)

  7. Bayesian Safety Risk Modeling of Human-Flightdeck Automation Interaction

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2015-01-01

    Usage of automatic systems in airliners has increased fuel efficiency, added extra capabilities, enhanced safety and reliability, as well as provide improved passenger comfort since its introduction in the late 80's. However, original automation benefits, including reduced flight crew workload, human errors or training requirements, were not achieved as originally expected. Instead, automation introduced new failure modes, redistributed, and sometimes increased workload, brought in new cognitive and attention demands, and increased training requirements. Modern airliners have numerous flight modes, providing more flexibility (and inherently more complexity) to the flight crew. However, the price to pay for the increased flexibility is the need for increased mode awareness, as well as the need to supervise, understand, and predict automated system behavior. Also, over-reliance on automation is linked to manual flight skill degradation and complacency in commercial pilots. As a result, recent accidents involving human errors are often caused by the interactions between humans and the automated systems (e.g., the breakdown in man-machine coordination), deteriorated manual flying skills, and/or loss of situational awareness due to heavy dependence on automated systems. This paper describes the development of the increased complexity and reliance on automation baseline model, named FLAP for FLightdeck Automation Problems. The model development process starts with a comprehensive literature review followed by the construction of a framework comprised of high-level causal factors leading to an automation-related flight anomaly. The framework was then converted into a Bayesian Belief Network (BBN) using the Hugin Software v7.8. The effects of automation on flight crew are incorporated into the model, including flight skill degradation, increased cognitive demand and training requirements along with their interactions. Besides flight crew deficiencies, automation system

  8. Task-focused modeling in automated agriculture

    Science.gov (United States)

    Vriesenga, Mark R.; Peleg, K.; Sklansky, Jack

    1993-01-01

    Machine vision systems analyze image data to carry out automation tasks. Our interest is in machine vision systems that rely on models to achieve their designed task. When the model is interrogated from an a priori menu of questions, the model need not be complete. Instead, the machine vision system can use a partial model that contains a large amount of information in regions of interest and less information elsewhere. We propose an adaptive modeling scheme for machine vision, called task-focused modeling, which constructs a model having just sufficient detail to carry out the specified task. The model is detailed in regions of interest to the task and is less detailed elsewhere. This focusing effect saves time and reduces the computational effort expended by the machine vision system. We illustrate task-focused modeling by an example involving real-time micropropagation of plants in automated agriculture.

  9. Extended Cellular Automata Models of Particles and Space-Time

    Science.gov (United States)

    Beedle, Michael

    2005-04-01

    Models of particles and space-time are explored through simulations and theoretical models that use Extended Cellular Automata models. The expanded Cellular Automata Models consist go beyond simple scalar binary cell-fields, into discrete multi-level group representations like S0(2), SU(2), SU(3), SPIN(3,1). The propagation and evolution of these expanded cellular automatas are then compared to quantum field theories based on the "harmonic paradigm" i.e. built by an infinite number of harmonic oscillators, and with gravitational models.

  10. Model-based automated testing of critical PLC programs.

    CERN Document Server

    Fernández Adiego, B; Tournier, J-C; González Suárez, V M; Bliudze, S

    2014-01-01

    Testing of critical PLC (Programmable Logic Controller) programs remains a challenging task for control system engineers as it can rarely be automated. This paper proposes a model based approach which uses the BIP (Behavior, Interactions and Priorities) framework to perform automated testing of PLC programs developed with the UNICOS (UNified Industrial COntrol System) framework. This paper defines the translation procedure and rules from UNICOS to BIP which can be fully automated in order to hide the complexity of the underlying model from the control engineers. The approach is illustrated and validated through the study of a water treatment process.

  11. Using Modeling and Simulation to Predict Operator Performance and Automation-Induced Complacency With Robotic Automation: A Case Study and Empirical Validation.

    Science.gov (United States)

    Wickens, Christopher D; Sebok, Angelia; Li, Huiyang; Sarter, Nadine; Gacy, Andrew M

    2015-09-01

    The aim of this study was to develop and validate a computational model of the automation complacency effect, as operators work on a robotic arm task, supported by three different degrees of automation. Some computational models of complacency in human-automation interaction exist, but those are formed and validated within the context of fairly simplified monitoring failures. This research extends model validation to a much more complex task, so that system designers can establish, without need for human-in-the-loop (HITL) experimentation, merits and shortcomings of different automation degrees. We developed a realistic simulation of a space-based robotic arm task that could be carried out with three different levels of trajectory visualization and execution automation support. Using this simulation, we performed HITL testing. Complacency was induced via several trials of correctly performing automation and then was assessed on trials when automation failed. Following a cognitive task analysis of the robotic arm operation, we developed a multicomponent model of the robotic operator and his or her reliance on automation, based in part on visual scanning. The comparison of model predictions with empirical results revealed that the model accurately predicted routine performance and predicted the responses to these failures after complacency developed. However, the scanning models do not account for the entire attention allocation effects of complacency. Complacency modeling can provide a useful tool for predicting the effects of different types of imperfect automation. The results from this research suggest that focus should be given to supporting situation awareness in automation development. © 2015, Human Factors and Ergonomics Society.

  12. RCrane: semi-automated RNA model building

    International Nuclear Information System (INIS)

    Keating, Kevin S.; Pyle, Anna Marie

    2012-01-01

    RCrane is a new tool for the partially automated building of RNA crystallographic models into electron-density maps of low or intermediate resolution. This tool helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems

  13. Automated data acquisition technology development:Automated modeling and control development

    Science.gov (United States)

    Romine, Peter L.

    1995-01-01

    This report documents the completion of, and improvements made to, the software developed for automated data acquisition and automated modeling and control development on the Texas Micro rackmounted PC's. This research was initiated because a need was identified by the Metal Processing Branch of NASA Marshall Space Flight Center for a mobile data acquisition and data analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC based system was chosen. The Welding Measurement System (WMS), is a dedicated instrument strickly for use of data acquisition and data analysis. In addition to the data acquisition functions described in this thesis, WMS also supports many functions associated with process control. The hardware and software requirements for an automated acquisition system for welding process parameters, welding equipment checkout, and welding process modeling were determined in 1992. From these recommendations, NASA purchased the necessary hardware and software. The new welding acquisition system is designed to collect welding parameter data and perform analysis to determine the voltage versus current arc-length relationship for VPPA welding. Once the results of this analysis are obtained, they can then be used to develop a RAIL function to control welding startup and shutdown without torch crashing.

  14. Cellular-automata supercomputers for fluid-dynamics modeling

    International Nuclear Information System (INIS)

    Margolus, N.; Toffoli, T.; Vichniac, G.

    1986-01-01

    We report recent developments in the modeling of fluid dynamics, and give experimental results (including dynamical exponents) obtained using cellular automata machines. Because of their locality and uniformity, cellular automata lend themselves to an extremely efficient physical realization; with a suitable architecture, an amount of hardware resources comparable to that of a home computer can achieve (in the simulation of cellular automata) the performance of a conventional supercomputer

  15. Models of Automation Surprise: Results of a Field Survey in Aviation

    Directory of Open Access Journals (Sweden)

    Robert De Boer

    2017-09-01

    Full Text Available Automation surprises in aviation continue to be a significant safety concern and the community’s search for effective strategies to mitigate them are ongoing. The literature has offered two fundamentally divergent directions, based on different ideas about the nature of cognition and collaboration with automation. In this paper, we report the results of a field study that empirically compared and contrasted two models of automation surprises: a normative individual-cognition model and a sensemaking model based on distributed cognition. Our data prove a good fit for the sense-making model. This finding is relevant for aviation safety, since our understanding of the cognitive processes that govern human interaction with automation drive what we need to do to reduce the frequency of automation-induced events.

  16. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  17. Automation model of sewerage rehabilitation planning.

    Science.gov (United States)

    Yang, M D; Su, T C

    2006-01-01

    The major steps of sewerage rehabilitation include inspection of sewerage, assessment of structural conditions, computation of structural condition grades, and determination of rehabilitation methods and materials. Conventionally, sewerage rehabilitation planning relies on experts with professional background that is tedious and time-consuming. This paper proposes an automation model of planning optimal sewerage rehabilitation strategies for the sewer system by integrating image process, clustering technology, optimization, and visualization display. Firstly, image processing techniques, such as wavelet transformation and co-occurrence features extraction, were employed to extract various characteristics of structural failures from CCTV inspection images. Secondly, a classification neural network was established to automatically interpret the structural conditions by comparing the extracted features with the typical failures in a databank. Then, to achieve optimal rehabilitation efficiency, a genetic algorithm was used to determine appropriate rehabilitation methods and substitution materials for the pipe sections with a risk of mal-function and even collapse. Finally, the result from the automation model can be visualized in a geographic information system in which essential information of the sewer system and sewerage rehabilitation plans are graphically displayed. For demonstration, the automation model of optimal sewerage rehabilitation planning was applied to a sewer system in east Taichung, Chinese Taiwan.

  18. Automated protein structure modeling with SWISS-MODEL Workspace and the Protein Model Portal.

    Science.gov (United States)

    Bordoli, Lorenza; Schwede, Torsten

    2012-01-01

    Comparative protein structure modeling is a computational approach to build three-dimensional structural models for proteins using experimental structures of related protein family members as templates. Regular blind assessments of modeling accuracy have demonstrated that comparative protein structure modeling is currently the most reliable technique to model protein structures. Homology models are often sufficiently accurate to substitute for experimental structures in a wide variety of applications. Since the usefulness of a model for specific application is determined by its accuracy, model quality estimation is an essential component of protein structure prediction. Comparative protein modeling has become a routine approach in many areas of life science research since fully automated modeling systems allow also nonexperts to build reliable models. In this chapter, we describe practical approaches for automated protein structure modeling with SWISS-MODEL Workspace and the Protein Model Portal.

  19. Automated structure solution, density modification and model building.

    Science.gov (United States)

    Terwilliger, Thomas C

    2002-11-01

    The approaches that form the basis of automated structure solution in SOLVE and RESOLVE are described. The use of a scoring scheme to convert decision making in macromolecular structure solution to an optimization problem has proven very useful and in many cases a single clear heavy-atom solution can be obtained and used for phasing. Statistical density modification is well suited to an automated approach to structure solution because the method is relatively insensitive to choices of numbers of cycles and solvent content. The detection of non-crystallographic symmetry (NCS) in heavy-atom sites and checking of potential NCS operations against the electron-density map has proven to be a reliable method for identification of NCS in most cases. Automated model building beginning with an FFT-based search for helices and sheets has been successful in automated model building for maps with resolutions as low as 3 A. The entire process can be carried out in a fully automatic fashion in many cases.

  20. Simulasi Tumbukan Partikel Gas Ideal Dengan Model Cellular Automata Dua Dimensi

    OpenAIRE

    Abdul Basid, Annisa Mujriati

    2010-01-01

    Telah dilakukan simulasi tumbukan partikel gas ideal dengan menggunakan  model cellular automata dua dimensi untuk memvisualisasikan tumbukan partikel gas ideal. Tumbukan partikel  disimulasikan  dengan  menggunakan  model  cellular  automata  dua  dimensi.  Di  dalam cellular automata, pergerakan partikel diatur dengan suatu aturan  yaitu aturan delapan tetangga yang merupakan aturan acak. Hasil program simulasi tumbukan partikel gas ideal dengan model cellular automata dua dimensi  mengguna...

  1. RCrane: semi-automated RNA model building.

    Science.gov (United States)

    Keating, Kevin S; Pyle, Anna Marie

    2012-08-01

    RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems.

  2. Agent-Based Modeling of Mitochondria Links Sub-Cellular Dynamics to Cellular Homeostasis and Heterogeneity.

    Directory of Open Access Journals (Sweden)

    Giovanni Dalmasso

    Full Text Available Mitochondria are semi-autonomous organelles that supply energy for cellular biochemistry through oxidative phosphorylation. Within a cell, hundreds of mobile mitochondria undergo fusion and fission events to form a dynamic network. These morphological and mobility dynamics are essential for maintaining mitochondrial functional homeostasis, and alterations both impact and reflect cellular stress states. Mitochondrial homeostasis is further dependent on production (biogenesis and the removal of damaged mitochondria by selective autophagy (mitophagy. While mitochondrial function, dynamics, biogenesis and mitophagy are highly-integrated processes, it is not fully understood how systemic control in the cell is established to maintain homeostasis, or respond to bioenergetic demands. Here we used agent-based modeling (ABM to integrate molecular and imaging knowledge sets, and simulate population dynamics of mitochondria and their response to environmental energy demand. Using high-dimensional parameter searches we integrated experimentally-measured rates of mitochondrial biogenesis and mitophagy, and using sensitivity analysis we identified parameter influences on population homeostasis. By studying the dynamics of cellular subpopulations with distinct mitochondrial masses, our approach uncovered system properties of mitochondrial populations: (1 mitochondrial fusion and fission activities rapidly establish mitochondrial sub-population homeostasis, and total cellular levels of mitochondria alter fusion and fission activities and subpopulation distributions; (2 restricting the directionality of mitochondrial mobility does not alter morphology subpopulation distributions, but increases network transmission dynamics; and (3 maintaining mitochondrial mass homeostasis and responding to bioenergetic stress requires the integration of mitochondrial dynamics with the cellular bioenergetic state. Finally, (4 our model suggests sources of, and stress conditions

  3. Automation of Endmember Pixel Selection in SEBAL/METRIC Model

    Science.gov (United States)

    Bhattarai, N.; Quackenbush, L. J.; Im, J.; Shaw, S. B.

    2015-12-01

    The commonly applied surface energy balance for land (SEBAL) and its variant, mapping evapotranspiration (ET) at high resolution with internalized calibration (METRIC) models require manual selection of endmember (i.e. hot and cold) pixels to calibrate sensible heat flux. Current approaches for automating this process are based on statistical methods and do not appear to be robust under varying climate conditions and seasons. In this paper, we introduce a new approach based on simple machine learning tools and search algorithms that provides an automatic and time efficient way of identifying endmember pixels for use in these models. The fully automated models were applied on over 100 cloud-free Landsat images with each image covering several eddy covariance flux sites in Florida and Oklahoma. Observed land surface temperatures at automatically identified hot and cold pixels were within 0.5% of those from pixels manually identified by an experienced operator (coefficient of determination, R2, ≥ 0.92, Nash-Sutcliffe efficiency, NSE, ≥ 0.92, and root mean squared error, RMSE, ≤ 1.67 K). Daily ET estimates derived from the automated SEBAL and METRIC models were in good agreement with their manual counterparts (e.g., NSE ≥ 0.91 and RMSE ≤ 0.35 mm day-1). Automated and manual pixel selection resulted in similar estimates of observed ET across all sites. The proposed approach should reduce time demands for applying SEBAL/METRIC models and allow for their more widespread and frequent use. This automation can also reduce potential bias that could be introduced by an inexperienced operator and extend the domain of the models to new users.

  4. DNA index determination with Automated Cellular Imaging System (ACIS in Barrett's esophagus: Comparison with CAS 200

    Directory of Open Access Journals (Sweden)

    Klein Michael

    2005-08-01

    Full Text Available Abstract Background For solid tumors, image cytometry has been shown to be more sensitive for diagnosing DNA content abnormalities (aneuploidy than flow cytometry. Image cytometry has often been performed using the semi-automated CAS 200 system. Recently, an Automated Cellular Imaging System (ACIS was introduced to determine DNA content (DNA index, but it has not been validated. Methods Using the CAS 200 system and ACIS, we compared the DNA index (DI obtained from the same archived formalin-fixed and paraffin embedded tissue samples from Barrett's esophagus related lesions, including samples with specialized intestinal metaplasia without dysplasia, low-grade dysplasia, high-grade dysplasia and adenocarcinoma. Results Although there was a very good correlation between the DI values determined by ACIS and CAS 200, the former was 25% more sensitive in detecting aneuploidy. ACIS yielded a mean DI value 18% higher than that obtained by CAS 200 (p t test. In addition, the average time required to perform a DNA ploidy analysis was shorter with the ACIS (30–40 min than with the CAS 200 (40–70 min. Results obtained by ACIS gave excellent inter-and intra-observer variability (coefficient of correlation >0.9 for both, p Conclusion Compared with the CAS 200, the ACIS is a more sensitive and less time consuming technique for determining DNA ploidy. Results obtained by ACIS are also highly reproducible.

  5. A survey on the modeling and applications of cellular automata theory

    Science.gov (United States)

    Gong, Yimin

    2017-09-01

    The Cellular Automata Theory is a discrete model which is now widely used in scientific researches and simulations. The model is comprised of some cells which changes according to a specific rule over time. This paper provides a survey of the Modeling and Applications of Cellular Automata Theory, which focus on the program realization of Cellular Automata Theory and the application of Cellular Automata in each field, such as road traffic, land use, and cutting machines. Each application is further explained, and several related main models are briefly introduced. This research aims to help decision-makers formulate appropriate development plans.

  6. Modeling integrated cellular machinery using hybrid Petri-Boolean networks.

    Directory of Open Access Journals (Sweden)

    Natalie Berestovsky

    Full Text Available The behavior and phenotypic changes of cells are governed by a cellular circuitry that represents a set of biochemical reactions. Based on biological functions, this circuitry is divided into three types of networks, each encoding for a major biological process: signal transduction, transcription regulation, and metabolism. This division has generally enabled taming computational complexity dealing with the entire system, allowed for using modeling techniques that are specific to each of the components, and achieved separation of the different time scales at which reactions in each of the three networks occur. Nonetheless, with this division comes loss of information and power needed to elucidate certain cellular phenomena. Within the cell, these three types of networks work in tandem, and each produces signals and/or substances that are used by the others to process information and operate normally. Therefore, computational techniques for modeling integrated cellular machinery are needed. In this work, we propose an integrated hybrid model (IHM that combines Petri nets and Boolean networks to model integrated cellular networks. Coupled with a stochastic simulation mechanism, the model simulates the dynamics of the integrated network, and can be perturbed to generate testable hypotheses. Our model is qualitative and is mostly built upon knowledge from the literature and requires fine-tuning of very few parameters. We validated our model on two systems: the transcriptional regulation of glucose metabolism in human cells, and cellular osmoregulation in S. cerevisiae. The model produced results that are in very good agreement with experimental data, and produces valid hypotheses. The abstract nature of our model and the ease of its construction makes it a very good candidate for modeling integrated networks from qualitative data. The results it produces can guide the practitioner to zoom into components and interconnections and investigate them

  7. Modeling and cellular studies

    International Nuclear Information System (INIS)

    Anon.

    1982-01-01

    Testing the applicability of mathematical models with carefully designed experiments is a powerful tool in the investigations of the effects of ionizing radiation on cells. The modeling and cellular studies complement each other, for modeling provides guidance for designing critical experiments which must provide definitive results, while the experiments themselves provide new input to the model. Based on previous experimental results the model for the accumulation of damage in Chlamydomonas reinhardi has been extended to include various multiple two-event combinations. Split dose survival experiments have shown that models tested to date predict most but not all the observed behavior. Stationary-phase mammalian cells, required for tests of other aspects of the model, have been shown to be at different points in the cell cycle depending on how they were forced to stop proliferating. These cultures also demonstrate different capacities for repair of sublethal radiation damage

  8. Automated particulate sampler field test model operations guide

    Energy Technology Data Exchange (ETDEWEB)

    Bowyer, S.M.; Miley, H.S.

    1996-10-01

    The Automated Particulate Sampler Field Test Model Operations Guide is a collection of documents which provides a complete picture of the Automated Particulate Sampler (APS) and the Field Test in which it was evaluated. The Pacific Northwest National Laboratory (PNNL) Automated Particulate Sampler was developed for the purpose of radionuclide particulate monitoring for use under the Comprehensive Test Ban Treaty (CTBT). Its design was directed by anticipated requirements of small size, low power consumption, low noise level, fully automatic operation, and most predominantly the sensitivity requirements of the Conference on Disarmament Working Paper 224 (CDWP224). This guide is intended to serve as both a reference document for the APS and to provide detailed instructions on how to operate the sampler. This document provides a complete description of the APS Field Test Model and all the activity related to its evaluation and progression.

  9. A model based message passing approach for flexible and scalable home automation controllers

    Energy Technology Data Exchange (ETDEWEB)

    Bienhaus, D. [INNIAS GmbH und Co. KG, Frankenberg (Germany); David, K.; Klein, N.; Kroll, D. [ComTec Kassel Univ., SE Kassel Univ. (Germany); Heerdegen, F.; Jubeh, R.; Zuendorf, A. [Kassel Univ. (Germany). FG Software Engineering; Hofmann, J. [BSC Computer GmbH, Allendorf (Germany)

    2012-07-01

    There is a large variety of home automation systems that are largely proprietary systems from different vendors. In addition, the configuration and administration of home automation systems is frequently a very complex task especially, if more complex functionality shall be achieved. Therefore, an open model for home automation was developed that is especially designed for easy integration of various home automation systems. This solution also provides a simple modeling approach that is inspired by typical home automation components like switches, timers, etc. In addition, a model based technology to achieve rich functionality and usability was implemented. (orig.)

  10. Modeling evolution and immune system by cellular automata

    International Nuclear Information System (INIS)

    Bezzi, M.

    2001-01-01

    In this review the behavior of two different biological systems is investigated using cellular automata. Starting from this spatially extended approach it is also tried, in some cases, to reduce the complexity of the system introducing mean-field approximation, and solving (or trying to solve) these simplified systems. It is discussed the biological meaning of the results, the comparison with experimental data (if available) and the different features between spatially extended and mean-field versions. The biological systems considered in this review are the following: Darwinian evolution in simple ecosystems and immune system response. In the first section the main features of molecular evolution are introduced, giving a short survey of genetics for physicists and discussing some models for prebiotic systems and simple ecosystems. It is also introduced a cellular automaton model for studying a set of evolving individuals in a general fitness landscape, considering also the effects of co-evolution. In particular the process of species formation (speciation) is described in sect. 5. The second part deals with immune system modeling. The biological features of immune response are discussed, as well as it is introduced the concept of shape space and of idiotypic network. More detailed reviews which deal with immune system models (mainly focused on idiotypic network models) can be found. Other themes here discussed: the applications of CA to immune system modeling, two complex cellular automata for humoral and cellular immune response. Finally, it is discussed the biological data and the general conclusions are drawn in the last section

  11. Cellular automaton modeling of biological pattern formation characterization, examples, and analysis

    CERN Document Server

    Deutsch, Andreas

    2017-01-01

    This text explores the use of cellular automata in modeling pattern formation in biological systems. It describes several mathematical modeling approaches utilizing cellular automata that can be used to study the dynamics of interacting cell systems both in simulation and in practice. New in this edition are chapters covering cell migration, tissue development, and cancer dynamics, as well as updated references and new research topic suggestions that reflect the rapid development of the field. The book begins with an introduction to pattern-forming principles in biology and the various mathematical modeling techniques that can be used to analyze them. Cellular automaton models are then discussed in detail for different types of cellular processes and interactions, including random movement, cell migration, adhesive cell interaction, alignment and cellular swarming, growth processes, pigment cell pattern formation, tissue development, tumor growth and invasion, and Turing-type patterns and excitable media. In ...

  12. Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems

    Science.gov (United States)

    Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael

    2013-01-01

    The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.

  13. A generalized cellular automata approach to modeling first order ...

    Indian Academy of Sciences (India)

    system, consisting of space, time and state, structured with simple local rules without ... Sensitivity analysis of a stochastic cellular automata model. 413 ..... Baetens J M and De Baets B 2011 Design and parameterization of a stochastic cellular.

  14. Discrete dynamic modeling of cellular signaling networks.

    Science.gov (United States)

    Albert, Réka; Wang, Rui-Sheng

    2009-01-01

    Understanding signal transduction in cellular systems is a central issue in systems biology. Numerous experiments from different laboratories generate an abundance of individual components and causal interactions mediating environmental and developmental signals. However, for many signal transduction systems there is insufficient information on the overall structure and the molecular mechanisms involved in the signaling network. Moreover, lack of kinetic and temporal information makes it difficult to construct quantitative models of signal transduction pathways. Discrete dynamic modeling, combined with network analysis, provides an effective way to integrate fragmentary knowledge of regulatory interactions into a predictive mathematical model which is able to describe the time evolution of the system without the requirement for kinetic parameters. This chapter introduces the fundamental concepts of discrete dynamic modeling, particularly focusing on Boolean dynamic models. We describe this method step-by-step in the context of cellular signaling networks. Several variants of Boolean dynamic models including threshold Boolean networks and piecewise linear systems are also covered, followed by two examples of successful application of discrete dynamic modeling in cell biology.

  15. Automation of electroweak NLO corrections in general models

    Energy Technology Data Exchange (ETDEWEB)

    Lang, Jean-Nicolas [Universitaet Wuerzburg (Germany)

    2016-07-01

    I discuss the automation of generation of scattering amplitudes in general quantum field theories at next-to-leading order in perturbation theory. The work is based on Recola, a highly efficient one-loop amplitude generator for the Standard Model, which I have extended so that it can deal with general quantum field theories. Internally, Recola computes off-shell currents and for new models new rules for off-shell currents emerge which are derived from the Feynman rules. My work relies on the UFO format which can be obtained by a suited model builder, e.g. FeynRules. I have developed tools to derive the necessary counterterm structures and to perform the renormalization within Recola in an automated way. I describe the procedure using the example of the two-Higgs-doublet model.

  16. Cellular modelling of river catchments and reaches: Advantages, limitations and prospects

    Science.gov (United States)

    Coulthard, T. J.; Hicks, D. M.; Van De Wiel, M. J.

    2007-10-01

    The last decade has witnessed the development of a series of cellular models that simulate the processes operating within river channels and drive their geomorphic evolution. Their proliferation can be partly attributed to the relative simplicity of cellular models and their ability to address some of the shortcomings of other numerical models. By using relaxed interpretations of the equations determining fluid flow, cellular models allow rapid solutions of water depths and velocities. These can then be used to drive (usually) conventional sediment transport relations to determine erosion and deposition and alter the channel form. The key advance of using these physically based yet simplified approaches is that they allow us to apply models to a range of spatial scales (1-100 km 2) and time periods (1-100 years) that are especially relevant to contemporary management and fluvial studies. However, these approaches are not without their limitations and technical problems. This paper reviews the findings of nearly 10 years of research into modelling fluvial systems with cellular techniques, principally focusing on improvements in routing water and how fluvial erosion and deposition (including lateral erosion) are represented. These ideas are illustrated using sample simulations of the River Teifi, Wales. A detailed case study is then presented, demonstrating how cellular models can explore the interactions between vegetation and the morphological dynamics of the braided Waitaki River, New Zealand. Finally, difficulties associated with model validation and the problems, prospects and future issues important to the further development and application of these cellular fluvial models are outlined.

  17. Cellular automata a parallel model

    CERN Document Server

    Mazoyer, J

    1999-01-01

    Cellular automata can be viewed both as computational models and modelling systems of real processes. This volume emphasises the first aspect. In articles written by leading researchers, sophisticated massive parallel algorithms (firing squad, life, Fischer's primes recognition) are treated. Their computational power and the specific complexity classes they determine are surveyed, while some recent results in relation to chaos from a new dynamic systems point of view are also presented. Audience: This book will be of interest to specialists of theoretical computer science and the parallelism challenge.

  18. PASSENGER TRAFFIC MOVEMENT MODELLING BY THE CELLULAR-AUTOMAT APPROACH

    Directory of Open Access Journals (Sweden)

    T. Mikhaylovskaya

    2009-01-01

    Full Text Available The mathematical model of passenger traffic movement developed on the basis of the cellular-automat approach is considered. The program realization of the cellular-automat model of pedastrians streams movement in pedestrian subways at presence of obstacles, at subway structure narrowing is presented. The optimum distances between the obstacles and the angle of subway structure narrowing providing pedastrians stream safe movement and traffic congestion occurance are determined.

  19. Automated comparison of Bayesian reconstructions of experimental profiles with physical models

    International Nuclear Information System (INIS)

    Irishkin, Maxim

    2014-01-01

    In this work we developed an expert system that carries out in an integrated and fully automated way i) a reconstruction of plasma profiles from the measurements, using Bayesian analysis ii) a prediction of the reconstructed quantities, according to some models and iii) an intelligent comparison of the first two steps. This system includes systematic checking of the internal consistency of the reconstructed quantities, enables automated model validation and, if a well-validated model is used, can be applied to help detecting interesting new physics in an experiment. The work shows three applications of this quite general system. The expert system can successfully detect failures in the automated plasma reconstruction and provide (on successful reconstruction cases) statistics of agreement of the models with the experimental data, i.e. information on the model validity. (author) [fr

  20. Overview of cellular automaton models for corrosion

    International Nuclear Information System (INIS)

    Perez-Brokate, Cristian Felipe; De Lamare, Jacques; Dung di Caprio; Feron, Damien; Chausse, Annie

    2014-01-01

    A review of corrosion process modeling using cellular automata methods is presented. This relatively new and growing approach takes into account the stochastic nature of the phenomena and uses physico-chemical rules to make predictions at a mesoscopic scale. Milestone models are analyzed and perspectives are established. (authors)

  1. Model-Based approaches to Human-Automation Systems Design

    DEFF Research Database (Denmark)

    Jamieson, Greg A.; Andersson, Jonas; Bisantz, Ann

    2012-01-01

    Human-automation interaction in complex systems is common, yet design for this interaction is often conducted without explicit consideration of the role of the human operator. Fortunately, there are a number of modeling frameworks proposed for supporting this design activity. However...... (and reportedly one or two critics) can engage one another on several agreed questions about such frameworks. The goal is to aid non-aligned practitioners in choosing between alternative frameworks for their human-automation interaction design challenges....

  2. Automated microscopy for high-content RNAi screening

    Science.gov (United States)

    2010-01-01

    Fluorescence microscopy is one of the most powerful tools to investigate complex cellular processes such as cell division, cell motility, or intracellular trafficking. The availability of RNA interference (RNAi) technology and automated microscopy has opened the possibility to perform cellular imaging in functional genomics and other large-scale applications. Although imaging often dramatically increases the content of a screening assay, it poses new challenges to achieve accurate quantitative annotation and therefore needs to be carefully adjusted to the specific needs of individual screening applications. In this review, we discuss principles of assay design, large-scale RNAi, microscope automation, and computational data analysis. We highlight strategies for imaging-based RNAi screening adapted to different library and assay designs. PMID:20176920

  3. Feasibility evaluation of 3 automated cellular drug screening assays on a robotic workstation.

    Science.gov (United States)

    Soikkeli, Anne; Sempio, Cristina; Kaukonen, Ann Marie; Urtti, Arto; Hirvonen, Jouni; Yliperttula, Marjo

    2010-01-01

    This study presents the implementation and optimization of 3 cell-based assays on a TECAN Genesis workstation-the Caspase-Glo 3/7 and sulforhodamine B (SRB) screening assays and the mechanistic Caco-2 permeability protocol-and evaluates their feasibility for automation. During implementation, the dispensing speed to add drug solutions and fixative trichloroacetic acid and the aspiration speed to remove the supernatant immediately after fixation were optimized. Decontamination steps for cleaning the tips and pipetting tubing were also added. The automated Caspase-Glo 3/7 screen was successfully optimized with Caco-2 cells (Z' 0.7, signal-to-base ratio [S/B] 1.7) but not with DU-145 cells. In contrast, the automated SRB screen was successfully optimized with the DU-145 cells (Z' 0.8, S/B 2.4) but not with the Caco-2 cells (Z' -0.8, S/B 1.4). The automated bidirectional Caco-2 permeability experiments separated successfully low- and high-permeability compounds (Z' 0.8, S/B 84.2) and passive drug permeation from efflux-mediated transport (Z' 0.5, S/B 8.6). Of the assays, the homogeneous Caspase-Glo 3/7 assay benefits the most from automation, but also the heterogeneous SRB assay and Caco-2 permeability experiments gain advantages from automation.

  4. A Mathematical Model for Cisplatin Cellular Pharmacodynamics

    Directory of Open Access Journals (Sweden)

    Ardith W. El-Kareh

    2003-03-01

    Full Text Available A simple theoretical model for the cellular pharmacodynamics of cisplatin is presented. The model, which takes into account the kinetics of cisplatin uptake by cells and the intracellular binding of the drug, can be used to predict the dependence of survival (relative to controls on the time course of extracellular exposure. Cellular pharmacokinetic parameters are derived from uptake data for human ovarian and head and neck cancer cell lines. Survival relative to controls is assumed to depend on the peak concentration of DNA-bound intracellular platinum. Model predictions agree well with published data on cisplatin cytotoxicity for three different cancer cell lines, over a wide range of exposure times. In comparison with previously published mathematical models for anticancer drug pharmacodynamics, the present model provides a better fit to experimental data sets including long exposure times (∼100 hours. The model provides a possible explanation for the fact that cell kill correlates well with area under the extracellular concentration-time curve in some data sets, but not in others. The model may be useful for optimizing delivery schedules and for the dosing of cisplatin for cancer therapy.

  5. Modeling the mechanics of cancer: effect of changes in cellular and extra-cellular mechanical properties.

    Science.gov (United States)

    Katira, Parag; Bonnecaze, Roger T; Zaman, Muhammad H

    2013-01-01

    Malignant transformation, though primarily driven by genetic mutations in cells, is also accompanied by specific changes in cellular and extra-cellular mechanical properties such as stiffness and adhesivity. As the transformed cells grow into tumors, they interact with their surroundings via physical contacts and the application of forces. These forces can lead to changes in the mechanical regulation of cell fate based on the mechanical properties of the cells and their surrounding environment. A comprehensive understanding of cancer progression requires the study of how specific changes in mechanical properties influences collective cell behavior during tumor growth and metastasis. Here we review some key results from computational models describing the effect of changes in cellular and extra-cellular mechanical properties and identify mechanistic pathways for cancer progression that can be targeted for the prediction, treatment, and prevention of cancer.

  6. Automated MRI segmentation for individualized modeling of current flow in the human head.

    Science.gov (United States)

    Huang, Yu; Dmochowski, Jacek P; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C

    2013-12-01

    High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of the brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images requires labor-intensive manual segmentation, even when utilizing available automated segmentation tools. Also, accurate placement of many high-density electrodes on an individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. A fully automated segmentation technique based on Statical Parametric Mapping 8, including an improved tissue probability map and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on four healthy subjects and seven stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets. The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly. Fully automated individualized modeling may now be feasible

  7. Simulation Model of Automated Peat Briquetting Press Drive

    Directory of Open Access Journals (Sweden)

    A. A. Marozka

    2012-01-01

    Full Text Available The paper presents the developed fully functional simulation model of an automated peat briquetting press drive. The given model makes it possible to reduce financial and time costs while developing, designing and operating a double-stamp peat briquetting press drive.

  8. Automated side-chain model building and sequence assignment by template matching

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.

    2002-01-01

    A method for automated macromolecular side-chain model building and for aligning the sequence to the map is described. An algorithm is described for automated building of side chains in an electron-density map once a main-chain model is built and for alignment of the protein sequence to the map. The procedure is based on a comparison of electron density at the expected side-chain positions with electron-density templates. The templates are constructed from average amino-acid side-chain densities in 574 refined protein structures. For each contiguous segment of main chain, a matrix with entries corresponding to an estimate of the probability that each of the 20 amino acids is located at each position of the main-chain model is obtained. The probability that this segment corresponds to each possible alignment with the sequence of the protein is estimated using a Bayesian approach and high-confidence matches are kept. Once side-chain identities are determined, the most probable rotamer for each side chain is built into the model. The automated procedure has been implemented in the RESOLVE software. Combined with automated main-chain model building, the procedure produces a preliminary model suitable for refinement and extension by an experienced crystallographer

  9. A catalog of automated analysis methods for enterprise models.

    Science.gov (United States)

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool.

  10. Point process models for localization and interdependence of punctate cellular structures.

    Science.gov (United States)

    Li, Ying; Majarian, Timothy D; Naik, Armaghan W; Johnson, Gregory R; Murphy, Robert F

    2016-07-01

    Accurate representations of cellular organization for multiple eukaryotic cell types are required for creating predictive models of dynamic cellular function. To this end, we have previously developed the CellOrganizer platform, an open source system for generative modeling of cellular components from microscopy images. CellOrganizer models capture the inherent heterogeneity in the spatial distribution, size, and quantity of different components among a cell population. Furthermore, CellOrganizer can generate quantitatively realistic synthetic images that reflect the underlying cell population. A current focus of the project is to model the complex, interdependent nature of organelle localization. We built upon previous work on developing multiple non-parametric models of organelles or structures that show punctate patterns. The previous models described the relationships between the subcellular localization of puncta and the positions of cell and nuclear membranes and microtubules. We extend these models to consider the relationship to the endoplasmic reticulum (ER), and to consider the relationship between the positions of different puncta of the same type. Our results do not suggest that the punctate patterns we examined are dependent on ER position or inter- and intra-class proximity. With these results, we built classifiers to update previous assignments of proteins to one of 11 patterns in three distinct cell lines. Our generative models demonstrate the ability to construct statistically accurate representations of puncta localization from simple cellular markers in distinct cell types, capturing the complex phenomena of cellular structure interaction with little human input. This protocol represents a novel approach to vesicular protein annotation, a field that is often neglected in high-throughput microscopy. These results suggest that spatial point process models provide useful insight with respect to the spatial dependence between cellular structures.

  11. Deterministic one-way simulation of two-way, real-time cellular automata and its related problems

    Energy Technology Data Exchange (ETDEWEB)

    Umeo, H; Morita, K; Sugata, K

    1982-06-13

    The authors show that for any deterministic two-way, real-time cellular automaton, m, there exists a deterministic one-way cellular automation which can simulate m in twice real-time. Moreover the authors present a new type of deterministic one-way cellular automata, called circular cellular automata, which are computationally equivalent to deterministic two-way cellular automata. 7 references.

  12. Quantum field theoretic behavior of a deterministic cellular automaton

    Energy Technology Data Exchange (ETDEWEB)

    Hooft, G ' t; Isler, K; Kalitzin, S [Inst. for Theoretical Physics, Utrecht (Netherlands)

    1992-11-16

    A certain class of cellular automata in 1 space +1 time dimension is shown to be closely related to quantum field theories containing Dirac fermions. In the massless case this relation can be studied analytically, while the introduction of Dirac mass requires numerical simulations. We show that in the last case the cellular automation describes the corresponding field theory only approximately. (orig.).

  13. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  14. Automated statistical modeling of analytical measurement systems

    International Nuclear Information System (INIS)

    Jacobson, J.J.

    1992-01-01

    The statistical modeling of analytical measurement systems at the Idaho Chemical Processing Plant (ICPP) has been completely automated through computer software. The statistical modeling of analytical measurement systems is one part of a complete quality control program used by the Remote Analytical Laboratory (RAL) at the ICPP. The quality control program is an integration of automated data input, measurement system calibration, database management, and statistical process control. The quality control program and statistical modeling program meet the guidelines set forth by the American Society for Testing Materials and American National Standards Institute. A statistical model is a set of mathematical equations describing any systematic bias inherent in a measurement system and the precision of a measurement system. A statistical model is developed from data generated from the analysis of control standards. Control standards are samples which are made up at precise known levels by an independent laboratory and submitted to the RAL. The RAL analysts who process control standards do not know the values of those control standards. The object behind statistical modeling is to describe real process samples in terms of their bias and precision and, to verify that a measurement system is operating satisfactorily. The processing of control standards gives us this ability

  15. A Framework for Semi-Automated Implementation of Multidimensional Data Models

    Directory of Open Access Journals (Sweden)

    Ilona Mariana NAGY

    2012-08-01

    Full Text Available Data warehousing solution development represents a challenging task which requires the employment of considerable resources on behalf of enterprises and sustained commitment from the stakeholders. Costs derive mostly from the amount of time invested in the design and physical implementation of these large projects, time that we consider, may be decreased through the automation of several processes. Thus, we present a framework for semi-automated implementation of multidimensional data models and introduce an automation prototype intended to reduce the time of data structures generation in the warehousing environment. Our research is focused on the design of an automation component and the development of a corresponding prototype from technical metadata.

  16. Implementing Lumberjacks and Black Swans Into Model-Based Tools to Support Human-Automation Interaction.

    Science.gov (United States)

    Sebok, Angelia; Wickens, Christopher D

    2017-03-01

    The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.

  17. Cellular Phone-Based Image Acquisition and Quantitative Ratiometric Method for Detecting Cocaine and Benzoylecgonine for Biological and Forensic Applications

    OpenAIRE

    Cadle, Brian A.; Rasmus, Kristin C.; Varela, Juan A.; Leverich, Leah S.; O’Neill, Casey E.; Bachtell, Ryan K.; Cooper, Donald C.

    2010-01-01

    Here we describe the first report of using low-cost cellular or web-based digital cameras to image and quantify standardized rapid immunoassay strips as a new point-of-care diagnostic and forensics tool with health applications. Quantitative ratiometric pixel density analysis (QRPDA) is an automated method requiring end-users to utilize inexpensive (~ $1 USD/each) immunotest strips, a commonly available web or mobile phone camera or scanner, and internet or cellular service. A model is descri...

  18. Generic framework for mining cellular automata models on protein-folding simulations.

    Science.gov (United States)

    Diaz, N; Tischer, I

    2016-05-13

    Cellular automata model identification is an important way of building simplified simulation models. In this study, we describe a generic architectural framework to ease the development process of new metaheuristic-based algorithms for cellular automata model identification in protein-folding trajectories. Our framework was developed by a methodology based on design patterns that allow an improved experience for new algorithms development. The usefulness of the proposed framework is demonstrated by the implementation of four algorithms, able to obtain extremely precise cellular automata models of the protein-folding process with a protein contact map representation. Dynamic rules obtained by the proposed approach are discussed, and future use for the new tool is outlined.

  19. Empirical study on entropy models of cellular manufacturing systems

    Institute of Scientific and Technical Information of China (English)

    Zhifeng Zhang; Renbin Xiao

    2009-01-01

    From the theoretical point of view,the states of manufacturing resources can be monitored and assessed through the amount of information needed to describe their technological structure and operational state.The amount of information needed to describe cellular manufacturing systems is investigated by two measures:the structural entropy and the operational entropy.Based on the Shannon entropy,the models of the structural entropy and the operational entropy of cellular manufacturing systems are developed,and the cognizance of the states of manufacturing resources is also illustrated.Scheduling is introduced to measure the entropy models of cellular manufacturing systems,and the feasible concepts of maximum schedule horizon and schedule adherence are advanced to quantitatively evaluate the effectiveness of schedules.Finally,an example is used to demonstrate the validity of the proposed methodology.

  20. A cellular automata model for ant trails

    Indian Academy of Sciences (India)

    In this study, the unidirectional ant traffic flow with U-turn in an ant trail was investigated using one-dimensional cellular automata model. It is known that ants communicate with each other by dropping a chemical, called pheromone, on the substrate. Apart from the studies in the literature, it was considered in the model that ...

  1. Fuzzy cellular automata models in immunology

    International Nuclear Information System (INIS)

    Ahmed, E.

    1996-01-01

    The self-nonself character of antigens is considered to be fuzzy. The Chowdhury et al. cellular automata model is generalized accordingly. New steady states are found. The first corresponds to a below-normal help and suppression and is proposed to be related to autoimmune diseases. The second corresponds to a below-normal B-cell level

  2. Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation

    Science.gov (United States)

    2018-01-01

    ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory Simulator Output Files for Model...Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...although some minor changes may be needed. The program processes a GTRAJ output text file that contains results from 2 or more simulations , where each

  3. Automating an integrated spatial data-mining model for landfill site selection

    Science.gov (United States)

    Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Aziz, Hamidi Abdul

    2017-10-01

    An integrated programming environment represents a robust approach to building a valid model for landfill site selection. One of the main challenges in the integrated model is the complicated processing and modelling due to the programming stages and several limitations. An automation process helps avoid the limitations and improve the interoperability between integrated programming environments. This work targets the automation of a spatial data-mining model for landfill site selection by integrating between spatial programming environment (Python-ArcGIS) and non-spatial environment (MATLAB). The model was constructed using neural networks and is divided into nine stages distributed between Matlab and Python-ArcGIS. A case study was taken from the north part of Peninsular Malaysia. 22 criteria were selected to utilise as input data and to build the training and testing datasets. The outcomes show a high-performance accuracy percentage of 98.2% in the testing dataset using 10-fold cross validation. The automated spatial data mining model provides a solid platform for decision makers to performing landfill site selection and planning operations on a regional scale.

  4. Automated model-based testing of hybrid systems

    NARCIS (Netherlands)

    Osch, van M.P.W.J.

    2009-01-01

    In automated model-based input-output conformance testing, tests are automati- cally generated from a speci¯cation and automatically executed on an implemen- tation. Input is applied to the implementation and output is observed from the implementation. If the observed output is allowed according to

  5. Cellular automata and statistical mechanical models

    International Nuclear Information System (INIS)

    Rujan, P.

    1987-01-01

    The authors elaborate on the analogy between the transfer matrix of usual lattice models and the master equation describing the time development of cellular automata. Transient and stationary properties of probabilistic automata are linked to surface and bulk properties, respectively, of restricted statistical mechanical systems. It is demonstrated that methods of statistical physics can be successfully used to describe the dynamic and the stationary behavior of such automata. Some exact results are derived, including duality transformations, exact mappings, disorder, and linear solutions. Many examples are worked out in detail to demonstrate how to use statistical physics in order to construct cellular automata with desired properties. This approach is considered to be a first step toward the design of fully parallel, probabilistic systems whose computational abilities rely on the cooperative behavior of their components

  6. An Immune-inspired Adaptive Automated Intrusion Response System Model

    Directory of Open Access Journals (Sweden)

    Ling-xi Peng

    2012-09-01

    Full Text Available An immune-inspired adaptive automated intrusion response system model, named as , is proposed. The descriptions of self, non-self, immunocyte, memory detector, mature detector and immature detector of the network transactions, and the realtime network danger evaluation equations are given. Then, the automated response polices are adaptively performed or adjusted according to the realtime network danger. Thus, not only accurately evaluates the network attacks, but also greatly reduces the response times and response costs.

  7. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  8. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1986-01-01

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  9. Station Model for Rail Transit System Using Cellular Automata

    International Nuclear Information System (INIS)

    Xun Jing; Ning Bin; Li Keping

    2009-01-01

    In this paper, we propose a new cellular automata model to simulate the railway traffic at station. Based on NaSch model, the proposed station model is composed of the main track and the siding track. Two different schemes for trains passing through station are considered. One is the scheme of 'pass by the main track, start and stop by the siding track'. The other is the scheme of 'two tracks play the same role'. We simulate the train movement using the proposed model and analyze the traffic flow at station. The simulation results demonstrate that the proposed cellular automata model can be successfully used for the simulations of railway traffic. Some characteristic behaviors of railway traffic flow can be reproduced. Moreover, the simulation values of the minimum headway are close to the theoretical values. This result demonstrates the dependability and availability of the proposed model. (general)

  10. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    Science.gov (United States)

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  11. Multi-scale modeling in morphogenesis: a critical analysis of the cellular Potts model.

    Directory of Open Access Journals (Sweden)

    Anja Voss-Böhme

    Full Text Available Cellular Potts models (CPMs are used as a modeling framework to elucidate mechanisms of biological development. They allow a spatial resolution below the cellular scale and are applied particularly when problems are studied where multiple spatial and temporal scales are involved. Despite the increasing usage of CPMs in theoretical biology, this model class has received little attention from mathematical theory. To narrow this gap, the CPMs are subjected to a theoretical study here. It is asked to which extent the updating rules establish an appropriate dynamical model of intercellular interactions and what the principal behavior at different time scales characterizes. It is shown that the longtime behavior of a CPM is degenerate in the sense that the cells consecutively die out, independent of the specific interdependence structure that characterizes the model. While CPMs are naturally defined on finite, spatially bounded lattices, possible extensions to spatially unbounded systems are explored to assess to which extent spatio-temporal limit procedures can be applied to describe the emergent behavior at the tissue scale. To elucidate the mechanistic structure of CPMs, the model class is integrated into a general multiscale framework. It is shown that the central role of the surface fluctuations, which subsume several cellular and intercellular factors, entails substantial limitations for a CPM's exploitation both as a mechanistic and as a phenomenological model.

  12. The Use of AMET & Automated Scripts for Model Evaluation

    Science.gov (United States)

    Brief overview of EPA’s new CMAQ website to be launched publically in June, 2017. Details on the upcoming release of the Atmospheric Model Evaluation Tool (AMET) and the creation of automated scripts for post-processing and evaluating air quality model data.

  13. Context based mixture model for cell phase identification in automated fluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Zhou Xiaobo

    2007-01-01

    Full Text Available Abstract Background Automated identification of cell cycle phases of individual live cells in a large population captured via automated fluorescence microscopy technique is important for cancer drug discovery and cell cycle studies. Time-lapse fluorescence microscopy images provide an important method to study the cell cycle process under different conditions of perturbation. Existing methods are limited in dealing with such time-lapse data sets while manual analysis is not feasible. This paper presents statistical data analysis and statistical pattern recognition to perform this task. Results The data is generated from Hela H2B GFP cells imaged during a 2-day period with images acquired 15 minutes apart using an automated time-lapse fluorescence microscopy. The patterns are described with four kinds of features, including twelve general features, Haralick texture features, Zernike moment features, and wavelet features. To generate a new set of features with more discriminate power, the commonly used feature reduction techniques are used, which include Principle Component Analysis (PCA, Linear Discriminant Analysis (LDA, Maximum Margin Criterion (MMC, Stepwise Discriminate Analysis based Feature Selection (SDAFS, and Genetic Algorithm based Feature Selection (GAFS. Then, we propose a Context Based Mixture Model (CBMM for dealing with the time-series cell sequence information and compare it to other traditional classifiers: Support Vector Machine (SVM, Neural Network (NN, and K-Nearest Neighbor (KNN. Being a standard practice in machine learning, we systematically compare the performance of a number of common feature reduction techniques and classifiers to select an optimal combination of a feature reduction technique and a classifier. A cellular database containing 100 manually labelled subsequence is built for evaluating the performance of the classifiers. The generalization error is estimated using the cross validation technique. The

  14. Modeling the Energy Use of a Connected and Automated Transportation System (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Gonder, J.; Brown, A.

    2014-07-01

    Early research points to large potential impacts of connected and automated vehicles (CAVs) on transportation energy use - dramatic savings, increased use, or anything in between. Due to a lack of suitable data and integrated modeling tools to explore these complex future systems, analyses to date have relied on simple combinations of isolated effects. This poster proposes a framework for modeling the potential energy implications from increasing penetration of CAV technologies and for assessing technology and policy options to steer them toward favorable energy outcomes. Current CAV modeling challenges include estimating behavior change, understanding potential vehicle-to-vehicle interactions, and assessing traffic flow and vehicle use under different automation scenarios. To bridge these gaps and develop a picture of potential future automated systems, NREL is integrating existing modeling capabilities with additional tools and data inputs to create a more fully integrated CAV assessment toolkit.

  15. Multi-scale modeling with cellular automata: The complex automata approach

    NARCIS (Netherlands)

    Hoekstra, A.G.; Falcone, J.-L.; Caiazzo, A.; Chopard, B.

    2008-01-01

    Cellular Automata are commonly used to describe complex natural phenomena. In many cases it is required to capture the multi-scale nature of these phenomena. A single Cellular Automata model may not be able to efficiently simulate a wide range of spatial and temporal scales. It is our goal to

  16. Modeling and Analysis of Cellular CDMA Forward Channel

    National Research Council Canada - National Science Library

    Tighe, Jan

    2001-01-01

    In this thesis, we develop the forward channel model for a DS-CDMA cellular system operating in a slow-flat Rayleigh fading and log normal shadowing environment, which incorporates the extended Hata...

  17. Simulating Urban Growth Using a Random Forest-Cellular Automata (RF-CA Model

    Directory of Open Access Journals (Sweden)

    Courage Kamusoko

    2015-04-01

    Full Text Available Sustainable urban planning and management require reliable land change models, which can be used to improve decision making. The objective of this study was to test a random forest-cellular automata (RF-CA model, which combines random forest (RF and cellular automata (CA models. The Kappa simulation (KSimulation, figure of merit, and components of agreement and disagreement statistics were used to validate the RF-CA model. Furthermore, the RF-CA model was compared with support vector machine cellular automata (SVM-CA and logistic regression cellular automata (LR-CA models. Results show that the RF-CA model outperformed the SVM-CA and LR-CA models. The RF-CA model had a Kappa simulation (KSimulation accuracy of 0.51 (with a figure of merit statistic of 47%, while SVM-CA and LR-CA models had a KSimulation accuracy of 0.39 and −0.22 (with figure of merit statistics of 39% and 6%, respectively. Generally, the RF-CA model was relatively accurate at allocating “non-built-up to built-up” changes as reflected by the correct “non-built-up to built-up” components of agreement of 15%. The performance of the RF-CA model was attributed to the relatively accurate RF transition potential maps. Therefore, this study highlights the potential of the RF-CA model for simulating urban growth.

  18. Excellent approach to modeling urban expansion by fuzzy cellular automata: agent base model

    Science.gov (United States)

    Khajavigodellou, Yousef; Alesheikh, Ali A.; Mohammed, Abdulrazak A. S.; Chapi, Kamran

    2014-09-01

    Recently, the interaction between humans and their environment is the one of important challenges in the world. Landuse/ cover change (LUCC) is a complex process that includes actors and factors at different social and spatial levels. The complexity and dynamics of urban systems make the applicable practice of urban modeling very difficult. With the increased computational power and the greater availability of spatial data, micro-simulation such as the agent based and cellular automata simulation methods, has been developed by geographers, planners, and scholars, and it has shown great potential for representing and simulating the complexity of the dynamic processes involved in urban growth and land use change. This paper presents Fuzzy Cellular Automata in Geospatial Information System and remote Sensing to simulated and predicted urban expansion pattern. These FCA-based dynamic spatial urban models provide an improved ability to forecast and assess future urban growth and to create planning scenarios, allowing us to explore the potential impacts of simulations that correspond to urban planning and management policies. A fuzzy inference guided cellular automata approach. Semantic or linguistic knowledge on Land use change is expressed as fuzzy rules, based on which fuzzy inference is applied to determine the urban development potential for each pixel. The model integrates an ABM (agent-based model) and FCA (Fuzzy Cellular Automata) to investigate a complex decision-making process and future urban dynamic processes. Based on this model rapid development and green land protection under the influences of the behaviors and decision modes of regional authority agents, real estate developer agents, resident agents and non- resident agents and their interactions have been applied to predict the future development patterns of the Erbil metropolitan region.

  19. Automated Model Fit Method for Diesel Engine Control Development

    NARCIS (Netherlands)

    Seykens, X.; Willems, F.P.T.; Kuijpers, B.; Rietjens, C.

    2014-01-01

    This paper presents an automated fit for a control-oriented physics-based diesel engine combustion model. This method is based on the combination of a dedicated measurement procedure and structured approach to fit the required combustion model parameters. Only a data set is required that is

  20. Automated model fit method for diesel engine control development

    NARCIS (Netherlands)

    Seykens, X.L.J.; Willems, F.P.T.; Kuijpers, B.; Rietjens, C.J.H.

    2014-01-01

    This paper presents an automated fit for a control-oriented physics-based diesel engine combustion model. This method is based on the combination of a dedicated measurement procedure and structured approach to fit the required combustion model parameters. Only a data set is required that is

  1. A Compact Synchronous Cellular Model of Nonlinear Calcium Dynamics: Simulation and FPGA Synthesis Results.

    Science.gov (United States)

    Soleimani, Hamid; Drakakis, Emmanuel M

    2017-06-01

    Recent studies have demonstrated that calcium is a widespread intracellular ion that controls a wide range of temporal dynamics in the mammalian body. The simulation and validation of such studies using experimental data would benefit from a fast large scale simulation and modelling tool. This paper presents a compact and fully reconfigurable cellular calcium model capable of mimicking Hopf bifurcation phenomenon and various nonlinear responses of the biological calcium dynamics. The proposed cellular model is synthesized on a digital platform for a single unit and a network model. Hardware synthesis, physical implementation on FPGA, and theoretical analysis confirm that the proposed cellular model can mimic the biological calcium behaviors with considerably low hardware overhead. The approach has the potential to speed up large-scale simulations of slow intracellular dynamics by sharing more cellular units in real-time. To this end, various networks constructed by pipelining 10 k to 40 k cellular calcium units are compared with an equivalent simulation run on a standard PC workstation. Results show that the cellular hardware model is, on average, 83 times faster than the CPU version.

  2. Cellular High-Energy Cavitation Trauma - Description of a Novel In Vitro Trauma Model in Three Different Cell Types.

    Science.gov (United States)

    Cao, Yuli; Risling, Mårten; Malm, Elisabeth; Sondén, Anders; Bolling, Magnus Frödin; Sköld, Mattias K

    2016-01-01

    The mechanisms involved in traumatic brain injury have yet to be fully characterized. One mechanism that, especially in high-energy trauma, could be of importance is cavitation. Cavitation can be described as a process of vaporization, bubble generation, and bubble implosion as a result of a decrease and subsequent increase in pressure. Cavitation as an injury mechanism is difficult to visualize and model due to its short duration and limited spatial distribution. One strategy to analyze the cellular response of cavitation is to employ suitable in vitro models. The flyer-plate model is an in vitro high-energy trauma model that includes cavitation as a trauma mechanism. A copper fragment is accelerated by means of a laser, hits the bottom of a cell culture well causing cavitation, and shock waves inside the well and cell medium. We have found the flyer-plate model to be efficient, reproducible, and easy to control. In this study, we have used the model to analyze the cellular response to microcavitation in SH-SY5Y neuroblastoma, Caco-2, and C6 glioma cell lines. Mitotic activity in neuroblastoma and glioma was investigated with BrdU staining, and cell numbers were calculated using automated time-lapse imaging. We found variations between cell types and between different zones surrounding the lesion with these methods. It was also shown that the injured cell cultures released S-100B in a dose-dependent manner. Using gene expression microarray, a number of gene families of potential interest were found to be strongly, but differently regulated in neuroblastoma and glioma at 24 h post trauma. The data from the gene expression arrays may be used to identify new candidates for biomarkers in cavitation trauma. We conclude that our model is useful for studies of trauma in vitro and that it could be applied in future treatment studies.

  3. Automated side-chain model building and sequence assignment by template matching.

    Science.gov (United States)

    Terwilliger, Thomas C

    2003-01-01

    An algorithm is described for automated building of side chains in an electron-density map once a main-chain model is built and for alignment of the protein sequence to the map. The procedure is based on a comparison of electron density at the expected side-chain positions with electron-density templates. The templates are constructed from average amino-acid side-chain densities in 574 refined protein structures. For each contiguous segment of main chain, a matrix with entries corresponding to an estimate of the probability that each of the 20 amino acids is located at each position of the main-chain model is obtained. The probability that this segment corresponds to each possible alignment with the sequence of the protein is estimated using a Bayesian approach and high-confidence matches are kept. Once side-chain identities are determined, the most probable rotamer for each side chain is built into the model. The automated procedure has been implemented in the RESOLVE software. Combined with automated main-chain model building, the procedure produces a preliminary model suitable for refinement and extension by an experienced crystallographer.

  4. A computational and cellular solids approach to the stiffness-based design of bone scaffolds.

    Science.gov (United States)

    Norato, J A; Wagoner Johnson, A J

    2011-09-01

    We derive a cellular solids approach to the design of bone scaffolds for stiffness and pore size. Specifically, we focus on scaffolds made of stacked, alternating, orthogonal layers of hydroxyapatite rods, such as those obtained via micro-robotic deposition, and aim to determine the rod diameter, spacing and overlap required to obtain specified elastic moduli and pore size. To validate and calibrate the cellular solids model, we employ a finite element model and determine the effective scaffold moduli via numerical homogenization. In order to perform an efficient, automated execution of the numerical studies, we employ a geometry projection method so that analyses corresponding to different scaffold dimensions can be performed on a fixed, non-conforming mesh. Based on the developed model, we provide design charts to aid in the selection of rod diameter, spacing and overlap to be used in the robotic deposition to attain desired elastic moduli and pore size.

  5. Automation life-cycle cost model

    Science.gov (United States)

    Gathmann, Thomas P.; Reeves, Arlinda J.; Cline, Rick; Henrion, Max; Ruokangas, Corinne

    1992-01-01

    The problem domain being addressed by this contractual effort can be summarized by the following list: Automation and Robotics (A&R) technologies appear to be viable alternatives to current, manual operations; Life-cycle cost models are typically judged with suspicion due to implicit assumptions and little associated documentation; and Uncertainty is a reality for increasingly complex problems and few models explicitly account for its affect on the solution space. The objectives for this effort range from the near-term (1-2 years) to far-term (3-5 years). In the near-term, the envisioned capabilities of the modeling tool are annotated. In addition, a framework is defined and developed in the Decision Modelling System (DEMOS) environment. Our approach is summarized as follows: Assess desirable capabilities (structure into near- and far-term); Identify useful existing models/data; Identify parameters for utility analysis; Define tool framework; Encode scenario thread for model validation; and Provide transition path for tool development. This report contains all relevant, technical progress made on this contractual effort.

  6. A Model of How Different Biology Experts Explain Molecular and Cellular Mechanisms

    Science.gov (United States)

    Trujillo, Caleb M.; Anderson, Trevor R.; Pelaez, Nancy J.

    2015-01-01

    Constructing explanations is an essential skill for all science learners. The goal of this project was to model the key components of expert explanation of molecular and cellular mechanisms. As such, we asked: What is an appropriate model of the components of explanation used by biology experts to explain molecular and cellular mechanisms? Do…

  7. A Stepwise Fitting Procedure for automated fitting of Ecopath with Ecosim models

    Directory of Open Access Journals (Sweden)

    Erin Scott

    2016-01-01

    Full Text Available The Stepwise Fitting Procedure automates testing of alternative hypotheses used for fitting Ecopath with Ecosim (EwE models to observation reference data (Mackinson et al. 2009. The calibration of EwE model predictions to observed data is important to evaluate any model that will be used for ecosystem based management. Thus far, the model fitting procedure in EwE has been carried out manually: a repetitive task involving setting >1000 specific individual searches to find the statistically ‘best fit’ model. The novel fitting procedure automates the manual procedure therefore producing accurate results and lets the modeller concentrate on investigating the ‘best fit’ model for ecological accuracy.

  8. Automated imaging of cellular spheroids with selective plane illumination microscopy on a chip (Conference Presentation)

    Science.gov (United States)

    Paiè, Petra; Bassi, Andrea; Bragheri, Francesca; Osellame, Roberto

    2017-02-01

    Selective plane illumination microscopy (SPIM) is an optical sectioning technique that allows imaging of biological samples at high spatio-temporal resolution. Standard SPIM devices require dedicated set-ups, complex sample preparation and accurate system alignment, thus limiting the automation of the technique, its accessibility and throughput. We present a millimeter-scaled optofluidic device that incorporates selective plane illumination and fully automatic sample delivery and scanning. To this end an integrated cylindrical lens and a three-dimensional fluidic network were fabricated by femtosecond laser micromachining into a single glass chip. This device can upgrade any standard fluorescence microscope to a SPIM system. We used SPIM on a CHIP to automatically scan biological samples under a conventional microscope, without the need of any motorized stage: tissue spheroids expressing fluorescent proteins were flowed in the microchannel at constant speed and their sections were acquired while passing through the light sheet. We demonstrate high-throughput imaging of the entire sample volume (with a rate of 30 samples/min), segmentation and quantification in thick (100-300 μm diameter) cellular spheroids. This optofluidic device gives access to SPIM analyses to non-expert end-users, opening the way to automatic and fast screening of a high number of samples at subcellular resolution.

  9. Simulations of living cell origins using a cellular automata model.

    Science.gov (United States)

    Ishida, Takeshi

    2014-04-01

    Understanding the generalized mechanisms of cell self-assembly is fundamental for applications in various fields, such as mass producing molecular machines in nanotechnology. Thus, the details of real cellular reaction networks and the necessary conditions for self-organized cells must be elucidated. We constructed a 2-dimensional cellular automata model to investigate the emergence of biological cell formation, which incorporated a looped membrane and a membrane-bound information system (akin to a genetic code and gene expression system). In particular, with an artificial reaction system coupled with a thermal system, the simultaneous formation of a looped membrane and an inner reaction process resulted in a more stable structure. These double structures inspired the primitive biological cell formation process from chemical evolution stage. With a model to simulate cellular self-organization in a 2-dimensional cellular automata model, 3 phenomena could be realized: (1) an inner reaction system developed as an information carrier precursor (akin to DNA); (2) a cell border emerged (akin to a cell membrane); and (3) these cell structures could divide into 2. This double-structured cell was considered to be a primary biological cell. The outer loop evolved toward a lipid bilayer membrane, and inner polymeric particles evolved toward precursor information carriers (evolved toward DNA). This model did not completely clarify all the necessary and sufficient conditions for biological cell self-organization. Further, our virtual cells remained unstable and fragile. However, the "garbage bag model" of Dyson proposed that the first living cells were deficient; thus, it would be reasonable that the earliest cells were more unstable and fragile than the simplest current unicellular organisms.

  10. An attempt of modelling debris flows characterised by strong inertial effects through Cellular Automata

    Science.gov (United States)

    Iovine, G.; D'Ambrosio, D.

    2003-04-01

    Cellular Automata models do represent a valid method for the simulation of complex phenomena, when these latter can be described in "a-centric" terms - i.e. through local interactions within a discrete time-space. In particular, flow-type landslides (such as debris flows) can be viewed as a-centric dynamical system. SCIDDICA S4b, the last release of a family of two-dimensional hexagonal Cellular Automata models, has recently been developed for simulating debris flows characterised by strong inertial effects. It has been derived by progressively enriching an initial simplified CA model, originally derived for simulating very simple cases of slow-moving flow-type landslides. In S4b, by applying an empirical strategy, the inertial characters of the flowing mass have been translated into CA terms. In the transition function of the model, the distribution of landslide debris among the cells is computed by considering the momentum of the debris which move among the cells of the neighbourhood, and privileging the flow direction. By properly setting the value of one of the global parameters of the model (the "inertial factor"), the mechanism of distribution of the landslide debris among the cells can be influenced in order to emphasise the inertial effects, according to the energy of the flowing mass. Moreover, the high complexity of both the model and of the phenomena to be simulated (e.g. debris flows characterised by severe erosion along their path, and by strong inertial effects) suggested to employ an automated technique of evaluation, for the determination of the best set of global parameters. Accordingly, the calibration of the model has been performed through Genetic Algorithms, by considering several real cases of study: these latter have been selected among the population of landslides triggered in Campania (Southern Italy) in May 1998 and December 1999. Obtained results are satisfying: errors computed by comparing the simulations with the map of the real

  11. Quantitative proteomic assessment of very early cellular signaling events

    DEFF Research Database (Denmark)

    Dengjel, Joern; Akimov, Vyacheslav; Olsen, Jesper V

    2007-01-01

    Technical limitations have prevented proteomic analyses of events occurring less than 30 s after signal initiation. We developed an automated, continuous quench-flow system allowing quantitative proteomic assessment of very early cellular signaling events (qPACE) with a time resolution of 1 s...

  12. An automated digital imaging system for environmental monitoring applications

    Science.gov (United States)

    Bogle, Rian; Velasco, Miguel; Vogel, John

    2013-01-01

    Recent improvements in the affordability and availability of high-resolution digital cameras, data loggers, embedded computers, and radio/cellular modems have advanced the development of sophisticated automated systems for remote imaging. Researchers have successfully placed and operated automated digital cameras in remote locations and in extremes of temperature and humidity, ranging from the islands of the South Pacific to the Mojave Desert and the Grand Canyon. With the integration of environmental sensors, these automated systems are able to respond to local conditions and modify their imaging regimes as needed. In this report we describe in detail the design of one type of automated imaging system developed by our group. It is easily replicated, low-cost, highly robust, and is a stand-alone automated camera designed to be placed in remote locations, without wireless connectivity.

  13. A cellular automata model of bone formation.

    Science.gov (United States)

    Van Scoy, Gabrielle K; George, Estee L; Opoku Asantewaa, Flora; Kerns, Lucy; Saunders, Marnie M; Prieto-Langarica, Alicia

    2017-04-01

    Bone remodeling is an elegantly orchestrated process by which osteocytes, osteoblasts and osteoclasts function as a syncytium to maintain or modify bone. On the microscopic level, bone consists of cells that create, destroy and monitor the bone matrix. These cells interact in a coordinated manner to maintain a tightly regulated homeostasis. It is this regulation that is responsible for the observed increase in bone gain in the dominant arm of a tennis player and the observed increase in bone loss associated with spaceflight and osteoporosis. The manner in which these cells interact to bring about a change in bone quality and quantity has yet to be fully elucidated. But efforts to understand the multicellular complexity can ultimately lead to eradication of metabolic bone diseases such as osteoporosis and improved implant longevity. Experimentally validated mathematical models that simulate functional activity and offer eventual predictive capabilities offer tremendous potential in understanding multicellular bone remodeling. Here we undertake the initial challenge to develop a mathematical model of bone formation validated with in vitro data obtained from osteoblastic bone cells induced to mineralize and quantified at 26 days of culture. A cellular automata model was constructed to simulate the in vitro characterization. Permutation tests were performed to compare the distribution of the mineralization in the cultures and the distribution of the mineralization in the mathematical models. The results of the permutation test show the distribution of mineralization from the characterization and mathematical model come from the same probability distribution, therefore validating the cellular automata model. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Phenomenological model of an electron flow with a virtual cathode

    International Nuclear Information System (INIS)

    Koronovskij, A.A.; Khramov, A.E.; Anfinogenov, V.G.

    1999-01-01

    A phenomenological model of electron flow with a virtual cathode in diode space, which is a modification of cellular automation, is suggested. The type of models, called cellular conveyer, permits making allowance for distribution and delay in a beam with a virtual cathode. A good agreement between results of numerical study of electron flow dynamics and results obtained using the phenomenological model described has been achieved [ru

  15. A cellular automata model of Ebola virus dynamics

    Science.gov (United States)

    Burkhead, Emily; Hawkins, Jane

    2015-11-01

    We construct a stochastic cellular automaton (SCA) model for the spread of the Ebola virus (EBOV). We make substantial modifications to an existing SCA model used for HIV, introduced by others and studied by the authors. We give a rigorous analysis of the similarities between models due to the spread of virus and the typical immune response to it, and the differences which reflect the drastically different timing of the course of EBOV. We demonstrate output from the model and compare it with clinical data.

  16. Modeling of prepregs during automated draping sequences

    Science.gov (United States)

    Krogh, Christian; Glud, Jens A.; Jakobsen, Johnny

    2017-10-01

    The behavior of wowen prepreg fabric during automated draping sequences is investigated. A drape tool under development with an arrangement of grippers facilitates the placement of a woven prepreg fabric in a mold. It is essential that the draped configuration is free from wrinkles and other defects. The present study aims at setting up a virtual draping framework capable of modeling the draping process from the initial flat fabric to the final double curved shape and aims at assisting the development of an automated drape tool. The virtual draping framework consists of a kinematic mapping algorithm used to generate target points on the mold which are used as input to a draping sequence planner. The draping sequence planner prescribes the displacement history for each gripper in the drape tool and these displacements are then applied to each gripper in a transient model of the draping sequence. The model is based on a transient finite element analysis with the material's constitutive behavior currently being approximated as linear elastic orthotropic. In-plane tensile and bias-extension tests as well as bending tests are conducted and used as input for the model. The virtual draping framework shows a good potential for obtaining a better understanding of the drape process and guide the development of the drape tool. However, results obtained from using the framework on a simple test case indicate that the generation of draping sequences is non-trivial.

  17. Cellular automata and integrodifferential equation models for cell renewal in mosaic tissues

    Science.gov (United States)

    Bloomfield, J. M.; Sherratt, J. A.; Painter, K. J.; Landini, G.

    2010-01-01

    Mosaic tissues are composed of two or more genetically distinct cell types. They occur naturally, and are also a useful experimental method for exploring tissue growth and maintenance. By marking the different cell types, one can study the patterns formed by proliferation, renewal and migration. Here, we present mathematical modelling suggesting that small changes in the type of interaction that cells have with their local cellular environment can lead to very different outcomes for the composition of mosaics. In cell renewal, proliferation of each cell type may depend linearly or nonlinearly on the local proportion of cells of that type, and these two possibilities produce very different patterns. We study two variations of a cellular automaton model based on simple rules for renewal. We then propose an integrodifferential equation model, and again consider two different forms of cellular interaction. The results of the continuous and cellular automata models are qualitatively the same, and we observe that changes in local environment interaction affect the dynamics for both. Furthermore, we demonstrate that the models reproduce some of the patterns seen in actual mosaic tissues. In particular, our results suggest that the differing patterns seen in organ parenchymas may be driven purely by the process of cell replacement under different interaction scenarios. PMID:20375040

  18. Driver-centred vehicle automation: using network analysis for agent-based modelling of the driver in highly automated driving systems.

    Science.gov (United States)

    Banks, Victoria A; Stanton, Neville A

    2016-11-01

    To the average driver, the concept of automation in driving infers that they can become completely 'hands and feet free'. This is a common misconception, however, one that has been shown through the application of Network Analysis to new Cruise Assist technologies that may feature on our roads by 2020. Through the adoption of a Systems Theoretic approach, this paper introduces the concept of driver-initiated automation which reflects the role of the driver in highly automated driving systems. Using a combination of traditional task analysis and the application of quantitative network metrics, this agent-based modelling paper shows how the role of the driver remains an integral part of the driving system implicating the need for designers to ensure they are provided with the tools necessary to remain actively in-the-loop despite giving increasing opportunities to delegate their control to the automated subsystems. Practitioner Summary: This paper describes and analyses a driver-initiated command and control system of automation using representations afforded by task and social networks to understand how drivers remain actively involved in the task. A network analysis of different driver commands suggests that such a strategy does maintain the driver in the control loop.

  19. Toponomics method for the automated quantification of membrane protein translocation.

    Science.gov (United States)

    Domanova, Olga; Borbe, Stefan; Mühlfeld, Stefanie; Becker, Martin; Kubitz, Ralf; Häussinger, Dieter; Berlage, Thomas

    2011-09-19

    Intra-cellular and inter-cellular protein translocation can be observed by microscopic imaging of tissue sections prepared immunohistochemically. A manual densitometric analysis is time-consuming, subjective and error-prone. An automated quantification is faster, more reproducible, and should yield results comparable to manual evaluation. The automated method presented here was developed on rat liver tissue sections to study the translocation of bile salt transport proteins in hepatocytes. For validation, the cholestatic liver state was compared to the normal biological state. An automated quantification method was developed to analyze the translocation of membrane proteins and evaluated in comparison to an established manual method. Firstly, regions of interest (membrane fragments) are identified in confocal microscopy images. Further, densitometric intensity profiles are extracted orthogonally to membrane fragments, following the direction from the plasma membrane to cytoplasm. Finally, several different quantitative descriptors were derived from the densitometric profiles and were compared regarding their statistical significance with respect to the transport protein distribution. Stable performance, robustness and reproducibility were tested using several independent experimental datasets. A fully automated workflow for the information extraction and statistical evaluation has been developed and produces robust results. New descriptors for the intensity distribution profiles were found to be more discriminative, i.e. more significant, than those used in previous research publications for the translocation quantification. The slow manual calculation can be substituted by the fast and unbiased automated method.

  20. Geometric Modeling of Cellular Materials for Additive Manufacturing in Biomedical Field: A Review.

    Science.gov (United States)

    Savio, Gianpaolo; Rosso, Stefano; Meneghello, Roberto; Concheri, Gianmaria

    2018-01-01

    Advances in additive manufacturing technologies facilitate the fabrication of cellular materials that have tailored functional characteristics. The application of solid freeform fabrication techniques is especially exploited in designing scaffolds for tissue engineering. In this review, firstly, a classification of cellular materials from a geometric point of view is proposed; then, the main approaches on geometric modeling of cellular materials are discussed. Finally, an investigation on porous scaffolds fabricated by additive manufacturing technologies is pointed out. Perspectives in geometric modeling of scaffolds for tissue engineering are also proposed.

  1. Spatiotemporal Stochastic Modeling of IoT Enabled Cellular Networks: Scalability and Stability Analysis

    KAUST Repository

    Gharbieh, Mohammad; Elsawy, Hesham; Bader, Ahmed; Alouini, Mohamed-Slim

    2017-01-01

    The Internet of Things (IoT) is large-scale by nature, which is manifested by the massive number of connected devices as well as their vast spatial existence. Cellular networks, which provide ubiquitous, reliable, and efficient wireless access, will play fundamental rule in delivering the first-mile access for the data tsunami to be generated by the IoT. However, cellular networks may have scalability problems to provide uplink connectivity to massive numbers of connected things. To characterize the scalability of cellular uplink in the context of IoT networks, this paper develops a traffic-aware spatiotemporal mathematical model for IoT devices supported by cellular uplink connectivity. The developed model is based on stochastic geometry and queueing theory to account for the traffic requirement per IoT device, the different transmission strategies, and the mutual interference between the IoT devices. To this end, the developed model is utilized to characterize the extent to which cellular networks can accommodate IoT traffic as well as to assess and compare three different transmission strategies that incorporate a combination of transmission persistency, backoff, and power-ramping. The analysis and the results clearly illustrate the scalability problem imposed by IoT on cellular network and offer insights into effective scenarios for each transmission strategy.

  2. Spatiotemporal Stochastic Modeling of IoT Enabled Cellular Networks: Scalability and Stability Analysis

    KAUST Repository

    Gharbieh, Mohammad

    2017-05-02

    The Internet of Things (IoT) is large-scale by nature, which is manifested by the massive number of connected devices as well as their vast spatial existence. Cellular networks, which provide ubiquitous, reliable, and efficient wireless access, will play fundamental rule in delivering the first-mile access for the data tsunami to be generated by the IoT. However, cellular networks may have scalability problems to provide uplink connectivity to massive numbers of connected things. To characterize the scalability of cellular uplink in the context of IoT networks, this paper develops a traffic-aware spatiotemporal mathematical model for IoT devices supported by cellular uplink connectivity. The developed model is based on stochastic geometry and queueing theory to account for the traffic requirement per IoT device, the different transmission strategies, and the mutual interference between the IoT devices. To this end, the developed model is utilized to characterize the extent to which cellular networks can accommodate IoT traffic as well as to assess and compare three different transmission strategies that incorporate a combination of transmission persistency, backoff, and power-ramping. The analysis and the results clearly illustrate the scalability problem imposed by IoT on cellular network and offer insights into effective scenarios for each transmission strategy.

  3. Simulation Modeling by Classification of Problems: A Case of Cellular Manufacturing

    International Nuclear Information System (INIS)

    Afiqah, K N; Mahayuddin, Z R

    2016-01-01

    Cellular manufacturing provides good solution approach to manufacturing area by applying Group Technology concept. The evolution of cellular manufacturing can enhance performance of the cell and to increase the quality of the product manufactured but it triggers other problem. Generally, this paper highlights factors and problems which emerge commonly in cellular manufacturing. The aim of the research is to develop a thorough understanding of common problems in cellular manufacturing. A part from that, in order to find a solution to the problems exist using simulation technique, this classification framework is very useful to be adapted during model building. Biology evolution tool was used in the research in order to classify the problems emerge. The result reveals 22 problems and 25 factors using cladistic technique. In this research, the expected result is the cladogram established based on the problems in cellular manufacturing gathered. (paper)

  4. Predictive model to describe water migration in cellular solid foods during storage

    NARCIS (Netherlands)

    Voogt, J.A.; Hirte, A.; Meinders, M.B.J.

    2011-01-01

    BACKGROUND: Water migration in cellular solid foods during storage causes loss of crispness. To improve crispness retention, physical understanding of this process is needed. Mathematical models are suitable tools to gain this physical knowledge. RESULTS: Water migration in cellular solid foods

  5. Predictive model to describe water migration in cellular solid foods during storage

    NARCIS (Netherlands)

    Voogt, J.A.; Hirte, A.; Meinders, M.B.J.

    2011-01-01

    Background: Water migration in cellular solid foods during storage causes loss of crispness. To improve crispness retention, physical understanding of this process is needed. Mathematical models are suitable tools to gain this physical knowledge. Results: Water migration in cellular solid foods

  6. A Multiple Agent Model of Human Performance in Automated Air Traffic Control and Flight Management Operations

    Science.gov (United States)

    Corker, Kevin; Pisanich, Gregory; Condon, Gregory W. (Technical Monitor)

    1995-01-01

    A predictive model of human operator performance (flight crew and air traffic control (ATC)) has been developed and applied in order to evaluate the impact of automation developments in flight management and air traffic control. The model is used to predict the performance of a two person flight crew and the ATC operators generating and responding to clearances aided by the Center TRACON Automation System (CTAS). The purpose of the modeling is to support evaluation and design of automated aids for flight management and airspace management and to predict required changes in procedure both air and ground in response to advancing automation in both domains. Additional information is contained in the original extended abstract.

  7. Models and automation technologies for the curriculum development

    Directory of Open Access Journals (Sweden)

    V. N. Volkova

    2016-01-01

    Full Text Available The aim of the research was to determine the sequence of the curriculum development stages on the basis of the system analysis, as well as to create models and information technologies for the implementation of thesestages.The methods and the models of the systems’ theory and the system analysis, including methods and automated procedures for structuring organizational aims, models and automated procedures for organizing complex expertise.On the basis of the analysis of existing studies in the field of curriculum modeling, using formal mathematical language, including optimization models, that help to make distribution of disciplines by years and semesters in accordance with the relevant restrictions, it is shown, that the complexity and dimension of these tasks require the development of special software; the problem of defining the input data and restrictions requires a large time investment, that seems to be difficult to provide in real conditions of plans’ developing, thus it is almost impossible to verify the objectivity of the input data and the restrictions in such models. For a complete analysis of the process of curriculum development it is proposed to use the system definition, based on the system-targeted approach. On the basis of this definition the reasonable sequence of the integrated stages for the development of the curriculum was justified: 1 definition (specification of the requirements for the educational content; 2 determining the number of subjects, included in the curriculum; 3 definition of the sequence of the subjects; 4 distribution of subjects by semesters. The models and technologies for the implementation of these stages of curriculum development were given in the article: 1 models, based on the information approach of A.Denisov and the modified degree of compliance with objectives based on Denisov’s evaluation index (in the article the idea of evaluating the degree of the impact of disciplines for realization

  8. Lattice gas cellular automata and lattice Boltzmann models an introduction

    CERN Document Server

    Wolf-Gladrow, Dieter A

    2000-01-01

    Lattice-gas cellular automata (LGCA) and lattice Boltzmann models (LBM) are relatively new and promising methods for the numerical solution of nonlinear partial differential equations. The book provides an introduction for graduate students and researchers. Working knowledge of calculus is required and experience in PDEs and fluid dynamics is recommended. Some peculiarities of cellular automata are outlined in Chapter 2. The properties of various LGCA and special coding techniques are discussed in Chapter 3. Concepts from statistical mechanics (Chapter 4) provide the necessary theoretical background for LGCA and LBM. The properties of lattice Boltzmann models and a method for their construction are presented in Chapter 5.

  9. Complex Automated Negotiations Theories, Models, and Software Competitions

    CERN Document Server

    Zhang, Minjie; Robu, Valentin; Matsuo, Tokuro

    2013-01-01

    Complex Automated Negotiations are a widely studied, emerging area in the field of Autonomous Agents and Multi-Agent Systems. In general, automated negotiations can be complex, since there are a lot of factors that characterize such negotiations. For this book, we solicited papers on all aspects of such complex automated negotiations, which are studied in the field of Autonomous Agents and Multi-Agent Systems. This book includes two parts, which are Part I: Agent-based Complex Automated Negotiations and Part II: Automated Negotiation Agents Competition. Each chapter in Part I is an extended version of ACAN 2011 papers after peer reviews by three PC members. Part II includes ANAC 2011 (The Second Automated Negotiating Agents Competition), in which automated agents who have different negotiation strategies and implemented by different developers are automatically negotiate in the several negotiation domains. ANAC is an international competition in which automated negotiation strategies, submitted by a number of...

  10. Geometric Modeling of Cellular Materials for Additive Manufacturing in Biomedical Field: A Review

    Directory of Open Access Journals (Sweden)

    Gianpaolo Savio

    2018-01-01

    Full Text Available Advances in additive manufacturing technologies facilitate the fabrication of cellular materials that have tailored functional characteristics. The application of solid freeform fabrication techniques is especially exploited in designing scaffolds for tissue engineering. In this review, firstly, a classification of cellular materials from a geometric point of view is proposed; then, the main approaches on geometric modeling of cellular materials are discussed. Finally, an investigation on porous scaffolds fabricated by additive manufacturing technologies is pointed out. Perspectives in geometric modeling of scaffolds for tissue engineering are also proposed.

  11. Geometric Modeling of Cellular Materials for Additive Manufacturing in Biomedical Field: A Review

    Science.gov (United States)

    Rosso, Stefano; Meneghello, Roberto; Concheri, Gianmaria

    2018-01-01

    Advances in additive manufacturing technologies facilitate the fabrication of cellular materials that have tailored functional characteristics. The application of solid freeform fabrication techniques is especially exploited in designing scaffolds for tissue engineering. In this review, firstly, a classification of cellular materials from a geometric point of view is proposed; then, the main approaches on geometric modeling of cellular materials are discussed. Finally, an investigation on porous scaffolds fabricated by additive manufacturing technologies is pointed out. Perspectives in geometric modeling of scaffolds for tissue engineering are also proposed. PMID:29487626

  12. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  13. Parameters Investigation of Mathematical Model of Productivity for Automated Line with Availability by DMAIC Methodology

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2014-01-01

    Full Text Available Automated line is widely applied in industry especially for mass production with less variety product. Productivity is one of the important criteria in automated line as well as industry which directly present the outputs and profits. Forecast of productivity in industry accurately in order to achieve the customer demand and the forecast result is calculated by using mathematical model. Mathematical model of productivity with availability for automated line has been introduced to express the productivity in terms of single level of reliability for stations and mechanisms. Since this mathematical model of productivity with availability cannot achieve close enough productivity compared to actual one due to lack of parameters consideration, the enhancement of mathematical model is required to consider and add the loss parameters that is not considered in current model. This paper presents the investigation parameters of productivity losses investigated by using DMAIC (Define, Measure, Analyze, Improve, and Control concept and PACE Prioritization Matrix (Priority, Action, Consider, and Eliminate. The investigated parameters are important for further improvement of mathematical model of productivity with availability to develop robust mathematical model of productivity in automated line.

  14. Integrating Cellular Metabolism into a Multiscale Whole-Body Model

    Science.gov (United States)

    Krauss, Markus; Schaller, Stephan; Borchers, Steffen; Findeisen, Rolf; Lippert, Jörg; Kuepfer, Lars

    2012-01-01

    Cellular metabolism continuously processes an enormous range of external compounds into endogenous metabolites and is as such a key element in human physiology. The multifaceted physiological role of the metabolic network fulfilling the catalytic conversions can only be fully understood from a whole-body perspective where the causal interplay of the metabolic states of individual cells, the surrounding tissue and the whole organism are simultaneously considered. We here present an approach relying on dynamic flux balance analysis that allows the integration of metabolic networks at the cellular scale into standardized physiologically-based pharmacokinetic models at the whole-body level. To evaluate our approach we integrated a genome-scale network reconstruction of a human hepatocyte into the liver tissue of a physiologically-based pharmacokinetic model of a human adult. The resulting multiscale model was used to investigate hyperuricemia therapy, ammonia detoxification and paracetamol-induced toxication at a systems level. The specific models simultaneously integrate multiple layers of biological organization and offer mechanistic insights into pathology and medication. The approach presented may in future support a mechanistic understanding in diagnostics and drug development. PMID:23133351

  15. Simulation of root forms using cellular automata model

    International Nuclear Information System (INIS)

    Winarno, Nanang; Prima, Eka Cahya; Afifah, Ratih Mega Ayu

    2016-01-01

    This research aims to produce a simulation program for root forms using cellular automata model. Stephen Wolfram in his book entitled “A New Kind of Science” discusses the formation rules based on the statistical analysis. In accordance with Stephen Wolfram’s investigation, the research will develop a basic idea of computer program using Delphi 7 programming language. To best of our knowledge, there is no previous research developing a simulation describing root forms using the cellular automata model compared to the natural root form with the presence of stone addition as the disturbance. The result shows that (1) the simulation used four rules comparing results of the program towards the natural photographs and each rule had shown different root forms; (2) the stone disturbances prevent the root growth and the multiplication of root forms had been successfully modeled. Therefore, this research had added some stones, which have size of 120 cells placed randomly in the soil. Like in nature, stones cannot be penetrated by plant roots. The result showed that it is very likely to further develop the program of simulating root forms by 50 variations

  16. Simulation of root forms using cellular automata model

    Energy Technology Data Exchange (ETDEWEB)

    Winarno, Nanang, E-mail: nanang-winarno@upi.edu; Prima, Eka Cahya [International Program on Science Education, Universitas Pendidikan Indonesia, Jl. Dr. Setiabudi no 229, Bandung40154 (Indonesia); Afifah, Ratih Mega Ayu [Department of Physics Education, Post Graduate School, Universitas Pendidikan Indonesia, Jl. Dr. Setiabudi no 229, Bandung40154 (Indonesia)

    2016-02-08

    This research aims to produce a simulation program for root forms using cellular automata model. Stephen Wolfram in his book entitled “A New Kind of Science” discusses the formation rules based on the statistical analysis. In accordance with Stephen Wolfram’s investigation, the research will develop a basic idea of computer program using Delphi 7 programming language. To best of our knowledge, there is no previous research developing a simulation describing root forms using the cellular automata model compared to the natural root form with the presence of stone addition as the disturbance. The result shows that (1) the simulation used four rules comparing results of the program towards the natural photographs and each rule had shown different root forms; (2) the stone disturbances prevent the root growth and the multiplication of root forms had been successfully modeled. Therefore, this research had added some stones, which have size of 120 cells placed randomly in the soil. Like in nature, stones cannot be penetrated by plant roots. The result showed that it is very likely to further develop the program of simulating root forms by 50 variations.

  17. Automated quantitative cytological analysis using portable microfluidic microscopy.

    Science.gov (United States)

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. A generalized cellular automata approach to modeling first order ...

    Indian Academy of Sciences (India)

    ... inhibitors deforming the allosteric site or inhibitors changing the structure of active ... Cell-based models with discrete state variables, such as Cellular Automata ... capture the essential features of a discrete real system, consisting of space, ...

  19. Modeling cell adhesion and proliferation: a cellular-automata based approach.

    Science.gov (United States)

    Vivas, J; Garzón-Alvarado, D; Cerrolaza, M

    Cell adhesion is a process that involves the interaction between the cell membrane and another surface, either a cell or a substrate. Unlike experimental tests, computer models can simulate processes and study the result of experiments in a shorter time and lower costs. One of the tools used to simulate biological processes is the cellular automata, which is a dynamic system that is discrete both in space and time. This work describes a computer model based on cellular automata for the adhesion process and cell proliferation to predict the behavior of a cell population in suspension and adhered to a substrate. The values of the simulated system were obtained through experimental tests on fibroblast monolayer cultures. The results allow us to estimate the cells settling time in culture as well as the adhesion and proliferation time. The change in the cells morphology as the adhesion over the contact surface progress was also observed. The formation of the initial link between cell and the substrate of the adhesion was observed after 100 min where the cell on the substrate retains its spherical morphology during the simulation. The cellular automata model developed is, however, a simplified representation of the steps in the adhesion process and the subsequent proliferation. A combined framework of experimental and computational simulation based on cellular automata was proposed to represent the fibroblast adhesion on substrates and changes in a macro-scale observed in the cell during the adhesion process. The approach showed to be simple and efficient.

  20. Automated main-chain model building by template matching and iterative fragment extension

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.

    2003-01-01

    A method for automated macromolecular main-chain model building is described. An algorithm for the automated macromolecular model building of polypeptide backbones is described. The procedure is hierarchical. In the initial stages, many overlapping polypeptide fragments are built. In subsequent stages, the fragments are extended and then connected. Identification of the locations of helical and β-strand regions is carried out by FFT-based template matching. Fragment libraries of helices and β-strands from refined protein structures are then positioned at the potential locations of helices and strands and the longest segments that fit the electron-density map are chosen. The helices and strands are then extended using fragment libraries consisting of sequences three amino acids long derived from refined protein structures. The resulting segments of polypeptide chain are then connected by choosing those which overlap at two or more C α positions. The fully automated procedure has been implemented in RESOLVE and is capable of model building at resolutions as low as 3.5 Å. The algorithm is useful for building a preliminary main-chain model that can serve as a basis for refinement and side-chain addition

  1. Computer Modeling of the Earliest Cellular Structures and Functions

    Science.gov (United States)

    Pohorille, Andrew; Chipot, Christophe; Schweighofer, Karl

    2000-01-01

    In the absence of extinct or extant record of protocells (the earliest ancestors of contemporary cells). the most direct way to test our understanding of the origin of cellular life is to construct laboratory models of protocells. Such efforts are currently underway in the NASA Astrobiology Program. They are accompanied by computational studies aimed at explaining self-organization of simple molecules into ordered structures and developing designs for molecules that perform proto-cellular functions. Many of these functions, such as import of nutrients, capture and storage of energy. and response to changes in the environment are carried out by proteins bound to membranestructures at water-membrane interfaces and insert into membranes, (b) how these peptides aggregate to form membrane-spanning structures (eg. channels), and (c) by what mechanisms such aggregates perform essential proto-cellular functions, such as proton transport of protons across cell walls, a key step in cellular bioenergetics. The simulations were performed using the molecular dynamics method, in which Newton's equations of motion for each item in the system are solved iteratively. The problems of interest required simulations on multi-nanosecond time scales, which corresponded to 10(exp 6)-10(exp 8) time steps.

  2. Virtual model of an automated system for the storage of collected waste

    Directory of Open Access Journals (Sweden)

    Enciu George

    2017-01-01

    Full Text Available One of the problems identified in waste collection integrated systems is the storage space. The design process of an automated system for the storage of collected waste includes finding solutions for the optimal exploitation of the limited storage space, seen that the equipment for the loading, identification, transport and transfer of the waste covers most of the available space inside the integrated collection system. In the present paper a three-dimensional model of an automated storage system designed by the authors for a business partner is presented. The storage system can be used for the following types of waste: plastic and glass recipients, aluminium cans, paper, cardboard and WEEE (waste electrical and electronic equipment. Special attention has been given to the transfer subsystem, specific for the storage system, which should be able to transfer different types and shapes of waste. The described virtual model of the automated system for the storage of collected waste will be part of the virtual model of the entire integrated waste collection system as requested by the beneficiary.

  3. Unified Stochastic Geometry Model for MIMO Cellular Networks with Retransmissions

    KAUST Repository

    Afify, Laila H.

    2016-10-11

    This paper presents a unified mathematical paradigm, based on stochastic geometry, for downlink cellular networks with multiple-input-multiple-output (MIMO) base stations (BSs). The developed paradigm accounts for signal retransmission upon decoding errors, in which the temporal correlation among the signal-to-interference-plus-noise-ratio (SINR) of the original and retransmitted signals is captured. In addition to modeling the effect of retransmission on the network performance, the developed mathematical model presents twofold analysis unification for MIMO cellular networks literature. First, it integrates the tangible decoding error probability and the abstracted (i.e., modulation scheme and receiver type agnostic) outage probability analysis, which are largely disjoint in the literature. Second, it unifies the analysis for different MIMO configurations. The unified MIMO analysis is achieved by abstracting unnecessary information conveyed within the interfering signals by Gaussian signaling approximation along with an equivalent SISO representation for the per-data stream SINR in MIMO cellular networks. We show that the proposed unification simplifies the analysis without sacrificing the model accuracy. To this end, we discuss the diversity-multiplexing tradeoff imposed by different MIMO schemes and shed light on the diversity loss due to the temporal correlation among the SINRs of the original and retransmitted signals. Finally, several design insights are highlighted.

  4. Unified Stochastic Geometry Model for MIMO Cellular Networks with Retransmissions

    KAUST Repository

    Afify, Laila H.; Elsawy, Hesham; Al-Naffouri, Tareq Y.; Alouini, Mohamed-Slim

    2016-01-01

    This paper presents a unified mathematical paradigm, based on stochastic geometry, for downlink cellular networks with multiple-input-multiple-output (MIMO) base stations (BSs). The developed paradigm accounts for signal retransmission upon decoding errors, in which the temporal correlation among the signal-to-interference-plus-noise-ratio (SINR) of the original and retransmitted signals is captured. In addition to modeling the effect of retransmission on the network performance, the developed mathematical model presents twofold analysis unification for MIMO cellular networks literature. First, it integrates the tangible decoding error probability and the abstracted (i.e., modulation scheme and receiver type agnostic) outage probability analysis, which are largely disjoint in the literature. Second, it unifies the analysis for different MIMO configurations. The unified MIMO analysis is achieved by abstracting unnecessary information conveyed within the interfering signals by Gaussian signaling approximation along with an equivalent SISO representation for the per-data stream SINR in MIMO cellular networks. We show that the proposed unification simplifies the analysis without sacrificing the model accuracy. To this end, we discuss the diversity-multiplexing tradeoff imposed by different MIMO schemes and shed light on the diversity loss due to the temporal correlation among the SINRs of the original and retransmitted signals. Finally, several design insights are highlighted.

  5. Automated image analysis of lateral lumber X-rays by a form model

    International Nuclear Information System (INIS)

    Mahnken, A.H.; Kohnen, M.; Steinberg, S.; Wein, B.B.; Guenther, R.W.

    2001-01-01

    Development of a software for fully automated image analysis of lateral lumbar spine X-rays. Material and method: Using the concept of active shape models, we developed a software that produces a form model of the lumbar spine from lateral lumbar spine radiographs and runs an automated image segmentation. This model is able to detect lumbar vertebrae automatically after the filtering of digitized X-ray images. The model was trained with 20 lateral lumbar spine radiographs with no pathological findings before we evaluated the software with 30 further X-ray images which were sorted by image quality ranging from one (best) to three (worst). There were 10 images for each quality. Results: Image recognition strongly depended on image quality. In group one 52 and in group two 51 out of 60 vertebral bodies including the sacrum were recognized, but in group three only 18 vertebral bodies were properly identified. Conclusion: Fully automated and reliable recognition of vertebral bodies from lateral spine radiographs using the concept of active shape models is possible. The precision of this technique is limited by the superposition of different structures. Further improvements are necessary. Therefore standardized image quality and enlargement of the training data set are required. (orig.) [de

  6. Referent 3D tumor model at cellular level in radionuclide therapy

    International Nuclear Information System (INIS)

    Spaic, R.; Ilic, R.D.; Petrovic, B.J.

    2002-01-01

    Aim Conventional internal dosimetry has a lot of limitations because of tumor dose nonuniformity. The best approach for absorbed dose at cellular level for different tumors in radionuclide therapy calculation is Monte Carlo method. The purpose of this study is to introduce referent tumor 3D model at cellular level for Monte Carlo simulation study in radionuclide therapy. Material and Methods The moment when tumor is detectable and when same therapy can start is time period in which referent 3D tumor model at cellular level was defined. In accordance with tumor growth rate at that moment he was a sphere with same radius (10 000 μm). In that tumor there are cells or cluster of cells, which are randomly distributed spheres. Distribution of cells/cluster of cells can be calculated from histology data but it was assumed that this distribution is normal with the same mean value and standard deviation (100±50 mm). Second parameter, which was selected to define referent tumor, is volume density of cells (30%). In this referent tumor there are no necroses. Stroma is defined as space between spheres with same concentration of materials as in spheres. Results: Referent tumor defined on this way have about 2,2 10 5 cells or cluster of cells random distributed. Using this referent 3D tumor model and for same concentration of radionuclides (1:100) and energy of beta emitters (1000 keV) which are homogeneously distributed in labeled cells absorbed dose for all cells was calculated. Simulations are done using FOTELP Monte Carlo code, which is modified for this purposes. Results of absorbed dose in cells are given in numerical values (1D distribution) and as the images (2D or 3D distributions). Conclusion Geometrical module for Monte Carlo simulation study can be standardized by introducing referent 3D tumor model at cellular level. This referent 3D tumor model gives most realistic presentation of different tumors at the moment of their detectability. Referent 3D tumor model at

  7. Facilitating arrhythmia simulation: the method of quantitative cellular automata modeling and parallel running

    Directory of Open Access Journals (Sweden)

    Mondry Adrian

    2004-08-01

    Full Text Available Abstract Background Many arrhythmias are triggered by abnormal electrical activity at the ionic channel and cell level, and then evolve spatio-temporally within the heart. To understand arrhythmias better and to diagnose them more precisely by their ECG waveforms, a whole-heart model is required to explore the association between the massively parallel activities at the channel/cell level and the integrative electrophysiological phenomena at organ level. Methods We have developed a method to build large-scale electrophysiological models by using extended cellular automata, and to run such models on a cluster of shared memory machines. We describe here the method, including the extension of a language-based cellular automaton to implement quantitative computing, the building of a whole-heart model with Visible Human Project data, the parallelization of the model on a cluster of shared memory computers with OpenMP and MPI hybrid programming, and a simulation algorithm that links cellular activity with the ECG. Results We demonstrate that electrical activities at channel, cell, and organ levels can be traced and captured conveniently in our extended cellular automaton system. Examples of some ECG waveforms simulated with a 2-D slice are given to support the ECG simulation algorithm. A performance evaluation of the 3-D model on a four-node cluster is also given. Conclusions Quantitative multicellular modeling with extended cellular automata is a highly efficient and widely applicable method to weave experimental data at different levels into computational models. This process can be used to investigate complex and collective biological activities that can be described neither by their governing differentiation equations nor by discrete parallel computation. Transparent cluster computing is a convenient and effective method to make time-consuming simulation feasible. Arrhythmias, as a typical case, can be effectively simulated with the methods

  8. A time-use model for the automated vehicle-era

    NARCIS (Netherlands)

    Pudāne, Baiba; Molin, Eric J.E.; Arentze, Theo A.; Maknoon, Yousef; Chorus, Caspar G.

    2018-01-01

    Automated Vehicles (AVs) offer their users a possibility to perform new non-driving activities while being on the way. The effects of this opportunity on travel choices and travel demand have mostly been conceptualised and modelled via a reduced penalty associated with (in-vehicle) travel time. This

  9. A Time-use Model for the Automated Vehicle-era

    NARCIS (Netherlands)

    Pudane, B.; Molin, E.J.E.; Arentze, TA; Maknoon, M.Y.; Chorus, C.G.

    2018-01-01

    Automated Vehicles (AVs) offer their users a possibility to perform new non-driving activities while being on the way. The effects of this opportunity on travel choices and travel demand have mostly been conceptualised and modelled via a reduced penalty associated with (in-vehicle) travel time. This

  10. Modeling chemical systems using cellular automata a textbook and laboratory manual

    CERN Document Server

    Kier, Lemont B; Cheng, Chao-Kun

    2005-01-01

    There are few publications on Cellular Automata and certainly no direct competition Insight into a new modeling paradigm Novel approach to the use of in silico modeling to replace or supplement costly labs.

  11. Effect of Water Flows on Ship Traffic in Narrow Water Channels Based on Cellular Automata

    Directory of Open Access Journals (Sweden)

    Hu Hongtao

    2017-11-01

    Full Text Available In narrow water channels, ship traffic may be affected by water flows and ship interactions. Studying their effects can help maritime authorities to establish appropriate management strategies. In this study, a two-lane cellular automation model is proposed. Further, the behavior of ship traffic is analyzed by setting different water flow velocities and considering ship interactions. Numerical experiment results show that the ship traffic density-flux relation is significantly different from the results obtained by classical models. Furthermore, due to ship interactions, the ship lane-change rate is influenced by the water flow to a certain degree.

  12. Modeling of coupled differential equations for cellular chemical signaling pathways: Implications for assay protocols utilized in cellular engineering.

    Science.gov (United States)

    O'Clock, George D

    2016-08-01

    Cellular engineering involves modification and control of cell properties, and requires an understanding of fundamentals and mechanisms of action for cellular derived product development. One of the keys to success in cellular engineering involves the quality and validity of results obtained from cell chemical signaling pathway assays. The accuracy of the assay data cannot be verified or assured if the effect of positive feedback, nonlinearities, and interrelationships between cell chemical signaling pathway elements are not understood, modeled, and simulated. Nonlinearities and positive feedback in the cell chemical signaling pathway can produce significant aberrations in assay data collection. Simulating the pathway can reveal potential instability problems that will affect assay results. A simulation, using an electrical analog for the coupled differential equations representing each segment of the pathway, provides an excellent tool for assay validation purposes. With this approach, voltages represent pathway enzyme concentrations and operational amplifier feedback resistance and input resistance values determine pathway gain and rate constants. The understanding provided by pathway modeling and simulation is strategically important in order to establish experimental controls for assay protocol structure, time frames specified between assays, and assay concentration variation limits; to ensure accuracy and reproducibility of results.

  13. Automation of program model developing for complex structure control objects

    International Nuclear Information System (INIS)

    Ivanov, A.P.; Sizova, T.B.; Mikhejkina, N.D.; Sankovskij, G.A.; Tyufyagin, A.N.

    1991-01-01

    A brief description of software for automated developing the models of integrating modular programming system, program module generator and program module library providing thermal-hydraulic calcualtion of process dynamics in power unit equipment components and on-line control system operation simulation is given. Technical recommendations for model development are based on experience in creation of concrete models of NPP power units. 8 refs., 1 tab., 4 figs

  14. Mathematical Modeling and Experimental Validation of Nanoemulsion-Based Drug Transport across Cellular Barriers.

    Science.gov (United States)

    Kadakia, Ekta; Shah, Lipa; Amiji, Mansoor M

    2017-07-01

    Nanoemulsions have shown potential in delivering drug across epithelial and endothelial cell barriers, which express efflux transporters. However, their transport mechanisms are not entirely understood. Our goal was to investigate the cellular permeability of nanoemulsion-encapsulated drugs and apply mathematical modeling to elucidate transport mechanisms and sensitive nanoemulsion attributes. Transport studies were performed in Caco-2 cells, using fish oil nanoemulsions and a model substrate, rhodamine-123. Permeability data was modeled using a semi-mechanistic approach, capturing the following cellular processes: endocytotic uptake of the nanoemulsion, release of rhodamine-123 from the nanoemulsion, efflux and passive permeability of rhodamine-123 in aqueous solution. Nanoemulsions not only improved the permeability of rhodamine-123, but were also less sensitive to efflux transporters. The model captured bidirectional permeability results and identified sensitive processes, such as the release of the nanoemulsion-encapsulated drug and cellular uptake of the nanoemulsion. Mathematical description of cellular processes, improved our understanding of transport mechanisms, such as nanoemulsions don't inhibit efflux to improve drug permeability. Instead, their endocytotic uptake, results in higher intracellular drug concentrations, thereby increasing the concentration gradient and transcellular permeability across biological barriers. Modeling results indicated optimizing nanoemulsion attributes like the droplet size and intracellular drug release rate, may further improve drug permeability.

  15. AUTOMATED FEATURE BASED TLS DATA REGISTRATION FOR 3D BUILDING MODELING

    OpenAIRE

    K. Kitamura; N. Kochi; S. Kaneko

    2012-01-01

    In this paper we present a novel method for the registration of point cloud data obtained using terrestrial laser scanner (TLS). The final goal of our investigation is the automated reconstruction of CAD drawings and the 3D modeling of objects surveyed by TLS. Because objects are scanned from multiple positions, individual point cloud need to be registered to the same coordinate system. We propose in this paper an automated feature based registration procedure. Our proposed method does not re...

  16. A model for automation of radioactive dose control

    International Nuclear Information System (INIS)

    Ribeiro, Carlos Henrique Calazans; Zambon, Jose Waldir; Bitelli, Ricardo; Honaiser, Eduardo Henrique Rangel

    2009-01-01

    The paper presents a proposal for automation of the personnel dose control system to be used in nuclear medicine environments. The model has considered the Standards and rules of the National Commission of Nuclear Energy (CNEN) and of the Health Ministry. The advantages of the model is a robust management of the integrated dose and technicians qualification status. The software platform selected to be used was the Lotus Notes and an analysis of the advantages, disadvantages of the use of this platform is also presented. (author)

  17. A model for automation of radioactive dose control

    Energy Technology Data Exchange (ETDEWEB)

    Ribeiro, Carlos Henrique Calazans; Zambon, Jose Waldir; Bitelli, Ricardo; Honaiser, Eduardo Henrique Rangel [Centro Tecnologico da Marinha em Sao Paulo (CTMSP), Sao Paulo, SP (Brazil)], e-mail: calazans@ctmsp.mar.mil.br, e-mail: zambon@ctmsp.mar.mil.br, e-mail: bitelli@ctmsp.mar.mil.br, e-mail: honaiser@ctmsp.mar.mil.br

    2009-07-01

    The paper presents a proposal for automation of the personnel dose control system to be used in nuclear medicine environments. The model has considered the Standards and rules of the National Commission of Nuclear Energy (CNEN) and of the Health Ministry. The advantages of the model is a robust management of the integrated dose and technicians qualification status. The software platform selected to be used was the Lotus Notes and an analysis of the advantages, disadvantages of the use of this platform is also presented. (author)

  18. A Fluid Model for Performance Analysis in Cellular Networks

    Directory of Open Access Journals (Sweden)

    Coupechoux Marceau

    2010-01-01

    Full Text Available We propose a new framework to study the performance of cellular networks using a fluid model and we derive from this model analytical formulas for interference, outage probability, and spatial outage probability. The key idea of the fluid model is to consider the discrete base station (BS entities as a continuum of transmitters that are spatially distributed in the network. This model allows us to obtain simple analytical expressions to reveal main characteristics of the network. In this paper, we focus on the downlink other-cell interference factor (OCIF, which is defined for a given user as the ratio of its outer cell received power to its inner cell received power. A closed-form formula of the OCIF is provided in this paper. From this formula, we are able to obtain the global outage probability as well as the spatial outage probability, which depends on the location of a mobile station (MS initiating a new call. Our analytical results are compared to Monte Carlo simulations performed in a traditional hexagonal network. Furthermore, we demonstrate an application of the outage probability related to cell breathing and densification of cellular networks.

  19. Integrating cellular metabolism into a multiscale whole-body model.

    Directory of Open Access Journals (Sweden)

    Markus Krauss

    Full Text Available Cellular metabolism continuously processes an enormous range of external compounds into endogenous metabolites and is as such a key element in human physiology. The multifaceted physiological role of the metabolic network fulfilling the catalytic conversions can only be fully understood from a whole-body perspective where the causal interplay of the metabolic states of individual cells, the surrounding tissue and the whole organism are simultaneously considered. We here present an approach relying on dynamic flux balance analysis that allows the integration of metabolic networks at the cellular scale into standardized physiologically-based pharmacokinetic models at the whole-body level. To evaluate our approach we integrated a genome-scale network reconstruction of a human hepatocyte into the liver tissue of a physiologically-based pharmacokinetic model of a human adult. The resulting multiscale model was used to investigate hyperuricemia therapy, ammonia detoxification and paracetamol-induced toxication at a systems level. The specific models simultaneously integrate multiple layers of biological organization and offer mechanistic insights into pathology and medication. The approach presented may in future support a mechanistic understanding in diagnostics and drug development.

  20. Advances in automated valuation modeling AVM after the non-agency mortgage crisis

    CERN Document Server

    Kauko, Tom

    2017-01-01

    This book addresses several problems related to automated valuation methodologies (AVM). Following the non-agency mortgage crisis, it offers a variety of approaches to improve the efficiency and quality of an automated valuation methodology (AVM) dealing with emerging problems and different contexts. Spatial issue, evolution of AVM standards, multilevel models, fuzzy and rough set applications and quantitative methods to define comparables are just some of the topics discussed.

  1. Automated cost modeling for coal combustion systems

    International Nuclear Information System (INIS)

    Rowe, R.M.; Anast, K.R.

    1991-01-01

    This paper reports on cost information developed at AMAX R and D Center for coal-water slurry production implemented in an automated spreadsheet (Lotus 123) for personal computer use. The spreadsheet format allows the user toe valuate impacts of various process options, coal feedstock characteristics, fuel characteristics, plant location sites, and plant sizes on fuel cost. Model flexibility reduces time and labor required to determine fuel costs and provides a basis to compare fuels manufactured by different processes. The model input includes coal characteristics, plant flowsheet definition, plant size, and market location. Based on these inputs, selected unit operations are chosen for coal processing

  2. Towards Massive Machine Type Cellular Communications

    OpenAIRE

    Dawy, Zaher; Saad, Walid; Ghosh, Arunabha; Andrews, Jeffrey G.; Yaacoub, Elias

    2015-01-01

    Cellular networks have been engineered and optimized to carrying ever-increasing amounts of mobile data, but over the last few years, a new class of applications based on machine-centric communications has begun to emerge. Automated devices such as sensors, tracking devices, and meters - often referred to as machine-to-machine (M2M) or machine-type communications (MTC) - introduce an attractive revenue stream for mobile network operators, if a massive number of them can be efficiently support...

  3. Modeling cellular networks in fading environments with dominant specular components

    KAUST Repository

    AlAmmouri, Ahmad

    2016-07-26

    Stochastic geometry (SG) has been widely accepted as a fundamental tool for modeling and analyzing cellular networks. However, the fading models used with SG analysis are mainly confined to the simplistic Rayleigh fading, which is extended to the Nakagami-m fading in some special cases. However, neither the Rayleigh nor the Nakagami-m accounts for dominant specular components (DSCs) which may appear in realistic fading channels. In this paper, we present a tractable model for cellular networks with generalized two-ray (GTR) fading channel. The GTR fading explicitly accounts for two DSCs in addition to the diffuse components and offers high flexibility to capture diverse fading channels that appear in realistic outdoor/indoor wireless communication scenarios. It also encompasses the famous Rayleigh and Rician fading as special cases. To this end, the prominent effect of DSCs is highlighted in terms of average spectral efficiency. © 2016 IEEE.

  4. Improving the driver-automation interaction: an approach using automation uncertainty.

    Science.gov (United States)

    Beller, Johannes; Heesen, Matthias; Vollrath, Mark

    2013-12-01

    The aim of this study was to evaluate whether communicating automation uncertainty improves the driver-automation interaction. A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs.low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. The presentation of automation uncertaintythrough a symbol improves overall driver-automation cooperation. Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver-automation cooperation.

  5. An automation model of Effluent Treatment Plant

    Directory of Open Access Journals (Sweden)

    Luiz Alberto Oliveira Lima Roque

    2012-07-01

    on the conservation of water resources, this paper aims to propose an automation model of an Effluent Treatment Plant, using Ladder programming language and supervisory systems.

  6. Modeling and Analysis of Cellular Networks using Stochastic Geometry: A Tutorial

    KAUST Repository

    Elsawy, Hesham; Salem, Ahmed Sultan; Alouini, Mohamed-Slim; Win, Moe Z.

    2016-01-01

    This paper presents a tutorial on stochastic geometry (SG) based analysis for cellular networks. This tutorial is distinguished by its depth with respect to wireless communication details and its focus on cellular networks. The paper starts by modeling and analyzing the baseband interference in a baseline single-tier downlink cellular network with single antenna base stations and universal frequency reuse. Then, it characterizes signal-to-interference-plus-noise-ratio (SINR) and its related performance metrics. In particular, a unified approach to conduct error probability, outage probability, and transmission rate analysis is presented. Although the main focus of the paper is on cellular networks, the presented unified approach applies for other types of wireless networks that impose interference protection around receivers. The paper then extends the unified approach to capture cellular network characteristics (e.g., frequency reuse, multiple antenna, power control, etc.). It also presents numerical examples associated with demonstrations and discussions. To this end, the paper highlights the state-of-the- art research and points out future research directions.

  7. Modeling and Analysis of Cellular Networks using Stochastic Geometry: A Tutorial

    KAUST Repository

    Elsawy, Hesham

    2016-11-03

    This paper presents a tutorial on stochastic geometry (SG) based analysis for cellular networks. This tutorial is distinguished by its depth with respect to wireless communication details and its focus on cellular networks. The paper starts by modeling and analyzing the baseband interference in a baseline single-tier downlink cellular network with single antenna base stations and universal frequency reuse. Then, it characterizes signal-to-interference-plus-noise-ratio (SINR) and its related performance metrics. In particular, a unified approach to conduct error probability, outage probability, and transmission rate analysis is presented. Although the main focus of the paper is on cellular networks, the presented unified approach applies for other types of wireless networks that impose interference protection around receivers. The paper then extends the unified approach to capture cellular network characteristics (e.g., frequency reuse, multiple antenna, power control, etc.). It also presents numerical examples associated with demonstrations and discussions. To this end, the paper highlights the state-of-the- art research and points out future research directions.

  8. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  9. Automated extraction of knowledge for model-based diagnostics

    Science.gov (United States)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  10. Modeling Multiple Human-Automation Distributed Systems using Network-form Games

    Science.gov (United States)

    Brat, Guillaume

    2012-01-01

    The paper describes at a high-level the network-form game framework (based on Bayes net and game theory), which can be used to model and analyze safety issues in large, distributed, mixed human-automation systems such as NextGen.

  11. An agent-based model of cellular dynamics and circadian variability in human endotoxemia.

    Directory of Open Access Journals (Sweden)

    Tung T Nguyen

    Full Text Available As cellular variability and circadian rhythmicity play critical roles in immune and inflammatory responses, we present in this study an agent-based model of human endotoxemia to examine the interplay between circadian controls, cellular variability and stochastic dynamics of inflammatory cytokines. The model is qualitatively validated by its ability to reproduce circadian dynamics of inflammatory mediators and critical inflammatory responses after endotoxin administration in vivo. Novel computational concepts are proposed to characterize the cellular variability and synchronization of inflammatory cytokines in a population of heterogeneous leukocytes. Our results suggest that there is a decrease in cell-to-cell variability of inflammatory cytokines while their synchronization is increased after endotoxin challenge. Model parameters that are responsible for IκB production stimulated by NFκB activation and for the production of anti-inflammatory cytokines have large impacts on system behaviors. Additionally, examining time-dependent systemic responses revealed that the system is least vulnerable to endotoxin in the early morning and most vulnerable around midnight. Although much remains to be explored, proposed computational concepts and the model we have pioneered will provide important insights for future investigations and extensions, especially for single-cell studies to discover how cellular variability contributes to clinical implications.

  12. The adaptive cruise control vehicles in the cellular automata model

    International Nuclear Information System (INIS)

    Jiang Rui; Wu Qingsong

    2006-01-01

    This Letter presented a cellular automata model where the adaptive cruise control vehicles are modelled. In this model, the constant time headway policy is adopted. The fundamental diagram is presented. The simulation results are in good agreement with the analytical ones. The mixture of ACC vehicles with manually driven vehicles is investigated. It is shown that with the introduction of ACC vehicles, the jam can be suppressed

  13. World-wide distribution automation systems

    International Nuclear Information System (INIS)

    Devaney, T.M.

    1994-01-01

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems

  14. An Overview of the Automated Dispatch Controller Algorithms in the System Advisor Model (SAM)

    Energy Technology Data Exchange (ETDEWEB)

    DiOrio, Nicholas A [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-11-22

    Three automatic dispatch modes have been added to the battery model within the System Adviser Model. These controllers have been developed to perform peak shaving in an automated fashion, providing users with a way to see the benefit of reduced demand charges without manually programming a complicated dispatch control. A flexible input option allows more advanced interaction with the automated controller. This document will describe the algorithms in detail and present brief results on its use and limitations.

  15. Kinetic theory approach to modeling of cellular repair mechanisms under genome stress.

    Directory of Open Access Journals (Sweden)

    Jinpeng Qi

    Full Text Available Under acute perturbations from outer environment, a normal cell can trigger cellular self-defense mechanism in response to genome stress. To investigate the kinetics of cellular self-repair process at single cell level further, a model of DNA damage generating and repair is proposed under acute Ion Radiation (IR by using mathematical framework of kinetic theory of active particles (KTAP. Firstly, we focus on illustrating the profile of Cellular Repair System (CRS instituted by two sub-populations, each of which is made up of the active particles with different discrete states. Then, we implement the mathematical framework of cellular self-repair mechanism, and illustrate the dynamic processes of Double Strand Breaks (DSBs and Repair Protein (RP generating, DSB-protein complexes (DSBCs synthesizing, and toxins accumulating. Finally, we roughly analyze the capability of cellular self-repair mechanism, cellular activity of transferring DNA damage, and genome stability, especially the different fates of a certain cell before and after the time thresholds of IR perturbations that a cell can tolerate maximally under different IR perturbation circumstances.

  16. Kinetic theory approach to modeling of cellular repair mechanisms under genome stress.

    Science.gov (United States)

    Qi, Jinpeng; Ding, Yongsheng; Zhu, Ying; Wu, Yizhi

    2011-01-01

    Under acute perturbations from outer environment, a normal cell can trigger cellular self-defense mechanism in response to genome stress. To investigate the kinetics of cellular self-repair process at single cell level further, a model of DNA damage generating and repair is proposed under acute Ion Radiation (IR) by using mathematical framework of kinetic theory of active particles (KTAP). Firstly, we focus on illustrating the profile of Cellular Repair System (CRS) instituted by two sub-populations, each of which is made up of the active particles with different discrete states. Then, we implement the mathematical framework of cellular self-repair mechanism, and illustrate the dynamic processes of Double Strand Breaks (DSBs) and Repair Protein (RP) generating, DSB-protein complexes (DSBCs) synthesizing, and toxins accumulating. Finally, we roughly analyze the capability of cellular self-repair mechanism, cellular activity of transferring DNA damage, and genome stability, especially the different fates of a certain cell before and after the time thresholds of IR perturbations that a cell can tolerate maximally under different IR perturbation circumstances.

  17. A cellular automata model of traffic flow with variable probability of randomization

    International Nuclear Information System (INIS)

    Zheng Wei-Fan; Zhang Ji-Ye

    2015-01-01

    Research on the stochastic behavior of traffic flow is important to understand the intrinsic evolution rules of a traffic system. By introducing an interactional potential of vehicles into the randomization step, an improved cellular automata traffic flow model with variable probability of randomization is proposed in this paper. In the proposed model, the driver is affected by the interactional potential of vehicles before him, and his decision-making process is related to the interactional potential. Compared with the traditional cellular automata model, the modeling is more suitable for the driver’s random decision-making process based on the vehicle and traffic situations in front of him in actual traffic. From the improved model, the fundamental diagram (flow–density relationship) is obtained, and the detailed high-density traffic phenomenon is reproduced through numerical simulation. (paper)

  18. Radio Channel Modelling for UAV Communication over Cellular Networks

    DEFF Research Database (Denmark)

    Amorim, Rafhael Medeiros de; Nguyen, Huan Cong; Mogensen, Preben Elgaard

    2017-01-01

    The main goal of this paper is to obtain models for path loss exponents and shadowing for the radio channel between airborne Unmanned Aerial Vehicles (UAVs) and cellular networks. In this pursuit, field measurements were conducted in live LTE networks at the 800 MHz frequency band, using a commer...

  19. Modeling take-over performance in level 3 conditionally automated vehicles.

    Science.gov (United States)

    Gold, Christian; Happee, Riender; Bengler, Klaus

    2017-11-28

    Taking over vehicle control from a Level 3 conditionally automated vehicle can be a demanding task for a driver. The take-over determines the controllability of automated vehicle functions and thereby also traffic safety. This paper presents models predicting the main take-over performance variables take-over time, minimum time-to-collision, brake application and crash probability. These variables are considered in relation to the situational and driver-related factors time-budget, traffic density, non-driving-related task, repetition, the current lane and driver's age. Regression models were developed using 753 take-over situations recorded in a series of driving simulator experiments. The models were validated with data from five other driving simulator experiments of mostly unrelated authors with another 729 take-over situations. The models accurately captured take-over time, time-to-collision and crash probability, and moderately predicted the brake application. Especially the time-budget, traffic density and the repetition strongly influenced the take-over performance, while the non-driving-related tasks, the lane and drivers' age explained a minor portion of the variance in the take-over performances. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Stochastic processes, multiscale modeling, and numerical methods for computational cellular biology

    CERN Document Server

    2017-01-01

    This book focuses on the modeling and mathematical analysis of stochastic dynamical systems along with their simulations. The collected chapters will review fundamental and current topics and approaches to dynamical systems in cellular biology. This text aims to develop improved mathematical and computational methods with which to study biological processes. At the scale of a single cell, stochasticity becomes important due to low copy numbers of biological molecules, such as mRNA and proteins that take part in biochemical reactions driving cellular processes. When trying to describe such biological processes, the traditional deterministic models are often inadequate, precisely because of these low copy numbers. This book presents stochastic models, which are necessary to account for small particle numbers and extrinsic noise sources. The complexity of these models depend upon whether the biochemical reactions are diffusion-limited or reaction-limited. In the former case, one needs to adopt the framework of s...

  1. Three-dimensional cellular automata as a model of a seismic fault

    International Nuclear Information System (INIS)

    Gálvez, G; Muñoz, A

    2017-01-01

    The Earth's crust is broken into a series of plates, whose borders are the seismic fault lines and it is where most of the earthquakes occur. This plating system can in principle be described by a set of nonlinear coupled equations describing the motion of the plates, its stresses, strains and other characteristics. Such a system of equations is very difficult to solve, and nonlinear parts leads to a chaotic behavior, which is not predictable. In 1989, Bak and Tang presented an earthquake model based on the sand pile cellular automata. The model though simple, provides similar results to those observed in actual earthquakes. In this work the cellular automata in three dimensions is proposed as a best model to approximate a seismic fault. It is noted that the three-dimensional model reproduces similar properties to those observed in real seismicity, especially, the Gutenberg-Richter law. (paper)

  2. A coarse-grained model for the simulations of biomolecular interactions in cellular environments

    International Nuclear Information System (INIS)

    Xie, Zhong-Ru; Chen, Jiawen; Wu, Yinghao

    2014-01-01

    The interactions of bio-molecules constitute the key steps of cellular functions. However, in vivo binding properties differ significantly from their in vitro measurements due to the heterogeneity of cellular environments. Here we introduce a coarse-grained model based on rigid-body representation to study how factors such as cellular crowding and membrane confinement affect molecular binding. The macroscopic parameters such as the equilibrium constant and the kinetic rate constant are calibrated by adjusting the microscopic coefficients used in the numerical simulations. By changing these model parameters that are experimentally approachable, we are able to study the kinetic and thermodynamic properties of molecular binding, as well as the effects caused by specific cellular environments. We investigate the volumetric effects of crowded intracellular space on bio-molecular diffusion and diffusion-limited reactions. Furthermore, the binding constants of membrane proteins are currently difficult to measure. We provide quantitative estimations about how the binding of membrane proteins deviates from soluble proteins under different degrees of membrane confinements. The simulation results provide biological insights to the functions of membrane receptors on cell surfaces. Overall, our studies establish a connection between the details of molecular interactions and the heterogeneity of cellular environments

  3. Quantitative phase-digital holographic microscopy: a new imaging modality to identify original cellular biomarkers of diseases

    KAUST Repository

    Marquet, P.; Rothenfusser, K.; Rappaz, B.; Depeursinge, Christian; Jourdain, P.; Magistretti, Pierre J.

    2016-01-01

    parallelization and automation processes, represents an appealing imaging modality to both identify original cellular biomarkers of diseases as well to explore the underlying pathophysiological processes.

  4. On the derivation of approximations to cellular automata models and the assumption of independence.

    Science.gov (United States)

    Davies, K J; Green, J E F; Bean, N G; Binder, B J; Ross, J V

    2014-07-01

    Cellular automata are discrete agent-based models, generally used in cell-based applications. There is much interest in obtaining continuum models that describe the mean behaviour of the agents in these models. Previously, continuum models have been derived for agents undergoing motility and proliferation processes, however, these models only hold under restricted conditions. In order to narrow down the reason for these restrictions, we explore three possible sources of error in deriving the model. These sources are the choice of limiting arguments, the use of a discrete-time model as opposed to a continuous-time model and the assumption of independence between the state of sites. We present a rigorous analysis in order to gain a greater understanding of the significance of these three issues. By finding a limiting regime that accurately approximates the conservation equation for the cellular automata, we are able to conclude that the inaccuracy between our approximation and the cellular automata is completely based on the assumption of independence. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Fire and Heat Spreading Model Based on Cellular Automata Theory

    Science.gov (United States)

    Samartsev, A. A.; Rezchikov, A. F.; Kushnikov, V. A.; Ivashchenko, V. A.; Bogomolov, A. S.; Filimonyuk, L. Yu; Dolinina, O. N.; Kushnikov, O. V.; Shulga, T. E.; Tverdokhlebov, V. A.; Fominykh, D. S.

    2018-05-01

    The distinctive feature of the proposed fire and heat spreading model in premises is the reduction of the computational complexity due to the use of the theory of cellular automata with probability rules of behavior. The possibilities and prospects of using this model in practice are noted. The proposed model has a simple mechanism of integration with agent-based evacuation models. The joint use of these models could improve floor plans and reduce the time of evacuation from premises during fires.

  6. A cellular automata intraurban model with prices and income-differentiated actors

    NARCIS (Netherlands)

    Furtado, B.A.; Ettema, D.F.; Ruiz, R.M.; Hurkens, J.; Delden, H. van

    2012-01-01

    This paper presents an intraurban cellular automata model that is an extension to White and Engelen’s pioneering model. The paper’s main contribution is to distinguish between agglomerative eff ects, determined by the attraction of the neighbourhood, and disagglomerative eff ects, driven by land

  7. A Cellular Automata-based Model for Simulating Restitution Property in a Single Heart Cell.

    Science.gov (United States)

    Sabzpoushan, Seyed Hojjat; Pourhasanzade, Fateme

    2011-01-01

    Ventricular fibrillation is the cause of the most sudden mortalities. Restitution is one of the specific properties of ventricular cell. The recent findings have clearly proved the correlation between the slope of restitution curve with ventricular fibrillation. This; therefore, mandates the modeling of cellular restitution to gain high importance. A cellular automaton is a powerful tool for simulating complex phenomena in a simple language. A cellular automaton is a lattice of cells where the behavior of each cell is determined by the behavior of its neighboring cells as well as the automata rule. In this paper, a simple model is depicted for the simulation of the property of restitution in a single cardiac cell using cellular automata. At first, two state variables; action potential and recovery are introduced in the automata model. In second, automata rule is determined and then recovery variable is defined in such a way so that the restitution is developed. In order to evaluate the proposed model, the generated restitution curve in our study is compared with the restitution curves from the experimental findings of valid sources. Our findings indicate that the presented model is not only capable of simulating restitution in cardiac cell, but also possesses the capability of regulating the restitution curve.

  8. Modeling virtualized downlink cellular networks with ultra-dense small cells

    KAUST Repository

    Ibrahim, Hazem

    2015-09-11

    The unrelenting increase in the mobile users\\' populations and traffic demand drive cellular network operators to densify their infrastructure. Network densification increases the spatial frequency reuse efficiency while maintaining the signal-to-interference-plus-noise-ratio (SINR) performance, hence, increases the spatial spectral efficiency and improves the overall network performance. However, control signaling in such dense networks consumes considerable bandwidth and limits the densification gain. Radio access network (RAN) virtualization via control plane (C-plane) and user plane (U-plane) splitting has been recently proposed to lighten the control signaling burden and improve the network throughput. In this paper, we present a tractable analytical model for virtualized downlink cellular networks, using tools from stochastic geometry. We then apply the developed modeling framework to obtain design insights for virtualized RANs and quantify associated performance improvement. © 2015 IEEE.

  9. Complex Automata: Multi-scale Modeling with Coupled Cellular Automata

    NARCIS (Netherlands)

    Hoekstra, A.G.; Caiazzo, A.; Lorenz, E.; Falcone, J.-L.; Chopard, B.; Hoekstra, A.G.; Kroc, J.; Sloot, P.M.A.

    2010-01-01

    Cellular Automata (CA) are generally acknowledged to be a powerful way to describe and model natural phenomena [1-3]. There are even tempting claims that nature itself is one big (quantum) information processing system, e.g. [4], and that CA may actually be nature’s way to do this processing [5-7].

  10. Uniform and localized corrosion modelling by means of probabilistic cellular automata

    International Nuclear Information System (INIS)

    Perez-Brokate, Cristian

    2016-01-01

    Numerical modelling is complementary tool for corrosion prediction. The objective of this work is to develop a corrosion model by means of a probabilistic cellular automata approach at a mesoscopic scale. In this work, we study the morphological evolution and kinetics of corrosion. This model couples electrochemical oxidation and reduction reactions. Regarding kinetics, cellular automata models are able to describe current as a function of the applied potential for a redox reaction on an inert electrode. The inclusion of probabilities allows the description of the stochastic nature of anodic and cathodic reactions. Corrosion morphology has been studied in different context: generalised corrosion, pitting corrosion and corrosion in an occluded environment. a general tendency of two regimes is found. a first regime of uniform corrosion where the anodic and cathodic reactions occur homogeneously over the surface. a second regime of localized corrosion when there is a spatial separation of anodic and cathodic zones, with an increase of anodic reaction rate. (author) [fr

  11. MODELING OF FUTURE LAND COVER LAND USE CHANGE IN NORTH CAROLINA USING MARKOV CHAIN AND CELLULAR AUTOMATA MODEL

    OpenAIRE

    Mohammad Sayemuzzaman; Manoj K. Jha

    2014-01-01

    State wide variant topographic features in North Carolina attract the hydro-climatologist. There is none modeling study found that predict future Land Cover Land Use (LCLU) change for whole North Carolina. In this study, satellite-derived land cover maps of year 1992, 2001 and 2006 of North Carolina were integrated within the framework of the Markov-Cellular Automata (Markov-CA) model which combines the Markov chain and Cellular Automata (CA) techniques. A Multi-Criteria Evaluation (MCE) was ...

  12. Quantitative Analysis of Intra Urban Growth Modeling using socio economic agents by combining cellular automata model with agent based model

    Science.gov (United States)

    Singh, V. K.; Jha, A. K.; Gupta, K.; Srivastav, S. K.

    2017-12-01

    Recent studies indicate that there is a significant improvement in the urban land use dynamics through modeling at finer spatial resolutions. Geo-computational models such as cellular automata and agent based model have given evident proof regarding the quantification of the urban growth pattern with urban boundary. In recent studies, socio- economic factors such as demography, education rate, household density, parcel price of the current year, distance to road, school, hospital, commercial centers and police station are considered to the major factors influencing the Land Use Land Cover (LULC) pattern of the city. These factors have unidirectional approach to land use pattern which makes it difficult to analyze the spatial aspects of model results both quantitatively and qualitatively. In this study, cellular automata model is combined with generic model known as Agent Based Model to evaluate the impact of socio economic factors on land use pattern. For this purpose, Dehradun an Indian city is selected as a case study. Socio economic factors were collected from field survey, Census of India, Directorate of economic census, Uttarakhand, India. A 3X3 simulating window is used to consider the impact on LULC. Cellular automata model results are examined for the identification of hot spot areas within the urban area and agent based model will be using logistic based regression approach where it will identify the correlation between each factor on LULC and classify the available area into low density, medium density, high density residential or commercial area. In the modeling phase, transition rule, neighborhood effect, cell change factors are used to improve the representation of built-up classes. Significant improvement is observed in the built-up classes from 84 % to 89 %. However after incorporating agent based model with cellular automata model the accuracy improved from 89 % to 94 % in 3 classes of urban i.e. low density, medium density and commercial classes

  13. Modeling cellular networks in fading environments with dominant specular components

    KAUST Repository

    Alammouri, Ahmad; Elsawy, Hesham; Salem, Ahmed Sultan; Di Renzo, Marco; Alouini, Mohamed-Slim

    2016-01-01

    to the Nakagami-m fading in some special cases. However, neither the Rayleigh nor the Nakagami-m accounts for dominant specular components (DSCs) which may appear in realistic fading channels. In this paper, we present a tractable model for cellular networks

  14. Automation in organizations: Eternal conflict

    Science.gov (United States)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  15. Automation in high-content flow cytometry screening.

    Science.gov (United States)

    Naumann, U; Wand, M P

    2009-09-01

    High-content flow cytometric screening (FC-HCS) is a 21st Century technology that combines robotic fluid handling, flow cytometric instrumentation, and bioinformatics software, so that relatively large numbers of flow cytometric samples can be processed and analysed in a short period of time. We revisit a recent application of FC-HCS to the problem of cellular signature definition for acute graft-versus-host-disease. Our focus is on automation of the data processing steps using recent advances in statistical methodology. We demonstrate that effective results, on par with those obtained via manual processing, can be achieved using our automatic techniques. Such automation of FC-HCS has the potential to drastically improve diagnosis and biomarker identification.

  16. Graph Cellular Automata with Relation-Based Neighbourhoods of Cells for Complex Systems Modelling: A Case of Traffic Simulation

    Directory of Open Access Journals (Sweden)

    Krzysztof Małecki

    2017-12-01

    Full Text Available A complex system is a set of mutually interacting elements for which it is possible to construct a mathematical model. This article focuses on the cellular automata theory and the graph theory in order to compare various types of cellular automata and to analyse applications of graph structures together with cellular automata. It proposes a graph cellular automaton with a variable configuration of cells and relation-based neighbourhoods (r–GCA. The developed mechanism enables modelling of phenomena found in complex systems (e.g., transport networks, urban logistics, social networks taking into account the interaction between the existing objects. As an implementation example, modelling of moving vehicles has been made and r–GCA was compared to the other cellular automata models simulating the road traffic and used in the computer simulation process.

  17. Automated evolutionary restructuring of workflows to minimise errors via stochastic model checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents a framework for the automated restructuring of workflows that allows one to minimise the impact of errors on a production workflow. The framework allows for the modelling of workflows by means of a formalised subset of the Business Process Modelling and Notation (BPMN) language...

  18. A hybrid parallel framework for the cellular Potts model simulations

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Yi [Los Alamos National Laboratory; He, Kejing [SOUTH CHINA UNIV; Dong, Shoubin [SOUTH CHINA UNIV

    2009-01-01

    The Cellular Potts Model (CPM) has been widely used for biological simulations. However, most current implementations are either sequential or approximated, which can't be used for large scale complex 3D simulation. In this paper we present a hybrid parallel framework for CPM simulations. The time-consuming POE solving, cell division, and cell reaction operation are distributed to clusters using the Message Passing Interface (MPI). The Monte Carlo lattice update is parallelized on shared-memory SMP system using OpenMP. Because the Monte Carlo lattice update is much faster than the POE solving and SMP systems are more and more common, this hybrid approach achieves good performance and high accuracy at the same time. Based on the parallel Cellular Potts Model, we studied the avascular tumor growth using a multiscale model. The application and performance analysis show that the hybrid parallel framework is quite efficient. The hybrid parallel CPM can be used for the large scale simulation ({approx}10{sup 8} sites) of complex collective behavior of numerous cells ({approx}10{sup 6}).

  19. Load-aware modeling for uplink cellular networks in a multi-channel environment

    KAUST Repository

    Alammouri, Ahmad; Elsawy, Hesham; Alouini, Mohamed-Slim

    2014-01-01

    We exploit tools from stochastic geometry to develop a tractable analytical approach for modeling uplink cellular networks. The developed model is load aware and accounts for per-user power control as well as the limited transmit power constraint

  20. Model of informational system for freight insurance automation based on digital signature

    OpenAIRE

    Maxim E. SLOBODYANYUK

    2009-01-01

    In the article considered a model of informational system for freight insurance automation based on digital signature, showed architecture, macro flowchart of information flow in model, components (modules) and their functions. Described calculation method of costs on interactive cargo insurance via proposed system, represented main characteristics and options of existing transport management systems, conceptual cost models.

  1. Model of informational system for freight insurance automation based on digital signature

    Directory of Open Access Journals (Sweden)

    Maxim E. SLOBODYANYUK

    2009-01-01

    Full Text Available In the article considered a model of informational system for freight insurance automation based on digital signature, showed architecture, macro flowchart of information flow in model, components (modules and their functions. Described calculation method of costs on interactive cargo insurance via proposed system, represented main characteristics and options of existing transport management systems, conceptual cost models.

  2. Initial Assessment and Modeling Framework Development for Automated Mobility Districts: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Hou, Yi [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Young, Stanley E [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Garikapati, Venu [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Chen, Yuche [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zhu, Lei [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-02-07

    Automated vehicles (AVs) are increasingly being discussed as the basis for on-demand mobility services, introducing a new paradigm in which a fleet of AVs displaces private automobiles for day-to-day travel in dense activity districts. This paper examines a concept to displace privately owned automobiles within a region containing dense activity generators (jobs, retail, entertainment, etc.), referred to as an automated mobility district (AMD). This paper reviews several such districts, including airports, college campuses, business parks, downtown urban cores, and military bases, with examples of previous attempts to meet the mobility needs apart from private automobiles, some with automated technology and others with more traditional transit-based solutions. The issues and benefits of AMDs are framed within the perspective of intra-district, inter-district, and border issues, and the requirements for a modeling framework are identified to adequately reflect the breadth of mobility, energy, and emissions impact anticipated with AMDs

  3. Parallel Genetic Algorithms for calibrating Cellular Automata models: Application to lava flows

    International Nuclear Information System (INIS)

    D'Ambrosio, D.; Spataro, W.; Di Gregorio, S.; Calabria Univ., Cosenza; Crisci, G.M.; Rongo, R.; Calabria Univ., Cosenza

    2005-01-01

    Cellular Automata are highly nonlinear dynamical systems which are suitable far simulating natural phenomena whose behaviour may be specified in terms of local interactions. The Cellular Automata model SCIARA, developed far the simulation of lava flows, demonstrated to be able to reproduce the behaviour of Etnean events. However, in order to apply the model far the prediction of future scenarios, a thorough calibrating phase is required. This work presents the application of Genetic Algorithms, general-purpose search algorithms inspired to natural selection and genetics, far the parameters optimisation of the model SCIARA. Difficulties due to the elevated computational time suggested the adoption a Master-Slave Parallel Genetic Algorithm far the calibration of the model with respect to the 2001 Mt. Etna eruption. Results demonstrated the usefulness of the approach, both in terms of computing time and quality of performed simulations

  4. Platinum nanozymes recover cellular ROS homeostasis in an oxidative stress-mediated disease model

    Science.gov (United States)

    Moglianetti, Mauro; de Luca, Elisa; Pedone, Deborah; Marotta, Roberto; Catelani, Tiziano; Sartori, Barbara; Amenitsch, Heinz; Retta, Saverio Francesco; Pompa, Pier Paolo

    2016-02-01

    In recent years, the use of nanomaterials as biomimetic enzymes has attracted great interest. In this work, we show the potential of biocompatible platinum nanoparticles (Pt NPs) as antioxidant nanozymes, which combine abundant cellular internalization and efficient scavenging activity of cellular reactive oxygen species (ROS), thus simultaneously integrating the functions of nanocarriers and antioxidant drugs. Careful toxicity assessment and intracellular tracking of Pt NPs proved their cytocompatibility and high cellular uptake, with compartmentalization within the endo/lysosomal vesicles. We have demonstrated that Pt NPs possess strong and broad antioxidant properties, acting as superoxide dismutase, catalase, and peroxidase enzymes, with similar or even superior performance than natural enzymes, along with higher adaptability to the changes in environmental conditions. We then exploited their potent activity as radical scavenging materials in a cellular model of an oxidative stress-related disorder, namely human Cerebral Cavernous Malformation (CCM) disease, which is associated with a significant increase in intracellular ROS levels. Noteworthily, we found that Pt nanozymes can efficiently reduce ROS levels, completely restoring the cellular physiological homeostasis.In recent years, the use of nanomaterials as biomimetic enzymes has attracted great interest. In this work, we show the potential of biocompatible platinum nanoparticles (Pt NPs) as antioxidant nanozymes, which combine abundant cellular internalization and efficient scavenging activity of cellular reactive oxygen species (ROS), thus simultaneously integrating the functions of nanocarriers and antioxidant drugs. Careful toxicity assessment and intracellular tracking of Pt NPs proved their cytocompatibility and high cellular uptake, with compartmentalization within the endo/lysosomal vesicles. We have demonstrated that Pt NPs possess strong and broad antioxidant properties, acting as superoxide

  5. A cardiac electrical activity model based on a cellular automata system in comparison with neural network model.

    Science.gov (United States)

    Khan, Muhammad Sadiq Ali; Yousuf, Sidrah

    2016-03-01

    Cardiac Electrical Activity is commonly distributed into three dimensions of Cardiac Tissue (Myocardium) and evolves with duration of time. The indicator of heart diseases can occur randomly at any time of a day. Heart rate, conduction and each electrical activity during cardiac cycle should be monitor non-invasively for the assessment of "Action Potential" (regular) and "Arrhythmia" (irregular) rhythms. Many heart diseases can easily be examined through Automata model like Cellular Automata concepts. This paper deals with the different states of cardiac rhythms using cellular automata with the comparison of neural network also provides fast and highly effective stimulation for the contraction of cardiac muscles on the Atria in the result of genesis of electrical spark or wave. The specific formulated model named as "States of automaton Proposed Model for CEA (Cardiac Electrical Activity)" by using Cellular Automata Methodology is commonly shows the three states of cardiac tissues conduction phenomena (i) Resting (Relax and Excitable state), (ii) ARP (Excited but Absolutely refractory Phase i.e. Excited but not able to excite neighboring cells) (iii) RRP (Excited but Relatively Refractory Phase i.e. Excited and able to excite neighboring cells). The result indicates most efficient modeling with few burden of computation and it is Action Potential during the pumping of blood in cardiac cycle.

  6. An improved multi-value cellular automata model for heterogeneous bicycle traffic flow

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Sheng [College of Civil Engineering and Architecture, Zhejiang University, Hangzhou, 310058 China (China); Qu, Xiaobo [Griffith School of Engineering, Griffith University, Gold Coast, 4222 Australia (Australia); Xu, Cheng [Department of Transportation Management Engineering, Zhejiang Police College, Hangzhou, 310053 China (China); College of Transportation, Jilin University, Changchun, 130022 China (China); Ma, Dongfang, E-mail: mdf2004@zju.edu.cn [Ocean College, Zhejiang University, Hangzhou, 310058 China (China); Wang, Dianhai [College of Civil Engineering and Architecture, Zhejiang University, Hangzhou, 310058 China (China)

    2015-10-16

    This letter develops an improved multi-value cellular automata model for heterogeneous bicycle traffic flow taking the higher maximum speed of electric bicycles into consideration. The update rules of both regular and electric bicycles are improved, with maximum speeds of two and three cells per second respectively. Numerical simulation results for deterministic and stochastic cases are obtained. The fundamental diagrams and multiple states effects under different model parameters are analyzed and discussed. Field observations were made to calibrate the slowdown probabilities. The results imply that the improved extended Burgers cellular automata (IEBCA) model is more consistent with the field observations than previous models and greatly enhances the realism of the bicycle traffic model. - Highlights: • We proposed an improved multi-value CA model with higher maximum speed. • Update rules are introduced for heterogeneous bicycle traffic with maximum speed 2 and 3 cells/s. • Simulation results of the proposed model are consistent with field bicycle data. • Slowdown probabilities of both regular and electric bicycles are calibrated.

  7. An improved multi-value cellular automata model for heterogeneous bicycle traffic flow

    International Nuclear Information System (INIS)

    Jin, Sheng; Qu, Xiaobo; Xu, Cheng; Ma, Dongfang; Wang, Dianhai

    2015-01-01

    This letter develops an improved multi-value cellular automata model for heterogeneous bicycle traffic flow taking the higher maximum speed of electric bicycles into consideration. The update rules of both regular and electric bicycles are improved, with maximum speeds of two and three cells per second respectively. Numerical simulation results for deterministic and stochastic cases are obtained. The fundamental diagrams and multiple states effects under different model parameters are analyzed and discussed. Field observations were made to calibrate the slowdown probabilities. The results imply that the improved extended Burgers cellular automata (IEBCA) model is more consistent with the field observations than previous models and greatly enhances the realism of the bicycle traffic model. - Highlights: • We proposed an improved multi-value CA model with higher maximum speed. • Update rules are introduced for heterogeneous bicycle traffic with maximum speed 2 and 3 cells/s. • Simulation results of the proposed model are consistent with field bicycle data. • Slowdown probabilities of both regular and electric bicycles are calibrated

  8. Unified tractable model for downlink MIMO cellular networks using stochastic geometry

    KAUST Repository

    Afify, Laila H.

    2016-07-26

    Several research efforts are invested to develop stochastic geometry models for cellular networks with multiple antenna transmission and reception (MIMO). On one hand, there are models that target abstract outage probability and ergodic rate for simplicity. On the other hand, there are models that sacrifice simplicity to target more tangible performance metrics such as the error probability. Both types of models are completely disjoint in terms of the analytic steps to obtain the performance measures, which makes it challenging to conduct studies that account for different performance metrics. This paper unifies both techniques and proposes a unified stochastic-geometry based mathematical paradigm to account for error probability, outage probability, and ergodic rates in MIMO cellular networks. The proposed model is also unified in terms of the antenna configurations and leads to simpler error probability analysis compared to existing state-of-the-art models. The core part of the analysis is based on abstracting unnecessary information conveyed within the interfering signals by assuming Gaussian signaling. To this end, the accuracy of the proposed framework is verified against state-of-the-art models as well as system level simulations. We provide via this unified study insights on network design by reflecting system parameters effect on different performance metrics. © 2016 IEEE.

  9. Data for Environmental Modeling (D4EM): Background and Applications of Data Automation

    Science.gov (United States)

    The Data for Environmental Modeling (D4EM) project demonstrates the development of a comprehensive set of open source software tools that overcome obstacles to accessing data needed by automating the process of populating model input data sets with environmental data available fr...

  10. Adaptive Automation Design and Implementation

    Science.gov (United States)

    2015-09-17

    with an automated system to a real-world adaptive au- tomation system implementation. There have been plenty of adaptive automation 17 Adaptive...of systems without increasing manpower requirements by allocating routine tasks to automated aids, improving safety through the use of au- tomated ...between intermediate levels of au- tomation , explicitly defining which human task a given level automates. Each model aids the creation and classification

  11. Statistical modelling of networked human-automation performance using working memory capacity.

    Science.gov (United States)

    Ahmed, Nisar; de Visser, Ewart; Shaw, Tyler; Mohamed-Ameen, Amira; Campbell, Mark; Parasuraman, Raja

    2014-01-01

    This study examines the challenging problem of modelling the interaction between individual attentional limitations and decision-making performance in networked human-automation system tasks. Analysis of real experimental data from a task involving networked supervision of multiple unmanned aerial vehicles by human participants shows that both task load and network message quality affect performance, but that these effects are modulated by individual differences in working memory (WM) capacity. These insights were used to assess three statistical approaches for modelling and making predictions with real experimental networked supervisory performance data: classical linear regression, non-parametric Gaussian processes and probabilistic Bayesian networks. It is shown that each of these approaches can help designers of networked human-automated systems cope with various uncertainties in order to accommodate future users by linking expected operating conditions and performance from real experimental data to observable cognitive traits like WM capacity. Practitioner Summary: Working memory (WM) capacity helps account for inter-individual variability in operator performance in networked unmanned aerial vehicle supervisory tasks. This is useful for reliable performance prediction near experimental conditions via linear models; robust statistical prediction beyond experimental conditions via Gaussian process models and probabilistic inference about unknown task conditions/WM capacities via Bayesian network models.

  12. Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy

    DEFF Research Database (Denmark)

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H

    2017-01-01

    To target bacterial pathogens that invade and proliferate inside host cells, it is necessary to design intervention strategies directed against bacterial attachment, cellular invasion and intracellular proliferation. We present an automated microscopy-based, fast, high-throughput method for analy...

  13. Acanthamoeba and Dictyostelium as Cellular Models for Legionella Infection

    Science.gov (United States)

    Swart, A. Leoni; Harrison, Christopher F.; Eichinger, Ludwig; Steinert, Michael; Hilbi, Hubert

    2018-01-01

    Environmental bacteria of the genus Legionella naturally parasitize free-living amoebae. Upon inhalation of bacteria-laden aerosols, the opportunistic pathogens grow intracellularly in alveolar macrophages and can cause a life-threatening pneumonia termed Legionnaires' disease. Intracellular replication in amoebae and macrophages takes place in a unique membrane-bound compartment, the Legionella-containing vacuole (LCV). LCV formation requires the bacterial Icm/Dot type IV secretion system, which translocates literally hundreds of “effector” proteins into host cells, where they modulate crucial cellular processes for the pathogen's benefit. The mechanism of LCV formation appears to be evolutionarily conserved, and therefore, amoebae are not only ecologically significant niches for Legionella spp., but also useful cellular models for eukaryotic phagocytes. In particular, Acanthamoeba castellanii and Dictyostelium discoideum emerged over the last years as versatile and powerful models. Using genetic, biochemical and cell biological approaches, molecular interactions between amoebae and Legionella pneumophila have recently been investigated in detail with a focus on the role of phosphoinositide lipids, small and large GTPases, autophagy components and the retromer complex, as well as on bacterial effectors targeting these host factors. PMID:29552544

  14. Acanthamoeba and Dictyostelium as Cellular Models for Legionella Infection

    Directory of Open Access Journals (Sweden)

    A. Leoni Swart

    2018-03-01

    Full Text Available Environmental bacteria of the genus Legionella naturally parasitize free-living amoebae. Upon inhalation of bacteria-laden aerosols, the opportunistic pathogens grow intracellularly in alveolar macrophages and can cause a life-threatening pneumonia termed Legionnaires' disease. Intracellular replication in amoebae and macrophages takes place in a unique membrane-bound compartment, the Legionella-containing vacuole (LCV. LCV formation requires the bacterial Icm/Dot type IV secretion system, which translocates literally hundreds of “effector” proteins into host cells, where they modulate crucial cellular processes for the pathogen's benefit. The mechanism of LCV formation appears to be evolutionarily conserved, and therefore, amoebae are not only ecologically significant niches for Legionella spp., but also useful cellular models for eukaryotic phagocytes. In particular, Acanthamoeba castellanii and Dictyostelium discoideum emerged over the last years as versatile and powerful models. Using genetic, biochemical and cell biological approaches, molecular interactions between amoebae and Legionella pneumophila have recently been investigated in detail with a focus on the role of phosphoinositide lipids, small and large GTPases, autophagy components and the retromer complex, as well as on bacterial effectors targeting these host factors.

  15. Illuminance-based slat angle selection model for automated control of split blinds

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Jia; Olbina, Svetlana [Rinker School of Building Construction, University of Florida, Gainesville, FL 32611-5703 (United States)

    2011-03-15

    Venetian blinds play an important role in controlling daylight in buildings. Automated blinds overcome some limitations of manual blinds; however, the existing automated systems mainly control the direct solar radiation and glare and cannot be used for controlling innovative blind systems such as split blinds. This research developed an Illuminance-based Slat Angle Selection (ISAS) model that predicts the optimum slat angles of split blinds to achieve the designed indoor illuminance. The model was constructed based on a series of multi-layer feed-forward artificial neural networks (ANNs). The illuminance values at the sensor points used to develop the ANNs were obtained by the software EnergyPlus trademark. The weather determinants (such as horizontal illuminance and sun angles) were used as the input variables for the ANNs. The illuminance level at a sensor point was the output variable for the ANNs. The ISAS model was validated by evaluating the errors in the calculation of the: 1) illuminance and 2) optimum slat angles. The validation results showed that the power of the ISAS model to predict illuminance was 94.7% while its power to calculate the optimum slat angles was 98.5%. For about 90% of time in the year, the illuminance percentage errors were less than 10%, and the percentage errors in calculating the optimum slat angles were less than 5%. This research offers a new approach for the automated control of split blinds and a guide for future research to utilize the adaptive nature of ANNs to develop a more practical and applicable blind control system. (author)

  16. Cellular and molecular modifier pathways in tauopathies: the big picture from screening invertebrate models.

    Science.gov (United States)

    Hannan, Shabab B; Dräger, Nina M; Rasse, Tobias M; Voigt, Aaron; Jahn, Thomas R

    2016-04-01

    Abnormal tau accumulations were observed and documented in post-mortem brains of patients affected by Alzheimer's disease (AD) long before the identification of mutations in the Microtubule-associated protein tau (MAPT) gene, encoding the tau protein, in a different neurodegenerative disease called Frontotemporal dementia and Parkinsonism linked to chromosome 17 (FTDP-17). The discovery of mutations in the MAPT gene associated with FTDP-17 highlighted that dysfunctions in tau alone are sufficient to cause neurodegeneration. Invertebrate models have been diligently utilized in investigating tauopathies, contributing to the understanding of cellular and molecular pathways involved in disease etiology. An important discovery came with the demonstration that over-expression of human tau in Drosophila leads to premature mortality and neuronal dysfunction including neurodegeneration, recapitulating some key neuropathological features of the human disease. The simplicity of handling invertebrate models combined with the availability of a diverse range of experimental resources make these models, in particular Drosophila a powerful invertebrate screening tool. Consequently, several large-scale screens have been performed using Drosophila, to identify modifiers of tau toxicity. The screens have revealed not only common cellular and molecular pathways, but in some instances the same modifier has been independently identified in two or more screens suggesting a possible role for these modifiers in regulating tau toxicity. The purpose of this review is to discuss the genetic modifier screens on tauopathies performed in Drosophila and C. elegans models, and to highlight the common cellular and molecular pathways that have emerged from these studies. Here, we summarize results of tau toxicity screens providing mechanistic insights into pathological alterations in tauopathies. Key pathways or modifiers that have been identified are associated with a broad range of processes

  17. Car Deceleration Considering Its Own Velocity in Cellular Automata Model

    International Nuclear Information System (INIS)

    Li Keping

    2006-01-01

    In this paper, we propose a new cellular automaton model, which is based on NaSch traffic model. In our method, when a car has a larger velocity, if the gap between the car and its leading car is not enough large, it will decrease. The aim is that the following car has a buffer space to decrease its velocity at the next time, and then avoid to decelerate too high. The simulation results show that using our model, the car deceleration is realistic, and is closer to the field measure than that of NaSch model.

  18. Drivers' communicative interactions: on-road observations and modelling for integration in future automation systems.

    Science.gov (United States)

    Portouli, Evangelia; Nathanael, Dimitris; Marmaras, Nicolas

    2014-01-01

    Social interactions with other road users are an essential component of the driving activity and may prove critical in view of future automation systems; still up to now they have received only limited attention in the scientific literature. In this paper, it is argued that drivers base their anticipations about the traffic scene to a large extent on observations of social behaviour of other 'animate human-vehicles'. It is further argued that in cases of uncertainty, drivers seek to establish a mutual situational awareness through deliberate communicative interactions. A linguistic model is proposed for modelling these communicative interactions. Empirical evidence from on-road observations and analysis of concurrent running commentary by 25 experienced drivers support the proposed model. It is suggested that the integration of a social interactions layer based on illocutionary acts in future driving support and automation systems will improve their performance towards matching human driver's expectations. Practitioner Summary: Interactions between drivers on the road may play a significant role in traffic coordination. On-road observations and running commentaries are presented as empirical evidence to support a model of such interactions; incorporation of drivers' interactions in future driving support and automation systems may improve their performance towards matching driver's expectations.

  19. Model-driven design using IEC 61499 a synchronous approach for embedded and automation systems

    CERN Document Server

    Yoong, Li Hsien; Bhatti, Zeeshan E; Kuo, Matthew M Y

    2015-01-01

    This book describes a novel approach for the design of embedded systems and industrial automation systems, using a unified model-driven approach that is applicable in both domains.  The authors illustrate their methodology, using the IEC 61499 standard as the main vehicle for specification, verification, static timing analysis and automated code synthesis.  The well-known synchronous approach is used as the main vehicle for defining an unambiguous semantics that ensures determinism and deadlock freedom. The proposed approach also ensures very efficient implementations either on small-scale embedded devices or on industry-scale programmable automation controllers (PACs). It can be used for both centralized and distributed implementations. Significantly, the proposed approach can be used without the need for any run-time support. This approach, for the first time, blurs the gap between embedded systems and automation systems and can be applied in wide-ranging applications in automotive, robotics, and industri...

  20. Mobile home automation-merging mobile value added services and home automation technologies

    OpenAIRE

    Rosendahl, Andreas; Hampe, Felix J.; Botterweck, Goetz

    2007-01-01

    non-peer-reviewed In this paper we study mobile home automation, a field that emerges from an integration of mobile application platforms and home automation technologies. In a conceptual introduction we first illustrate the need for such applications by introducing a two-dimensional conceptual model of mobility. Subsequently we suggest an architecture and discuss different options of how a user might access a mobile home automation service and the controlled devices. As another contrib...

  1. Contact assembly of cell-laden hollow microtubes through automated micromanipulator tip locating

    International Nuclear Information System (INIS)

    Wang, Huaping; Shi, Qing; Guo, Yanan; Li, Yanan; Sun, Tao; Huang, Qiang; Fukuda, Toshio

    2017-01-01

    This paper presents an automated contact assembly method to fabricate a cell-laden microtube based on accurate locating of the micromanipulator tip. Essential for delivering nutrients in thick engineered tissues, a vessel-mimetic microtube can be precisely assembled through microrobotic contact biomanipulation. The biomanipulation is a technique to spatially order and immobilize cellular targets with high precision. However, due to image occlusion during contact, it is challenging to locate the micromanipulator tip for fully automated assembly. To achieve pixel-wise tracking and locating of the tip in contact, a particle filter algorithm integrated with a determined level set model is employed here. The model ensures precise convergence of the micromanipulator’s contour during occlusion. With the converged active contour, the algorithm is able to pixel-wisely separate the micromanipulator from the low-contrast background and precisely locate the tip with error around 1 pixel (2 µ m at 4  ×  magnification). As a result, the cell-laden microtube is automatically assembled at six layers/min, which is effective enough to fabricate vessel-mimetic constructs for vascularization in tissue engineering. (paper)

  2. Toward designing for trust in database automation

    Energy Technology Data Exchange (ETDEWEB)

    Duez, P. P.; Jamieson, G. A. [Cognitive Engineering Laboratory, Univ. of Toronto, 5 King' s College Rd., Toronto, Ont. M5S 3G8 (Canada)

    2006-07-01

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operating functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process

  3. Toward designing for trust in database automation

    International Nuclear Information System (INIS)

    Duez, P. P.; Jamieson, G. A.

    2006-01-01

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operating functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process

  4. A Markov Process Inspired Cellular Automata Model of Road Traffic

    OpenAIRE

    Wang, Fa; Li, Li; Hu, Jianming; Ji, Yan; Yao, Danya; Zhang, Yi; Jin, Xuexiang; Su, Yuelong; Wei, Zheng

    2008-01-01

    To provide a more accurate description of the driving behaviors in vehicle queues, a namely Markov-Gap cellular automata model is proposed in this paper. It views the variation of the gap between two consequent vehicles as a Markov process whose stationary distribution corresponds to the observed distribution of practical gaps. The multiformity of this Markov process provides the model enough flexibility to describe various driving behaviors. Two examples are given to show how to specialize i...

  5. Modeling cellular effects of coal pollutants

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    The goal of this project is to develop and test models for the dose and dose-rate dependence of biological effects of coal pollutants on mammalian cells in tissue culture. Particular attention is given to the interaction of pollutants with the genetic material (deoxyribonucleic acid, or NDA) in the cell. Unlike radiation, which can interact directly with chromatin, chemical pollutants undergo numerous changes before the ultimate carcinogen becomes covalently bound to the DNA. Synthetic vesicles formed from a phospholipid bilayer are being used to investigate chemical transformations that may occur during the transport of pollutants across cellular membranes. The initial damage to DNA is rapidly modified by enzymatic repair systems in most living organisms. A model has been developed for predicting the effects of excision repair on the survival of human cells exposed to chemical carcinogens. In addition to the excision system, normal human cells also have tolerance mechanisms that permit continued growth and division of cells without removal of the damage. We are investigating the biological effect of damage passed to daughter cells by these tolerance mechanisms

  6. Propagation Path Loss Models for 5G Urban Micro- and Macro-Cellular Scenarios

    DEFF Research Database (Denmark)

    Sun, Shu; Rappaport, Theodore S.; Rangan, Sundeep

    2016-01-01

    This paper presents and compares two candidate large-scale propagation path loss models, the alpha-beta-gamma (ABG) model and the close-in (CI) free space reference distance model, for the design of fifth generation (5G) wireless communication systems in urban micro- and macro-cellular scenarios....

  7. Hybrid disassembly system for cellular telephone end-of-life treatment

    Energy Technology Data Exchange (ETDEWEB)

    Kniebel, M.; Basdere, B.; Seliger, G. [Technical Univ. Berlin, Inst. for Machine Tools and Factory Management, Dept. of Assembly Technology and Factory Management, Berlin (Germany)

    2004-07-01

    Concern over the negative environmental impacts associated with the production, use, and end-of-life (EOL) of cellular telephones is particularly high due to large production volumes and characteristically short time scales of technological and stylistic obsolescence. Landfilled or incinerated cellular telephones create the potential for release of toxic substances. The European legislation has passed the directive on Waste of Electrical and Electronic Equipment (WEEE) to regulate their collection and appropriate end-of-life treatment. Manufacturers must conduct material recycling or remanufacturing processes to recover resources. While recovery rates can hardly be met economically by material recycling, remanufacturing and reusing cellular phones is developing into a reasonable alternative. Both end-of-life options require disassembly processes for WEEE compliant treatment. Due to the high number of different cell phone variants and their typical design that fits components into tight enclosing spaces, cellular phone disassembly becomes a challenging task. These challenges and the expected high numbers of phones to be returned in the course of the WEEE urges for automated disassembly. A hybrid disassembly system has been developed to ensure the mass-treatment of obsolete cellular phones. It has been integrated into a prototypical remanufacturing factory for cellular phones that has been planned based on market data. (orig.)

  8. Cellular self-assembly and biomaterials-based organoid models of development and diseases.

    Science.gov (United States)

    Shah, Shivem B; Singh, Ankur

    2017-04-15

    Organogenesis and morphogenesis have informed our understanding of physiology, pathophysiology, and avenues to create new curative and regenerative therapies. Thus far, this understanding has been hindered by the lack of a physiologically relevant yet accessible model that affords biological control. Recently, three-dimensional ex vivo cellular cultures created through cellular self-assembly under natural extracellular matrix cues or through biomaterial-based directed assembly have been shown to physically resemble and recapture some functionality of target organs. These "organoids" have garnered momentum for their applications in modeling human development and disease, drug screening, and future therapy design or even organ replacement. This review first discusses the self-organizing organoids as materials with emergent properties and their advantages and limitations. We subsequently describe biomaterials-based strategies used to afford more control of the organoid's microenvironment and ensuing cellular composition and organization. In this review, we also offer our perspective on how multifunctional biomaterials with precise spatial and temporal control could ultimately bridge the gap between in vitro organoid platforms and their in vivo counterparts. Several notable reviews have highlighted PSC-derived organoids and 3D aggregates, including embryoid bodies, from a development and cellular assembly perspective. The focus of this review is to highlight the materials-based approaches that cells, including PSCs and others, adopt for self-assembly and the controlled development of complex tissues, such as that of the brain, gut, and immune system. Copyright © 2017 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  9. An efficient approach to bioconversion kinetic model generation based on automated microscale experimentation integrated with model driven experimental design

    DEFF Research Database (Denmark)

    Chen, B. H.; Micheletti, M.; Baganz, F.

    2009-01-01

    -erythrulose. Experiments were performed using automated microwell studies at the 150 or 800 mu L scale. The derived kinetic parameters were then verified in a second round of experiments where model predictions showed excellent agreement with experimental data obtained under conditions not included in the original......Reliable models of enzyme kinetics are required for the effective design of bioconversion processes. Kinetic expressions of the enzyme-catalysed reaction rate however, are frequently complex and establishing accurate values of kinetic parameters normally requires a large number of experiments....... These can be both time consuming and expensive when working with the types of non-natural chiral intermediates important in pharmaceutical syntheses. This paper presents ail automated microscale approach to the rapid and cost effective generation of reliable kinetic models useful for bioconversion process...

  10. a Predator-Prey Model Based on the Fully Parallel Cellular Automata

    Science.gov (United States)

    He, Mingfeng; Ruan, Hongbo; Yu, Changliang

    We presented a predator-prey lattice model containing moveable wolves and sheep, which are characterized by Penna double bit strings. Sexual reproduction and child-care strategies are considered. To implement this model in an efficient way, we build a fully parallel Cellular Automata based on a new definition of the neighborhood. We show the roles played by the initial densities of the populations, the mutation rate and the linear size of the lattice in the evolution of this model.

  11. Automation for mineral resource development

    Energy Technology Data Exchange (ETDEWEB)

    Norrie, A.W.; Turner, D.R. (eds.)

    1986-01-01

    A total of 55 papers were presented at the symposium under the following headings: automation and the future of mining; modelling and control of mining processes; transportation for mining; automation and the future of metallurgical processes; modelling and control of metallurgical processes; and general aspects. Fifteen papers have been abstracted separately.

  12. Cellular automata models for diffusion of information and highway traffic flow

    Science.gov (United States)

    Fuks, Henryk

    In the first part of this work we study a family of deterministic models for highway traffic flow which generalize cellular automaton rule 184. This family is parameterized by the speed limit m and another parameter k that represents degree of 'anticipatory driving'. We compare two driving strategies with identical maximum throughput: 'conservative' driving with high speed limit and 'anticipatory' driving with low speed limit. Those two strategies are evaluated in terms of accident probability. We also discuss fundamental diagrams of generalized traffic rules and examine limitations of maximum achievable throughput. Possible modifications of the model are considered. For rule 184, we present exact calculations of the order parameter in a transition from the moving phase to the jammed phase using the method of preimage counting, and use this result to construct a solution to the density classification problem. In the second part we propose a probabilistic cellular automaton model for the spread of innovations, rumors, news, etc., in a social system. We start from simple deterministic models, for which exact expressions for the density of adopters are derived. For a more realistic model, based on probabilistic cellular automata, we study the influence of a range of interaction R on the shape of the adoption curve. When the probability of adoption is proportional to the local density of adopters, and individuals can drop the innovation with some probability p, the system exhibits a second order phase transition. Critical line separating regions of parameter space in which asymptotic density of adopters is positive from the region where it is equal to zero converges toward the mean-field line when the range of the interaction increases. In a region between R=1 critical line and the mean-field line asymptotic density of adopters depends on R, becoming zero if R is too small (smaller than some critical value). This result demonstrates the importance of connectivity in

  13. Use of noncrystallographic symmetry for automated model building at medium to low resolution.

    Science.gov (United States)

    Wiegels, Tim; Lamzin, Victor S

    2012-04-01

    A novel method is presented for the automatic detection of noncrystallographic symmetry (NCS) in macromolecular crystal structure determination which does not require the derivation of molecular masks or the segmentation of density. It was found that throughout structure determination the NCS-related parts may be differently pronounced in the electron density. This often results in the modelling of molecular fragments of variable length and accuracy, especially during automated model-building procedures. These fragments were used to identify NCS relations in order to aid automated model building and refinement. In a number of test cases higher completeness and greater accuracy of the obtained structures were achieved, specifically at a crystallographic resolution of 2.3 Å or poorer. In the best case, the method allowed the building of up to 15% more residues automatically and a tripling of the average length of the built fragments.

  14. Automated and model-based assembly of an anamorphic telescope

    Science.gov (United States)

    Holters, Martin; Dirks, Sebastian; Stollenwerk, Jochen; Loosen, Peter

    2018-02-01

    Since the first usage of optical glasses there has been an increasing demand for optical systems which are highly customized for a wide field of applications. To meet the challenge of the production of so many unique systems, the development of new techniques and approaches has risen in importance. However, the assembly of precision optical systems with lot sizes of one up to a few tens of systems is still dominated by manual labor. In contrast, highly adaptive and model-based approaches may offer a solution for manufacturing with a high degree of automation and high throughput while maintaining high precision. In this work a model-based automated assembly approach based on ray-tracing is presented. This process runs autonomously, and accounts for a wide range of functionality. It firstly identifies the sequence for an optimized assembly and secondly, generates and matches intermediate figures of merit to predict the overall optical functionality of the optical system. This process also takes into account the generation of a digital twin of the optical system, by mapping key-performance-indicators like the first and the second momentum of intensity into the optical model. This approach is verified by the automatic assembly of an anamorphic telescope within an assembly cell. By continuous measuring and mapping the key-performance-indicators into the optical model, the quality of the digital twin is determined. Moreover, by measuring the optical quality and geometrical parameters of the telescope, the precision of this approach is determined. Finally, the productivity of the process is evaluated by monitoring the speed of the different steps of the process.

  15. Automated differentiation of computer models for sensitivity analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1990-01-01

    Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems

  16. Automated differentiation of computer models for sensitivity analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1991-01-01

    Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives, although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems. (author). 9 refs, 1 tab

  17. Sordaria macrospora, a model organism to study fungal cellular development.

    Science.gov (United States)

    Engh, Ines; Nowrousian, Minou; Kück, Ulrich

    2010-12-01

    During the development of multicellular eukaryotes, the processes of cellular growth and organogenesis are tightly coordinated. Since the 1940s, filamentous fungi have served as genetic model organisms to decipher basic mechanisms underlying eukaryotic cell differentiation. Here, we focus on Sordaria macrospora, a homothallic ascomycete and important model organism for developmental biology. During its sexual life cycle, S. macrospora forms three-dimensional fruiting bodies, a complex process involving the formation of different cell types. S. macrospora can be used for genetic, biochemical and cellular experimental approaches since diverse tools, including fluorescence microscopy, a marker recycling system and gene libraries, are available. Moreover, the genome of S. macrospora has been sequenced and allows functional genomics analyses. Over the past years, our group has generated and analysed a number of developmental mutants which has greatly enhanced our fundamental understanding about fungal morphogenesis. In addition, our recent research activities have established a link between developmental proteins and conserved signalling cascades, ultimately leading to a regulatory network controlling differentiation processes in a eukaryotic model organism. This review summarizes the results of our recent findings, thus advancing current knowledge of the general principles and paradigms underpinning eukaryotic cell differentiation and development. Copyright © 2010 Elsevier GmbH. All rights reserved.

  18. Automated crack detection in conductive smart-concrete structures using a resistor mesh model

    Science.gov (United States)

    Downey, Austin; D'Alessandro, Antonella; Ubertini, Filippo; Laflamme, Simon

    2018-03-01

    Various nondestructive evaluation techniques are currently used to automatically detect and monitor cracks in concrete infrastructure. However, these methods often lack the scalability and cost-effectiveness over large geometries. A solution is the use of self-sensing carbon-doped cementitious materials. These self-sensing materials are capable of providing a measurable change in electrical output that can be related to their damage state. Previous work by the authors showed that a resistor mesh model could be used to track damage in structural components fabricated from electrically conductive concrete, where damage was located through the identification of high resistance value resistors in a resistor mesh model. In this work, an automated damage detection strategy that works through placing high value resistors into the previously developed resistor mesh model using a sequential Monte Carlo method is introduced. Here, high value resistors are used to mimic the internal condition of damaged cementitious specimens. The proposed automated damage detection method is experimentally validated using a 500 × 500 × 50 mm3 reinforced cement paste plate doped with multi-walled carbon nanotubes exposed to 100 identical impact tests. Results demonstrate that the proposed Monte Carlo method is capable of detecting and localizing the most prominent damage in a structure, demonstrating that automated damage detection in smart-concrete structures is a promising strategy for real-time structural health monitoring of civil infrastructure.

  19. Programmable cellular arrays. Faults testing and correcting in cellular arrays

    International Nuclear Information System (INIS)

    Cercel, L.

    1978-03-01

    A review of some recent researches about programmable cellular arrays in computing and digital processing of information systems is presented, and includes both combinational and sequential arrays, with full arbitrary behaviour, or which can realize better implementations of specialized blocks as: arithmetic units, counters, comparators, control systems, memory blocks, etc. Also, the paper presents applications of cellular arrays in microprogramming, in implementing of a specialized computer for matrix operations, in modeling of universal computing systems. The last section deals with problems of fault testing and correcting in cellular arrays. (author)

  20. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection

    DEFF Research Database (Denmark)

    Ernstsen, Christina Lundgaard; Login, Frédéric H.; Jensen, Helene Halkjær

    2017-01-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacteria...

  1. Automated Test Assembly for Cognitive Diagnosis Models Using a Genetic Algorithm

    Science.gov (United States)

    Finkelman, Matthew; Kim, Wonsuk; Roussos, Louis A.

    2009-01-01

    Much recent psychometric literature has focused on cognitive diagnosis models (CDMs), a promising class of instruments used to measure the strengths and weaknesses of examinees. This article introduces a genetic algorithm to perform automated test assembly alongside CDMs. The algorithm is flexible in that it can be applied whether the goal is to…

  2. High-Dimensional Modeling for Cytometry: Building Rock Solid Models Using GemStone™ and Verity Cen-se'™ High-Definition t-SNE Mapping.

    Science.gov (United States)

    Bruce Bagwell, C

    2018-01-01

    This chapter outlines how to approach the complex tasks associated with designing models for high-dimensional cytometry data. Unlike gating approaches, modeling lends itself to automation and accounts for measurement overlap among cellular populations. Designing these models is now easier because of a new technique called high-definition t-SNE mapping. Nontrivial examples are provided that serve as a guide to create models that are consistent with data.

  3. A dynamic cellular vertex model of growing epithelial tissues

    Science.gov (United States)

    Lin, Shao-Zhen; Li, Bo; Feng, Xi-Qiao

    2017-04-01

    Intercellular interactions play a significant role in a wide range of biological functions and processes at both the cellular and tissue scales, for example, embryogenesis, organogenesis, and cancer invasion. In this paper, a dynamic cellular vertex model is presented to study the morphomechanics of a growing epithelial monolayer. The regulating role of stresses in soft tissue growth is revealed. It is found that the cells originating from the same parent cell in the monolayer can orchestrate into clustering patterns as the tissue grows. Collective cell migration exhibits a feature of spatial correlation across multiple cells. Dynamic intercellular interactions can engender a variety of distinct tissue behaviors in a social context. Uniform cell proliferation may render high and heterogeneous residual compressive stresses, while stress-regulated proliferation can effectively release the stresses, reducing the stress heterogeneity in the tissue. The results highlight the critical role of mechanical factors in the growth and morphogenesis of epithelial tissues and help understand the development and invasion of epithelial tumors.

  4. Geometry Based Design Automation : Applied to Aircraft Modelling and Optimization

    OpenAIRE

    Amadori, Kristian

    2012-01-01

    Product development processes are continuously challenged by demands for increased efficiency. As engineering products become more and more complex, efficient tools and methods for integrated and automated design are needed throughout the development process. Multidisciplinary Design Optimization (MDO) is one promising technique that has the potential to drastically improve concurrent design. MDO frameworks combine several disciplinary models with the aim of gaining a holistic perspective of ...

  5. Automated main-chain model building by template matching and iterative fragment extension.

    Science.gov (United States)

    Terwilliger, Thomas C

    2003-01-01

    An algorithm for the automated macromolecular model building of polypeptide backbones is described. The procedure is hierarchical. In the initial stages, many overlapping polypeptide fragments are built. In subsequent stages, the fragments are extended and then connected. Identification of the locations of helical and beta-strand regions is carried out by FFT-based template matching. Fragment libraries of helices and beta-strands from refined protein structures are then positioned at the potential locations of helices and strands and the longest segments that fit the electron-density map are chosen. The helices and strands are then extended using fragment libraries consisting of sequences three amino acids long derived from refined protein structures. The resulting segments of polypeptide chain are then connected by choosing those which overlap at two or more C(alpha) positions. The fully automated procedure has been implemented in RESOLVE and is capable of model building at resolutions as low as 3.5 A. The algorithm is useful for building a preliminary main-chain model that can serve as a basis for refinement and side-chain addition.

  6. Efficient Analysis of Systems Biology Markup Language Models of Cellular Populations Using Arrays.

    Science.gov (United States)

    Watanabe, Leandro; Myers, Chris J

    2016-08-19

    The Systems Biology Markup Language (SBML) has been widely used for modeling biological systems. Although SBML has been successful in representing a wide variety of biochemical models, the core standard lacks the structure for representing large complex regular systems in a standard way, such as whole-cell and cellular population models. These models require a large number of variables to represent certain aspects of these types of models, such as the chromosome in the whole-cell model and the many identical cell models in a cellular population. While SBML core is not designed to handle these types of models efficiently, the proposed SBML arrays package can represent such regular structures more easily. However, in order to take full advantage of the package, analysis needs to be aware of the arrays structure. When expanding the array constructs within a model, some of the advantages of using arrays are lost. This paper describes a more efficient way to simulate arrayed models. To illustrate the proposed method, this paper uses a population of repressilator and genetic toggle switch circuits as examples. Results show that there are memory benefits using this approach with a modest cost in runtime.

  7. Programmable automation systems in PSA

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1997-06-01

    The Finnish safety authority (STUK) requires plant specific PSAs, and quantitative safety goals are set on different levels. The reliability analysis is more problematic when critical safety functions are realized by applying programmable automation systems. Conventional modeling techniques do not necessarily apply to the analysis of these systems, and the quantification seems to be impossible. However, it is important to analyze contribution of programmable automation systems to the plant safety and PSA is the only method with system analytical view over the safety. This report discusses the applicability of PSA methodology (fault tree analyses, failure modes and effects analyses) in the analysis of programmable automation systems. The problem of how to decompose programmable automation systems for reliability modeling purposes is discussed. In addition to the qualitative analysis and structural reliability modeling issues, the possibility to evaluate failure probabilities of programmable automation systems is considered. One solution to the quantification issue is the use of expert judgements, and the principles to apply expert judgements is discussed in the paper. A framework to apply expert judgements is outlined. Further, the impacts of subjective estimates on the interpretation of PSA results are discussed. (orig.) (13 refs.)

  8. Calibrating cellular automaton models for pedestrians walking through corners

    Science.gov (United States)

    Dias, Charitha; Lovreglio, Ruggiero

    2018-05-01

    Cellular Automata (CA) based pedestrian simulation models have gained remarkable popularity as they are simpler and easier to implement compared to other microscopic modeling approaches. However, incorporating traditional floor field representations in CA models to simulate pedestrian corner navigation behavior could result in unrealistic behaviors. Even though several previous studies have attempted to enhance CA models to realistically simulate pedestrian maneuvers around bends, such modifications have not been calibrated or validated against empirical data. In this study, two static floor field (SFF) representations, namely 'discrete representation' and 'continuous representation', are calibrated for CA-models to represent pedestrians' walking behavior around 90° bends. Trajectory data collected through a controlled experiment are used to calibrate these model representations. Calibration results indicate that although both floor field representations can represent pedestrians' corner navigation behavior, the 'continuous' representation fits the data better. Output of this study could be beneficial for enhancing the reliability of existing CA-based models by representing pedestrians' corner navigation behaviors more realistically.

  9. AutoLens: Automated Modeling of a Strong Lens's Light, Mass and Source

    Science.gov (United States)

    Nightingale, J. W.; Dye, S.; Massey, Richard J.

    2018-05-01

    This work presents AutoLens, the first entirely automated modeling suite for the analysis of galaxy-scale strong gravitational lenses. AutoLens simultaneously models the lens galaxy's light and mass whilst reconstructing the extended source galaxy on an adaptive pixel-grid. The method's approach to source-plane discretization is amorphous, adapting its clustering and regularization to the intrinsic properties of the lensed source. The lens's light is fitted using a superposition of Sersic functions, allowing AutoLens to cleanly deblend its light from the source. Single component mass models representing the lens's total mass density profile are demonstrated, which in conjunction with light modeling can detect central images using a centrally cored profile. Decomposed mass modeling is also shown, which can fully decouple a lens's light and dark matter and determine whether the two component are geometrically aligned. The complexity of the light and mass models are automatically chosen via Bayesian model comparison. These steps form AutoLens's automated analysis pipeline, such that all results in this work are generated without any user-intervention. This is rigorously tested on a large suite of simulated images, assessing its performance on a broad range of lens profiles, source morphologies and lensing geometries. The method's performance is excellent, with accurate light, mass and source profiles inferred for data sets representative of both existing Hubble imaging and future Euclid wide-field observations.

  10. A Model of How Different Biology Experts Explain Molecular and Cellular Mechanisms

    Science.gov (United States)

    Trujillo, Caleb M.; Anderson, Trevor R.; Pelaez, Nancy J.

    2015-01-01

    Constructing explanations is an essential skill for all science learners. The goal of this project was to model the key components of expert explanation of molecular and cellular mechanisms. As such, we asked: What is an appropriate model of the components of explanation used by biology experts to explain molecular and cellular mechanisms? Do explanations made by experts from different biology subdisciplines at a university support the validity of this model? Guided by the modeling framework of R. S. Justi and J. K. Gilbert, the validity of an initial model was tested by asking seven biologists to explain a molecular mechanism of their choice. Data were collected from interviews, artifacts, and drawings, and then subjected to thematic analysis. We found that biologists explained the specific activities and organization of entities of the mechanism. In addition, they contextualized explanations according to their biological and social significance; integrated explanations with methods, instruments, and measurements; and used analogies and narrated stories. The derived methods, analogies, context, and how themes informed the development of our final MACH model of mechanistic explanations. Future research will test the potential of the MACH model as a guiding framework for instruction to enhance the quality of student explanations. PMID:25999313

  11. Development of an automated core model for nuclear reactors

    International Nuclear Information System (INIS)

    Mosteller, R.D.

    1998-01-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of this project was to develop an automated package of computer codes that can model the steady-state behavior of nuclear-reactor cores of various designs. As an added benefit, data produced for steady-state analysis also can be used as input to the TRAC transient-analysis code for subsequent safety analysis of the reactor at any point in its operating lifetime. The basic capability to perform steady-state reactor-core analysis already existed in the combination of the HELIOS lattice-physics code and the NESTLE advanced nodal code. In this project, the automated package was completed by (1) obtaining cross-section libraries for HELIOS, (2) validating HELIOS by comparing its predictions to results from critical experiments and from the MCNP Monte Carlo code, (3) validating NESTLE by comparing its predictions to results from numerical benchmarks and to measured data from operating reactors, and (4) developing a linkage code to transform HELIOS output into NESTLE input

  12. The Development Of Mathematical Model For Automated Fingerprint Identification Systems Analysis

    International Nuclear Information System (INIS)

    Ardisasmita, M. Syamsa

    2001-01-01

    Fingerprint has a strong oriented and periodic structure composed of dark lines of raised skin (ridges) and clear lines of lowered skin (furrows)that twist to form a distinct pattern. Although the manner in which the ridges flow is distinctive, other characteristics of the fingerprint called m inutiae a re what are most unique to the individual. These features are particular patterns consisting of terminations or bifurcations of the ridges. To assert if two fingerprints are from the same finger or not, experts detect those minutiae. AFIS (Automated Fingerprint Identification Systems) extract and compare these features for determining a match. The classic methods of fingerprints recognition are not suitable for direct implementation in form of computer algorithms. The creation of a finger's model was however the necessity of development of new, better algorithms of analysis. This paper presents a new numerical methods of fingerprints' simulation based on mathematical model of arrangement of dermatoglyphics and creation of minutiae. This paper describes also the design and implementation of an automated fingerprint identification systems which operates in two stages: minutiae extraction and minutiae matching

  13. A Geometrical-Based Model for Cochannel Interference Analysis and Capacity Estimation of CDMA Cellular Systems

    Directory of Open Access Journals (Sweden)

    Konstantinos B. Baltzis

    2008-10-01

    Full Text Available A common assumption in cellular communications is the circular-cell approximation. In this paper, an alternative analysis based on the hexagonal shape of the cells is presented. A geometrical-based stochastic model is proposed to describe the angle of arrival of the interfering signals in the reverse link of a cellular system. Explicit closed form expressions are derived, and simulations performed exhibit the characteristics and validate the accuracy of the proposed model. Applications in the capacity estimation of WCDMA cellular networks are presented. Dependence of system capacity of the sectorization of the cells and the base station antenna radiation pattern is explored. Comparisons with data in literature validate the accuracy of the proposed model. The degree of error of the hexagonal and the circular-cell approaches has been investigated indicating the validity of the proposed model. Results have also shown that, in many cases, the two approaches give similar results when the radius of the circle equals to the hexagon inradius. A brief discussion on how the proposed technique may be applied to broadband access networks is finally made.

  14. Optimization of automation: I. Estimation method of cognitive automation rates reflecting the effects of automation on human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Seong, Poong Hyun

    2014-01-01

    Highlights: • We propose an estimation method of the automation rate by taking the advantages of automation as the estimation measures. • We conduct the experiments to examine the validity of the suggested method. • The higher the cognitive automation rate is, the greater the decreased rate of the working time will be. • The usefulness of the suggested estimation method is proved by statistical analyses. - Abstract: Since automation was introduced in various industrial fields, the concept of the automation rate has been used to indicate the inclusion proportion of automation among all work processes or facilities. Expressions of the inclusion proportion of automation are predictable, as is the ability to express the degree of the enhancement of human performance. However, many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, this paper proposes a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs). Automation in NPPs can be divided into two types: system automation and cognitive automation. Some general descriptions and characteristics of each type of automation are provided, and the advantages of automation are investigated. The advantages of each type of automation are used as measures of the estimation method of the automation rate. One advantage was found to be a reduction in the number of tasks, and another was a reduction in human cognitive task loads. The system and the cognitive automation rate were proposed as quantitative measures by taking advantage of the aforementioned benefits. To quantify the required human cognitive task loads and thus suggest the cognitive automation rate, Conant’s information-theory-based model was applied. The validity of the suggested method, especially as regards the cognitive automation rate, was proven by conducting

  15. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee

    2016-01-01

    This article presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults...

  16. A Local Land Use Competition Cellular Automata Model and Its Application

    Directory of Open Access Journals (Sweden)

    Jun Yang

    2016-06-01

    Full Text Available Cellular automaton (CA is an important method in land use and cover change studies, however, the majority of research focuses on the discovery of macroscopic factors affecting LUCC, which results in ignoring the local effects within the neighborhoods. This paper introduces a Local Land Use Competition Cellular Automata (LLUC-CA model, based on local land use competition, land suitability evaluation, demand analysis of the different land use types, and multi-target land use competition allocation algorithm to simulate land use change at a micro level. The model is applied to simulate land use changes at Jinshitan National Tourist Holiday Resort from 1988 to 2012. The results show that the simulation accuracies were 64.46%, 77.21%, 85.30% and 99.14% for the agricultural land, construction land, forestland and water, respectively. In addition, comparing the simulation results of the LLUC-CA and CA-Markov model with the real land use data, their overall spatial accuracies were found to be 88.74% and 86.82%, respectively. In conclusion, the results from this study indicated that the model was an acceptable method for the simulation of large-scale land use changes, and the approach used here is applicable to analyzing the land use change driven forces and assist in decision-making.

  17. An innovative approach for modeling and simulation of an automated industrial robotic arm operated electro-pneumatically

    Science.gov (United States)

    Popa, L.; Popa, V.

    2017-08-01

    The article is focused on modeling an automated industrial robotic arm operated electro-pneumatically and to simulate the robotic arm operation. It is used the graphic language FBD (Function Block Diagram) to program the robotic arm on Zelio Logic automation. The innovative modeling and simulation procedures are considered specific problems regarding the development of a new type of technical products in the field of robotics. Thus, were identified new applications of a Programmable Logic Controller (PLC) as a specialized computer performing control functions with a variety of high levels of complexit.

  18. DMPy: a Python package for automated mathematical model construction of large-scale metabolic systems.

    Science.gov (United States)

    Smith, Robert W; van Rosmalen, Rik P; Martins Dos Santos, Vitor A P; Fleck, Christian

    2018-06-19

    Models of metabolism are often used in biotechnology and pharmaceutical research to identify drug targets or increase the direct production of valuable compounds. Due to the complexity of large metabolic systems, a number of conclusions have been drawn using mathematical methods with simplifying assumptions. For example, constraint-based models describe changes of internal concentrations that occur much quicker than alterations in cell physiology. Thus, metabolite concentrations and reaction fluxes are fixed to constant values. This greatly reduces the mathematical complexity, while providing a reasonably good description of the system in steady state. However, without a large number of constraints, many different flux sets can describe the optimal model and we obtain no information on how metabolite levels dynamically change. Thus, to accurately determine what is taking place within the cell, finer quality data and more detailed models need to be constructed. In this paper we present a computational framework, DMPy, that uses a network scheme as input to automatically search for kinetic rates and produce a mathematical model that describes temporal changes of metabolite fluxes. The parameter search utilises several online databases to find measured reaction parameters. From this, we take advantage of previous modelling efforts, such as Parameter Balancing, to produce an initial mathematical model of a metabolic pathway. We analyse the effect of parameter uncertainty on model dynamics and test how recent flux-based model reduction techniques alter system properties. To our knowledge this is the first time such analysis has been performed on large models of metabolism. Our results highlight that good estimates of at least 80% of the reaction rates are required to accurately model metabolic systems. Furthermore, reducing the size of the model by grouping reactions together based on fluxes alters the resulting system dynamics. The presented pipeline automates the

  19. Automated Eukaryotic Gene Structure Annotation Using EVidenceModeler and the Program to Assemble Spliced Alignments

    Energy Technology Data Exchange (ETDEWEB)

    Haas, B J; Salzberg, S L; Zhu, W; Pertea, M; Allen, J E; Orvis, J; White, O; Buell, C R; Wortman, J R

    2007-12-10

    EVidenceModeler (EVM) is presented as an automated eukaryotic gene structure annotation tool that reports eukaryotic gene structures as a weighted consensus of all available evidence. EVM, when combined with the Program to Assemble Spliced Alignments (PASA), yields a comprehensive, configurable annotation system that predicts protein-coding genes and alternatively spliced isoforms. Our experiments on both rice and human genome sequences demonstrate that EVM produces automated gene structure annotation approaching the quality of manual curation.

  20. Automated smoother for the numerical decoupling of dynamics models.

    Science.gov (United States)

    Vilela, Marco; Borges, Carlos C H; Vinga, Susana; Vasconcelos, Ana Tereza R; Santos, Helena; Voit, Eberhard O; Almeida, Jonas S

    2007-08-21

    Structure identification of dynamic models for complex biological systems is the cornerstone of their reverse engineering. Biochemical Systems Theory (BST) offers a particularly convenient solution because its parameters are kinetic-order coefficients which directly identify the topology of the underlying network of processes. We have previously proposed a numerical decoupling procedure that allows the identification of multivariate dynamic models of complex biological processes. While described here within the context of BST, this procedure has a general applicability to signal extraction. Our original implementation relied on artificial neural networks (ANN), which caused slight, undesirable bias during the smoothing of the time courses. As an alternative, we propose here an adaptation of the Whittaker's smoother and demonstrate its role within a robust, fully automated structure identification procedure. In this report we propose a robust, fully automated solution for signal extraction from time series, which is the prerequisite for the efficient reverse engineering of biological systems models. The Whittaker's smoother is reformulated within the context of information theory and extended by the development of adaptive signal segmentation to account for heterogeneous noise structures. The resulting procedure can be used on arbitrary time series with a nonstationary noise process; it is illustrated here with metabolic profiles obtained from in-vivo NMR experiments. The smoothed solution that is free of parametric bias permits differentiation, which is crucial for the numerical decoupling of systems of differential equations. The method is applicable in signal extraction from time series with nonstationary noise structure and can be applied in the numerical decoupling of system of differential equations into algebraic equations, and thus constitutes a rather general tool for the reverse engineering of mechanistic model descriptions from multivariate experimental

  1. Cellular-automata model of the dwarf shrubs populations and communities dynamics

    Directory of Open Access Journals (Sweden)

    A. S. Komarov

    2015-06-01

    Full Text Available The probabilistic cellular-automata model of development and long-time dynamics of dwarf shrub populations and communities is developed. It is based on the concept of discrete description of the plant ontogenesis and joint model approaches in terms of probabilistic cellular automata and L-systems by Lindenmayer. Short representation of the basic model allows evaluation of the approach and software implementation. The main variables of the model are a number of partial bushes in clones or area projective cover. The model allows us to investigate the conditions of self-maintenance and sustainability population under different environmental conditions (inaccessibility of the territory for settlement, mosaic moisture conditions of soil and wealth. The model provides a forecast of the total biomass dynamics shrubs and their fractions (stems, leaves, roots, fine roots, fruits on the basis of the data obtained in the discrete description of ontogenesis and further information on the productivity of the plant fractions. The inclusion of the joint dynamics of biomass of shrubs and soil in EFIMOD models cycle of carbon and nitrogen to evaluate the role of shrubs in these circulations, especially at high impact, such as forest fires and clear cutting, allow forecasting of the dynamics of populations and ecosystem functions of shrubs (regulation of biogeochemical cycles maintaining biodiversity, participation in the creation of non-wood products with changing climatic conditions and strong damaging effects (logging, fires; and application of the models developed to investigate the stability and productivity of shrubs and their participation in the cycle of carbon and nitrogen in different climatic and edaphic conditions.

  2. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery

    International Nuclear Information System (INIS)

    Yu, Victoria Y.; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A.; Sheng, Ke

    2015-01-01

    Purpose: Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. Methods: A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. Results: The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was

  3. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Victoria Y.; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A.; Sheng, Ke, E-mail: ksheng@mednet.ucla.edu [Department of Radiation Oncology, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, California 90024 (United States)

    2015-11-15

    Purpose: Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. Methods: A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. Results: The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was

  4. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery.

    Science.gov (United States)

    Yu, Victoria Y; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A; Sheng, Ke

    2015-11-01

    Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was attributed to phantom setup

  5. Modelling land-use effects of future urbanization using cellular automata

    DEFF Research Database (Denmark)

    Fuglsang, Morten; Münier, B.; Hansen, H.S.

    2013-01-01

    project PASHMINA (Paradigm Shift modelling and innovative approaches), three storylines of future transportation paradigm shifts towards 2040 are created. These storylines are translated into spatial planning strategies and modelled using the cellular automata model LUCIA. For the modelling, an Eastern......The modelling of land use change is a way to analyse future scenarios by modelling different pathways. Application of spatial data of different scales coupled with socio-economic data makes it possible to explore and test the understanding of land use change relations. In the EU-FP7 research...... Danish case area was selected, comprising of the Copenhagen metropolitan area and its hinterland. The different scenarios are described using a range of different descriptive GIS datasets. These include mapping of accessibility based on public and private transportation, urban density and structure...

  6. Collaborative Model-based Systems Engineering for Cyber-Physical Systems, with a Building Automation Case Study

    DEFF Research Database (Denmark)

    Fitzgerald, John; Gamble, Carl; Payne, Richard

    2016-01-01

    We describe an approach to the model-based engineering of cyber-physical systems that permits the coupling of diverse discrete-event and continuous-time models and their simulators. A case study in the building automation domain demonstrates how such co-models and co-simulation can promote early...

  7. Cellular automata analysis and applications

    CERN Document Server

    Hadeler, Karl-Peter

    2017-01-01

    This book focuses on a coherent representation of the main approaches to analyze the dynamics of cellular automata. Cellular automata are an inevitable tool in mathematical modeling. In contrast to classical modeling approaches as partial differential equations, cellular automata are straightforward to simulate but hard to analyze. In this book we present a review of approaches and theories that allow the reader to understand the behavior of cellular automata beyond simulations. The first part consists of an introduction of cellular automata on Cayley graphs, and their characterization via the fundamental Cutis-Hedlund-Lyndon theorems in the context of different topological concepts (Cantor, Besicovitch and Weyl topology). The second part focuses on classification results: What classification follows from topological concepts (Hurley classification), Lyapunov stability (Gilman classification), and the theory of formal languages and grammars (Kůrka classification). These classifications suggest to cluster cel...

  8. Toward Automated Inventory Modeling in Life Cycle Assessment: The Utility of Semantic Data Modeling to Predict Real-WorldChemical Production

    Science.gov (United States)

    A set of coupled semantic data models, i.e., ontologies, are presented to advance a methodology towards automated inventory modeling of chemical manufacturing in life cycle assessment. The cradle-to-gate life cycle inventory for chemical manufacturing is a detailed collection of ...

  9. Cellular Particle Dynamics simulation of biomechanical relaxation processes of multi-cellular systems

    Science.gov (United States)

    McCune, Matthew; Kosztin, Ioan

    2013-03-01

    Cellular Particle Dynamics (CPD) is a theoretical-computational-experimental framework for describing and predicting the time evolution of biomechanical relaxation processes of multi-cellular systems, such as fusion, sorting and compression. In CPD, cells are modeled as an ensemble of cellular particles (CPs) that interact via short range contact interactions, characterized by an attractive (adhesive interaction) and a repulsive (excluded volume interaction) component. The time evolution of the spatial conformation of the multicellular system is determined by following the trajectories of all CPs through numerical integration of their equations of motion. Here we present CPD simulation results for the fusion of both spherical and cylindrical multi-cellular aggregates. First, we calibrate the relevant CPD model parameters for a given cell type by comparing the CPD simulation results for the fusion of two spherical aggregates to the corresponding experimental results. Next, CPD simulations are used to predict the time evolution of the fusion of cylindrical aggregates. The latter is relevant for the formation of tubular multi-cellular structures (i.e., primitive blood vessels) created by the novel bioprinting technology. Work supported by NSF [PHY-0957914]. Computer time provided by the University of Missouri Bioinformatics Consortium.

  10. Implementation and automated validation of the minimal Z' model in FeynRules

    International Nuclear Information System (INIS)

    Basso, L.; Christensen, N.D.; Duhr, C.; Fuks, B.; Speckner, C.

    2012-01-01

    We describe the implementation of a well-known class of U(1) gauge models, the 'minimal' Z' models, in FeynRules. We also describe a new automated validation tool for FeynRules models which is controlled by a web interface and allows the user to run a complete set of 2 → 2 processes on different matrix element generators, different gauges, and compare between them all. If existing, the comparison with independent implementations is also possible. This tool has been used to validate our implementation of the 'minimal' Z' models. (authors)

  11. Assessing user acceptance towards automated and conventional sink use for hand decontamination using the technology acceptance model.

    Science.gov (United States)

    Dawson, Carolyn H; Mackrill, Jamie B; Cain, Rebecca

    2017-12-01

    Hand hygiene (HH) prevents harmful contaminants spreading in settings including domestic, health care and food handling. Strategies to improve HH range from behavioural techniques through to automated sinks that ensure hand surface cleaning. This study aimed to assess user experience and acceptance towards a new automated sink, compared to a normal sink. An adapted version of the technology acceptance model (TAM) assessed each mode of handwashing. A within-subjects design enabled N = 46 participants to evaluate both sinks. Perceived Ease of Use and Satisfaction of Use were significantly lower for the automated sink, compared to the conventional sink (p technology. We provide recommendations for future HH technology development to contribute a positive user experience, relevant to technology developers, ergonomists and those involved in HH across all sectors. Practitioner Summary: The need to facilitate timely, effective hand hygiene to prevent illness has led to a rise in automated handwashing systems across different contexts. User acceptance is a key factor in system uptake. This paper applies the technology acceptance model as a means to explore and optimise the design of such systems.

  12. Automation of coal mining equipment

    Energy Technology Data Exchange (ETDEWEB)

    Yamada, Ryuji

    1986-12-25

    Major machines used in the working face include the shearer and the self-advancing frame. The shearer has been changed from the radio-controlled model to the microcomputer operated machine, while automating various functions. In addition, a system for comprehensively examining operating conditions and natural conditions in the working face for further automation. The selfadvancing frame has been modified from the sequence controlled model to the microcomputer aided electrohydraulic control system. In order to proceed further with automation and introduce robotics, detectors, control units and valves must be made smaller in higher reliability. The system will be controlled above the ground in the future, provided that the machines in the working face are remote controlled at the gate while transmitting relevant data above the ground from this system. Thus, automated working face will be realized. (2 figs, 1 photo)

  13. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    International Nuclear Information System (INIS)

    Herbert, L.T.; Hansen, Z.N.L.

    2016-01-01

    This paper presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults and incorporate an intention preserving stochastic semantics able to model both probabilistic- and non-deterministic behaviour. Stochastic model checking techniques are employed to generate the state-space of a given workflow. Possible improvements obtained by restructuring are measured by employing the framework's capacity for tracking real-valued quantities associated with states and transitions of the workflow. The space of possible restructurings of a workflow is explored by means of an evolutionary algorithm, where the goals for improvement are defined in terms of optimising quantities, typically employed to model resources, associated with a workflow. The approach is fully automated and only the modelling of the production workflows, potential faults and the expression of the goals require manual input. We present the design of a software tool implementing this framework and explore the practical utility of this approach through an industrial case study in which the risk of production failures and their impact are reduced by restructuring the workflow. - Highlights: • We present a framework which allows for the automated restructuring of workflows. • This framework seeks to minimise the impact of errors on the workflow. • We illustrate a scalable software implementation of this framework. • We explore the practical utility of this approach through an industry case. • The impact of errors can be substantially reduced by restructuring the workflow.

  14. Automated adaptive inference of phenomenological dynamical models

    Science.gov (United States)

    Daniels, Bryan

    Understanding the dynamics of biochemical systems can seem impossibly complicated at the microscopic level: detailed properties of every molecular species, including those that have not yet been discovered, could be important for producing macroscopic behavior. The profusion of data in this area has raised the hope that microscopic dynamics might be recovered in an automated search over possible models, yet the combinatorial growth of this space has limited these techniques to systems that contain only a few interacting species. We take a different approach inspired by coarse-grained, phenomenological models in physics. Akin to a Taylor series producing Hooke's Law, forgoing microscopic accuracy allows us to constrain the search over dynamical models to a single dimension. This makes it feasible to infer dynamics with very limited data, including cases in which important dynamical variables are unobserved. We name our method Sir Isaac after its ability to infer the dynamical structure of the law of gravitation given simulated planetary motion data. Applying the method to output from a microscopically complicated but macroscopically simple biological signaling model, it is able to adapt the level of detail to the amount of available data. Finally, using nematode behavioral time series data, the method discovers an effective switch between behavioral attractors after the application of a painful stimulus.

  15. Concept of a cognitive-numeric plant and process modelizer

    International Nuclear Information System (INIS)

    Vetterkind, D.

    1990-01-01

    To achieve automatic modeling of plant distrubances and failure limitation procedures, first the system's hardware and the present media (water, steam, coolant fluid) are formalized into fully computable matrices, called topographies. Secondly a microscopic cellular automation model, using lattice gases and state transition rules, is combined with a semi - microscopic cellular process model and with a macroscopic model, too. In doing this, at semi-microscopic level there are acting a cellular data compressor, a feature detection device and the Intelligent Physical Element's process dynamics. At macroscopic level the Walking Process Elements, a process evolving module, a test-and-manage device and abstracting process net are involved. Additionally, a diagnosis-coordinating and a counter measurements coordinating device are used. In order to automatically get process insights, object transformations, elementary process functions and associative methods are used. Developments of optoelectronic hardware language components are under consideration

  16. Fractal growth of tumors and other cellular populations: Linking the mechanistic to the phenomenological modeling and vice versa

    International Nuclear Information System (INIS)

    D'Onofrio, Alberto

    2009-01-01

    In this paper we study and extend the mechanistic mean field theory of growth of cellular populations proposed by Mombach et al. [Mombach JCM, Lemke N, Bodmann BEJ, Idiart MAP. A mean-field theory of cellular growth. Europhys Lett 2002;59:923-928] (MLBI model), and we demonstrate that the original model and our generalizations lead to inferences of biological interest. In the first part of this paper, we show that the model in study is widely general since it admits, as particular cases, the main phenomenological models of cellular growth. In the second part of this work, we generalize the MLBI model to a wider family of models by allowing the cells to have a generic unspecified biologically plausible interaction. Then, we derive a relationship between this generic microscopic interaction function and the growth rate of the corresponding macroscopic model. Finally, we propose to use this relationship in order to help the investigation of the biological plausibility of phenomenological models of cancer growth.

  17. Boundary Induced Phase Transition in Cellular Automata Models of Pedestrian Flow

    Czech Academy of Sciences Publication Activity Database

    Bukáček, M.; Hrabák, Pavel

    2016-01-01

    Roč. 11, č. 4 (2016), s. 327-338 ISSN 1557-5969 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Adaptive time-span * Cellular automata model * Floor-field * Pedestrian flow * Phase transition * Principle of bonds Subject RIV: BD - Theory of Information Impact factor: 0.696, year: 2016

  18. Detecting the Extent of Cellular Decomposition after Sub-Eutectoid Annealing in Rolled UMo Foils

    Energy Technology Data Exchange (ETDEWEB)

    Kautz, Elizabeth J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Jana, Saumyadeep [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Devaraj, Arun [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lavender, Curt A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sweet, Lucas E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Joshi, Vineet V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-07-31

    This report presents an automated image processing approach to quantifying microstructure image data, specifically the extent of eutectoid (cellular) decomposition in rolled U-10Mo foils. An image processing approach is used here to be able to quantitatively describe microstructure image data in order to relate microstructure to processing parameters (time, temperature, deformation).

  19. Cellular automaton modelling of ductile iron microstructure in the thin wall casting

    International Nuclear Information System (INIS)

    Burbelko, A A; Gurgul, D; Kapturkiewicz, W; Górny, M

    2012-01-01

    The mathematical model of the globular eutectic solidification in 2D was designed. Proposed model is based on the Cellular Automaton Finite Differences (CA-FD) calculation method. Model has been used for studies of the primary austenite and of globular eutectic grains growth during the ductile iron solidification in the thin wall casting. Model takes into account, among other things, non-uniform temperature distribution in the casting wall cross-section, kinetics of the austenite and graphite grains nucleation, and non-equilibrium nature of the interphase boundary migration.

  20. Train flow chaos analysis based on an improved cellular automata model

    International Nuclear Information System (INIS)

    Meng, Xuelei; Xiang, Wanli; Jia, Limin; Xu, Jie

    2015-01-01

    To control the chaos in the railway traffic flow and offer valuable information for the dispatchers of the railway system, an improved cellular model is presented to detect and analyze the chaos in the traffic flow. We first introduce the working mechanism of moving block system, analyzing the train flow movement characteristics. Then we improve the cellular model on the evolution rules to adjust the train flow movement. We give the train operation steps from three cases: the trains running on a railway section, a train will arrive in a station and a train will departure from a station. We simulate 4 trains to run on a high speed section fixed with moving block system and record the distances between the neighbor trains and draw the Poincare section to analyze the chaos in the train operation. It is concluded that there is not only chaos but order in the train operation system with moving blocking system and they can interconvert to each other. The findings have the potential value in train dispatching system construction and offer supporting information for the daily dispatching work.

  1. Model-Based Control for Postal Automation and Baggage Handling

    NARCIS (Netherlands)

    Tarau, A.N.

    2010-01-01

    In this thesis we focus on two specific transportation systems, namely postal automation and baggage handling. Postal automation: During the last decades the volume of magazines, catalogs, and other plastic wrapped mail items that have to be processed by post sorting centers has increased

  2. Automated estimation of defects in magnetographic defectoscopy. 1. Automated magnetographic flow detectors

    International Nuclear Information System (INIS)

    Mikhajlov, S.P.; Vaulin, S.L.; Shcherbinin, V.E.; Shur, M.L.

    1993-01-01

    Consideration is given to specific features and possible functions of equipment for automated estimation of stretched continuity defects for samples with plane surface in magnetographic defectoscopy are discussed. Two models of automated magnetographic flow detectors, those with built-in microcomputer and in the form computer attachment, are described. Directions of further researches and development are discussed. 35 refs., 6 figs

  3. Cellular automaton model of coupled mass transport and chemical reactions

    International Nuclear Information System (INIS)

    Karapiperis, T.

    1994-01-01

    Mass transport, coupled with chemical reactions, is modelled as a cellular automaton in which solute molecules perform a random walk on a lattice and react according to a local probabilistic rule. Assuming molecular chaos and a smooth density function, we obtain the standard reaction-transport equations in the continuum limit. The model is applied to the reactions a + b ↔c and a + b →c, where we observe interesting macroscopic effects resulting from microscopic fluctuations and spatial correlations between molecules. We also simulate autocatalytic reaction schemes displaying spontaneous formation of spatial concentration patterns. Finally, we propose and discuss the limitations of a simple model for mineral-solute interaction. (author) 5 figs., 20 refs

  4. Treatment Analysis in a Cancer Stem Cell Context Using a Tumor Growth Model Based on Cellular Automata.

    Science.gov (United States)

    Monteagudo, Ángel; Santos, José

    2015-01-01

    Cancer can be viewed as an emergent behavior in terms of complex system theory and artificial life, Cellular Automata (CA) being the tool most used for studying and characterizing the emergent behavior. Different approaches with CA models were used to model cancer growth. The use of the abstract model of acquired cancer hallmarks permits the direct modeling at cellular level, where a cellular automaton defines the mitotic and apoptotic behavior of cells, and allows for an analysis of different dynamics of the cellular system depending on the presence of the different hallmarks. A CA model based on the presence of hallmarks in the cells, which includes a simulation of the behavior of Cancer Stem Cells (CSC) and their implications for the resultant growth behavior of the multicellular system, was employed. This modeling of cancer growth, in the avascular phase, was employed to analyze the effect of cancer treatments in a cancer stem cell context. The model clearly explains why, after treatment against non-stem cancer cells, the regrowth capability of CSCs generates a faster regrowth of tumor behavior, and also shows that a continuous low-intensity treatment does not favor CSC proliferation and differentiation, thereby allowing an unproblematic control of future tumor regrowth. The analysis performed indicates that, contrary to the current attempts at CSC control, trying to make CSC proliferation more difficult is an important point to consider, especially in the immediate period after a standard treatment for controlling non-stem cancer cell proliferation.

  5. Application of a novel cellular automaton porosity prediction model to aluminium castings

    International Nuclear Information System (INIS)

    Atwood, R.C.; Chirazi, A.; Lee, P.D.

    2002-01-01

    A multiscale model was developed to predict the formation of porosity within a solidifying aluminium-silicon alloy. The diffusion of silicon and dissolved gas was simulated on a microscopic scale combined with cellular automaton models of gas porosity formation within the growing three-dimensional solidification microstructure. However, due to high computational cost, the modelled volume is limited to the millimetre range. This renders the application of direct modelling of complex shape castings unfeasible. Combining the microstructural modelling with a statistical response-surface prediction method allows application of the microstructural model results to industrial scale casts by incorporating them in commercial solidification software. (author)

  6. Simulation of a plane wavefront propagating in cardiac tissue using a cellular automata model

    International Nuclear Information System (INIS)

    Barbosa, Carlos R Hall

    2003-01-01

    We present a detailed description of a cellular automata model for the propagation of action potential in a planar cardiac tissue, which is very fast and easy to use. The model incorporates anisotropy in the electrical conductivity and a spatial variation of the refractory time. The transmembrane potential distribution is directly derived from the cell states, and the intracellular and extracellular potential distributions are calculated for the particular case of a plane wavefront. Once the potential distributions are known, the associated current densities are calculated by Ohm's law, and the magnetic field is determined at a plane parallel to the cardiac tissue by applying the law of Biot and Savart. The results obtained for propagation speed and for magnetic field amplitude with the cellular automata model are compared with values predicted by the bidomain formulation, for various angles between wavefront propagation and fibre direction, characterizing excellent agreement between the models

  7. Kotai Antibody Builder: automated high-resolution structural modeling of antibodies.

    Science.gov (United States)

    Yamashita, Kazuo; Ikeda, Kazuyoshi; Amada, Karlou; Liang, Shide; Tsuchiya, Yuko; Nakamura, Haruki; Shirai, Hiroki; Standley, Daron M

    2014-11-15

    Kotai Antibody Builder is a Web service for tertiary structural modeling of antibody variable regions. It consists of three main steps: hybrid template selection by sequence alignment and canonical rules, 3D rendering of alignments and CDR-H3 loop modeling. For the last step, in addition to rule-based heuristics used to build the initial model, a refinement option is available that uses fragment assembly followed by knowledge-based scoring. Using targets from the Second Antibody Modeling Assessment, we demonstrate that Kotai Antibody Builder generates models with an overall accuracy equal to that of the best-performing semi-automated predictors using expert knowledge. Kotai Antibody Builder is available at http://kotaiab.org standley@ifrec.osaka-u.ac.jp. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. Use of noncrystallographic symmetry for automated model building at medium to low resolution

    International Nuclear Information System (INIS)

    Wiegels, Tim; Lamzin, Victor S.

    2012-01-01

    Noncrystallographic symmetry is automatically detected and used to achieve higher completeness and greater accuracy of automatically built protein structures at resolutions of 2.3 Å or poorer. A novel method is presented for the automatic detection of noncrystallographic symmetry (NCS) in macromolecular crystal structure determination which does not require the derivation of molecular masks or the segmentation of density. It was found that throughout structure determination the NCS-related parts may be differently pronounced in the electron density. This often results in the modelling of molecular fragments of variable length and accuracy, especially during automated model-building procedures. These fragments were used to identify NCS relations in order to aid automated model building and refinement. In a number of test cases higher completeness and greater accuracy of the obtained structures were achieved, specifically at a crystallographic resolution of 2.3 Å or poorer. In the best case, the method allowed the building of up to 15% more residues automatically and a tripling of the average length of the built fragments

  9. Automated home cage assessment shows behavioral changes in a transgenic mouse model of spinocerebellar ataxia type 17.

    Science.gov (United States)

    Portal, Esteban; Riess, Olaf; Nguyen, Huu Phuc

    2013-08-01

    Spinocerebellar Ataxia type 17 (SCA17) is an autosomal dominantly inherited, neurodegenerative disease characterized by ataxia, involuntary movements, and dementia. A novel SCA17 mouse model having a 71 polyglutamine repeat expansion in the TATA-binding protein (TBP) has shown age related motor deficit using a classic motor test, yet concomitant weight increase might be a confounding factor for this measurement. In this study we used an automated home cage system to test several motor readouts for this same model to confirm pathological behavior results and evaluate benefits of automated home cage in behavior phenotyping. Our results confirm motor deficits in the Tbp/Q71 mice and present previously unrecognized behavioral characteristics obtained from the automated home cage, indicating its use for high-throughput screening and testing, e.g. of therapeutic compounds. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Addressing population heterogeneity and distribution in epidemics models using a cellular automata approach.

    Science.gov (United States)

    López, Leonardo; Burguerner, Germán; Giovanini, Leonardo

    2014-04-12

    The spread of an infectious disease is determined by biological and social factors. Models based on cellular automata are adequate to describe such natural systems consisting of a massive collection of simple interacting objects. They characterize the time evolution of the global system as the emergent behaviour resulting from the interaction of the objects, whose behaviour is defined through a set of simple rules that encode the individual behaviour and the transmission dynamic. An epidemic is characterized trough an individual-based-model built upon cellular automata. In the proposed model, each individual of the population is represented by a cell of the automata. This way of modeling an epidemic situation allows to individually define the characteristic of each individual, establish different scenarios and implement control strategies. A cellular automata model to study the time evolution of a heterogeneous populations through the various stages of disease was proposed, allowing the inclusion of individual heterogeneity, geographical characteristics and social factors that determine the dynamic of the desease. Different assumptions made to built the classical model were evaluated, leading to following results: i) for low contact rate (like in quarantine process or low density population areas) the number of infective individuals is lower than other areas where the contact rate is higher, and ii) for different initial spacial distributions of infected individuals different epidemic dynamics are obtained due to its influence on the transition rate and the reproductive ratio of disease. The contact rate and spatial distributions have a central role in the spread of a disease. For low density populations the spread is very low and the number of infected individuals is lower than in highly populated areas. The spacial distribution of the population and the disease focus as well as the geographical characteristic of the area play a central role in the dynamics of the

  11. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  12. Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy.

    Science.gov (United States)

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N

    2017-08-01

    To target bacterial pathogens that invade and proliferate inside host cells, it is necessary to design intervention strategies directed against bacterial attachment, cellular invasion and intracellular proliferation. We present an automated microscopy-based, fast, high-throughput method for analyzing size and number of intracellular bacterial colonies in infected tissue culture cells. Cells are seeded in 48-well plates and infected with a GFP-expressing bacterial pathogen. Following gentamicin treatment to remove extracellular pathogens, cells are fixed and cell nuclei stained. This is followed by automated microscopy and subsequent semi-automated spot detection to determine the number of intracellular bacterial colonies, their size distribution, and the average number per host cell. Multiple 48-well plates can be processed sequentially and the procedure can be completed in one working day. As a model we quantified intracellular bacterial colonies formed by uropathogenic Escherichia coli (UPEC) during infection of human kidney cells (HKC-8). Urinary tract infections caused by UPEC are among the most common bacterial infectious diseases in humans. UPEC can colonize tissues of the urinary tract and is responsible for acute, chronic, and recurrent infections. In the bladder, UPEC can form intracellular quiescent reservoirs, thought to be responsible for recurrent infections. In the kidney, UPEC can colonize renal epithelial cells and pass to the blood stream, either via epithelial cell disruption or transcellular passage, to cause sepsis. Intracellular colonies are known to be clonal, originating from single invading UPEC. In our experimental setup, we found UPEC CFT073 intracellular bacterial colonies to be heterogeneous in size and present in nearly one third of the HKC-8 cells. This high-throughput experimental format substantially reduces experimental time and enables fast screening of the intracellular bacterial load and cellular distribution of multiple

  13. Bridging the gap: linking molecular simulations and systemic descriptions of cellular compartments.

    Directory of Open Access Journals (Sweden)

    Tihamér Geyer

    Full Text Available Metabolic processes in biological cells are commonly either characterized at the level of individual enzymes and metabolites or at the network level. Often these two paradigms are considered as mutually exclusive because concepts from neither side are suited to describe the complete range of scales. Additionally, when modeling metabolic or regulatory cellular systems, often a large fraction of the required kinetic parameters are unknown. This even applies to such simple and extensively studied systems like the photosynthetic apparatus of purple bacteria. Using the chromatophore vesicles of Rhodobacter sphaeroides as a model system, we show that a consistent kinetic model emerges when fitting the dynamics of a molecular stochastic simulation to a set of time dependent experiments even though about two thirds of the kinetic parameters in this system are not known from experiment. Those kinetic parameters that were previously known all came out in the expected range. The simulation model was built from independent protein units composed of elementary reactions processing single metabolites. This pools-and-proteins approach naturally compiles the wealth of available molecular biological data into a systemic model and can easily be extended to describe other systems by adding new protein or nucleic acid types. The automated parameter optimization, performed with an evolutionary algorithm, reveals the sensitivity of the model to the value of each parameter and the relative importances of the experiments used. Such an analysis identifies the crucial system parameters and guides the setup of new experiments that would add most knowledge for a systemic understanding of cellular compartments. The successful combination of the molecular model and the systemic parametrization presented here on the example of the simple machinery for bacterial photosynthesis shows that it is actually possible to combine molecular and systemic modeling. This framework can now

  14. Guided Inquiry and Consensus-Building Used to Construct Cellular Models

    Directory of Open Access Journals (Sweden)

    Joel I. Cohen

    2015-02-01

    Full Text Available Using models helps students learn from a “whole systems” perspective when studying the cell. This paper describes a model that employs guided inquiry and requires consensus building among students for its completion. The model is interactive, meaning that it expands upon a static model which, once completed, cannot be altered and additionally relates various levels of biological organization (molecular, organelle, and cellular to define cell and organelle function and interaction. Learning goals are assessed using data summed from final grades and from images of the student’s final cell model (plant, bacteria, and yeast taken from diverse seventh grade classes. Instructional figures showing consensus-building pathways and seating arrangements are discussed. Results suggest that the model leads to a high rate of participation, facilitates guided inquiry, and fosters group and individual exploration by challenging student understanding of the living cell.

  15. Optical scatter imaging of cellular and mitochondrial swelling in brain tissue models of stroke

    Science.gov (United States)

    Johnson, Lee James

    2001-08-01

    The severity of brain edema resulting from a stroke can determine a patient's survival and the extent of their recovery. Cellular swelling is the microscopic source of a significant part of brain edema. Mitochondrial swelling also appears to be a determining event in the death or survival of the cells that are injured during a stroke. Therapies for reducing brain edema are not effective in many cases and current treatments of stroke do not address mitochondrial swelling at all. This dissertation is motivated by the lack of a complete understanding of cellular swelling resulting from stroke and the lack of a good method to begin to study mitochondrial swelling resulting from stroke in living brain tissue. In this dissertation, a novel method of detecting mitochondrial and cellular swelling in living hippocampal slices is developed and validated. The system is used to obtain spatial and temporal information about cellular and mitochondrial swelling resulting from various models of stroke. The effect of changes in water content on light scatter and absorption are examined in two models of brain edema. The results of this study demonstrate that optical techniques can be used to detect changes in water content. Mie scatter theory, the theoretical basis of the dual- angle scatter ratio imaging system, is presented. Computer simulations based on Mie scatter theory are used to determine the optimal angles for imaging. A detailed account of the early systems is presented to explain the motivations for the system design, especially polarization, wavelength and light path. Mitochondrial sized latex particles are used to determine the system response to changes in scattering particle size and concentration. The dual-angle scatter ratio imaging system is used to distinguish between osmotic and excitotoxic models of stroke injury. Such distinction cannot be achieved using the current techniques to study cellular swelling in hippocampal slices. The change in the scatter ratio is

  16. Modeling nurses' attitude toward using automated unit-based medication storage and distribution systems: an extension of the technology acceptance model.

    Science.gov (United States)

    Escobar-Rodríguez, Tomás; Romero-Alonso, María Mercedes

    2013-05-01

    This article analyzes the attitude of nurses toward the use of automated unit-based medication storage and distribution systems and identifies influencing factors. Understanding these factors provides an opportunity to explore actions that might be taken to boost adoption by potential users. The theoretical grounding for this research is the Technology Acceptance Model. The Technology Acceptance Model specifies the causal relationships between perceived usefulness, perceived ease of use, attitude toward using, and actual usage behavior. The research model has six constructs, and nine hypotheses were generated from connections between these six constructs. These constructs include perceived risks, experience level, and training. The findings indicate that these three external variables are related to the perceived ease of use and perceived usefulness of automated unit-based medication storage and distribution systems, and therefore, they have a significant influence on attitude toward the use of these systems.

  17. Human-centred automation: an explorative study

    International Nuclear Information System (INIS)

    Hollnagel, Erik; Miberg, Ann Britt

    1999-05-01

    The purpose of the programme activity on human-centred automation at the HRP is to develop knowledge (in the form of models and theories) and tools (in the form of techniques and simulators) to support design of automation that ensures effective human performance and comprehension. This report presents the work done on both the analytical and experimental side of this project. The analytical work has surveyed common definitions of automation and traditional design principles. A general finding is that human-centred automation usually is defined in terms of what it is not. This is partly due to a lack of adequate models and of human-automation interaction. Another result is a clarification of the consequences of automation, in particular with regard to situation awareness and workload. The experimental work has taken place as an explorative experiment in HAMMLAB in collaboration with IPSN (France). The purpose of this experiment was to increase the understanding of how automation influences operator performance in NPP control rooms. Two different types of automation (extensive and limited) were considered in scenarios having two different degrees of complexity (high and low), and involving diagnostic and procedural tasks. Six licensed NPP crews from the NPP at Loviisa, Finland, participated in the experiment. The dependent variables applied were plant performance, operator performance, self-rated crew performance, situation awareness, workload, and operator trust in the automation. The results from the diagnostic scenarios indicated that operators' judgement of crew efficiency was related to their level of trust in the automation, and further that operators trusted automation least and rated crew performance lowest in situations where crew performance was efficient and vice versa. The results from procedural scenarios indicated that extensive automation efficiently supported operators' performance, and further that operator' judgement of crew performance efficiency

  18. A Voyage to Arcturus: A model for automated management of a WLCG Tier-2 facility

    International Nuclear Information System (INIS)

    Roy, Gareth; Crooks, David; Mertens, Lena; Mitchell, Mark; Skipsey, Samuel Cadellin; Britton, David; Purdie, Stuart

    2014-01-01

    With the current trend towards 'On Demand Computing' in big data environments it is crucial that the deployment of services and resources becomes increasingly automated. Deployment based on cloud platforms is available for large scale data centre environments but these solutions can be too complex and heavyweight for smaller, resource constrained WLCG Tier-2 sites. Along with a greater desire for bespoke monitoring and collection of Grid related metrics, a more lightweight and modular approach is desired. In this paper we present a model for a lightweight automated framework which can be use to build WLCG grid sites, based on 'off the shelf' software components. As part of the research into an automation framework the use of both IPMI and SNMP for physical device management will be included, as well as the use of SNMP as a monitoring/data sampling layer such that more comprehensive decision making can take place and potentially be automated. This could lead to reduced down times and better performance as services are recognised to be in a non-functional state by autonomous systems.

  19. Simulation of emotional contagion using modified SIR model: A cellular automaton approach

    Science.gov (United States)

    Fu, Libi; Song, Weiguo; Lv, Wei; Lo, Siuming

    2014-07-01

    Emotion plays an important role in the decision-making of individuals in some emergency situations. The contagion of emotion may induce either normal or abnormal consolidated crowd behavior. This paper aims to simulate the dynamics of emotional contagion among crowds by modifying the epidemiological SIR model to a cellular automaton approach. This new cellular automaton model, entitled the “CA-SIRS model”, captures the dynamic process ‘susceptible-infected-recovered-susceptible', which is based on SIRS contagion in epidemiological theory. Moreover, in this new model, the process is integrated with individual movement. The simulation results of this model show that multiple waves and dynamical stability around a mean value will appear during emotion spreading. It was found that the proportion of initial infected individuals had little influence on the final stable proportion of infected population in a given system, and that infection frequency increased with an increase in the average crowd density. Our results further suggest that individual movement accelerates the spread speed of emotion and increases the stable proportion of infected population. Furthermore, decreasing the duration of an infection and the probability of reinfection can markedly reduce the number of infected individuals. It is hoped that this study will be helpful in crowd management and evacuation organization.

  20. Cellular automata model for traffic flow at intersections in internet of vehicles

    Science.gov (United States)

    Zhao, Han-Tao; Liu, Xin-Ru; Chen, Xiao-Xu; Lu, Jian-Cheng

    2018-03-01

    Considering the effect of the front vehicle's speed, the influence of the brake light and the conflict of the traffic flow, we established a cellular automata model called CE-NS for traffic flow at the intersection in the non-vehicle networking environment. According to the information interaction of Internet of Vehicles (IoV), introducing parameters describing the congestion and the accurate speed of the front vehicle into the CE-NS model, we improved the rules of acceleration, deceleration and conflict, and finally established a cellular automata model for traffic flow at intersections of IoV. The relationship between traffic parameters such as vehicle speed, flow and average travel time is obtained by numerical simulation of two models. Based on this, we compared the traffic situation of the non-vehicle networking environment with conditions of IoV environment, and analyzed the influence of the different degree of IoV on the traffic flow. The results show that the traffic speed is increased, the travel time is reduced, the flux of intersections is increased and the traffic flow is more smoothly under IoV environment. After the vehicle which achieves IoV reaches a certain proportion, the operation effect of the traffic flow begins to improve obviously.

  1. Understanding human management of automation errors

    Science.gov (United States)

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  2. The Development of Design Tools for Fault Tolerant Quantum Dot Cellular Automata Based Logic

    Science.gov (United States)

    Armstrong, Curtis D.; Humphreys, William M.

    2003-01-01

    We are developing software to explore the fault tolerance of quantum dot cellular automata gate architectures in the presence of manufacturing variations and device defects. The Topology Optimization Methodology using Applied Statistics (TOMAS) framework extends the capabilities of the A Quantum Interconnected Network Array Simulator (AQUINAS) by adding front-end and back-end software and creating an environment that integrates all of these components. The front-end tools establish all simulation parameters, configure the simulation system, automate the Monte Carlo generation of simulation files, and execute the simulation of these files. The back-end tools perform automated data parsing, statistical analysis and report generation.

  3. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    Science.gov (United States)

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  4. Treatment Analysis in a Cancer Stem Cell Context Using a Tumor Growth Model Based on Cellular Automata.

    Directory of Open Access Journals (Sweden)

    Ángel Monteagudo

    Full Text Available Cancer can be viewed as an emergent behavior in terms of complex system theory and artificial life, Cellular Automata (CA being the tool most used for studying and characterizing the emergent behavior. Different approaches with CA models were used to model cancer growth. The use of the abstract model of acquired cancer hallmarks permits the direct modeling at cellular level, where a cellular automaton defines the mitotic and apoptotic behavior of cells, and allows for an analysis of different dynamics of the cellular system depending on the presence of the different hallmarks. A CA model based on the presence of hallmarks in the cells, which includes a simulation of the behavior of Cancer Stem Cells (CSC and their implications for the resultant growth behavior of the multicellular system, was employed. This modeling of cancer growth, in the avascular phase, was employed to analyze the effect of cancer treatments in a cancer stem cell context. The model clearly explains why, after treatment against non-stem cancer cells, the regrowth capability of CSCs generates a faster regrowth of tumor behavior, and also shows that a continuous low-intensity treatment does not favor CSC proliferation and differentiation, thereby allowing an unproblematic control of future tumor regrowth. The analysis performed indicates that, contrary to the current attempts at CSC control, trying to make CSC proliferation more difficult is an important point to consider, especially in the immediate period after a standard treatment for controlling non-stem cancer cell proliferation.

  5. Bacterial growth on surfaces: Automated image analysis for quantification of growth rate-related parameters

    DEFF Research Database (Denmark)

    Møller, S.; Sternberg, Claus; Poulsen, L. K.

    1995-01-01

    species-specific hybridizations with fluorescence-labelled ribosomal probes to estimate the single-cell concentration of RNA. By automated analysis of digitized images of stained cells, we determined four independent growth rate-related parameters: cellular RNA and DNA contents, cell volume......, and the frequency of dividing cells in a cell population. These parameters were used to compare physiological states of liquid-suspended and surfacegrowing Pseudomonas putida KT2442 in chemostat cultures. The major finding is that the correlation between substrate availability and cellular growth rate found...

  6. Model for spatial synthesis of automated control system of the GCR type reactor; Model za prostornu sintezu sistema automatskog upravljanja reaktora GCR tipa

    Energy Technology Data Exchange (ETDEWEB)

    Lazarevic, B; Matausek, M [Institut za nuklearne nauke ' Boris Kidric' , Vinca, Belgrade (Yugoslavia)

    1966-07-01

    This paper describes the model which was developed for synthesis of spatial distribution of automated control elements in the reactor. It represents a general reliable mathematical model for analyzing transition states and synthesis of the automated control and regulation systems of GCR type reactors. One-dimensional system was defined under assumption that the time dependence of parameters of the neutron diffusion equation are identical in the total volume of the reactor and that spatial distribution of neutrons is time independent. It is shown that this assumption is satisfactory in case of short term variations which are relevant for safety analysis.

  7. Automation of the Jarrell--Ash model 70-314 emission spectrometer

    International Nuclear Information System (INIS)

    Morris, W.F.; Fisher, E.R.; Taber, L.

    1978-01-01

    Automation of the Jarrell-Ash 3.4-Meter Ebert direct-reading emission spectrometer with digital scaler readout is described. The readout is interfaced to a Data General NOVA 840 minicomputer. The automation code consists of BASIC language programs for interactive routines, data processing, and report generation. Call statements within the BASIC programs invoke assembly language routines for real-time data acquisition and control. In addition, the automation objectives as well as the spectrometer-computer system functions, coding, and operating instructions are presented

  8. Cellular automaton modeling of ductile iron microstructure in the thin wall

    Directory of Open Access Journals (Sweden)

    A.A. Burbelko

    2011-10-01

    Full Text Available The mathematical model of the globular eutectic solidification in 2D was designed. Proposed model is based on the Cellular Automaton Finite Differences (CA-FD calculation method. Model has been used for studies of the primary austenite and of globular eutectic grains growth during the solidification of the ductile iron with different carbon equivalent in the thin wall casting. Model takes into account, among other things, non-uniform temperature distribution in the casting wall cross-section, kinetics of the austenite and graphite grains nucleation, and non-equilibrium nature of the interphase boundary migration. Solidification of the DI with different carbon equivalents was analyzed. Obtained results were compared with the solidification path calculated by CALPHAD method.

  9. Component-based modeling of systems for automated fault tree generation

    International Nuclear Information System (INIS)

    Majdara, Aref; Wakabayashi, Toshio

    2009-01-01

    One of the challenges in the field of automated fault tree construction is to find an efficient modeling approach that can support modeling of different types of systems without ignoring any necessary details. In this paper, we are going to represent a new system of modeling approach for computer-aided fault tree generation. In this method, every system model is composed of some components and different types of flows propagating through them. Each component has a function table that describes its input-output relations. For the components having different operational states, there is also a state transition table. Each component can communicate with other components in the system only through its inputs and outputs. A trace-back algorithm is proposed that can be applied to the system model to generate the required fault trees. The system modeling approach and the fault tree construction algorithm are applied to a fire sprinkler system and the results are presented

  10. Probing Cellular Dynamics with Mesoscopic Simulations

    DEFF Research Database (Denmark)

    Shillcock, Julian C.

    2010-01-01

    Cellular processes span a huge range of length and time scales from the molecular to the near-macroscopic. Understanding how effects on one scale influence, and are themselves influenced by, those on lower and higher scales is a critical issue for the construction of models in Systems Biology....... Advances in computing hardware and software now allow explicit simulation of some aspects of cellular dynamics close to the molecular scale. Vesicle fusion is one example of such a process. Experiments, however, typically probe cellular behavior from the molecular scale up to microns. Standard particle...... soon be coupled to Mass Action models allowing the parameters in such models to be continuously tuned according to the finer resolution simulation. This will help realize the goal of a computational cellular simulation that is able to capture the dynamics of membrane-associated processes...

  11. Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation

    CERN Document Server

    2012-01-01

    This volume Future Control and Automation- Volume 2 includes best papers from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into six sessions on the basis of the classification of manuscripts considered, which is listed as follows: Mathematical Modeling, Analysis and Computation, Control Engineering, Reliable Networks Design, Vehicular Communications and Networking, Automation and Mechatronics.

  12. The Employment-Impact of Automation in Canada

    OpenAIRE

    McLean, Colin Alexander

    2015-01-01

    Standard neoclassical models of labour demand predict that automation does not produce long-term increases in unemployment. Supporting evidence in Canada between 1970 and 2008 is explained by the reallocation of labour from industries with high levels of automation such as Manufacturing to industries with low levels of automation such as Retail and Wholesale Trade, and Business Services. Recent evidence indicates however that on-going technological advances are now driving labour automation i...

  13. Cellular automaton model for hydrogen transport dynamics through metallic surface

    International Nuclear Information System (INIS)

    Shimura, K.; Yamaguchi, K.; Terai, T.; Yamawaki, M.

    2002-01-01

    Hydrogen re-emission and re-combination at the surface of first wall materials are a crucial issue for the understanding of the fuel recycling and for the tritium inventory in plasma facing materials. It is know to be difficult to model the transient behaviour of those processes due to their complex time-transient nature. However, cellular automata (CA) are powerful tools to model such complex systems because of their nature of discreteness in both dependent and independent variables. Then the system can be represented by the fully local interactions between cells. For that reason, complex physical and chemical systems can be described by fairly simple manner. In this study, the kinetics of desorption of adsorbed hydrogen from an ideal metallic surface is modelled in CA. Thermal desorption is simulated with this model and the comparison with the theory of rate processes is performed to identify the validity of this model. The overall results show that this model is reasonable to express the desorption kinetics

  14. A Computational model for compressed sensing RNAi cellular screening

    Directory of Open Access Journals (Sweden)

    Tan Hua

    2012-12-01

    Full Text Available Abstract Background RNA interference (RNAi becomes an increasingly important and effective genetic tool to study the function of target genes by suppressing specific genes of interest. This system approach helps identify signaling pathways and cellular phase types by tracking intensity and/or morphological changes of cells. The traditional RNAi screening scheme, in which one siRNA is designed to knockdown one specific mRNA target, needs a large library of siRNAs and turns out to be time-consuming and expensive. Results In this paper, we propose a conceptual model, called compressed sensing RNAi (csRNAi, which employs a unique combination of group of small interfering RNAs (siRNAs to knockdown a much larger size of genes. This strategy is based on the fact that one gene can be partially bound with several small interfering RNAs (siRNAs and conversely, one siRNA can bind to a few genes with distinct binding affinity. This model constructs a multi-to-multi correspondence between siRNAs and their targets, with siRNAs much fewer than mRNA targets, compared with the conventional scheme. Mathematically this problem involves an underdetermined system of equations (linear or nonlinear, which is ill-posed in general. However, the recently developed compressed sensing (CS theory can solve this problem. We present a mathematical model to describe the csRNAi system based on both CS theory and biological concerns. To build this model, we first search nucleotide motifs in a target gene set. Then we propose a machine learning based method to find the effective siRNAs with novel features, such as image features and speech features to describe an siRNA sequence. Numerical simulations show that we can reduce the siRNA library to one third of that in the conventional scheme. In addition, the features to describe siRNAs outperform the existing ones substantially. Conclusions This csRNAi system is very promising in saving both time and cost for large-scale RNAi

  15. Semi-automated extraction of longitudinal subglacial bedforms from digital terrain models - Two new methods

    Science.gov (United States)

    Jorge, Marco G.; Brennand, Tracy A.

    2017-07-01

    Relict drumlin and mega-scale glacial lineation (positive relief, longitudinal subglacial bedforms - LSBs) morphometry has been used as a proxy for paleo ice-sheet dynamics. LSB morphometric inventories have relied on manual mapping, which is slow and subjective and thus potentially difficult to reproduce. Automated methods are faster and reproducible, but previous methods for LSB semi-automated mapping have not been highly successful. Here, two new object-based methods for the semi-automated extraction of LSBs (footprints) from digital terrain models are compared in a test area in the Puget Lowland, Washington, USA. As segmentation procedures to create LSB-candidate objects, the normalized closed contour method relies on the contouring of a normalized local relief model addressing LSBs on slopes, and the landform elements mask method relies on the classification of landform elements derived from the digital terrain model. For identifying which LSB-candidate objects correspond to LSBs, both methods use the same LSB operational definition: a ruleset encapsulating expert knowledge, published morphometric data, and the morphometric range of LSBs in the study area. The normalized closed contour method was separately applied to four different local relief models, two computed in moving windows and two hydrology-based. Overall, the normalized closed contour method outperformed the landform elements mask method. The normalized closed contour method performed on a hydrological relief model from a multiple direction flow routing algorithm performed best. For an assessment of its transferability, the normalized closed contour method was evaluated on a second area, the Chautauqua drumlin field, Pennsylvania and New York, USA where it performed better than in the Puget Lowland. A broad comparison to previous methods suggests that the normalized relief closed contour method may be the most capable method to date, but more development is required.

  16. Automated procedures for sizing aerospace vehicle structures /SAVES/

    Science.gov (United States)

    Giles, G. L.; Blackburn, C. L.; Dixon, S. C.

    1972-01-01

    Results from a continuing effort to develop automated methods for structural design are described. A system of computer programs presently under development called SAVES is intended to automate the preliminary structural design of a complete aerospace vehicle. Each step in the automated design process of the SAVES system of programs is discussed, with emphasis placed on use of automated routines for generation of finite-element models. The versatility of these routines is demonstrated by structural models generated for a space shuttle orbiter, an advanced technology transport,n hydrogen fueled Mach 3 transport. Illustrative numerical results are presented for the Mach 3 transport wing.

  17. Virtualized cognitive network architecture for 5G cellular networks

    KAUST Repository

    Elsawy, Hesham

    2015-07-17

    Cellular networks have preserved an application agnostic and base station (BS) centric architecture1 for decades. Network functionalities (e.g. user association) are decided and performed regardless of the underlying application (e.g. automation, tactile Internet, online gaming, multimedia). Such an ossified architecture imposes several hurdles against achieving the ambitious metrics of next generation cellular systems. This article first highlights the features and drawbacks of such architectural ossification. Then the article proposes a virtualized and cognitive network architecture, wherein network functionalities are implemented via software instances in the cloud, and the underlying architecture can adapt to the application of interest as well as to changes in channels and traffic conditions. The adaptation is done in terms of the network topology by manipulating connectivities and steering traffic via different paths, so as to attain the applications\\' requirements and network design objectives. The article presents cognitive strategies to implement some of the classical network functionalities, along with their related implementation challenges. The article further presents a case study illustrating the performance improvement of the proposed architecture as compared to conventional cellular networks, both in terms of outage probability and handover rate.

  18. Comparison of manual and automated cell counts in EDTA preserved synovial fluids. Storage has little influence on the results.

    Science.gov (United States)

    Salinas, M; Rosas, J; Iborra, J; Manero, H; Pascual, E

    1997-10-01

    To determine the precision and agreement of synovial fluid (SF) cell counts done manually and with automated counters, and to determine the degree of variability of the counts in SF samples, kept in the tubes used for routine white blood cell (WBC) counts--which use liquid EDTA as anticoagulant--at 24 and 48 hours at 4 degrees C, and at room temperature. To determine precision, cell counts were repeated 10 times--both manually and by an automated counter--in a SF sample of low, medium, and high cellularity. The variances were calculated to determine the interobserver variation in two manual (M1,M2) and two automated cell counts (C1,C2). The agreement between a manual (M1) and automated counter (C1) results, was analysed by the Bland and Altman method and the difference against the mean of the two methods was plotted. Then, the mean difference between the two methods was estimated and the standard deviation of the difference. To determine the effects of storage, SF samples were kept in a refrigerator at 4 degrees C, and at room temperature; cell counts were done manually (M1) and automatically (C1) at 24 and 48 hours and the changes analysed by the Bland and Altman method. The variances were compared using an F test. (1) Precision. With the manual technique, the coefficients of variation were 27.9%, 14%, and 10.7% when used for counting the SF with low (270), medium (6200), and high cellularities (25,000). With the automated technique the coefficients of variation were 20%, 3.4%, and 2.9% in the same SF samples. In the fluids of medium and high cellularity, the variances of the automated cell counts were significatively lower (F test, p automated counter. (4) Influence of storage. The coulter counts of SF samples preserved at 4 degrees C showed less variance (F test, p Automated cell count of the SF offers advantages: it gives higher precision and consumes less time. The stability of the samples preserved in the EDTA tubes used for routine WBC counts is of

  19. Stochastic cellular automata model of cell migration, proliferation and differentiation: validation with in vitro cultures of muscle satellite cells.

    Science.gov (United States)

    Garijo, N; Manzano, R; Osta, R; Perez, M A

    2012-12-07

    Cell migration and proliferation has been modelled in the literature as a process similar to diffusion. However, using diffusion models to simulate the proliferation and migration of cells tends to create a homogeneous distribution in the cell density that does not correlate to empirical observations. In fact, the mechanism of cell dispersal is not diffusion. Cells disperse by crawling or proliferation, or are transported in a moving fluid. The use of cellular automata, particle models or cell-based models can overcome this limitation. This paper presents a stochastic cellular automata model to simulate the proliferation, migration and differentiation of cells. These processes are considered as completely stochastic as well as discrete. The model developed was applied to predict the behaviour of in vitro cell cultures performed with adult muscle satellite cells. Moreover, non homogeneous distribution of cells has been observed inside the culture well and, using the above mentioned stochastic cellular automata model, we have been able to predict this heterogeneous cell distribution and compute accurate quantitative results. Differentiation was also incorporated into the computational simulation. The results predicted the myotube formation that typically occurs with adult muscle satellite cells. In conclusion, we have shown how a stochastic cellular automata model can be implemented and is capable of reproducing the in vitro behaviour of adult muscle satellite cells. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Antiproliferative Activity and Cellular Uptake of Evodiamine and Rutaecarpine Based on 3D Tumor Models

    Directory of Open Access Journals (Sweden)

    Hui Guo

    2016-07-01

    Full Text Available Evodiamine (EVO and rutaecarpine (RUT are promising anti-tumor drug candidates. The evaluation of the anti-proliferative activity and cellular uptake of EVO and RUT in 3D multicellular spheroids of cancer cells would better recapitulate the native situation and thus better reflect an in vivo response to the treatment. Herein, we employed the 3D culture of MCF-7 and SMMC-7721 cells based on hanging drop method and evaluated the anti-proliferative activity and cellular uptake of EVO and RUT in 3D multicellular spheroids, and compared the results with those obtained from 2D monolayers. The drugs’ IC50 values were significantly increased from the range of 6.4–44.1 μM in 2D monolayers to 21.8–138.0 μM in 3D multicellular spheroids, which may be due to enhanced mass barrier and reduced drug penetration in 3D models. The fluorescence of EVO and RUT was measured via fluorescence spectroscopy and the cellular uptake of both drugs was characterized in 2D tumor models. The results showed that the cellular uptake concentrations of RUT increased with increasing drug concentrations. However, the EVO concentrations uptaken by the cells showed only a small change with increasing drug concentrations, which may be due to the different solubility of EVO and Rut in solvents. Overall, this study provided a new vision of the anti-tumor activity of EVO and RUT via 3D multicellular spheroids and cellular uptake through the fluorescence of compounds.

  1. A computational framework for the automated construction of glycosylation reaction networks.

    Science.gov (United States)

    Liu, Gang; Neelamegham, Sriram

    2014-01-01

    Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS) data. The features described above are illustrated using three case studies that examine: i) O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii) automated N-linked glycosylation pathway construction; and iii) the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme biochemistry. All

  2. A computational framework for the automated construction of glycosylation reaction networks.

    Directory of Open Access Journals (Sweden)

    Gang Liu

    Full Text Available Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS data. The features described above are illustrated using three case studies that examine: i O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii automated N-linked glycosylation pathway construction; and iii the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme

  3. Magnetohydrodynamics cellular automata

    International Nuclear Information System (INIS)

    Hatori, Tadatsugu.

    1990-02-01

    There has been a renewal of interest in cellular automata, partly because they give an architecture for a special purpose computer with parallel processing optimized to solve a particular problem. The lattice gas cellular automata are briefly surveyed, which are recently developed to solve partial differential equations such as hydrodynamics or magnetohydrodynamics. A new model is given in the present paper to implement the magnetic Lorentz force in a more deterministic and local procedure than the previous one. (author)

  4. Magnetohydrodynamic cellular automata

    Energy Technology Data Exchange (ETDEWEB)

    Hatori, Tadatsugu [National Inst. for Fusion Science, Nagoya (Japan)

    1990-03-01

    There has been a renewal of interest in cellular automata, partly because they give an architecture for a special purpose computer with parallel processing optimized to solve a particular problem. The lattice gas cellular automata are briefly surveyed, which are recently developed to solve partial differential equations such as hydrodynamics or magnetohydrodynamics. A new model is given in the present paper to implement the magnetic Lorentz force in a more deterministic and local procedure than the previous one. (author).

  5. Magnetohydrodynamic cellular automata

    International Nuclear Information System (INIS)

    Hatori, Tadatsugu

    1990-01-01

    There has been a renewal of interest in cellular automata, partly because they give an architecture for a special purpose computer with parallel processing optimized to solve a particular problem. The lattice gas cellular automata are briefly surveyed, which are recently developed to solve partial differential equations such as hydrodynamics or magnetohydrodynamics. A new model is given in the present paper to implement the magnetic Lorentz force in a more deterministic and local procedure than the previous one. (author)

  6. Optimization of spectral printer modeling based on a modified cellular Yule-Nielsen spectral Neugebauer model.

    Science.gov (United States)

    Liu, Qiang; Wan, Xiaoxia; Xie, Dehong

    2014-06-01

    The study presented here optimizes several steps in the spectral printer modeling workflow based on a cellular Yule-Nielsen spectral Neugebauer (CYNSN) model. First, a printer subdividing method was developed that reduces the number of sub-models while maintaining the maximum device gamut. Second, the forward spectral prediction accuracy of the CYNSN model for each subspace of the printer was improved using back propagation artificial neural network (BPANN) estimated n values. Third, a sequential gamut judging method, which clearly reduced the complexity of the optimal sub-model and cell searching process during printer backward modeling, was proposed. After that, we further modified the use of the modeling color metric and comprehensively improved the spectral and perceptual accuracy of the spectral printer model. The experimental results show that the proposed optimization approaches provide obvious improvements in aspects of the modeling accuracy or efficiency for each of the corresponding steps, and an overall improvement of the optimized spectral printer modeling workflow was also demonstrated.

  7. On stochastic geometry modeling of cellular uplink transmission with truncated channel inversion power control

    KAUST Repository

    Elsawy, Hesham; Hossain, Ekram

    2014-01-01

    Using stochastic geometry, we develop a tractable uplink modeling paradigm for outage probability and spectral efficiency in both single and multi-tier cellular wireless networks. The analysis accounts for per user equipment (UE) power control

  8. Automated Structure Solution with the PHENIX Suite

    Energy Technology Data Exchange (ETDEWEB)

    Zwart, Peter H.; Zwart, Peter H.; Afonine, Pavel; Grosse-Kunstleve, Ralf W.; Hung, Li-Wei; Ioerger, Tom R.; McCoy, A.J.; McKee, Eric; Moriarty, Nigel; Read, Randy J.; Sacchettini, James C.; Sauter, Nicholas K.; Storoni, L.C.; Terwilliger, Tomas C.; Adams, Paul D.

    2008-06-09

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix.refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  9. Automated structure solution with the PHENIX suite

    Energy Technology Data Exchange (ETDEWEB)

    Terwilliger, Thomas C [Los Alamos National Laboratory; Zwart, Peter H [LBNL; Afonine, Pavel V [LBNL; Grosse - Kunstleve, Ralf W [LBNL

    2008-01-01

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution, and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution, and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template- and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix. refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  10. MATHEMATICAL MODEL FOR SOFTWARE USABILITY AUTOMATED EVALUATION AND ASSURANCE

    Directory of Open Access Journals (Sweden)

    І. Гученко

    2011-04-01

    Full Text Available The subject of the research is software usability and the aim is construction of mathematicalmodel of estimation and providing of the set level of usability. Methodology of structural analysis,methods of multicriterion optimization and theory of making decision, method of convolution,scientific methods of analysis and analogies is used in the research. The result of executed work isthe model for software usability automated evaluation and assurance that allows not only toestimate the current level of usability during every iteration of agile development but also tomanage the usability of created software products. Results can be used for the construction ofautomated support systems of management the software usability.

  11. Advances in the simulation and automated measurement of well-sorted granular material: 1. Simulation

    Science.gov (United States)

    Daniel Buscombe,; Rubin, David M.

    2012-01-01

    1. In this, the first of a pair of papers which address the simulation and automated measurement of well-sorted natural granular material, a method is presented for simulation of two-phase (solid, void) assemblages of discrete non-cohesive particles. The purpose is to have a flexible, yet computationally and theoretically simple, suite of tools with well constrained and well known statistical properties, in order to simulate realistic granular material as a discrete element model with realistic size and shape distributions, for a variety of purposes. The stochastic modeling framework is based on three-dimensional tessellations with variable degrees of order in particle-packing arrangement. Examples of sediments with a variety of particle size distributions and spatial variability in grain size are presented. The relationship between particle shape and porosity conforms to published data. The immediate application is testing new algorithms for automated measurements of particle properties (mean and standard deviation of particle sizes, and apparent porosity) from images of natural sediment, as detailed in the second of this pair of papers. The model could also prove useful for simulating specific depositional structures found in natural sediments, the result of physical alterations to packing and grain fabric, using discrete particle flow models. While the principal focus here is on naturally occurring sediment and sedimentary rock, the methods presented might also be useful for simulations of similar granular or cellular material encountered in engineering, industrial and life sciences.

  12. Modeling the properties of closed-cell cellular materials from tomography images using finite shell elements

    International Nuclear Information System (INIS)

    Caty, O.; Maire, E.; Youssef, S.; Bouchet, R.

    2008-01-01

    Closed-cell cellular materials exhibit several interesting properties. These properties are, however, very difficult to simulate and understand from the knowledge of the cellular microstructure. This problem is mostly due to the highly complex organization of the cells and to their very fine walls. X-ray tomography can produce three-dimensional (3-D) images of the structure, enabling one to visualize locally the damage of the cell walls that would result in the structure collapsing. These data could be used for meshing with continuum elements of the structure for finite element (FE) calculations. But when the density is very low, the walls are fine and the meshes based on continuum elements are not suitable to represent accurately the structure while preserving the representativeness of the model in terms of cell size. This paper presents a shell FE model obtained from tomographic 3-D images that allows bigger volumes of low-density closed-cell cellular materials to be calculated. The model is enriched by direct thickness measurement on the tomographic images. The values measured are ascribed to the shell elements. To validate and use the model, a structure composed of stainless steel hollow spheres is firstly compressed and scanned to observe local deformations. The tomographic data are also meshed with shells for a FE calculation. The convergence of the model is checked and its performance is compared with a continuum model. The global behavior is compared with the measures of the compression test. At the local scale, the model allows the local stress and strain field to be calculated. The calculated deformed shape is compared with the deformed tomographic images

  13. Automating CPM-GOMS

    Science.gov (United States)

    John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger

    2002-01-01

    CPM-GOMS is a modeling method that combines the task decomposition of a GOMS analysis with a model of human resource usage at the level of cognitive, perceptual, and motor operations. CPM-GOMS models have made accurate predictions about skilled user behavior in routine tasks, but developing such models is tedious and error-prone. We describe a process for automatically generating CPM-GOMS models from a hierarchical task decomposition expressed in a cognitive modeling tool called Apex. Resource scheduling in Apex automates the difficult task of interleaving the cognitive, perceptual, and motor resources underlying common task operators (e.g. mouse move-and-click). Apex's UI automatically generates PERT charts, which allow modelers to visualize a model's complex parallel behavior. Because interleaving and visualization is now automated, it is feasible to construct arbitrarily long sequences of behavior. To demonstrate the process, we present a model of automated teller interactions in Apex and discuss implications for user modeling. available to model human users, the Goals, Operators, Methods, and Selection (GOMS) method [6, 21] has been the most widely used, providing accurate, often zero-parameter, predictions of the routine performance of skilled users in a wide range of procedural tasks [6, 13, 15, 27, 28]. GOMS is meant to model routine behavior. The user is assumed to have methods that apply sequences of operators and to achieve a goal. Selection rules are applied when there is more than one method to achieve a goal. Many routine tasks lend themselves well to such decomposition. Decomposition produces a representation of the task as a set of nested goal states that include an initial state and a final state. The iterative decomposition into goals and nested subgoals can terminate in primitives of any desired granularity, the choice of level of detail dependent on the predictions required. Although GOMS has proven useful in HCI, tools to support the

  14. The Science of Home Automation

    Science.gov (United States)

    Thomas, Brian Louis

    Smart home technologies and the concept of home automation have become more popular in recent years. This popularity has been accompanied by social acceptance of passive sensors installed throughout the home. The subsequent increase in smart homes facilitates the creation of home automation strategies. We believe that home automation strategies can be generated intelligently by utilizing smart home sensors and activity learning. In this dissertation, we hypothesize that home automation can benefit from activity awareness. To test this, we develop our activity-aware smart automation system, CARL (CASAS Activity-aware Resource Learning). CARL learns the associations between activities and device usage from historical data and utilizes the activity-aware capabilities to control the devices. To help validate CARL we deploy and test three different versions of the automation system in a real-world smart environment. To provide a foundation of activity learning, we integrate existing activity recognition and activity forecasting into CARL home automation. We also explore two alternatives to using human-labeled data to train the activity learning models. The first unsupervised method is Activity Detection, and the second is a modified DBSCAN algorithm that utilizes Dynamic Time Warping (DTW) as a distance metric. We compare the performance of activity learning with human-defined labels and with automatically-discovered activity categories. To provide evidence in support of our hypothesis, we evaluate CARL automation in a smart home testbed. Our results indicate that home automation can be boosted through activity awareness. We also find that the resulting automation has a high degree of usability and comfort for the smart home resident.

  15. Automated reconstruction of 3D models from real environments

    Science.gov (United States)

    Sequeira, V.; Ng, K.; Wolfart, E.; Gonçalves, J. G. M.; Hogg, D.

    This paper describes an integrated approach to the construction of textured 3D scene models of building interiors from laser range data and visual images. This approach has been implemented in a collection of algorithms and sensors within a prototype device for 3D reconstruction, known as the EST (Environmental Sensor for Telepresence). The EST can take the form of a push trolley or of an autonomous mobile platform. The Autonomous EST (AEST) has been designed to provide an integrated solution for automating the creation of complete models. Embedded software performs several functions, including triangulation of the range data, registration of video texture, registration and integration of data acquired from different capture points. Potential applications include facilities management for the construction industry and creating reality models to be used in general areas of virtual reality, for example, virtual studios, virtualised reality for content-related applications (e.g., CD-ROMs), social telepresence, architecture and others. The paper presents the main components of the EST/AEST, and presents some example results obtained from the prototypes. The reconstructed model is encoded in VRML format so that it is possible to access and view the model via the World Wide Web.

  16. Multiplicity of Mathematical Modeling Strategies to Search for Molecular and Cellular Insights into Bacteria Lung Infection.

    Science.gov (United States)

    Cantone, Martina; Santos, Guido; Wentker, Pia; Lai, Xin; Vera, Julio

    2017-01-01

    Even today two bacterial lung infections, namely pneumonia and tuberculosis, are among the 10 most frequent causes of death worldwide. These infections still lack effective treatments in many developing countries and in immunocompromised populations like infants, elderly people and transplanted patients. The interaction between bacteria and the host is a complex system of interlinked intercellular and the intracellular processes, enriched in regulatory structures like positive and negative feedback loops. Severe pathological condition can emerge when the immune system of the host fails to neutralize the infection. This failure can result in systemic spreading of pathogens or overwhelming immune response followed by a systemic inflammatory response. Mathematical modeling is a promising tool to dissect the complexity underlying pathogenesis of bacterial lung infection at the molecular, cellular and tissue levels, and also at the interfaces among levels. In this article, we introduce mathematical and computational modeling frameworks that can be used for investigating molecular and cellular mechanisms underlying bacterial lung infection. Then, we compile and discuss published results on the modeling of regulatory pathways and cell populations relevant for lung infection and inflammation. Finally, we discuss how to make use of this multiplicity of modeling approaches to open new avenues in the search of the molecular and cellular mechanisms underlying bacterial infection in the lung.

  17. Automated Protocol for Large-Scale Modeling of Gene Expression Data.

    Science.gov (United States)

    Hall, Michelle Lynn; Calkins, David; Sherman, Woody

    2016-11-28

    With the continued rise of phenotypic- and genotypic-based screening projects, computational methods to analyze, process, and ultimately make predictions in this field take on growing importance. Here we show how automated machine learning workflows can produce models that are predictive of differential gene expression as a function of a compound structure using data from A673 cells as a proof of principle. In particular, we present predictive models with an average accuracy of greater than 70% across a highly diverse ∼1000 gene expression profile. In contrast to the usual in silico design paradigm, where one interrogates a particular target-based response, this work opens the opportunity for virtual screening and lead optimization for desired multitarget gene expression profiles.

  18. Automation based on knowledge modeling theory and its applications in engine diagnostic systems using Space Shuttle Main Engine vibrational data. M.S. Thesis

    Science.gov (United States)

    Kim, Jonnathan H.

    1995-01-01

    Humans can perform many complicated tasks without explicit rules. This inherent and advantageous capability becomes a hurdle when a task is to be automated. Modern computers and numerical calculations require explicit rules and discrete numerical values. In order to bridge the gap between human knowledge and automating tools, a knowledge model is proposed. Knowledge modeling techniques are discussed and utilized to automate a labor and time intensive task of detecting anomalous bearing wear patterns in the Space Shuttle Main Engine (SSME) High Pressure Oxygen Turbopump (HPOTP).

  19. Tool-driven Design and Automated Parameterization for Real-time Generic Drivetrain Models

    Directory of Open Access Journals (Sweden)

    Schwarz Christina

    2015-01-01

    Full Text Available Real-time dynamic drivetrain modeling approaches have a great potential for development cost reduction in the automotive industry. Even though real-time drivetrain models are available, these solutions are specific to single transmission topologies. In this paper an environment for parameterization of a solution is proposed based on a generic method applicable to all types of gear transmission topologies. This enables tool-guided modeling by non- experts in the fields of mechanic engineering and control theory leading to reduced development and testing efforts. The approach is demonstrated for an exemplary automatic transmission using the environment for automated parameterization. Finally, the parameterization is validated via vehicle measurement data.

  20. Estimating cellular network performance during hurricanes

    International Nuclear Information System (INIS)

    Booker, Graham; Torres, Jacob; Guikema, Seth; Sprintson, Alex; Brumbelow, Kelly

    2010-01-01

    Cellular networks serve a critical role during and immediately after a hurricane, allowing citizens to contact emergency services when land-line communication is lost and serving as a backup communication channel for emergency responders. However, due to their ubiquitous deployment and limited design for extreme loading events, basic network elements, such as cellular towers and antennas are prone to failures during adverse weather conditions such as hurricanes. Accordingly, a systematic and computationally feasible approach is required for assessing and improving the reliability of cellular networks during hurricanes. In this paper we develop a new multi-disciplinary approach to efficiently and accurately assess cellular network reliability during hurricanes. We show how the performance of a cellular network during and immediately after future hurricanes can be estimated based on a combination of hurricane wind field models, structural reliability analysis, Monte Carlo simulation, and cellular network models and simulation tools. We then demonstrate the use of this approach for assessing the improvement in system reliability that can be achieved with discrete topological changes in the system. Our results suggest that adding redundancy, particularly through a mesh topology or through the addition of an optical fiber ring around the perimeter of the system can be an effective way to significantly increase the reliability of some cellular systems during hurricanes.

  1. Altering user' acceptance of automation through prior automation exposure.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  2. A conceptual model of the automated credibility assessment of the volunteered geographic information

    International Nuclear Information System (INIS)

    Idris, N H; Jackson, M J; Ishak, M H I

    2014-01-01

    The use of Volunteered Geographic Information (VGI) in collecting, sharing and disseminating geospatially referenced information on the Web is increasingly common. The potentials of this localized and collective information have been seen to complement the maintenance process of authoritative mapping data sources and in realizing the development of Digital Earth. The main barrier to the use of this data in supporting this bottom up approach is the credibility (trust), completeness, accuracy, and quality of both the data input and outputs generated. The only feasible approach to assess these data is by relying on an automated process. This paper describes a conceptual model of indicators (parameters) and practical approaches to automated assess the credibility of information contributed through the VGI including map mashups, Geo Web and crowd – sourced based applications. There are two main components proposed to be assessed in the conceptual model – metadata and data. The metadata component comprises the indicator of the hosting (websites) and the sources of data / information. The data component comprises the indicators to assess absolute and relative data positioning, attribute, thematic, temporal and geometric correctness and consistency. This paper suggests approaches to assess the components. To assess the metadata component, automated text categorization using supervised machine learning is proposed. To assess the correctness and consistency in the data component, we suggest a matching validation approach using the current emerging technologies from Linked Data infrastructures and using third party reviews validation. This study contributes to the research domain that focuses on the credibility, trust and quality issues of data contributed by web citizen providers

  3. An improved cellular automaton method to model multispecies biofilms.

    Science.gov (United States)

    Tang, Youneng; Valocchi, Albert J

    2013-10-01

    Biomass-spreading rules used in previous cellular automaton methods to simulate multispecies biofilm introduced extensive mixing between different biomass species or resulted in spatially discontinuous biomass concentration and distribution; this caused results based on the cellular automaton methods to deviate from experimental results and those from the more computationally intensive continuous method. To overcome the problems, we propose new biomass-spreading rules in this work: Excess biomass spreads by pushing a line of grid cells that are on the shortest path from the source grid cell to the destination grid cell, and the fractions of different biomass species in the grid cells on the path change due to the spreading. To evaluate the new rules, three two-dimensional simulation examples are used to compare the biomass distribution computed using the continuous method and three cellular automaton methods, one based on the new rules and the other two based on rules presented in two previous studies. The relationship between the biomass species is syntrophic in one example and competitive in the other two examples. Simulation results generated using the cellular automaton method based on the new rules agree much better with the continuous method than do results using the other two cellular automaton methods. The new biomass-spreading rules are no more complex to implement than the existing rules. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Advances in automated noise data acquisition and noise source modeling for power reactors

    International Nuclear Information System (INIS)

    Clapp, N.E. Jr.; Kryter, R.C.; Sweeney, F.J.; Renier, J.A.

    1981-01-01

    A newly expanded program, directed toward achieving a better appreciation of both the strengths and limitations of on-line, noise-based, long-term surveillance programs for nuclear reactors, is described. Initial results in the complementary experimental (acquisition and automated screening of noise signatures) and theoretical (stochastic modeling of likely noise sources) areas of investigation are given

  5. An improved cellular automata model for train operation simulation with dynamic acceleration

    Science.gov (United States)

    Li, Wen-Jun; Nie, Lei

    2018-03-01

    Urban rail transit plays an important role in the urban public traffic because of its advantages of fast speed, large transport capacity, high safety, reliability and low pollution. This study proposes an improved cellular automaton (CA) model by considering the dynamic characteristic of the train acceleration to analyze the energy consumption and train running time. Constructing an effective model for calculating energy consumption to aid train operation improvement is the basis for studying and analyzing energy-saving measures for urban rail transit system operation.

  6. Automated Generation of Formal Models from ST Control Programs for Verification Purposes

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Blech, J-O; Gonzalez Suarez, V

    2014-01-01

    In large industrial control systems such as the ones installed at CERN, one of the main issues is the ability to verify the correct behaviour of the Programmable Logic Controller (PLC) programs. While manual and automated testing can achieve good results, some obvious problems remain unsolved such as the difficulty to check safety or liveness properties. This paper proposes a general methodology and a tool to verify PLC programs by automatically generating formal models for different model checkers out of ST code. The proposed methodology defines an automata-based formalism used as intermediate model (IM) to transform PLC programs written in ST language into different formal models for verification purposes. A tool based on Xtext has been implemented that automatically generates models for the NuSMV and UPPAAL model checkers and the BIP framework.

  7. Work Practice Simulation of Complex Human-Automation Systems in Safety Critical Situations: The Brahms Generalized berlingen Model

    Science.gov (United States)

    Clancey, William J.; Linde, Charlotte; Seah, Chin; Shafto, Michael

    2013-01-01

    The transition from the current air traffic system to the next generation air traffic system will require the introduction of new automated systems, including transferring some functions from air traffic controllers to on­-board automation. This report describes a new design verification and validation (V&V) methodology for assessing aviation safety. The approach involves a detailed computer simulation of work practices that includes people interacting with flight-critical systems. The research is part of an effort to develop new modeling and verification methodologies that can assess the safety of flight-critical systems, system configurations, and operational concepts. The 2002 Ueberlingen mid-air collision was chosen for analysis and modeling because one of the main causes of the accident was one crew's response to a conflict between the instructions of the air traffic controller and the instructions of TCAS, an automated Traffic Alert and Collision Avoidance System on-board warning system. It thus furnishes an example of the problem of authority versus autonomy. It provides a starting point for exploring authority/autonomy conflict in the larger system of organization, tools, and practices in which the participants' moment-by-moment actions take place. We have developed a general air traffic system model (not a specific simulation of Überlingen events), called the Brahms Generalized Ueberlingen Model (Brahms-GUeM). Brahms is a multi-agent simulation system that models people, tools, facilities/vehicles, and geography to simulate the current air transportation system as a collection of distributed, interactive subsystems (e.g., airports, air-traffic control towers and personnel, aircraft, automated flight systems and air-traffic tools, instruments, crew). Brahms-GUeM can be configured in different ways, called scenarios, such that anomalous events that contributed to the Überlingen accident can be modeled as functioning according to requirements or in an

  8. A distributed substation automation model based on the multi-agents technology; Um modelo distribuido de automacao de subestacoes baseado em tecnologia multiagentes

    Energy Technology Data Exchange (ETDEWEB)

    Geus, Klaus de; Milsztajn, Flavio; Kolb, Carlos Jose Johann; Dometerco, Jose Henrique; Souza, Alexandre Mendonca de; Braga, Ciro de Carvalho; Parolin, Emerson Luis; Frisch, Arlenio Carneiro; Fortunato Junior, Luiz Kiss; Erzinger Junior, Augusto; Jonack, Marco Antonio; Guiera, Anderson Juliano Azambuja [Companhia Paranaense de Energia (COPEL), Curitiba, PR (Brazil)]. E-mail: klaus@copel.com; flaviomil@copel.com; kolb@copel.com; dometerc@copel.com; alexandre.mendonca@copel.com; ciro@copel.com; parolin@copel.com; arlenio@copel.com; luiz.kiss@copel.com; aerzinger@copel.com; jonack@copel.com; guiera@copel.com

    2006-10-15

    The main purpose of this paper is to analyse distributed computing technology which can be used in substation automation systems. Based on performance comparative results obtained in laboratory, a specific model for distributed substation automation is proposed considering the current model employed at COPEL - Companhia Paranaense de Energia. The proposed model is based on the multi-agents technology, which has lately received special attention in the development of distributed systems with local intelligence. (author)

  9. A Co-Opetitive Automated Negotiation Model for Vertical Allied Enterprises Teams and Stakeholders

    Directory of Open Access Journals (Sweden)

    Taiguang Gao

    2018-04-01

    Full Text Available Upstream and downstream of supply chain enterprises often form a tactic vertical alliance to enhance their operational efficiency and maintain their competitive edges in the market. Hence, it is critical for an alliance to collaborate over their internal resources and resolve the profit conflicts among members, so that the functionality required by stakeholders can be fulfilled. As an effective solution, automated negotiation for the vertical allied enterprises team and stakeholder will sufficiently make use of emerging team advantages and significantly reduce the profit conflicts in teams with grouping decisions rather than unilateral decisions by some leader. In this paper, an automated negotiation model is designed to describe both the collaborative game process among the team members and the competitive negotiation process between the allied team and the stakeholder. Considering the co-competitiveness of the vertical allied team, the designed model helps the team members making decision for their own sake, and the team counter-offers for the ongoing negotiation are generated with non-cooperative game process, where the profit derived from negotiation result is distributed with Shapley value method according to contribution or importance contributed by each team member. Finally, a case study is given to testify the effectiveness of the designed model.

  10. Applications of Bayesian temperature profile reconstruction to automated comparison with heat transport models and uncertainty quantification of current diffusion

    International Nuclear Information System (INIS)

    Irishkin, M.; Imbeaux, F.; Aniel, T.; Artaud, J.F.

    2015-01-01

    Highlights: • We developed a method for automated comparison of experimental data with models. • A unique platform implements Bayesian analysis and integrated modelling tools. • The method is tokamak-generic and is applied to Tore Supra and JET pulses. • Validation of a heat transport model is carried out. • We quantified the uncertainties due to Te profiles in current diffusion simulations. - Abstract: In the context of present and future long pulse tokamak experiments yielding a growing size of measured data per pulse, automating data consistency analysis and comparisons of measurements with models is a critical matter. To address these issues, the present work describes an expert system that carries out in an integrated and fully automated way (i) a reconstruction of plasma profiles from the measurements, using Bayesian analysis (ii) a prediction of the reconstructed quantities, according to some models and (iii) a comparison of the first two steps. The first application shown is devoted to the development of an automated comparison method between the experimental plasma profiles reconstructed using Bayesian methods and time dependent solutions of the transport equations. The method was applied to model validation of a simple heat transport model with three radial shape options. It has been tested on a database of 21 Tore Supra and 14 JET shots. The second application aims at quantifying uncertainties due to the electron temperature profile in current diffusion simulations. A systematic reconstruction of the Ne, Te, Ti profiles was first carried out for all time slices of the pulse. The Bayesian 95% highest probability intervals on the Te profile reconstruction were then used for (i) data consistency check of the flux consumption and (ii) defining a confidence interval for the current profile simulation. The method has been applied to one Tore Supra pulse and one JET pulse.

  11. An Algorithm to Automate Yeast Segmentation and Tracking

    Science.gov (United States)

    Doncic, Andreas; Eser, Umut; Atay, Oguzhan; Skotheim, Jan M.

    2013-01-01

    Our understanding of dynamic cellular processes has been greatly enhanced by rapid advances in quantitative fluorescence microscopy. Imaging single cells has emphasized the prevalence of phenomena that can be difficult to infer from population measurements, such as all-or-none cellular decisions, cell-to-cell variability, and oscillations. Examination of these phenomena requires segmenting and tracking individual cells over long periods of time. However, accurate segmentation and tracking of cells is difficult and is often the rate-limiting step in an experimental pipeline. Here, we present an algorithm that accomplishes fully automated segmentation and tracking of budding yeast cells within growing colonies. The algorithm incorporates prior information of yeast-specific traits, such as immobility and growth rate, to segment an image using a set of threshold values rather than one specific optimized threshold. Results from the entire set of thresholds are then used to perform a robust final segmentation. PMID:23520484

  12. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  13. Statistical mechanics of cellular automata

    International Nuclear Information System (INIS)

    Wolfram, S.

    1983-01-01

    Cellular automata are used as simple mathematical models to investigate self-organization in statistical mechanics. A detailed analysis is given of ''elementary'' cellular automata consisting of a sequence of sites with values 0 or 1 on a line, with each site evolving deterministically in discrete time steps according to p definite rules involving the values of its nearest neighbors. With simple initial configurations, the cellular automata either tend to homogeneous states, or generate self-similar patterns with fractal dimensions approx. =1.59 or approx. =1.69. With ''random'' initial configurations, the irreversible character of the cellular automaton evolution leads to several self-organization phenomena. Statistical properties of the structures generated are found to lie in two universality classes, independent of the details of the initial state or the cellular automaton rules. More complicated cellular automata are briefly considered, and connections with dynamical systems theory and the formal theory of computation are discussed

  14. Quantitative phase-digital holographic microscopy: a new imaging modality to identify original cellular biomarkers of diseases

    KAUST Repository

    Marquet, P.

    2016-05-03

    Quantitative phase microscopy (QPM) has recently emerged as a powerful label-free technique in the field of living cell imaging allowing to non-invasively measure with a nanometric axial sensitivity cell structure and dynamics. Since the phase retardation of a light wave when transmitted through the observed cells, namely the quantitative phase signal (QPS), is sensitive to both cellular thickness and intracellular refractive index related to the cellular content, its accurate analysis allows to derive various cell parameters and monitor specific cell processes, which are very likely to identify new cell biomarkers. Specifically, quantitative phase-digital holographic microscopy (QP-DHM), thanks to its numerical flexibility facilitating parallelization and automation processes, represents an appealing imaging modality to both identify original cellular biomarkers of diseases as well to explore the underlying pathophysiological processes.

  15. Cellular Automata Simulation for Wealth Distribution

    Science.gov (United States)

    Lo, Shih-Ching

    2009-08-01

    Wealth distribution of a country is a complicate system. A model, which is based on the Epstein & Axtell's "Sugars cape" model, is presented in Netlogo. The model considers the income, age, working opportunity and salary as control variables. There are still other variables should be considered while an artificial society is established. In this study, a more complicate cellular automata model for wealth distribution model is proposed. The effects of social welfare, tax, economical investment and inheritance are considered and simulated. According to the cellular automata simulation for wealth distribution, we will have a deep insight of financial policy of the government.

  16. Cellular Automata for Modeling the field-scale erosion

    International Nuclear Information System (INIS)

    Diaz Suarez, Jorge; Bagarotti Marin, Angel; Ruiz Perez, Maria Elena

    2008-01-01

    Full text: The Cellular Automaton (CA) is a system used discrete dynamic modeling of many physical systems. Their fundamental properties are the interaction at the local level, homogeneity and parallelism. It has been used as a secondary for the simulation of large systems where the use of equations in partial derivatives is complex and costly from the computational point of view. On the other hand, the high complexity of spatial interaction in the processes involved in the erosion-transport-deposition of sediments at field level, considerably limiting the use of base models physics. The objective of this study is to model the main processes involved in erosion water supply of soils through the use of the CAMELot system, based on an extension of the original paradigm of the CA. The CAMELot system has been used in the simulation of systems of large spatial extent, where the laws of local interaction between automata have a deep physical sense. This system guarantees both the input of the necessary specifications and simulation in parallel, as the visualization and the general management of the system. They are exposed to each of the submodels used in it and the overall dynamics of the system is analyzed. (author)

  17. Probabilistic cellular automata.

    Science.gov (United States)

    Agapie, Alexandru; Andreica, Anca; Giuclea, Marius

    2014-09-01

    Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case-connecting the probability of a configuration in the stationary distribution to its number of zero-one borders-the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata.

  18. Classification of Automated Search Traffic

    Science.gov (United States)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  19. Design and evaluation of cellular power converter architectures

    Science.gov (United States)

    Perreault, David John

    Power electronic technology plays an important role in many energy conversion and storage applications, including machine drives, power supplies, frequency changers and UPS systems. Increases in performance and reductions in cost have been achieved through the development of higher performance power semiconductor devices and integrated control devices with increased functionality. Manufacturing techniques, however, have changed little. High power is typically achieved by paralleling multiple die in a sing!e package, producing the physical equivalent of a single large device. Consequently, both the device package and the converter in which the device is used continue to require large, complex mechanical structures, and relatively sophisticated heat transfer systems. An alternative to this approach is the use of a cellular power converter architecture, which is based upon the parallel connection of a large number of quasi-autonomous converters, called cells, each of which is designed for a fraction of the system rating. The cell rating is chosen such that single-die devices in inexpensive packages can be used, and the cell fabricated with an automated assembly process. The use of quasi-autonomous cells means that system performance is not compromised by the failure of a cell. This thesis explores the design of cellular converter architectures with the objective of achieving improvements in performance, reliability, and cost over conventional converter designs. New approaches are developed and experimentally verified for highly distributed control of cellular converters, including methods for ripple cancellation and current-sharing control. The performance of these techniques are quantified, and their dynamics are analyzed. Cell topologies suitable to the cellular architecture are investigated, and their use for systems in the 5-500 kVA range is explored. The design, construction, and experimental evaluation of a 6 kW cellular switched-mode rectifier is also addressed

  20. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.

    1987-01-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  1. Cellular modelling of secondary radial growth in conifer trees: application to Pinus radiata (D. Don).

    Science.gov (United States)

    Forest, Loïc; Demongeot, Jacques; Demongeota, Jacques

    2006-05-01

    The radial growth of conifer trees proceeds from the dynamics of a merismatic tissue called vascular cambium or cambium. Cambium is a thin layer of active proliferating cells. The purpose of this paper was to model the main characteristics of cambial activity and its consecutive radial growth. Cell growth is under the control of the auxin hormone indole-3-acetic. The model is composed of a discrete part, which accounts for cellular proliferation, and a continuous part involving the transport of auxin. Cambium is modeled in a two-dimensional cross-section by a cellular automaton that describes the set of all its constitutive cells. Proliferation is defined as growth and division of cambial cells under neighbouring constraints, which can eliminate some cells from the cambium. The cell-growth rate is determined from auxin concentration, calculated with the continuous model. We studied the integration of each elementary cambial cell activity into the global coherent movement of macroscopic morphogenesis. Cases of normal and abnormal growth of Pinus radiata (D. Don) are modelled. Abnormal growth includes deformed trees where gravity influences auxin transport, producing heterogeneous radial growth. Cross-sectional microscopic views are also provided to validate the model's hypothesis and results.

  2. Modeling of time dependent localized flow shear stress and its impact on cellular growth within additive manufactured titanium implants.

    Science.gov (United States)

    Zhang, Ziyu; Yuan, Lang; Lee, Peter D; Jones, Eric; Jones, Julian R

    2014-11-01

    Bone augmentation implants are porous to allow cellular growth, bone formation and fixation. However, the design of the pores is currently based on simple empirical rules, such as minimum pore and interconnects sizes. We present a three-dimensional (3D) transient model of cellular growth based on the Navier-Stokes equations that simulates the body fluid flow and stimulation of bone precursor cellular growth, attachment, and proliferation as a function of local flow shear stress. The model's effectiveness is demonstrated for two additive manufactured (AM) titanium scaffold architectures. The results demonstrate that there is a complex interaction of flow rate and strut architecture, resulting in partially randomized structures having a preferential impact on stimulating cell migration in 3D porous structures for higher flow rates. This novel result demonstrates the potential new insights that can be gained via the modeling tool developed, and how the model can be used to perform what-if simulations to design AM structures to specific functional requirements. © 2014 Wiley Periodicals, Inc.

  3. A material optimization model to approximate energy bounds for cellular materials under multiload conditions

    DEFF Research Database (Denmark)

    Guedes, J.M.; Rodrigues, H.C.; Bendsøe, Martin P.

    2003-01-01

    This paper describes a computational model, based on inverse homogenization and topology design, for approximating energy bounds for two-phase composites under multiple load cases. The approach allows for the identification of possible single-scale cellular materials that give rise to the optimal...

  4. Continuum-level modelling of cellular adhesion and matrix production in aggregates.

    Science.gov (United States)

    Geris, Liesbet; Ashbourn, Joanna M A; Clarke, Tim

    2011-05-01

    Key regulators in tissue-engineering processes such as cell culture and cellular organisation are the cell-cell and cell-matrix interactions. As mathematical models are increasingly applied to investigate biological phenomena in the biomedical field, it is important, for some applications, that these models incorporate an adequate description of cell adhesion. This study describes the development of a continuum model that represents a cell-in-gel culture system used in bone-tissue engineering, namely that of a cell aggregate embedded in a hydrogel. Cell adhesion is modelled through the use of non-local (integral) terms in the partial differential equations. The simulation results demonstrate that the effects of cell-cell and cell-matrix adhesion are particularly important for the survival and growth of the cell population and the production of extracellular matrix by the cells, concurring with experimental observations in the literature.

  5. Idealized Mesoscale Model Simulations of Open Cellular Convection Over the Sea

    DEFF Research Database (Denmark)

    Vincent, Claire Louise; Hahmann, Andrea N.; Kelly, Mark C.

    2012-01-01

    The atmospheric conditions during an observed case of open cellular convection over the North Sea were simulated using the Weather Research and Forecasting (WRF) numerical model. Wind, temperature and water vapour mixing ratio profiles from the WRF simulation were used to initialize an idealized...... version of the model, which excluded the effects of topography, surface inhomogeneities and large-scale weather forcing. Cells with an average diameter of 17.4 km developed. Simulations both with and without a capping inversion were made, and the cell-scale kinetic energy budget was calculated for each...... case. By considering all sources of explicit diffusion in the model, the budgets were balanced. In comparison with previous work based on observational studies, the use of three-dimensional, gridded model data afforded the possibility of calculating all terms in the budgets, which showed...

  6. An examination of adaptive cellular protective mechanisms using a multi-stage carcinogenesis model

    International Nuclear Information System (INIS)

    Schollnberger, H.; Stewart, R. D.; Mitchel, R. E. J.; Hofmann, W.

    2004-01-01

    A multi-stage cancer model that describes the putative rate-limiting steps in carcinogenesis was developed and used to investigate the potential impact on lung cancer incidence of the hormesis mechanisms suggested by Feinendegen and Pollycove. In this deterministic cancer model, radiation and endogenous processes damage the DNA of target cells in the lung. Some fraction of the misrepaired our unrepaired DNA damage induces genomic instability and, ultimately, leads to the accumulation of malignant cells. The model accounts for cell birth and death processes. Ita also includes a rate of malignant transformation and a lag period for tumour formation. Cellular defence mechanisms are incorporated into the model by postulating dose and dose rate dependent radical scavenging. The accuracy of DNA damage repair also depends on dose and dose rate. Sensitivity studies were conducted to identify critical model inputs and to help define the shapes of the cumulative lung cancer incidence curves that may arise when dose and dose rate dependent cellular defence mechanisms are incorporated into a multi-stage cancer model. For lung cancer, both linear no-threshold (LNT) and non-LNT shaped responses can be obtained. The reported studied clearly show that it is critical to know whether or not and to what extent multiply damaged DNA sites are formed by endogenous processes. Model inputs that give rise to U-shaped responses are consistent with an effective cumulative lung cancer incidence threshold that may be as high as 300 mGy (4 mGy per year for 75 years). (Author) 11 refs

  7. Fast and accurate automated cell boundary determination for fluorescence microscopy

    Science.gov (United States)

    Arce, Stephen Hugo; Wu, Pei-Hsun; Tseng, Yiider

    2013-07-01

    Detailed measurement of cell phenotype information from digital fluorescence images has the potential to greatly advance biomedicine in various disciplines such as patient diagnostics or drug screening. Yet, the complexity of cell conformations presents a major barrier preventing effective determination of cell boundaries, and introduces measurement error that propagates throughout subsequent assessment of cellular parameters and statistical analysis. State-of-the-art image segmentation techniques that require user-interaction, prolonged computation time and specialized training cannot adequately provide the support for high content platforms, which often sacrifice resolution to foster the speedy collection of massive amounts of cellular data. This work introduces a strategy that allows us to rapidly obtain accurate cell boundaries from digital fluorescent images in an automated format. Hence, this new method has broad applicability to promote biotechnology.

  8. Continuous Automated Model EvaluatiOn (CAMEO) complementing the critical assessment of structure prediction in CASP12.

    Science.gov (United States)

    Haas, Jürgen; Barbato, Alessandro; Behringer, Dario; Studer, Gabriel; Roth, Steven; Bertoni, Martino; Mostaguir, Khaled; Gumienny, Rafal; Schwede, Torsten

    2018-03-01

    Every second year, the community experiment "Critical Assessment of Techniques for Structure Prediction" (CASP) is conducting an independent blind assessment of structure prediction methods, providing a framework for comparing the performance of different approaches and discussing the latest developments in the field. Yet, developers of automated computational modeling methods clearly benefit from more frequent evaluations based on larger sets of data. The "Continuous Automated Model EvaluatiOn (CAMEO)" platform complements the CASP experiment by conducting fully automated blind prediction assessments based on the weekly pre-release of sequences of those structures, which are going to be published in the next release of the PDB Protein Data Bank. CAMEO publishes weekly benchmarking results based on models collected during a 4-day prediction window, on average assessing ca. 100 targets during a time frame of 5 weeks. CAMEO benchmarking data is generated consistently for all participating methods at the same point in time, enabling developers to benchmark and cross-validate their method's performance, and directly refer to the benchmarking results in publications. In order to facilitate server development and promote shorter release cycles, CAMEO sends weekly email with submission statistics and low performance warnings. Many participants of CASP have successfully employed CAMEO when preparing their methods for upcoming community experiments. CAMEO offers a variety of scores to allow benchmarking diverse aspects of structure prediction methods. By introducing new scoring schemes, CAMEO facilitates new development in areas of active research, for example, modeling quaternary structure, complexes, or ligand binding sites. © 2017 Wiley Periodicals, Inc.

  9. Incremental learning for automated knowledge capture

    Energy Technology Data Exchange (ETDEWEB)

    Benz, Zachary O. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Basilico, Justin Derrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Davis, Warren Leon [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dixon, Kevin R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jones, Brian S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Martin, Nathaniel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wendt, Jeremy Daniel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-12-01

    People responding to high-consequence national-security situations need tools to help them make the right decision quickly. The dynamic, time-critical, and ever-changing nature of these situations, especially those involving an adversary, require models of decision support that can dynamically react as a situation unfolds and changes. Automated knowledge capture is a key part of creating individualized models of decision making in many situations because it has been demonstrated as a very robust way to populate computational models of cognition. However, existing automated knowledge capture techniques only populate a knowledge model with data prior to its use, after which the knowledge model is static and unchanging. In contrast, humans, including our national-security adversaries, continually learn, adapt, and create new knowledge as they make decisions and witness their effect. This artificial dichotomy between creation and use exists because the majority of automated knowledge capture techniques are based on traditional batch machine-learning and statistical algorithms. These algorithms are primarily designed to optimize the accuracy of their predictions and only secondarily, if at all, concerned with issues such as speed, memory use, or ability to be incrementally updated. Thus, when new data arrives, batch algorithms used for automated knowledge capture currently require significant recomputation, frequently from scratch, which makes them ill suited for use in dynamic, timecritical, high-consequence decision making environments. In this work we seek to explore and expand upon the capabilities of dynamic, incremental models that can adapt to an ever-changing feature space.

  10. An automation of design and modelling tasks in NX Siemens environment with original software - generator module

    Science.gov (United States)

    Zbiciak, M.; Grabowik, C.; Janik, W.

    2015-11-01

    Nowadays the design constructional process is almost exclusively aided with CAD/CAE/CAM systems. It is evaluated that nearly 80% of design activities have a routine nature. These design routine tasks are highly susceptible to automation. Design automation is usually made with API tools which allow building original software responsible for adding different engineering activities. In this paper the original software worked out in order to automate engineering tasks at the stage of a product geometrical shape design is presented. The elaborated software works exclusively in NX Siemens CAD/CAM/CAE environment and was prepared in Microsoft Visual Studio with application of the .NET technology and NX SNAP library. The software functionality allows designing and modelling of spur and helicoidal involute gears. Moreover, it is possible to estimate relative manufacturing costs. With the Generator module it is possible to design and model both standard and non-standard gear wheels. The main advantage of the model generated in such a way is its better representation of an involute curve in comparison to those which are drawn in specialized standard CAD systems tools. It comes from fact that usually in CAD systems an involute curve is drawn by 3 points that respond to points located on the addendum circle, the reference diameter of a gear and the base circle respectively. In the Generator module the involute curve is drawn by 11 involute points which are located on and upper the base and the addendum circles therefore 3D gear wheels models are highly accurate. Application of the Generator module makes the modelling process very rapid so that the gear wheel modelling time is reduced to several seconds. During the conducted research the analysis of differences between standard 3 points and 11 points involutes was made. The results and conclusions drawn upon analysis are shown in details.

  11. Use of the DynaLearn learning environment by naïve student modelers : Implications for automated support

    NARCIS (Netherlands)

    Noble, R.; Bredeweg, B.; Biswas, G.; Bull, S.; Kay, J.; Mitrovic, A.

    2011-01-01

    This paper shows that naïve students will require coaching to overcome the difficulties they face in identifying the important concepts to be modeled, and understanding the causal meta-vocabulary needed for conceptual models. The results of this study will be incorporated in the automated feedback

  12. On Modeling Large-Scale Multi-Agent Systems with Parallel, Sequential and Genuinely Asynchronous Cellular Automata

    International Nuclear Information System (INIS)

    Tosic, P.T.

    2011-01-01

    We study certain types of Cellular Automata (CA) viewed as an abstraction of large-scale Multi-Agent Systems (MAS). We argue that the classical CA model needs to be modified in several important respects, in order to become a relevant and sufficiently general model for the large-scale MAS, and so that thus generalized model can capture many important MAS properties at the level of agent ensembles and their long-term collective behavior patterns. We specifically focus on the issue of inter-agent communication in CA, and propose sequential cellular automata (SCA) as the first step, and genuinely Asynchronous Cellular Automata (ACA) as the ultimate deterministic CA-based abstract models for large-scale MAS made of simple reactive agents. We first formulate deterministic and nondeterministic versions of sequential CA, and then summarize some interesting configuration space properties (i.e., possible behaviors) of a restricted class of sequential CA. In particular, we compare and contrast those properties of sequential CA with the corresponding properties of the classical (that is, parallel and perfectly synchronous) CA with the same restricted class of update rules. We analytically demonstrate failure of the studied sequential CA models to simulate all possible behaviors of perfectly synchronous parallel CA, even for a very restricted class of non-linear totalistic node update rules. The lesson learned is that the interleaving semantics of concurrency, when applied to sequential CA, is not refined enough to adequately capture the perfect synchrony of parallel CA updates. Last but not least, we outline what would be an appropriate CA-like abstraction for large-scale distributed computing insofar as the inter-agent communication model is concerned, and in that context we propose genuinely asynchronous CA. (author)

  13. Towards Automation 2.0: A Neurocognitive Model for Environment Recognition, Decision-Making, and Action Execution

    Directory of Open Access Journals (Sweden)

    Zucker Gerhard

    2011-01-01

    Full Text Available The ongoing penetration of building automation by information technology is by far not saturated. Today's systems need not only be reliable and fault tolerant, they also have to regard energy efficiency and flexibility in the overall consumption. Meeting the quality and comfort goals in building automation while at the same time optimizing towards energy, carbon footprint and cost-efficiency requires systems that are able to handle large amounts of information and negotiate system behaviour that resolves conflicting demands—a decision-making process. In the last years, research has started to focus on bionic principles for designing new concepts in this area. The information processing principles of the human mind have turned out to be of particular interest as the mind is capable of processing huge amounts of sensory data and taking adequate decisions for (re-actions based on these analysed data. In this paper, we discuss how a bionic approach can solve the upcoming problems of energy optimal systems. A recently developed model for environment recognition and decision-making processes, which is based on research findings from different disciplines of brain research is introduced. This model is the foundation for applications in intelligent building automation that have to deal with information from home and office environments. All of these applications have in common that they consist of a combination of communicating nodes and have many, partly contradicting goals.

  14. Interoperability between OPC UA and AutomationML

    OpenAIRE

    Henßen, Robert; Schleipen, Miriam

    2014-01-01

    OPC UA (OPC Unified Architecture) is a platform-independent standard series (IEC 62541) [1], [2] for communication of industrial automation devices and systems. The OPC Unified Architecture is an advanced communication technology for process control. Certainly the launching costs for the initial information model are quite high. AutomationML (Automation Markup Language) is an upcoming open standard series (IEC 62714) [3], [4] for describing production plants or plant components. The goal of t...

  15. Studying human-automation interactions: methodological lessons learned from the human-centred automation experiments 1997-2001

    International Nuclear Information System (INIS)

    Massaiu, Salvatore; Skjerve, Ann Britt Miberg; Skraaning, Gyrd Jr.; Strand, Stine; Waeroe, Irene

    2004-04-01

    This report documents the methodological lessons learned from the Human Centred Automation (HCA) programme both in terms of psychometric evaluation of the measurement techniques developed for human-automation interaction study, and in terms of the application of advanced statistical methods for analysis of experiments. The psychometric evaluation is based on data from the four experiments performed within the HCA programme. The result is a single-source reference text of measurement instruments for the study of human-automation interaction, part of which were specifically developed by the programme. The application of advanced statistical techniques is exemplified by additional analyses performed on the IPSN-HCA experiment of 1998. Special importance is given to the statistical technique Structural Equation Modeling, for the possibility it offers to advance, and empirically test, comprehensive explanations about human-automation interactions. The additional analyses of the IPSN-HCA experiment investigated how the operators formed judgments about their own performance. The issue is of substantive interest for human automation interaction research because the operators' over- or underestimation of their own performance could be seen as a symptom of human-machine mismatch, and a potential latent failure. These analyses concluded that it is the interplay between (1) the level of automation and several factors that determines the operators' bias in performance self-estimation: (2) the nature of the task, (3) the level of scenario complexity, and (4) the level of trust in the automatic system. A structural model that expresses the interplay of all these factors was empirically evaluated and was found able to provide a concise and elegant explanation of the intricate pattern of relationships between the identified factors. (Author)

  16. A new cellular automata model of traffic flow with negative exponential weighted look-ahead potential

    Science.gov (United States)

    Ma, Xiao; Zheng, Wei-Fan; Jiang, Bao-Shan; Zhang, Ji-Ye

    2016-10-01

    With the development of traffic systems, some issues such as traffic jams become more and more serious. Efficient traffic flow theory is needed to guide the overall controlling, organizing and management of traffic systems. On the basis of the cellular automata model and the traffic flow model with look-ahead potential, a new cellular automata traffic flow model with negative exponential weighted look-ahead potential is presented in this paper. By introducing the negative exponential weighting coefficient into the look-ahead potential and endowing the potential of vehicles closer to the driver with a greater coefficient, the modeling process is more suitable for the driver’s random decision-making process which is based on the traffic environment that the driver is facing. The fundamental diagrams for different weighting parameters are obtained by using numerical simulations which show that the negative exponential weighting coefficient has an obvious effect on high density traffic flux. The complex high density non-linear traffic behavior is also reproduced by numerical simulations. Project supported by the National Natural Science Foundation of China (Grant Nos. 11572264, 11172247, 11402214, and 61373009).

  17. Protein-protein interaction networks identify targets which rescue the MPP+ cellular model of Parkinson’s disease

    Science.gov (United States)

    Keane, Harriet; Ryan, Brent J.; Jackson, Brendan; Whitmore, Alan; Wade-Martins, Richard

    2015-11-01

    Neurodegenerative diseases are complex multifactorial disorders characterised by the interplay of many dysregulated physiological processes. As an exemplar, Parkinson’s disease (PD) involves multiple perturbed cellular functions, including mitochondrial dysfunction and autophagic dysregulation in preferentially-sensitive dopamine neurons, a selective pathophysiology recapitulated in vitro using the neurotoxin MPP+. Here we explore a network science approach for the selection of therapeutic protein targets in the cellular MPP+ model. We hypothesised that analysis of protein-protein interaction networks modelling MPP+ toxicity could identify proteins critical for mediating MPP+ toxicity. Analysis of protein-protein interaction networks constructed to model the interplay of mitochondrial dysfunction and autophagic dysregulation (key aspects of MPP+ toxicity) enabled us to identify four proteins predicted to be key for MPP+ toxicity (P62, GABARAP, GBRL1 and GBRL2). Combined, but not individual, knockdown of these proteins increased cellular susceptibility to MPP+ toxicity. Conversely, combined, but not individual, over-expression of the network targets provided rescue of MPP+ toxicity associated with the formation of autophagosome-like structures. We also found that modulation of two distinct proteins in the protein-protein interaction network was necessary and sufficient to mitigate neurotoxicity. Together, these findings validate our network science approach to multi-target identification in complex neurological diseases.

  18. A Computational Model of Cellular Engraftment on Lung Scaffolds.

    Science.gov (United States)

    Pothen, Joshua J; Rajendran, Vignesh; Wagner, Darcy; Weiss, Daniel J; Smith, Bradford J; Ma, Baoshun; Bates, Jason H T

    2016-01-01

    The possibility that stem cells might be used to regenerate tissue is now being investigated for a variety of organs, but these investigations are still essentially exploratory and have few predictive tools available to guide experimentation. We propose, in this study, that the field of lung tissue regeneration might be better served by predictive tools that treat stem cells as agents that obey certain rules of behavior governed by both their phenotype and their environment. Sufficient knowledge of these rules of behavior would then, in principle, allow lung tissue development to be simulated computationally. Toward this end, we developed a simple agent-based computational model to simulate geographic patterns of cells seeded onto a lung scaffold. Comparison of the simulated patterns to those observed experimentally supports the hypothesis that mesenchymal stem cells proliferate preferentially toward the scaffold boundary, whereas alveolar epithelial cells do not. This demonstrates that a computational model of this type has the potential to assist in the discovery of rules of cellular behavior.

  19. Development strategy and process models for phased automation of design and digital manufacturing electronics

    Science.gov (United States)

    Korshunov, G. I.; Petrushevskaya, A. A.; Lipatnikov, V. A.; Smirnova, M. S.

    2018-03-01

    The strategy of quality of electronics insurance is represented as most important. To provide quality, the processes sequence is considered and modeled by Markov chain. The improvement is distinguished by simple database means of design for manufacturing for future step-by-step development. Phased automation of design and digital manufacturing electronics is supposed. The MatLab modelling results showed effectiveness increase. New tools and software should be more effective. The primary digital model is proposed to represent product in the processes sequence from several processes till the whole life circle.

  20. A Cellular Automata Model of Infection Control on Medical Implants

    Science.gov (United States)

    Prieto-Langarica, Alicia; Kojouharov, Hristo; Chen-Charpentier, Benito; Tang, Liping

    2011-01-01

    S. epidermidis infections on medically implanted devices are a common problem in modern medicine due to the abundance of the bacteria. Once inside the body, S. epidermidis gather in communities called biofilms and can become extremely hard to eradicate, causing the patient serious complications. We simulate the complex S. epidermidis-Neutrophils interactions in order to determine the optimum conditions for the immune system to be able to contain the infection and avoid implant rejection. Our cellular automata model can also be used as a tool for determining the optimal amount of antibiotics for combating biofilm formation on medical implants. PMID:23543851

  1. [Reparative and neoplastic spheroid cellular structures and their mathematical model].

    Science.gov (United States)

    Kogan, E A; Namiot, V A; Demura, T A; Faĭzullina, N M; Sukhikh, G T

    2014-01-01

    Spheroid cell structures in the cell cultures have been described and are used for studying cell-cell and cell- matrix interactions. At the same time, spheroid cell structure participation in the repair and development of cancer in vivo remains unexplored. The aim of this study was to investigate the cellular composition of spherical structures and their functional significance in the repair of squamous epithelium in human papilloma virus-associated cervical pathology--chronic cervicitis and cervical intraepithelial neoplasia 1-3 degree, and also construct a mathematical model to explain the development and behavior of such spheroid cell structure.

  2. Silver Nanoparticle-Mediated Cellular Responses in Various Cell Lines: An in Vitro Model

    Directory of Open Access Journals (Sweden)

    Xi-Feng Zhang

    2016-09-01

    Full Text Available Silver nanoparticles (AgNPs have attracted increased interest and are currently used in various industries including medicine, cosmetics, textiles, electronics, and pharmaceuticals, owing to their unique physical and chemical properties, particularly as antimicrobial and anticancer agents. Recently, several studies have reported both beneficial and toxic effects of AgNPs on various prokaryotic and eukaryotic systems. To develop nanoparticles for mediated therapy, several laboratories have used a variety of cell lines under in vitro conditions to evaluate the properties, mode of action, differential responses, and mechanisms of action of AgNPs. In vitro models are simple, cost-effective, rapid, and can be used to easily assess efficacy and performance. The cytotoxicity, genotoxicity, and biocompatibility of AgNPs depend on many factors such as size, shape, surface charge, surface coating, solubility, concentration, surface functionalization, distribution of particles, mode of entry, mode of action, growth media, exposure time, and cell type. Cellular responses to AgNPs are different in each cell type and depend on the physical and chemical nature of AgNPs. This review evaluates significant contributions to the literature on biological applications of AgNPs. It begins with an introduction to AgNPs, with particular attention to their overall impact on cellular effects. The main objective of this review is to elucidate the reasons for different cell types exhibiting differential responses to nanoparticles even when they possess similar size, shape, and other parameters. Firstly, we discuss the cellular effects of AgNPs on a variety of cell lines; Secondly, we discuss the mechanisms of action of AgNPs in various cellular systems, and try to elucidate how AgNPs interact with different mammalian cell lines and produce significant effects; Finally, we discuss the cellular activation of various signaling molecules in response to AgNPs, and conclude with

  3. Mathematical modelling of the automated FADU assay for the quantification of DNA strand breaks and their repair in human peripheral mononuclear blood cells

    International Nuclear Information System (INIS)

    Junk, Michael; Salzwedel, Judy; Sindlinger, Thilo; Bürkle, Alexander; Moreno-Villanueva, Maria

    2014-01-01

    Cells continuously undergo DNA damage from exogenous agents like irradiation or genotoxic chemicals or from endogenous radicals produced by normal cellular metabolic activities. DNA strand breaks are one of the most common genotoxic lesions and they can also arise as intermediates of DNA repair activity. Unrepaired DNA damage can lead to genomic instability, which can massively compromise the health status of organisms. Therefore it is important to measure and quantify DNA damage and its repair. We have previously published an automated method for measuring DNA strand breaks based on fluorimetric detection of alkaline DNA unwinding [1], and here we present a mathematical model of the FADU assay, which enables to an analytic expression for the relation between measured fluorescence and the number of strand breaks. Assessment of the formation and also the repair of DNA strand breaks is a crucial functional parameter to investigate genotoxicity in living cells. A reliable and convenient method to quantify DNA strand breakage is therefore of significant importance for a wide variety of scientific fields, e.g. toxicology, pharmacology, epidemiology and medical sciences

  4. An algorithm to automate yeast segmentation and tracking.

    Directory of Open Access Journals (Sweden)

    Andreas Doncic

    Full Text Available Our understanding of dynamic cellular processes has been greatly enhanced by rapid advances in quantitative fluorescence microscopy. Imaging single cells has emphasized the prevalence of phenomena that can be difficult to infer from population measurements, such as all-or-none cellular decisions, cell-to-cell variability, and oscillations. Examination of these phenomena requires segmenting and tracking individual cells over long periods of time. However, accurate segmentation and tracking of cells is difficult and is often the rate-limiting step in an experimental pipeline. Here, we present an algorithm that accomplishes fully automated segmentation and tracking of budding yeast cells within growing colonies. The algorithm incorporates prior information of yeast-specific traits, such as immobility and growth rate, to segment an image using a set of threshold values rather than one specific optimized threshold. Results from the entire set of thresholds are then used to perform a robust final segmentation.

  5. Modeling and Experimental Study of Soft Error Propagation Based on Cellular Automaton

    OpenAIRE

    He, Wei; Wang, Yueke; Xing, Kefei; Yang, Jianwei

    2016-01-01

    Aiming to estimate SEE soft error performance of complex electronic systems, a soft error propagation model based on cellular automaton is proposed and an estimation methodology based on circuit partitioning and error propagation is presented. Simulations indicate that different fault grade jamming and different coupling factors between cells are the main parameters influencing the vulnerability of the system. Accelerated radiation experiments have been developed to determine the main paramet...

  6. Cellular Automaton Modeling of Dendritic Growth Using a Multi-grid Method

    International Nuclear Information System (INIS)

    Natsume, Y; Ohsasa, K

    2015-01-01

    A two-dimensional cellular automaton model with a multi-grid method was developed to simulate dendritic growth. In the present model, we used a triple-grid system for temperature, solute concentration and solid fraction fields as a new approach of the multi-grid method. In order to evaluate the validity of the present model, we carried out simulations of single dendritic growth, secondary dendrite arm growth, multi-columnar dendritic growth and multi-equiaxed dendritic growth. From the results of the grid dependency from the simulation of single dendritic growth, we confirmed that the larger grid can be used in the simulation and that the computational time can be reduced dramatically. In the simulation of secondary dendrite arm growth, the results from the present model were in good agreement with the experimental data and the simulated results from a phase-field model. Thus, the present model can quantitatively simulate dendritic growth. From the simulated results of multi-columnar and multi-equiaxed dendrites, we confirmed that the present model can perform simulations under practical solidification conditions. (paper)

  7. A new stochastic cellular automaton model on traffic flow and its jamming phase transition

    International Nuclear Information System (INIS)

    Sakai, Satoshi; Nishinari, Katsuhiro; Iida, Shinji

    2006-01-01

    A general stochastic traffic cellular automaton (CA) model, which includes the slow-to-start effect and driver's perspective, is proposed in this paper. It is shown that this model includes well-known traffic CA models such as the Nagel-Schreckenberg model, the quick-start model and the slow-to-start model as specific cases. Fundamental diagrams of this new model clearly show metastable states around the critical density even when the stochastic effect is present. We also obtain analytic expressions of the phase transition curve in phase diagrams by using approximate flow-density relations at boundaries. These phase transition curves are in excellent agreement with numerical results

  8. Improved compliance by BPM-driven workflow automation.

    Science.gov (United States)

    Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin

    2014-12-01

    Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.

  9. Oxidative Damage and Cellular Defense Mechanisms in Sea Urchin Models of Aging

    Science.gov (United States)

    Du, Colin; Anderson, Arielle; Lortie, Mae; Parsons, Rachel; Bodnar, Andrea

    2013-01-01

    The free radical or oxidative stress theory of aging proposes that the accumulation of oxidative cellular damage is a major contributor to the aging process and a key determinant of species longevity. This study investigates the oxidative stress theory in a novel model for aging research, the sea urchin. Sea urchins present a unique model for the study of aging due to the existence of species with tremendously different natural life spans including some species with extraordinary longevity and negligible senescence. Cellular oxidative damage, antioxidant capacity and proteasome enzyme activities were measured in the tissues of three sea urchin species: short-lived Lytechinus variegatus, long-lived Strongylocentrotus franciscanus and Strongylocentrotus purpuratus which has an intermediate lifespan. Levels of protein carbonyls and 4-hydroxynonenal (HNE) measured in tissues (muscle, nerve, esophagus, gonad, coelomocytes, ampullae) and 8-hydroxy-2’-deoxyguanosine (8-OHdG) measured in cell-free coelomic fluid showed no general increase with age. The fluorescent age-pigment lipofuscin measured in muscle, nerve and esophagus, increased with age however it appeared to be predominantly extracellular. Antioxidant mechanisms (total antioxidant capacity, superoxide dismutase) and proteasome enzyme activities were maintained with age. In some instances, levels of oxidative damage were lower and antioxidant activity higher in cells or tissues of the long-lived species compared to the short-lived species, however further studies are required to determine the relationship between oxidative damage and longevity in these animals. Consistent with the predictions of the oxidative stress theory of aging, the results suggest that negligible senescence is accompanied by a lack of accumulation of cellular oxidative damage with age and maintenance of antioxidant capacity and proteasome enzyme activities may be important mechanisms to mitigate damage. PMID:23707327

  10. Automation of the ELISpot assay for high-throughput detection of antigen-specific T-cell responses.

    Science.gov (United States)

    Almeida, Coral-Ann M; Roberts, Steven G; Laird, Rebecca; McKinnon, Elizabeth; Ahmed, Imran; Pfafferott, Katja; Turley, Joanne; Keane, Niamh M; Lucas, Andrew; Rushton, Ben; Chopra, Abha; Mallal, Simon; John, Mina

    2009-05-15

    The enzyme linked immunospot (ELISpot) assay is a fundamental tool in cellular immunology, providing both quantitative and qualitative information on cellular cytokine responses to defined antigens. It enables the comprehensive screening of patient derived peripheral blood mononuclear cells to reveal the antigenic restriction of T-cell responses and is an emerging technique in clinical laboratory investigation of certain infectious diseases. As with all cellular-based assays, the final results of the assay are dependent on a number of technical variables that may impact precision if not highly standardised between operators. When studies that are large scale or using multiple antigens are set up manually, these assays may be labour intensive, have many manual handling steps, are subject to data and sample integrity failure and may show large inter-operator variability. Here we describe the successful automated performance of the interferon (IFN)-gamma ELISpot assay from cell counting through to electronic capture of cytokine quantitation and present the results of a comparison between automated and manual performance of the ELISpot assay. The mean number of spot forming units enumerated by both methods for limiting dilutions of CMV, EBV and influenza (CEF)-derived peptides in six healthy individuals were highly correlated (r>0.83, pautomated system compared favourably with the manual ELISpot and further ensured electronic tracking, increased through-put and reduced turnaround time.

  11. The Design of Fault Tolerant Quantum Dot Cellular Automata Based Logic

    Science.gov (United States)

    Armstrong, C. Duane; Humphreys, William M.; Fijany, Amir

    2002-01-01

    As transistor geometries are reduced, quantum effects begin to dominate device performance. At some point, transistors cease to have the properties that make them useful computational components. New computing elements must be developed in order to keep pace with Moore s Law. Quantum dot cellular automata (QCA) represent an alternative paradigm to transistor-based logic. QCA architectures that are robust to manufacturing tolerances and defects must be developed. We are developing software that allows the exploration of fault tolerant QCA gate architectures by automating the specification, simulation, analysis and documentation processes.

  12. Physics of automated driving in framework of three-phase traffic theory.

    Science.gov (United States)

    Kerner, Boris S

    2018-04-01

    We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.

  13. Necessity of supporting situation understanding to prevent over-trust in automation

    International Nuclear Information System (INIS)

    Itoh, Makoto

    2010-01-01

    It is necessary to clarify how a human operator comes to expect that an automation will perform a task successfully even beyond the limit of the automation. This paper proposes a way of modeling trust in automation by which it is possible to discuss how an operator's trust in automation becomes over-trust. On the basis of the model, an experiment using a micro world was conducted to examine whether the range of user's expectation per se surpasses the limit of the capability of the automation. The results revealed that informing the human operator of the functional limit of capability of automation without giving an appropriate reason was effective but not perfect for avoiding human operator's over trust. It was also shown that the understanding of the automation's limitation can be changed through experiences due to confusion about the situation, therefore, it is necessary to support adequate situation understanding of the human operator in order to prevent over-trust in automation. (author)

  14. Physics of automated driving in framework of three-phase traffic theory

    Science.gov (United States)

    Kerner, Boris S.

    2018-04-01

    We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.

  15. Automated multiscale morphometry of muscle disease from second harmonic generation microscopy using tensor-based image processing.

    Science.gov (United States)

    Garbe, Christoph S; Buttgereit, Andreas; Schürmann, Sebastian; Friedrich, Oliver

    2012-01-01

    Practically, all chronic diseases are characterized by tissue remodeling that alters organ and cellular function through changes to normal organ architecture. Some morphometric alterations become irreversible and account for disease progression even on cellular levels. Early diagnostics to categorize tissue alterations, as well as monitoring progression or remission of disturbed cytoarchitecture upon treatment in the same individual, are a new emerging field. They strongly challenge spatial resolution and require advanced imaging techniques and strategies for detecting morphological changes. We use a combined second harmonic generation (SHG) microscopy and automated image processing approach to quantify morphology in an animal model of inherited Duchenne muscular dystrophy (mdx mouse) with age. Multiphoton XYZ image stacks from tissue slices reveal vast morphological deviation in muscles from old mdx mice at different scales of cytoskeleton architecture: cell calibers are irregular, myofibrils within cells are twisted, and sarcomere lattice disruptions (detected as "verniers") are larger in number compared to samples from healthy mice. In young mdx mice, such alterations are only minor. The boundary-tensor approach, adapted and optimized for SHG data, is a suitable approach to allow quick quantitative morphometry in whole tissue slices. The overall detection performance of the automated algorithm compares very well with manual "by eye" detection, the latter being time consuming and prone to subjective errors. Our algorithm outperfoms manual detection by time with similar reliability. This approach will be an important prerequisite for the implementation of a clinical image databases to diagnose and monitor specific morphological alterations in chronic (muscle) diseases. © 2011 IEEE

  16. Cellular automaton model of mass transport with chemical reactions

    International Nuclear Information System (INIS)

    Karapiperis, T.; Blankleider, B.

    1993-10-01

    The transport and chemical reactions of solutes are modelled as a cellular automaton in which molecules of different species perform a random walk on a regular lattice and react according to a local probabilistic rule. The model describes advection and diffusion in a simple way, and as no restriction is placed on the number of particles at a lattice site, it is also able to describe a wide variety of chemical reactions. Assuming molecular chaos and a smooth density function, we obtain the standard reaction-transport equations in the continuum limit. Simulations on one-and two-dimensional lattices show that the discrete model can be used to approximate the solutions of the continuum equations. We discuss discrepancies which arise from correlations between molecules and how these discrepancies disappear as the continuum limit is approached. Of particular interest are simulations displaying long-time behaviour which depends on long-wavelength statistical fluctuations not accounted for by the standard equations. The model is applied to the reactions a + b ↔ c and a + b → c with homogeneous and inhomogeneous initial conditions as well as to systems subject to autocatalytic reactions and displaying spontaneous formation of spatial concentration patterns. (author) 9 figs., 34 refs

  17. Application of DEN refinement and automated model building to a difficult case of molecular-replacement phasing: the structure of a putative succinyl-diaminopimelate desuccinylase from Corynebacterium glutamicum.

    Science.gov (United States)

    Brunger, Axel T; Das, Debanu; Deacon, Ashley M; Grant, Joanna; Terwilliger, Thomas C; Read, Randy J; Adams, Paul D; Levitt, Michael; Schröder, Gunnar F

    2012-04-01

    Phasing by molecular replacement remains difficult for targets that are far from the search model or in situations where the crystal diffracts only weakly or to low resolution. Here, the process of determining and refining the structure of Cgl1109, a putative succinyl-diaminopimelate desuccinylase from Corynebacterium glutamicum, at ∼3 Å resolution is described using a combination of homology modeling with MODELLER, molecular-replacement phasing with Phaser, deformable elastic network (DEN) refinement and automated model building using AutoBuild in a semi-automated fashion, followed by final refinement cycles with phenix.refine and Coot. This difficult molecular-replacement case illustrates the power of including DEN restraints derived from a starting model to guide the movements of the model during refinement. The resulting improved model phases provide better starting points for automated model building and produce more significant difference peaks in anomalous difference Fourier maps to locate anomalous scatterers than does standard refinement. This example also illustrates a current limitation of automated procedures that require manual adjustment of local sequence misalignments between the homology model and the target sequence.

  18. Mathematical Modeling of Tuberculosis Bacillary Counts and Cellular Populations in the Organs of Infected Mice

    Science.gov (United States)

    Bru, Antonio; Cardona, Pere-Joan

    2010-01-01

    Background Mycobacterium tuberculosis is a particularly aggressive microorganism and the host's defense is based on the induction of cellular immunity, in which the creation of a granulomatous structure has an important role. Methodology We present here a new 2D cellular automata model based on the concept of a multifunctional process that includes key factors such as the chemokine attraction of the cells; the role of innate immunity triggered by natural killers; the presence of neutrophils; apoptosis and necrosis of infected macrophages; the removal of dead cells by macrophages, which induces the production of foamy macrophages (FMs); the life cycle of the bacilli as a determinant for the evolution of infected macrophages; and the immune response. Results The results obtained after the inclusion of two degrees of tolerance to the inflammatory response triggered by the infection shows that the model can cover a wide spectrum, ranging from highly-tolerant (i.e. mice) to poorly-tolerant hosts (i.e. mini-pigs or humans). Conclusions This model suggest that stopping bacillary growth at the onset of the infection might be difficult and the important role played by FMs in bacillary drainage in poorly-tolerant hosts together with apoptosis and innate lymphocytes. It also shows the poor ability of the cellular immunity to control the infection, provides a clear protective character to the granuloma, due its ability to attract a sufficient number of cells, and explains why an already infected host can be constantly reinfected. PMID:20886087

  19. Strength analysis and modeling of cellular lattice structures manufactured using selective laser melting for tooling applications

    DEFF Research Database (Denmark)

    Mahshid, Rasoul; Hansen, Hans Nørgaard; Loft Højbjerre, Klaus

    2016-01-01

    Additive manufacturing is rapidly developing and gaining popularity for direct metal fabrication systems like selective laser melting (SLM). The technology has shown significant improvement for high-quality fabrication of lightweight design-efficient structures such as conformal cooling channels...... in injection molding tools and lattice structures. This research examines the effect of cellular lattice structures on the strength of workpieces additively manufactured from ultra high-strength steel powder. Two commercial SLM machines are used to fabricate cellular samples based on four architectures— solid......, hollow, lattice structure and rotated lattice structure. Compression test is applied to the specimens while they are deformed. The analytical approach includes finite element (FE), geometrical and mathematical models for prediction of collapse strength. The results from the the models are verified...

  20. Origami interleaved tube cellular materials

    International Nuclear Information System (INIS)

    Cheung, Kenneth C; Tachi, Tomohiro; Calisch, Sam; Miura, Koryo

    2014-01-01

    A novel origami cellular material based on a deployable cellular origami structure is described. The structure is bi-directionally flat-foldable in two orthogonal (x and y) directions and is relatively stiff in the third orthogonal (z) direction. While such mechanical orthotropicity is well known in cellular materials with extruded two dimensional geometry, the interleaved tube geometry presented here consists of two orthogonal axes of interleaved tubes with high interfacial surface area and relative volume that changes with fold-state. In addition, the foldability still allows for fabrication by a flat lamination process, similar to methods used for conventional expanded two dimensional cellular materials. This article presents the geometric characteristics of the structure together with corresponding kinematic and mechanical modeling, explaining the orthotropic elastic behavior of the structure with classical dimensional scaling analysis. (paper)

  1. Origami interleaved tube cellular materials

    Science.gov (United States)

    Cheung, Kenneth C.; Tachi, Tomohiro; Calisch, Sam; Miura, Koryo

    2014-09-01

    A novel origami cellular material based on a deployable cellular origami structure is described. The structure is bi-directionally flat-foldable in two orthogonal (x and y) directions and is relatively stiff in the third orthogonal (z) direction. While such mechanical orthotropicity is well known in cellular materials with extruded two dimensional geometry, the interleaved tube geometry presented here consists of two orthogonal axes of interleaved tubes with high interfacial surface area and relative volume that changes with fold-state. In addition, the foldability still allows for fabrication by a flat lamination process, similar to methods used for conventional expanded two dimensional cellular materials. This article presents the geometric characteristics of the structure together with corresponding kinematic and mechanical modeling, explaining the orthotropic elastic behavior of the structure with classical dimensional scaling analysis.

  2. Predictive modeling of multicellular structure formation by using Cellular Particle Dynamics simulations

    Science.gov (United States)

    McCune, Matthew; Shafiee, Ashkan; Forgacs, Gabor; Kosztin, Ioan

    2014-03-01

    Cellular Particle Dynamics (CPD) is an effective computational method for describing and predicting the time evolution of biomechanical relaxation processes of multicellular systems. A typical example is the fusion of spheroidal bioink particles during post bioprinting structure formation. In CPD cells are modeled as an ensemble of cellular particles (CPs) that interact via short-range contact interactions, characterized by an attractive (adhesive interaction) and a repulsive (excluded volume interaction) component. The time evolution of the spatial conformation of the multicellular system is determined by following the trajectories of all CPs through integration of their equations of motion. CPD was successfully applied to describe and predict the fusion of 3D tissue construct involving identical spherical aggregates. Here, we demonstrate that CPD can also predict tissue formation involving uneven spherical aggregates whose volumes decrease during the fusion process. Work supported by NSF [PHY-0957914]. Computer time provided by the University of Missouri Bioinformatics Consortium.

  3. Designing a Software Test Automation Framework

    Directory of Open Access Journals (Sweden)

    Sabina AMARICAI

    2014-01-01

    Full Text Available Testing is an art and science that should ultimately lead to lower cost businesses through increasing control and reducing risk. Testing specialists should thoroughly understand the system or application from both the technical and the business perspective, and then design, build and implement the minimum-cost, maximum-coverage validation framework. Test Automation is an important ingredient for testing large scale applications. In this paper we discuss several test automation frameworks, their advantages and disadvantages. We also propose a custom automation framework model that is suited for applications with very complex business requirements and numerous interfaces.

  4. Advanced spatial metrics analysis in cellular automata land use and cover change modeling

    International Nuclear Information System (INIS)

    Zamyatin, Alexander; Cabral, Pedro

    2011-01-01

    This paper proposes an approach for a more effective definition of cellular automata transition rules for landscape change modeling using an advanced spatial metrics analysis. This approach considers a four-stage methodology based on: (i) the search for the appropriate spatial metrics with minimal correlations; (ii) the selection of the appropriate neighborhood size; (iii) the selection of the appropriate technique for spatial metrics application; and (iv) the analysis of the contribution level of each spatial metric for joint use. The case study uses an initial set of 7 spatial metrics of which 4 are selected for modeling. Results show a better model performance when compared to modeling without any spatial metrics or with the initial set of 7 metrics.

  5. Algorithm for cellular reprogramming.

    Science.gov (United States)

    Ronquist, Scott; Patterson, Geoff; Muir, Lindsey A; Lindsly, Stephen; Chen, Haiming; Brown, Markus; Wicha, Max S; Bloch, Anthony; Brockett, Roger; Rajapakse, Indika

    2017-11-07

    The day we understand the time evolution of subcellular events at a level of detail comparable to physical systems governed by Newton's laws of motion seems far away. Even so, quantitative approaches to cellular dynamics add to our understanding of cell biology. With data-guided frameworks we can develop better predictions about, and methods for, control over specific biological processes and system-wide cell behavior. Here we describe an approach for optimizing the use of transcription factors (TFs) in cellular reprogramming, based on a device commonly used in optimal control. We construct an approximate model for the natural evolution of a cell-cycle-synchronized population of human fibroblasts, based on data obtained by sampling the expression of 22,083 genes at several time points during the cell cycle. To arrive at a model of moderate complexity, we cluster gene expression based on division of the genome into topologically associating domains (TADs) and then model the dynamics of TAD expression levels. Based on this dynamical model and additional data, such as known TF binding sites and activity, we develop a methodology for identifying the top TF candidates for a specific cellular reprogramming task. Our data-guided methodology identifies a number of TFs previously validated for reprogramming and/or natural differentiation and predicts some potentially useful combinations of TFs. Our findings highlight the immense potential of dynamical models, mathematics, and data-guided methodologies for improving strategies for control over biological processes. Copyright © 2017 the Author(s). Published by PNAS.

  6. Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.

  7. Analysis of an Automated Vehicle Routing Problem in Logistics considering Path Interruption

    Directory of Open Access Journals (Sweden)

    Yong Zhang

    2017-01-01

    Full Text Available The application of automated vehicles in logistics can efficiently reduce the cost of logistics and reduce the potential risks in the last mile. Considering the path restriction in the initial stage of the application of automated vehicles in logistics, the conventional model for a vehicle routing problem (VRP is modified. Thus, the automated vehicle routing problem with time windows (AVRPTW model considering path interruption is established. Additionally, an improved particle swarm optimisation (PSO algorithm is designed to solve this problem. Finally, a case study is undertaken to test the validity of the model and the algorithm. Four automated vehicles are designated to execute all delivery tasks required by 25 stores. Capacities of all of the automated vehicles are almost fully utilised. It is of considerable significance for the promotion of automated vehicles in last-mile situations to develop such research into real problems arising in the initial period.

  8. Modeling and Analysis of Hybrid Cellular/WLAN Systems with Integrated Service-Based Vertical Handoff Schemes

    Science.gov (United States)

    Xia, Weiwei; Shen, Lianfeng

    We propose two vertical handoff schemes for cellular network and wireless local area network (WLAN) integration: integrated service-based handoff (ISH) and integrated service-based handoff with queue capabilities (ISHQ). Compared with existing handoff schemes in integrated cellular/WLAN networks, the proposed schemes consider a more comprehensive set of system characteristics such as different features of voice and data services, dynamic information about the admitted calls, user mobility and vertical handoffs in two directions. The code division multiple access (CDMA) cellular network and IEEE 802.11e WLAN are taken into account in the proposed schemes. We model the integrated networks by using multi-dimensional Markov chains and the major performance measures are derived for voice and data services. The important system parameters such as thresholds to prioritize handoff voice calls and queue sizes are optimized. Numerical results demonstrate that the proposed ISHQ scheme can maximize the utilization of overall bandwidth resources with the best quality of service (QoS) provisioning for voice and data services.

  9. Shot Automation for the National Ignition Facility

    International Nuclear Information System (INIS)

    Lagin, L J; Bettenhausen, R C; Beeler, R G; Bowers, G A; Carey, R.; Casavant, D.D.; Cline, B.D.; Demaret, R.D.; Domyancic, D.M.; Elko, S.D.; Fisher, J.M.; Hermann, M.R.; Krammen, J.E.; Kohut, T.R.; Marshall, C.D.; Mathisen, D.G.; Ludwigsen, A.P.; Patterson, Jr. R.W.; Sanchez, R.J.; Stout, E.A.; Van Arsdall, P.J.; Van Wonterghem, B.M.

    2005-01-01

    A shot automation framework has been developed and deployed during the past year to automate shots performed on the National Ignition Facility (NIF) using the Integrated Computer Control System This framework automates a 4-8 hour shot sequence, that includes inputting shot goals from a physics model, set up of the laser and diagnostics, automatic alignment of laser beams and verification of status. This sequence consists of set of preparatory verification shots, leading to amplified system shots using a 4-minute countdown, triggering during the last 2 seconds using a high-precision timing system, followed by post-shot analysis and archiving. The framework provides for a flexible, model-based execution driven of scriptable automation called macro steps. The framework is driven by high-level shot director software that provides a restricted set of shot life cycle state transitions to 25 collaboration supervisors that automate 8-laser beams (bundles) and a common set of shared resources. Each collaboration supervisor commands approximately 10 subsystem shot supervisors that perform automated control and status verification. Collaboration supervisors translate shot life cycle state commands from the shot director into sequences of ''macro steps'' to be distributed to each of its shot supervisors. Each Shot supervisor maintains order of macro steps for each subsystem and supports collaboration between macro steps. They also manage failure, restarts and rejoining into the shot cycle (if necessary) and manage auto/manual macro step execution and collaborations between other collaboration supervisors. Shot supervisors execute macro step shot functions commanded by collaboration supervisors. Each macro step has database-driven verification phases and a scripted perform phase. This provides for a highly flexible methodology for performing a variety of NIF shot types. Database tables define the order of work and dependencies (workflow) of macro steps to be performed for a

  10. Genome-wide assessment of the carriers involved in the cellular uptake of drugs: a model system in yeast.

    Science.gov (United States)

    Lanthaler, Karin; Bilsland, Elizabeth; Dobson, Paul D; Moss, Harry J; Pir, Pınar; Kell, Douglas B; Oliver, Stephen G

    2011-10-24

    The uptake of drugs into cells has traditionally been considered to be predominantly via passive diffusion through the bilayer portion of the cell membrane. The recent recognition that drug uptake is mostly carrier-mediated raises the question of which drugs use which carriers. To answer this, we have constructed a chemical genomics platform built upon the yeast gene deletion collection, using competition experiments in batch fermenters and robotic automation of cytotoxicity screens, including protection by 'natural' substrates. Using these, we tested 26 different drugs and identified the carriers required for 18 of the drugs to gain entry into yeast cells. As well as providing a useful platform technology, these results further substantiate the notion that the cellular uptake of pharmaceutical drugs normally occurs via carrier-mediated transport and indicates that establishing the identity and tissue distribution of such carriers should be a major consideration in the design of safe and effective drugs.

  11. The similia principle: results obtained in a cellular model system.

    Science.gov (United States)

    Wiegant, Fred; Van Wijk, Roeland

    2010-01-01

    This paper describes the results of a research program focused on the beneficial effect of low dose stress conditions that were applied according to the similia principle to cells previously disturbed by more severe stress conditions. In first instance, we discuss criteria for research on the similia principle at the cellular level. Then, the homologous ('isopathic') approach is reviewed, in which the initial (high dose) stress used to disturb cellular physiology and the subsequent (low dose) stress are identical. Beneficial effects of low dose stress are described in terms of increased cellular survival capacity and at the molecular level as an increase in the synthesis of heat shock proteins (hsps). Both phenomena reflect a stimulation of the endogenous cellular self-recovery capacity. Low dose stress conditions applied in a homologous approach stimulate the synthesis of hsps and enhance survival in comparison with stressed cells that were incubated in the absence of low dose stress conditions. Thirdly, the specificity of the low dose stress condition is described where the initial (high dose) stress is different in nature from the subsequently applied (low dose) stress; the heterologous or 'heteropathic' approach. The results support the similia principle at the cellular level and add to understanding of how low dose stress conditions influence the regulatory processes underlying self-recovery. In addition, the phenomenon of 'symptom aggravation' which is also observed at the cellular level, is discussed in the context of self-recovery. Finally, the difference in efficiency between the homologous and the heterologous approach is discussed; a perspective is indicated for further research; and the relationship between studies on the similia principle and the recently introduced concept of 'postconditioning hormesis' is emphasized. Copyright 2009 The Faculty of Homeopathy. Published by Elsevier Ltd. All rights reserved.

  12. Automation Marketplace 2010: New Models, Core Systems

    Science.gov (United States)

    Breeding, Marshall

    2010-01-01

    In a year when a difficult economy presented fewer opportunities for immediate gains, the major industry players have defined their business strategies with fundamentally different concepts of library automation. This is no longer an industry where companies compete on the basis of the best or the most features in similar products but one where…

  13. Designing and implementing a regional urban modeling system using the SLEUTH cellular urban model

    Science.gov (United States)

    Jantz, Claire A.; Goetz, Scott J.; Donato, David I.; Claggett, Peter

    2010-01-01

    This paper presents a fine-scale (30 meter resolution) regional land cover modeling system, based on the SLEUTH cellular automata model, that was developed for a 257000 km2 area comprising the Chesapeake Bay drainage basin in the eastern United States. As part of this effort, we developed a new version of the SLEUTH model (SLEUTH-3r), which introduces new functionality and fit metrics that substantially increase the performance and applicability of the model. In addition, we developed methods that expand the capability of SLEUTH to incorporate economic, cultural and policy information, opening up new avenues for the integration of SLEUTH with other land-change models. SLEUTH-3r is also more computationally efficient (by a factor of 5) and uses less memory (reduced 65%) than the original software. With the new version of SLEUTH, we were able to achieve high accuracies at both the aggregate level of 15 sub-regional modeling units and at finer scales. We present forecasts to 2030 of urban development under a current trends scenario across the entire Chesapeake Bay drainage basin, and three alternative scenarios for a sub-region within the Chesapeake Bay watershed to illustrate the new ability of SLEUTH-3r to generate forecasts across a broad range of conditions.

  14. Robotic Automation of In Vivo Two-Photon Targeted Whole-Cell Patch-Clamp Electrophysiology.

    Science.gov (United States)

    Annecchino, Luca A; Morris, Alexander R; Copeland, Caroline S; Agabi, Oshiorenoya E; Chadderton, Paul; Schultz, Simon R

    2017-08-30

    Whole-cell patch-clamp electrophysiological recording is a powerful technique for studying cellular function. While in vivo patch-clamp recording has recently benefited from automation, it is normally performed "blind," meaning that throughput for sampling some genetically or morphologically defined cell types is unacceptably low. One solution to this problem is to use two-photon microscopy to target fluorescently labeled neurons. Combining this with robotic automation is difficult, however, as micropipette penetration induces tissue deformation, moving target cells from their initial location. Here we describe a platform for automated two-photon targeted patch-clamp recording, which solves this problem by making use of a closed loop visual servo algorithm. Our system keeps the target cell in focus while iteratively adjusting the pipette approach trajectory to compensate for tissue motion. We demonstrate platform validation with patch-clamp recordings from a variety of cells in the mouse neocortex and cerebellum. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Low cost automation

    International Nuclear Information System (INIS)

    1987-03-01

    This book indicates method of building of automation plan, design of automation facilities, automation and CHIP process like basics of cutting, NC processing machine and CHIP handling, automation unit, such as drilling unit, tapping unit, boring unit, milling unit and slide unit, application of oil pressure on characteristics and basic oil pressure circuit, application of pneumatic, automation kinds and application of process, assembly, transportation, automatic machine and factory automation.

  16. Flexible End2End Workflow Automation of Hit-Discovery Research.

    Science.gov (United States)

    Holzmüller-Laue, Silke; Göde, Bernd; Thurow, Kerstin

    2014-08-01

    The article considers a new approach of more complex laboratory automation at the workflow layer. The authors purpose the automation of end2end workflows. The combination of all relevant subprocesses-whether automated or manually performed, independently, and in which organizational unit-results in end2end processes that include all result dependencies. The end2end approach focuses on not only the classical experiments in synthesis or screening, but also on auxiliary processes such as the production and storage of chemicals, cell culturing, and maintenance as well as preparatory activities and analyses of experiments. Furthermore, the connection of control flow and data flow in the same process model leads to reducing of effort of the data transfer between the involved systems, including the necessary data transformations. This end2end laboratory automation can be realized effectively with the modern methods of business process management (BPM). This approach is based on a new standardization of the process-modeling notation Business Process Model and Notation 2.0. In drug discovery, several scientific disciplines act together with manifold modern methods, technologies, and a wide range of automated instruments for the discovery and design of target-based drugs. The article discusses the novel BPM-based automation concept with an implemented example of a high-throughput screening of previously synthesized compound libraries. © 2014 Society for Laboratory Automation and Screening.

  17. Modeling the Land Use/Cover Change in an Arid Region Oasis City Constrained by Water Resource and Environmental Policy Change using Cellular Automata Model

    Science.gov (United States)

    Hu, X.; Li, X.; Lu, L.

    2017-12-01

    Land use/cover change (LUCC) is an important subject in the research of global environmental change and sustainable development, while spatial simulation on land use/cover change is one of the key content of LUCC and is also difficult due to the complexity of the system. The cellular automata (CA) model had an irreplaceable role in simulating of land use/cover change process due to the powerful spatial computing power. However, the majority of current CA land use/cover models were binary-state model that could not provide more general information about the overall spatial pattern of land use/cover change. Here, a multi-state logistic-regression-based Markov cellular automata (MLRMCA) model and a multi-state artificial-neural-network-based Markov cellular automata (MANNMCA) model were developed and were used to simulate complex land use/cover evolutionary process in an arid region oasis city constrained by water resource and environmental policy change, the Zhangye city during the period of 1990-2010. The results indicated that the MANNMCA model was superior to MLRMCA model in simulated accuracy. These indicated that by combining the artificial neural network with CA could more effectively capture the complex relationships between the land use/cover change and a set of spatial variables. Although the MLRMCA model were also some advantages, the MANNMCA model was more appropriate for simulating complex land use/cover dynamics. The two proposed models were effective and reliable, and could reflect the spatial evolution of regional land use/cover changes. These have also potential implications for the impact assessment of water resources, ecological restoration, and the sustainable urban development in arid areas.

  18. Surface Dynamic Process Simulation with the Use of Cellular Automata

    International Nuclear Information System (INIS)

    Adamska-Szatko, M.; Bala, J.

    2010-01-01

    Cellular automata are known for many applications, especially for physical and biological simulations. Universal cellular automata can be used for modelling complex natural phenomena. The paper presents simulation of surface dynamic process. Simulation uses 2-dimensional cellular automata algorithm. Modelling and visualisation were created by in-house developed software with standard OpenGL graphic library. (authors)

  19. A Computational Model of Cellular Engraftment on Lung Scaffolds

    Directory of Open Access Journals (Sweden)

    Joshua J. Pothen

    2016-10-01

    Full Text Available The possibility that stem cells might be used to regenerate tissue is now being investigated for a variety of organs, but these investigations are still essentially exploratory and have few predictive tools available to guide experimentation. We propose, in this study, that the field of lung tissue regeneration might be better served by predictive tools that treat stem cells as agents that obey certain rules of behavior governed by both their phenotype and their environment. Sufficient knowledge of these rules of behavior would then, in principle, allow lung tissue development to be simulated computationally. Toward this end, we developed a simple agent-based computational model to simulate geographic patterns of cells seeded onto a lung scaffold. Comparison of the simulated patterns to those observed experimentally supports the hypothesis that mesenchymal stem cells proliferate preferentially toward the scaffold boundary, whereas alveolar epithelial cells do not. This demonstrates that a computational model of this type has the potential to assist in the discovery of rules of cellular behavior

  20. Développement d'une approche couplée Automates Cellulaires – Eléments Finis pour la modélisation du développement des structures de grains en soudage TIG A coupled Cellular Automaton – Finite Element approach for the modelling of grain structure development in TIG welding

    Directory of Open Access Journals (Sweden)

    Chen Shijia

    2013-11-01

    Full Text Available Dans le domaine du soudage, les propriétés finales du cordon sont fortement liées à la structure de grains développée au cours des procédés de fusion / resolidification. La maîtrise des propriétés de l'assemblage final passe ainsi par une amélioration de la connaissance de sa structure de ce domaine. Dans cet objectif, un modèle couplé Automates Cellulaires – Eléments Finis est proposé pour simuler le développement, en volume, de cette structure, dans le cadre du soudage TIG. Ce modèle est appliqué au soudage d'acier Duplex 2202 et l'évolution de la structure de grains selon les paramètres procédés est discutée. In the welding area, the final properties of the weld bead are mainly induced by the grain structure developed during the melting and solidification steps. The mastery of the properties of the joining will be achieved with a better knowledge of the developed grain structure. A 3D coupled Cellular Automaton – Finite Element model is proposed in order to simulate the grains development in TIG process. This model is applied to the welding of a duplex stainless steel grade. The grain structure evolution is discussed for the various process parameters.

  1. Cellular automata model for human articular chondrocytes migration, proliferation and cell death: An in vitro validation.

    Science.gov (United States)

    Vaca-González, J J; Gutiérrez, M L; Guevara, J M; Garzón-Alvarado, D A

    2017-01-01

    Articular cartilage is characterized by low cell density of only one cell type, chondrocytes, and has limited self-healing properties. When articular cartilage is affected by traumatic injuries, a therapeutic strategy such as autologous chondrocyte implantation is usually proposed for its treatment. This approach requires in vitro chondrocyte expansion to yield high cell number for cell transplantation. To improve the efficiency of this procedure, it is necessary to assess cell dynamics such as migration, proliferation and cell death during culture. Computational models such as cellular automata can be used to simulate cell dynamics in order to enhance the result of cell culture procedures. This methodology has been implemented for several cell types; however, an experimental validation is required for each one. For this reason, in this research a cellular automata model, based on random-walk theory, was devised in order to predict articular chondrocyte behavior in monolayer culture during cell expansion. Results demonstrated that the cellular automata model corresponded to cell dynamics and computed-accurate quantitative results. Moreover, it was possible to observe that cell dynamics depend on weighted probabilities derived from experimental data and cell behavior varies according to the cell culture period. Thus, depending on whether cells were just seeded or proliferated exponentially, culture time probabilities differed in percentages in the CA model. Furthermore, in the experimental assessment a decreased chondrocyte proliferation was observed along with increased passage number. This approach is expected to having other uses as in enhancing articular cartilage therapies based on tissue engineering and regenerative medicine.

  2. 3D model assisted fully automated scanning laser Doppler vibrometer measurements

    Science.gov (United States)

    Sels, Seppe; Ribbens, Bart; Bogaerts, Boris; Peeters, Jeroen; Vanlanduit, Steve

    2017-12-01

    In this paper, a new fully automated scanning laser Doppler vibrometer (LDV) measurement technique is presented. In contrast to existing scanning LDV techniques which use a 2D camera for the manual selection of sample points, we use a 3D Time-of-Flight camera in combination with a CAD file of the test object to automatically obtain measurements at pre-defined locations. The proposed procedure allows users to test prototypes in a shorter time because physical measurement locations are determined without user interaction. Another benefit from this methodology is that it incorporates automatic mapping between a CAD model and the vibration measurements. This mapping can be used to visualize measurements directly on a 3D CAD model. The proposed method is illustrated with vibration measurements of an unmanned aerial vehicle

  3. Building mathematics cellular phone learning communities

    Directory of Open Access Journals (Sweden)

    Wajeeh M. Daher

    2011-04-01

    Full Text Available Researchers emphasize the importance of maintaining learning communities and environments. This article describes the building and nourishment of a learning community, one comprised of middle school students who learned mathematics out-of-class using the cellular phone. The building of the learning community was led by three third year pre-service teachers majoring in mathematics and computers. The pre-service teachers selected thirty 8th grade students to learn mathematics with the cellular phone and be part of a learning community experimenting with this learning. To analyze the building and development stages of the cellular phone learning community, two models of community building stages were used; first the team development model developed by Tuckman (1965, second the life cycle model of a virtual learning community developed by Garber (2004. The research findings indicate that a learning community which is centered on a new technology has five 'life' phases of development: Pre-birth, birth, formation, performing, and maturity. Further, the research finding indicate that the norms that were encouraged by the preservice teachers who initiated the cellular phone learning community resulted in a community which developed, nourished and matured to be similar to a community of experienced applied mathematicians who use mathematical formulae to study everyday phenomena.

  4. A Modified Cellular Automaton Approach for Mixed Bicycle Traffic Flow Modeling

    Directory of Open Access Journals (Sweden)

    Xiaonian Shan

    2015-01-01

    Full Text Available Several previous studies have used the Cellular Automaton (CA for the modeling of bicycle traffic flow. However, previous CA models have several limitations, resulting in differences between the simulated and the observed traffic flow features. The primary objective of this study is to propose a modified CA model for simulating the characteristics of mixed bicycle traffic flow. Field data were collected on physically separated bicycle path in Shanghai, China, and were used to calibrate the CA model using the genetic algorithm. Traffic flow features between simulations of several CA models and field observations were compared. The results showed that our modified CA model produced more accurate simulation for the fundamental diagram and the passing events in mixed bicycle traffic flow. Based on our model, the bicycle traffic flow features, including the fundamental diagram, the number of passing events, and the number of lane changes, were analyzed. We also analyzed the traffic flow features with different traffic densities, traffic components on different travel lanes. Results of the study can provide important information for understanding and simulating the operations of mixed bicycle traffic flow.

  5. Design, Development and Evaluation of a Pneumatic Seeder for Automatic Planting of Seeds in Cellular Trays

    Directory of Open Access Journals (Sweden)

    E Movahedi

    2014-04-01

    Full Text Available For planting fine seeds in cellular trays, an automatic pneumatic seeder was designed, constructed and evaluated. CATIA software was used to design and analysis the system parts of the seeder. Different parts of the seeder, including vibrating seed hopper, vacuum boom, seed picking nozzles, seed tube, pneumatic system and electronic control unit for automation of the seeder, were designed and constructed. The area of nozzle orifice was used to calculate the required pressure of nozzle tip. The seeder was evaluated using two sizes of trays. Experiments were performed with five replications and the error of planting the seeds in the 105 and 390-cellular trays were 1.9 and 0.46 percent, respectively. The time of planting for 105 and 390 cellular trays reduced from 20 min (for manual seeding to 35 s and from 90 min to 160 s, respectively.

  6. Cellular Model of Atherogenesis Based on Pluripotent Vascular Wall Pericytes.

    Science.gov (United States)

    Ivanova, Ekaterina A; Orekhov, Alexander N

    2016-01-01

    Pericytes are pluripotent cells that can be found in the vascular wall of both microvessels and large arteries and veins. They have distinct morphology with long branching processes and form numerous contacts with each other and with endothelial cells, organizing the vascular wall cells into a three-dimensional network. Accumulating evidence demonstrates that pericytes may play a key role in the pathogenesis of vascular disorders, including atherosclerosis. Macrovascular pericytes are able to accumulate lipids and contribute to growth and vascularization of the atherosclerotic plaque. Moreover, they participate in the local inflammatory process and thrombosis, which can lead to fatal consequences. At the same time, pericytes can represent a useful model for studying the atherosclerotic process and for the development of novel therapeutic approaches. In particular, they are suitable for testing various substances' potential for decreasing lipid accumulation induced by the incubation of cells with atherogenic low-density lipoprotein. In this review we will discuss the application of cellular models for studying atherosclerosis and provide several examples of successful application of these models to drug research.

  7. Embryonic stem cells as an ectodermal cellular model of human p63-related dysplasia syndromes.

    NARCIS (Netherlands)

    Rostagno, P.; Wolchinsky, Z.; Vigano, A.M.; Shivtiel, S.; Zhou, Huiqing; Bokhoven, J.H.L.M. van; Ferone, G.; Missero, C.; Mantovani, R.; Aberdam, D.; Virolle, T.

    2010-01-01

    Heterozygous mutations in the TP63 transcription factor underlie the molecular basis of several similar autosomal dominant ectodermal dysplasia (ED) syndromes. Here we provide a novel cellular model derived from embryonic stem (ES) cells that recapitulates in vitro the main steps of embryonic skin

  8. 3D hierarchical computational model of wood as a cellular material with fibril reinforced, heterogeneous multiple layers

    DEFF Research Database (Denmark)

    Qing, Hai; Mishnaevsky, Leon

    2009-01-01

    A 3D hierarchical computational model of deformation and stiffness of wood, which takes into account the structures of wood at several scale levels (cellularity, multilayered nature of cell walls, composite-like structures of the wall layers) is developed. At the mesoscale, the softwood cell...... cellular model. With the use of the developed hierarchical model, the influence of the microstructure, including microfibril angles (MFAs, which characterizes the orientation of the cellulose fibrils with respect to the cell axis), the thickness of the cell wall, the shape of the cell cross...... is presented as a 3D hexagon-shape-tube with multilayered walls. The layers in the softwood cell are considered as considered as composite reinforced by microfibrils (celluloses). The elastic properties of the layers are determined with Halpin–Tsai equations, and introduced into mesoscale finite element...

  9. On Elementary and Algebraic Cellular Automata

    Science.gov (United States)

    Gulak, Yuriy

    In this paper we study elementary cellular automata from an algebraic viewpoint. The goal is to relate the emergent complex behavior observed in such systems with the properties of corresponding algebraic structures. We introduce algebraic cellular automata as a natural generalization of elementary ones and discuss their applications as generic models of complex systems.

  10. A bead-based western for high-throughput cellular signal transduction analyses

    Science.gov (United States)

    Treindl, Fridolin; Ruprecht, Benjamin; Beiter, Yvonne; Schultz, Silke; Döttinger, Anette; Staebler, Annette; Joos, Thomas O.; Kling, Simon; Poetz, Oliver; Fehm, Tanja; Neubauer, Hans; Kuster, Bernhard; Templin, Markus F.

    2016-01-01

    Dissecting cellular signalling requires the analysis of large number of proteins. The DigiWest approach we describe here transfers the western blot to a bead-based microarray platform. By combining gel-based protein separation with immobilization on microspheres, hundreds of replicas of the initial blot are created, thus enabling the comprehensive analysis of limited material, such as cells collected by laser capture microdissection, and extending traditional western blotting to reach proteomic scales. The combination of molecular weight resolution, sensitivity and signal linearity on an automated platform enables the rapid quantification of hundreds of specific proteins and protein modifications in complex samples. This high-throughput western blot approach allowed us to identify and characterize alterations in cellular signal transduction that occur during the development of resistance to the kinase inhibitor Lapatinib, revealing major changes in the activation state of Ephrin-mediated signalling and a central role for p53-controlled processes. PMID:27659302

  11. From equilibrium spin models to probabilistic cellular automata

    International Nuclear Information System (INIS)

    Georges, A.; Le Doussal, P.

    1989-01-01

    The general equivalence between D-dimensional probabilistic cellular automata (PCA) and (D + 1)-dimensional equilibrium spin models satisfying a disorder condition is first described in a pedagogical way and then used to analyze the phase diagrams, the critical behavior, and the universality classes of some automato. Diagrammatic representations of time-dependent correlation functions PCA are introduced. Two important classes of PCA are singled out for which these correlation functions simplify: (1) Quasi-Hamiltonian automata, which have a current-carrying steady state, and for which some correlation functions are those of a D-dimensional static model PCA satisfying the detailed balance condition appear as a particular case of these rules for which the current vanishes. (2) Linear (and more generally affine) PCA for which the diagrammatics reduces to a random walk problem closely related to (D + 1)-dimensional directed SAWs: both problems display a critical behavior with mean-field exponents in any dimension. The correlation length and effective velocity of propagation of excitations can be calculated for affine PCA, as is shown on an explicit D = 1 example. The authors conclude with some remarks on nonlinear PCA, for which the diagrammatics is related to reaction-diffusion processes, and which belong in some cases to the universality class of Reggeon field theory

  12. Automation and Job Satisfaction among Reference Librarians.

    Science.gov (United States)

    Whitlatch, Jo Bell

    1991-01-01

    Discussion of job satisfaction and the level of job performance focuses on the effect of automation on job satisfaction among reference librarians. The influence of stress is discussed, a job strain model is explained, and examples of how to design a job to reduce the stress caused by automation are given. (12 references) (LRW)

  13. Automated service quality and its behavioural consequences in CRM Environment: A structural equation modeling and causal loop diagramming approach

    Directory of Open Access Journals (Sweden)

    Arup Kumar Baksi

    2012-08-01

    Full Text Available Information technology induced communications (ICTs have revolutionized the operational aspects of service sector and have triggered a perceptual shift in service quality as rapid dis-intermediation has changed the access-mode of services on part of the consumers. ICT-enabled services further stimulated the perception of automated service quality with renewed dimensions and there subsequent significance to influence the behavioural outcomes of the consumers. Customer Relationship Management (CRM has emerged as an offshoot to technological breakthrough as it ensured service-encapsulation by integrating people, process and technology. This paper attempts to explore the relationship between automated service quality and its behavioural consequences in a relatively novel business-philosophy – CRM. The study has been conducted on the largest public sector bank of India - State bank of India (SBI at Kolkata which has successfully completed its decade-long operational automation in the year 2008. The study used structural equation modeling (SEM to justify the proposed model construct and causal loop diagramming (CLD to depict the negative and positive linkages between the variables.

  14. Automated Test Case Generation for an Autopilot Requirement Prototype

    Science.gov (United States)

    Giannakopoulou, Dimitra; Rungta, Neha; Feary, Michael

    2011-01-01

    Designing safety-critical automation with robust human interaction is a difficult task that is susceptible to a number of known Human-Automation Interaction (HAI) vulnerabilities. It is therefore essential to develop automated tools that provide support both in the design and rapid evaluation of such automation. The Automation Design and Evaluation Prototyping Toolset (ADEPT) enables the rapid development of an executable specification for automation behavior and user interaction. ADEPT supports a number of analysis capabilities, thus enabling the detection of HAI vulnerabilities early in the design process, when modifications are less costly. In this paper, we advocate the introduction of a new capability to model-based prototyping tools such as ADEPT. The new capability is based on symbolic execution that allows us to automatically generate quality test suites based on the system design. Symbolic execution is used to generate both user input and test oracles user input drives the testing of the system implementation, and test oracles ensure that the system behaves as designed. We present early results in the context of a component in the Autopilot system modeled in ADEPT, and discuss the challenges of test case generation in the HAI domain.

  15. A radiation measurement study on cellular phone

    International Nuclear Information System (INIS)

    Mohd Yusof Mohd Ali; Rozaimah Abd Rahim; Roha Tukimin; Khairol Nizam Mohamed; Mohd Amirul Nizam Mohamad Thari; Ahmad Fadzli Ahmad Sanusi

    2007-01-01

    This paper will explain the radiation level produced by various selected cellular phone from various models and brands available in the market. The result obtained from this study will also recommend whether a cellular phone is safe for public usage or it might cause any effect on public health. Finally, a database of radiation measurement level produced by selected various cellular phone will also be developed and exhibited in this paper. (Author)

  16. Meta-domains for Automated System Identification

    National Research Council Canada - National Science Library

    Easley, Matthew; Bradley, Elizabeth

    2000-01-01

    .... In particular we introduce a new structure for automated model building known as a meta-domain which, when instantiated with domain-specific components tailors the space of candidate models to the system at hand...

  17. Mining Repair Actions for Guiding Automated Program Fixing

    OpenAIRE

    Martinez , Matias; Monperrus , Martin

    2012-01-01

    Automated program fixing consists of generating source code in order to fix bugs in an automated manner. Our intuition is that automated program fixing can imitate human-based program fixing. Hence, we present a method to mine repair actions from software repositories. A repair action is a small semantic modification on code such as adding a method call. We then decorate repair actions with a probability distribution also learnt from software repositories. Our probabilistic repair models enab...

  18. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  19. Role of cellular adhesions in tissue dynamics spectroscopy

    Science.gov (United States)

    Merrill, Daniel A.; An, Ran; Turek, John; Nolte, David

    2014-02-01

    Cellular adhesions play a critical role in cell behavior, and modified expression of cellular adhesion compounds has been linked to various cancers. We tested the role of cellular adhesions in drug response by studying three cellular culture models: three-dimensional tumor spheroids with well-developed cellular adhesions and extracellular matrix (ECM), dense three-dimensional cell pellets with moderate numbers of adhesions, and dilute three-dimensional cell suspensions in agarose having few adhesions. Our technique for measuring the drug response for the spheroids and cell pellets was biodynamic imaging (BDI), and for the suspensions was quasi-elastic light scattering (QELS). We tested several cytoskeletal chemotherapeutic drugs (nocodazole, cytochalasin-D, paclitaxel, and colchicine) on three cancer cell lines chosen from human colorectal adenocarcinoma (HT-29), human pancreatic carcinoma (MIA PaCa-2), and rat osteosarcoma (UMR-106) to exhibit differences in adhesion strength. Comparing tumor spheroid behavior to that of cell suspensions showed shifts in the spectral motion of the cancer tissues that match predictions based on different degrees of cell-cell contacts. The HT-29 cell line, which has the strongest adhesions in the spheroid model, exhibits anomalous behavior in some cases. These results highlight the importance of using three-dimensional tissue models in drug screening with cellular adhesions being a contributory factor in phenotypic differences between the drug responses of tissue and cells.

  20. Cell motility dynamics: a novel segmentation algorithm to quantify multi-cellular bright field microscopy images.

    Directory of Open Access Journals (Sweden)

    Assaf Zaritsky

    Full Text Available Confocal microscopy analysis of fluorescence and morphology is becoming the standard tool in cell biology and molecular imaging. Accurate quantification algorithms are required to enhance the understanding of different biological phenomena. We present a novel approach based on image-segmentation of multi-cellular regions in bright field images demonstrating enhanced quantitative analyses and better understanding of cell motility. We present MultiCellSeg, a segmentation algorithm to separate between multi-cellular and background regions for bright field images, which is based on classification of local patches within an image: a cascade of Support Vector Machines (SVMs is applied using basic image features. Post processing includes additional classification and graph-cut segmentation to reclassify erroneous regions and refine the segmentation. This approach leads to a parameter-free and robust algorithm. Comparison to an alternative algorithm on wound healing assay images demonstrates its superiority. The proposed approach was used to evaluate common cell migration models such as wound healing and scatter assay. It was applied to quantify the acceleration effect of Hepatocyte growth factor/scatter factor (HGF/SF on healing rate in a time lapse confocal microscopy wound healing assay and demonstrated that the healing rate is linear in both treated and untreated cells, and that HGF/SF accelerates the healing rate by approximately two-fold. A novel fully automated, accurate, zero-parameters method to classify and score scatter-assay images was developed and demonstrated that multi-cellular texture is an excellent descriptor to measure HGF/SF-induced cell scattering. We show that exploitation of textural information from differential interference contrast (DIC images on the multi-cellular level can prove beneficial for the analyses of wound healing and scatter assays. The proposed approach is generic and can be used alone or alongside traditional

  1. Cell motility dynamics: a novel segmentation algorithm to quantify multi-cellular bright field microscopy images.

    Science.gov (United States)

    Zaritsky, Assaf; Natan, Sari; Horev, Judith; Hecht, Inbal; Wolf, Lior; Ben-Jacob, Eshel; Tsarfaty, Ilan

    2011-01-01

    Confocal microscopy analysis of fluorescence and morphology is becoming the standard tool in cell biology and molecular imaging. Accurate quantification algorithms are required to enhance the understanding of different biological phenomena. We present a novel approach based on image-segmentation of multi-cellular regions in bright field images demonstrating enhanced quantitative analyses and better understanding of cell motility. We present MultiCellSeg, a segmentation algorithm to separate between multi-cellular and background regions for bright field images, which is based on classification of local patches within an image: a cascade of Support Vector Machines (SVMs) is applied using basic image features. Post processing includes additional classification and graph-cut segmentation to reclassify erroneous regions and refine the segmentation. This approach leads to a parameter-free and robust algorithm. Comparison to an alternative algorithm on wound healing assay images demonstrates its superiority. The proposed approach was used to evaluate common cell migration models such as wound healing and scatter assay. It was applied to quantify the acceleration effect of Hepatocyte growth factor/scatter factor (HGF/SF) on healing rate in a time lapse confocal microscopy wound healing assay and demonstrated that the healing rate is linear in both treated and untreated cells, and that HGF/SF accelerates the healing rate by approximately two-fold. A novel fully automated, accurate, zero-parameters method to classify and score scatter-assay images was developed and demonstrated that multi-cellular texture is an excellent descriptor to measure HGF/SF-induced cell scattering. We show that exploitation of textural information from differential interference contrast (DIC) images on the multi-cellular level can prove beneficial for the analyses of wound healing and scatter assays. The proposed approach is generic and can be used alone or alongside traditional fluorescence single

  2. Error performance analysis in downlink cellular networks with interference management

    KAUST Repository

    Afify, Laila H.; Elsawy, Hesham; Al-Naffouri, Tareq Y.; Alouini, Mohamed-Slim

    2015-01-01

    Modeling aggregate network interference in cellular networks has recently gained immense attention both in academia and industry. While stochastic geometry based models have succeeded to account for the cellular network geometry, they mostly

  3. Trust in automation: integrating empirical evidence on factors that influence trust.

    Science.gov (United States)

    Hoff, Kevin Anthony; Bashir, Masooda

    2015-05-01

    We systematically review recent empirical research on factors that influence trust in automation to present a three-layered trust model that synthesizes existing knowledge. Much of the existing research on factors that guide human-automation interaction is centered around trust, a variable that often determines the willingness of human operators to rely on automation. Studies have utilized a variety of different automated systems in diverse experimental paradigms to identify factors that impact operators' trust. We performed a systematic review of empirical research on trust in automation from January 2002 to June 2013. Papers were deemed eligible only if they reported the results of a human-subjects experiment in which humans interacted with an automated system in order to achieve a goal. Additionally, a relationship between trust (or a trust-related behavior) and another variable had to be measured. All together, 101 total papers, containing 127 eligible studies, were included in the review. Our analysis revealed three layers of variability in human-automation trust (dispositional trust, situational trust, and learned trust), which we organize into a model. We propose design recommendations for creating trustworthy automation and identify environmental conditions that can affect the strength of the relationship between trust and reliance. Future research directions are also discussed for each layer of trust. Our three-layered trust model provides a new lens for conceptualizing the variability of trust in automation. Its structure can be applied to help guide future research and develop training interventions and design procedures that encourage appropriate trust. © 2014, Human Factors and Ergonomics Society.

  4. Problems of complex automation of process at a NPP

    International Nuclear Information System (INIS)

    Naumov, A.V.

    1981-01-01

    The importance of theoretical investigation in determining the level and quality of NPP automation is discussed. Achievements gained in this direction are briefly reviewed on the example of domestic NPPs. Two models of the problem solution on function distribution between the operator and technical means are outlined. The processes subjected to automation are enumerated. Development of the optimal methods of power automatic control of power units is one of the most important problems of NPP automation. Automation of discrete operations especially during the start-up, shut-down or in imergency situations becomes important [ru

  5. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT ...

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execution of the Soil Water Assessment Tool (SWAT) and KINEmatic Runoff and EROSion (KINEROS2) hydrologic models. The application of these two models allows AGWA to conduct hydrologic modeling and watershed assessments at multiple temporal and spatial scales. AGWA’s current outputs are runoff (volumes and peaks) and sediment yield, plus nitrogen and phosphorus with the SWAT model. AGWA uses commonly available GIS data layers to fully parameterize, execute, and visualize results from both models. Through an intuitive interface the user selects an outlet from which AGWA delineates and discretizes the watershed using a Digital Elevation Model (DEM) based on the individual model requirements. The watershed model elements are then intersected with soils and land cover data layers to derive the requisite model input parameters. The chosen model is then executed, and the results are imported back into AGWA for visualization. This allows managers to identify potential problem areas where additional monitoring can be undertaken or mitigation activities can be focused. AGWA also has tools to apply an array of best management practices. There are currently two versions of AGWA available; AGWA 1.5 for

  6. Cellular Automata Modelling of Photo-Induced Oxidation Processes in Molecularly Doped Polymers

    Directory of Open Access Journals (Sweden)

    David M. Goldie

    2016-11-01

    Full Text Available The possibility of employing cellular automata (CA to model photo-induced oxidation processes in molecularly doped polymers is explored. It is demonstrated that the oxidation dynamics generated using CA models exhibit stretched-exponential behavior. This dynamical characteristic is in general agreement with an alternative analysis conducted using standard rate equations provided the molecular doping levels are sufficiently low to prohibit the presence of safe-sites which are impenetrable to dissolved oxygen. The CA models therefore offer the advantage of exploring the effect of dopant agglomeration which is difficult to assess from standard rate equation solutions. The influence of UV-induced bleaching or darkening upon the resulting oxidation dynamics may also be easily incorporated into the CA models and these optical effects are investigated for various photo-oxidation product scenarios. Output from the CA models is evaluated for experimental photo-oxidation data obtained from a series of hydrazone-doped polymers.

  7. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection.

    Science.gov (United States)

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N

    2017-10-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacterial colonies in infected host cells (Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy, Ernstsen et al., 2017 [1]). The infected cells were imaged with a 10× objective and number of intracellular bacterial colonies, their size distribution and the number of cell nuclei were automatically quantified using a spot detection-tool. The spot detection-output was exported to Excel, where data analysis was performed. In this article, micrographs and spot detection data are made available to facilitate implementation of the method.

  8. Mobility-Aware Modeling and Analysis of Dense Cellular Networks With $C$ -Plane/ $U$ -Plane Split Architecture

    KAUST Repository

    Ibrahim, Hazem

    2016-09-19

    The unrelenting increase in the population of mobile users and their traffic demands drive cellular network operators to densify their network infrastructure. Network densification shrinks the footprint of base stations (BSs) and reduces the number of users associated with each BS, leading to an improved spatial frequency reuse and spectral efficiency, and thus, higher network capacity. However, the densification gain comes at the expense of higher handover rates and network control overhead. Hence, user’s mobility can diminish or even nullifies the foreseen densification gain. In this context, splitting the control plane ( C -plane) and user plane ( U -plane) is proposed as a potential solution to harvest densification gain with reduced cost in terms of handover rate and network control overhead. In this paper, we use stochastic geometry to develop a tractable mobility-aware model for a two-tier downlink cellular network with ultra-dense small cells and C -plane/ U -plane split architecture. The developed model is then used to quantify the effect of mobility on the foreseen densification gain with and without C -plane/ U -plane split. To this end, we shed light on the handover problem in dense cellular environments, show scenarios where the network fails to support certain mobility profiles, and obtain network design insights.

  9. Applying genetic algorithms for calibrating a hexagonal cellular automata model for the simulation of debris flows characterised by strong inertial effects

    Science.gov (United States)

    Iovine, G.; D'Ambrosio, D.; Di Gregorio, S.

    2005-03-01

    In modelling complex a-centric phenomena which evolve through local interactions within a discrete time-space, cellular automata (CA) represent a valid alternative to standard solution methods based on differential equations. Flow-type phenomena (such as lava flows, pyroclastic flows, earth flows, and debris flows) can be viewed as a-centric dynamical systems, and they can therefore be properly investigated in CA terms. SCIDDICA S 4a is the last release of a two-dimensional hexagonal CA model for simulating debris flows characterised by strong inertial effects. S 4a has been obtained by progressively enriching an initial simplified model, originally derived for simulating very simple cases of slow-moving flow-type landslides. Using an empirical strategy, in S 4a, the inertial character of the flowing mass is translated into CA terms by means of local rules. In particular, in the transition function of the model, the distribution of landslide debris among the cells is obtained through a double cycle of computation. In the first phase, the inertial character of the landslide debris is taken into account by considering indicators of momentum. In the second phase, any remaining debris in the central cell is distributed among the adjacent cells, according to the principle of maximum possible equilibrium. The complexities of the model and of the phenomena to be simulated suggested the need for an automated technique of evaluation for the determination of the best set of global parameters. Accordingly, the model is calibrated using a genetic algorithm and by considering the May 1998 Curti-Sarno (Southern Italy) debris flow. The boundaries of the area affected by the debris flow are simulated well with the model. Errors computed by comparing the simulations with the mapped areal extent of the actual landslide are smaller than those previously obtained without genetic algorithms. As the experiments have been realised in a sequential computing environment, they could be

  10. Dynamic PET scanning and compartmental model analysis to determine cellular level radiotracer distribution in vivo

    International Nuclear Information System (INIS)

    Smith, G.T.; Hubner, K.F.; Goodman, M.M.; Stubbs, J.B.

    1992-01-01

    Positron emission tomography (PET) has been used to measure tissue radiotracer concentration in vivo. Radiochemical distribution can be determined with compartmental model analysis. A two compartment model describes the kinetics of N-13 ammonia ( 13 NH 3 ) in the myocardium. The model consists of a vascular space, Q 1 and a space for 13 NH 3 bound within the tissue, Q 2 . Differential equations for the model can be written: X(t) = AX(t) + BU( t), Y(t)= CX(t)+ DU(t) (1) where X(t) is a column vector [Q 1 (t); Q 2 (t)], U(t) is the arterial input activity measured from the left ventricular blood pool, and Y(t) is the measured tissue activity using PET. Matrices A, B, C, and D are dependent on physiological parameters describing the kinetics of 13 NH 3 in the myocardium. Estimated parameter matrices in Equation 1 have been validated in dog experiments by measuring myocardial perfusion with dynamic PET scanning and intravenous injection of 13 NH 3 . Tracer concentrations for each compartment can be calculated by direct integration of Equation 1. If the cellular level distribution of each compartment is known, the concentration of tracer within the intracellular and extracellular space can be determined. Applications of this type of modeling include parameter estimation for measurement of physiological processes, organ level dosimetry, and determination of cellular radiotracer distribution

  11. MATHEMATICAL MODEL OF AUTOMATED REHABILITATION SYSTEM WITH BIOLOGICAL FEEDBACK FOR REHABILITATION AND DEVELOPMENT OF MUSCULOSKELETAL SYSTEM

    Directory of Open Access Journals (Sweden)

    Kirill A. Kalyashin

    2013-01-01

    Full Text Available In order to increase the efficiency and safety of rehabilitation of musculoskeletal system, the model and the algorithm for patient interaction with automated rehabilitation system with biological feedback was developed, based on registration and management of the second functional parameter, which prevents risks of overwork while intensive exercises.

  12. Automated parameter tuning applied to sea ice in a global climate model

    Science.gov (United States)

    Roach, Lettie A.; Tett, Simon F. B.; Mineter, Michael J.; Yamazaki, Kuniko; Rae, Cameron D.

    2018-01-01

    This study investigates the hypothesis that a significant portion of spread in climate model projections of sea ice is due to poorly-constrained model parameters. New automated methods for optimization are applied to historical sea ice in a global coupled climate model (HadCM3) in order to calculate the combination of parameters required to reduce the difference between simulation and observations to within the range of model noise. The optimized parameters result in a simulated sea-ice time series which is more consistent with Arctic observations throughout the satellite record (1980-present), particularly in the September minimum, than the standard configuration of HadCM3. Divergence from observed Antarctic trends and mean regional sea ice distribution reflects broader structural uncertainty in the climate model. We also find that the optimized parameters do not cause adverse effects on the model climatology. This simple approach provides evidence for the contribution of parameter uncertainty to spread in sea ice extent trends and could be customized to investigate uncertainties in other climate variables.

  13. Evaluation of the Cytotoxicity and Genotoxicity of Flavonolignans in Different Cellular Models

    Directory of Open Access Journals (Sweden)

    Michal Bijak

    2017-12-01

    Full Text Available Flavonolignans are the main components of silymarin, which represents 1.5–3% of the dry fruit weight of Milk thistle (Silybum marianum L. Gaernt.. In ancient Greece and Romania, physicians and herbalists used the Silybum marianum to treat a range of liver diseases. Besides their hepatoprotective action, silymarin flavonolignans have many other healthy properties, such as anti-platelet and anti-inflammatory actions. The aim of this study was to evaluate the toxic effect of flavonolignans on blood platelets, peripheral blood mononuclear cells (PBMCs and human lung cancer cell line—A549—using different molecular techniques. We established that three major flavonolignans: silybin, silychristin and silydianin, in concentrations of up to 100 µM, have neither a cytotoxic nor genotoxic effect on blood platelets, PMBCs and A549. We also saw that silybin and silychristin have a protective effect on cellular mitochondria, observed as a reduction of spontaneous mitochondrial DNA (mtDNA damage in A549, measured as mtDNA copies, and mtDNA lesions in ND1 and ND5 genes. Additionally, we observed that flavonolignans increase the blood platelets’ mitochondrial membrane potential and reduce the generation of reactive oxygen species in blood platelets. Our current findings show for the first time that the three major flavonolignans, silybin, silychristin and silydianin, do not have any cytotoxicity and genotoxicity in various cellular models, and that they actually protect cellular mitochondria. This proves that the antiplatelet and anti-inflammatory effect of these compounds is part of our molecular health mechanisms.

  14. Multiscale modeling of porous ceramics using movable cellular automaton method

    Science.gov (United States)

    Smolin, Alexey Yu.; Smolin, Igor Yu.; Smolina, Irina Yu.

    2017-10-01

    The paper presents a multiscale model for porous ceramics based on movable cellular automaton method, which is a particle method in novel computational mechanics of solid. The initial scale of the proposed approach corresponds to the characteristic size of the smallest pores in the ceramics. At this scale, we model uniaxial compression of several representative samples with an explicit account of pores of the same size but with the unique position in space. As a result, we get the average values of Young's modulus and strength, as well as the parameters of the Weibull distribution of these properties at the current scale level. These data allow us to describe the material behavior at the next scale level were only the larger pores are considered explicitly, while the influence of small pores is included via effective properties determined earliar. If the pore size distribution function of the material has N maxima we need to perform computations for N-1 levels in order to get the properties step by step from the lowest scale up to the macroscale. The proposed approach was applied to modeling zirconia ceramics with bimodal pore size distribution. The obtained results show correct behavior of the model sample at the macroscale.

  15. Modelling and experimental study for automated congestion driving

    NARCIS (Netherlands)

    Urhahne, Joseph; Piastowski, P.; van der Voort, Mascha C.; Bebis, G; Boyle, R.; Parvin, B.; Koracin, D.; Pavlidis, I.; Feris, R.; McGraw, T.; Elendt, M.; Kopper, R.; Ragan, E.; Ye, Z.; Weber, G.

    2015-01-01

    Taking a collaborative approach in automated congestion driving with a Traffic Jam Assist system requires the driver to take over control in certain traffic situations. In order to warn the driver appropriately, warnings are issued (“pay attention” vs. “take action”) due to a control transition

  16. Glider-based computing in reaction-diffusion hexagonal cellular automata

    International Nuclear Information System (INIS)

    Adamatzky, Andrew; Wuensche, Andrew; De Lacy Costello, Benjamin

    2006-01-01

    A three-state hexagonal cellular automaton, discovered in [Wuensche A. Glider dynamics in 3-value hexagonal cellular automata: the beehive rule. Int J Unconvention Comput, in press], presents a conceptual discrete model of a reaction-diffusion system with inhibitor and activator reagents. The automaton model of reaction-diffusion exhibits mobile localized patterns (gliders) in its space-time dynamics. We show how to implement the basic computational operations with these mobile localizations, and thus demonstrate collision-based logical universality of the hexagonal reaction-diffusion cellular automaton

  17. Computational Modeling of Proteins based on Cellular Automata: A Method of HP Folding Approximation.

    Science.gov (United States)

    Madain, Alia; Abu Dalhoum, Abdel Latif; Sleit, Azzam

    2018-06-01

    The design of a protein folding approximation algorithm is not straightforward even when a simplified model is used. The folding problem is a combinatorial problem, where approximation and heuristic algorithms are usually used to find near optimal folds of proteins primary structures. Approximation algorithms provide guarantees on the distance to the optimal solution. The folding approximation approach proposed here depends on two-dimensional cellular automata to fold proteins presented in a well-studied simplified model called the hydrophobic-hydrophilic model. Cellular automata are discrete computational models that rely on local rules to produce some overall global behavior. One-third and one-fourth approximation algorithms choose a subset of the hydrophobic amino acids to form H-H contacts. Those algorithms start with finding a point to fold the protein sequence into two sides where one side ignores H's at even positions and the other side ignores H's at odd positions. In addition, blocks or groups of amino acids fold the same way according to a predefined normal form. We intend to improve approximation algorithms by considering all hydrophobic amino acids and folding based on the local neighborhood instead of using normal forms. The CA does not assume a fixed folding point. The proposed approach guarantees one half approximation minus the H-H endpoints. This lower bound guaranteed applies to short sequences only. This is proved as the core and the folds of the protein will have two identical sides for all short sequences.

  18. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  19. A mathematical model in cellular manufacturing system considering subcontracting approach under constraints

    Directory of Open Access Journals (Sweden)

    Kamran Forghani

    2012-10-01

    Full Text Available In this paper, a new mathematical model in cellular manufacturing systems (CMSs has been presented. In order to increase the performance of manufacturing system, the production quantity of parts has been considered as a decision variable, i.e. each part can be produced and outsourced, simultaneously. This extension would be minimized the unused capacity of machines. The exceptional elements (EEs are taken into account and would be totally outsourced to the external supplier in order to remove intercellular material handling cost. The problem has been formulated as a mixed-integer programming to minimize the sum of manufacturing variable costs under budget, machines capacity and demand constraints. Also, to evaluate advantages of the model, several illustrative numerical examples have been provided to compare the performance of the proposed model with the available classical approaches in the literature.

  20. Set of information technologies and their role in automation of agricultural production

    OpenAIRE

    V. V. Al’t

    2018-01-01

    The modern enterprises of agrarian and industrial complex are characterized by the high level of automation of technological processes. The technological development level conformto 5th and 6th technology revolutions. The automatic and automated technologies in crop production and livestock production use data of internet technologies, Global Positioning Satellite survey and observations, mashine and tractor aggregates automated operating. The models nucleus and row of information models of a...

  1. Automating calibration, sensitivity and uncertainty analysis of complex models using the R package Flexible Modeling Environment (FME): SWAT as an example

    Science.gov (United States)

    Wu, Y.; Liu, S.

    2012-01-01

    Parameter optimization and uncertainty issues are a great challenge for the application of large environmental models like the Soil and Water Assessment Tool (SWAT), which is a physically-based hydrological model for simulating water and nutrient cycles at the watershed scale. In this study, we present a comprehensive modeling environment for SWAT, including automated calibration, and sensitivity and uncertainty analysis capabilities through integration with the R package Flexible Modeling Environment (FME). To address challenges (e.g., calling the model in R and transferring variables between Fortran and R) in developing such a two-language coupling framework, 1) we converted the Fortran-based SWAT model to an R function (R-SWAT) using the RFortran platform, and alternatively 2) we compiled SWAT as a Dynamic Link Library (DLL). We then wrapped SWAT (via R-SWAT) with FME to perform complex applications including parameter identifiability, inverse modeling, and sensitivity and uncertainty analysis in the R environment. The final R-SWAT-FME framework has the following key functionalities: automatic initialization of R, running Fortran-based SWAT and R commands in parallel, transferring parameters and model output between SWAT and R, and inverse modeling with visualization. To examine this framework and demonstrate how it works, a case study simulating streamflow in the Cedar River Basin in Iowa in the United Sates was used, and we compared it with the built-in auto-calibration tool of SWAT in parameter optimization. Results indicate that both methods performed well and similarly in searching a set of optimal parameters. Nonetheless, the R-SWAT-FME is more attractive due to its instant visualization, and potential to take advantage of other R packages (e.g., inverse modeling and statistical graphics). The methods presented in the paper are readily adaptable to other model applications that require capability for automated calibration, and sensitivity and uncertainty

  2. Ciona intestinalis notochord as a new model to investigate the cellular and molecular mechanisms of tubulogenesis.

    Science.gov (United States)

    Denker, Elsa; Jiang, Di

    2012-05-01

    Biological tubes are a prevalent structural design across living organisms. They provide essential functions during the development and adult life of an organism. Increasing progress has been made recently in delineating the cellular and molecular mechanisms underlying tubulogenesis. This review aims to introduce ascidian notochord morphogenesis as an interesting model system to study the cell biology of tube formation, to a wider cell and developmental biology community. We present fundamental morphological and cellular events involved in notochord morphogenesis, compare and contrast them with other more established tubulogenesis model systems, and point out some unique features, including bipolarity of the notochord cells, and using cell shape changes and cell rearrangement to connect lumens. We highlight some initial findings in the molecular mechanisms of notochord morphogenesis. Based on these findings, we present intriguing problems and put forth hypotheses that can be addressed in future studies. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Automated computer-based CT stratification as a predictor of outcome in hypersensitivity pneumonitis

    International Nuclear Information System (INIS)

    Jacob, Joseph; Mak, S.M.; Mok, W.; Hansell, D.M.; Bartholmai, B.J.; Rajagopalan, S.; Karwoski, R.; Della Casa, G.; Sugino, K.; Walsh, S.L.F.; Wells, A.U.

    2017-01-01

    Hypersensitivity pneumonitis (HP) has a variable clinical course. Modelling of quantitative CALIPER-derived CT data can identify distinct disease phenotypes. Mortality prediction using CALIPER analysis was compared to the interstitial lung disease gender, age, physiology (ILD-GAP) outcome model. CALIPER CT analysis of parenchymal patterns in 98 consecutive HP patients was compared to visual CT scoring by two radiologists. Functional indices including forced vital capacity (FVC) and diffusion capacity for carbon monoxide (DLco) in univariate and multivariate Cox mortality models. Automated stratification of CALIPER scores was evaluated against outcome models. Univariate predictors of mortality included visual and CALIPER CT fibrotic patterns, and all functional indices. Multivariate analyses identified only two independent predictors of mortality: CALIPER reticular pattern (p = 0.001) and DLco (p < 0.0001). Automated stratification distinguished three distinct HP groups (log-rank test p < 0.0001). Substitution of automated stratified groups for FVC and DLco in the ILD-GAP model demonstrated no loss of model strength (C-Index = 0.73 for both models). Model strength improved when automated stratified groups were combined with the ILD-GAP model (C-Index = 0.77). CALIPER-derived variables are the strongest CT predictors of mortality in HP. Automated CT stratification is equivalent to functional indices in the ILD-GAP model for predicting outcome in HP. (orig.)

  4. Automated computer-based CT stratification as a predictor of outcome in hypersensitivity pneumonitis

    Energy Technology Data Exchange (ETDEWEB)

    Jacob, Joseph; Mak, S.M.; Mok, W.; Hansell, D.M. [Royal Brompton and Harefield NHS Foundation Trust, Department of Radiology, Royal Brompton Hospital, London (United Kingdom); Bartholmai, B.J. [Mayo Clinic Rochester, Division of Radiology, Rochester, MN (United States); Rajagopalan, S.; Karwoski, R. [Mayo Clinic Rochester, Biomedical Imaging Resource, Rochester, MN (United States); Della Casa, G. [Universita degli Studi di Modena e Reggio Emilia, Modena, Emilia-Romagna (Italy); Sugino, K. [Toho University Omori Medical Centre, Tokyo (Japan); Walsh, S.L.F. [Kings College Hospital, London (United Kingdom); Wells, A.U. [Royal Brompton and Harefield NHS Foundation Trust, Interstitial Lung Disease Unit, Royal Brompton Hospital, London (United Kingdom)

    2017-09-15

    Hypersensitivity pneumonitis (HP) has a variable clinical course. Modelling of quantitative CALIPER-derived CT data can identify distinct disease phenotypes. Mortality prediction using CALIPER analysis was compared to the interstitial lung disease gender, age, physiology (ILD-GAP) outcome model. CALIPER CT analysis of parenchymal patterns in 98 consecutive HP patients was compared to visual CT scoring by two radiologists. Functional indices including forced vital capacity (FVC) and diffusion capacity for carbon monoxide (DLco) in univariate and multivariate Cox mortality models. Automated stratification of CALIPER scores was evaluated against outcome models. Univariate predictors of mortality included visual and CALIPER CT fibrotic patterns, and all functional indices. Multivariate analyses identified only two independent predictors of mortality: CALIPER reticular pattern (p = 0.001) and DLco (p < 0.0001). Automated stratification distinguished three distinct HP groups (log-rank test p < 0.0001). Substitution of automated stratified groups for FVC and DLco in the ILD-GAP model demonstrated no loss of model strength (C-Index = 0.73 for both models). Model strength improved when automated stratified groups were combined with the ILD-GAP model (C-Index = 0.77). CALIPER-derived variables are the strongest CT predictors of mortality in HP. Automated CT stratification is equivalent to functional indices in the ILD-GAP model for predicting outcome in HP. (orig.)

  5. Automated Synthesis of 18F-Fluoropropoxytryptophan for Amino Acid Transporter System Imaging

    Directory of Open Access Journals (Sweden)

    I-Hong Shih

    2014-01-01

    Full Text Available Objective. This study was to develop a cGMP grade of [18F]fluoropropoxytryptophan (18F-FTP to assess tryptophan transporters using an automated synthesizer. Methods. Tosylpropoxytryptophan (Ts-TP was reacted with K18F/kryptofix complex. After column purification, solvent evaporation, and hydrolysis, the identity and purity of the product were validated by radio-TLC (1M-ammonium acetate : methanol = 4 : 1 and HPLC (C-18 column, methanol : water = 7 : 3 analyses. In vitro cellular uptake of 18F-FTP and 18F-FDG was performed in human prostate cancer cells. PET imaging studies were performed with 18F-FTP and 18F-FDG in prostate and small cell lung tumor-bearing mice (3.7 MBq/mouse, iv. Results. Radio-TLC and HPLC analyses of 18F-FTP showed that the Rf and Rt values were 0.9 and 9 min, respectively. Radiochemical purity was >99%. The radiochemical yield was 37.7% (EOS 90 min, decay corrected. Cellular uptake of 18F-FTP and 18F-FDG showed enhanced uptake as a function of incubation time. PET imaging studies showed that 18F-FTP had less tumor uptake than 18F-FDG in prostate cancer model. However, 18F-FTP had more uptake than 18F-FDG in small cell lung cancer model. Conclusion. 18F-FTP could be synthesized with high radiochemical yield. Assessment of upregulated transporters activity by 18F-FTP may provide potential applications in differential diagnosis and prediction of early treatment response.

  6. Automated processing of label-free Raman microscope images of macrophage cells with standardized regression for high-throughput analysis.

    Science.gov (United States)

    Milewski, Robert J; Kumagai, Yutaro; Fujita, Katsumasa; Standley, Daron M; Smith, Nicholas I

    2010-11-19

    Macrophages represent the front lines of our immune system; they recognize and engulf pathogens or foreign particles thus initiating the immune response. Imaging macrophages presents unique challenges, as most optical techniques require labeling or staining of the cellular compartments in order to resolve organelles, and such stains or labels have the potential to perturb the cell, particularly in cases where incomplete information exists regarding the precise cellular reaction under observation. Label-free imaging techniques such as Raman microscopy are thus valuable tools for studying the transformations that occur in immune cells upon activation, both on the molecular and organelle levels. Due to extremely low signal levels, however, Raman microscopy requires sophisticated image processing techniques for noise reduction and signal extraction. To date, efficient, automated algorithms for resolving sub-cellular features in noisy, multi-dimensional image sets have not been explored extensively. We show that hybrid z-score normalization and standard regression (Z-LSR) can highlight the spectral differences within the cell and provide image contrast dependent on spectral content. In contrast to typical Raman imaging processing methods using multivariate analysis, such as single value decomposition (SVD), our implementation of the Z-LSR method can operate nearly in real-time. In spite of its computational simplicity, Z-LSR can automatically remove background and bias in the signal, improve the resolution of spatially distributed spectral differences and enable sub-cellular features to be resolved in Raman microscopy images of mouse macrophage cells. Significantly, the Z-LSR processed images automatically exhibited subcellular architectures whereas SVD, in general, requires human assistance in selecting the components of interest. The computational efficiency of Z-LSR enables automated resolution of sub-cellular features in large Raman microscopy data sets without

  7. On the engineering design for systematic integration of agent-orientation in industrial automation.

    Science.gov (United States)

    Yu, Liyong; Schüller, Andreas; Epple, Ulrich

    2014-09-01

    In today's automation industry, agent-oriented development of system functionalities appears to have a great potential for increasing autonomy and flexibility of complex operations, while lowering the workload of users. In this paper, we present a reference model for the harmonious and systematical integration of agent-orientation in industrial automation. Considering compatibility with existing automation systems and best practice, this model combines advantages of function block technology, service orientation and native description methods from the automation standard IEC 61131-3. This approach can be applied as a guideline for the engineering design of future agent-oriented automation systems. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  8. miR-103 Promotes Neurite Outgrowth and Suppresses Cells Apoptosis by Targeting Prostaglandin-Endoperoxide Synthase 2 in Cellular Models of Alzheimer's Disease.

    Science.gov (United States)

    Yang, Hui; Wang, Hongcai; Shu, Yongwei; Li, Xuling

    2018-01-01

    miR-103 has been reported to be decreased in brain of transgenic mouse model of Alzheimer's disease (AD) and in cerebrospinal fluid (CSF) of AD patients, while the detailed mechanism of its effect on AD is obscure, thus this study aimed to investigate the effect of miR-103 expression on neurite outgrowth and cells apoptosis as well as its targets in cellular models of AD. Blank mimic (NC1-mimic), miR-103 mimic, blank inhibitor (NC2-mimic) and miR-103 inhibitor plasmids were transferred into PC12 cellular AD model and Cellular AD model of cerebral cortex neurons which were established by Aβ1-42 insult. Rescue experiment was subsequently performed by transferring Prostaglandin-endoperoxide synthase 2 (PTGS2) and miR-103 mimic plasmid. mRNA and protein expressions were detected by qPCR and Western Blot assays. Total neurite outgrowth was detected by microscope, cells apoptosis was determined by Hoechst/PI assay, and apoptotic markers Caspase 3 and p38 expressions were determined by Western Blot assay. In both PC12 and cerebral cortex neurons cellular AD models, miR-103 mimic increases the total neurite outgrowth compared with NC1-mimic, while miR-103 inhibitor decreases the total neurite outgrowth than NC2-inhibitor. The apoptosis rate was decreased in miR-103 mimic group than NC1-mimic group while increased in miR-103 inhibitor group than NC2-inhibitor group. PTGS2, Adisintegrin and metalloproteinase 10 (ADAM10) and neprilysin (NEP) were selected as target genes of miR-103 by bioinformatics analysis. And PTGS2 was found to be conversely regulated by miR-103 expression while ADAM10 and NEP were not affected. After transfection by PTGS2 and miR-103 mimic plasmid in PC12 cellular AD model, the total neurite growth was shortened compared with miR-103 mimic group, and cells apoptosis was enhanced which indicated PTGS2 mimic attenuated the influence of miR-103 mimic on progression of AD. In conclusion, miR-103 promotes total neurite outgrowth and inhibits cells apoptosis

  9. STATISTIC MODEL OF DYNAMIC DELAY AND DROPOUT ON CELLULAR DATA NETWORKED CONTROL SYSTEM

    Directory of Open Access Journals (Sweden)

    MUHAMMAD A. MURTI

    2017-07-01

    Full Text Available Delay and dropout are important parameters influence overall control performance in Networked Control System (NCS. The goal of this research is to find a model of delay and dropout of data communication link in the NCS. Experiments have been done in this research to a water level control of boiler tank as part of the NCS based on internet communication network using High Speed Packet Access (HSPA cellular technology. By this experiments have been obtained closed-loop system response as well as data delay and dropout of data packets. This research contributes on modeling of the NCS which is combination of controlled plant and data communication link. Another contribution is statistical model of delay and dropout on the NCS.

  10. Algorithmic crystal chemistry: A cellular automata approach

    International Nuclear Information System (INIS)

    Krivovichev, S. V.

    2012-01-01

    Atomic-molecular mechanisms of crystal growth can be modeled based on crystallochemical information using cellular automata (a particular case of finite deterministic automata). In particular, the formation of heteropolyhedral layered complexes in uranyl selenates can be modeled applying a one-dimensional three-colored cellular automaton. The use of the theory of calculations (in particular, the theory of automata) in crystallography allows one to interpret crystal growth as a computational process (the realization of an algorithm or program with a finite number of steps).

  11. Modeling and Experimental Study of Soft Error Propagation Based on Cellular Automaton

    Directory of Open Access Journals (Sweden)

    Wei He

    2016-01-01

    Full Text Available Aiming to estimate SEE soft error performance of complex electronic systems, a soft error propagation model based on cellular automaton is proposed and an estimation methodology based on circuit partitioning and error propagation is presented. Simulations indicate that different fault grade jamming and different coupling factors between cells are the main parameters influencing the vulnerability of the system. Accelerated radiation experiments have been developed to determine the main parameters for raw soft error vulnerability of the module and coupling factors. Results indicate that the proposed method is feasible.

  12. Automated Generation of Fault Management Artifacts from a Simple System Model

    Science.gov (United States)

    Kennedy, Andrew K.; Day, John C.

    2013-01-01

    Our understanding of off-nominal behavior - failure modes and fault propagation - in complex systems is often based purely on engineering intuition; specific cases are assessed in an ad hoc fashion as a (fallible) fault management engineer sees fit. This work is an attempt to provide a more rigorous approach to this understanding and assessment by automating the creation of a fault management artifact, the Failure Modes and Effects Analysis (FMEA) through querying a representation of the system in a SysML model. This work builds off the previous development of an off-nominal behavior model for the upcoming Soil Moisture Active-Passive (SMAP) mission at the Jet Propulsion Laboratory. We further developed the previous system model to more fully incorporate the ideas of State Analysis, and it was restructured in an organizational hierarchy that models the system as layers of control systems while also incorporating the concept of "design authority". We present software that was developed to traverse the elements and relationships in this model to automatically construct an FMEA spreadsheet. We further discuss extending this model to automatically generate other typical fault management artifacts, such as Fault Trees, to efficiently portray system behavior, and depend less on the intuition of fault management engineers to ensure complete examination of off-nominal behavior.

  13. Cellular blebs: pressure-driven, axisymmetric, membrane protrusions

    KAUST Repository

    Woolley, Thomas E.

    2013-07-16

    Blebs are cellular protrusions that are used by cells for multiple purposes including locomotion. A mechanical model for the problem of pressure-driven blebs based on force and moment balances of an axisymmetric shell model is proposed. The formation of a bleb is initiated by weakening the shell over a small region, and the deformation of the cellular membrane from the cortex is obtained during inflation. However, simply weakening the shell leads to an area increase of more than 4 %, which is physically unrealistic. Thus, the model is extended to include a reconfiguration process that allows large blebs to form with small increases in area. It is observed that both geometric and biomechanical constraints are important in this process. In particular, it is shown that although blebs are driven by a pressure difference across the cellular membrane, it is not the limiting factor in determining bleb size. © 2013 Springer-Verlag Berlin Heidelberg.

  14. Method for automated building of spindle thermal model with use of CAE system

    Science.gov (United States)

    Kamenev, S. V.

    2018-03-01

    The spindle is one of the most important units of the metal-cutting machine tool. Its performance is critical to minimize the machining error, especially the thermal error. Various methods are applied to improve the thermal behaviour of spindle units. One of the most important methods is mathematical modelling based on the finite element analysis. The most common approach for its realization is the use of CAE systems. This approach, however, is not capable to address the number of important effects that need to be taken into consideration for proper simulation. In the present article, the authors propose the solution to overcome these disadvantages using automated thermal model building for the spindle unit utilizing the CAE system ANSYS.

  15. Framework for Human-Automation Collaboration: Conclusions from Four Studies

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States); Le Blanc, Katya L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); O' Hara, John [Brookhaven National Lab. (BNL), Upton, NY (United States); Joe, Jeffrey C. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Whaley, April M. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Medema, Heather [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-11-01

    The Human Automation Collaboration (HAC) research project is investigating how advanced technologies that are planned for Advanced Small Modular Reactors (AdvSMR) will affect the performance and the reliability of the plant from a human factors and human performance perspective. The HAC research effort investigates the consequences of allocating functions between the operators and automated systems. More specifically, the research team is addressing how to best design the collaboration between the operators and the automated systems in a manner that has the greatest positive impact on overall plant performance and reliability. Oxstrand et al. (2013 - March) describes the efforts conducted by the researchers to identify the research needs for HAC. The research team reviewed the literature on HAC, developed a model of HAC, and identified gaps in the existing knowledge of human-automation collaboration. As described in Oxstrand et al. (2013 – June), the team then prioritized the research topics identified based on the specific needs in the context of AdvSMR. The prioritization was based on two sources of input: 1) The preliminary functions and tasks, and 2) The model of HAC. As a result, three analytical studies were planned and conduced; 1) Models of Teamwork, 2) Standardized HAC Performance Measurement Battery, and 3) Initiators and Triggering Conditions for Adaptive Automation. Additionally, one field study was also conducted at Idaho Falls Power.

  16. Operator-based metric for nuclear operations automation assessment

    Energy Technology Data Exchange (ETDEWEB)

    Zacharias, G.L.; Miao, A.X.; Kalkan, A. [Charles River Analytics Inc., Cambridge, MA (United States)] [and others

    1995-04-01

    Continuing advances in real-time computational capabilities will support enhanced levels of smart automation and AI-based decision-aiding systems in the nuclear power plant (NPP) control room of the future. To support development of these aids, we describe in this paper a research tool, and more specifically, a quantitative metric, to assess the impact of proposed automation/aiding concepts in a manner that can account for a number of interlinked factors in the control room environment. In particular, we describe a cognitive operator/plant model that serves as a framework for integrating the operator`s information-processing capabilities with his procedural knowledge, to provide insight as to how situations are assessed by the operator, decisions made, procedures executed, and communications conducted. Our focus is on the situation assessment (SA) behavior of the operator, the development of a quantitative metric reflecting overall operator awareness, and the use of this metric in evaluating automation/aiding options. We describe the results of a model-based simulation of a selected emergency scenario, and metric-based evaluation of a range of contemplated NPP control room automation/aiding options. The results demonstrate the feasibility of model-based analysis of contemplated control room enhancements, and highlight the need for empirical validation.

  17. Fuzzy Case-Based Reasoning in Product Style Acquisition Incorporating Valence-Arousal-Based Emotional Cellular Model

    Directory of Open Access Journals (Sweden)

    Fuqian Shi

    2012-01-01

    Full Text Available Emotional cellular (EC, proposed in our previous works, is a kind of semantic cell that contains kernel and shell and the kernel is formalized by a triple- L = , where P denotes a typical set of positive examples relative to word-L, d is a pseudodistance measure on emotional two-dimensional space: valence-arousal, and δ is a probability density function on positive real number field. The basic idea of EC model is to assume that the neighborhood radius of each semantic concept is uncertain, and this uncertainty will be measured by one-dimensional density function δ. In this paper, product form features were evaluated by using ECs and to establish the product style database, fuzzy case based reasoning (FCBR model under a defined similarity measurement based on fuzzy nearest neighbors (FNN incorporating EC was applied to extract product styles. A mathematical formalized inference system for product style was also proposed, and it also includes uncertainty measurement tool emotional cellular. A case study of style acquisition of mobile phones illustrated the effectiveness of the proposed methodology.

  18. Automated robust generation of compact 3D statistical shape models

    Science.gov (United States)

    Vrtovec, Tomaz; Likar, Bostjan; Tomazevic, Dejan; Pernus, Franjo

    2004-05-01

    Ascertaining the detailed shape and spatial arrangement of anatomical structures is important not only within diagnostic settings but also in the areas of planning, simulation, intraoperative navigation, and tracking of pathology. Robust, accurate and efficient automated segmentation of anatomical structures is difficult because of their complexity and inter-patient variability. Furthermore, the position of the patient during image acquisition, the imaging device and protocol, image resolution, and other factors induce additional variations in shape and appearance. Statistical shape models (SSMs) have proven quite successful in capturing structural variability. A possible approach to obtain a 3D SSM is to extract reference voxels by precisely segmenting the structure in one, reference image. The corresponding voxels in other images are determined by registering the reference image to each other image. The SSM obtained in this way describes statistically plausible shape variations over the given population as well as variations due to imperfect registration. In this paper, we present a completely automated method that significantly reduces shape variations induced by imperfect registration, thus allowing a more accurate description of variations. At each iteration, the derived SSM is used for coarse registration, which is further improved by describing finer variations of the structure. The method was tested on 64 lumbar spinal column CT scans, from which 23, 38, 45, 46 and 42 volumes of interest containing vertebra L1, L2, L3, L4 and L5, respectively, were extracted. Separate SSMs were generated for each vertebra. The results show that the method is capable of reducing the variations induced by registration errors.

  19. Computer simulation and automation of data processing

    International Nuclear Information System (INIS)

    Tikhonov, A.N.

    1981-01-01

    The principles of computerized simulation and automation of data processing are presented. The automized processing system is constructed according to the module-hierarchical principle. The main operating conditions of the system are as follows: preprocessing, installation analysis, interpretation, accuracy analysis and controlling parameters. The definition of the quasireal experiment permitting to plan the real experiment is given. It is pointed out that realization of the quasireal experiment by means of the computerized installation model with subsequent automized processing permits to scan the quantitative aspect of the system as a whole as well as provides optimal designing of installation parameters for obtaining maximum resolution [ru

  20. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  1. Dockomatic - automated ligand creation and docking.

    Science.gov (United States)

    Bullock, Casey W; Jacob, Reed B; McDougal, Owen M; Hampikian, Greg; Andersen, Tim

    2010-11-08

    The application of computational modeling to rationally design drugs and characterize macro biomolecular receptors has proven increasingly useful due to the accessibility of computing clusters and clouds. AutoDock is a well-known and powerful software program used to model ligand to receptor binding interactions. In its current version, AutoDock requires significant amounts of user time to setup and run jobs, and collect results. This paper presents DockoMatic, a user friendly Graphical User Interface (GUI) application that eases and automates the creation and management of AutoDock jobs for high throughput screening of ligand to receptor interactions. DockoMatic allows the user to invoke and manage AutoDock jobs on a single computer or cluster, including jobs for evaluating secondary ligand interactions. It also automates the process of collecting, summarizing, and viewing results. In addition, DockoMatic automates creation of peptide ligand .pdb files from strings of single-letter amino acid abbreviations. DockoMatic significantly reduces the complexity of managing multiple AutoDock jobs by facilitating ligand and AutoDock job creation and management.

  2. Dockomatic - automated ligand creation and docking

    Directory of Open Access Journals (Sweden)

    Hampikian Greg

    2010-11-01

    Full Text Available Abstract Background The application of computational modeling to rationally design drugs and characterize macro biomolecular receptors has proven increasingly useful due to the accessibility of computing clusters and clouds. AutoDock is a well-known and powerful software program used to model ligand to receptor binding interactions. In its current version, AutoDock requires significant amounts of user time to setup and run jobs, and collect results. This paper presents DockoMatic, a user friendly Graphical User Interface (GUI application that eases and automates the creation and management of AutoDock jobs for high throughput screening of ligand to receptor interactions. Results DockoMatic allows the user to invoke and manage AutoDock jobs on a single computer or cluster, including jobs for evaluating secondary ligand interactions. It also automates the process of collecting, summarizing, and viewing results. In addition, DockoMatic automates creation of peptide ligand .pdb files from strings of single-letter amino acid abbreviations. Conclusions DockoMatic significantly reduces the complexity of managing multiple AutoDock jobs by facilitating ligand and AutoDock job creation and management.

  3. Predictability in cellular automata.

    Science.gov (United States)

    Agapie, Alexandru; Andreica, Anca; Chira, Camelia; Giuclea, Marius

    2014-01-01

    Modelled as finite homogeneous Markov chains, probabilistic cellular automata with local transition probabilities in (0, 1) always posses a stationary distribution. This result alone is not very helpful when it comes to predicting the final configuration; one needs also a formula connecting the probabilities in the stationary distribution to some intrinsic feature of the lattice configuration. Previous results on the asynchronous cellular automata have showed that such feature really exists. It is the number of zero-one borders within the automaton's binary configuration. An exponential formula in the number of zero-one borders has been proved for the 1-D, 2-D and 3-D asynchronous automata with neighborhood three, five and seven, respectively. We perform computer experiments on a synchronous cellular automaton to check whether the empirical distribution obeys also that theoretical formula. The numerical results indicate a perfect fit for neighbourhood three and five, which opens the way for a rigorous proof of the formula in this new, synchronous case.

  4. Automated quantification and sizing of unbranched filamentous cyanobacteria by model based object oriented image analysis

    OpenAIRE

    Zeder, M; Van den Wyngaert, S; Köster, O; Felder, K M; Pernthaler, J

    2010-01-01

    Quantification and sizing of filamentous cyanobacteria in environmental samples or cultures are time-consuming and are often performed by using manual or semiautomated microscopic analysis. Automation of conventional image analysis is difficult because filaments may exhibit great variations in length and patchy autofluorescence. Moreover, individual filaments frequently cross each other in microscopic preparations, as deduced by modeling. This paper describes a novel approach based on object-...

  5. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    Energy Technology Data Exchange (ETDEWEB)

    and Ben Polly, Joseph Robertson [National Renewable Energy Lab. (NREL), Golden, CO (United States); Polly, Ben [National Renewable Energy Lab. (NREL), Golden, CO (United States); Collis, Jon [Colorado School of Mines, Golden, CO (United States)

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define "explicit" input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAE 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.

  6. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, J.; Polly, B.; Collis, J.

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define 'explicit' input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAE 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.

  7. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving.

    Science.gov (United States)

    Elfring, Jos; Appeldoorn, Rein; van den Dries, Sjoerd; Kwakkernaat, Maurice

    2016-10-11

    The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle's surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture.

  8. Cellular automaton model in the fundamental diagram approach reproducing the synchronized outflow of wide moving jams

    International Nuclear Information System (INIS)

    Tian, Jun-fang; Yuan, Zhen-zhou; Jia, Bin; Fan, Hong-qiang; Wang, Tao

    2012-01-01

    Velocity effect and critical velocity are incorporated into the average space gap cellular automaton model [J.F. Tian, et al., Phys. A 391 (2012) 3129], which was able to reproduce many spatiotemporal dynamics reported by the three-phase theory except the synchronized outflow of wide moving jams. The physics of traffic breakdown has been explained. Various congested patterns induced by the on-ramp are reproduced. It is shown that the occurrence of synchronized outflow, free outflow of wide moving jams is closely related with drivers time delay in acceleration at the downstream jam front and the critical velocity, respectively. -- Highlights: ► Velocity effect is added into average space gap cellular automaton model. ► The physics of traffic breakdown has been explained. ► The probabilistic nature of traffic breakdown is simulated. ► Various congested patterns induced by the on-ramp are reproduced. ► The occurrence of synchronized outflow of jams depends on drivers time delay.

  9. Automation, consolidation, and integration in autoimmune diagnostics.

    Science.gov (United States)

    Tozzoli, Renato; D'Aurizio, Federica; Villalta, Danilo; Bizzaro, Nicola

    2015-08-01

    Over the past two decades, we have witnessed an extraordinary change in autoimmune diagnostics, characterized by the progressive evolution of analytical technologies, the availability of new tests, and the explosive growth of molecular biology and proteomics. Aside from these huge improvements, organizational changes have also occurred which brought about a more modern vision of the autoimmune laboratory. The introduction of automation (for harmonization of testing, reduction of human error, reduction of handling steps, increase of productivity, decrease of turnaround time, improvement of safety), consolidation (combining different analytical technologies or strategies on one instrument or on one group of connected instruments) and integration (linking analytical instruments or group of instruments with pre- and post-analytical devices) opened a new era in immunodiagnostics. In this article, we review the most important changes that have occurred in autoimmune diagnostics and present some models related to the introduction of automation in the autoimmunology laboratory, such as automated indirect immunofluorescence and changes in the two-step strategy for detection of autoantibodies; automated monoplex immunoassays and reduction of turnaround time; and automated multiplex immunoassays for autoantibody profiling.

  10. Open Automated Demand Response Communications Specification (Version 1.0)

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Ghatikar, Girish; Kiliccote, Sila; Koch, Ed; Hennage, Dan; Palensky, Peter; McParland, Charles

    2009-02-28

    The development of the Open Automated Demand Response Communications Specification, also known as OpenADR or Open Auto-DR, began in 2002 following the California electricity crisis. The work has been carried out by the Demand Response Research Center (DRRC), which is managed by Lawrence Berkeley National Laboratory. This specification describes an open standards-based communications data model designed to facilitate sending and receiving demand response price and reliability signals from a utility or Independent System Operator to electric customers. OpenADR is one element of the Smart Grid information and communications technologies that are being developed to improve optimization between electric supply and demand. The intention of the open automated demand response communications data model is to provide interoperable signals to building and industrial control systems that are preprogrammed to take action based on a demand response signal, enabling a demand response event to be fully automated, with no manual intervention. The OpenADR specification is a flexible infrastructure to facilitate common information exchange between the utility or Independent System Operator and end-use participants. The concept of an open specification is intended to allow anyone to implement the signaling systems, the automation server or the automation clients.

  11. Automated data processing and radioassays.

    Science.gov (United States)

    Samols, E; Barrows, G H

    1978-04-01

    Radioassays include (1) radioimmunoassays, (2) competitive protein-binding assays based on competition for limited antibody or specific binding protein, (3) immunoradiometric assay, based on competition for excess labeled antibody, and (4) radioreceptor assays. Most mathematical models describing the relationship between labeled ligand binding and unlabeled ligand concentration have been based on the law of mass action or the isotope dilution principle. These models provide useful data reduction programs, but are theoretically unfactory because competitive radioassay usually is not based on classical dilution principles, labeled and unlabeled ligand do not have to be identical, antibodies (or receptors) are frequently heterogenous, equilibrium usually is not reached, and there is probably steric and cooperative influence on binding. An alternative, more flexible mathematical model based on the probability or binding collisions being restricted by the surface area of reactive divalent sites on antibody and on univalent antigen has been derived. Application of these models to automated data reduction allows standard curves to be fitted by a mathematical expression, and unknown values are calculated from binding data. The vitrues and pitfalls are presented of point-to-point data reduction, linear transformations, and curvilinear fitting approaches. A third-order polynomial using the square root of concentration closely approximates the mathematical model based on probability, and in our experience this method provides the most acceptable results with all varieties of radioassays. With this curvilinear system, linear point connection should be used between the zero standard and the beginning of significant dose response, and also towards saturation. The importance is stressed of limiting the range of reported automated assay results to that portion of the standard curve that delivers optimal sensitivity. Published methods for automated data reduction of Scatchard plots

  12. Automated Slide Scanning and Segmentation in Fluorescently-labeled Tissues Using a Widefield High-content Analysis System.

    Science.gov (United States)

    Poon, Candice C; Ebacher, Vincent; Liu, Katherine; Yong, Voon Wee; Kelly, John James Patrick

    2018-05-03

    Automated slide scanning and segmentation of fluorescently-labeled tissues is the most efficient way to analyze whole slides or large tissue sections. Unfortunately, many researchers spend large amounts of time and resources developing and optimizing workflows that are only relevant to their own experiments. In this article, we describe a protocol that can be used by those with access to a widefield high-content analysis system (WHCAS) to image any slide-mounted tissue, with options for customization within pre-built modules found in the associated software. Not originally intended for slide scanning, the steps detailed in this article make it possible to acquire slide scanning images in the WHCAS which can be imported into the associated software. In this example, the automated segmentation of brain tumor slides is demonstrated, but the automated segmentation of any fluorescently-labeled nuclear or cytoplasmic marker is possible. Furthermore, there are a variety of other quantitative software modules including assays for protein localization/translocation, cellular proliferation/viability/apoptosis, and angiogenesis that can be run. This technique will save researchers time and effort and create an automated protocol for slide analysis.

  13. Sub-cellular force microscopy in single normal and cancer cells.

    Science.gov (United States)

    Babahosseini, H; Carmichael, B; Strobl, J S; Mahmoodi, S N; Agah, M

    2015-08-07

    This work investigates the biomechanical properties of sub-cellular structures of breast cells using atomic force microscopy (AFM). The cells are modeled as a triple-layered structure where the Generalized Maxwell model is applied to experimental data from AFM stress-relaxation tests to extract the elastic modulus, the apparent viscosity, and the relaxation time of sub-cellular structures. The triple-layered modeling results allow for determination and comparison of the biomechanical properties of the three major sub-cellular structures between normal and cancerous cells: the up plasma membrane/actin cortex, the mid cytoplasm/nucleus, and the low nuclear/integrin sub-domains. The results reveal that the sub-domains become stiffer and significantly more viscous with depth, regardless of cell type. In addition, there is a decreasing trend in the average elastic modulus and apparent viscosity of the all corresponding sub-cellular structures from normal to cancerous cells, which becomes most remarkable in the deeper sub-domain. The presented modeling in this work constitutes a unique AFM-based experimental framework to study the biomechanics of sub-cellular structures. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Coupling biomechanics to a cellular level model: an approach to patient-specific image driven multi-scale and multi-physics tumor simulation.

    Science.gov (United States)

    May, Christian P; Kolokotroni, Eleni; Stamatakos, Georgios S; Büchler, Philippe

    2011-10-01

    Modeling of tumor growth has been performed according to various approaches addressing different biocomplexity levels and spatiotemporal scales. Mathematical treatments range from partial differential equation based diffusion models to rule-based cellular level simulators, aiming at both improving our quantitative understanding of the underlying biological processes and, in the mid- and long term, constructing reliable multi-scale predictive platforms to support patient-individualized treatment planning and optimization. The aim of this paper is to establish a multi-scale and multi-physics approach to tumor modeling taking into account both the cellular and the macroscopic mechanical level. Therefore, an already developed biomodel of clinical tumor growth and response to treatment is self-consistently coupled with a biomechanical model. Results are presented for the free growth case of the imageable component of an initially point-like glioblastoma multiforme tumor. The composite model leads to significant tumor shape corrections that are achieved through the utilization of environmental pressure information and the application of biomechanical principles. Using the ratio of smallest to largest moment of inertia of the tumor material to quantify the effect of our coupled approach, we have found a tumor shape correction of 20% by coupling biomechanics to the cellular simulator as compared to a cellular simulation without preferred growth directions. We conclude that the integration of the two models provides additional morphological insight into realistic tumor growth behavior. Therefore, it might be used for the development of an advanced oncosimulator focusing on tumor types for which morphology plays an important role in surgical and/or radio-therapeutic treatment planning. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Automated optimization and construction of chemometric models based on highly variable raw chromatographic data.

    Science.gov (United States)

    Sinkov, Nikolai A; Johnston, Brandon M; Sandercock, P Mark L; Harynuk, James J

    2011-07-04

    Direct chemometric interpretation of raw chromatographic data (as opposed to integrated peak tables) has been shown to be advantageous in many circumstances. However, this approach presents two significant challenges: data alignment and feature selection. In order to interpret the data, the time axes must be precisely aligned so that the signal from each analyte is recorded at the same coordinates in the data matrix for each and every analyzed sample. Several alignment approaches exist in the literature and they work well when the samples being aligned are reasonably similar. In cases where the background matrix for a series of samples to be modeled is highly variable, the performance of these approaches suffers. Considering the challenge of feature selection, when the raw data are used each signal at each time is viewed as an individual, independent variable; with the data rates of modern chromatographic systems, this generates hundreds of thousands of candidate variables, or tens of millions of candidate variables if multivariate detectors such as mass spectrometers are utilized. Consequently, an automated approach to identify and select appropriate variables for inclusion in a model is desirable. In this research we present an alignment approach that relies on a series of deuterated alkanes which act as retention anchors for an alignment signal, and couple this with an automated feature selection routine based on our novel cluster resolution metric for the construction of a chemometric model. The model system that we use to demonstrate these approaches is a series of simulated arson debris samples analyzed by passive headspace extraction, GC-MS, and interpreted using partial least squares discriminant analysis (PLS-DA). Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Towards Automated Bargaining in Electronic Markets: A Partially Two-Sided Competition Model

    Science.gov (United States)

    Gatti, Nicola; Lazaric, Alessandro; Restelli, Marcello

    This paper focuses on the prominent issue of automating bargaining agents within electronic markets. Models of bargaining in literature deal with settings wherein there are only two agents and no model satisfactorily captures settings in which there is competition among buyers, being they more than one, and analogously among sellers. In this paper, we extend the principal bargaining protocol, i.e. the alternating-offers protocol, to capture bargaining in markets. The model we propose is such that, in presence of a unique buyer and a unique seller, agents' equilibrium strategies are those in the original protocol. Moreover, we game theoretically study the considered game providing the following results: in presence of one-sided competition (more buyers and one seller or vice versa) we provide agents' equilibrium strategies for all the values of the parameters, in presence of two-sided competition (more buyers and more sellers) we provide an algorithm that produce agents' equilibrium strategies for a large set of the parameters and we experimentally evaluate its effectiveness.

  17. Software complex AS (automation of spectrometry). User interface of experiment automation system implementation

    International Nuclear Information System (INIS)

    Astakhova, N.V.; Beskrovnyj, A.I.; Bogdzel', A.A.; Butorin, P.E.; Vasilovskij, S.G.; Gundorin, N.A.; Zlokazov, V.B.; Kutuzov, S.A.; Salamatin, I.M.; Shvetsov, V.N.

    2003-01-01

    An instrumental software complex for automation of spectrometry (AS) that enables prompt realization of experiment automation systems for spectrometers, which use data buferisation, has been developed. In the development new methods of programming and building of automation systems together with novel net technologies were employed. It is suggested that programs to schedule and conduct experiments should be based on the parametric model of the spectrometer, the approach that will make it possible to write programs suitable for any FLNP (Frank Laboratory of Neutron Physics) spectrometer and experimental technique applied and use different hardware interfaces for introducing the spectrometric data into the data acquisition system. The article describes the possibilities provided to the user in the field of scheduling and control of the experiment, data viewing, and control of the spectrometer parameters. The possibility of presenting the current spectrometer state, programs and the experimental data in the Internet in the form of dynamically formed protocols and graphs, as well as of the experiment control via the Internet is realized. To use the means of the Internet on the side of the client, applied programs are not needed. It suffices to know how to use the two programs to carry out experiments in the automated mode. The package is designed for experiments in condensed matter and nuclear physics and is ready for using. (author)

  18. Implementing the WebSocket Protocol Based on Formal Modelling and Automated Code Generation

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2014-01-01

    with pragmatic annotations for automated code generation of protocol software. The contribution of this paper is an application of the approach as implemented in the PetriCode tool to obtain protocol software implementing the IETF WebSocket protocol. This demonstrates the scalability of our approach to real...... protocols. Furthermore, we perform formal verification of the CPN model prior to code generation, and test the implementation for interoperability against the Autobahn WebSocket test-suite resulting in 97% and 99% success rate for the client and server implementation, respectively. The tests show...

  19. Cellularized Cellular Solids via Freeze-Casting.

    Science.gov (United States)

    Christoph, Sarah; Kwiatoszynski, Julien; Coradin, Thibaud; Fernandes, Francisco M

    2016-02-01

    The elaboration of metabolically active cell-containing materials is a decisive step toward the successful application of cell based technologies. The present work unveils a new process allowing to simultaneously encapsulate living cells and shaping cell-containing materials into solid-state macroporous foams with precisely controlled morphology. Our strategy is based on freeze casting, an ice templating materials processing technique that has recently emerged for the structuration of colloids into macroporous materials. Our results indicate that it is possible to combine the precise structuration of the materials with cellular metabolic activity for the model organism Saccharomyces cerevisiae. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Cellular signaling identifiability analysis: a case study.

    Science.gov (United States)

    Roper, Ryan T; Pia Saccomani, Maria; Vicini, Paolo

    2010-05-21

    Two primary purposes for mathematical modeling in cell biology are (1) simulation for making predictions of experimental outcomes and (2) parameter estimation for drawing inferences from experimental data about unobserved aspects of biological systems. While the former purpose has become common in the biological sciences, the latter is less common, particularly when studying cellular and subcellular phenomena such as signaling-the focus of the current study. Data are difficult to obtain at this level. Therefore, even models of only modest complexity can contain parameters for which the available data are insufficient for estimation. In the present study, we use a set of published cellular signaling models to address issues related to global parameter identifiability. That is, we address the following question: assuming known time courses for some model variables, which parameters is it theoretically impossible to estimate, even with continuous, noise-free data? Following an introduction to this problem and its relevance, we perform a full identifiability analysis on a set of cellular signaling models using DAISY (Differential Algebra for the Identifiability of SYstems). We use our analysis to bring to light important issues related to parameter identifiability in ordinary differential equation (ODE) models. We contend that this is, as of yet, an under-appreciated issue in biological modeling and, more particularly, cell biology. Copyright (c) 2010 Elsevier Ltd. All rights reserved.